Apple’s Siri contractors will no longer hear you having sex, making drug deals

Apple’s Siri contractors will no longer hear you having sex, making drug deals

Google may have been forced by regulators (temporarily at least) to stop its subcontractors listening to audio captured by its smart speaker, but Apple clearly sees the benefit of not waiting until it receives a stern letter from the authorities.


Last week The Guardian reported that – like Amazon and Google – Apple was farming out a small percentage of audio recordings made by its Siri digital assistant to third-parties in order that speech recognition could be improved. And sometimes that can result in personal and potentially sensitive information being exposed:



Siri can be accidentally activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. Those mistakes can be understandable – a BBC interview about Syria was interrupted by the assistant last year – or less so. “The sound of a zip, Siri often hears as a trigger,” the contractor said. The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.


The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”



The Guardian now reports that Apple has decided to suspend what it calls Siri “grading” globally, while it conducts a “thorough review.”



“We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we a ..

Support the originator by clicking the read the rest link below.