How to Keep Your Siri, Alexa, and Google Assistant Voice Recordings Private

How to Keep Your Siri, Alexa, and Google Assistant Voice Recordings Private

After months of revelations and apologies, all the major smart assistant makers have revamped how they handle human review of audio snippets. Amazon Alexa, Google Assistant, Apple Siri, and Microsoft Cortana were all using third-party contractors to transcribe and vet recorded snippets, adding some human brain power to underlying machine-learning algorithms. But the backlash over the lack of transparency spurred new customer controls. And with the release of Apple's iOS 13.2 on Monday, you now have one more way to rein that data collection in.


Even if Siri isn't your smart assistant of choice, it's still a good time to take stock of how you have things set up on whatever platform you use. Each service has its own mix of options and controls. Here's how to take the human element out of Siri, Alexa, Google Assistant, and Cortana. Once that's done, tell a friend to do the same.


Siri

Apple paused human review of Siri user audio snippets at the beginning of August and published an apology later that month for the lack of transparency. Now, almost three months later, human review resumes, but it's opt-in and only with Apple employees rather than contractors.

With iOS 13.2, your Siri snippets no longer get passed along for human review by default. Now Apple asks you whether you'd like to opt in during the iOS 13.2 setup flow. If you make a mistake and opt in when you didn't mean to, or change your mind down the road, you can go ..

Support the originator by clicking the read the rest link below.