[Reading time - 1 minute 37 seconds]
Question: Which tech company hires contractors to listen to what you say that's recorded by your personal voice assistant?
D. All of the above
Answer: Of course, it's "D. All of the above."
First, it was revealed back in April that Amazon has been listening to EVERYTHING we say (not just direct commands) to Alexa (see Alexa, Are You Listening?). Next, Google was called out for doing the same thing. And how did we find out they were doing this? Did Amazon and Google freely and openly make it plain that this was their policy? Hardly. This information came from whistleblowers who have gone to the press. Or through errors that gave users access to incorrect audio files. Or through a list of questions sent from the U.S. Congress.
And now, it's Apple's turn.
Last week (Jul 26 2019) it was reported that contractors who review Siri recordings for accuracy and to help make improvements may be eavesdropping on personal conversations. One of the contract workers said that Siri sometimes records audio after mistaken activations. The normal activation phrase is "Hey, Siri" or when an Apple Watch is raised and speech is detected. But this source said that Siri is also activated by similar-sounding words or--wait for it--from the sound of a zipper! What's more, the source said that Siri has recorded--and contractors have listened to--"countless instances" of private discussions. How private? Would you believe conversations between doctors and patients, talks about business deals, illegal criminal activities, and even sexual encounters. And the recordings are not anonymous; they also have user data showing location, contact details, and app data.
So much for Apple protecting my privacy!
As you can imagine, this has not been well received.
Earlier this week (Aug 1 2019) both Apple and Google have separately said that they are "suspending" (Apple) or "pausing" (Google) these human reviews of our recordings (Amazon has said nothing about stopping their listening). Google says this pause will last for three months; Apple gave no timetable. Apple says that users "soon" (as part of a future software update) can choose to opt out of letting a stranger listen to their conversations.
But why wait? You can today delete your stored voice recordings for Alexa, Google Assistant, Facebook Portal, and Siri on your own so that they cannot be reviewed by a stranger. The detailed information is found here.
Sounds like that's a good idea.
(Thanks to Michele McTighe for this suggestion).
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.