Siri/Cortana listening posts for Apple/Microsoft and their marketeers

Invasion of Privacy by Siri and Cortana brought out in open

“Everything you’ve ever said to Siri/Cortana has been recorded…and I get to listen to it” says an employee ofย Walk N’Talk Technologies

A Redditor, FallenMyst today stated the obvious on a Reddit thread, people other than the users can easily hear what the user says to Siri and Cortana and in all probability may use it to harm the user in long run.

FallenMyst stated in the thread that he had just joined a tech firm,ย Walk N’Talk Technologies where he got to listen into the sound bytes, match it with what is said in an audio click and then give the feedback about the quality to his bosses.

I started a new job today with Walk N’Talk Technologies. I get to listen to sound bites and rate how the text matches up with what is said in an audio clip and give feedback on what should be improved.

So far so good for FallenMyst because he thought that the sound bytes being given to him for benchmarking may be random. However he noticed a pattern in the voice samples and realised that they were sound samples of users giving voice commands to their smartphones using either Apple’s Siri or Microsoft’s Cortana.

Hearing the personal communications from users which is not supposed to heard by anyone other than the user and Siri/Cortana put FallenMyst in a moral dilemma.

“Soon, I realized that I was hearing peoples commands given to their mobile devices. Guys, I’m telling you, if you’ve said it to your phone, it’s been recorded…and there’s a damn good chance a 3rd party is going to hear it, FallenMyst states on the thread.

It seems that whatever that a users says to Siri/Cortana is being recorded and saved in the clouds and is available for listening to an unwanted third party.

Though it may be innocent stuff likeย “Siri, do you like me?” but in the end, a unwanted person is hearing a personal communication meant to be for Siri/Cortana’s ears only.

“I heard everything from kiddos asking innocent things like “Siri, do you like me?” to some guy asking Galaxy to lick his bxxxxxe. I wish I was kidding,” FallenMyst states.

Further, if such information is indeed being store by Apple and/or Microsoft, did they obtain users explicit permission to store it. Can Apple/Microsoft guarantee that such personal communications are not used by it/breached by hackers and used against the user. ย Sometimes, innocent stuff can land you in a soup.

The post has already received a thousand upvotes on Reddit since it was posted an hour ago. ย Many redditors have given their views and comments on the post. Some of the top comments are given below :

[โ€“]mjrbac0n 1 point 55 minutes ago
“It’s helps pay for the device, like commercials.” Once the sound waves leave your mouth, you don’t own them anymore anyway.

[โ€“]TheGreenJedi1 point

So i have a quick question OP,

If I remember some similar articles about this stuff happening the data is scrubbed of name and location and many other details, it’d be pretty hard for it to get traced back to a user in regards to the sexting.

Is that true?

Voice Commands are only going to get better through the work your doing at “Walk N’Talk Technologies.” So I’m really not surprised. I consider it my duty to future generations to help this technology work. I’m slightly surprised at the number of people using it for sexting though.

[โ€“]jpgray1 point

Pretty interesting. I wonder if there’s going to be court cases in the near future on this sort of thing & stuff like the samsung TVs and xbox kinect recording people unwittingly. I don’t care what terms of use you agree to when you buy your device, people aren’t knowingly giving consent to this level of monitoring and it’s definitely an invasion of privacy.

Redditor jpgray has a point there, perhaps Apple and Microsoft should get ready for a class action suit for invasion of privacy and breach of trust!

Read More

Suggested Post