Microsoft contractors are listening in to personal conversations of users who use Skype app’s translation service and Microsoft’s voice assistant Cortana, reports Motherboard. The reviewed audio includes conversations that are intimate and revolve around personal issues, per the report. When a contractor is presented by Microsoft with a piece of audio to transcribe, they are also given a series of approximate translations generated by Skype’s translation system; the contractor then needs to select the most accurate translation, adds the report. One thing to note here is that privacy policy pages of Microsoft, Skype and Cortana don’t state anywhere that audio collected by these services could be analysed by humans – this is a clear case of lack of disclosure to users, and in effect, outside the realm of what users have consented to.

Google, Apple and Amazon all have similar contractors

If this story sounds familiar to you so far, it is with good reason. Last month, we reported that Google conceded that it gave employees the access to audio recordings from Google Assistant and Google Home speakers. To make matters worse, the files include not only voice commands and queries “consciously” made by users – but also “conversations that should never have been recorded, some of which contain sensitive information.”

In fact, that wasn’t the first story that showed how our interactions with virtual assistants may not be as private and secure as we might believe. In April this year, we reported that thousands of Amazon employees around the world listen to users’ voice recordings captured on Alexa-powered Echo speakers. Amazon workers listen to the audio clips which they then transcribe, annotate and feed back into the software to improve Alexa’s voice recognition ability and to help it understand commands better.

Not only Google and Amazon, Apple, who claim to be the de facto flagbearers of protecting users’ privacy (writing from the perspective of their recent ad campaign in India), have also had a similar “grading” process that allowed the company’s employees to “review” audio recordings collected by Siri. Apple announced on August 2 that it had halted the process.