• Home
  • News
  • Report Claims Contractors Working on Siri ‘Regularly’ Hear Recordings of Drug Deals, Sexual Escapades, and More

Report Claims Contractors Working on Siri ‘Regularly’ Hear Recordings of Drug Deals, Sexual Escapades, and More

Report Claims Contractors Working on Siri ‘Regularly’ Hear Recordings of Drug Deals, Sexual Escapades, and More

Contractors working on Apple’s Siri “regularly” hear drug deals, confidential medical information, and even couples in the throes of passion. A report by The Guardian shares details collected from a contractor working on a Siri team.

The contractor says many contractors around the world listen to Siri voice data collected from customers, in order to improve the Siri voice experience and to help Siri better understand commands and queries.

The employee is said to have shared the information because they were concerned with Apple’s lack of disclosure about human access to the voice data.

The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

However, Apple has indeed in the past noted that this takes place, and reaffirmed that in a statement about the issue, shared by 9to5Mac:

“A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long.

The contractor that spoke to The Guardian said that “the regularity of accidental triggers on the watch is incredibly high,” and that some snippets were up to 30 seconds in length.  Employees in the Siri project are encouraged to report accidental activations, but not the actual content.

It should be noted that Apple has an extensive privacy policy related to Siri, and says it anonymizes all incoming datas, so it is not connected to a specific Apple ID, so that it provides no identifiable information about the user. However, the contractor claims that user data showing location, contact details, and app data is shared, and that names and addresses are sometimes disclosed when they are spoken aloud.

User voice data is saved for a six-month period so that the recognition system can use them to better understand a person’s voice. The saved voice data is identified using a random identifier that is assigned when Siri is turned on, never linked to an Apple ID. After the six months is up, a second copy is saved without an identifier, and is used by Apple for up to two years.