• Home
  • News
  • Apple Hit With Class Action Lawsuit for for ‘Unlawful and Intentional’ Recording of Siri Requests

Apple Hit With Class Action Lawsuit for for ‘Unlawful and Intentional’ Recording of Siri Requests

Apple Hit With Class Action Lawsuit for for ‘Unlawful and Intentional’ Recording of Siri Requests

Apple has been hit with a class action lawsuit (PDF) for using contractors to listen to and grade anonymized Siri requests, which involved listening to bits of private conversations by users.

The lawsuit was filed today in a Northern California court and accuses Apple of “unlawful and intentional recording of individuals’ confidential communications without their consent,” which the duit says violates California privacy laws.

Siri Devices are only supposed to record conversations preceded by the utterance of “Hey Siri” (a “wake phrase”) or through a specific gesture, such as pressing the home button on a device for a specified amount of time. California law prohibits the recording of oral communications without the consent of all parties to the communication. 

Individuals who have purchased or used Siri Devices and interacted with Sirihave not consented to Apple recording conversations where “Hey Siri” was not uttered or where they did not otherwise perform a gesture intending to activate Siri, such as pressing and holding down the home button on a device for a certain period of time.

A recent report highlighted Apple’s practices where contractors around the world listen to Siri voice data collected from customers, in order to improve the Siri voice experience and to help Siri better understand commands and queries.

The contractors would hear private discussions between doctors and patients, business deals, criminal dealings, sexual encounters, and other personal interactions where Siri had been activated, likely by accident.

A contractor that spoke to The Guardian in July said that “the regularity of accidental triggers on the watch is incredibly high,” and that some snippets were up to 30 seconds in length.  Employees in the Siri project are encouraged to report accidental activations, but not the actual content.

It should be noted that Apple has an extensive privacy policy related to Siri, and says it anonymizes all incoming datas, so it is not connected to a specific Apple ID, so that it provides no identifiable information about the user. However, the contractor claims that user data showing location, contact details, and app data is shared, and that names and addresses are sometimes disclosed when they are spoken aloud.

User voice data is saved for a six-month period so that the recognition system can use them to better understand a person’s voice. The saved voice data is identified using a random identifier that is assigned when Siri is turned on, never linked to an Apple ID. After the six months is up, a second copy is saved without an identifier, and is used by Apple for up to two years.

The lawsuit is asking Apple to obtain consent before recording a minor’s Siri interactions, delete all existing recordings, and prevent unauthorized recordings in the future. It also seeks damages of $5,000 per violation.

The plaintiffs in the case, which includes a minor, say they own an iPhone XR and an iPhone 6 that they claim they would not have purchased if they had been aware that their Siri recordings were stored for evaluation. The lawsuit is seeking class action status for all individuals who were recorded by a Siri device without their consent from October 12, 2011 to the present.

Apple has suspended its Siri evaluation program for the time being as it reviews the processes that are in place. The company plans to release a software update in the future to allow Siri users to opt out of having their Siri queries included in the evaluation process, which they currently cannot.