• Home
  • Apple
  • News
  • Apple Must Face Class-Action Siri Privacy Lawsuit, Says Judge

Apple Must Face Class-Action Siri Privacy Lawsuit, Says Judge

Apple Must Face Class-Action Siri Privacy Lawsuit, Says Judge

Apple must deal with allegations that its Siri voice assistant listened in on its users’ private conversations, ruled a judge on Thursday. The judge said the majority of the claims in the proposed class-action lawsuit can go forward.

Reuters reports that U.S. District Judge Jeffrey White told plaintiffs they could try to prove Siri routinely recorded their private conversations because of “accidental activations.” Apple allegedly shared these conversations with third parties, such as advertisers.

Judge White ruled plaintiffs can pursue their claims that Apple violated the federal Wiretap Act and California privacy law, committing breach of contract, the report said. A claim of unfair competition was thrown out.

The lawsuit, which was filed in a Northern California court in 2019, accuses Apple of “unlawful and intentional recording of individuals’ confidential communications without their consent,” which the suit says violates California privacy laws.

Siri Devices are only supposed to record conversations preceded by the utterance of “Hey Siri” (a “wake phrase”) or through a specific gesture, such as pressing the home button on a device for a specified amount of time. California law prohibits the recording of oral communications without the consent of all parties to the communication. 

Individuals who have purchased or used Siri Devices and interacted with Sirihave not consented to Apple recording conversations where “Hey Siri” was not uttered or where they did not otherwise perform a gesture intending to activate Siri, such as pressing and holding down the home button on a device for a certain period of time.

One user claims that a private conversation with his doctor about a “brand name surgical treatment” led to receiving targeted ads for that treatment. Other users claim they saw similar instances involving sneakers, sunglasses, and restaurants.

A report highlighted Apple’s practices where contractors around the world listen to Siri voice data collected from customers, in order to improve the Siri voice experience and to help Siri better understand commands and queries.

The contractors would hear private discussions between doctors and patients, business deals, criminal dealings, sexual encounters, and other personal interactions where Siri had been activated, likely by accident.

A contractor that spoke to The Guardian in July said that “the regularity of accidental triggers on the watch is incredibly high,” and that some snippets were up to 30 seconds in length.  Employees in the Siri project are encouraged to report accidental activations, but not the actual content.

Apple has suspended its Siri evaluation program in August 2019 and released a software update to allow Siri users to opt-out of having their Siri queries included in the evaluation process.