New in OS X: Get MacRumors Push Notifications on your Mac

Resubscribe Now Close

Apple Facing Lawsuit for 'Unlawful and Intentional' Recording of Confidential Siri Requests Without User Consent

Apple is facing a class action lawsuit [PDF] for employing contractors to listen to and grade some anonymized Siri conversations for the purpose of quality control and product improvement.

Apple's ‌Siri‌ practices were highlighted in a recent report where one of the contractors claimed that Apple employees evaluating ‌Siri‌ recordings often hear confidential medical information, drug deals, and other private information when ‌Siri‌ is activated accidentally.


The lawsuit, filed in a Northern California court today (and shared by CNBC's Kif Leswing), accuses Apple of "unlawful and intentional recording of individuals' confidential communications without their consent," violating California privacy laws when accidental ‌Siri‌ activations are recorded and evaluated by humans.
‌Siri‌ Devices are only supposed to record conversations preceded by the utterance of "Hey ‌Siri‌" (a "wake phrase") or through a specific gesture, such as pressing the home button on a device for a specified amount of time. California law prohibits the recording of oral communications without the consent of all parties to the communication.

Individuals who have purchased or used ‌Siri‌ Devices and interacted with ‌Siri‌ have not consented to Apple recording conversations where "Hey ‌Siri‌" was not uttered or where they did not otherwise perform a gesture intending to activate ‌Siri‌, such as pressing and holding down the home button on a device for a certain period of time.
As outlined in its privacy policies, Apple collects some anonymized ‌Siri‌ recordings for the purpose of improving ‌Siri‌ and, presumably, cutting down on accidental ‌Siri‌ activations. These recordings are analyzed by humans and can include details recorded when ‌Siri‌ mishears a "Hey ‌Siri‌" trigger word.

The lawsuit claims that Apple has not informed consumers that they are "regularly being recorded without consent," though it also highlights Apple's privacy policy where Apple does state that such data can be used for improving its services.

The plaintiffs in the case, one of whom is a minor, claim to own an iPhone XR and an iPhone 6 that they would not have purchased had they known that their ‌Siri‌ recordings were stored for evaluation. The plaintiffs are seeking class action status for all individuals who were recorded by a ‌Siri‌ device without their consent from October 12, 2011 to the present.

The lawsuit asks for Apple to obtain consent before recording a minor's ‌Siri‌ interactions, to delete all existing recordings, and to prevent unauthorized recordings in the future. It also asks for $5,000 in damages per violation.

Apple has suspended its Siri evaluation program right now as it reviews the processes that are in place in light of the contractor's claims. Prior to the suspension of the program, Apple said that a small, random subset (less than 1%) of daily ‌Siri‌ requests are analyzed for improving ‌Siri‌ and dictation, with requests not associated with a user's Apple ID.

Apple in the future plans to release a software update that will let ‌Siri‌ users opt out of having their ‌Siri‌ queries included in the evaluation process, something that's not possible at the current time. All collected ‌Siri‌ data can be cleared from an iOS device by turning ‌Siri‌ off and then on again, while accidental recordings can be stopped by disabling "Hey ‌Siri‌."

Top Rated Comments

(View all)

10 weeks ago
Guess they should have read the terms of service better...
Rating: 18 Votes
10 weeks ago
All of that quality assurance work and Siri is still comically bad at understanding what she's asked.
Rating: 14 Votes
10 weeks ago
This lawsuit is going nowhere.
Rating: 10 Votes
10 weeks ago
"Apple in the future plans to release a software update that will let Siri users opt out of having their Siri queries included in the evaluation process, something that's not possible at the current time."

Not nearly good. This should be opt-in, not opt-out. Also, I imagine this is some grounds for a GDPR lawsuit.

I also read the whole privacy statement displayed when you try to enable Siri. Nowhere in it does it say that humans listen to snippets of your conversations.
Rating: 5 Votes
10 weeks ago
"What happens on your iPhone stays on your iPhone" ... apparently not.

Apple has made HUGE statements about privacy, "Privacy is King" , and it turns out that is 'misleading' at best.

Will I opt-in, nope because you have shown that you are no more trustworthy than any other company when it suits you.
Rating: 5 Votes
10 weeks ago
That was quick.
Rating: 5 Votes
10 weeks ago

Are they suing Google and Amazon as well?

Why would the owners of Apple devices be suing Google and Amazon?
Rating: 4 Votes
10 weeks ago
Woot! Also Microsoft is getting in trouble for it now.... What I really want revealed is who are these "contractor" companies that got to listen to our recordings? Is the government included in that pool of "contractors"?
Rating: 4 Votes
10 weeks ago
1% of the millions of Apple products is still a crap ton of people. Oops.
Rating: 4 Votes
10 weeks ago
Omg.


so anyways, I used my iPhone today.
Rating: 4 Votes

[ Read All Comments ]