Contractors Working on Siri 'Regularly' Hear Recordings of Drug Deals, Private Medical Info and More Claims Apple Employee

Contractors that are working on Siri regularly hear confidential medical information, drug deals, recordings of couples having sex, and other private information, according to a report from The Guardian that shares details collected from a contractor who works on one of Apple's ‌Siri‌ teams.

The employee who shared the info is one of many contractors around the world that listen to ‌Siri‌ voice data collected from customers to improve the ‌Siri‌ voice experience and help ‌Siri‌ better understand incoming commands and queries.

hey siri
According to The Guardian, the employee shared the information because he or she was concerned with Apple's lack of disclosure about the human oversight, though Apple has several times in the past confirmed that this takes place and the practice has been outlined in past reports as well.

The whistleblower said: "There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data."

In a statement, Apple confirmed to The Guardian that a small number of anonymized ‌Siri‌ requests are analyzed for the purpose of improving ‌Siri‌. A small, random subset (less than 1 percent) of daily ‌Siri‌ activations are used for grading, with each clip only lasting for a few seconds.

"A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user's Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements."

Apple has not made its human-based ‌Siri‌ analysis a secret, but its extensive privacy terms don't appear to explicitly state that ‌Siri‌ information is listened to by humans. The employee said that Apple should "reveal to users" that human oversight exists.

The contractor who spoke to The Guardian said that "the regularity of accidental triggers on the watch is incredibly high," and that some snippets were up to 30 seconds in length. Employees listening to ‌Siri‌ recordings are encouraged to report accidental activations as a technical problem, but aren't told to report about content.

Apple has an extensive privacy policy related to ‌Siri‌ and says it anonymizes all incoming data so that it's not linked to an Apple ID and provides no information about the user. Still, the contractor claims that user data showing location, contact details, and app data is shared, and that names and addresses are sometimes disclosed when they're spoken aloud. To be clear, Apple says that all ‌Siri‌ data is assigned a random identifier and does not include location or contact details as stated by the contractor.

As well as the discomfort they felt listening to such private information, the contractor said they were motivated to go public about their job because of their fears that such information could be misused. "There's not much vetting of who works there, and the amount of data that we're free to look through seems quite broad. It wouldn't be difficult to identify the person that you're listening to, especially with accidental triggers - addresses, names and so on.

While Apple's ‌Siri‌ privacy policy and security documents do not mention human oversight specifically, they are detailed and provide information on how ‌Siri‌ recordings are used.

As stated in Apple's security white paper, for example, user voice data is saved for a six-month period so that the recognition system can use them to better understand a person's voice. The voice data that's saved is identified using a random identifier that's assigned when ‌Siri‌ is turned on, and it is never linked to an ‌Apple ID‌. After six months, a second copy is saved sans any identifier and is used by Apple for improving ‌Siri‌ for up to two years. A small number of recordings, transcripts, and associated data without identifying information is sometimes used by Apple for ongoing improvement of ‌Siri‌ beyond two years.

Apple's privacy website has a ‌Siri‌ section that offers up more info, explaining that all ‌Siri‌ queries are assigned a random identifier not associated with an ‌Apple ID‌. The identifier is reset whenever ‌Siri‌ is turned off and then on again, and turning ‌Siri‌ off deletes all user data associated with a ‌Siri‌ identifier.

When we do send information to a server, we protect your privacy by using anonymized rotating identifiers so that searches and locations can't be traced to you personally. And you can disable Location Services, our proactive features, or the proactive features' use of your location at any time.

Those concerned about ‌Siri‌ triggering accidentally on devices like the iPhone, Apple Watch, and HomePod can turn off the "Hey ‌Siri‌" feature and can instead activate ‌Siri‌ manually, and ‌Siri‌ can also be turned off entirely.

Popular Stories

Apple Intelligence General Feature

Apple Intelligence Features Not Coming to European Union at Launch Due to DMA

Friday June 21, 2024 9:44 am PDT by
Apple today said that European customers will not get access to the Apple Intelligence, iPhone Mirroring, and SharePlay Screen Sharing features that are coming to the iPhone, iPad, and Mac this September due to regulatory issues related to the Digital Markets Act. In a statement to Financial Times, Apple said that there will be a delay as it works to figure out how to make the new...
iOS 18 on iPhone Feature

Everything New in iOS 18 Beta 2

Monday June 24, 2024 12:52 pm PDT by
Apple today released the second betas of iOS 18 and iPadOS 18 to developers, and the software adds support for new features that Apple is working on, plus it tweaks some of the interface changes that have been made in the updates. Apple will refine iOS 18 over the course of the next few months, with multiple changes and refinements expected from now until September. We've highlighted...
Apple WWDC24 Apple Intelligence hero 240610

Apple Explains iPhone 15 Pro Requirement for Apple Intelligence

Wednesday June 19, 2024 4:48 am PDT by
With iOS 18, iPadOS 18, and macOS Sequoia, Apple is introducing a new personalized AI experience called Apple Intelligence that uses on-device, generative large-language models to enhance the user experience across iPhone, iPad, and Mac. These new AI features require Apple's latest iPhone 15 Pro and iPhone 15 Pro Max models to work, while only Macs and iPads with M1 or later chips will...
amazon echo dot

Amazon Could Charge Up to $10/Month for Alexa

Friday June 21, 2024 2:55 pm PDT by
Apple competitor Amazon is working on a revamp of its Alexa assistant, and the new version could cost up to $10 per month, according to a report from Reuters. The upcoming version of Alexa will support conversational generative AI, and Amazon is planning for two tiers of service. There will be a free tier and a second, premium tier that is priced at $5 at a minimum, with Amazon considering...
top stories 22jun2024

Top Stories: Apple Watch X Rumors, New Final Cut App for iPhone, and More

Saturday June 22, 2024 6:00 am PDT by
The avalanche of news coming out of WWDC earlier this month is finally starting to slow, but that doesn't mean there wasn't still lots to talk about in Apple news and rumors this week. This week saw some additional rumors about the upcoming Apple Watch models, the release of major Final Cut Pro updates, the launch of Apple's annual Back to School promo in the U.S. and Canada, new...

Top Rated Comments

Tivoli_ Avatar
64 months ago
No matter how much privacy is touted by companies, including Apple, it is difficult to trust any of them. Don't let any company have a spy in your house. Period.
Score: 45 Votes (Like | Disagree)
Glockworkorange Avatar
64 months ago
There will be a Rene Ritchie video defending this soon. If it were Google/Facebook, he'd go *******.
Score: 43 Votes (Like | Disagree)
jclo Avatar
64 months ago
Apple made it pretty clear that Siri is anonymized - contractors absolutely should not be able to see "contact details" of whoever the recording is from. This would be huge if that were the case, which I find hard to believe. If names and addresses are referenced separately then what does "contact details" even entail?
I'm also skeptical of this claim from the contractor. Apple's privacy policy is pretty clear on this point, even if it doesn't mention the human oversight. I believe names and addresses might sometimes be heard if spoken aloud, but not that info is sent with contact information/addresses/location.
Score: 43 Votes (Like | Disagree)
gaximus Avatar
64 months ago
So I used to tell everyone that the reason Siri is so much worse than the others, is because they don't have real people listening in to make corrections. And that I take privacy over the better performance of other assistants. But that seems to not be the case. So what the **** Apple, I brag about Apple privacy to all my Android friends (friends with Android phones, not robot friends) and this is how you repay me. SMH
Score: 40 Votes (Like | Disagree)
HackerJL Avatar
64 months ago
This here is the next week of headlines blowing this out of proportion. Brace yourself.
Score: 35 Votes (Like | Disagree)
NickName99 Avatar
64 months ago
Sounds like they just need to improve the vetting of these contractors. Naturally some private information will end up in Siri requests sometimes.

I occasionally have access to sensitive information when I’m debugging using restored client database backups. I take it very seriously, I don’t go poking around, I delete the database when I’m done working on the issue. I treat it with the same respect I would want a fellow professional to treat my data with.
Score: 34 Votes (Like | Disagree)