Apple Employees Internally Raising Concerns Over CSAM Detection Plans

Apple employees are now joining the choir of individuals raising concerns over Apple's plans to scan iPhone users' photo libraries for CSAM or child sexual abuse material, reportedly speaking out internally about how the technology could be used to scan users' photos for other types of content, according to a report from Reuters.

apple park drone june 2018 2
According to Reuters, an unspecified number of Apple employees have taken to internal Slack channels to raise concerns over CSAM detection. Specifically, employees are concerned that governments could force Apple to use the technology for censorship by finding content other than CSAM. Some employees are worried that Apple is damaging its industry-leading privacy reputation.

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.

Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.

Apple employees in roles pertaining to user security are not thought to have been part of the internal protest, according to the report.

Ever since its announcement last week, Apple has been bombarded with criticism over its CSAM detection plans, which are still expected to roll out with iOS 15 and iPadOS 15 this fall. Concerns mainly revolve around how the technology could present a slippery slope for future implementations by oppressive governments and regimes.

Apple has firmly pushed back against the idea that the on-device technology used for detecting CSAM material could be used for any other purpose. In a published FAQ document, the company says it will vehemently refuse any such demand by governments.

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

An open letter criticizing Apple and calling upon the company to immediately halt it's plan to deploy CSAM detection has gained more than 7,000 signatures at the time of writing. The head of WhatsApp has also weighed into the debate.

Popular Stories

imac video apple feature

Apple Unveils First New Products of 2026

Monday January 26, 2026 1:55 pm PST by
Apple today introduced its first two physical products of 2026: a second-generation AirTag and the Black Unity Connection Braided Solo Loop for the Apple Watch. Read our coverage of each announcement to learn more:Apple Unveils New AirTag With Longer Range, Louder Speaker, and More Apple Introduces New Black Unity Apple Watch BandBoth the new AirTag and the Black Unity Connection Braided...
iPhone 5s

iPhone 5s Gets New Software Update 13 Years After Launch

Monday January 26, 2026 3:56 pm PST by
Alongside iOS 26.2.1, Apple today released an updated version of iOS 12 for devices that are still running that operating system update, eight years after the software was first released. iOS 12.5.8 is available for the iPhone 5s and the iPhone 6, meaning Apple is continuing to support these devices for 13 and 12 years after launch, respectively. The iPhone 5s came out in September 2013,...
Second Generation AirTag Feature

Apple Unveils New AirTag With Longer Range, Louder Speaker, and More

Monday January 26, 2026 6:07 am PST by
Apple today introduced the second-generation AirTag, with key features including longer range for tracking items and a louder speaker. For those who are not familiar, the AirTag is a small accessory that you can attach to your backpack, keys, or other items. Then, you can track the location of those items in the Find My app on the iPhone, iPad, Mac, Apple Watch, and iCloud.com. The new...
Apple Creator Studio

Apple's Next Launch is Today

Tuesday January 27, 2026 2:39 pm PST by
Update: Apple Creator Studio is now available. Apple Creator Studio launches this Wednesday, January 28. The all-in-one subscription provides access to the Final Cut Pro, Logic Pro, Pixelmator Pro, Motion, Compressor, and MainStage apps, with U.S. pricing set at $12.99 per month or $129 per year. A subscription to Apple Creator Studio also unlocks "intelligent features" and "premium...
Apple Logo Spotlight

Apple to Launch These 20+ Products This Year

Sunday January 25, 2026 6:02 pm PST by
2026 promises to be yet another busy year for Apple, with the company rumored to be planning more than 20 product announcements over the coming months. Beyond the usual updates to iPhones, iPads, Macs, and Apple Watches, Apple is expected to release its all-new smart home hub, which was reportedly delayed until the more personalized version of Siri is ready. Other unique products rumored for ...

Top Rated Comments

haruhiko Avatar
58 months ago
“We will refuse any such demands.… except when it’s made into a law.” [insert whatever country name you hate here] will definitely force Apple to do so or Apple may lose access to that market.
Score: 84 Votes (Like | Disagree)
ItWasNotMe Avatar
58 months ago
The road to hell is littered with good intent
Score: 75 Votes (Like | Disagree)
Xenden Avatar
58 months ago
I really hope apple reversed course on the CSAM stuff. No one wants child porn, but it’s easy to start with a universally reviled topic, then move on to topics that are controversial but not illegal.
Score: 72 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
58 months ago
[HEADING=3]Enough is Enough, Apple! It is incredibly distasteful for you to SPY on us, as consumers. Apple it will be in your best interest to give us the Opt-Out option from CSAM feature, please.[/HEADING]

Where are you, Craig? You said this yourself in WWDC - 2021. What happened?

“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”

This is a punishment and a big slap in the face for anyone who owns an iPhone. Whatever you are trying to accomplish with this Apple, leave it in government and law enforcement hands.

1. How about CSAM scans Apple's executives' iPhones first? No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our PRIVACY. It is our fundamental human right.

2. How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to the release date of iOS 15...

https://www.cultofmac.com/744421/apple-announces-icloud-other-new-privacy-features/

Also, this guy needs to be FIRED from Apple. He is the mastermind behind CSAM. What a joke!



[HEADING=2]Does anyone here have a game plan on how we can stop this crappy CSAM feature?[/HEADING]

Attachment Image
Score: 72 Votes (Like | Disagree)
zakarhino Avatar
58 months ago
"We will refuse any such demands... except when the FBI ask us not to encrypt iCloud Backups.

Oh, and when the CCP ask us to hand over iCloud decryption keys"
Score: 53 Votes (Like | Disagree)
urbZZ Avatar
58 months ago
No iOS 15 upgrade for me, thats for sure

Attachment Image
Score: 46 Votes (Like | Disagree)