Apple Provides Further Clarity on Why It Abandoned Plan to Detect CSAM in iCloud Photos

Apple on Thursday provided its fullest explanation yet for last year abandoning its controversial plan to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos.

iCloud General Feature
Apple's statement, shared with Wired and reproduced below, came in response to child safety group Heat Initiative's demand that the company "detect, report, and remove" CSAM from iCloud and offer more tools for users to report such content to the company.

"Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it," Erik Neuenschwander, Apple's director of user privacy and child safety, wrote in the company's response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.

"Scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit," Neuenschwander wrote. "It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types."

In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in ‌iCloud Photos‌, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the U.S. with iOS 15.2 in December 2021 and has since expanded to the U.K., Canada, Australia, and New Zealand, and the ‌Siri‌ resources are also available, but CSAM detection never ended up launching.

Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company postponed the feature based on "feedback from customers, advocacy groups, researchers, and others." The plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Apple's latest response to the issue comes at a time when the encryption debate has been reignited by the U.K. government, which is considering plans to amend surveillance legislation that would require tech companies to disable security features like end-to-end encryption without telling the public.

Apple says it will pull services including FaceTime and iMessage in the U.K. if the legislation is passed in its current form.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iOS 26 on Three iPhones

iOS 26's Liquid Glass Design Draws Criticism From Users

Wednesday September 17, 2025 2:56 pm PDT by
It's been two days since iOS 26 was released, and Apple's new Liquid Glass design is even more divisive than expected. Any major design change can create controversy as people get used to the new look, but the MacRumors forums, Reddit, Apple Support Communities, and social media sites seem to feature more criticism than praise as people discuss the update. Complaints There are a long...
iPhone 17 Pro and Air Feature

Two iPhone 17 Pro and iPhone Air Colors Appear to Scratch More Easily

Friday September 19, 2025 10:02 am PDT by
As reported by Bloomberg today, some of the new iPhone 17 Pro and iPhone Air models on display at Apple Stores today are already scratched and scuffed. French blog Consomac also reported on this topic. The scratches appear to be most prominent on models with darker finishes, including the iPhone 17 Pro and Pro Max in Deep Blue, and the iPhone Air in Space Black. Images Credit: Consoma ...
iOS 26

iOS 26.0.1 Coming Soon, Likely With iPhone Air and iPhone 17 Pro Fix

Thursday September 18, 2025 9:17 am PDT by
Apple is preparing to release iOS 26.0.1, according to a private account on X with a proven track record of sharing information about future iOS versions. The update will have a build number of 23A350, or similar, the account said. It is likely that iOS 26.0.1 will fix a camera-related bug on the new iPhone Air and iPhone 17 Pro models. In his iPhone Air review, CNN Underscored's Henry T. ...
M6 MacBook Pro Feature 1

Apple's Rumored MacBook Pro Redesign: 6 New Features Anticipated

Wednesday September 17, 2025 4:26 am PDT by
Apple in October 2024 overhauled its 14-inch and 16-inch MacBook Pro models, adding M4, M4 Pro, and M4 Max chips, Thunderbolt 5 ports on higher-end models, display changes, and more. That's quite a lot of updates in one go, but if you think this means a further major refresh for the ‌MacBook Pro‌ is now several years away, think again. Bloomberg's Mark Gurman has said he expects only a small ...
iOS 26

iOS 26.1 to iOS 26.4: Here Are 5 New Features to Expect on Your iPhone

Tuesday September 16, 2025 11:17 am PDT by
iOS 26 was finally released on Monday, but the software train never stops, and the first developer beta of iOS 26.1 will likely be released soon. iOS 18.1 was an anomaly, as the first developer beta of that version was released in late July last year, to allow for early testing of Apple Intelligence features. The first betas of iOS 15.1, iOS 16.1, and iOS 17.1 were all released in the second ...
Tim Cook Rainbow

Apple Reportedly Plans to Launch These 10 Products in 'Coming Months'

Sunday September 14, 2025 8:45 am PDT by
Apple's annual September event is now in the rearview mirror, with the iPhone 17, iPhone 17 Pro, iPhone 17 Pro Max, iPhone Air, Apple Watch Series 11, Apple Watch Ultra 3, Apple Watch SE 3, and AirPods Pro 3 set to launch this Friday, September 19. As always, there is more to come. In his Power On newsletter today, Bloomberg's Mark Gurman said Apple plans to release many products in the...

Top Rated Comments

Glenny2lappies Avatar
27 months ago

I think an on-device, privacy-centric approach could be a good middle ground. Removes the external service and is an enhancement / addition to existing on-device recognition technologies.

For example, if someone used their device to capture such material, or otherwise saved such material to their device, maybe a notice could be displayed "this content could be illegal and causes real harm". It wouldn't remove it, or "report" it external to the device, but it could be at least some way to get someone to think twice about being in possession such horrendous material.
No.

The slippery slope commences. Child abuse, so easy to put in place the infrastructure for censoring.
Then came the clamor for "hate speech". Then Thought Crime.

No. Child abuse is illegal and only a very few people are involved. The wider privacy issue is far more important.

Just read 1984 (the Stephen Fry audiobook is best) to see the end result which we can already see happening.
Score: 56 Votes (Like | Disagree)
Apple Knowledge Navigator Avatar
27 months ago
That’s a decent explanation, but I’m baffled that they didn’t think of these things before announcing the feature originally.
Score: 46 Votes (Like | Disagree)
xxray Avatar
27 months ago
Still super grateful Apple listened and had the “courage” to walk this back and then even enable end-to-end encryption in iCloud. It really made me lose faith in Apple for a bit when they were pursuing this. I cancelled iCloud and everything.

Yes, CSAM is the absolute worst of humanity, and people who cause/encourage it deserve to be punished. But not at the expense of ruining privacy for the entire rest of the human population.
Score: 44 Votes (Like | Disagree)
Wanted797 Avatar
27 months ago

That’s a decent explanation, but I’m baffled that they didn’t think of these things before announcing the feature originally.
They didn’t.

It was a knee jerk reaction that sounded good on paper and I’m assuming Apple corporate bureaucracy (like most companies) meant no one spoke up.
Score: 31 Votes (Like | Disagree)
centauratlas Avatar
27 months ago

Is there a deep hidden meaning to this? Basically is Apple saying in not so many words that if they were to create software to scan for CSAM in icloud that it could fall into the hands of data thieves who would exploit it's use and thus to prevent such a thing happening they have no intention of creating the software in the first place? Much like the argument Apple use for not making security backdoors into iphones because they are worried making such a thing will fall into the hands of criminals who would exploit it's use and therefore it is better to say no such thing exists.
Not just data thieves but authoritarians of all stripes - fascist, socialist, communist, monarchist etc - would use it to scan for unacceptable speech and dissent. Then they'd use it to shut down dissent and free speech. Just ask the panda.

See 1984 and Brave New World. They treat them like instructions, not warnings.
Score: 25 Votes (Like | Disagree)
Glenny2lappies Avatar
27 months ago
Good on Apple!

First they came for the ('https://en.wikipedia.org/wiki/First_they_came_...')socialists, and I did not speak out—
Because I was not a socialist.

Then they came for the trade unionists, and I did not speak out—
Because I was not a trade unionist.

Then they came for the Jews, and I did not speak out—
Because I was not a Jew.

Then they came for me—and there was no one left to speak for me.



With the rise of populist driven one-trick-pony political movements, it is truly great to see Apple's stance. Privacy is vital as is the right to free speech.
Score: 24 Votes (Like | Disagree)