Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos

In addition to making end-to-end encryption available for iCloud Photos, Apple today announced that it has abandoned its controversial plans to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos, according to a statement shared with WIRED.

iCloud General Feature
Apple's full statement:

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the U.S. with iOS 15.2 in December 2021 and has since expanded to the U.K., Canada, Australia, and New Zealand, and the Siri resources are also available, but CSAM detection never ended up launching.

Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company ultimately postponed the feature based on "feedback from customers, advocacy groups, researchers, and others." Now, after a year of silence, Apple has abandoned the CSAM detection plans altogether.

Apple promised its CSAM detection system was "designed with user privacy in mind." The system would have performed "on-device matching using a database of known CSAM image hashes" from child safety organizations, which Apple would transform into an "unreadable set of hashes that is securely stored on users' devices."

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a "threshold" that would ensure "less than a one in one trillion chance per year" of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.

Apple's plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that the feature would have created a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

Aston Martin CarPlay Ultra Screen

Apple's CarPlay Ultra to Expand to These Vehicle Brands Later This Year

Sunday February 1, 2026 10:08 am PST by
Last year, Apple launched CarPlay Ultra, the long-awaited next-generation version of its CarPlay software system for vehicles. Nearly nine months later, CarPlay Ultra is still limited to Aston Martin's latest luxury vehicles, but that should change fairly soon. In May 2025, Apple said many other vehicle brands planned to offer CarPlay Ultra, including Hyundai, Kia, and Genesis. In his Powe...
Apple Logo Black

Apple Just Made Its Second-Biggest Acquisition Ever After Beats

Thursday January 29, 2026 10:07 am PST by
Apple today confirmed to Reuters that it has acquired Q.ai, an Israeli startup that is working on artificial intelligence technology for audio. Apple paid close to $2 billion for Q.ai, according to sources cited by the Financial Times. That would make this Apple's second-biggest acquisition ever, after it paid $3 billion for the popular headphone and audio brand Beats in 2014. Q.ai has...
Apple Logo Black

Apple's Next Launch is 'Imminent'

Sunday February 1, 2026 12:31 pm PST by
The calendar has turned to February, and a new report indicates that Apple's next product launch is "imminent," in the form of new MacBook Pro models. "All signs point to an imminent launch of next-generation MacBook Pros that retain the current form factor but deliver faster chips," Bloomberg's Mark Gurman said on Sunday. "I'm told the new models — code-named J714 and J716 — are slated...
14 inch MacBook Pro Keyboard

Apple Changes How You Order a Mac

Saturday January 31, 2026 10:51 am PST by
Apple recently updated its online store with a new ordering process for Macs, including the MacBook Air, MacBook Pro, iMac, Mac mini, Mac Studio, and Mac Pro. There used to be a handful of standard configurations available for each Mac, but now you must configure a Mac entirely from scratch on a feature-by-feature basis. In other words, ordering a new Mac now works much like ordering an...
Apple MacBook Pro M4 hero

New MacBook Pros Reportedly Launching Alongside macOS 26.3

Sunday February 1, 2026 5:42 am PST by
Apple is planning to launch new MacBook Pro models with M5 Pro and M5 Max chips alongside macOS 26.3, according to Bloomberg's Mark Gurman. "Apple's faster MacBook Pros are planned for the macOS 26.3 release cycle," wrote Gurman, in his Power On newsletter today. "I'm told the new models — code-named J714 and J716 — are slated for the macOS 26.3 software cycle, which runs from...

Top Rated Comments

Populus Avatar
41 months ago
This is the right decision for Apple to make, in my opinion. I’m glad they recognized that there are better ways to prevent the spread of this type of content.

I’m sincerely surprised Apple backtracked on something as big as this (and with such a big pressure from the governments).
Score: 64 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
41 months ago

Yeah, but now we can't catch the pedophiles
That's the Law Enforcement and Government job. Not Apple's.
Score: 63 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
41 months ago
Thank you, Apple. CSAM was a joke. If privacy matters in your life, it should matter to the phone your life is on”. Long Live!

Score: 44 Votes (Like | Disagree)
Realityck Avatar
41 months ago

Apple today announced that it has abandoned its plans to detect known CSAM stored in iCloud Photos, according to a statement shared with WIRED ('https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/').
Everybody should be happy CSAM is DOA
Score: 30 Votes (Like | Disagree)
aPple nErd Avatar
41 months ago
Great news but they lost all my trust with this. I will continue to use local backups on my Mac and keep my photos out of iCloud for good.
Score: 27 Votes (Like | Disagree)
aPple nErd Avatar
41 months ago

[S]After extensive consultation with experts[/S] After extensive public pressure...

Never should have succumbed to the public feedback! Perhaps a botched introduction but no one else would have done it 'right' like Apple because of the scrutiny they are under.
Truly an unhinged take
Score: 26 Votes (Like | Disagree)