Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos

In addition to making end-to-end encryption available for iCloud Photos, Apple today announced that it has abandoned its controversial plans to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos, according to a statement shared with WIRED.

iCloud General Feature
Apple's full statement:

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the U.S. with iOS 15.2 in December 2021 and has since expanded to the U.K., Canada, Australia, and New Zealand, and the Siri resources are also available, but CSAM detection never ended up launching.

Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company ultimately postponed the feature based on "feedback from customers, advocacy groups, researchers, and others." Now, after a year of silence, Apple has abandoned the CSAM detection plans altogether.

Apple promised its CSAM detection system was "designed with user privacy in mind." The system would have performed "on-device matching using a database of known CSAM image hashes" from child safety organizations, which Apple would transform into an "unreadable set of hashes that is securely stored on users' devices."

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a "threshold" that would ensure "less than a one in one trillion chance per year" of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.

Apple's plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that the feature would have created a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iphone 16 apple intelligence

Apple Aiming to Release 'Breakthrough' New iPhone Accessory

Wednesday February 18, 2026 12:43 pm PST by
Apple is looking for a "breakthrough" with its push into wearable AI devices, including an "AirTag-sized pendant," according to Bloomberg's Mark Gurman. In a report this week, he said the pendant is reminiscent of the failed Humane AI Pin, but it would be an iPhone accessory rather than a standalone product. The pendant would feature an "always-on" camera and a microphone for Siri voice...
Apple Watch 15 Tips Every Owner Needs to Know Feature

Apple Watch: 15 Tips Every Owner Needs to Know

Thursday February 19, 2026 7:38 am PST by
Apple Watch is now eleven generations in, and packed with useful features that are easy to miss at first glance. To help you get more out of your new device, we've rounded up 15 practical tips you might not have discovered yet, including a few that long-time users often overlook. Bounce Between Two Apps On your Apple Watch, double-press the Digital Crown to see a deck of all currently...
Dynamic Island iPhone 18 Pro Feature

10 Reasons to Wait for Apple's iPhone 18 Pro

Wednesday February 18, 2026 5:12 am PST by
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models at the same time, which is why we often get rumored features months ahead of launch. The iPhone 18 series is no different, and we already have a good idea of what to expect for the iPhone 18 Pro and iPhone 18 Pro Max. One thing worth...
iphone 17 pro green

iPhone 17 Pro Max Curiously Becomes Most Traded-In Smartphone

Wednesday February 18, 2026 9:13 am PST by
New trade-in data indicates that Apple's iPhone 17 Pro Max has rapidly become the single most traded-in smartphone. According to a new report from SellCell, Apple's latest flagship iPhone has quickly risen to the top of the independent trade-in market, accounting for 11.5% of all devices appearing in the top-20 trade-in rankings just months after release. The analysis is based on SellCell...
Multicolored Low Cost A18 Pro MacBook Feature

Low-Cost MacBook Expected on March 4 in These Colors

Wednesday February 18, 2026 5:42 am PST by
Apple will announce its rumored low-cost MacBook at its event on March 4, with the device coming in a selection of bold color options, according to a known leaker. Earlier this week, Apple announced a "special Apple Experience" for the media in New York, London, and Shanghai, taking place on March 4, 2026 at 9:00am ET. Posting on Weibo, the leaker known as "Instant Digital" said that the...

Top Rated Comments

Populus Avatar
42 months ago
This is the right decision for Apple to make, in my opinion. I’m glad they recognized that there are better ways to prevent the spread of this type of content.

I’m sincerely surprised Apple backtracked on something as big as this (and with such a big pressure from the governments).
Score: 64 Votes (Like | Disagree)
TheYayAreaLiving 🎗️ Avatar
42 months ago

Yeah, but now we can't catch the pedophiles
That's the Law Enforcement and Government job. Not Apple's.
Score: 63 Votes (Like | Disagree)
TheYayAreaLiving 🎗️ Avatar
42 months ago
Thank you, Apple. CSAM was a joke. If privacy matters in your life, it should matter to the phone your life is on”. Long Live!

Score: 44 Votes (Like | Disagree)
Realityck Avatar
42 months ago

Apple today announced that it has abandoned its plans to detect known CSAM stored in iCloud Photos, according to a statement shared with WIRED ('https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/').
Everybody should be happy CSAM is DOA
Score: 30 Votes (Like | Disagree)
aPple nErd Avatar
42 months ago
Great news but they lost all my trust with this. I will continue to use local backups on my Mac and keep my photos out of iCloud for good.
Score: 27 Votes (Like | Disagree)
aPple nErd Avatar
42 months ago

[S]After extensive consultation with experts[/S] After extensive public pressure...

Never should have succumbed to the public feedback! Perhaps a botched introduction but no one else would have done it 'right' like Apple because of the scrutiny they are under.
Truly an unhinged take
Score: 26 Votes (Like | Disagree)