Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos

In addition to making end-to-end encryption available for iCloud Photos, Apple today announced that it has abandoned its controversial plans to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos, according to a statement shared with WIRED.

iCloud General Feature
Apple's full statement:

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the U.S. with iOS 15.2 in December 2021 and has since expanded to the U.K., Canada, Australia, and New Zealand, and the Siri resources are also available, but CSAM detection never ended up launching.

Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company ultimately postponed the feature based on "feedback from customers, advocacy groups, researchers, and others." Now, after a year of silence, Apple has abandoned the CSAM detection plans altogether.

Apple promised its CSAM detection system was "designed with user privacy in mind." The system would have performed "on-device matching using a database of known CSAM image hashes" from child safety organizations, which Apple would transform into an "unreadable set of hashes that is securely stored on users' devices."

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a "threshold" that would ensure "less than a one in one trillion chance per year" of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.

Apple's plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that the feature would have created a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iOS 26

iOS 26.3 and iOS 26.4 Will Add These New Features to Your iPhone

Tuesday February 3, 2026 7:47 am PST by
While the iOS 26.3 Release Candidate is now available ahead of a public release, the first iOS 26.4 beta is likely still at least a week away. Following beta testing, iOS 26.4 will likely be released to the general public in March or April. Below, we have recapped known or rumored iOS 26.3 and iOS 26.4 features so far. iOS 26.3 iPhone to Android Transfer Tool iOS 26.3 makes it easier...
imac video apple feature

Apple Makes Its Second-Biggest Acquisition Ever

Tuesday February 3, 2026 12:45 pm PST by
Apple recently acquired Israeli startup Q.ai for close to $2 billion, according to Financial Times sources. That would make this Apple's second-biggest acquisition ever, after it paid $3 billion for the popular headphone maker Beats in 2014. This is also the largest known Apple acquisition since the company purchased Intel's smartphone modem business and patents for $1 billion in 2019....
Apple Logo Zoomed

Tim Cook Teases Plans for Apple's Upcoming 50th Anniversary

Thursday February 5, 2026 12:54 pm PST by
Apple turns 50 this year, and its CEO Tim Cook has promised to celebrate the milestone. The big day falls on April 1, 2026. "I've been unusually reflective lately about Apple because we have been working on what do we do to mark this moment," Cook told employees today, according to Bloomberg's Mark Gurman. "When you really stop and pause and think about the last 50 years, it makes your heart ...
Apple TV Color

Apple TV Announces 12 New Shows and Films Coming This Year

Wednesday February 4, 2026 12:29 pm PST by
Apple on Tuesday previewed 12 new shows and films that will be premiering on the Apple TV streaming service throughout 2026. The new series: Imperfect Women — March 18, 2026 Margo's Got Money Troubles — April 15, 2026 Widow's Bay — April 29, 2026 Maximum Pleasure Guaranteed — May 20, 2026 Cape Fear — June 5, 2026 Lucky — July 15, 2026 The new films: Eternity — ...
iphone 16 pro colors 1

Apple Begins Selling Refurbished iPhone 16 and iPhone 16 Pro Models at Lower Prices

Wednesday February 4, 2026 7:44 am PST by
Apple today began selling certified refurbished iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max models on its online store in the U.S., with prices discounted by 12% to 22% compared to Apple's current or former pricing for the devices. Here were Apple's starting prices when the devices launched in September 2024: iPhone 16: $799 iPhone 16 Plus: $899 iPhone 16 Pro:...

Top Rated Comments

Populus Avatar
41 months ago
This is the right decision for Apple to make, in my opinion. I’m glad they recognized that there are better ways to prevent the spread of this type of content.

I’m sincerely surprised Apple backtracked on something as big as this (and with such a big pressure from the governments).
Score: 64 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
41 months ago

Yeah, but now we can't catch the pedophiles
That's the Law Enforcement and Government job. Not Apple's.
Score: 63 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
41 months ago
Thank you, Apple. CSAM was a joke. If privacy matters in your life, it should matter to the phone your life is on”. Long Live!

Score: 44 Votes (Like | Disagree)
Realityck Avatar
41 months ago

Apple today announced that it has abandoned its plans to detect known CSAM stored in iCloud Photos, according to a statement shared with WIRED ('https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/').
Everybody should be happy CSAM is DOA
Score: 30 Votes (Like | Disagree)
aPple nErd Avatar
41 months ago
Great news but they lost all my trust with this. I will continue to use local backups on my Mac and keep my photos out of iCloud for good.
Score: 27 Votes (Like | Disagree)
aPple nErd Avatar
41 months ago

[S]After extensive consultation with experts[/S] After extensive public pressure...

Never should have succumbed to the public feedback! Perhaps a botched introduction but no one else would have done it 'right' like Apple because of the scrutiny they are under.
Truly an unhinged take
Score: 26 Votes (Like | Disagree)