Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]

Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods.

Child Safety Feature yellow
Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees.

The majority of criticism was leveled at Apple's planned on-device CSAM detection, which was lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual abuse.

Apple initially attempted to dispel some misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more, in order to allay concerns.

However, despite Apple's efforts, the controversy didn't go away. Apple eventually went ahead with the Communication Safety features rollout for Messages, which went live earlier this week with the release of iOS 15.2, but Apple decided to delay the rollout of CSAM following the torrent of criticism that it clearly hadn't anticipated.

Apple said its decision to delay was "based on feedback from customers, advocacy groups, researchers and others... we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

The above statement was added to Apple's Child Safety page, but it has now gone, along with all mentions of CSAM, which raises the possibility that Apple could have kicked it into the long grass and abandoned the plan altogether. We've reached out to Apple for comment and will update this article if we hear back.

Update: Apple spokesperson Shane Bauer told The Verge that though the CSAM detection feature is no longer mentioned on its website, plans for CSAM detection have not changed since September, which means CSAM detection is still coming in the future.

"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," Apple said in September.

Popular Stories

iPhone SE 4 Vertical Camera Feature

iPhone SE 4 Production Will Reportedly Begin Ramping Up in October

Tuesday July 23, 2024 2:00 pm PDT by
Following nearly two years of rumors about a fourth-generation iPhone SE, The Information today reported that Apple suppliers are finally planning to begin ramping up mass production of the device in October of this year. If accurate, that timeframe would mean that the next iPhone SE would not be announced alongside the iPhone 16 series in September, as expected. Instead, the report...
iPhone 17 Plus Feature

iPhone 17 Lineup Specs Detail Display Upgrade and New High-End Model

Monday July 22, 2024 4:33 am PDT by
Key details about the overall specifications of the iPhone 17 lineup have been shared by the leaker known as "Ice Universe," clarifying several important aspects of next year's devices. Reports in recent months have converged in agreement that Apple will discontinue the "Plus" iPhone model in 2025 while introducing an all-new iPhone 17 "Slim" model as an even more high-end option sitting...
Generic iPhone 17 Feature With Full Width Dynamic Island

Kuo: Ultra-Thin iPhone 17 to Feature A19 Chip, Single Rear Camera, Semi-Titanium Frame, and More

Wednesday July 24, 2024 9:06 am PDT by
Apple supply chain analyst Ming-Chi Kuo today shared alleged specifications for a new ultra-thin iPhone 17 model rumored to launch next year. Kuo expects the device to be equipped with a 6.6-inch display with a current-size Dynamic Island, a standard A19 chip rather than an A19 Pro chip, a single rear camera, and an Apple-designed 5G chip. He also expects the device to have a...
iPhone 16 Pro Sizes Feature

iPhone 16 Series Is Less Than Two Months Away: Everything We Know

Thursday July 25, 2024 5:43 am PDT by
Apple typically releases its new iPhone series around mid-September, which means we are about two months out from the launch of the iPhone 16. Like the iPhone 15 series, this year's lineup is expected to stick with four models – iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max – although there are plenty of design differences and new features to take into account. To bring ...
icloud private relay outage

iCloud Private Relay Experiencing Outage

Thursday July 25, 2024 3:18 pm PDT by
Apple’s iCloud Private Relay service is down for some users, according to Apple’s System Status page. Apple says that the iCloud Private Relay service may be slow or unavailable. The outage started at 2:34 p.m. Eastern Time, but it does not appear to be affecting all iCloud users. Some impacted users are unable to browse the web without turning iCloud Private Relay off, while others are...

Top Rated Comments

Jim Lahey Avatar
34 months ago
If I ever find out this has been surreptitiously added without my knowledge then I will sell every Apple device I own and never buy another. Anyone who doesn’t have an issue with this has no clue of the concept of mission creep. If these systems are allowed to exist then it’s only a matter of time before the Feds batter your door in for having a [insert future dictator here] meme in your iCloud library. The road to hell is paved with good intentions.
Score: 62 Votes (Like | Disagree)
entropys Avatar
34 months ago
Good too much potential for abuse (and I am not saying that ironically).
Score: 46 Votes (Like | Disagree)
DaveFlash Avatar
34 months ago
better, this was bound to fail at the start, you'd only need one bad actor feeding apple's system wrong hashes and everyone is a potential suspect for whatever governmental purpose that bad actor wants to silence, like criticism, dissent, protestors in Hong Kong, LGBT minorities in certain regions you name it. Also, as an EU citizen, I'm glad, as this system Apple proposed, wouldn't have been allowed here anyway because of the strong protections in our GDPR privacy laws.
Score: 31 Votes (Like | Disagree)
DMG35 Avatar
34 months ago
This was a mess from the beginning and pulling it was the only logical thing to do.
Score: 25 Votes (Like | Disagree)
Solomani Avatar
34 months ago
Good. It is only right that Apple should listen to its userbase.
Score: 23 Votes (Like | Disagree)
Agit21 Avatar
34 months ago
Im still not going to activate iMessage again or iCloud backup for pictures. Thank you Tim.
Score: 23 Votes (Like | Disagree)