Apple Outlines Security and Privacy of CSAM Detection System in New Document

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations.

iphone communication safety feature
Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from some security researchers, the non-profit Electronic Frontier Foundation, and others about the system potentially being abused by governments as a form of mass surveillance.

The document aims to address these concerns and reiterates some details that surfaced earlier in an interview with Apple's software engineering chief Craig Federighi, including that Apple expects to set an initial match threshold of 30 known CSAM images before an iCloud account is flagged for manual review by the company.

Apple also said that the on-device database of known CSAM images contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions and not under the control of the same government.

The system is designed so that a user need not trust Apple, any other single entity, or even any set of possibly-colluding entities from the same sovereign jurisdiction (that is, under the control of the same government) to be confident that the system is functioning as advertised. This is achieved through several interlocking mechanisms, including the intrinsic auditability of a single software image distributed worldwide for execution on-device, a requirement that any perceptual image hashes included in the on-device encrypted CSAM database are provided independently by two or more child safety organizations from separate sovereign jurisdictions, and lastly, a human review process to prevent any errant reports.

Apple added that it will publish a support document on its website containing a root hash of the encrypted CSAM hash database included with each version of every Apple operating system that supports the feature. Additionally, Apple said users will be able to inspect the root hash of the encrypted database present on their device, and compare it to the expected root hash in the support document. No timeframe was provided for this.

In a memo obtained by Bloomberg's Mark Gurman, Apple said it will have an independent auditor review the system as well. The memo noted that Apple retail employees may be getting questions from customers about the child safety features and linked to a FAQ that Apple shared earlier this week as a resource the employees can use to address the questions and provide more clarity and transparency to customers.

Apple initially said the new child safety features would be coming to the iPhone, iPad, and Mac with software updates later this year, and the company said the features would be available in the U.S. only at launch. Despite facing criticism, Apple today said it has not made any changes to this timeframe for rolling out the features to users.

Popular Stories

iOS 26

iOS 26.3 and iOS 26.4 Will Add These New Features to Your iPhone

Tuesday February 3, 2026 7:47 am PST by
While the iOS 26.3 Release Candidate is now available ahead of a public release, the first iOS 26.4 beta is likely still at least a week away. Following beta testing, iOS 26.4 will likely be released to the general public in March or April. Below, we have recapped known or rumored iOS 26.3 and iOS 26.4 features so far. iOS 26.3 iPhone to Android Transfer Tool iOS 26.3 makes it easier...
imac video apple feature

Apple Makes Its Second-Biggest Acquisition Ever

Tuesday February 3, 2026 12:45 pm PST by
Apple recently acquired Israeli startup Q.ai for close to $2 billion, according to Financial Times sources. That would make this Apple's second-biggest acquisition ever, after it paid $3 billion for the popular headphone maker Beats in 2014. This is also the largest known Apple acquisition since the company purchased Intel's smartphone modem business and patents for $1 billion in 2019....
Apple Logo Black

Apple's Next Launch is 'Imminent'

Sunday February 1, 2026 12:31 pm PST by
The calendar has turned to February, and a new report indicates that Apple's next product launch is "imminent," in the form of new MacBook Pro models. "All signs point to an imminent launch of next-generation MacBook Pros that retain the current form factor but deliver faster chips," Bloomberg's Mark Gurman said on Sunday. "I'm told the new models — code-named J714 and J716 — are slated...
iOS 26 Home Feature

Apple Gives Final Warning to Home App Users

Tuesday February 3, 2026 8:55 am PST by
In 2022, Apple introduced a new Apple Home architecture that is "more reliable and efficient," and the deadline to upgrade and avoid issues is fast approaching. In an email this week, Apple gave customers a final reminder to upgrade their Home app by February 10, 2026. Apple says users who do not upgrade may experience issues with accessories and automations, or lose access to their smart...
Aston Martin CarPlay Ultra Screen

Apple's CarPlay Ultra to Expand to These Vehicle Brands Later This Year

Sunday February 1, 2026 10:08 am PST by
Last year, Apple launched CarPlay Ultra, the long-awaited next-generation version of its CarPlay software system for vehicles. Nearly nine months later, CarPlay Ultra is still limited to Aston Martin's latest luxury vehicles, but that should change fairly soon. In May 2025, Apple said many other vehicle brands planned to offer CarPlay Ultra, including Hyundai, Kia, and Genesis. In his Powe...

Top Rated Comments

fwmireault Avatar
59 months ago
It’s funny how Apple deeply believes that we just don’t understand the feature. I understand the hashes matching process, and I’m against it. Not because of the feature itself (who could be way more intrusive than that) but because of the risk of abuses of that backdoor.
Score: 129 Votes (Like | Disagree)
Khedron Avatar
59 months ago
How many press releases and FAQs do we need to polish this turd?

Apple designed a system so that an external authority can gain control of your phone to scan your private files and report the results to the police. End of.
Score: 95 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
59 months ago
Good try but, Give it up Apple. Shut this down already.

This is an end of an era for Privacy!!!

This is literally Apple right now!



Attachment Image
Score: 83 Votes (Like | Disagree)
So@So@So Avatar
59 months ago
Mass surveillance of a billion iPhone users for what – now that every criminal has been warned?

Since it is on the device it looks like a first step, the second step could be a neural network detecting new images (taken with the camera).

It's just unacceptable – I won't update software or hardware.
Score: 78 Votes (Like | Disagree)
Sciomar Avatar
59 months ago
They can educate everyone as much as possible but I think the social court has already made its emotional ruling.
Score: 69 Votes (Like | Disagree)
haunebu Avatar
59 months ago
No thanks, Apple.
Score: 55 Votes (Like | Disagree)