Apple Outlines Security and Privacy of CSAM Detection System in New Document

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations.

iphone communication safety feature
Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from some security researchers, the non-profit Electronic Frontier Foundation, and others about the system potentially being abused by governments as a form of mass surveillance.

The document aims to address these concerns and reiterates some details that surfaced earlier in an interview with Apple's software engineering chief Craig Federighi, including that Apple expects to set an initial match threshold of 30 known CSAM images before an iCloud account is flagged for manual review by the company.

Apple also said that the on-device database of known CSAM images contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions and not under the control of the same government.

The system is designed so that a user need not trust Apple, any other single entity, or even any set of possibly-colluding entities from the same sovereign jurisdiction (that is, under the control of the same government) to be confident that the system is functioning as advertised. This is achieved through several interlocking mechanisms, including the intrinsic auditability of a single software image distributed worldwide for execution on-device, a requirement that any perceptual image hashes included in the on-device encrypted CSAM database are provided independently by two or more child safety organizations from separate sovereign jurisdictions, and lastly, a human review process to prevent any errant reports.

Apple added that it will publish a support document on its website containing a root hash of the encrypted CSAM hash database included with each version of every Apple operating system that supports the feature. Additionally, Apple said users will be able to inspect the root hash of the encrypted database present on their device, and compare it to the expected root hash in the support document. No timeframe was provided for this.

In a memo obtained by Bloomberg's Mark Gurman, Apple said it will have an independent auditor review the system as well. The memo noted that Apple retail employees may be getting questions from customers about the child safety features and linked to a FAQ that Apple shared earlier this week as a resource the employees can use to address the questions and provide more clarity and transparency to customers.

Apple initially said the new child safety features would be coming to the iPhone, iPad, and Mac with software updates later this year, and the company said the features would be available in the U.S. only at launch. Despite facing criticism, Apple today said it has not made any changes to this timeframe for rolling out the features to users.

Popular Stories

iPhone 17 Slim Feature Single Camera 2

10 Reasons to Wait for Next Year's iPhone 17

Monday September 23, 2024 2:00 am PDT by
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we sometimes get rumored feature leaks so far ahead of launch. The iPhone 17 series is no different – already we have some idea of what to expect from Apple's 2025 smartphone lineup. If you plan to skip...
iPhone 15 Pro lineup

Apple's 80% Charging Limit for iPhone: How Much Did It Help After a Year?

Tuesday September 24, 2024 2:09 pm PDT by
With the iPhone 15 models that came out last year, Apple added an opt-in battery setting that limits maximum charge to 80 percent. The idea is that never charging the iPhone above 80 percent will increase battery longevity, so I kept my iPhone at that 80 percent limit from September 2023 to now, with no cheating. My iPhone 15 Pro Max battery level is currently at 94 percent with 299 cycles....
iPhone 17 Slim Feature Single Camera 2

iPhone 17 Air: Everything We Know About Apple's Slim iPhone

Monday September 23, 2024 1:50 am PDT by
In 2025, Apple is expected to discontinue the iPhone "Plus" device in its iPhone 17 lineup to make way for an iPhone "Air" – although it may not actually be called this when the device debuts in the fall of next year. Even though the iPhone 16 series has only just launched, when you consider that we learned about larger displays on the iPhone 16 Pro models way back in May 2023, rumors about a...
iphone 16 pro apple intelligence

Apple Intelligence Features Expected to Roll Out in This Order Between iOS 18.1 and iOS 18.4 [Updated]

Sunday September 22, 2024 6:00 am PDT by
iOS 18 was released to the public earlier this month, but the first Apple Intelligence features will not be available until iOS 18.1 is released in October. Apple Intelligence features will continue to roll out in iOS 18.2 and beyond, with the expected roadmap outlined below per Apple's website and rumors. Apple Intelligence requires an iPhone 15 Pro model or any iPhone 16 model, and it...
iFixit iPhone 16 Battery Removal

iPhone 16's 'Revolutionary' Battery Removal Process Shown in Video

Monday September 23, 2024 7:30 am PDT by
Over the weekend, well-known repair website iFixit shared an iPhone 16 and iPhone 16 Plus teardown video, and an accompanying blog post. Notably, the video shows Apple's new electrical battery removal process in action on the standard iPhone 16. iPhone 16 and iPhone 16 Plus batteries have an innovative type of adhesive that can be easily loosened with low-voltage electrical current, such as...

Top Rated Comments

fwmireault Avatar
41 months ago
It’s funny how Apple deeply believes that we just don’t understand the feature. I understand the hashes matching process, and I’m against it. Not because of the feature itself (who could be way more intrusive than that) but because of the risk of abuses of that backdoor.
Score: 129 Votes (Like | Disagree)
Khedron Avatar
41 months ago
How many press releases and FAQs do we need to polish this turd?

Apple designed a system so that an external authority can gain control of your phone to scan your private files and report the results to the police. End of.
Score: 95 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
41 months ago
Good try but, Give it up Apple. Shut this down already.

This is an end of an era for Privacy!!!

This is literally Apple right now!



Attachment Image
Score: 83 Votes (Like | Disagree)
So@So@So Avatar
41 months ago
Mass surveillance of a billion iPhone users for what – now that every criminal has been warned?

Since it is on the device it looks like a first step, the second step could be a neural network detecting new images (taken with the camera).

It's just unacceptable – I won't update software or hardware.
Score: 78 Votes (Like | Disagree)
Sciomar Avatar
41 months ago
They can educate everyone as much as possible but I think the social court has already made its emotional ruling.
Score: 69 Votes (Like | Disagree)
haunebu Avatar
41 months ago
No thanks, Apple.
Score: 55 Votes (Like | Disagree)