Apple Outlines Security and Privacy of CSAM Detection System in New Document

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations.

iphone communication safety feature
Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from some security researchers, the non-profit Electronic Frontier Foundation, and others about the system potentially being abused by governments as a form of mass surveillance.

The document aims to address these concerns and reiterates some details that surfaced earlier in an interview with Apple's software engineering chief Craig Federighi, including that Apple expects to set an initial match threshold of 30 known CSAM images before an iCloud account is flagged for manual review by the company.

Apple also said that the on-device database of known CSAM images contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions and not under the control of the same government.

The system is designed so that a user need not trust Apple, any other single entity, or even any set of possibly-colluding entities from the same sovereign jurisdiction (that is, under the control of the same government) to be confident that the system is functioning as advertised. This is achieved through several interlocking mechanisms, including the intrinsic auditability of a single software image distributed worldwide for execution on-device, a requirement that any perceptual image hashes included in the on-device encrypted CSAM database are provided independently by two or more child safety organizations from separate sovereign jurisdictions, and lastly, a human review process to prevent any errant reports.

Apple added that it will publish a support document on its website containing a root hash of the encrypted CSAM hash database included with each version of every Apple operating system that supports the feature. Additionally, Apple said users will be able to inspect the root hash of the encrypted database present on their device, and compare it to the expected root hash in the support document. No timeframe was provided for this.

In a memo obtained by Bloomberg's Mark Gurman, Apple said it will have an independent auditor review the system as well. The memo noted that Apple retail employees may be getting questions from customers about the child safety features and linked to a FAQ that Apple shared earlier this week as a resource the employees can use to address the questions and provide more clarity and transparency to customers.

Apple initially said the new child safety features would be coming to the iPhone, iPad, and Mac with software updates later this year, and the company said the features would be available in the U.S. only at launch. Despite facing criticism, Apple today said it has not made any changes to this timeframe for rolling out the features to users.

Popular Stories

iOS 18 CarPlay Feature

iOS 18 Adds These 5 New Features to CarPlay

Thursday June 13, 2024 7:44 am PDT by
Apple did not mention CarPlay during its WWDC keynote this week, but iOS 18 includes a handful of new features for the in-car software. Overall, there is not a whole lot new for CarPlay on iOS 18, with changes seemingly limited to the Messages and Settings apps so far. Below, we recap everything new for CarPlay on iOS 18. New for CarPlay on iOS 18 1. Contact Photos in Messages App...
ios 18 button bulge

iOS 18 Adds Pop-Out Bezel Animation When Pressing iPhone Buttons

Tuesday June 11, 2024 10:40 am PDT by
iOS 18 includes a small but interesting change for the buttons on the iPhone, adding more of a visual element when changing volume, activating the Action button, or locking the screen. When you press an iPhone button in iOS 18, the display bezel bulges outward slightly. This feature is available for the volume buttons, Action button and the power button, and it will also likely be used for...
maxresdefault

First Look at Messages via Satellite in iOS 18

Thursday June 13, 2024 11:29 am PDT by
Apple has been gradually expanding its suite of satellite connectivity features for iPhone, and iOS 18 brings a significant new one in the form of Messages via satellite. The feature allows users to send and receive iMessages and SMS texts, including emoji and Tapbacks, while out of range of cellular and Wi-Fi networks. CNET met up with Apple's senior director of platform product marketing,...
iOS 18 Wallet Feature

Here's What's New in Apple Wallet on iOS 18 for Event Tickets and More

Friday June 14, 2024 7:32 am PDT by
iOS 18 includes a handful of enhancements to the Wallet app on the iPhone, with new features for Apple Pay, Apple Cash, event tickets, and more. Below, we outline everything new for the Wallet app on iOS 18, based on information from Apple's press release and a WWDC 2024 coding session. Redesigned Event Tickets Event tickets have an all-new design in the Wallet app on iOS 18, complete...

Top Rated Comments

fwmireault Avatar
37 months ago
It’s funny how Apple deeply believes that we just don’t understand the feature. I understand the hashes matching process, and I’m against it. Not because of the feature itself (who could be way more intrusive than that) but because of the risk of abuses of that backdoor.
Score: 129 Votes (Like | Disagree)
Khedron Avatar
37 months ago
How many press releases and FAQs do we need to polish this turd?

Apple designed a system so that an external authority can gain control of your phone to scan your private files and report the results to the police. End of.
Score: 95 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
37 months ago
Good try but, Give it up Apple. Shut this down already.

This is an end of an era for Privacy!!!

This is literally Apple right now!



Attachment Image
Score: 83 Votes (Like | Disagree)
So@So@So Avatar
37 months ago
Mass surveillance of a billion iPhone users for what – now that every criminal has been warned?

Since it is on the device it looks like a first step, the second step could be a neural network detecting new images (taken with the camera).

It's just unacceptable – I won't update software or hardware.
Score: 78 Votes (Like | Disagree)
Sciomar Avatar
37 months ago
They can educate everyone as much as possible but I think the social court has already made its emotional ruling.
Score: 69 Votes (Like | Disagree)
haunebu Avatar
37 months ago
No thanks, Apple.
Score: 55 Votes (Like | Disagree)