Apple Outlines Security and Privacy of CSAM Detection System in New Document

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations.

iphone communication safety feature
Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from some security researchers, the non-profit Electronic Frontier Foundation, and others about the system potentially being abused by governments as a form of mass surveillance.

The document aims to address these concerns and reiterates some details that surfaced earlier in an interview with Apple's software engineering chief Craig Federighi, including that Apple expects to set an initial match threshold of 30 known CSAM images before an iCloud account is flagged for manual review by the company.

Apple also said that the on-device database of known CSAM images contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions and not under the control of the same government.

The system is designed so that a user need not trust Apple, any other single entity, or even any set of possibly-colluding entities from the same sovereign jurisdiction (that is, under the control of the same government) to be confident that the system is functioning as advertised. This is achieved through several interlocking mechanisms, including the intrinsic auditability of a single software image distributed worldwide for execution on-device, a requirement that any perceptual image hashes included in the on-device encrypted CSAM database are provided independently by two or more child safety organizations from separate sovereign jurisdictions, and lastly, a human review process to prevent any errant reports.

Apple added that it will publish a support document on its website containing a root hash of the encrypted CSAM hash database included with each version of every Apple operating system that supports the feature. Additionally, Apple said users will be able to inspect the root hash of the encrypted database present on their device, and compare it to the expected root hash in the support document. No timeframe was provided for this.

In a memo obtained by Bloomberg's Mark Gurman, Apple said it will have an independent auditor review the system as well. The memo noted that Apple retail employees may be getting questions from customers about the child safety features and linked to a FAQ that Apple shared earlier this week as a resource the employees can use to address the questions and provide more clarity and transparency to customers.

Apple initially said the new child safety features would be coming to the iPhone, iPad, and Mac with software updates later this year, and the company said the features would be available in the U.S. only at launch. Despite facing criticism, Apple today said it has not made any changes to this timeframe for rolling out the features to users.

Popular Stories

Apple Vision Pro 2 Feature 2

Apple Reportedly Suspends Work on Vision Pro 2

Tuesday June 18, 2024 8:17 am PDT by
Apple has suspended work on the second-generation Vision Pro headset to singularly focus on a cheaper model, The Information reports. Apple was widely believed to have plans to divide its Vision product line into two models, with one "Pro" model and one lower-cost standard model. The company is said to have been deprioritizing the next Vision Pro headset over the past year, gradually...
Apple WWDC24 Apple Intelligence hero 240610

Apple Explains iPhone 15 Pro Requirement for Apple Intelligence

Wednesday June 19, 2024 4:48 am PDT by
With iOS 18, iPadOS 18, and macOS Sequoia, Apple is introducing a new personalized AI experience called Apple Intelligence that uses on-device, generative large-language models to enhance the user experience across iPhone, iPad, and Mac. These new AI features require Apple's latest iPhone 15 Pro and iPhone 15 Pro Max models to work, while only Macs and iPads with M1 or later chips will...
2022 back to school apple feature

Apple's 2024 Back to School Sale Launching This Week

Monday June 17, 2024 12:27 pm PDT by
Apple will launch its annual Back to School promotion for university students in the United States and Canada this week, according to Bloomberg's Mark Gurman. Apple's back to school sales provide students with a free Apple gift card when purchasing a Mac or an iPad, and this year's promotion could help Apple push the new M2 iPad Air and M4 iPad Pro models. Last year, Apple offered U.S....
apple watch series 9 display

Kuo: Apple Watch Series 10 to Get Larger Screen and Thinner Design

Monday June 17, 2024 1:20 am PDT by
This year's Apple Watch Series 10 will be thinner and come in larger screen sizes than previous models, according to Apple analyst Ming-Chi Kuo. In his latest industry note -10-and-98075c44ce92">shared on Medium, Kuo said the screen size options on the next-generation Apple Watch will increase from 41mm to 45mm, and from 45mm to 49mm, while being encased in a thinner design. For reference,...
M4 Real Feature Red

M4 MacBook Pro Models Expected to Launch in Late 2024

Tuesday June 18, 2024 10:50 am PDT by
MacBook Pro models with an M4 chip are expected to launch in the fourth quarter of 2024, according to display analyst Ross Young. In a tweet for subscribers, Young said that panel shipments for new 14-inch and 16-inch MacBook Pro models are set to begin in the third quarter of 2024, which suggests a launch toward the end of the year. Apple started its M4 chip refresh in May with the launch...
Apple Pay Later feature 1

Apple Discontinuing Apple Pay Later

Monday June 17, 2024 11:44 am PDT by
Apple is discontinuing Apple Pay Later, the buy now, pay later feature that it just launched last October. Apple Pay Later is being discontinued as of today, but people who have existing Apple Pay Later loans will be able to continue to pay them off and manage them through the Wallet app. Apple announced plans to end the feature in a statement provided to 9to5Mac, which also notes that...
watchOS 11 Thumb 2 1

watchOS 11 Supports Automatic Nap Detection

Monday June 17, 2024 4:05 pm PDT by
watchOS 11 appears to include a new feature that allows an Apple Watch to automatically detect and record when you're taking a nap. As shared on Reddit, an Apple Watch owner took a nap and was able to see the sleep data recorded in the Health app, despite not putting the device in Sleep Mode. Right now, the Apple Watch only tracks and records sleep when it is in Sleep Mode, and there is no...
iPod Nano vs iPod Pro Ad Feature 1

Apple Developing Thinner MacBook Pro, Apple Watch, and iPhone

Monday June 17, 2024 2:22 am PDT by
Apple intends to slim down the MacBook Pro, Apple Watch, and iPhone, with the new ultra-thin M4 iPad Pro a sign of the company's new design trajectory, according to Bloomberg's Mark Gurman. When the M4 iPad Pro was unveiled last month, Apple touted it as the company's thinnest product ever, and even compared it to the 2012 iPod nano to emphasize its slim dimensions. Writing in the latest ...

Top Rated Comments

fwmireault Avatar
37 months ago
It’s funny how Apple deeply believes that we just don’t understand the feature. I understand the hashes matching process, and I’m against it. Not because of the feature itself (who could be way more intrusive than that) but because of the risk of abuses of that backdoor.
Score: 129 Votes (Like | Disagree)
Khedron Avatar
37 months ago
How many press releases and FAQs do we need to polish this turd?

Apple designed a system so that an external authority can gain control of your phone to scan your private files and report the results to the police. End of.
Score: 95 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
37 months ago
Good try but, Give it up Apple. Shut this down already.

This is an end of an era for Privacy!!!

This is literally Apple right now!



Attachment Image
Score: 83 Votes (Like | Disagree)
So@So@So Avatar
37 months ago
Mass surveillance of a billion iPhone users for what – now that every criminal has been warned?

Since it is on the device it looks like a first step, the second step could be a neural network detecting new images (taken with the camera).

It's just unacceptable – I won't update software or hardware.
Score: 78 Votes (Like | Disagree)
Sciomar Avatar
37 months ago
They can educate everyone as much as possible but I think the social court has already made its emotional ruling.
Score: 69 Votes (Like | Disagree)
haunebu Avatar
37 months ago
No thanks, Apple.
Score: 55 Votes (Like | Disagree)