Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week.

apple privacy
"Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of this new solution, and some have reached out with questions," reads the FAQ. "This document serves to address these questions and provide more clarity and transparency in the process."

Some discussions have blurred the distinction between the two features, and Apple takes great pains in the document to differentiate them, explaining that communication safety in Messages "only works on images sent or received in the Messages app for child accounts set up in Family Sharing," while CSAM detection in ‌iCloud Photos‌ "only impacts users who have chosen to use ‌iCloud Photos‌ to store their photos… There is no impact to any other on-device data."

From the FAQ:

These two features are not the same and do not use the same technology.

Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on-device, and so does not change the privacy assurances of Messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it.

The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.

The rest of the document is split into three sections (in bold below), with answers to the following commonly asked questions:

  • Communication safety in Messages
  • Who can use communication safety in Messages?
  • Does this mean Messages will share information with Apple or law enforcement?
  • Does this break end-to-end encryption in Messages?
  • Does this feature prevent children in abusive homes from seeking help?
  • Will parents be notified without children being warned and given a choice?
  • CSAM detection
  • Does this mean Apple is going to scan all the photos stored on my iPhone?
  • Will this download CSAM images to my ‌iPhone‌ to compare against my photos?
  • Why is Apple doing this now?
  • Security for CSAM detection for iCloud Photos
  • Can the CSAM detection system in ‌iCloud Photos‌ be used to detect things other than CSAM?
  • Could governments force Apple to add non-CSAM images to the hash list?
  • Can non-CSAM images be "injected" into the system to flag accounts for things other than CSAM?
  • Will CSAM detection in ‌iCloud Photos‌ falsely flag innocent people to law enforcement?

Interested readers should consult the document for Apple's full responses to these questions. However, it's worth noting that for those questions which can be responded to with a binary yes/no, Apple begins all of them with "No" with the exception of the following three questions from the section titled "Security for CSAM detection for ‌iCloud Photos‌:"

Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities.

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

Can non-CSAM images be "injected" into the system to flag accounts for things other than CSAM?
Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design. Finally, there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system flagging images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, and others for its decision to deploy the technology with the release of iOS 15 and iPadOS 15, expected in September.

This has resulted in an open letter criticizing Apple's plan to scan iPhones for CSAM in ‌iCloud Photos‌ and explicit images in children's messages, which has gained over 5,500 signatures as of writing. Apple has also received criticism from Facebook-owned WhatsApp, whose chief Will Cathcart called it "the wrong approach and a setback for people's privacy all over the world." Epic Games CEO Tim Sweeney also attacked the decision, claiming he'd "tried hard" to see the move from Apple's point of view, but had concluded that, "inescapably, this is government spyware installed by Apple based on a presumption of guilt."

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow." The non-profit Electronic Frontier Foundation also criticized Apple's plans, stating that "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

Top Rated Comments

Hanterdro Avatar
15 months ago
In 5 years: “It is the law that we have to scan for government critical images. Apple only follows regional laws.“
Score: 102 Votes (Like | Disagree)
Feyl Avatar
15 months ago
I’m sorry Apple but you are not trustworthy. You and your comrades from the big tech are evil.
Score: 87 Votes (Like | Disagree)
betterbegood Avatar
15 months ago
All three FAQ questions could/should actually be answered with:

"Practically speaking yes, and if we were forced to do so by a government entity you wouldn't know."

This is the problem.
Score: 59 Votes (Like | Disagree)
jshannon01 Avatar
15 months ago
This CSAM upgrade is the only one you will hear about. When it starts scanning for other things you won't know and will have no way of finding out. The timing of it in this era of censorship is suspicious.
Score: 54 Votes (Like | Disagree)
Luis Ortega Avatar
15 months ago

I’m sorry Apple but you are not trustworthy. You and your comrades from the big tech are evil.
It has become like the boy who cried wolf. Nobody really believes apple or anyone are even capable of protecting users from government snooping.
The more Cook grouses about privacy, the less I believe him and the more he sounds like a lying hypocrite.
Score: 46 Votes (Like | Disagree)
entropys Avatar
15 months ago

People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:

https://www.adobe.com/uk/legal/lawenforcementrequests/childsafety.html

That's just one example.
So? One of the reasons we use Apple is it had a modicum of respect for privacy. Those companies don't respect our privacy.
Score: 45 Votes (Like | Disagree)

Related Stories

iphone communication safety feature

Apple Open to Expanding New Child Safety Features to Third-Party Apps

Monday August 9, 2021 11:00 am PDT by
Apple today held a questions-and-answers session with reporters regarding its new child safety features, and during the briefing, Apple confirmed that it would be open to expanding the features to third-party apps in the future. As a refresher, Apple unveiled three new child safety features coming to future versions of iOS 15, iPadOS 15, macOS Monterey, and/or watchOS 8. Apple's New Child ...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...
craig wwdc 2021 privacy

Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Friday August 13, 2021 6:33 am PDT by
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM). Federighi admitted that Apple...
iphone communication safety feature

Apple Outlines Security and Privacy of CSAM Detection System in New Document

Friday August 13, 2021 11:45 am PDT by
Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations. Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from...
iphone communication safety feature arned

Code for Apple's Communication Safety Feature for Kids Found in iOS 15.2 Beta [Updated]

Wednesday October 27, 2021 1:04 pm PDT by
Update: We've learned from Apple that the Communication Safety code found in the first iOS 15.2 beta is not a feature in that update and Apple does not plan to release the feature as it is described in the article. Apple this summer announced new Child Safety Features that are designed to keep children safer online. One of those features, Communication Safety, appears to be included in the...
Child Safety Feature yellow

Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]

Wednesday December 15, 2021 1:53 am PST by
Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods. Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material...
Child Safety Feature Blue

Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection

Wednesday August 18, 2021 1:13 pm PDT by
Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit. Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his...
Child Safety Feature

Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread

Thursday August 5, 2021 1:04 pm PDT by
Apple today announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC). Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun...

Popular Stories

apple watch ultra hammer test

YouTuber Tests Apple Watch Ultra Durability With a Hammer: Table Breaks Before the Watch

Sunday September 25, 2022 2:27 pm PDT by
A YouTuber has put Apple's claims for the durability of the Apple Watch Ultra to the test by putting it up against a drop test, a jar of nails, and repeated hits with a hammer to test the sapphire crystal protecting the display. TechRax, a channel popular for testing the durability of products, first tested the Apple Watch Ultra by dropping it from around four feet high. The Apple Watch...
AirPods Max 2022 Colors

Ten Things AirPods Pro 2 Tell Us About AirPods Max 2

Saturday September 24, 2022 1:00 am PDT by
Upon the release of the second-generation AirPods Pro, the AirPods Max became the oldest current-generation AirPods product still in Apple's lineup. Introducing several new features like Adaptive Transparency and the H2 chip, the second-generation AirPods Pro may provide some of the best indications yet of what to expect from the second-generation AirPods Max. Almost two years later, rumors...
tim cook spring loaded event

Six Major Products to Expect From Apple in 2023

Sunday September 25, 2022 10:57 am PDT by
As we approach the end of a busy product release season for Apple with only new iPads and Macs left to be announced over the next month or so, we're also setting our sights on 2023. Apple is rumored to have several major products in the pipeline for next year, including new Macs, a new HomePod, a VR/AR headset, and so much more. Other than new iPhones and Apple Watches, which are expected...
Tim Cook Apple Event

Gurman: New iPads and Macs May Be Announced Through Press Releases, No October Event

Sunday September 25, 2022 6:50 am PDT by
Apple may decide to release its remaining products for 2022, which include updated iPad Pro, Mac mini, and 14-inch and 16-inch MacBook Pro models, through press releases on its website rather than a digital event, according to Bloomberg's Mark Gurman. In his latest Power On newsletter, Gurman said that Apple is currently "likely to release its remaining 2022 products via press releases,...
apple watch series 7 aluminum colors yellowbg

Don't Want the Apple Watch Ultra or Series 8? Amazon Has Record Low Prices on Series 7 Models This Week

Friday September 23, 2022 6:56 am PDT by
The Apple Watch Series 8 and Apple Watch Ultra are now available to purchase, but if you aren't interested in these updates you can save a lot of money on Series 7 models right now on Amazon. Note: MacRumors is an affiliate partner with Amazon. When you click a link and make a purchase, we may receive a small payment, which helps us keep the site running. The best deals are on cellular...
iphone 14 pro max deep purple

iPhone 15 'Ultra' Could Replace Pro Max Model Next Year

Sunday September 25, 2022 7:02 am PDT by
Apple is gearing up to possibly replace its "Pro Max" iPhone with an all-new "Ultra" iPhone 15 model next year, reliable Bloomberg journalist Mark Gurman said today. Writing in his latest Power On newsletter, Gurman said that for the iPhone 15, Apple is planning a revamped design alongside USB-C and a potential name change. Apple could replace its "Pro Max" branding, which it started to use...
AirPods Pro Second Generation 2 Pairing Feature 1

AirPods Pro 2 Engravings Appear in iOS During Pairing and Connecting

Friday September 23, 2022 9:40 am PDT by
Customers who personalize their second-generation AirPods Pro charging case with an engraving will now have that engraving reflected directly on iOS as they pair and connect their AirPods Pro. Apple allows customers to personalize their AirPods Pro charging case with a special engraving that can include select emojis and Memojis. Unlike before, starting with the second-generation AirPods...
General iOS 16 Feature Yellow

Some iOS 16 Users Continue to Face Unaddressed Bugs and Battery Drain Two Weeks After Launch

Monday September 26, 2022 7:34 am PDT by
Today marks exactly two weeks since Apple released iOS 16 to the public. Besides the personalized Lock Screen, major changes in Messages, and new features in Maps, the update has also seen its fair share of bugs, performance problems, battery drain, and more. After major iOS updates, it's normal for some users to report having issues with the new update, but such reports usually subside in...
14 vs 16 inch mbp m2 pro and max feature 1

New 14-Inch and 16-Inch MacBook Pros Reportedly Launching Later This Year

Friday September 23, 2022 7:08 am PDT by
Apple plans to release new MacBook Pro models in the fourth quarter of 2022, according to supply chain publication DigiTimes. The report does not mention specific models, but it very likely refers to the next-generation 14-inch and 16-inch MacBook Pros given that the 13-inch model was already updated earlier this year. There has been uncertainty surrounding the timing of new 14-inch and...