Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week.

apple privacy
"Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of this new solution, and some have reached out with questions," reads the FAQ. "This document serves to address these questions and provide more clarity and transparency in the process."

Some discussions have blurred the distinction between the two features, and Apple takes great pains in the document to differentiate them, explaining that communication safety in Messages "only works on images sent or received in the Messages app for child accounts set up in Family Sharing," while CSAM detection in ‌iCloud Photos‌ "only impacts users who have chosen to use ‌iCloud Photos‌ to store their photos… There is no impact to any other on-device data."

From the FAQ:

These two features are not the same and do not use the same technology.

Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on-device, and so does not change the privacy assurances of Messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it.

The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.

The rest of the document is split into three sections (in bold below), with answers to the following commonly asked questions:

  • Communication safety in Messages
  • Who can use communication safety in Messages?
  • Does this mean Messages will share information with Apple or law enforcement?
  • Does this break end-to-end encryption in Messages?
  • Does this feature prevent children in abusive homes from seeking help?
  • Will parents be notified without children being warned and given a choice?
  • CSAM detection
  • Does this mean Apple is going to scan all the photos stored on my iPhone?
  • Will this download CSAM images to my ‌iPhone‌ to compare against my photos?
  • Why is Apple doing this now?
  • Security for CSAM detection for iCloud Photos
  • Can the CSAM detection system in ‌iCloud Photos‌ be used to detect things other than CSAM?
  • Could governments force Apple to add non-CSAM images to the hash list?
  • Can non-CSAM images be "injected" into the system to flag accounts for things other than CSAM?
  • Will CSAM detection in ‌iCloud Photos‌ falsely flag innocent people to law enforcement?

Interested readers should consult the document for Apple's full responses to these questions. However, it's worth noting that for those questions which can be responded to with a binary yes/no, Apple begins all of them with "No" with the exception of the following three questions from the section titled "Security for CSAM detection for ‌iCloud Photos‌:"

Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities.

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

Can non-CSAM images be "injected" into the system to flag accounts for things other than CSAM?
Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design. Finally, there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system flagging images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, and others for its decision to deploy the technology with the release of iOS 15 and iPadOS 15, expected in September.

This has resulted in an open letter criticizing Apple's plan to scan iPhones for CSAM in ‌iCloud Photos‌ and explicit images in children's messages, which has gained over 5,500 signatures as of writing. Apple has also received criticism from Facebook-owned WhatsApp, whose chief Will Cathcart called it "the wrong approach and a setback for people's privacy all over the world." Epic Games CEO Tim Sweeney also attacked the decision, claiming he'd "tried hard" to see the move from Apple's point of view, but had concluded that, "inescapably, this is government spyware installed by Apple based on a presumption of guilt."

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow." The non-profit Electronic Frontier Foundation also criticized Apple's plans, stating that "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

Top Rated Comments

Hanterdro Avatar
12 months ago
In 5 years: “It is the law that we have to scan for government critical images. Apple only follows regional laws.“
Score: 102 Votes (Like | Disagree)
Feyl Avatar
12 months ago
I’m sorry Apple but you are not trustworthy. You and your comrades from the big tech are evil.
Score: 87 Votes (Like | Disagree)
betterbegood Avatar
12 months ago
All three FAQ questions could/should actually be answered with:

"Practically speaking yes, and if we were forced to do so by a government entity you wouldn't know."

This is the problem.
Score: 59 Votes (Like | Disagree)
jshannon01 Avatar
12 months ago
This CSAM upgrade is the only one you will hear about. When it starts scanning for other things you won't know and will have no way of finding out. The timing of it in this era of censorship is suspicious.
Score: 54 Votes (Like | Disagree)
Luis Ortega Avatar
12 months ago

I’m sorry Apple but you are not trustworthy. You and your comrades from the big tech are evil.
It has become like the boy who cried wolf. Nobody really believes apple or anyone are even capable of protecting users from government snooping.
The more Cook grouses about privacy, the less I believe him and the more he sounds like a lying hypocrite.
Score: 46 Votes (Like | Disagree)
entropys Avatar
12 months ago

People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:

https://www.adobe.com/uk/legal/lawenforcementrequests/childsafety.html

That's just one example.
So? One of the reasons we use Apple is it had a modicum of respect for privacy. Those companies don't respect our privacy.
Score: 45 Votes (Like | Disagree)

Related Stories

iphone communication safety feature

Apple Open to Expanding New Child Safety Features to Third-Party Apps

Monday August 9, 2021 11:00 am PDT by
Apple today held a questions-and-answers session with reporters regarding its new child safety features, and during the briefing, Apple confirmed that it would be open to expanding the features to third-party apps in the future. As a refresher, Apple unveiled three new child safety features coming to future versions of iOS 15, iPadOS 15, macOS Monterey, and/or watchOS 8. Apple's New Child ...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...
craig wwdc 2021 privacy

Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Friday August 13, 2021 6:33 am PDT by
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM). Federighi admitted that Apple...
iphone communication safety feature

Apple Outlines Security and Privacy of CSAM Detection System in New Document

Friday August 13, 2021 11:45 am PDT by
Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations. Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from...
iphone communication safety feature arned

Code for Apple's Communication Safety Feature for Kids Found in iOS 15.2 Beta [Updated]

Wednesday October 27, 2021 1:04 pm PDT by
Update: We've learned from Apple that the Communication Safety code found in the first iOS 15.2 beta is not a feature in that update and Apple does not plan to release the feature as it is described in the article. Apple this summer announced new Child Safety Features that are designed to keep children safer online. One of those features, Communication Safety, appears to be included in the...
Child Safety Feature yellow

Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]

Wednesday December 15, 2021 1:53 am PST by
Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods. Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material...
Child Safety Feature Blue

Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection

Wednesday August 18, 2021 1:13 pm PDT by
Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit. Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his...
Child Safety Feature

Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread

Thursday August 5, 2021 1:04 pm PDT by
Apple today announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC). Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun...

Popular Stories

airpods pro 2 1

AirPods Pro 2 No Longer Expected to Feature Built-In Heart Rate or Body Temperature Sensor

Sunday July 3, 2022 8:07 pm PDT by
While past rumors have indicated the upcoming second-generation AirPods Pro will feature a built-in heart rate and body temperature sensor, Bloomberg's Mark Gurman has cast doubt on those rumors turning out to be true, saying instead such a feature is unlikely to come anytime soon. "Over the past few months, there have been rumors about this year's model gaining the ability to determine a...
Apple Watch 8 Unreleased Feature Thumb

Apple Watch Series 8 Model Rumored to Feature 5% Larger Display

Monday July 4, 2022 5:50 am PDT by
Apple is working on an Apple Watch Series 8 model with a larger display, according to DSCC's Ross Young and Haitong International Securities's Jeff Pu. In October last year, Young suggested that the Apple Watch Series 8 could come in three display sizes. Now, responding to a query about the rumor on Twitter, Young claims that the additional display size joining the Apple Watch lineup will be ...
intel go pc justin long

Windows Laptop Makers 'Worried' About New MacBook Air Impacting Sales

Tuesday July 5, 2022 6:57 am PDT by
The upcoming launch of Apple's redesigned MacBook Air with the M2 chip has some Windows laptop manufacturers "worried" that sales of Intel-based laptops will be negatively affected, according to industry sources cited by DigiTimes. "A Wintel brand vendor pointed out that at a price point of US$1,000-$1,500, the MacBook Air will crowd out other high-end notebooks," the report claims, with...
European Commisssion

EU Approves Landmark Legislation to Regulate Apple and Other Big Tech Firms

Tuesday July 5, 2022 5:53 am PDT by
European Union lawmakers have approved landmark legislation to heavily regulate Apple, Google, Meta, and other big tech firms. The Digital Markets Act (DMA) and Digital Services Act (DSA) were proposed by the European Commission in December 2020. Now, collected in a "Digital Services Package," the legislation has been formally adopted by the European Parliament and seeks to address...
macbook air m2 order date feature

Apple Announces MacBook Air With M2 Chip Available to Order Starting July 8, Launches July 15

Wednesday July 6, 2022 4:59 am PDT by
Apple today announced that the new MacBook Air equipped with the M2 chip will be available to order starting Friday, July 8 at 5 a.m. Pacific Time. Apple said deliveries to customers and in-store availability will begin Friday, July 15. MacRumors exclusively reported that Apple planned to launch the new MacBook Air on July 15, and the date has now been confirmed by Apple. Customers will be...
top stories 2jul2022

Top Stories: M2 MacBook Air Release Date, New HomePod Rumor, and More

Saturday July 2, 2022 6:00 am PDT by
The M2 MacBook Pro has started making its way into customers' hands and we're learning more about how it performs in a variety of situations, but all eyes are really on the upcoming M2 MacBook Air which has seen a complete redesign and should be arriving in a couple of weeks. Other top stories this week included a host of product rumors including additional M2 and even M3 Macs, an updated...
Lockdown Mode Feature

Apple Announces New Lockdown Mode on iOS 16 With 'Extreme' Level of Security

Wednesday July 6, 2022 10:00 am PDT by
Apple today announced a new Lockdown Mode coming to the iPhone, iPad, and Mac with iOS 16, iPadOS 16, and macOS Ventura. Apple says the optional security feature is designed to protect the "very small number" of users who may be at risk of "highly targeted cyberattacks" from private companies developing state-sponsored spyware, such as journalists, activists, and government employees. Apple...
siri remote 3

Apple Releases Firmware Update for Apple TV Siri Remote

Tuesday July 5, 2022 12:10 pm PDT by
Apple today released new firmware for the Siri Remote designed for the Apple TV, updating the software from version 9M6772 to 10M1103. The software is for the redesigned Siri Remote that was released in May 2021, aka the gray remote with the updated interface. In the Apple TV settings, the new firmware will display as 0x0070, up from 0x0061. There is no word on what's new with the Apple TV...