Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week.

apple privacy
"Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of this new solution, and some have reached out with questions," reads the FAQ. "This document serves to address these questions and provide more clarity and transparency in the process."

Some discussions have blurred the distinction between the two features, and Apple takes great pains in the document to differentiate them, explaining that communication safety in Messages "only works on images sent or received in the Messages app for child accounts set up in Family Sharing," while CSAM detection in ‌iCloud Photos‌ "only impacts users who have chosen to use ‌iCloud Photos‌ to store their photos… There is no impact to any other on-device data."

From the FAQ:

These two features are not the same and do not use the same technology.

Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on-device, and so does not change the privacy assurances of Messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it.

The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.

The rest of the document is split into three sections (in bold below), with answers to the following commonly asked questions:

  • Communication safety in Messages
  • Who can use communication safety in Messages?
  • Does this mean Messages will share information with Apple or law enforcement?
  • Does this break end-to-end encryption in Messages?
  • Does this feature prevent children in abusive homes from seeking help?
  • Will parents be notified without children being warned and given a choice?
  • CSAM detection
  • Does this mean Apple is going to scan all the photos stored on my iPhone?
  • Will this download CSAM images to my ‌iPhone‌ to compare against my photos?
  • Why is Apple doing this now?
  • Security for CSAM detection for iCloud Photos
  • Can the CSAM detection system in ‌iCloud Photos‌ be used to detect things other than CSAM?
  • Could governments force Apple to add non-CSAM images to the hash list?
  • Can non-CSAM images be "injected" into the system to flag accounts for things other than CSAM?
  • Will CSAM detection in ‌iCloud Photos‌ falsely flag innocent people to law enforcement?

Interested readers should consult the document for Apple's full responses to these questions. However, it's worth noting that for those questions which can be responded to with a binary yes/no, Apple begins all of them with "No" with the exception of the following three questions from the section titled "Security for CSAM detection for ‌iCloud Photos‌:"

Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities.

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

Can non-CSAM images be "injected" into the system to flag accounts for things other than CSAM?
Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design. Finally, there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system flagging images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, and others for its decision to deploy the technology with the release of iOS 15 and iPadOS 15, expected in September.

This has resulted in an open letter criticizing Apple's plan to scan iPhones for CSAM in ‌iCloud Photos‌ and explicit images in children's messages, which has gained over 5,500 signatures as of writing. Apple has also received criticism from Facebook-owned WhatsApp, whose chief Will Cathcart called it "the wrong approach and a setback for people's privacy all over the world." Epic Games CEO Tim Sweeney also attacked the decision, claiming he'd "tried hard" to see the move from Apple's point of view, but had concluded that, "inescapably, this is government spyware installed by Apple based on a presumption of guilt."

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow." The non-profit Electronic Frontier Foundation also criticized Apple's plans, stating that "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

Top Rated Comments

Hanterdro Avatar
24 weeks ago
In 5 years: “It is the law that we have to scan for government critical images. Apple only follows regional laws.“
Score: 102 Votes (Like | Disagree)
Feyl Avatar
24 weeks ago
I’m sorry Apple but you are not trustworthy. You and your comrades from the big tech are evil.
Score: 87 Votes (Like | Disagree)
betterbegood Avatar
24 weeks ago
All three FAQ questions could/should actually be answered with:

"Practically speaking yes, and if we were forced to do so by a government entity you wouldn't know."

This is the problem.
Score: 59 Votes (Like | Disagree)
jshannon01 Avatar
24 weeks ago
This CSAM upgrade is the only one you will hear about. When it starts scanning for other things you won't know and will have no way of finding out. The timing of it in this era of censorship is suspicious.
Score: 54 Votes (Like | Disagree)
Luis Ortega Avatar
24 weeks ago

I’m sorry Apple but you are not trustworthy. You and your comrades from the big tech are evil.
It has become like the boy who cried wolf. Nobody really believes apple or anyone are even capable of protecting users from government snooping.
The more Cook grouses about privacy, the less I believe him and the more he sounds like a lying hypocrite.
Score: 46 Votes (Like | Disagree)
entropys Avatar
24 weeks ago

People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:

https://www.adobe.com/uk/legal/lawenforcementrequests/childsafety.html

That's just one example.
So? One of the reasons we use Apple is it had a modicum of respect for privacy. Those companies don't respect our privacy.
Score: 45 Votes (Like | Disagree)

Related Stories

iphone communication safety feature

Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Thursday August 5, 2021 12:00 pm PDT by
Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time. Communication Safety First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children...
iCloud General Feature

Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off

Thursday August 5, 2021 2:16 pm PDT by
Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States. User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud...
craig wwdc 2021 privacy

Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Friday August 13, 2021 6:33 am PDT by
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM). Federighi admitted that Apple...
Child Safety Feature yellow

Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]

Wednesday December 15, 2021 1:53 am PST by
Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods. Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material...
communication safety 1

iOS 15.2 Beta Adds Messages Communication Safety Feature for Kids

Tuesday November 9, 2021 10:07 am PST by
Apple over the summer announced new Child Safety features that are aimed at keeping children safer online. Apple has confirmed that one of those features, Communication Safety in Messages, has been enabled in the second beta of iOS 15.2 that was released today, after hints of it appeared in the first beta. Note that Communication Safety is not the same as the controversial anti-CSAM feature that...
iphone communication safety feature

Apple Remains Committed to Launching New Child Safety Features Later This Year

Tuesday August 10, 2021 10:58 am PDT by
Last week, Apple previewed new child safety features that it said will be coming to the iPhone, iPad, and Mac with software updates later this year. The company said the features will be available in the U.S. only at launch. A refresher on Apple's new child safety features from our previous coverage:First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac...
iphone communication safety feature

Apple Outlines Security and Privacy of CSAM Detection System in New Document

Friday August 13, 2021 11:45 am PDT by
Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations. Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from...
Child Safety Feature Blue

Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns

Tuesday August 10, 2021 9:07 am PDT by
Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch. When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got ...

Popular Stories

Upcoming Products 2022 Feature

Gurman: Apple Preparing 'Widest Array of New Hardware Products in Its History' for Fall

Sunday January 23, 2022 10:32 am PST by
Apple is working on a number of new products that are set to launch this fall, and Bloomberg's Mark Gurman says that it will be "the widest array" of new devices that Apple has introduced in its history. In his latest "Power On" newsletter, Gurman explains that Apple is working on four new flagship iPhones (iPhone 14, iPhone 14 Max, iPhone 14 Pro, and iPhone 14 Pro Max), an updated low-end Ma...
macbook pro 14 16 2021

Three Months After Launch, Apple Still Struggling to Meet Demand for Redesigned 14-Inch and 16-Inch MacBook Pro

Monday January 24, 2022 7:12 am PST by
Three months after their launch, the 14-inch and 16-inch MacBook Pros continue to experience high demand and seemingly short supply, with shipping dates for both models stretching into multiple weeks in several of Apple's key markets. In the United States, the baseline 14-inch MacBook Pro with the M1 Pro chip is estimated to ship in three to four weeks, promising an arrival by at least...
General Dropbox Feature

macOS 12.3 Will Include Cloud Storage Changes Affecting Dropbox and OneDrive

Tuesday January 25, 2022 3:31 pm PST by
Dropbox today announced that users who update to macOS 12.3 once that software version becomes available may temporarily encounter issues with opening online-only files in some third-party apps on their Mac. In a support document and an email to customers, Dropbox said it is actively working on full support for online-only files on macOS 12.3 and will begin rolling out an updated version of...
Questionable Design Decisions

Apple's Most Questionable Design Decisions in Recent Memory

Sunday January 23, 2022 2:59 am PST by
Apple has always emphasized the depth of thought that goes into the design of its products. In the foreword to Designed by Apple in California, a photo book released by the company in 2016, Jony Ive explains how Apple strives "to define objects that appear effortless" and "so simple, coherent and inevitable that there could be no rational alternative." But every once in a while even Apple...
Apple Watch Red Yellow Green Feature 1

Apple Launches Black Unity Braided Solo Loop With 'Unity Lights' Watch Face

Wednesday January 26, 2022 6:05 am PST by
Apple today announced the Black Unity Braided Solo Loop for the Apple Watch, as well as a new downloadable watch face, to celebrate Black History Month. Following the launch of the limited edition Black Unity Apple Watch Series 6 and Sport Band in 2021, Apple today launched the Black Unity Braided Solo Loop as part of its celebrations for Black History Month this year.Apple is launching a...
att gigabit internet

AT&T Bringing $180/Month 5-Gigabit Internet to 70 Cities

Monday January 24, 2022 9:20 am PST by
AT&T today announced the launch of upgraded AT&T Fiber plans, which support speeds of up to 5 Gigabits for some customers. There are two separate plans, one "2 GIG" plan and one "5 GIG" plan, available to new and existing AT&T Fiber subscribers. According to AT&T, the new plans are available to nearly 5.2 million customers across 70 metro areas including Los Angeles, Atlanta, Chicago, San...
intel vs m1 max chip purple

Benchmarks Confirm Intel's Latest Core i9 Chip Outperforms Apple's M1 Max With Several Caveats

Wednesday January 26, 2022 8:56 am PST by
Benchmark results have started to surface for MSI's new GE76 Raider, one of the first laptops to be powered by Intel's new 12th-generation Core i9 processor. Intel previously said that its new high-end Core i9 processor is faster than Apple's M1 Max chip in the 16-inch MacBook Pro and, as noted by Macworld, early Geekbench 5 results do appear to confirm this claim, but there are several...
ios 15

Apple Releases iOS 15.3 and iPadOS 15.3 With Fix for Safari Bug That Leaks Browsing Activity

Wednesday January 26, 2022 10:00 am PST by
Apple today released iOS 15.3 and iPadOS 15.3, the third major updates to the iOS and iPadOS 15 operating systems that were released in September 2021. iOS and iPadOS 15.3 come almost two weeks after the release of iOS and iPadOS 15.2.1, minor bug fix updates. The iOS 15.3 and iPadOS 15.3 updates can be downloaded for free and the software is available on all eligible devices over-the-air in ...