Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week.

apple privacy
"Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of this new solution, and some have reached out with questions," reads the FAQ. "This document serves to address these questions and provide more clarity and transparency in the process."

Some discussions have blurred the distinction between the two features, and Apple takes great pains in the document to differentiate them, explaining that communication safety in Messages "only works on images sent or received in the Messages app for child accounts set up in Family Sharing," while CSAM detection in ‌iCloud Photos‌ "only impacts users who have chosen to use ‌iCloud Photos‌ to store their photos… There is no impact to any other on-device data."

From the FAQ:

These two features are not the same and do not use the same technology.

Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on-device, and so does not change the privacy assurances of Messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it.

The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.

The rest of the document is split into three sections (in bold below), with answers to the following commonly asked questions:

  • Communication safety in Messages
  • Who can use communication safety in Messages?
  • Does this mean Messages will share information with Apple or law enforcement?
  • Does this break end-to-end encryption in Messages?
  • Does this feature prevent children in abusive homes from seeking help?
  • Will parents be notified without children being warned and given a choice?
  • CSAM detection
  • Does this mean Apple is going to scan all the photos stored on my iPhone?
  • Will this download CSAM images to my ‌iPhone‌ to compare against my photos?
  • Why is Apple doing this now?
  • Security for CSAM detection for iCloud Photos
  • Can the CSAM detection system in ‌iCloud Photos‌ be used to detect things other than CSAM?
  • Could governments force Apple to add non-CSAM images to the hash list?
  • Can non-CSAM images be "injected" into the system to flag accounts for things other than CSAM?
  • Will CSAM detection in ‌iCloud Photos‌ falsely flag innocent people to law enforcement?

Interested readers should consult the document for Apple's full responses to these questions. However, it's worth noting that for those questions which can be responded to with a binary yes/no, Apple begins all of them with "No" with the exception of the following three questions from the section titled "Security for CSAM detection for ‌iCloud Photos‌:"

Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities.

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

Can non-CSAM images be "injected" into the system to flag accounts for things other than CSAM?
Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design. Finally, there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system flagging images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, and others for its decision to deploy the technology with the release of iOS 15 and iPadOS 15, expected in September.

This has resulted in an open letter criticizing Apple's plan to scan iPhones for CSAM in ‌iCloud Photos‌ and explicit images in children's messages, which has gained over 5,500 signatures as of writing. Apple has also received criticism from Facebook-owned WhatsApp, whose chief Will Cathcart called it "the wrong approach and a setback for people's privacy all over the world." Epic Games CEO Tim Sweeney also attacked the decision, claiming he'd "tried hard" to see the move from Apple's point of view, but had concluded that, "inescapably, this is government spyware installed by Apple based on a presumption of guilt."

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow." The non-profit Electronic Frontier Foundation also criticized Apple's plans, stating that "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

Popular Stories

M5 MacBook Pro

Apple Announces New 14-Inch MacBook Pro With M5 Chip

Wednesday October 15, 2025 6:07 am PDT by
Apple today updated the 14-inch MacBook Pro base model with its new M5 chip, which is also available in updated iPad Pro and Vision Pro models. In addition, the base 14-inch MacBook Pro can now be configured with up to 4TB of storage on Apple's online store, whereas the previous model maxed out at 2TB. However, the maximum amount of unified RAM available for this model remains 32GB. Like...
Apple iPad Pro hero M5

Apple Debuts New iPad Pro With M5 Chip, Faster Charging, and More

Wednesday October 15, 2025 6:16 am PDT by
Apple today announced the next-generation iPad Pro, featuring the custom-designed M5, C1X, and N1 chips. The M5 chip has up to a 10-core CPU, with four performance cores and six efficiency cores. It features a next-generation GPU with Neural Accelerator in each core, allowing the new iPad Pro to deliver up to 3.5x the AI performance than the previous model, and a third-generation ray-tracing ...
maxresdefault

Here's Everything Apple Announced Today

Wednesday October 15, 2025 3:54 pm PDT by
We didn't get a second fall event this year, but Apple did unveil updated products with a series of press releases that went out today. The M5 chip made an appearance in new MacBook Pro, Vision Pro, and iPad Pro models. Subscribe to the MacRumors YouTube channel for more videos. We've rounded up our coverage and highlighted the main feature changes for each device below. MacBook Pro M5...
iphone air thickness

Apple Said to Cut iPhone Air Production Amid Underwhelming Sales

Friday October 17, 2025 8:29 am PDT by
Apple plans to cut production of the iPhone Air amid underwhelming sales performance, Japan's Mizuho Securities believes (via The Elec). The Japanese investment banking and securities firm claims that the iPhone 17 Pro and iPhone 17 Pro Max are seeing higher sales than their predecessors during the same period last year, while the standard iPhone 17 is a major success, performing...
HomePod mini and Apple TV

Apple's Next Rumored Products: New HomePod Mini, Apple TV, and More

Thursday October 16, 2025 9:13 am PDT by
Apple on Wednesday updated the 14-inch MacBook Pro, iPad Pro, and Vision Pro with its next-generation M5 chip, but previous rumors have indicated that the company still plans to announce at least a few additional products before the end of the year. The following Apple products have at one point been rumored to be updated in 2025, although it is unclear if the timeframe for any of them has...
Vision Pro M5 Announcement

Apple Updates Vision Pro With M5 Chip, Dual Knit Band, and 120Hz Support

Wednesday October 15, 2025 6:14 am PDT by
Apple today updated the Vision Pro headset with its next-generation M5 chip for faster performance, and a more comfortable Dual Knit Band. The M5 chip has a 10-core CPU, a 10-core GPU with Neural Accelerators, and a 16-core Neural Engine, and we have confirmed the Vision Pro still has 16GB of RAM. With the M5 chip, the Vision Pro offers faster performance and longer battery life compared...
14 inch MacBook Pro Keyboard

New 14-Inch MacBook Pro Has Two Key Upgrades Beyond the M5 Chip

Thursday October 16, 2025 8:31 am PDT by
Apple on Wednesday updated the 14-inch MacBook Pro base model with an M5 chip, and there are two key storage-related upgrades beyond that chip bump. First, Apple says the new 14-inch MacBook Pro offers up to 2× faster SSD performance than the equivalent previous-generation model, so read and write speeds should get a significant boost. Apple says it is using "the latest storage technology," ...
MacBook Pro M5 Screen

New MacBook Pro Does Not Include a Charger in the Box in Europe

Wednesday October 15, 2025 6:59 am PDT by
The new 14-inch MacBook Pro with an M5 chip does not include a charger in the box in European countries, including the U.K., Ireland, Germany, Italy, France, Spain, the Netherlands, Norway, and others, according to Apple's online store. In the U.S. and all other countries outside of Europe, the new MacBook Pro comes with Apple's 70W USB-C Power Adapter, but European customers miss out....
airpods max 2024 colors

AirPods Max 2: Everything We Know So Far

Tuesday October 14, 2025 8:43 am PDT by
Apple's AirPods Max have now been available for almost five years, so what do we know about the second-generation version? According to Apple supply chain analyst Ming-Chi Kuo, the new AirPods Max will be lighter than the current ones, but exactly how much is as yet known. The current AirPods Max weigh 0.85 pounds (386.2 grams), excluding the charging case, making it one of the heavier...
macbook pro blue

Apple's M5 MacBook Pro Imminent: What to Expect

Tuesday October 14, 2025 4:35 pm PDT by
Apple is going to launch a new version of the MacBook Pro as soon as tomorrow, so we thought we'd go over what to expect from Apple's upcoming Mac. M5 Chip The MacBook Pro will be one of the first new devices to use the next-generation M5 chip, which will replace the M4 chip. The M5 is built on TSMC's more advanced 3-nanometer process, and it will bring speed and efficiency improvements. ...

Top Rated Comments

Hanterdro Avatar
55 months ago
In 5 years: “It is the law that we have to scan for government critical images. Apple only follows regional laws.“
Score: 102 Votes (Like | Disagree)
Feyl Avatar
55 months ago
I’m sorry Apple but you are not trustworthy. You and your comrades from the big tech are evil.
Score: 87 Votes (Like | Disagree)
5232152 Avatar
55 months ago
All three FAQ questions could/should actually be answered with:

"Practically speaking yes, and if we were forced to do so by a government entity you wouldn't know."

This is the problem.
Score: 59 Votes (Like | Disagree)
jshannon01 Avatar
55 months ago
This CSAM upgrade is the only one you will hear about. When it starts scanning for other things you won't know and will have no way of finding out. The timing of it in this era of censorship is suspicious.
Score: 54 Votes (Like | Disagree)
Luis Ortega Avatar
55 months ago

I’m sorry Apple but you are not trustworthy. You and your comrades from the big tech are evil.
It has become like the boy who cried wolf. Nobody really believes apple or anyone are even capable of protecting users from government snooping.
The more Cook grouses about privacy, the less I believe him and the more he sounds like a lying hypocrite.
Score: 46 Votes (Like | Disagree)
entropys Avatar
55 months ago

People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:

https://www.adobe.com/uk/legal/lawenforcementrequests/childsafety.html

That's just one example.
So? One of the reasons we use Apple is it had a modicum of respect for privacy. Those companies don't respect our privacy.
Score: 45 Votes (Like | Disagree)