Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off

Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States.

iCloud General Feature
User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud Photos. Apple says that this is a highly accurate method for detecting CSAM and protecting children.

CSAM image scanning is not an optional feature and it happens automatically, but Apple has confirmed to MacRumors that it cannot detect known CSAM images if the ‌iCloud Photos‌ feature is turned off.

Apple's method works by identifying a known CSAM photo on device and then flagging it when it's uploaded to ‌iCloud Photos‌ with an attached voucher. After a certain number of vouchers (aka flagged photos) have been uploaded to ‌iCloud Photos‌, Apple can interpret the vouchers and does a manual review. If CSAM content is found, the user account is disabled and the National Center for Missing and Exploited Children is notified.

Because Apple is scanning ‌iCloud Photos‌ for the CSAM flags, it makes sense that the feature does not work with ‌iCloud Photos‌ disabled. Apple has also confirmed that it cannot detect known CSAM images in iCloud Backups if ‌iCloud Photos‌ is disabled on a user's device.

It's worth noting that Apple is scanning specifically for hashes of known child sexual abuse materials and it is not broadly inspecting a user's photo library or scanning personal images that are not already circulating among those who abuse children. Still, users who have privacy concerns about Apple's efforts to scan user photo libraries can disable ‌iCloud Photos‌.

Security researchers have expressed concerns over Apple's CSAM initiative and worry that it could in the future be able to detect other kinds of content that could have political and safety implications, but for now, Apple's efforts are limited seeking child abusers.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Top Rated Comments

iObama Avatar
12 weeks ago

Scanning my photos for kiddie porn has no effect on my privacy, since I don’t have any kiddie porn.
Here's the thing. That's great that you don't have any of that on your device!

But if you ever live in a country that, for some reason, wants to find something on your device and have it flagged in order to charge you with a crime, this sets a dangerous precedent.

Surveillance technology, while often well-intentioned, can easily end up in the wrong hands for nefarious purposes.
Score: 77 Votes (Like | Disagree)
haruhiko Avatar
12 weeks ago
what’s next? scanning your stuff on iCloud for anti government materials for oppressive governments?
Score: 50 Votes (Like | Disagree)
zakarhino Avatar
12 weeks ago

Scanning my photos for kiddie porn has no effect on my privacy, since I don’t have any kiddie porn.
Random warrantless searches of your property have no effect on you because you're not a criminal.

Warrantless surveilance of your entire digital life has no effect on you because you're not a criminal.

^ Those statements are contingent on what the powers that be define as "criminal." If the definition changes tomorrow then they'll have all the infrastructure and law in place to subjugate you as they please. At one point in time it was practically considered criminal to be Japanese in the USA, not that you would care because that has "no effect" on you if you weren't Japanese during that period of time.


If that were to happen I’d be here railing against it.

But as long as they’re just helping catch these sick freaks they have nothing but my support.
If they have all the infrastructure in place then there is no "railing against it" because by that time it's too late. There's no "railing against" nuclear weapons once we've all gone up in smoke. There's no "railing against" climate change once it's already too late and the planet is no longer viable for human life. The time to "rail against" technologies like this is not after they've been abused when it's too late, the time is NOW when the technology is capable of doing those things but hasn't yet gone that far.

Apple are implementing a system that is capable of scanning all the photos on your device against a database of images that can include images not related to child abuse, regardless of whether or not you have iCloud turned on. It doesn't matter that right now it disables itself when iCloud is off and it doesn't matter the database supposedly only includes child abuse images, it is CAPABLE of being an authoritarian tool at a moment's notice via minimal updates in the same way a nuclear bomb is capable of decimating a country with a few button presses even if the bomb is currently sitting in a silo.

You would say there's no issue with the patriot act because you're not a terrorist but it turns out the patriot act and its sister policies have been used to harass journalists and climate activists. It's not like there haven't been terror attacks on US soil since the patriot act was enacted. There were a 1000 other things the US government could have done to prevent more terrorist attacks globally but they chose the option of spying on every single citizen and violating people's constitutional rights instead. If you're actually interested in stopping "sickos" then support systems that actually combat the core issue rather than the "let's just police the entire public more" solution which won't actually stop "sickos" (terrorists use their own encrypted chat tools they make themselves according to various reports, so do child abusers most likely).

Nobody wants terrorists or child abusers in their community. Increasing the reach of warrantless, global spying programs is not the way to tackle the issue. Make no mistake, this system is capable of being a spying tool that bypasses end to end encryption regardless of how it's configured as of right now.
Score: 42 Votes (Like | Disagree)
alex00100 Avatar
12 weeks ago
Sounds like damage control by Apple. It's a bad feature, period. Having a workaround to disable it does not change that.
Score: 36 Votes (Like | Disagree)
TheYayAreaLiving Avatar
12 weeks ago
Sounds like I’ll be turning off iCloud.

Apple, go ahead and release that 1TB iPhone.

Please, respect our privacy as consumers. Don’t be creepy. How times have changed!



Attachment Image
Score: 35 Votes (Like | Disagree)
ArtOfWarfare Avatar
12 weeks ago

I agree with darcyf that if you're not doing anything wrong, then you have nothing to worry about.
Who decides what's wrong, though? Regardless of where you fall politically, there is likely something you do or have that some politician wants to make illegal.

We're starting with something that is fairly universal in people saying it's wrong, but it's a slippery slope. Now that the tools are there, an authoritarian government can start telling Apple to do whatever with it.

And everyone knows that Apple's commitment to human rights and privacy goes right out the window the moment the Chinese Communist Party asks for assistance in trampling them.
Score: 34 Votes (Like | Disagree)

Related Stories

apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
iphone communication safety feature

Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Thursday August 5, 2021 12:00 pm PDT by
Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time. Communication Safety First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children...
Child Safety Feature

Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread

Thursday August 5, 2021 1:04 pm PDT by
Apple today announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC). Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun...
craig wwdc 2021 privacy

Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Friday August 13, 2021 6:33 am PDT by
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM). Federighi admitted that Apple...
Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...
iOs 15 Photos Feature

iOS 15 Messages Bug Causes Saved Photos to Be Deleted

Wednesday September 29, 2021 1:28 pm PDT by
A serious bug in the iOS 15 Messages app can cause some saved photos to be deleted, according to multiple complaints we've heard from MacRumors readers and Twitter users. If you save a photo from a Messages thread and then go on to delete that thread, the next time an iCloud Backup is performed, the photo will disappear. Even though the image is saved to your personal iCloud Photo...
appleprivacyad

Privacy Whistleblower Edward Snowden and EFF Slam Apple's Plans to Scan Messages and iCloud Images

Friday August 6, 2021 5:00 am PDT by
Apple's plans to scan users' iCloud Photos library against a database of child sexual abuse material (CSAM) to look for matches and childrens' messages for explicit content has come under fire from privacy whistleblower Edward Snowden and the Electronic Frontier Foundation (EFF). In a series of tweets, the prominent privacy campaigner and whistleblower Edward Snowden highlighted concerns...
Child Safety Feature Blue

Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns

Tuesday August 10, 2021 9:07 am PDT by
Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch. When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got ...
Child Safety Feature yellow

Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

Thursday August 19, 2021 1:23 am PDT by
An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters). "Though these capabilities are intended to protect...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...