Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis

Apple this week announced that, starting later this year with iOS 15 and iPadOS 15, the company will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children, a non-profit organization that works in collaboration with law enforcement agencies across the United States.

apple csam flow chart
The plans have sparked concerns among some security researchers and other parties that Apple could eventually be forced by governments to add non-CSAM images to the hash list for nefarious purposes, such as to suppress political activism.

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow." The non-profit Electronic Frontier Foundation also criticized Apple's plans, stating that "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

To address these concerns, Apple provided additional commentary about its plans today.

Apple's known CSAM detection system will be limited to the United States at launch, and to address the potential for some governments to try to abuse the system, Apple confirmed to MacRumors that the company will consider any potential global expansion of the system on a country-by-country basis after conducting a legal evaluation. Apple did not provide a timeframe for global expansion of the system, if such a move ever happens.

Apple also addressed the hypothetical possibility of a particular region in the world deciding to corrupt a safety organization in an attempt to abuse the system, noting that the system's first layer of protection is an undisclosed threshold before a user is flagged for having inappropriate imagery. Even if the threshold is exceeded, Apple said its manual review process would serve as an additional barrier and confirm the absence of known CSAM imagery. Apple said it would ultimately not report the flagged user to NCMEC or law enforcement agencies and that the system would still be working exactly as designed.

Apple also highlighted some proponents of the system, with some parties praising the company for its efforts to fight child abuse.

"We support the continued evolution of Apple's approach to child online safety," said Stephen Balkam, CEO of the Family Online Safety Institute. "Given the challenges parents face in protecting their kids online, it is imperative that tech companies continuously iterate and improve their safety tools to respond to new risks and actual harms."

Apple did admit that there is no silver bullet answer as it relates to the potential of the system being abused, but the company said it is committed to using the system solely for known CSAM imagery detection.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Top Rated Comments

dmx Avatar
11 months ago
This system is ripe for abuse and privacy creep over time.

Anyone who it would catch will just turn off iCloud photos anyway, defeating the purpose.

Apple should admit that they made a mistake and cancel the rollout.
Score: 156 Votes (Like | Disagree)
ScottishDuck Avatar
11 months ago
US government known not to abuse systems
Score: 100 Votes (Like | Disagree)
transpo1 Avatar
11 months ago
This is a horrendous idea with so many ways this tech could go wrong.

Limiting it to the U.S. is not a solution and it’s obtuse of Apple to think so. Apple needs to stop now. Get rid of the feature, both the iCloud and Messages versions. No one wants this.
Score: 87 Votes (Like | Disagree)
budafied Avatar
11 months ago

Apple's known CSAM detection system will be limited to the United States at launch, and to address the potential for some governments to try to abuse the system, Apple confirmed to MacRumors that the company will consider any potential global expansion of the system on a country-by-country basis after conducting a legal evaluation.
Oh, idk. I thought the US government was pretty ****ing dishonest when it comes to privacy. How did that get approved in the first place?

**** Apple for doing this.
Score: 84 Votes (Like | Disagree)
StrangeNoises Avatar
11 months ago
And when China, or Russia, or India, give them a big list of hashes they want to be notified of, or you don't get to sell phones in those countries any more?
Score: 83 Votes (Like | Disagree)
J___o___h___n Avatar
11 months ago
I’ve nothing to hide, but this just doesn’t seem right to me.

I’m not updating any existing device to iOS15 until this is roll-out is stopped. I don’t want my photos scanned and I don’t want it to happen to my children’s messages. I ensure my children are safe myself. There’s a level of trust and these sort of forced policies just don’t agree with me.
Score: 69 Votes (Like | Disagree)

Related Stories

Child Safety Feature Blue

Apple Delays Rollout of Controversial Child Safety Features to Make Improvements

Friday September 3, 2021 6:07 am PDT by
Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced. The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance...
Child Safety Feature Blue

Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection

Wednesday August 18, 2021 1:13 pm PDT by
Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit. Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his...
iOS App Store General Feature Clorange

Apple Fined €5 Million for Ninth Time in the Netherlands Over Third-Party In-App Payment Systems

Tuesday March 22, 2022 5:00 am PDT by
Apple has been hit with its ninth €5 million ($5.5 million) fine in the Netherlands for ostensibly continuing to insufficiently meet new requirements regarding alternative payment systems for dating apps, Reuters reports. The Dutch Authority for Consumers and Markets (ACM) said that Apple had sent it "new proposals" on Monday in an attempt to resolve the company's dispute over allowing...
eff apple park plane 1

EFF Flew a Banner Over Apple Park During Last Apple Event to Protest CSAM Plans

Friday September 24, 2021 2:06 am PDT by
In protest of the company's now delayed CSAM detection plans, the EFF, which has been vocal about Apple's child safety features plans in the past, flew a banner over Apple Park during the iPhone 13 event earlier this month with a message for the Cupertino tech giant. During Apple's fully-digital "California streaming" event on September 14, which included no physical audience attendance in...
iphone communication safety feature

Apple Remains Committed to Launching New Child Safety Features Later This Year

Tuesday August 10, 2021 10:58 am PDT by
Last week, Apple previewed new child safety features that it said will be coming to the iPhone, iPad, and Mac with software updates later this year. The company said the features will be available in the U.S. only at launch. A refresher on Apple's new child safety features from our previous coverage:First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac...
russia online store

Apple Halts All Sales From Online Store in Russia [Updated]

Tuesday March 1, 2022 12:28 pm PST by
Apple today confirmed that it has stopped all product sales from its online website in Russia, which means customers in Russia can no longer purchase Macs, iPhones, iPads, and other Apple devices. Attempting to make a purchase from the Russia store results in a "delivery unavailable" result when trying to add a product to the online cart. Sales have been halted following a plea last week...
airtag zipper

Apple Explains How to Stay Safe With AirTag and More in Personal Safety Guide

Tuesday January 25, 2022 11:32 am PST by
Apple today shared an updated Personal Safety User Guide that serves as a resource for anyone who is concerned about or experiencing technology-enabled abuse, stalking, or harassment, the company said. The new guide is available on Apple's website, complete with a table of contents and a search tool, and as a downloadable PDF. The guide outlines the personal safety features that are built...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...

Popular Stories

iPhone 14 Purple Lineup Feature

Will the iPhone 14 Be a Disappointment?

Saturday May 21, 2022 9:00 am PDT by
With around four months to go before Apple is expected to unveil the iPhone 14 lineup, the overwhelming majority of rumors related to the new devices so far have focused on the iPhone 14 Pro, rather than the standard iPhone 14 – leading to questions about how different the iPhone 14 will actually be from its predecessor, the iPhone 13. The iPhone 14 Pro and iPhone 14 Pro Max are expected...
apple ar headset concept 1

Apple's Headset Said to Feature 14 Cameras Enabling Lifelike Avatars, Jony Ive Has Remained Involved With Design

Friday May 20, 2022 6:50 am PDT by
Earlier this week, The Information's Wayne Ma outlined struggles that Apple has faced during the development of its long-rumored AR/VR headset. Now, in a follow-up report, he has shared several additional details about the wearable device. Apple headset render created by Ian Zelbo based on The Information reporting For starters, one of the headset's marquee features is said to be lifelike...
sony headphones 1

Sony's New WH-1000XM5 Headphones vs. Apple's AirPods Max

Friday May 20, 2022 12:18 pm PDT by
Sony this week came out with an updated version of its popular over-ear noise canceling headphones, so we picked up a pair to compare them to the AirPods Max to see which headphones are better and whether it's worth buying the $400 WH-1000XM5 from Sony over Apple's $549 AirPods Max. Subscribe to the MacRumors YouTube channel for more videos. First of all, the AirPods Max win out when it comes ...
apple music

Apple Increases Apple Music Subscription Price for Students in Several Countries

Sunday May 22, 2022 1:57 am PDT by
Apple has silently increased the price of its Apple Music subscription for college students in several countries, with the company emailing students informing them their subscription would be slightly increasing in price moving forward. The price change is not widespread and, based on MacRumors' findings, will impact Apple Music student subscribers in but not limited to Australia, the...
iPhone 13 Face ID

'High-End' iPhone 14 Front-Facing Camera to Cost Apple Three Times More

Monday May 23, 2022 7:05 am PDT by
The iPhone 14 will feature a more expensive "high-end" front-facing camera with autofocus, partly made in South Korea for the first time, ET News reports. Apple reportedly ousted a Chinese candidate to choose LG Innotek, a South Korean company, to supply the iPhone 14's front-facing camera alongside Japan's Sharp. The company is said to have originally planned to switch to LG for the iPhone...