Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread

Apple today announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC).

Child Safety Feature
Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun expressing concerns about how Apple's new image scanning protocol could be used in the future, as noted by Financial Times.

Apple is using a "NeuralHash" system to compare known CSAM images to photos on a user's iPhone before they're uploaded to iCloud. If there is a match, that photograph is uploaded with a cryptographic safety voucher, and at a certain threshold, a review is triggered to check if the person has CSAM on their devices.

At the current time, Apple is using its image scanning and matching technology to look for child abuse, but researchers worry that in the future, it could be adapted to scan for other kinds of imagery that are more concerning, like anti-government signs at protests.

In a series of tweets, Johns Hopkins cryptography researcher Matthew Green said that CSAM scanning is a "really bad idea" because in the future, it could expand to scanning end-to-end encrypted photos rather than just content that's uploaded to ‌iCloud‌. For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in iMessages, which are end-to-end encrypted.

Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.

Apple for its part says that its scanning technology has an "extremely high level of accuracy" to make sure accounts are not incorrectly flagged, and reports are manually reviewed before a person's ‌iCloud‌ account is disabled and a report is sent to NCMEC.

Green believes that Apple's implementation will push other tech companies to adopt similar techniques. "This will break the dam," he wrote. "Governments will demand it from everyone." He compared the technology to "tools that repressive regimes have deployed."


Security researcher Alec Muffett, who formerly worked at Facebook, said that Apple's decision to implement this kind of image scanning was a "huge and regressive step for individual privacy." "Apple are walking back privacy to enable 1984," he said.

Ross Anderson, professor of security engineering at the University of Cambridge said called it an "absolutely appalling idea" that could lead to "distributed bulk surveillance" of devices.

As many have pointed out on Twitter, multiple tech companies already do image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to look for and report known images of child abuse.


It's also worth noting that Apple was already scanning some content for child abuse images prior to the rollout of the new CSAM initiative. In 2020, Apple chief privacy officer Jane Horvath said that Apple used screening technology to look for illegal images and then disables accounts if evidence of CSAM is detected.

Apple in 2019 updated its privacy policies to note that it would scan uploaded content for "potentially illegal content, including child sexual exploitation material," so today's announcements are not entirely new.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Top Rated Comments

macrumorsuser10 Avatar
12 months ago
Apple should add scanning for:

1. Photos of the confederate flag.
2. Photos of people not wearing Covid masks.
3. Photos of Chinese people disrespecting the Chinese government.
4. Photos of Middle eastern women not wearing burkas.
5. Photos of a group of people with too many whites, not enough blacks.
Score: 74 Votes (Like | Disagree)
Bandaman Avatar
12 months ago

If you're not doing anything wrong, then you have nothing to worry about.
This is always the de facto standard for terrible replies to privacy.
Score: 74 Votes (Like | Disagree)
cloudyo Avatar
12 months ago

If you're not doing anything wrong, then you have nothing to worry about.
You should let law enforcement install cameras in your home then, so they can make sure you are not doing anything illegal while you take a shower, for example. After all, you have nothing to hide, do you?
Score: 57 Votes (Like | Disagree)
Bawstun Avatar
12 months ago

If you're not doing anything wrong, then you have nothing to worry about.
This simply isn’t true. As the article notes, the technology can easily be changed to other things in the future - what if they scanned for BLM supporter images or anti-government images? What if they wanted to scan and track certain political parties?

It’s not about child sex material, everyone agrees that that is wrong, it’s about passing over more and more of our rights to Big Tech. Give them an inch and they’ll take a foot.
Score: 51 Votes (Like | Disagree)
contacos Avatar
12 months ago

If you're not doing anything wrong, then you have nothing to worry about.
Depends on the definition of „wrong“. Sometimes it is up to self serving definitions of dictators
Score: 50 Votes (Like | Disagree)
jarman92 Avatar
12 months ago
"Other companies already participate in this outrageous invasion of privacy" is not nearly the defense of Apple these people seem to think it is.
Score: 48 Votes (Like | Disagree)

Related Stories

Child Safety Feature yellow

Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]

Wednesday December 15, 2021 1:53 am PST by
Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods. Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material...
Child Safety Feature Blue

Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns

Tuesday August 10, 2021 9:07 am PDT by
Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch. When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got ...
apple privacy

University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology

Friday August 20, 2021 5:48 am PDT by
Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous." Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University...
Child Safety Feature yellow

Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

Thursday August 19, 2021 1:23 am PDT by
An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters). "Though these capabilities are intended to protect...
scan text notes ios 15 4 feature

iOS 15.4 Adds New 'Scan Text' Shortcut in Notes App

Monday March 21, 2022 12:40 pm PDT by
iOS 15.4 and iPadOS 15.4 were released to the public last week, and one new change is the addition of a "Scan Text" shortcut for quickly scanning printed or handwritten text into the Notes app on the iPhone and iPad. Apple recently shared a video with step-by-step instructions on how to use the Scan Text feature. On a device running iOS 15.4 or iPadOS 15.4, simply open the Notes app, tap the ...
iPhone trade in

Apple Working on 'Cosmetic Scan' Trade-In Tool for iPhone

Thursday January 27, 2022 2:50 pm PST by
Apple is developing a new trade-in tool that will be used to scan the iPhone for cosmetic damage, likely with the aim of generating a more accurate trade-in estimate. Code for the feature was discovered in the iOS 15.4 beta by MacRumors contributor Steve Moser, and it was also highlighted by 9to5Mac. There's not a lot of detail right now, but in a Diagnostic Services section, there are...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
s960 2MS Home Office sign 960x640

UK Government Readies Anti-Encryption Publicity Campaign to 'Keep Children Safe' Online

Monday January 17, 2022 4:26 am PST by
The British Government is reportedly preparing a publicity attack on end-to-end encryption in an effort to mobilize public opinion against the technology by framing it as a child safety issue, with its main aim being to derail Facebook's plan to end-to-end encrypt its Messenger platform. According to Rolling Stone, the Home Office has hired the M&C Saatchi advertising agency to plan the...

Popular Stories

top stories 2jul2022

Top Stories: M2 MacBook Air Release Date, New HomePod Rumor, and More

Saturday July 2, 2022 6:00 am PDT by
The M2 MacBook Pro has started making its way into customers' hands and we're learning more about how it performs in a variety of situations, but all eyes are really on the upcoming M2 MacBook Air which has seen a complete redesign and should be arriving in a couple of weeks. Other top stories this week included a host of product rumors including additional M2 and even M3 Macs, an updated...
Mac Studio IO

Apple Begins Selling Refurbished Mac Studio Models

Thursday June 30, 2022 7:42 pm PDT by
Apple today began selling refurbished Mac Studio models for the first time in the United States, Canada, and select European countries, such as Belgium, Germany, Ireland, Spain, Switzerland, the Netherlands, and the United Kingdom. In the United States, two refurbished Mac Studio configurations are currently available, including one with the M1 Max chip (10-core CPU and 24-core GPU) for...
macbook air m2

Exclusive: Apple Plans to Launch MacBook Air With M2 Chip on July 15

Wednesday June 29, 2022 5:23 pm PDT by
The redesigned MacBook Air with the all-new M2 Apple silicon chip will be available for customers starting Friday, July 15, MacRumors has learned from a retail source. The new MacBook Air was announced and previewed during WWDC earlier this month, with Apple stating availability will begin in July. The MacBook Air features a redesigned body that is thinner and lighter than the previous...
13 inch macbook pro m2 mock feature 2

M2 MacBook Pro Much Slower Than Previous Model

Friday July 1, 2022 2:24 am PDT by
Apple's new 13-inch MacBook Pro with the M2 chip features a significantly slower SSD compared to the previous model, resulting in poorer performance in some workflows, it has been discovered. Specifically, it has been found that the $1,299 base model with 256GB of storage has significantly slower SSD read and write speeds compared to the equivalent previous-generation 13-inch MacBook Pro....