Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread

Apple today announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC).

Child Safety Feature
Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun expressing concerns about how Apple's new image scanning protocol could be used in the future, as noted by Financial Times.

Apple is using a "NeuralHash" system to compare known CSAM images to photos on a user's iPhone before they're uploaded to iCloud. If there is a match, that photograph is uploaded with a cryptographic safety voucher, and at a certain threshold, a review is triggered to check if the person has CSAM on their devices.

At the current time, Apple is using its image scanning and matching technology to look for child abuse, but researchers worry that in the future, it could be adapted to scan for other kinds of imagery that are more concerning, like anti-government signs at protests.

In a series of tweets, Johns Hopkins cryptography researcher Matthew Green said that CSAM scanning is a "really bad idea" because in the future, it could expand to scanning end-to-end encrypted photos rather than just content that's uploaded to ‌iCloud‌. For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in iMessages, which are end-to-end encrypted.

Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.

Apple for its part says that its scanning technology has an "extremely high level of accuracy" to make sure accounts are not incorrectly flagged, and reports are manually reviewed before a person's ‌iCloud‌ account is disabled and a report is sent to NCMEC.

Green believes that Apple's implementation will push other tech companies to adopt similar techniques. "This will break the dam," he wrote. "Governments will demand it from everyone." He compared the technology to "tools that repressive regimes have deployed."


Security researcher Alec Muffett, who formerly worked at Facebook, said that Apple's decision to implement this kind of image scanning was a "huge and regressive step for individual privacy." "Apple are walking back privacy to enable 1984," he said.

Ross Anderson, professor of security engineering at the University of Cambridge said called it an "absolutely appalling idea" that could lead to "distributed bulk surveillance" of devices.

As many have pointed out on Twitter, multiple tech companies already do image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to look for and report known images of child abuse.


It's also worth noting that Apple was already scanning some content for child abuse images prior to the rollout of the new CSAM initiative. In 2020, Apple chief privacy officer Jane Horvath said that Apple used screening technology to look for illegal images and then disables accounts if evidence of CSAM is detected.

Apple in 2019 updated its privacy policies to note that it would scan uploaded content for "potentially illegal content, including child sexual exploitation material," so today's announcements are not entirely new.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iPhone 17 Pro Blue Feature Tighter Crop

iPhone 17 Pro Launching Later This Year With These 12 New Features

Tuesday May 27, 2025 9:10 am PDT by
While the iPhone 17 Pro and iPhone 17 Pro Max are not expected to launch until September, there are already plenty of rumors about the devices. Below, we recap key changes rumored for the iPhone 17 Pro models as of May 2025: Aluminum frame: iPhone 17 Pro models are rumored to have an aluminum frame, whereas the iPhone 15 Pro and iPhone 16 Pro models have a titanium frame, and the iPhone X ...
maxresdefault

No iOS 19: Apple Going Straight to iOS 26

Wednesday May 28, 2025 11:56 am PDT by
With the design overhaul that's coming this year, Apple plans to rename all of its operating systems, reports Bloomberg. Going forward, iOS, iPadOS, macOS, tvOS, watchOS, and visionOS will be identified by year, rather than by version number. We're not going to be getting iOS 19, we're getting iOS 26. Subscribe to the MacRumors YouTube channel for more videos. iOS 26 will be accompanied by...
28 years later iphone 1

Filmmakers Used 20 iPhones at Once to Shoot '28 Years Later'

Friday May 30, 2025 7:27 am PDT by
Sony today provided a closer look at the iPhone rigs used to shoot the upcoming post-apocalyptic British horror movie "28 Years Later" (via IGN). With a budget of $75 million, Danny Boyle's 28 Years Later will become the first major blockbuster movie to be shot on iPhone. 28 Years Later is the sequel to "28 Days Later" (2002) and "28 Weeks Later" (2007), which depict the aftermath of a...
Generic iPhone 17 Feature With Full Width Dynamic Island

iPhone 17 Display Sizes: What to Expect

Thursday May 29, 2025 11:38 am PDT by
Apple's iPhone 17 lineup will include four iPhones, and two of those are going to get all-new display sizes. There's the iPhone 17 Air, which we've heard about several times, but the standard iPhone 17 is also going to have a different display size. We've heard a bit about the updated size before, but with most rumors focusing on the iPhone 17 Air, it's easy to forget. Display analyst Ross...
macOS 26 visionOS Inspired Feature

macOS 26 Rumored to Drop Support for These Five Macs

Thursday May 29, 2025 5:31 am PDT by
The next major version of macOS, now dubbed "macOS 26," is rumored to drop support for several older Intel-based Mac models currently compatible with macOS Sequoia. According to individuals familiar with the matter cited by AppleInsider, the following Macs will not be supported by the next version of macOS: MacBook Pro (2018) iMac (2019) iMac Pro (2017) Mac mini (2018) MacB...
iOS 26 Mock Rainbow Feature

With iOS 18 Jumping to iOS 26, Will Apple Renumber iPhones Too?

Thursday May 29, 2025 1:59 pm PDT by
With the next-generation version of iOS and other 2025 software updates, Apple is planning to change its numbering scheme. Rather than iOS 19, which would logically follow iOS 18, Apple is instead going to call the update iOS 26. Apple plans to use 26 across all of its platforms (the number representing the upcoming year), which will presumably be less confusing than having iOS 19, macOS 16,...
iOS 19 visionOS UI Elements

6 visionOS-Inspired Design Elements Coming to iOS 26

Friday May 30, 2025 3:26 pm PDT by
With iOS 26, macOS 26, tvOS 26, and watchOS 26, Apple is planning to debut a new design that's been described as taking inspiration from visionOS, the newest operating system. With WWDC coming up soon, we thought we'd take a closer look at visionOS and some of the design details that Apple might adopt based on current rumors and leaked information. 1. Translucency Inside Apple, the iOS 26...

Top Rated Comments

macrumorsuser10 Avatar
50 months ago
Apple should add scanning for:

1. Photos of the confederate flag.
2. Photos of people not wearing Covid masks.
3. Photos of Chinese people disrespecting the Chinese government.
4. Photos of Middle eastern women not wearing burkas.
5. Photos of a group of people with too many whites, not enough blacks.
Score: 74 Votes (Like | Disagree)
Bandaman Avatar
50 months ago

If you're not doing anything wrong, then you have nothing to worry about.
This is always the de facto standard for terrible replies to privacy.
Score: 74 Votes (Like | Disagree)
cloudyo Avatar
50 months ago

If you're not doing anything wrong, then you have nothing to worry about.
You should let law enforcement install cameras in your home then, so they can make sure you are not doing anything illegal while you take a shower, for example. After all, you have nothing to hide, do you?
Score: 57 Votes (Like | Disagree)
Bawstun Avatar
50 months ago

If you're not doing anything wrong, then you have nothing to worry about.
This simply isn’t true. As the article notes, the technology can easily be changed to other things in the future - what if they scanned for BLM supporter images or anti-government images? What if they wanted to scan and track certain political parties?

It’s not about child sex material, everyone agrees that that is wrong, it’s about passing over more and more of our rights to Big Tech. Give them an inch and they’ll take a foot.
Score: 51 Votes (Like | Disagree)
contacos Avatar
50 months ago

If you're not doing anything wrong, then you have nothing to worry about.
Depends on the definition of „wrong“. Sometimes it is up to self serving definitions of dictators
Score: 50 Votes (Like | Disagree)
jarman92 Avatar
50 months ago
"Other companies already participate in this outrageous invasion of privacy" is not nearly the defense of Apple these people seem to think it is.
Score: 48 Votes (Like | Disagree)