Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child Safety Feature Purple
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.

"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Top Rated Comments

Wanted797 Avatar
11 months ago
Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves.
Score: 171 Votes (Like | Disagree)
_Spinn_ Avatar
11 months ago

Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
Score: 88 Votes (Like | Disagree)
MathersMahmood Avatar
11 months ago

Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves
This. 100% this.
Score: 63 Votes (Like | Disagree)
LV426 Avatar
11 months ago
“Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
Score: 58 Votes (Like | Disagree)
iGobbleoff Avatar
11 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
Score: 54 Votes (Like | Disagree)
H2SO4 Avatar
11 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August ('https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/'), the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Score: 45 Votes (Like | Disagree)

Related Stories

apple csam flow chart

Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis

Friday August 6, 2021 10:25 am PDT by
Apple this week announced that, starting later this year with iOS 15 and iPadOS 15, the company will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children, a non-profit organization that works in collaboration with law enforcement agencies across the United...
Child Safety Feature Blue

Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection

Wednesday August 18, 2021 1:13 pm PDT by
Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit. Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his...
Child Safety Feature yellow

Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

Thursday August 19, 2021 1:23 am PDT by
An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters). "Though these capabilities are intended to protect...
european parliament

EU Provisionally Agrees on Law That Would Force Apple to Allow Alternative App Stores, Sideloading, and iMessage Interoperability

Friday March 25, 2022 4:46 am PDT by
European lawmakers have provisionally agreed upon a new law that would force Apple to allow user access to third-party app stores and permit the sideloading of apps on iPhones and iPads, among other sweeping changes designed to make the digital sector fairer and more competitive. The European Council and European Parliament said on Friday they had reached a political agreement on the...
apple privacy

University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology

Friday August 20, 2021 5:48 am PDT by
Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous." Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University...
eff apple park plane 1

EFF Flew a Banner Over Apple Park During Last Apple Event to Protest CSAM Plans

Friday September 24, 2021 2:06 am PDT by
In protest of the company's now delayed CSAM detection plans, the EFF, which has been vocal about Apple's child safety features plans in the past, flew a banner over Apple Park during the iPhone 13 event earlier this month with a message for the Cupertino tech giant. During Apple's fully-digital "California streaming" event on September 14, which included no physical audience attendance in...
Child Safety Feature Blue

Apple Delays Rollout of Controversial Child Safety Features to Make Improvements

Friday September 3, 2021 6:07 am PDT by
Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced. The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...

Popular Stories

Apple Watch Series 7 Starlight Midnight

Standard Apple Watch Series 8 Rumored to Feature Same Design as Series 7

Friday August 5, 2022 7:46 am PDT by
The standard 41mm and 45mm models of the Apple Watch Series 8 will feature the same design as the Apple Watch Series 7, according to Twitter user @ShrimpApplePro, who was first to reveal that iPhone 14 Pro models would feature a new pill-and-hole display. Titanium will not be an option for the standard Apple Watch Series 8 models either, according to @ShrimpApplePro, but Bloomberg's Mark...
cook sept 2020 event

Gurman: Apple Preparing Pre-Recorded iPhone 14 and Apple Watch Series 8 Event

Sunday August 7, 2022 6:13 am PDT by
Apple has "started to record" its virtual September event, where it's expected to announce the upcoming iPhone 14 lineup, the Apple Watch Series 8, and a new "rugged" Apple Watch model, according to Bloomberg's Mark Gurman. Writing in his latest Power On newsletter, Gurman says the event, which is expected to take place in the early part of September, is already under production, implying...
iPhone 14 Pro Purple Front and Back MacRumors Exclusive

Five iPhone 14 Rumors You May Have Missed

Thursday August 4, 2022 6:05 am PDT by
With August upon us, the countdown is officially on. We're just weeks away from when we're expecting Apple to announce the iPhone 14 lineup. Rumors of the next iPhone start early in the year, and as a result, some details about the upcoming device sometimes get lost in the crowd. Exclusive MacRumors iPhone 14 Pro renders by graphic designer Ian Zelbo To help MacRumors readers, we've created a ...
banish safari app pop ups

New iOS App Blocks Those Annoying 'Open in App' Pop-Ups in Safari

Friday August 5, 2022 2:47 am PDT by
You've probably experienced visiting a website like Reddit or LinkedIn on your iPhone only to be greeted with an annoying, almost full-screen pop-up urging you to view the content in their app instead of on the website. It's a common practice for websites that have accompanying iOS apps to push users to open (if they already have the app installed) or download their app from the App Store to ...
top stories 7aug22

Top Stories: iPadOS 16 Delayed, iPhone 14 Pro Rumors, Studio Display Speaker Issues

Saturday August 6, 2022 6:00 am PDT by
The big Apple news this week was word that the upcoming iPadOS 16 update apparently won't be arriving alongside its counterpart update for the iPhone in September, largely due to a need to continue refining the new Stage Manager multitasking feature. Other popular stories this week included more hints about the iPhone 14 Pro's rumored always-on display, potential design leaks for the...