Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child Safety Feature Purple
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.

"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Popular Stories

iOS 18

Here Are Apple's Full iOS 18.5 Release Notes

Tuesday May 6, 2025 2:17 pm PDT by
Apple today seeded the release candidate version of iOS 18.5 to developers and public beta testers, giving us a look at the final version of the update that will be provided to the public next week. With the release candidate, Apple provided release notes, so we have a more complete look at the new features that are included in the update, including those that weren't found during the beta...
siri glow

iPhone Users Now Able to Submit Claims in $95 Million Siri Spying Lawsuit

Wednesday May 7, 2025 11:40 am PDT by
If you owned a Siri-compatible device and had an accidental Siri activation between September 17, 2014 and December 31, 2024, you could be eligible for a payment from Apple as part of a class action lawsuit settlement. Apple in January agreed to pay $95 million to settle a class action lawsuit involving Siri spying accusations, and a website to distribute the funds has now been set up and...
iPhone 17 Pro Blue Feature Tighter Crop

iPhone 17: What's New With the Cameras

Friday May 2, 2025 3:52 pm PDT by
We've still got months to go before the new iPhone 17 models come out, but a combination of dummy models and leaks have given us some insight into what we can expect in terms of camera changes. Apple is adding new camera features, and changing the design of the camera bump for some models. You might be skeptical of dummy models, but over the years, they've proven to be a highly accurate...
iOS 18

Apple Says iOS 18.5 Coming Soon, Here is What's New

Monday May 5, 2025 8:19 am PDT by
In its press release for the new Pride Band today, Apple said that iOS 18.5 is "upcoming," following more than a month of beta testing. We expect the iOS 18.5 Release Candidate to be released this week, and this should be the final beta version, barring any last-minute bugs or changes. The software update should then be released to the general public next week. iOS 18.5 is a relatively...
Nineth iOS 19 Feature

iOS 19 Beta is a Month Away With These New Features for Your iPhone

Thursday May 8, 2025 7:37 am PDT by
The first iOS 19 beta is just one month away, and there are already many new features and changes that are expected with it. Apple should seed the first iOS 19 beta to developers immediately following the WWDC 2025 keynote, which is scheduled for Monday, June 9. Following beta testing, the update should be released to the general public in September. Below, we recap the key iOS 19 rumors...
Foldable iPhone 2023 Feature Homescreen

Foldable iPhone Said to Have Two Key Advantages

Monday May 5, 2025 6:41 am PDT by
Apple plans to release its first foldable iPhone next year, according to several reporters and analysts who cover the company. In his Power On newsletter today, Bloomberg's Mark Gurman said the foldable iPhone will offer two key advantages over other foldable smartphones. First, he said the foldable iPhone will have a "nearly invisible" crease when unfolded. This means the device's...
Foldable iPhone 2023 Feature Homescreen

Apple's Foldable iPhone Display Tech May Set New Industry Standard

Thursday May 8, 2025 3:29 am PDT by
Apple's upcoming foldable iPhone will feature a new type of display panel developed by Samsung that has never been used in a foldable product, claims a source with links to Apple's supply chain. According to the account yeux1122 on the Korean Naver blog, the foldable iPhone will use a custom display process for which Apple will hold branding trademark rights, and that meets Apple's stringent ...
AirPods Pro 3 Mock Feature

AirPods Pro 3 Just Months Away – Here's What We Know

Tuesday April 29, 2025 1:30 am PDT by
Despite being more than two years old, Apple's AirPods Pro 2 still dominate the premium wireless‑earbud space, thanks to a potent mix of top‑tier audio, class‑leading noise cancellation, and Apple's habit of delivering major new features through software updates. With AirPods Pro 3 widely expected to arrive in 2025, prospective buyers now face a familiar dilemma: snap up the proven...

Top Rated Comments

Wanted797 Avatar
47 months ago
Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves.
Score: 171 Votes (Like | Disagree)
_Spinn_ Avatar
47 months ago

Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
Score: 88 Votes (Like | Disagree)
MathersMahmood Avatar
47 months ago

Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves
This. 100% this.
Score: 63 Votes (Like | Disagree)
LV426 Avatar
47 months ago
“Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
Score: 58 Votes (Like | Disagree)
iGobbleoff Avatar
47 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
Score: 54 Votes (Like | Disagree)
H2SO4 Avatar
47 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August ('https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/'), the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Score: 45 Votes (Like | Disagree)