Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child Safety Feature Purple
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.

"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Popular Stories

iphone 16 pro ghost hand

5 Reasons to Skip This Year's iPhone 17 Pro

Thursday July 10, 2025 4:54 am PDT by
Apple will launch its new iPhone 17 series in two months, and the iPhone 17 Pro models are expected to get a new design for the rear casing and the camera area. But more significant changes to the lineup are not expected until next year, when the iPhone 18 models arrive. If you're thinking of trading in your iPhone for this year's latest, consider the following features rumored to be coming...
apple wallet drivers license feature iPhone 15 pro

Apple Says iPhone Driver's Licenses Will Expand to These 8 U.S. States

Tuesday July 8, 2025 11:26 am PDT by
In select U.S. states, residents can add their driver's license or state ID to the Wallet app on the iPhone and Apple Watch, providing a convenient and contactless way to display proof of identity or age at select airports and businesses, and in select apps. Unfortunately, this feature continues to roll out very slowly since it was announced in 2021, with only nine U.S. states, Puerto Rico,...
iPhone 17 Pro in Hand Feature Lowgo

iPhone 17 Pro to Reverse iPhone X Design Decision

Monday July 7, 2025 9:46 am PDT by
Since the iPhone X in 2017, all of Apple's highest-end iPhone models have featured either stainless steel or titanium frames, but it has now been rumored that this design decision will be coming to an end with the iPhone 17 Pro models later this year. In a post on Chinese social media platform Weibo today, the account Instant Digital said that the iPhone 17 Pro models will have an aluminum...
iPhone 17 Pro in Hand Feature Lowgo

Leaker Reveals Amount of RAM in iPhone 17 Through iPhone 17 Pro Max

Wednesday July 9, 2025 8:08 am PDT by
Three out of four iPhone 17 models will feature more RAM than the equivalent iPhone 16 models, according to a new leak that aligns with previous rumors. The all-new iPhone 17 Air, the iPhone 17 Pro, and the iPhone 17 Pro Max will each be equipped with 12GB of RAM, according to Fixed Focus Digital, an account with more than two million followers on Chinese social media platform Weibo. The...
apple account card feature

Apple Account Card Expanding to More Countries

Tuesday July 8, 2025 7:34 pm PDT by
Apple is expanding the ability to add an Apple Account Card to the Wallet app to more countries, according to backend Apple Pay changes. With iOS 15.5, Apple updated the Wallet app to allow users to add an Apple Account Card, which displays the Apple credit balance associated with an Apple ID. If you receive an Apple gift card, for example, it is added to an Apple Account that is also...
macbook pro blue green

M5 MacBook Pro No Longer Coming in 2025

Thursday July 10, 2025 12:38 pm PDT by
Apple does not plan to refresh any Macs with updated M5 chips in 2025, according to Bloomberg's Mark Gurman. Updated MacBook Air and MacBook Pro models are now planned for the first half of 2026. Gurman previously said that Apple would debut the M5 MacBook Pro models in late 2025, but his newest report suggests that Apple is "considering" pushing them back to 2026. Apple is now said to be...
iOS 26 Feature

Everything New in iOS 26 Beta 3

Monday July 7, 2025 1:20 pm PDT by
Apple is continuing to refine and update iOS 26, and beta three features smaller changes than we saw in beta 2, plus further tweaks to the Liquid Glass design. Apple is gearing up for the next phase of beta testing, and the company has promised that a public beta is set to come out in July. Transparency In some apps like Apple Music, Podcasts, and the App Store, Apple has toned down the...
iCloud General Feature Redux

iPhone Users Who Pay for iCloud Storage Receive These Five Perks

Wednesday July 9, 2025 9:20 am PDT by
If you pay for iCloud storage on your iPhone, did you know that Apple offers you five perks beyond the extra storage space, at no additional cost? Here are the perks included with all iCloud+ plans:Private Relay keeps your Safari browsing history entirely private from network providers, websites, and even Apple. Hide My Email generates unique, random email addresses whenever needed. Hom...

Top Rated Comments

Wanted797 Avatar
49 months ago
Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves.
Score: 171 Votes (Like | Disagree)
_Spinn_ Avatar
49 months ago

Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
Score: 88 Votes (Like | Disagree)
MathersMahmood Avatar
49 months ago

Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves
This. 100% this.
Score: 63 Votes (Like | Disagree)
LV426 Avatar
49 months ago
“Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
Score: 58 Votes (Like | Disagree)
iGobbleoff Avatar
49 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
Score: 54 Votes (Like | Disagree)
H2SO4 Avatar
49 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August ('https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/'), the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Score: 45 Votes (Like | Disagree)