Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child Safety Feature Purple
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.

"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Popular Stories

iphone 17 models

No iPhone 18 Launch This Year, Reports Suggest

Thursday January 1, 2026 8:43 am PST by
Apple is not expected to release a standard iPhone 18 model this year, according to a growing number of reports that suggest the company is planning a significant change to its long-standing annual iPhone launch cycle. Despite the immense success of the iPhone 17 in 2025, the iPhone 18 is not expected to arrive until the spring of 2027, leaving the iPhone 17 in the lineup as the latest...
duolingo ad live activity

Duolingo Used iPhone's Dynamic Island to Display Ads, Violating Apple Design Guidelines

Friday January 2, 2026 1:36 pm PST by
Language learning app Duolingo has apparently been using the iPhone's Live Activity feature to display ads on the Lock Screen and the Dynamic Island, which violates Apple's design guidelines. According to multiple reports on Reddit, the Duolingo app has been displaying an ad for a "Super offer," which is Duolingo's paid subscription option. Apple's guidelines for Live Activity state that...
Clicks Communicator Feature

'Clicks Communicator' Unveiled — Will You Carry This With Your iPhone?

Friday January 2, 2026 6:35 am PST by
The company behind the BlackBerry-like Clicks Keyboard accessory for the iPhone today unveiled a new Android 16 smartphone called the Clicks Communicator. The purpose-built device is designed to be used as a second phone alongside your iPhone, with the intended focus being communication over content consumption. It runs a custom Android launcher that offers a curated selection of messaging...
Low Cost MacBook Feature A18 Pro

Low-Price 12.9-Inch MacBook With A18 Pro Chip Reportedly Launching Early This Year

Friday January 2, 2026 9:08 am PST by
Apple plans to introduce a 12.9-inch MacBook in spring 2026, according to TrendForce. In a press release this week, the Taiwanese research firm said this MacBook will be aimed at the entry-level to mid-range market, with "competitive pricing." TrendForce did not share any further details about this MacBook, but the information that it shared lines up with several rumors about a more...
Low Cost A18 Pro MacBook Feature Pink

Apple's 2026 Low-Cost A18 Pro MacBook: What We Know So Far

Friday January 2, 2026 4:33 pm PST by
Apple is planning to release a low-cost MacBook in 2026, which will apparently compete with more affordable Chromebooks and Windows PCs. Apple's most affordable Mac right now is the $999 MacBook Air, and the upcoming low-cost MacBook is expected to be cheaper. Here's what we know about the low-cost MacBook so far. Size Rumors suggest the low-cost MacBook will have a display that's around 13 ...
Apple Fitness Plus hero

Apple Announces New Fitness+ Workout Programs, Strava Challenge, and More

Friday January 2, 2026 6:43 am PST by
Apple today announced a number of updates to Apple Fitness+ and activity with the Apple Watch. The key announcements include: New Year limited-edition award: Users can win the award by closing all three Activity Rings for seven days in a row in January. "Quit Quitting" Strava challenge: Available in Strava throughout January, users who log 12 workouts anytime in the month will win an ...
govee floor lamp

CES 2026: Govee Announces New Matter-Connected Ceiling and Floor Lights

Sunday January 4, 2026 5:00 am PST by
Govee today introduced three new HomeKit-compatible lighting products, including the Govee Floor Lamp 3, the Govee Ceiling Light Ultra, and the Govee Sky Ceiling Light. The Govee Floor Lamp 3 is the successor to the Floor Lamp 2, and it offers Matter integration with the option to connect to HomeKit. The Floor Lamp 3 offers an upgraded LuminBlend+ lighting system that can reproduce 281...

Top Rated Comments

Wanted797 Avatar
55 months ago
Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves.
Score: 171 Votes (Like | Disagree)
_Spinn_ Avatar
55 months ago

Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
Score: 88 Votes (Like | Disagree)
MathersMahmood Avatar
55 months ago

Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves
This. 100% this.
Score: 63 Votes (Like | Disagree)
LV426 Avatar
55 months ago
“Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
Score: 58 Votes (Like | Disagree)
iGobbleoff Avatar
55 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
Score: 54 Votes (Like | Disagree)
H2SO4 Avatar
55 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August ('https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/'), the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Score: 45 Votes (Like | Disagree)