Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child Safety Feature Purple
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.

"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Popular Stories

iphone 16 display

iPhone 17's Scratch Resistant Anti-Reflective Display Coating Canceled

Monday April 28, 2025 12:48 pm PDT by
Apple may have canceled the super scratch resistant anti-reflective display coating that it planned to use for the iPhone 17 Pro models, according to a source with reliable information that spoke to MacRumors. Last spring, Weibo leaker Instant Digital suggested Apple was working on a new anti-reflective display layer that was more scratch resistant than the Ceramic Shield. We haven't heard...
apple watch ultra yellow

What's Next for the Apple Watch Ultra 3 and Apple Watch SE 3

Friday April 25, 2025 2:44 pm PDT by
This week marks the 10th anniversary of the Apple Watch, which launched on April 24, 2015. Yesterday, we recapped features rumored for the Apple Watch Series 11, but since 2015, the Apple Watch has also branched out into the Apple Watch Ultra and the Apple Watch SE, so we thought we'd take a look at what's next for those product lines, too. 2025 Apple Watch Ultra 3 Apple didn't update the...
iPhone 17 Air Pastel Feature

iPhone 17 Reaches Key Milestone Ahead of Mass Production

Monday April 28, 2025 8:44 am PDT by
Apple has completed Engineering Validation Testing (EVT) for at least one iPhone 17 model, according to a paywalled preview of an upcoming DigiTimes report. iPhone 17 Air mockup based on rumored design The EVT stage involves Apple testing iPhone 17 prototypes to ensure the hardware works as expected. There are still DVT (Design Validation Test) and PVT (Production Validation Test) stages to...
Beyond iPhone 13 Better Blue

20th Anniversary iPhone Likely to Be Made in China Due to 'Extraordinarily Complex' Design

Monday April 28, 2025 4:29 am PDT by
Apple will likely manufacture its 20th anniversary iPhone models in China, despite broader efforts to shift production to India, according to Bloomberg's Mark Gurman. In 2027, Apple is planning a "major shake-up" for the iPhone lineup to mark two decades since the original model launched. Gurman's previous reporting indicates the company will introduce a foldable iPhone alongside a "bold"...
iPhone 17 Air Pastel Feature

iPhone 17 Air Launching Later This Year With These 16 New Features

Thursday April 24, 2025 8:24 am PDT by
While the so-called "iPhone 17 Air" is not expected to launch until September, there are already plenty of rumors about the ultra-thin device. Overall, the iPhone 17 Air sounds like a mixed bag. While the device is expected to have an impressively thin and light design, rumors indicate it will have some compromises compared to iPhone 17 Pro models, including only a single rear camera, a...
iPhone 17 Pro Blue Feature Tighter Crop

iPhone 17 Pro Launching Later This Year With These 13 New Features

Wednesday April 23, 2025 8:31 am PDT by
While the iPhone 17 Pro and iPhone 17 Pro Max are not expected to launch until September, there are already plenty of rumors about the devices. Below, we recap key changes rumored for the iPhone 17 Pro models as of April 2025: Aluminum frame: iPhone 17 Pro models are rumored to have an aluminum frame, whereas the iPhone 15 Pro and iPhone 16 Pro models have a titanium frame, and the iPhone ...
iphone 17 air iphone 16 pro

iPhone 17 Air USB-C Port May Have This Unusual Design Quirk

Wednesday April 30, 2025 3:59 am PDT by
Apple is preparing to launch a dramatically thinner iPhone this September, and if recent leaks are anything to go by, the so-called iPhone 17 Air could boast one of the most radical design shifts in recent years. iPhone 17 Air dummy model alongside iPhone 16 Pro (credit: AppleTrack) At just 5.5mm thick (excluding a slightly raised camera bump), the 6.6-inch iPhone 17 Air is expected to become ...

Top Rated Comments

Wanted797 Avatar
46 months ago
Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves.
Score: 171 Votes (Like | Disagree)
_Spinn_ Avatar
46 months ago

Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
Score: 88 Votes (Like | Disagree)
MathersMahmood Avatar
46 months ago

Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves
This. 100% this.
Score: 63 Votes (Like | Disagree)
LV426 Avatar
46 months ago
“Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
Score: 58 Votes (Like | Disagree)
iGobbleoff Avatar
46 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
Score: 54 Votes (Like | Disagree)
H2SO4 Avatar
46 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August ('https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/'), the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Score: 45 Votes (Like | Disagree)