Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child Safety Feature Purple
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.

"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Popular Stories

iOS 18 Mock iPhone 16 Feature Gray

Revealed: iOS 18 Works With These iPhone Models

Monday June 10, 2024 3:57 am PDT by
iOS 18 will be compatible with the same iPhone models as iOS 17, according to a post on X today from a private account with a proven track record of sharing build numbers for upcoming iOS updates. iOS 18 will be compatible with the iPhone XR, and hence also the iPhone XS and iPhone XS Max models with the same A12 Bionic chip, but older iPhone models will miss out. Here is the full...
ios 18 tile summary

Apple Announces iOS 18 With New Customization Features, Redesigned Photos App, and More

Monday June 10, 2024 10:17 am PDT by
Apple today previewed iOS 18, the next major update to the operating system for the iPhone, with new customization features, a redesigned Photos app, and more. iOS 18 features new customization tools for the Home Screen. App icons now feature Dark Mode and users can tint them with a color to create a unique look. Apps can also now be placed anywhere on the Home Screen freely. The Control...
WWDC24 Live Coverage Article

WWDC 2024 Apple Event Live Keynote Coverage: iOS 18, Apple's AI Push, and More

Monday June 10, 2024 9:20 am PDT by
Apple's Worldwide Developers Conference (WWDC) starts today with the traditional keynote kicking things off at 10:00 a.m. Pacific Time. MacRumors is on hand for the event and we'll be sharing details and our thoughts throughout the day. We're expecting to see a number of software-related announcements with a focus on Apple's efforts to infuse AI throughout its operating systems and apps....
iOS 18 Siri Integrated Feature

Massive iPhone Upgrade Coming This Week But These Devices Will Miss Out

Sunday June 9, 2024 1:25 pm PDT by
Apple is planning a major AI overhaul in iOS 18, with a feature set it is referring to as "Apple Intelligence." However, these new features will not work on older iPhones, even if they do appear on the new operating system's device compatibility list. Apple's initial AI roadmap for iOS 18 is said to come in two parts: Basic AI features that will be processed on-device, and more advanced...
Next Gen CarPlay WWDC24 1

Apple Provides Updated Look at Next-Generation CarPlay at WWDC 2024

Monday June 10, 2024 7:11 pm PDT by
Apple today shared a few WWDC 2024 coding sessions related to its upcoming next-generation CarPlay system ahead of its launch later this year. The sessions include lots of updated next-generation CarPlay images, with one revealing new Vehicle, Media, and Climate apps in action for the first time. MacRumors previously discovered evidence of these apps in the iOS 17.4 beta. Next-generation...
iPad Air 5

New: iPadOS 18 Drops Support for These iPad Models

Monday June 10, 2024 4:16 am PDT by
iPadOS 18 will drop support for iPad models equipped with the A10X Fusion chip, according to a post on X today from a private account with a proven track record of sharing build numbers for upcoming iOS and iPadOS updates. In other words, iPadOS 18 will drop support for the 10.5-inch iPad Pro and the second-generation 12.9-inch iPad Pro. Support for the sixth-generation iPad, which uses the...
ios 18 button bulge

iOS 18 Adds Pop-Out Bezel Animation When Pressing iPhone Buttons

Tuesday June 11, 2024 10:40 am PDT by
iOS 18 includes a small but interesting change for the buttons on the iPhone, adding more of a visual element when changing volume, activating the Action button, or locking the screen. When you press an iPhone button in iOS 18, the display bezel bulges outward slightly. This feature is available for the volume buttons, Action button and the power button, and it will also likely be used for...

Top Rated Comments

Wanted797 Avatar
35 months ago
Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves.
Score: 171 Votes (Like | Disagree)
_Spinn_ Avatar
35 months ago

Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
Score: 88 Votes (Like | Disagree)
MathersMahmood Avatar
35 months ago

Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves
This. 100% this.
Score: 63 Votes (Like | Disagree)
LV426 Avatar
35 months ago
“Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
Score: 58 Votes (Like | Disagree)
iGobbleoff Avatar
35 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
Score: 54 Votes (Like | Disagree)
H2SO4 Avatar
35 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August ('https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/'), the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Score: 45 Votes (Like | Disagree)