Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child Safety Feature Purple
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.

"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Top Rated Comments

Wanted797 Avatar
28 months ago
Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves.
Score: 171 Votes (Like | Disagree)
_Spinn_ Avatar
28 months ago

Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
Score: 88 Votes (Like | Disagree)
MathersMahmood Avatar
28 months ago

Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves
This. 100% this.
Score: 63 Votes (Like | Disagree)
LV426 Avatar
28 months ago
“Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
Score: 58 Votes (Like | Disagree)
iGobbleoff Avatar
28 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
Score: 54 Votes (Like | Disagree)
H2SO4 Avatar
28 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August ('https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/'), the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Score: 45 Votes (Like | Disagree)

Popular Stories

apple card 1

Apple Ending Apple Card Partnership With Goldman Sachs

Tuesday November 28, 2023 3:09 pm PST by
Apple is ending its credit card partnership with Goldman Sachs, according to The Wall Street Journal. Apple plans to stop working with Goldman Sachs in the next 12 to 15 months, and it is not yet clear if Apple has established a new partnership for the Apple Card. Apple and Goldman Sachs will dissolve their entire consumer partnership, including the Apple Card and the Apple Savings account....
iOS 17

Everything New in iOS 17.2 Beta 4

Tuesday November 28, 2023 12:18 pm PST by
Apple is wrapping up development on iOS 17.2, with the update expected to come out in December. While we're getting to the end of the beta testing period, Apple is still tweaking features and adding new functionality. We've rounded up everything new in the fourth beta of iOS 17.2. Default Notification Sound Under Sounds & Haptics, there's a new "Default Alerts" section that allows you to ...
ios 17 namedrop

Police Departments and News Sites Spreading Misinformation About How iOS 17 NameDrop Feature Works

Monday November 27, 2023 5:11 pm PST by
Apple with iOS 17.1 and watchOS 10.1 introduced a new NameDrop feature that is designed to allow users to place Apple devices near one another to quickly exchange contact information. Sharing contact information is done with explicit user permission, but some news organizations and police departments have been spreading misinformation about how NameDrop functions. As noted by The Washington...
All New CarPlay Five New Features Article 2

What to Expect From All-New CarPlay, Still Listed as Coming 'Late 2023'

Tuesday November 28, 2023 7:44 am PST by
At WWDC in June 2022, Apple previewed the next generation of CarPlay, promising deeper integration with vehicle functions like A/C and FM radio, support for multiple displays across the dashboard, increased personalization, and more. Apple's website still says the first vehicles with support for the next-generation CarPlay experience will be announced in "late 2023," but it has not shared...
iOS 17

iOS 17.1.2 Update for iPhone Likely to Be Released This Week

Monday November 27, 2023 8:24 am PST by
Apple will likely release iOS 17.1.2 this week, based on mounting evidence of the software in our website's analytics logs in recent days. As a minor update, iOS 17.1.2 should be focused on bug fixes, but it's unclear exactly which issues might be addressed. Some users have continued to experience Wi-Fi issues on iOS 17.1.1, so perhaps iOS 17.1.2 will include the same fix for Wi-Fi...
Apple 5G Modem Feature Triad

Apple to Discontinue Custom 5G Modem Development, Claim Reports

Wednesday November 29, 2023 4:19 am PST by
Apple is discontinuing in-house modem development after several unsuccessful attempts to perfect its own custom 5G modem chip, according to unconfirmed reports coming out of Asia. According to the operator of news aggregator account "yeux1122" on the Naver blog, supply chain sources related to Apple's 5G modem departments claim that the company's attempts to develop its own modem have...
Apple Logo

Apple Discontinued These 5 Products This Year

Monday November 27, 2023 7:03 am PST by
As the end of 2023 nears, now is a good opportunity to look back at some of the devices and accessories that Apple discontinued throughout the year. Apple products discontinued in 2023 include the iPhone 13 mini, 13-inch MacBook Pro, MagSafe Battery Pack, MagSafe Duo Charger, and leather accessories. Also check out our lists of Apple products discontinued in 2022 and 2021. iPhone Mini ...
Google Drive

Some Google Drive Users' Files Have Mysteriously Vanished

Tuesday November 28, 2023 3:55 am PST by
Google Drive users have been warned not to disconnect their account within the Google Drive for desktop app, after a spate of reports of files going missing from the cloud service. Alarm bells began ringing last week on Google's community support site when some users reported files mysteriously disappearing from Google Drive, with some posters claiming six or more months of data had...