University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology

Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous."

apple privacy
Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University Center for Information Technology Policy, both penned an op-ed for The Washington Post, outlining their experiences with building image detection technology.

The researchers started a project two years ago to identity CSAM in end-to-end encrypted online services. The researchers note that given their field, they "know the value of end-to-end encryption, which protects data from third-party access." That concern, they say, is what horrifies them over CSAM "proliferating on encrypted platforms."

Mayer and Kulshrestha said they wanted to find a middle ground for the situation: build a system that online platforms could use to find CSAM and protect end-to-end encryption. The researchers note that experts in the field doubted the prospect of such a system, but they did manage to build it and in the process noticed a significant problem.

We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing. People couldn't read the database or learn whether content matched, since that information could reveal law enforcement methods and help criminals evade detection.

Knowledgeable observers argued a system like ours was far from feasible. After many false starts, we built a working prototype. But we encountered a glaring problem.

Since Apple's announcement of the feature, the company has been bombarded with concerns that the system behind detecting CSAM could be used to detect other forms of photos at the request of oppressive governments. Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments.

Nonetheless, concerns over the future implications of the technology being used for CSAM detection are widespread. Mayer and Kulshrestha said that their concerns over how governments could use the system to detect content other than CSAM had them "disturbed."

A foreign government could, for example, compel a service to out people sharing disfavored political speech. That's no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials.

We spotted other shortcomings. The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny.

We were so disturbed that we took a step we hadn't seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides....

Apple has continued to address user concerns over its plans, publishing additional documents and an FAQ page. Apple continues to believe that its CSAM detection system, which will occur on a user's device, aligns with its long-standing privacy values.

Top Rated Comments

Geert76 Avatar
11 months ago
Apple, just cancel this on device surveillance software
Score: 166 Votes (Like | Disagree)
nvmls Avatar
11 months ago
They clearly don't know how the technology works.. oh wait
Score: 125 Votes (Like | Disagree)
nexu Avatar
11 months ago
This is getting more and more disturbing
Score: 117 Votes (Like | Disagree)
TheRealNick Avatar
11 months ago
If they don’t cancel this I’m seriously going to have to look at alternative products, which saddens me.
Score: 108 Votes (Like | Disagree)
dwsolberg Avatar
11 months ago
It's time for Apple to admit it's made a mistake. It's seriously bizarre they ever thought this was a good idea.
Score: 95 Votes (Like | Disagree)
VulchR Avatar
11 months ago
Waiting for the surveillance apologists to argue these two researchers "don't understand" the process...
Score: 93 Votes (Like | Disagree)

Related Stories

Child Safety Feature Blue

Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection

Wednesday August 18, 2021 1:13 pm PDT by
Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit. Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his...
universal control sidecar

PSA: Universal Control and Sidecar Can Be Used At the Same Time in macOS 12.3

Tuesday March 15, 2022 5:12 am PDT by
Before macOS 12.3 and iPadOS 15.4 were released to the public, many beta testers who tried Universal Control were unable to get it working if they were already using an iPad in Sidecar mode, but it turns out that in the final release you can actually enable both Universal Control and Sidecar simultaneously on different devices, provided your setup allows for this. In the final version of...
universal control 1

macOS Monterey 12.3 Beta 3 Makes Accessing Universal Control Settings Easier

Tuesday February 15, 2022 3:56 pm PST by
In the third beta of macOS Monterey that Apple released to developers today, there is a small Universal Control update that is designed to make it easier to access the various Universal Control settings that you might need when using the feature to control multiple Macs and iPads with a single mouse/trackpad and keyboard. If you open up the Displays section of System Preferences after...
appleprivacyad

Corellium Launching New Initiative to Hold Apple Accountable Over CSAM Detection Security and Privacy Claims

Tuesday August 17, 2021 1:35 am PDT by
Security research firm Corellium this week announced it is launching a new initiative that will "support independent public research into the security and privacy of mobile applications," and one of the initiative's first projects will be Apple's recently announced CSAM detection plans. Since its announcement earlier this month, Apple's plan to scan iPhone users' photo libraries for CSAM or...
Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...
european parliament

EU Provisionally Agrees on Law That Would Force Apple to Allow Alternative App Stores, Sideloading, and iMessage Interoperability

Friday March 25, 2022 4:46 am PDT by
European lawmakers have provisionally agreed upon a new law that would force Apple to allow user access to third-party app stores and permit the sideloading of apps on iPhones and iPads, among other sweeping changes designed to make the digital sector fairer and more competitive. The European Council and European Parliament said on Friday they had reached a political agreement on the...
Child Safety Feature Blue

Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns

Tuesday August 10, 2021 9:07 am PDT by
Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch. When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got ...
Child Safety Feature yellow

Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

Thursday August 19, 2021 1:23 am PDT by
An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters). "Though these capabilities are intended to protect...

Popular Stories

macbook air m2

Exclusive: Apple Plans to Launch MacBook Air With M2 Chip on July 15

Wednesday June 29, 2022 5:23 pm PDT by
The redesigned MacBook Air with the all-new M2 Apple silicon chip will be available for customers starting Friday, July 15, MacRumors has learned from a retail source. The new MacBook Air was announced and previewed during WWDC earlier this month, with Apple stating availability will begin in July. The MacBook Air features a redesigned body that is thinner and lighter than the previous...
Mac Studio IO

Apple Begins Selling Refurbished Mac Studio Models

Thursday June 30, 2022 7:42 pm PDT by
Apple today began selling refurbished Mac Studio models for the first time in the United States, Canada, and select European countries, such as Belgium, Germany, Ireland, Spain, Switzerland, the Netherlands, and the United Kingdom. In the United States, two refurbished Mac Studio configurations are currently available, including one with the M1 Max chip (10-core CPU and 24-core GPU) for...
top stories 2jul2022

Top Stories: M2 MacBook Air Release Date, New HomePod Rumor, and More

Saturday July 2, 2022 6:00 am PDT by
The M2 MacBook Pro has started making its way into customers' hands and we're learning more about how it performs in a variety of situations, but all eyes are really on the upcoming M2 MacBook Air which has seen a complete redesign and should be arriving in a couple of weeks. Other top stories this week included a host of product rumors including additional M2 and even M3 Macs, an updated...
original iphone 2007

15 Years Ago Today, the iPhone Went On Sale

Wednesday June 29, 2022 4:43 am PDT by
Fifteen years ago to this day, the iPhone, the revolutionary device presented to the world by the late Steve Jobs, officially went on sale. The first iPhone was announced by Steve Jobs on January 9, 2007, and went on sale on June 29, 2007. "An iPod, a phone, an internet mobile communicator... these are not three separate devices," Jobs famously said. "Today, Apple is going to reinvent the...
rootbug

Major macOS High Sierra Bug Allows Full Admin Access Without Password - How to Fix [Updated]

Tuesday November 28, 2017 12:33 pm PST by
There appears to be a serious bug in macOS High Sierra that enables the root superuser on a Mac with a blank password and no security check. The bug, discovered by developer Lemi Ergin, lets anyone log into an admin account using the username "root" with no password. This works when attempting to access an administrator's account on an unlocked Mac, and it also provides access at the login...