University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology - MacRumors
Skip to Content

University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology

Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous."

apple privacy
Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University Center for Information Technology Policy, both penned an op-ed for The Washington Post, outlining their experiences with building image detection technology.

The researchers started a project two years ago to identity CSAM in end-to-end encrypted online services. The researchers note that given their field, they "know the value of end-to-end encryption, which protects data from third-party access." That concern, they say, is what horrifies them over CSAM "proliferating on encrypted platforms."

Mayer and Kulshrestha said they wanted to find a middle ground for the situation: build a system that online platforms could use to find CSAM and protect end-to-end encryption. The researchers note that experts in the field doubted the prospect of such a system, but they did manage to build it and in the process noticed a significant problem.

We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing. People couldn't read the database or learn whether content matched, since that information could reveal law enforcement methods and help criminals evade detection.

Knowledgeable observers argued a system like ours was far from feasible. After many false starts, we built a working prototype. But we encountered a glaring problem.

Since Apple's announcement of the feature, the company has been bombarded with concerns that the system behind detecting CSAM could be used to detect other forms of photos at the request of oppressive governments. Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments.

Nonetheless, concerns over the future implications of the technology being used for CSAM detection are widespread. Mayer and Kulshrestha said that their concerns over how governments could use the system to detect content other than CSAM had them "disturbed."

A foreign government could, for example, compel a service to out people sharing disfavored political speech. That's no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials.

We spotted other shortcomings. The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny.

We were so disturbed that we took a step we hadn't seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides....

Apple has continued to address user concerns over its plans, publishing additional documents and an FAQ page. Apple continues to believe that its CSAM detection system, which will occur on a user's device, aligns with its long-standing privacy values.

Popular Stories

apple lock security bug vulnerability fix privacy

Apple Warns Canada's Bill C-22 Could Force Encryption Backdoors

Friday May 8, 2026 4:22 am PDT by
Apple and Meta have opposed a Canadian bill that the companies say could force them to create backdoor access to encrypted user data, should it pass through the country's parliament. Proposed by Canada's ruling Liberal Party, Bill C-22 contains provisions that could be similar ​to a UK data access provision order sent to Apple last year, depending on how they are implemented. Back in Feb...
General Apps Reddit Feature

Reddit Starts Blocking Mobile Website, Pushing Users to App Instead

Monday May 11, 2026 6:10 am PDT by
Social network Reddit recently began blocking mobile visitors to its website while pushing them to download the official Reddit app, and it's fair to say that the move is not going down well with users. If you visit reddit.com on your iPhone today, you may see a new popup that can't be dismissed, asking you to "get the app to keep using Reddit." A Reddit spokesperson told Ars Technica...
iOS 26

Apple Releases iOS 26.5 and iPadOS 26.5 With End-to-End Encrypted RCS, New Wallpaper, and Maps Updates

Monday May 11, 2026 10:06 am PDT by
Apple today released iOS 26.5 and iPadOS 26.5, the newest updates to the iOS 26 and iPadOS 26 operating systems. The software comes nearly two months after Apple released iOS 26.4 and iPadOS 26.4. The new software can be downloaded on eligible iPhones and iPads over-the-air by going to Settings > General > Software Update. Apple has also released iOS 15.8.8, iOS 16.7.16, iOS 18.7.9, and...

Top Rated Comments

62 months ago
Apple, just cancel this on device surveillance software
Score: 166 Votes (Like | Disagree)
nvmls Avatar
62 months ago
They clearly don't know how the technology works.. oh wait
Score: 125 Votes (Like | Disagree)
62 months ago
This is getting more and more disturbing
Score: 117 Votes (Like | Disagree)
TheRealNick Avatar
62 months ago
If they don’t cancel this I’m seriously going to have to look at alternative products, which saddens me.
Score: 108 Votes (Like | Disagree)
62 months ago
It's time for Apple to admit it's made a mistake. It's seriously bizarre they ever thought this was a good idea.
Score: 95 Votes (Like | Disagree)
VulchR Avatar
62 months ago
Waiting for the surveillance apologists to argue these two researchers "don't understand" the process...
Score: 93 Votes (Like | Disagree)