Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System - MacRumors
Skip to Content

Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System

Apple is facing a lawsuit seeking $1.2 billion in damages over its decision to abandon plans for scanning iCloud photos for child sexual abuse material (CSAM), according to a report from The New York Times.

iCloud General Feature
Filed in Northern California on Saturday, the lawsuit represents a potential group of 2,680 victims and alleges that Apple's failure to implement previously announced child safety tools has allowed harmful content to continue circulating, causing ongoing harm to victims.

In 2021, Apple announced plans to implement CSAM detection in iCloud Photos, alongside other child safety features. However, the company faced significant backlash from privacy advocates, security researchers, and policy groups who argued the technology could create potential backdoors for government surveillance. Apple subsequently postponed and later abandoned the initiative.

Explaining its decision at the time, Apple said that implementing universal scanning of users' private iCloud storage would introduce major security vulnerabilities that malicious actors could potentially exploit. Apple also expressed concerns that such a system could establish a problematic precedent, in that once content scanning infrastructure exists for one purpose, it could face pressure to expand into broader surveillance applications across different types of content and messaging platforms, including those that use encryption.

The lead plaintiff in the lawsuit, filing under a pseudonym, said she continues to receive law enforcement notices about individuals being charged with possessing abuse images of her from when she was an infant. The lawsuit argues that Apple's decision not to proceed with its announced safety measures has forced victims to repeatedly relive their trauma.

In response to the lawsuit, Apple spokesperson Fred Sainz underlined the company's commitment to fighting child exploitation, stating that Apple is "urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users." Apple pointed to existing features like Communication Safety, which warns children about potentially inappropriate content, as examples of its ongoing child protection efforts.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

Dynamic Island iPhone 18 Pro Feature

11 Reasons to Wait for the iPhone 18 Pro

Monday May 11, 2026 9:01 am PDT by
We're only four months out from the launch of Apple's premium next-generation smartphone lineup, and while we're not expecting a sea change in terms of functionality, there are still several enhancements rumored to be coming to the iPhone 18 Pro and iPhone 18 Pro Max. One thing worth noting is that Apple is reportedly planning a major change to its iPhone release cycle this year, adopting a...
iOS 26

iOS 26.5 Features: Everything New in iOS 26.5

Monday May 11, 2026 5:09 pm PDT by
Apple released iOS 26.5 after a few months of beta testing, and while it doesn't have the Siri features we were hoping for since those are being held until iOS 27, there are a handful of useful changes worth knowing about. Subscribe to the MacRumors YouTube channel for more videos. End-to-End Encryption for RCS Support for end-to-end encryption (E2EE) for RCS messages between iPhone and...
General Apps Reddit Feature

Reddit Starts Blocking Mobile Website, Pushing Users to App Instead

Monday May 11, 2026 6:10 am PDT by
Social network Reddit recently began blocking mobile visitors to its website while pushing them to download the official Reddit app, and it's fair to say that the move is not going down well with users. If you visit reddit.com on your iPhone today, you may see a new popup that can't be dismissed, asking you to "get the app to keep using Reddit." A Reddit spokesperson told Ars Technica...

Top Rated Comments

19 months ago
Maybe we should sue display manufactures for having the ability to display illicit content. We could also sue town, counties, states, and national governments for allowing people who engage in illegal activities to live within their boarders. I mean surely the government should be “scanning” your home to make sure you aren’t engaged in any activity that harms others right?

If we are OK holding innocent people accountable for the actions of the perpetrators, it kind of seems like we could sue anyone and everyone…
Score: 113 Votes (Like | Disagree)
19 months ago
Sued if you do, sued if you don’t.

Attorneys are the people winning either way.
Score: 104 Votes (Like | Disagree)
BelgianChoklit Avatar
19 months ago
Apple had good intentions with CSAM, but it was abandonned and for good reasons. iThink Apple was drunk having had the idea to introduce this thing.
Score: 54 Votes (Like | Disagree)
cjsuk Avatar
19 months ago
Oh I'm really going to be popular with this one.

2680 people's situation does not represent the greater good which is hundreds of millions of people's communication security being put at risk by non-deterministic reporting and content moderation.

But of course 2860 people will be happy to live with their problem, which won't be solved either way, if they get some cash for it. They just want money. And so do lawyers.

If people really want to fix this problem it'll be case of dealing with individuals via good old fashioned social methods i.e. effective policing and rehabilitation. But that's hard, so they'll take some money instead.
Score: 40 Votes (Like | Disagree)
TVreporter Avatar
19 months ago
Apple is damned if they do; damned if they don’t.

While I can sympathize if the individual’s claim is true, how can they blame Apple?

The image(s) are likely circulated on far more Android and Windows devices than Apple’s.

And who is to say if Apple implemented its program that it would detect the victim’s images. All circumstantial- a judge should quickly quash this.
Score: 27 Votes (Like | Disagree)
justperry Avatar
19 months ago
This is beyond dumb.
Score: 21 Votes (Like | Disagree)