Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System

Apple is facing a lawsuit seeking $1.2 billion in damages over its decision to abandon plans for scanning iCloud photos for child sexual abuse material (CSAM), according to a report from The New York Times.

iCloud General Feature
Filed in Northern California on Saturday, the lawsuit represents a potential group of 2,680 victims and alleges that Apple's failure to implement previously announced child safety tools has allowed harmful content to continue circulating, causing ongoing harm to victims.

In 2021, Apple announced plans to implement CSAM detection in iCloud Photos, alongside other child safety features. However, the company faced significant backlash from privacy advocates, security researchers, and policy groups who argued the technology could create potential backdoors for government surveillance. Apple subsequently postponed and later abandoned the initiative.

Explaining its decision at the time, Apple said that implementing universal scanning of users' private iCloud storage would introduce major security vulnerabilities that malicious actors could potentially exploit. Apple also expressed concerns that such a system could establish a problematic precedent, in that once content scanning infrastructure exists for one purpose, it could face pressure to expand into broader surveillance applications across different types of content and messaging platforms, including those that use encryption.

The lead plaintiff in the lawsuit, filing under a pseudonym, said she continues to receive law enforcement notices about individuals being charged with possessing abuse images of her from when she was an infant. The lawsuit argues that Apple's decision not to proceed with its announced safety measures has forced victims to repeatedly relive their trauma.

In response to the lawsuit, Apple spokesperson Fred Sainz underlined the company's commitment to fighting child exploitation, stating that Apple is "urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users." Apple pointed to existing features like Communication Safety, which warns children about potentially inappropriate content, as examples of its ongoing child protection efforts.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

maxresdefault

Apple Shows Off a Key Reason to Upgrade to the iPhone 17

Saturday February 7, 2026 9:26 am PST by
Apple today shared an ad that shows how the upgraded Center Stage front camera on the latest iPhones improves the process of taking a group selfie. "Watch how the new front facing camera on iPhone 17 Pro takes group selfies that automatically expand and rotate as more people come into frame," says Apple. While the ad is focused on the iPhone 17 Pro and iPhone 17 Pro Max, the regular iPhone...
Finder Siri Feature

Why Apple's iOS 26.4 Siri Upgrade Will Be Bigger Than Originally Promised

Friday February 6, 2026 3:06 pm PST by
In the iOS 26.4 update that's coming this spring, Apple will introduce a new version of Siri that's going to overhaul how we interact with the personal assistant and what it's able to do. The iOS 26.4 version of Siri won't work like ChatGPT or Claude, but it will rely on large language models (LLMs) and has been updated from the ground up. Upgraded Architecture The next-generation...
wwdc sans text feature

Apple Rumored to Announce New Product on February 19

Thursday February 5, 2026 12:22 pm PST by
Apple plans to announce the iPhone 17e on Thursday, February 19, according to Macwelt, the German equivalent of Macworld. The report, citing industry sources, is available in English on Macworld. Apple announced the iPhone 16e on Wednesday, February 19 last year, so the iPhone 17e would be unveiled exactly one year later if this rumor is accurate. It is quite uncommon for Apple to unveil...
apple wallet drivers license feature iPhone 15 pro

Apple Says These 7 U.S. States Plan to Offer iPhone Driver's Licenses

Monday February 9, 2026 6:24 am PST by
In select U.S. states, residents can add their driver's license or state ID to the Apple Wallet app on the iPhone and Apple Watch, and then use it to display proof of identity or age at select airports and businesses, and in select apps. The feature is currently available in 13 U.S. states and Puerto Rico, and it is expected to launch in at least seven more in the future. To set up the...
14 inch MacBook Pro Keyboard

New MacBook Pros Could Now Arrive in March

Sunday February 8, 2026 6:02 am PST by
New MacBook Pro models with the M5 Pro and M5 Max chips could arrive as soon as Monday, March 2, according to Bloomberg's Mark Gurman. In today's "Power On" newsletter, Gurman said that the release of new MacBook Pro models is tied to the release of macOS Tahoe 26.3. The launch is said to be slated for as early as the week of March 2. He added that the M4 Pro and M4 Max models on sale today...

Top Rated Comments

SpotOnT Avatar
15 months ago
Maybe we should sue display manufactures for having the ability to display illicit content. We could also sue town, counties, states, and national governments for allowing people who engage in illegal activities to live within their boarders. I mean surely the government should be “scanning” your home to make sure you aren’t engaged in any activity that harms others right?

If we are OK holding innocent people accountable for the actions of the perpetrators, it kind of seems like we could sue anyone and everyone…
Score: 113 Votes (Like | Disagree)
neuropsychguy Avatar
15 months ago
Sued if you do, sued if you don’t.

Attorneys are the people winning either way.
Score: 104 Votes (Like | Disagree)
BelgianChoklit Avatar
15 months ago
Apple had good intentions with CSAM, but it was abandonned and for good reasons. iThink Apple was drunk having had the idea to introduce this thing.
Score: 54 Votes (Like | Disagree)
cjsuk Avatar
15 months ago
Oh I'm really going to be popular with this one.

2680 people's situation does not represent the greater good which is hundreds of millions of people's communication security being put at risk by non-deterministic reporting and content moderation.

But of course 2860 people will be happy to live with their problem, which won't be solved either way, if they get some cash for it. They just want money. And so do lawyers.

If people really want to fix this problem it'll be case of dealing with individuals via good old fashioned social methods i.e. effective policing and rehabilitation. But that's hard, so they'll take some money instead.
Score: 40 Votes (Like | Disagree)
TVreporter Avatar
15 months ago
Apple is damned if they do; damned if they don’t.

While I can sympathize if the individual’s claim is true, how can they blame Apple?

The image(s) are likely circulated on far more Android and Windows devices than Apple’s.

And who is to say if Apple implemented its program that it would detect the victim’s images. All circumstantial- a judge should quickly quash this.
Score: 27 Votes (Like | Disagree)
justperry Avatar
15 months ago
This is beyond dumb.
Score: 21 Votes (Like | Disagree)