Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System

Apple is facing a lawsuit seeking $1.2 billion in damages over its decision to abandon plans for scanning iCloud photos for child sexual abuse material (CSAM), according to a report from The New York Times.

iCloud General Feature
Filed in Northern California on Saturday, the lawsuit represents a potential group of 2,680 victims and alleges that Apple's failure to implement previously announced child safety tools has allowed harmful content to continue circulating, causing ongoing harm to victims.

In 2021, Apple announced plans to implement CSAM detection in iCloud Photos, alongside other child safety features. However, the company faced significant backlash from privacy advocates, security researchers, and policy groups who argued the technology could create potential backdoors for government surveillance. Apple subsequently postponed and later abandoned the initiative.

Explaining its decision at the time, Apple said that implementing universal scanning of users' private iCloud storage would introduce major security vulnerabilities that malicious actors could potentially exploit. Apple also expressed concerns that such a system could establish a problematic precedent, in that once content scanning infrastructure exists for one purpose, it could face pressure to expand into broader surveillance applications across different types of content and messaging platforms, including those that use encryption.

The lead plaintiff in the lawsuit, filing under a pseudonym, said she continues to receive law enforcement notices about individuals being charged with possessing abuse images of her from when she was an infant. The lawsuit argues that Apple's decision not to proceed with its announced safety measures has forced victims to repeatedly relive their trauma.

In response to the lawsuit, Apple spokesperson Fred Sainz underlined the company's commitment to fighting child exploitation, stating that Apple is "urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users." Apple pointed to existing features like Communication Safety, which warns children about potentially inappropriate content, as examples of its ongoing child protection efforts.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iPhone 17 Pro and Air Feature

Two iPhone 17 Pro and iPhone Air Colors Appear to Scratch More Easily

Friday September 19, 2025 10:02 am PDT by
As reported by Bloomberg today, some of the new iPhone 17 Pro and iPhone Air models on display at Apple Stores today are already scratched and scuffed. French blog Consomac also reported on this topic. The scratches appear to be most prominent on models with darker finishes, including the iPhone 17 Pro and Pro Max in Deep Blue, and the iPhone Air in Space Black. Images Credit: Consoma ...
iOS 26

Everything New in iOS 26.1 Beta 1

Monday September 22, 2025 12:44 pm PDT by
Apple released the first beta of iOS 26.1 today, just a week after launching iOS 26. iOS 26.1 mainly adds new languages to Apple Intelligence, but there are a few other features that are worth knowing about. New Apple Intelligence Languages Apple Intelligence is now available in Danish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Chinese (Traditional), and Vietnamese. AirPo...
Apple Foldable Thumb

Foldable iPhone Like 'Two Titanium iPhone Airs' Joined at the Hinge

Monday September 22, 2025 2:16 am PDT by
Next year's rumored foldable iPhone will showcase an ultra-thin design resembling "two titanium iPhone Airs side-by-side," according to Bloomberg's Mark Gurman. Writing in the Q&A section of his latest Power On newsletter, Gurman says Apple's first foldable device will be "super thin and a design achievement," combining Apple's thinnest iPhone form factor with cutting-edge folding...
iOS 26

iOS 26.0.1 Coming Soon, Likely With iPhone Air and iPhone 17 Pro Fix

Thursday September 18, 2025 9:17 am PDT by
Apple is preparing to release iOS 26.0.1, according to a private account on X with a proven track record of sharing information about future iOS versions. The update will have a build number of 23A350, or similar, the account said. It is likely that iOS 26.0.1 will fix a camera-related bug on the new iPhone Air and iPhone 17 Pro models. In his iPhone Air review, CNN Underscored's Henry T. ...
iPhone 17 Pro and Air N1 Feature

Some iPhone 17, iPhone 17 Pro, and iPhone Air Users Experiencing Intermittent Wi-Fi Issue

Monday September 22, 2025 8:44 am PDT by
Apple's latest iPhone models launched on Friday, and some early adopters of the devices are experiencing intermittent Wi-Fi issues. Affected customers say Wi-Fi connectivity periodically cuts out on the iPhone 17, iPhone 17 Pro, iPhone 17 Pro Max, and iPhone Air, with hundreds of comments about the issue posted across the MacRumors Forums, Reddit, and the Apple Support Community over the...
Apple Intelligence General Feature 2

iOS 26.1 Adds New Apple Intelligence Languages and Expands AirPods Live Translation

Monday September 22, 2025 11:15 am PDT by
With iOS 26.1, Apple Intelligence is gaining support for additional languages, including Danish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Chinese (Traditional), and Vietnamese. Apple announced plans to expand the languages that can be used with Apple Intelligence last year, and now the added language support is here. Apple Intelligence is now available in the following...

Top Rated Comments

SpotOnT Avatar
10 months ago
Maybe we should sue display manufactures for having the ability to display illicit content. We could also sue town, counties, states, and national governments for allowing people who engage in illegal activities to live within their boarders. I mean surely the government should be “scanning” your home to make sure you aren’t engaged in any activity that harms others right?

If we are OK holding innocent people accountable for the actions of the perpetrators, it kind of seems like we could sue anyone and everyone…
Score: 113 Votes (Like | Disagree)
neuropsychguy Avatar
10 months ago
Sued if you do, sued if you don’t.

Attorneys are the people winning either way.
Score: 104 Votes (Like | Disagree)
BelgianChoklit Avatar
10 months ago
Apple had good intentions with CSAM, but it was abandonned and for good reasons. iThink Apple was drunk having had the idea to introduce this thing.
Score: 54 Votes (Like | Disagree)
cjsuk Avatar
10 months ago
Oh I'm really going to be popular with this one.

2680 people's situation does not represent the greater good which is hundreds of millions of people's communication security being put at risk by non-deterministic reporting and content moderation.

But of course 2860 people will be happy to live with their problem, which won't be solved either way, if they get some cash for it. They just want money. And so do lawyers.

If people really want to fix this problem it'll be case of dealing with individuals via good old fashioned social methods i.e. effective policing and rehabilitation. But that's hard, so they'll take some money instead.
Score: 40 Votes (Like | Disagree)
TVreporter Avatar
10 months ago
Apple is damned if they do; damned if they don’t.

While I can sympathize if the individual’s claim is true, how can they blame Apple?

The image(s) are likely circulated on far more Android and Windows devices than Apple’s.

And who is to say if Apple implemented its program that it would detect the victim’s images. All circumstantial- a judge should quickly quash this.
Score: 27 Votes (Like | Disagree)
justperry Avatar
10 months ago
This is beyond dumb.
Score: 21 Votes (Like | Disagree)