EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout.

eff logo lockup cleaned
Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative feedback from a wide range of individuals and organizations, including security researches, politicians, policy groups, and even some Apple employees.

The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

In its response to the announced delay, the EFF said it was "pleased Apple is now listening to the concerns" of users, but "the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely."

The statement by the digital rights group reiterated its previous criticisms about the intended features, which it has called "a decrease in privacy for all ‌‌iCloud Photos‌‌ users, not an improvement," and warned that Apple's move to scan messages and ‌‌iCloud Photos‌‌ could be legally required by authoritarian governments to encompass additional materials.

It also highlighted the negative reaction to Apple's announced plans by noting a number petitions that have been organized in opposition to the intended move.

The responses to Apple's plans have been damning: over 90 organizations across the globe have urged the company not to implement them, for fear that they would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children. This week, EFF's petition to Apple demanding they abandon their plans reached 25,000 signatures. This is in addition to other petitions by groups such as Fight for the Future and OpenMedia, totalling well over 50,000 signatures. The enormous coalition that has spoken out will continue to demand that user phones—both their messages and their photos—be protected, and that the company maintain its promise to provide real privacy to its users.

The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It's not clear when Apple plans to roll out the "critically important" features or how it intends to "improve" them in light of so much criticism, but the company still appears determined to roll them out in some form.

Popular Stories

Apple Intelligence General Feature

Apple Intelligence Features Not Coming to European Union at Launch Due to DMA

Friday June 21, 2024 9:44 am PDT by
Apple today said that European customers will not get access to the Apple Intelligence, iPhone Mirroring, and SharePlay Screen Sharing features that are coming to the iPhone, iPad, and Mac this September due to regulatory issues related to the Digital Markets Act. In a statement to Financial Times, Apple said that there will be a delay as it works to figure out how to make the new...
Apple WWDC24 Apple Intelligence hero 240610

Apple Explains iPhone 15 Pro Requirement for Apple Intelligence

Wednesday June 19, 2024 4:48 am PDT by
With iOS 18, iPadOS 18, and macOS Sequoia, Apple is introducing a new personalized AI experience called Apple Intelligence that uses on-device, generative large-language models to enhance the user experience across iPhone, iPad, and Mac. These new AI features require Apple's latest iPhone 15 Pro and iPhone 15 Pro Max models to work, while only Macs and iPads with M1 or later chips will...
amazon echo dot

Amazon Could Charge Up to $10/Month for Alexa

Friday June 21, 2024 2:55 pm PDT by
Apple competitor Amazon is working on a revamp of its Alexa assistant, and the new version could cost up to $10 per month, according to a report from Reuters. The upcoming version of Alexa will support conversational generative AI, and Amazon is planning for two tiers of service. There will be a free tier and a second, premium tier that is priced at $5 at a minimum, with Amazon considering...
General Spotify Feature

Spotify Launches Cheaper $10.99/Month Premium Plan Without Audiobooks

Friday June 21, 2024 4:22 pm PDT by
Spotify today announced the launch of a new Basic paid plan that offers a small discount for dropping access to audiobooks. Priced at $10.99 per month, the Basic option includes all of the music benefits of Spotify like ad-free playback, but without added monthly audiobook listening time. The $11.99 standard Premium Individual plan that Spotify offers includes ad-free playback and 15 hours...
iPhone 16 Pro Max Generic Feature 2

5 Biggest Changes Rumored for iPhone 16 Pro Max

Wednesday June 19, 2024 5:00 am PDT by
Given Apple's rumored plan to add an all-new high-end tier to its iPhone 17 series in 2025, this could be the year for Apple to bring its boldest "Pro Max" model to the table — the kind of iPhone 16 upgrade that stands tall above its siblings, both figuratively and literally. If you have been holding out for the iPhone 16 Pro Max, here are five of the biggest changes rumored to be coming...

Top Rated Comments

Nuvi Avatar
37 months ago

I don’t know how apple are seen as the bad guy for trying to improve reporting and protection here.

The EFF don’t seem to be proposing any alternative solution.
You honestly believe private companies should start monitoring their users? You want Apple Police, Microsoft Police, Google Police etc. of gathering info for some other private organisation so they can use it to their own purpose? Shouldn’t we leave hunting down the criminals to governments and law enforcement agencies and not to some shady groups who are not governed by the laws like law enforcement?
Score: 98 Votes (Like | Disagree)
Porco Avatar
37 months ago

I don’t know how apple are seen as the bad guy for trying to improve reporting and protection here.
Literally in the article you responded to:

”for fear that they [the plans] would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”
Score: 87 Votes (Like | Disagree)
Saturn007 Avatar
37 months ago

Ah, the slippery slope fallacy.
It's no fallacy. It's how freedoms and privacy are eroded, and how authoritarian, fascist governments come to power.


When you receive this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know you’re doing something right.
When Apple receives this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know it's doing something wrong.

There. Fixed it for you!
Score: 63 Votes (Like | Disagree)
DBZmusicboy01 Avatar
37 months ago
Tim Cook is acting like the bad guy from Titanic who used the little girl to get into the lifeboat ? that’s how I feel Apple is trying to do. To use children as an excuse when in reality it’s for other reasons they want to spy on us.
Score: 38 Votes (Like | Disagree)
benh911f Avatar
37 months ago
“warned that Apple's move to scan messages and ‌‌iCloud Photos‌‌ could be legally required by authoritarian governments to encompass additional materials.”

They keep saying “authoritarian” governments in these articles. I can’t think of any government anymore that’s NOT authoritarian.
Score: 34 Votes (Like | Disagree)
matrix07 Avatar
37 months ago

1: privacy issue if it means children are protected. Those who question the privacy asspect of the issue I would have to question why because do they not want children to be protected?, therefore a persons privacy is more important than the protection of children? That concept is appalling to me. A childs protection comes before my privacy.
For a million times: it’s not Apple job to do this!

Want to protect our children? Either donate fund to FBI team who’s dealing with this issue or talk secretly to Congress to pass a law requires ALL who store our photo to scan for CSAM. Apart from these GTFO of my devices!

I’m a paying customer. I don’t like being point finger at ‘Hey! Let me check you. You COULD be a criminal’. This is not a way to treat your loyal customer.
Score: 27 Votes (Like | Disagree)