EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout.

eff logo lockup cleaned
Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative feedback from a wide range of individuals and organizations, including security researches, politicians, policy groups, and even some Apple employees.

The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

In its response to the announced delay, the EFF said it was "pleased Apple is now listening to the concerns" of users, but "the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely."

The statement by the digital rights group reiterated its previous criticisms about the intended features, which it has called "a decrease in privacy for all ‌‌iCloud Photos‌‌ users, not an improvement," and warned that Apple's move to scan messages and ‌‌iCloud Photos‌‌ could be legally required by authoritarian governments to encompass additional materials.

It also highlighted the negative reaction to Apple's announced plans by noting a number petitions that have been organized in opposition to the intended move.

The responses to Apple's plans have been damning: over 90 organizations across the globe have urged the company not to implement them, for fear that they would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children. This week, EFF's petition to Apple demanding they abandon their plans reached 25,000 signatures. This is in addition to other petitions by groups such as Fight for the Future and OpenMedia, totalling well over 50,000 signatures. The enormous coalition that has spoken out will continue to demand that user phones—both their messages and their photos—be protected, and that the company maintain its promise to provide real privacy to its users.

The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It's not clear when Apple plans to roll out the "critically important" features or how it intends to "improve" them in light of so much criticism, but the company still appears determined to roll them out in some form.

Top Rated Comments

Nuvi Avatar
7 weeks ago

I don’t know how apple are seen as the bad guy for trying to improve reporting and protection here.

The EFF don’t seem to be proposing any alternative solution.
You honestly believe private companies should start monitoring their users? You want Apple Police, Microsoft Police, Google Police etc. of gathering info for some other private organisation so they can use it to their own purpose? Shouldn’t we leave hunting down the criminals to governments and law enforcement agencies and not to some shady groups who are not governed by the laws like law enforcement?
Score: 98 Votes (Like | Disagree)
Porco Avatar
7 weeks ago

I don’t know how apple are seen as the bad guy for trying to improve reporting and protection here.
Literally in the article you responded to:

”for fear that they [the plans] would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”
Score: 87 Votes (Like | Disagree)
Saturn007 Avatar
7 weeks ago

Ah, the slippery slope fallacy.
It's no fallacy. It's how freedoms and privacy are eroded, and how authoritarian, fascist governments come to power.


When you receive this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know you’re doing something right.
When Apple receives this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know it's doing something wrong.

There. Fixed it for you!
Score: 63 Votes (Like | Disagree)
DBZmusicboy01 Avatar
7 weeks ago
Tim Cook is acting like the bad guy from Titanic who used the little girl to get into the lifeboat ? that’s how I feel Apple is trying to do. To use children as an excuse when in reality it’s for other reasons they want to spy on us.
Score: 38 Votes (Like | Disagree)
benh911f Avatar
7 weeks ago
“warned that Apple's move to scan messages and ‌‌iCloud Photos‌‌ could be legally required by authoritarian governments to encompass additional materials.”

They keep saying “authoritarian” governments in these articles. I can’t think of any government anymore that’s NOT authoritarian.
Score: 34 Votes (Like | Disagree)
matrix07 Avatar
7 weeks ago

1: privacy issue if it means children are protected. Those who question the privacy asspect of the issue I would have to question why because do they not want children to be protected?, therefore a persons privacy is more important than the protection of children? That concept is appalling to me. A childs protection comes before my privacy.
For a million times: it’s not Apple job to do this!

Want to protect our children? Either donate fund to FBI team who’s dealing with this issue or talk secretly to Congress to pass a law requires ALL who store our photo to scan for CSAM. Apart from these GTFO of my devices!

I’m a paying customer. I don’t like being point finger at ‘Hey! Let me check you. You COULD be a criminal’. This is not a way to treat your loyal customer.
Score: 27 Votes (Like | Disagree)

Related Stories

Child Safety Feature Blue

Apple Delays Rollout of Controversial Child Safety Features to Make Improvements

Friday September 3, 2021 6:07 am PDT by
Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced. The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance...
Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...
iphone communication safety feature

Apple Remains Committed to Launching New Child Safety Features Later This Year

Tuesday August 10, 2021 10:58 am PDT by
Last week, Apple previewed new child safety features that it said will be coming to the iPhone, iPad, and Mac with software updates later this year. The company said the features will be available in the U.S. only at launch. A refresher on Apple's new child safety features from our previous coverage:First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac...
eff apple park plane 1

EFF Flew a Banner Over Apple Park During Last Apple Event to Protest CSAM Plans

Friday September 24, 2021 2:06 am PDT by
In protest of the company's now delayed CSAM detection plans, the EFF, which has been vocal about Apple's child safety features plans in the past, flew a banner over Apple Park during the iPhone 13 event earlier this month with a message for the Cupertino tech giant. During Apple's fully-digital "California streaming" event on September 14, which included no physical audience attendance in...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
iphone communication safety feature

Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Thursday August 5, 2021 12:00 pm PDT by
Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time. Communication Safety First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children...
Child Safety Feature

Facebook's Former Security Chief Discusses Controversy Around Apple's Planned Child Safety Features

Tuesday August 10, 2021 5:50 am PDT by
Amid the ongoing controversy around Apple's plans to implement new child safety features that would involve scanning messages and users' photos libraries, Facebook's former security chief, Alex Stamos, has weighed into the debate with criticisms of multiple parties involved and suggestions for the future. In an extensive Twitter thread, Stamos said that there are "no easy answers" in the...
appleprivacyad

Privacy Whistleblower Edward Snowden and EFF Slam Apple's Plans to Scan Messages and iCloud Images

Friday August 6, 2021 5:00 am PDT by
Apple's plans to scan users' iCloud Photos library against a database of child sexual abuse material (CSAM) to look for matches and childrens' messages for explicit content has come under fire from privacy whistleblower Edward Snowden and the Electronic Frontier Foundation (EFF). In a series of tweets, the prominent privacy campaigner and whistleblower Edward Snowden highlighted concerns...
Child Safety Feature Blue

Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns

Tuesday August 10, 2021 9:07 am PDT by
Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch. When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got ...
Child Safety Feature yellow

Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

Thursday August 19, 2021 1:23 am PDT by
An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters). "Though these capabilities are intended to protect...