Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters).

Child Safety Feature yellow

"Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children," the groups wrote in the letter.

Some signatories of the letter, organized by the U.S.-based nonprofit Center for Democracy & Technology (CDT), are concerned that Apple's on-device CSAM scanning system could be subverted in nations with different legal systems to search for political or other sensitive content.

"Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit," reads the letter.

The letter also calls on Apple to abandon planned changes to iMessage in family accounts, which would try to identify and blur nudity in children's messages, letting them view it only if parents are notified. The signatories claim that not only could the step endanger children in intolerant homes or those seeking educational material, it would also break end-to-end encryption for iMessage.

Some signatories come from countries in which there are already heated legal battles over digital encryption and privacy rights, such as Brazil, where WhatsApp has been repeatedly blocked for failing to decrypt messages in criminal probes. Other signers are based in India, Mexico, Germany, Argentina, Ghana and Tanzania. Groups that have also signed include the American Civil Liberties Union, Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.

Apple's plan to detect known CSAM images stored in iCloud Photos has been particularly controversial and has prompted concerns from security researchers, academics, privacy groups, and others about the system potentially being abused by governments as a form of mass surveillance. The company has tried to address concerns by publishing additional documents and a FAQ page explaining how the image-detection system will work and arguing that the risk of false detections is low.

Apple has also said it would refuse demands to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although as Reuters points out, it has not said that it would pull out of a market rather than obeying a court order.

Top Rated Comments

stringParameter Avatar
10 weeks ago
Obviously the start of something very sinister here. I just didn't expect Apple to be the ones leading the way :/
Score: 81 Votes (Like | Disagree)
dragje Avatar
10 weeks ago


Apple has also said it would refuse demands to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although as Reuters points out, it has not said that it would pull out of a market rather than obeying a court order.

Exactly what Reuters rightfully points out. Even if Apple's intentions are 100% good, this system does create a backdoor that enables the possibility that due law, of any given country, Apple could be forced by court order, to look for images of protestors, or political symbols, to filter out political protestors for purposes that are not good.

I'm surprised to see Apple doing this because they seem to be the front runners of this whole privacy mantra. It counterpoints everything where Apple stands for.

I find it also hard to believe that Apple would pull back all of their iPhones out of China if the Chinese government orders Apple to search for aspects as mentioned above.
Score: 81 Votes (Like | Disagree)
Grey Area Avatar
10 weeks ago

Again? Wasn't this article already posted?

Anyway, it's pretty much misleading from the start since it's not a backdoor in any technical sense, worse-for-privacy cloud scanning is already taking place at least at other photo library providers, and "scan users' photo libraries" conveniently forgets to mention that it's pictures being uploaded to the cloud service.

Perhaps the signatories should read the relevant technical documents and FAQs:

https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
https://www.apple.com/child-safety/pdf/Expanded_Protections_for_Children_Frequently_Asked_Questions.pdf
https://www.apple.com/child-safety/pdf/Security_Threat_Model_Review_of_Apple_Child_Safety_Features.pdf
https://www.apple.com/child-safety/pdf/Apple_PSI_System_Security_Protocol_and_Analysis.pdf
The open letter was published today, so no, this article was not posted earlier.

Maybe something similar was, and if so, great - more and more organizations are protesting. This will not just go away quietly. I am also glad that these protests come despite the matter involving CSAM, a touchy topic normally well suited to enforce whatever measures. That so many have the courage to speak out against Apple in this indicates that Apple crossed a serious line and that "think-of-the-children" is wearing thin as an alibi.

The technical documents do not address the core objections in any satisfying way. Many people, including experts, have read these documents and still oppose the new system.
Score: 60 Votes (Like | Disagree)
Agit21 Avatar
10 weeks ago
„build surveillance capabilities into iPhones, iPads, and other products“

That’s exactly what this new “feature“ is Tim!
Score: 48 Votes (Like | Disagree)
Wildkraut Avatar
10 weeks ago
? w00t unbelievable, these “Screeching Voices of the Minority.”

But I’m sure there are still reasons to side with Apple. Apple is never wrong, Daddy Tim just want our best????????.
Score: 43 Votes (Like | Disagree)
sanook997 Avatar
10 weeks ago
Regardless of the outcome, I will never feel the same about security with Apple products as I have previously. They can do anything thing they want in the cloud, but not on my phone.
Score: 35 Votes (Like | Disagree)

Related Stories

Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
apple park drone june 2018 2

Apple Employees Internally Raising Concerns Over CSAM Detection Plans

Thursday August 12, 2021 11:43 pm PDT by
Apple employees are now joining the choir of individuals raising concerns over Apple's plans to scan iPhone users' photo libraries for CSAM or child sexual abuse material, reportedly speaking out internally about how the technology could be used to scan users' photos for other types of content, according to a report from Reuters. According to Reuters, an unspecified number of Apple employees ...
appleprivacyad

Privacy Whistleblower Edward Snowden and EFF Slam Apple's Plans to Scan Messages and iCloud Images

Friday August 6, 2021 5:00 am PDT by
Apple's plans to scan users' iCloud Photos library against a database of child sexual abuse material (CSAM) to look for matches and childrens' messages for explicit content has come under fire from privacy whistleblower Edward Snowden and the Electronic Frontier Foundation (EFF). In a series of tweets, the prominent privacy campaigner and whistleblower Edward Snowden highlighted concerns...
eff apple park plane 1

EFF Flew a Banner Over Apple Park During Last Apple Event to Protest CSAM Plans

Friday September 24, 2021 2:06 am PDT by
In protest of the company's now delayed CSAM detection plans, the EFF, which has been vocal about Apple's child safety features plans in the past, flew a banner over Apple Park during the iPhone 13 event earlier this month with a message for the Cupertino tech giant. During Apple's fully-digital "California streaming" event on September 14, which included no physical audience attendance in...
iphone communication safety feature

Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Thursday August 5, 2021 12:00 pm PDT by
Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time. Communication Safety First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children...
apple privacy

University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology

Friday August 20, 2021 5:48 am PDT by
Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous." Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...
iCloud General Feature

Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off

Thursday August 5, 2021 2:16 pm PDT by
Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States. User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud...
craig wwdc 2021 privacy

Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Friday August 13, 2021 6:33 am PDT by
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM). Federighi admitted that Apple...