Apple Employees Internally Raising Concerns Over CSAM Detection Plans

Apple employees are now joining the choir of individuals raising concerns over Apple's plans to scan iPhone users' photo libraries for CSAM or child sexual abuse material, reportedly speaking out internally about how the technology could be used to scan users' photos for other types of content, according to a report from Reuters.

apple park drone june 2018 2
According to Reuters, an unspecified number of Apple employees have taken to internal Slack channels to raise concerns over CSAM detection. Specifically, employees are concerned that governments could force Apple to use the technology for censorship by finding content other than CSAM. Some employees are worried that Apple is damaging its industry-leading privacy reputation.

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.

Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.

Apple employees in roles pertaining to user security are not thought to have been part of the internal protest, according to the report.

Ever since its announcement last week, Apple has been bombarded with criticism over its CSAM detection plans, which are still expected to roll out with iOS 15 and iPadOS 15 this fall. Concerns mainly revolve around how the technology could present a slippery slope for future implementations by oppressive governments and regimes.

Apple has firmly pushed back against the idea that the on-device technology used for detecting CSAM material could be used for any other purpose. In a published FAQ document, the company says it will vehemently refuse any such demand by governments.

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

An open letter criticizing Apple and calling upon the company to immediately halt it's plan to deploy CSAM detection has gained more than 7,000 signatures at the time of writing. The head of WhatsApp has also weighed into the debate.

Top Rated Comments

haruhiko Avatar
15 weeks ago
“We will refuse any such demands.… except when it’s made into a law.” [insert whatever country name you hate here] will definitely force Apple to do so or Apple may lose access to that market.
Score: 84 Votes (Like | Disagree)
ItWasNotMe Avatar
15 weeks ago
The road to hell is littered with good intent
Score: 75 Votes (Like | Disagree)
Xenden Avatar
15 weeks ago
I really hope apple reversed course on the CSAM stuff. No one wants child porn, but it’s easy to start with a universally reviled topic, then move on to topics that are controversial but not illegal.
Score: 72 Votes (Like | Disagree)
TheYayAreaLiving ? Avatar
15 weeks ago
[HEADING=3]Enough is Enough, Apple! It is incredibly distasteful for you to SPY on us, as consumers. Apple it will be in your best interest to give us the Opt-Out option from CSAM feature, please.[/HEADING]

Where are you, Craig? You said this yourself in WWDC - 2021. What happened?

“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”

This is a punishment and a big slap in the face for anyone who owns an iPhone. Whatever you are trying to accomplish with this Apple, leave it in government and law enforcement hands.

1. How about CSAM scans Apple's executives' iPhones first? No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our PRIVACY. It is our fundamental human right.

2. How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to the release date of iOS 15...

https://www.cultofmac.com/744421/apple-announces-icloud-other-new-privacy-features/

Also, this guy needs to be FIRED from Apple. He is the mastermind behind CSAM. What a joke!



[HEADING=2]Does anyone here have a game plan on how we can stop this crappy CSAM feature?[/HEADING]

Attachment Image
Score: 72 Votes (Like | Disagree)
zakarhino Avatar
15 weeks ago
"We will refuse any such demands... except when the FBI ask us not to encrypt iCloud Backups.

Oh, and when the CCP ask us to hand over iCloud decryption keys"
Score: 53 Votes (Like | Disagree)
urbZZ Avatar
15 weeks ago
No iOS 15 upgrade for me, thats for sure

Attachment Image
Score: 46 Votes (Like | Disagree)

Related Stories

Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...
Child Safety Feature yellow

Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

Thursday August 19, 2021 1:23 am PDT by
An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters). "Though these capabilities are intended to protect...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
apple privacy

University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology

Friday August 20, 2021 5:48 am PDT by
Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous." Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University...
apple park drone june 2018 2

Apple Employees Expected to Return to Offices in February

Thursday November 18, 2021 10:02 am PST by
Apple employees are expected to begin returning to corporate offices starting on February 1, according to a memo Apple CEO Tim Cook sent out to staff today. The memo, obtained by The Information, says that employees will return under the hybrid work pilot that was announced earlier this year. Starting in February, employees will work at Apple's campuses and offces for one to two days each...
emergency sos iphone banner

iPhones and Apple Watches Could Detect a Car Crash and Auto-Dial 911 Starting Next Year

Monday November 1, 2021 5:11 am PDT by
Apple is planning a new feature for the iPhone and Apple Watch that would enable the devices to detect if you are involved in a car crash and automatically dial 911 for emergency services, according to The Wall Street Journal's Rolfe Winkler. Apple plans to launch the "crash detection" feature in 2022, the report claims, citing company documents and people familiar with the matter. The...
apple park drone june 2018 2

Apple Requiring Unvaccinated Corporate Employees to Be Tested for COVID-19 Daily

Wednesday October 20, 2021 2:25 pm PDT by
Apple corporate employees who are unvaccinated or who have not shared their vaccination status with Apple will need to undergo a COVID-19 test each time they come into the office, reports Bloomberg. An employee going into the office on a daily basis would need to be tested each day of the week under the new rules. Apple is still stopping short of implementing a vaccine mandate that would...
iCloud General Feature

Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off

Thursday August 5, 2021 2:16 pm PDT by
Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States. User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud...
iphone communication safety feature

Apple Outlines Security and Privacy of CSAM Detection System in New Document

Friday August 13, 2021 11:45 am PDT by
Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations. Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...