Apple employees are now joining the choir of individuals raising concerns over Apple's plans to scan iPhone users' photo libraries for CSAM or child sexual abuse material, reportedly speaking out internally about how the technology could be used to scan users' photos for other types of content, according to a report from Reuters.
According to Reuters, an unspecified number of Apple employees have taken to internal Slack channels to raise concerns over CSAM detection. Specifically, employees are concerned that governments could force Apple to use the technology for censorship by finding content other than CSAM. Some employees are worried that Apple is damaging its industry-leading privacy reputation.
Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.
Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.
Apple employees in roles pertaining to user security are not thought to have been part of the internal protest, according to the report.
Apple has firmly pushed back against the idea that the on-device technology used for detecting CSAM material could be used for any other purpose. In a published FAQ document, the company says it will vehemently refuse any such demand by governments.
Could governments force Apple to add non-CSAM images to the hash list? Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.
An open letter criticizing Apple and calling upon the company to immediately halt it's plan to deploy CSAM detection has gained more than 7,000 signatures at the time of writing. The head of WhatsApp has also weighed into the debate.
Friday December 12, 2025 10:56 am PST by Joe Rossignol
Macworld's Filipe Espósito today revealed a handful of features that Apple is allegedly planning for iOS 26.4, iOS 27, and even iOS 28.
The report said the features are referenced within the code for a leaked internal build of iOS 26 that is not meant to be seen by the public. However, it appears that Espósito and/or his sources managed to gain access to it, providing us with a sneak peek...
Thursday December 11, 2025 8:49 am PST by Joe Rossignol
Apple seeded the second iOS 26.2 Release Candidate to developers earlier this week, meaning the update will be released to the general public very soon.
Apple confirmed iOS 26.2 would be released in December, but it did not provide a specific date. We expect the update to be released by early next week.
iOS 26.2 includes a handful of new features and changes on the iPhone, such as a new...
Friday December 12, 2025 10:10 am PST by Juli Clover
Apple today released iOS 26.2, the second major update to the iOS 26 operating system that came out in September, iOS 26.2 comes a little over a month after iOS 26.1 launched. iOS 26.2 is compatible with the iPhone 11 series and later, as well as the second-generation iPhone SE.
The new software can be downloaded on eligible iPhones over-the-air by going to Settings >...
Thursday December 11, 2025 11:28 am PST by Juli Clover
Apple today released new firmware designed for the AirPods Pro 3 and the prior-generation AirPods Pro 2. The AirPods Pro 3 firmware is 8B30, up from 8B25, while the AirPods Pro 2 firmware is 8B28, up from 8B21.
There's no word on what's include in the updated firmware, but the AirPods Pro 2 and AirPods Pro 3 are getting expanded support for Live Translation in the European Union in iOS...
Friday December 12, 2025 10:08 am PST by Juli Clover
Apple today released macOS Tahoe 26.2, the second major update to the macOS Tahoe operating system that came out in September. macOS Tahoe 26.2 comes five weeks after Apple released macOS Tahoe 26.1.
Mac users can download the macOS Tahoe update by using the Software Update section of System Settings.
macOS Tahoe 26.2 includes Edge Light, a feature that illuminates your face with soft...
Thursday December 11, 2025 10:31 am PST by Juli Clover
The AirTag 2 will include a handful of new features that will improve tracking capabilities, according to a new report from Macworld. The site says that it was able to access an internal build of iOS 26, which includes references to multiple unreleased products.
Here's what's supposedly coming:
An improved pairing process, though no details were provided. AirTag pairing is already...
Friday December 12, 2025 10:09 am PST by Juli Clover
Apple today released iPadOS 26.2, the second major update to the iPadOS 26 operating system released in September. iPadOS 26.2 comes a month after iPadOS 26.1.
The new software can be downloaded on eligible iPads over-the-air by going to Settings > General > Software Update.
iPadOS 26.2 continues with the multitasking improvements that were added with iPadOS 26.1. You can now drag and...
Friday December 12, 2025 11:11 am PST by Juli Clover
Apple today released iOS 26.2, iPadOS 26.2, and macOS 26.2, all of which introduce new features, bug fixes, and security improvements. Apple says that the updates address over 20 vulnerabilities, including two bugs that are known to have been actively exploited.
There are a pair of WebKit vulnerabilities that could allow maliciously crafted web content to execute code or cause memory...
“We will refuse any such demands.… except when it’s made into a law.” [insert whatever country name you hate here] will definitely force Apple to do so or Apple may lose access to that market.
I really hope apple reversed course on the CSAM stuff. No one wants child porn, but it’s easy to start with a universally reviled topic, then move on to topics that are controversial but not illegal.
[HEADING=3]Enough is Enough, Apple! It is incredibly distasteful for you to SPY on us, as consumers. Apple it will be in your best interest to give us the Opt-Out option from CSAM feature, please.[/HEADING]
Where are you, Craig? You said this yourself in WWDC - 2021. What happened?
“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”
This is a punishment and a big slap in the face for anyone who owns an iPhone. Whatever you are trying to accomplish with this Apple, leave it in government and law enforcement hands.
1. How about CSAM scans Apple's executives' iPhones first? No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our PRIVACY. It is our fundamental human right.
2. How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to the release date of iOS 15...