Apple Child Safety Features
By MacRumors Staff
Apple Child Safety Features Articles

Apple Remains Silent About Plans to Detect Known CSAM Stored in iCloud Photos
It has now been over a year since Apple announced plans for three new child safety features, including a system to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, an option to blur sexually explicit photos in the Messages app, and child exploitation resources for Siri. The latter two features are now available, but Apple remains silent about its plans for the CSAM...
Read Full Article 324 comments

European Commission to Release Draft Law Enforcing Mandatory Detection of Child Sexual Abuse Material on Digital Platforms
The European Commission is set to release a draft law this week that could require tech companies like Apple and Google to identify, remove and report to law enforcement illegal images of child abuse on their platforms, claims a new report out today.
According to a leak of the proposal obtained by Politico, the EC believes voluntary measures taken by some digital companies have thus far...

Apple's Messages Communication Safety Feature for Kids Expanding to the UK, Canada, Australia, and New Zealand
Apple is planning to expand its Communication Safety in Messages feature to the UK, according to The Guardian. Communication Safety in Messages was introduced in the iOS 15.2 update released in December, but the feature has been limited to the United States until now.
Communication Safety in Messages is designed to scan incoming and outgoing iMessage images on children's devices for nudity...

Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]
Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods.
Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material...

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...

EFF Flew a Banner Over Apple Park During Last Apple Event to Protest CSAM Plans
In protest of the company's now delayed CSAM detection plans, the EFF, which has been vocal about Apple's child safety features plans in the past, flew a banner over Apple Park during the iPhone 13 event earlier this month with a message for the Cupertino tech giant.
During Apple's fully-digital "California streaming" event on September 14, which included no physical audience attendance in...

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout.
Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...

Apple Delays Rollout of Controversial Child Safety Features to Make Improvements
Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced.
The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance...

University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology
Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous."
Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University...

Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'
An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters).
"Though these capabilities are intended to protect...

Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection
Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit.
Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his...

German Politician Asks Apple CEO Tim Cook to Abandon CSAM Scanning Plans
Member of the German parliament, Manuel Höferlin, who serves as the chairman of the Digital Agenda committee in Germany, has penned a letter to Apple CEO Tim Cook, pleading Apple to abandon its plan to scan iPhone users' photo libraries for CSAM (child sexual abuse material) images later this year.
In the two-page letter (via iFun), Höferlin said that he first applauds Apple's efforts to...

Corellium Launching New Initiative to Hold Apple Accountable Over CSAM Detection Security and Privacy Claims
Security research firm Corellium this week announced it is launching a new initiative that will "support independent public research into the security and privacy of mobile applications," and one of the initiative's first projects will be Apple's recently announced CSAM detection plans.
Since its announcement earlier this month, Apple's plan to scan iPhone users' photo libraries for CSAM or...

Apple Outlines Security and Privacy of CSAM Detection System in New Document
Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations.
Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from...

Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM).
Federighi admitted that Apple...

Apple Employees Internally Raising Concerns Over CSAM Detection Plans
Apple employees are now joining the choir of individuals raising concerns over Apple's plans to scan iPhone users' photo libraries for CSAM or child sexual abuse material, reportedly speaking out internally about how the technology could be used to scan users' photos for other types of content, according to a report from Reuters.
According to Reuters, an unspecified number of Apple employees ...

Apple Remains Committed to Launching New Child Safety Features Later This Year
Last week, Apple previewed new child safety features that it said will be coming to the iPhone, iPad, and Mac with software updates later this year. The company said the features will be available in the U.S. only at launch.
A refresher on Apple's new child safety features from our previous coverage:First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac...

Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns
Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch.
When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got ...

Facebook's Former Security Chief Discusses Controversy Around Apple's Planned Child Safety Features
Amid the ongoing controversy around Apple's plans to implement new child safety features that would involve scanning messages and users' photos libraries, Facebook's former security chief, Alex Stamos, has weighed into the debate with criticisms of multiple parties involved and suggestions for the future.
In an extensive Twitter thread, Stamos said that there are "no easy answers" in the...

Apple Open to Expanding New Child Safety Features to Third-Party Apps
Apple today held a questions-and-answers session with reporters regarding its new child safety features, and during the briefing, Apple confirmed that it would be open to expanding the features to third-party apps in the future.
As a refresher, Apple unveiled three new child safety features coming to future versions of iOS 15, iPadOS 15, macOS Monterey, and/or watchOS 8.
Apple's New Child ...