Skip to Content

Apple Open to Expanding New Child Safety Features to Third-Party Apps

Apple today held a questions-and-answers session with reporters regarding its new child safety features, and during the briefing, Apple confirmed that it would be open to expanding the features to third-party apps in the future.

iphone communication safety feature
As a refresher, Apple unveiled three new child safety features coming to future versions of iOS 15, iPadOS 15, macOS Monterey, and/or watchOS 8.

Apple's New Child Safety Features

First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac can warn children and their parents when receiving or sending sexually explicit photos. When the feature is enabled, Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

Second, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple confirmed today that the process will only apply to photos being uploaded to iCloud Photos and not videos.

Third, Apple will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

Expansion to Third-Party Apps

Apple said that while it does not have anything to share today in terms of an announcement, expanding the child safety features to third parties so that users are even more broadly protected would be a desirable goal. Apple did not provide any specific examples, but one possibility could be the Communication Safety feature being made available to apps like Snapchat, Instagram, or WhatsApp so that sexually explicit photos received by a child are blurred.

Another possibility is that Apple's known CSAM detection system could be expanded to third-party apps that upload photos elsewhere than iCloud Photos.

Apple did not provide a timeframe as to when the child safety features could expand to third parties, noting that it has still has to complete testing and deployment of the features, and the company also said it would need to ensure that any potential expansion would not undermine the privacy properties or effectiveness of the features.

Broadly speaking, Apple said expanding features to third parties is the company's general approach and has been ever since it introduced support for third-party apps with the introduction of the App Store on iPhone OS 2 in 2008.

Popular Stories

MacBook Neo Feature Pastel 1

First MacBook Neo Benchmarks Are In: Here's How It Compares to the M1 MacBook Air

Thursday March 5, 2026 4:07 pm PST by
Benchmarks for the new MacBook Neo surfaced today, and unsurprisingly, CPU performance is almost identical to the iPhone 16 Pro. The MacBook Neo uses the same 6-core A18 Pro chip that was first introduced in the iPhone 16 Pro, but it has one fewer GPU core. The MacBook Neo earned a single-core score of 3461 and a multi-core score of 8668, along with a Metal score of 31286. Here's how the...
MacBook Neo Feature Pastel 1

Apple Announces $599 'MacBook Neo' With A18 Pro Chip

Wednesday March 4, 2026 6:15 am PST by
Apple today announced the "MacBook Neo," an all-new kind of low-cost Mac featuring the A18 Pro chip for $599. The MacBook Neo is the first Mac to be powered by an iPhone chip; the A18 Pro debuted in 2024's iPhone 16 Pro models. Apple says it is up to 50% faster for everyday tasks than the bestselling PC with the latest shipping Intel Core Ultra 5, up to 3x faster for on-device AI workloads,...
Multicolored Low Cost A18 Pro MacBook Feature

Apple Accidentally Leaks 'MacBook Neo'

Tuesday March 3, 2026 7:00 am PST by
Apple appears to have prematurely revealed the name of its rumored lower-cost MacBook model, which is expected to be announced this Wednesday. A regulatory document for a "MacBook Neo" (Model A3404) has appeared on Apple's website. Unfortunately, there are no further details or images available yet. While the PDF file does not contain the "MacBook Neo" name, it briefly appeared in a link...

Top Rated Comments

crawfish963 Avatar
60 months ago
This is getting worse and worse. No way this will backfire….
Score: 85 Votes (Like | Disagree)
TheYayAreaLiving 🎗️ Avatar
60 months ago
Third-party Apps? Come on now. Apple is doing the MOST now. Imagine if Facebook gets ahold of the photos/information. Isn't Whatsup App belong to Facebook? SMH.

This is just getting creepier and creepier. What happened to this, Apple?



Attachment Image
Score: 78 Votes (Like | Disagree)
Naraxus Avatar
60 months ago
Apple used to be about privacy and security. Not any more. Apple has no more highground to stand on.
Score: 71 Votes (Like | Disagree)
60 months ago
Complete blatant invasion of privacy no matter how you spin the benefits of it. Hope this severely backfires.
Score: 67 Votes (Like | Disagree)
HiVolt Avatar
60 months ago
Apple wouldn't decrypt a freakin terrorists phone to help the investigation. Yet they are doing this on a massive scale now.
Score: 52 Votes (Like | Disagree)
60 months ago

Or you know, just don’t have iCloud photos turned on.
Or be like 99.999% of people, and don’t be worried about features that Will not ever apply to you
That's not how this works..... that's not how any of this works!

haha... anyways - this argument is super weak and just begging for exploits/issues. Typical weak-ass argument against mass surveillance. "I'm not hiding anything, why do I care if the police randomly pull me over and throw me out of my car and search it." This thinking rapidly escalates and it's a VERY slippery slope and hard to turn back from.
Score: 43 Votes (Like | Disagree)