Apple Open to Expanding New Child Safety Features to Third-Party Apps

Apple today held a questions-and-answers session with reporters regarding its new child safety features, and during the briefing, Apple confirmed that it would be open to expanding the features to third-party apps in the future.

iphone communication safety feature
As a refresher, Apple unveiled three new child safety features coming to future versions of iOS 15, iPadOS 15, macOS Monterey, and/or watchOS 8.

Apple's New Child Safety Features

First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac can warn children and their parents when receiving or sending sexually explicit photos. When the feature is enabled, Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

Second, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple confirmed today that the process will only apply to photos being uploaded to iCloud Photos and not videos.

Third, Apple will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

Expansion to Third-Party Apps

Apple said that while it does not have anything to share today in terms of an announcement, expanding the child safety features to third parties so that users are even more broadly protected would be a desirable goal. Apple did not provide any specific examples, but one possibility could be the Communication Safety feature being made available to apps like Snapchat, Instagram, or WhatsApp so that sexually explicit photos received by a child are blurred.

Another possibility is that Apple's known CSAM detection system could be expanded to third-party apps that upload photos elsewhere than iCloud Photos.

Apple did not provide a timeframe as to when the child safety features could expand to third parties, noting that it has still has to complete testing and deployment of the features, and the company also said it would need to ensure that any potential expansion would not undermine the privacy properties or effectiveness of the features.

Broadly speaking, Apple said expanding features to third parties is the company's general approach and has been ever since it introduced support for third-party apps with the introduction of the App Store on iPhone OS 2 in 2008.

Popular Stories

iPhone 17 Pro Dark Blue and Orange

iPhone 17 Release Date, Pre-Orders, and What to Expect

Thursday August 28, 2025 4:08 am PDT by
An iPhone 17 announcement is a dead cert for September 2025 – Apple has already sent out invites for an "Awe dropping" event on Tuesday, September 9 at the Apple Park campus in Cupertino, California. The timing follows Apple's trend of introducing new iPhone models annually in the fall. At the event, Apple is expected to unveil its new-generation iPhone 17, an all-new ultra-thin iPhone 17...
crossbody strap

iPhone 17's 'Crossbody Strap' Accessory to Feature Magnetic Design

Thursday August 28, 2025 7:49 am PDT by
Apple's cases for the iPhone 17 lineup will be accompanied by a new Crossbody Strap accessory with a unique magnetic design, according to the leaker known as "Majin Bu." Apple's Crossbody Strap reportedly features an unusual magnetic design; it likely has a "flexible metal core" that makes it magnetic along its entire length. At the ends, "rings polarized oppositely to the strap close the...
xiaomi apple ad india

Apple and Samsung Push Back Against Xiaomi's Bold India Ads

Friday August 29, 2025 4:54 am PDT by
Apple and Samsung have reportedly issued cease-and-desist notices to Xiaomi in India for an ad campaign that directly compares the rivals' devices to Xiaomi's products. The two companies have threatened the Chinese vendor with legal action, calling the ads "disparaging." Ads have appeared in local print media and on social media that take pot shots at the competitors' premium offerings. One...
Awe Dropping Apple Event Feature

Five Things to Expect From Apple's 'Awe Dropping' September 9 Event

Tuesday August 26, 2025 4:17 pm PDT by
Apple today announced its "Awe Dropping" iPhone-centric event, which is set to take place on Tuesday, September 9 at 10:00 a.m. Pacific Time. There are a long list of products that are coming, but we thought we'd pull out five feature highlights to look forward to. That Super Thin iPhone - Apple's September 9 event will see the unveiling of the first redesigned iPhone we've had in years, ...

Top Rated Comments

crawfish963 Avatar
53 months ago
This is getting worse and worse. No way this will backfire….
Score: 85 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
53 months ago
Third-party Apps? Come on now. Apple is doing the MOST now. Imagine if Facebook gets ahold of the photos/information. Isn't Whatsup App belong to Facebook? SMH.

This is just getting creepier and creepier. What happened to this, Apple?



Attachment Image
Score: 78 Votes (Like | Disagree)
Naraxus Avatar
53 months ago
Apple used to be about privacy and security. Not any more. Apple has no more highground to stand on.
Score: 71 Votes (Like | Disagree)
DJJAZZYJET Avatar
53 months ago
Complete blatant invasion of privacy no matter how you spin the benefits of it. Hope this severely backfires.
Score: 67 Votes (Like | Disagree)
HiVolt Avatar
53 months ago
Apple wouldn't decrypt a freakin terrorists phone to help the investigation. Yet they are doing this on a massive scale now.
Score: 52 Votes (Like | Disagree)
millerb7 Avatar
53 months ago

Or you know, just don’t have iCloud photos turned on.
Or be like 99.999% of people, and don’t be worried about features that Will not ever apply to you
That's not how this works..... that's not how any of this works!

haha... anyways - this argument is super weak and just begging for exploits/issues. Typical weak-ass argument against mass surveillance. "I'm not hiding anything, why do I care if the police randomly pull me over and throw me out of my car and search it." This thinking rapidly escalates and it's a VERY slippery slope and hard to turn back from.
Score: 43 Votes (Like | Disagree)