Apple Open to Expanding New Child Safety Features to Third-Party Apps

Apple today held a questions-and-answers session with reporters regarding its new child safety features, and during the briefing, Apple confirmed that it would be open to expanding the features to third-party apps in the future.

iphone communication safety feature
As a refresher, Apple unveiled three new child safety features coming to future versions of iOS 15, iPadOS 15, macOS Monterey, and/or watchOS 8.

Apple's New Child Safety Features

First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac can warn children and their parents when receiving or sending sexually explicit photos. When the feature is enabled, Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

Second, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple confirmed today that the process will only apply to photos being uploaded to iCloud Photos and not videos.

Third, Apple will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

Expansion to Third-Party Apps

Apple said that while it does not have anything to share today in terms of an announcement, expanding the child safety features to third parties so that users are even more broadly protected would be a desirable goal. Apple did not provide any specific examples, but one possibility could be the Communication Safety feature being made available to apps like Snapchat, Instagram, or WhatsApp so that sexually explicit photos received by a child are blurred.

Another possibility is that Apple's known CSAM detection system could be expanded to third-party apps that upload photos elsewhere than iCloud Photos.

Apple did not provide a timeframe as to when the child safety features could expand to third parties, noting that it has still has to complete testing and deployment of the features, and the company also said it would need to ensure that any potential expansion would not undermine the privacy properties or effectiveness of the features.

Broadly speaking, Apple said expanding features to third parties is the company's general approach and has been ever since it introduced support for third-party apps with the introduction of the App Store on iPhone OS 2 in 2008.

Top Rated Comments

crawfish963 Avatar
11 weeks ago
This is getting worse and worse. No way this will backfire….
Score: 85 Votes (Like | Disagree)
TheYayAreaLiving Avatar
11 weeks ago
Third-party Apps? Come on now. Apple is doing the MOST now. Imagine if Facebook gets ahold of the photos/information. Isn't Whatsup App belong to Facebook? SMH.

This is just getting creepier and creepier. What happened to this, Apple?



Attachment Image
Score: 78 Votes (Like | Disagree)
Naraxus Avatar
11 weeks ago
Apple used to be about privacy and security. Not any more. Apple has no more highground to stand on.
Score: 71 Votes (Like | Disagree)
DJJAZZYJET Avatar
11 weeks ago
Complete blatant invasion of privacy no matter how you spin the benefits of it. Hope this severely backfires.
Score: 67 Votes (Like | Disagree)
HiVolt Avatar
11 weeks ago
Apple wouldn't decrypt a freakin terrorists phone to help the investigation. Yet they are doing this on a massive scale now.
Score: 52 Votes (Like | Disagree)
millerb7 Avatar
11 weeks ago

Or you know, just don’t have iCloud photos turned on.
Or be like 99.999% of people, and don’t be worried about features that Will not ever apply to you
That's not how this works..... that's not how any of this works!

haha... anyways - this argument is super weak and just begging for exploits/issues. Typical weak-ass argument against mass surveillance. "I'm not hiding anything, why do I care if the police randomly pull me over and throw me out of my car and search it." This thinking rapidly escalates and it's a VERY slippery slope and hard to turn back from.
Score: 43 Votes (Like | Disagree)

Related Stories

iphone communication safety feature

Apple Remains Committed to Launching New Child Safety Features Later This Year

Tuesday August 10, 2021 10:58 am PDT by
Last week, Apple previewed new child safety features that it said will be coming to the iPhone, iPad, and Mac with software updates later this year. The company said the features will be available in the U.S. only at launch. A refresher on Apple's new child safety features from our previous coverage:First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac...
iphone communication safety feature

Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Thursday August 5, 2021 12:00 pm PDT by
Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time. Communication Safety First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
iphone communication safety feature arned

Apple's New Feature That Scans Messages for Nude Photos is Only for Children, Parental Notifications Limited to Kids Under 13

Thursday August 5, 2021 2:50 pm PDT by
Apple today announced a series of new child safety initiatives that are coming alongside the latest iOS 15, iPadOS 15, and macOS Monterey updates and that are aimed at keeping children safer online. One of the new features, Communication Safety, has raised privacy concerns because it allows Apple to scan images sent and received by the Messages app for sexually explicit content, but Apple...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...
Child Safety Feature Blue

Apple Delays Rollout of Controversial Child Safety Features to Make Improvements

Friday September 3, 2021 6:07 am PDT by
Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced. The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance...
siri glow

iOS 15 to Limit Siri Functionality With Third-Party Apps

Wednesday July 28, 2021 3:42 am PDT by
Starting with iOS 15, iPadOS 15, macOS Monterey, and watchOS 8, Apple will cut back on integration between Siri and third-party apps, drastically reducing the type and number of commands users will be able to invoke through the virtual assistant for third-party apps. On a developer support page, Apple says that a number of SiriKit intents and commands will no longer be supported with the...
apple csam flow chart

Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis

Friday August 6, 2021 10:25 am PDT by
Apple this week announced that, starting later this year with iOS 15 and iPadOS 15, the company will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children, a non-profit organization that works in collaboration with law enforcement agencies across the United...
Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...
craig wwdc 2021 privacy

Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Friday August 13, 2021 6:33 am PDT by
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM). Federighi admitted that Apple...