Apple Remains Committed to Launching New Child Safety Features Later This Year

Last week, Apple previewed new child safety features that it said will be coming to the iPhone, iPad, and Mac with software updates later this year. The company said the features will be available in the U.S. only at launch.

iphone communication safety feature
A refresher on Apple's new child safety features from our previous coverage:

First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac can warn children and their parents when receiving or sending sexually explicit photos. When the feature is enabled, Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

Second, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple confirmed today that the process will only apply to photos being uploaded to iCloud Photos and not videos.

Third, Apple will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

Since announcing the plans last Thursday, Apple has received some pointed criticism, ranging from NSA whistleblower Edward Snowden claiming that Apple is "rolling out mass surveillance" to the non-profit Electronic Frontier Foundation claiming that the new child safety features will create a "backdoor" into the company's platforms.

"All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children's, but anyone's accounts," cowrote the EFF's India McKinney and Erica Portnoy. "That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change."

The concerns extend to the general public, with over 7,000 individuals having signed an open letter against Apple's so-called "privacy-invasive content scanning technology" that calls for the company to abandon its planned child safety features.

At this point in time, it does not appear that any negative feedback has led Apple to reconsider its plans. We confirmed with Apple today that the company has not made any changes as it relates to the timing of the new child safety features becoming available — that is, later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. With the features not expected to launch for several weeks to months, though, the plans could still change.

Apple sticking to its plans will please several advocates, including Julie Cordua, CEO of the international anti-human trafficking organization Thorn.

"The commitment from Apple to deploy technology solutions that balance the need for privacy with digital safety for children brings us a step closer to justice for survivors whose most traumatic moments are disseminated online," said Cordua.

"We support the continued evolution of Apple's approach to child online safety," said Stephen Balkam, CEO of the Family Online Safety Institute. "Given the challenges parents face in protecting their kids online, it is imperative that tech companies continuously iterate and improve their safety tools to respond to new risks and actual harms."

Top Rated Comments

jarman92 Avatar
11 weeks ago

Those who are complaining obviously did not read how the technology works.

You have a higher chance of winning the lottery than Apple erroneously looking through your photos.
I understand how it works, and I still hate it.
Score: 91 Votes (Like | Disagree)
turbineseaplane Avatar
11 weeks ago
...and how long before the "scanning iMessages" isn't just for kids under parental controls?

I can't believe people are blind to the bad precedents and paths we are going down here.
Score: 81 Votes (Like | Disagree)
jjack50 Avatar
11 weeks ago

Those who are complaining obviously did not read how the technology works.

You have a higher chance of winning the lottery than Apple erroneously looking through your photos.
That's not the problem. The big issue is creating a method that allows Apple to review content that's supposed to be private. It opens the possibility for someone to add 'reasons' to review any content they want to accuse a user of creating, saving, or sharing. Once it has been created and is active, the risk increases that others may figure out how to piggyback on that system and use a modification for their own purposes. All privacy is at risk then. This also sets up a situation where a court can then say "see, Apple does have a method to examine contents without permission from the device owners, therefore we can order Apple to allow an investigatory agency permission to access the content."

Not good.
Score: 75 Votes (Like | Disagree)
benh911f Avatar
11 weeks ago
Apple with the most shocking heel turn since Hulk Hogan.
Score: 72 Votes (Like | Disagree)
nvmls Avatar
11 weeks ago

Those who are complaining obviously did not read how the technology works.

You have a higher chance of winning the lottery than Apple erroneously looking through your photos.
Why are you defending Apple so desperately? Have some dignity, how the technology works is not even the point.
Score: 72 Votes (Like | Disagree)
DHagan4755 Avatar
11 weeks ago
Apple needs to make one of its fancy dancy videos that solidly explains how this is all going to work. Their current roll-out of this is a PR disaster. It sounds creepy & even after reading about how it works, I'm still not enthralled with it.
Score: 56 Votes (Like | Disagree)

Related Stories

iphone communication safety feature

Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Thursday August 5, 2021 12:00 pm PDT by
Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time. Communication Safety First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children...
iphone communication safety feature

Apple Open to Expanding New Child Safety Features to Third-Party Apps

Monday August 9, 2021 11:00 am PDT by
Apple today held a questions-and-answers session with reporters regarding its new child safety features, and during the briefing, Apple confirmed that it would be open to expanding the features to third-party apps in the future. As a refresher, Apple unveiled three new child safety features coming to future versions of iOS 15, iPadOS 15, macOS Monterey, and/or watchOS 8. Apple's New Child ...
Child Safety Feature Blue

Apple Delays Rollout of Controversial Child Safety Features to Make Improvements

Friday September 3, 2021 6:07 am PDT by
Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced. The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance...
iphone communication safety feature arned

Apple's New Feature That Scans Messages for Nude Photos is Only for Children, Parental Notifications Limited to Kids Under 13

Thursday August 5, 2021 2:50 pm PDT by
Apple today announced a series of new child safety initiatives that are coming alongside the latest iOS 15, iPadOS 15, and macOS Monterey updates and that are aimed at keeping children safer online. One of the new features, Communication Safety, has raised privacy concerns because it allows Apple to scan images sent and received by the Messages app for sexually explicit content, but Apple...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
apple csam flow chart

Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis

Friday August 6, 2021 10:25 am PDT by
Apple this week announced that, starting later this year with iOS 15 and iPadOS 15, the company will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children, a non-profit organization that works in collaboration with law enforcement agencies across the United...
twitter safety mode

Twitter Debuts New 'Safety Mode' for Automatically Blocking Unwanted Replies

Wednesday September 1, 2021 9:50 am PDT by
Twitter today announced that it is testing a new feature called Safety Mode, which is designed to cut down on harassment and unwelcome interactions on the social network. Users who often get unwanted, spammy, or abusive replies to their tweets can turn on Safety Mode, which will autoblock accounts that use harmful language like insults, or send repetitive, uninvited replies and mentions....
Child Safety Feature

Facebook's Former Security Chief Discusses Controversy Around Apple's Planned Child Safety Features

Tuesday August 10, 2021 5:50 am PDT by
Amid the ongoing controversy around Apple's plans to implement new child safety features that would involve scanning messages and users' photos libraries, Facebook's former security chief, Alex Stamos, has weighed into the debate with criticisms of multiple parties involved and suggestions for the future. In an extensive Twitter thread, Stamos said that there are "no easy answers" in the...
Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...