Apple's New Feature That Scans Messages for Nude Photos is Only for Children, Parental Notifications Limited to Kids Under 13

Apple today announced a series of new child safety initiatives that are coming alongside the latest iOS 15, iPadOS 15, and macOS Monterey updates and that are aimed at keeping children safer online.

iphone communication safety feature arned
One of the new features, Communication Safety, has raised privacy concerns because it allows Apple to scan images sent and received by the Messages app for sexually explicit content, but Apple has confirmed that this is an opt-in feature limited to the accounts of children and that it must be enabled by parents through the Family Sharing feature.

If a parent turns on Communication Safety for the Apple ID account of a child, Apple will scan images that are sent and received in the Messages app for nudity. If nudity is detected, the photo will be automatically blurred and the child will be warned that the photo might contain private body parts.

"Sensitive photos and videos show the private body parts that you cover with bathing suits," reads Apple's warning. "It's not your fault, but sensitive photos and videos can be used to hurt you."

The child can choose to view the photo anyway, and for children that are under the age of 13, parents can opt to get a notification if their child clicks through to view a blurred photo. "If you decide to view this, your parents will get a notification to make sure you're OK," reads the warning screen.

These parental notifications are optional and are only available when the child viewing the photo is under the age of 13. Parents cannot be notified when a child between the ages of 13 and 17 views a blurred photo, though children that are between those ages will still see the warning about sensitive content if Communication Safety is turned on.

Communication Safety cannot be enabled on adult accounts and is only available for users that are under the age of 18, so adults do not need to worry about their content being scanned for nudity.

Parents need to expressly opt in to Communication Safety when setting up a child's device with Family Sharing, and it can be disabled if a family chooses not to use it. The feature uses on-device machine learning to analyze image attachments and because it's on-device, the content of an iMessage is not readable by Apple and remains protected with end-to-end encryption.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Top Rated Comments

Ursadorable Avatar
11 weeks ago

I feel like those images are as disturbing, if not more disturbing, than unsolicited nudes for young children.
What about pictures of an old man sniffing little girls hair? Those are disturbing, but the media keeps posting them.
Score: 24 Votes (Like | Disagree)
eatrains Avatar
11 weeks ago

Looks like Apple is being very transparent about it now.

Apple's warning. "It's not your fault, but sensitive photos and videos can be used to hurt you."

What in the world does that even mean? Don't take sensitive photos/videos?
They were never not transparent. It's not Apple's fault irresponsible and uninformed reporters spread misinformation.
Score: 16 Votes (Like | Disagree)
ipedro Avatar
11 weeks ago
A lot of folks suddenly panicking today over Apple giving parents a tool to protect young children from predators.



Attachment Image
Score: 15 Votes (Like | Disagree)
jclo Avatar
11 weeks ago

Looks like Apple is being very transparent about it now.

Apple's warning. "It's not your fault, but sensitive photos and videos can be used to hurt you."

What in the world does that even mean? Don't take sensitive photos/videos?
Not sure there's any ambiguity in Apple's alert. It's clearly warning children against nude photos sent by or requested by predatory adults.
Score: 14 Votes (Like | Disagree)
antiprotest Avatar
11 weeks ago

This makes NO sense to me. This is the kind of feature that has the potential for false positives/negatives and it’s impossible to draw the line. So apple decided to draw the line at, letting 10yos open explicit images but it can alert their parents, but a 13yo can do the same and it will alert nobody. What the hell is going on at apple? Who is making these decisions and why have they not been fired before any of these stupid announcements?
Well, as the Apple workers said when they complained about going back to the office, they did some of their "best work" while working from home last year. So you're probably seeing some of that "best work". More to come I'm sure.
Score: 11 Votes (Like | Disagree)
InGen Avatar
11 weeks ago
Bravo Apple ?? a lot of the negative comments about privacy from people who didn’t read the fine print on who this applies to and how it applies. This isn’t Apple wanting to go through your private photos or intercepting your private messages. This is a seemingly well implemented utilisation of onboard machine learning and algorithmic pattern recognition done in a very anonymous way to link known questionable content with a warning, that’s all. There’s no event recording or reporting to external sources or databases, it’s simply an on-device hash recognition that triggers a warning within parental features for children accounts, that’s all!
Score: 11 Votes (Like | Disagree)

Related Stories

iphone communication safety feature

Apple Remains Committed to Launching New Child Safety Features Later This Year

Tuesday August 10, 2021 10:58 am PDT by
Last week, Apple previewed new child safety features that it said will be coming to the iPhone, iPad, and Mac with software updates later this year. The company said the features will be available in the U.S. only at launch. A refresher on Apple's new child safety features from our previous coverage:First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac...
iphone communication safety feature

Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Thursday August 5, 2021 12:00 pm PDT by
Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time. Communication Safety First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
iphone communication safety feature

Apple Open to Expanding New Child Safety Features to Third-Party Apps

Monday August 9, 2021 11:00 am PDT by
Apple today held a questions-and-answers session with reporters regarding its new child safety features, and during the briefing, Apple confirmed that it would be open to expanding the features to third-party apps in the future. As a refresher, Apple unveiled three new child safety features coming to future versions of iOS 15, iPadOS 15, macOS Monterey, and/or watchOS 8. Apple's New Child ...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...
Child Safety Feature Blue

Apple Delays Rollout of Controversial Child Safety Features to Make Improvements

Friday September 3, 2021 6:07 am PDT by
Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced. The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance...
Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...
craig wwdc 2021 privacy

Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Friday August 13, 2021 6:33 am PDT by
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM). Federighi admitted that Apple...
Child Safety Feature

Facebook's Former Security Chief Discusses Controversy Around Apple's Planned Child Safety Features

Tuesday August 10, 2021 5:50 am PDT by
Amid the ongoing controversy around Apple's plans to implement new child safety features that would involve scanning messages and users' photos libraries, Facebook's former security chief, Alex Stamos, has weighed into the debate with criticisms of multiple parties involved and suggestions for the future. In an extensive Twitter thread, Stamos said that there are "no easy answers" in the...
eff apple park plane 1

EFF Flew a Banner Over Apple Park During Last Apple Event to Protest CSAM Plans

Friday September 24, 2021 2:06 am PDT by
In protest of the company's now delayed CSAM detection plans, the EFF, which has been vocal about Apple's child safety features plans in the past, flew a banner over Apple Park during the iPhone 13 event earlier this month with a message for the Cupertino tech giant. During Apple's fully-digital "California streaming" event on September 14, which included no physical audience attendance in...