Apple's New Feature That Scans Messages for Nude Photos is Only for Children, Parental Notifications Limited to Kids Under 13

Apple today announced a series of new child safety initiatives that are coming alongside the latest iOS 15, iPadOS 15, and macOS Monterey updates and that are aimed at keeping children safer online.

iphone communication safety feature arned
One of the new features, Communication Safety, has raised privacy concerns because it allows Apple to scan images sent and received by the Messages app for sexually explicit content, but Apple has confirmed that this is an opt-in feature limited to the accounts of children and that it must be enabled by parents through the Family Sharing feature.

If a parent turns on Communication Safety for the Apple ID account of a child, Apple will scan images that are sent and received in the Messages app for nudity. If nudity is detected, the photo will be automatically blurred and the child will be warned that the photo might contain private body parts.

"Sensitive photos and videos show the private body parts that you cover with bathing suits," reads Apple's warning. "It's not your fault, but sensitive photos and videos can be used to hurt you."

The child can choose to view the photo anyway, and for children that are under the age of 13, parents can opt to get a notification if their child clicks through to view a blurred photo. "If you decide to view this, your parents will get a notification to make sure you're OK," reads the warning screen.

These parental notifications are optional and are only available when the child viewing the photo is under the age of 13. Parents cannot be notified when a child between the ages of 13 and 17 views a blurred photo, though children that are between those ages will still see the warning about sensitive content if Communication Safety is turned on.

Communication Safety cannot be enabled on adult accounts and is only available for users that are under the age of 18, so adults do not need to worry about their content being scanned for nudity.

Parents need to expressly opt in to Communication Safety when setting up a child's device with Family Sharing, and it can be disabled if a family chooses not to use it. The feature uses on-device machine learning to analyze image attachments and because it's on-device, the content of an iMessage is not readable by Apple and remains protected with end-to-end encryption.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Top Rated Comments

Ursadorable Avatar
24 months ago

I feel like those images are as disturbing, if not more disturbing, than unsolicited nudes for young children.
What about pictures of an old man sniffing little girls hair? Those are disturbing, but the media keeps posting them.
Score: 25 Votes (Like | Disagree)
eatrains Avatar
24 months ago

Looks like Apple is being very transparent about it now.

Apple's warning. "It's not your fault, but sensitive photos and videos can be used to hurt you."

What in the world does that even mean? Don't take sensitive photos/videos?
They were never not transparent. It's not Apple's fault irresponsible and uninformed reporters spread misinformation.
Score: 16 Votes (Like | Disagree)
ipedro Avatar
24 months ago
A lot of folks suddenly panicking today over Apple giving parents a tool to protect young children from predators.



Attachment Image
Score: 15 Votes (Like | Disagree)
jclo Avatar
24 months ago

Looks like Apple is being very transparent about it now.

Apple's warning. "It's not your fault, but sensitive photos and videos can be used to hurt you."

What in the world does that even mean? Don't take sensitive photos/videos?
Not sure there's any ambiguity in Apple's alert. It's clearly warning children against nude photos sent by or requested by predatory adults.
Score: 14 Votes (Like | Disagree)
antiprotest Avatar
24 months ago

This makes NO sense to me. This is the kind of feature that has the potential for false positives/negatives and it’s impossible to draw the line. So apple decided to draw the line at, letting 10yos open explicit images but it can alert their parents, but a 13yo can do the same and it will alert nobody. What the hell is going on at apple? Who is making these decisions and why have they not been fired before any of these stupid announcements?
Well, as the Apple workers said when they complained about going back to the office, they did some of their "best work" while working from home last year. So you're probably seeing some of that "best work". More to come I'm sure.
Score: 11 Votes (Like | Disagree)
InGen Avatar
24 months ago
Bravo Apple ?? a lot of the negative comments about privacy from people who didn’t read the fine print on who this applies to and how it applies. This isn’t Apple wanting to go through your private photos or intercepting your private messages. This is a seemingly well implemented utilisation of onboard machine learning and algorithmic pattern recognition done in a very anonymous way to link known questionable content with a warning, that’s all. There’s no event recording or reporting to external sources or databases, it’s simply an on-device hash recognition that triggers a warning within parental features for children accounts, that’s all!
Score: 11 Votes (Like | Disagree)

Popular Stories

Google Assistant

Google I/O 2016: Assistant, Home, Allo, Duo, Android N, and More

Wednesday May 18, 2016 11:51 am PDT by
Google hosted its annual I/O developers keynote at the Shoreline Amphitheatre in Mountain View, California today, announcing multiple new products and services related to Android, search, messaging, home automation, and more. Google Assistant Google Assistant is described as a "conversational assistant" that builds upon Google Now based on two-way dialog. The tool can be used, for example,...