Skip to Content

iOS 17 Expands Communication Safety Worldwide, Turned On by Default

Starting with iOS 17, iPadOS 17, and macOS Sonoma, Apple is making Communication Safety available worldwide. The previously opt-in feature will now be turned on by default for children under the age of 13 who are signed in to their Apple ID and part of a Family Sharing group. Parents can turn it off in the Settings app under Screen Time.

communication safety feature yellow
Communication Safety first launched in the U.S. with iOS 15.2 in December 2021, and has since expanded to Australia, Belgium, Brazil, Canada, France, Germany, Italy, Japan, the Netherlands, New Zealand, South Korea, Spain, Sweden, and the U.K. With the software updates coming later this year, Apple is making the feature available globally.

Communication Safety is designed to warn children when receiving or sending photos that contain nudity in the Messages app. Apple is expanding the feature on iOS 17, iPadOS 17, and macOS Sonoma to cover video content, and it will also work for AirDrop content, FaceTime video messages, and Contact Posters in the Phone app.

When the feature is enabled, photos and videos containing nudity are automatically blurred in supported apps, and the child will be warned about viewing sensitive content. The warning also provides children with ways to get help. Apple is making a new API available that will allow developers to support Communication Safety in their App Store apps.

Apple says Communication Safety uses on-device processing to detect photos and videos containing nudity, ensuring that Apple and third parties cannot access the content, and that end-to-end encryption is preserved in the Messages app.

iOS 17, iPadOS 17, and macOS Sonoma will be released later this year. The updates are currently available in beta for users with an Apple developer account.

Related Forums: iOS 17, iPadOS 17

Popular Stories

Apple Event Logo

Apple Released Seven New Products Today

Wednesday March 11, 2026 7:05 am PDT by
Starting today, the seven new Apple products that were announced last week are available at Apple Stores and beginning to arrive to customers. The colorful MacBook Neo and all of the other new products are on display at most Apple Store locations around the world starting today. Apple Stores have inventory of the new products for both walk-in customers and Apple Store pickup, but...
iOS 27 Mock Quick

10+ New Features Coming in iOS 27

Friday March 13, 2026 2:13 pm PDT by
We're only three months away from Apple's WWDC 2026 event, which will see the company unveil iOS 27. With the fully revamped version of Siri possibly delayed until September, iOS 27 is shaping up to be the update we wanted iOS 26 to be. There will be new Apple Intelligence features, updates for the iPhone Fold, and more, with the latest rumors summarized below. Foldable iPhone Features...
iOS 27 Mock Quick

iOS 27 Will Reportedly Be Like Mac OS X Snow Leopard

Sunday March 15, 2026 9:42 am PDT by
In his Power On newsletter today, Bloomberg's Mark Gurman reiterated that iOS 27 will be similar to 2009's Mac OS X Snow Leopard, in the sense that one of Apple's biggest priorities is bug fixes for improved performance and stability. During WWDC 2008's State of the Union, Apple showed a slide that said Mac OS X Snow Leopard had "0 new features," as it opted to focus on performance and...

Top Rated Comments

36 months ago
Queue the tons of people who confuse what this feature is and does.
Score: 7 Votes (Like | Disagree)
Apple Fan 2008 Avatar
36 months ago
Having a porn blocker opt-in for kids was weird anyways. Good decision to have it on by default.
Score: 5 Votes (Like | Disagree)
SDJim Avatar
36 months ago
As a parent I love these kinds of platform improvements.
Score: 5 Votes (Like | Disagree)
Apple Fan 2008 Avatar
36 months ago

I think the feature is fine/good, but the wording is so ... infantile. Or is that only shown for kids?
That’s only for kids
Score: 3 Votes (Like | Disagree)
CarlJ Avatar
36 months ago

Essentially it is a same machine looking for something, be it sensitive images (whatever that means) or CSAM or union activity it doesn't really matterm this machine looks for what someone told it to look for. Also for children notification is sent to the parents IIRC. Which infringes on the privacy of the children especially if it is a false positive.
The two mechanisms are completely different. The CSAM scanning mechanism was never machine learning. It was looking for matches to a specific set of images already in the possession of NCMEC (National Center for Missing and Exploited Children), which is the only entity authorized to catalog such images. No “looking for things that look like naughty bits”, it was only looking for a specific set of images. The technical paper ('https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf') that explains the mechanism is freely available.

This mechanism is entirely different from the CSAM detection mechanism, and does look for nudity, with machine learning. If it finds something it thinks might be that, it tells the person holding the phone, right at the point of being about to view the image. The notion of sending messages to the parents was removed, very early on, when it was pointed out some kids might be unsafe situations (like, say, parents who would harm their kids if they found out their kid was gay). So, it isn't sending a notification to anybody, it’s just asking the kid if they really want to see the image - that’s all.
Score: 3 Votes (Like | Disagree)
Cinder6 Avatar
36 months ago
I think the feature is fine/good, but the wording is so ... infantile. Or is that only shown for kids?
Score: 2 Votes (Like | Disagree)