iOS 17 Expands Communication Safety Worldwide, Turned On by Default

Starting with iOS 17, iPadOS 17, and macOS Sonoma, Apple is making Communication Safety available worldwide. The previously opt-in feature will now be turned on by default for children under the age of 13 who are signed in to their Apple ID and part of a Family Sharing group. Parents can turn it off in the Settings app under Screen Time.

communication safety feature yellow
Communication Safety first launched in the U.S. with iOS 15.2 in December 2021, and has since expanded to Australia, Belgium, Brazil, Canada, France, Germany, Italy, Japan, the Netherlands, New Zealand, South Korea, Spain, Sweden, and the U.K. With the software updates coming later this year, Apple is making the feature available globally.

Communication Safety is designed to warn children when receiving or sending photos that contain nudity in the Messages app. Apple is expanding the feature on iOS 17, iPadOS 17, and macOS Sonoma to cover video content, and it will also work for AirDrop content, FaceTime video messages, and Contact Posters in the Phone app.

When the feature is enabled, photos and videos containing nudity are automatically blurred in supported apps, and the child will be warned about viewing sensitive content. The warning also provides children with ways to get help. Apple is making a new API available that will allow developers to support Communication Safety in their App Store apps.

Apple says Communication Safety uses on-device processing to detect photos and videos containing nudity, ensuring that Apple and third parties cannot access the content, and that end-to-end encryption is preserved in the Messages app.

iOS 17, iPadOS 17, and macOS Sonoma will be released later this year. The updates are currently available in beta for users with an Apple developer account.

Related Forums: iOS 17, iPadOS 17

Popular Stories

Verizon New

Verizon is Down: iPhones Show 'SOS' Mode Due to Network Outage [Resolved]

Wednesday January 14, 2026 10:18 am PST by
Verizon is experiencing a major outage across the U.S. today, with hundreds of thousands of customers reporting issues with the network on the website Downdetector. There are also complaints across Reddit and other social media platforms. iPhone users and others with Verizon service are generally unable to make phone calls, send text messages, or use data over 5G or LTE due to the outage....
iPhone Top Left Hole Punch Face ID Feature Purple

New Leak Reveals iPhone 18 Pro Display Sizes, Under-Screen Face ID, and More

Wednesday January 14, 2026 7:09 am PST by
While the iPhone 18 Pro models are still around eight months away, a leaker has shared some alleged details about the devices. In a post on Chinese social media platform Weibo this week, the account Digital Chat Station said the iPhone 18 Pro and iPhone 18 Pro Max will have the same 6.3-inch and 6.9-inch display sizes as the iPhone 17 Pro and iPhone 17 Pro Max. Consistent with previous...
iPhone Top Left Hole Punch Face ID Feature Purple

iPhone 18 Pro Launching Later This Year With These 12 New Features

Thursday January 15, 2026 10:56 am PST by
While the iPhone 18 Pro and iPhone 18 Pro Max are not expected to launch for another eight months, there are already plenty of rumors about the devices. Below, we have recapped 12 features rumored for the iPhone 18 Pro models, as of January 2026: The same overall design is expected, with 6.3-inch and 6.9-inch display sizes, and a "plateau" housing three rear cameras Under-screen Face ID...
2024 iPhone Boxes Feature

Apple Adjusts Trade-In Values for iPhones, Macs, and More

Thursday January 15, 2026 11:19 am PST by
Apple today updated its trade-in values for select iPhone, iPad, Mac, and Apple Watch models. Trade-ins can be completed on Apple's website, or at an Apple Store. The charts below provide an overview of Apple's current and previous trade-in values in the United States, according to the company's website. Most of the values declined slightly, but some of the Mac values increased. iPhone ...
maxresdefault

Google Gemini-Powered Siri Will Reportedly Have These 7 New Features

Tuesday January 13, 2026 7:52 pm PST by
Apple and Google this week announced that Gemini will help power a more personalized Siri, and The Information has provided more details. Subscribe to the MacRumors YouTube channel for more videos. As soon as this spring, the report said the revamped version of Siri will be able to… Answer more factual/world knowledge questions in a conversational manner Tell more stories Provide...

Top Rated Comments

mdatwood Avatar
34 months ago
Queue the tons of people who confuse what this feature is and does.
Score: 7 Votes (Like | Disagree)
Apple Fan 2008 Avatar
34 months ago
Having a porn blocker opt-in for kids was weird anyways. Good decision to have it on by default.
Score: 5 Votes (Like | Disagree)
SDJim Avatar
34 months ago
As a parent I love these kinds of platform improvements.
Score: 5 Votes (Like | Disagree)
Apple Fan 2008 Avatar
34 months ago

I think the feature is fine/good, but the wording is so ... infantile. Or is that only shown for kids?
That’s only for kids
Score: 3 Votes (Like | Disagree)
CarlJ Avatar
34 months ago

Essentially it is a same machine looking for something, be it sensitive images (whatever that means) or CSAM or union activity it doesn't really matterm this machine looks for what someone told it to look for. Also for children notification is sent to the parents IIRC. Which infringes on the privacy of the children especially if it is a false positive.
The two mechanisms are completely different. The CSAM scanning mechanism was never machine learning. It was looking for matches to a specific set of images already in the possession of NCMEC (National Center for Missing and Exploited Children), which is the only entity authorized to catalog such images. No “looking for things that look like naughty bits”, it was only looking for a specific set of images. The technical paper ('https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf') that explains the mechanism is freely available.

This mechanism is entirely different from the CSAM detection mechanism, and does look for nudity, with machine learning. If it finds something it thinks might be that, it tells the person holding the phone, right at the point of being about to view the image. The notion of sending messages to the parents was removed, very early on, when it was pointed out some kids might be unsafe situations (like, say, parents who would harm their kids if they found out their kid was gay). So, it isn't sending a notification to anybody, it’s just asking the kid if they really want to see the image - that’s all.
Score: 3 Votes (Like | Disagree)
Cinder6 Avatar
34 months ago
I think the feature is fine/good, but the wording is so ... infantile. Or is that only shown for kids?
Score: 2 Votes (Like | Disagree)