Code for Apple's Communication Safety Feature for Kids Found in iOS 15.2 Beta [Updated] - MacRumors
Skip to Content

Code for Apple's Communication Safety Feature for Kids Found in iOS 15.2 Beta [Updated]

Update: We've learned from Apple that the Communication Safety code found in the first iOS 15.2 beta is not a feature in that update and Apple does not plan to release the feature as it is described in the article.


Apple this summer announced new Child Safety Features that are designed to keep children safer online. One of those features, Communication Safety, appears to be included in the iOS 15.2 beta that was released today. This feature is distinct from the controversial CSAM initiative, which has been delayed.

iphone communication safety feature arned
Based on code found in the iOS 15.2 beta by MacRumors contributor Steve Moser, Communication Safety is being introduced in the update. The code is there, but we have not been able to confirm that the feature is active because it requires sensitive photos to be sent to or from a device set up for a child.

As Apple explained earlier this year, Communication Safety is built into the Messages app on iPhone, iPad, and Mac. It will warn children and their parents when sexually explicit photos are received or sent from a child's device, with Apple using on-device machine learning to analyze image attachments.

If a sexually explicit photo is flagged, it is automatically blurred and the child is warned against viewing it. For kids under 13, if the child taps the photo and views it anyway, the child's parents will be alerted.

Code in iOS 15.2 features some of the wording that children will see.

  • You are not alone and can always get help from a grownup you trust or with trained professionals. You can also block this person.
  • You are not alone and can always get help from a grownup you trust or with trained professionals. You can also leave this conversation or block contacts.
  • Talk to someone you trust if you feel uncomfortable or need help.
  • This photo will not be shared with Apple, and your feedback is helpful if it was incorrectly marked as sensitive.
  • Message a Grownup You Trust.
  • Hey, I would like to talk with you about a conversation that is bothering me.
  • Sensitive photos and videos show the private body parts that you cover with bathing suits.
  • It's not your fault, but sensitive photos can be used to hurt you.
  • The person in this may not have given consent to share it. How would they feel knowing other people saw it?
  • The person in this might not want it seen-it could have been shared without them knowing. It can also be against the law to share.
  • Sharing nudes to anyone under 18 years old can lead to legal consequences.
  • If you decide to view this, your parents will get a notification to make sure you're OK.
  • Don't share anything you don't want to. Talk to someone you trust if you feel pressured.
  • Do you feel OK? You're not alone and can always talk to someone who's trained to help here.

There are specific phrases for both children under 13 and children over 13, as the feature has different behaviors for each age group. As mentioned above, if a child over 13 views a nude photo, their parents will not be notified, but if a child under 13 does so, parents will be alerted. All of these Communication Safety features must be enabled by parents and are available for Family Sharing groups.

  • Nude photos and videos can be used to hurt people. Once something's shared, it can't be taken back.
  • It's not your fault, but sensitive photos and videos can be used to hurt you.
  • Even if you trust who you send this to now, they can share it forever without your consent.
  • Whoever gets this can share it with anyone-it may never go away. It can also be against the law to share.

Apple in August said that these Communication Safety features would be added in updates to iOS 15, iPadOS 15, and macOS Monterey later this year, and iMessage conversations remain end-to-end encrypted and are not readable by Apple.

Communication Safety was also announced alongside a new CSAM initiative that will see Apple scanning photos for Child Sexual Abuse Material. This has been highly controversial and heavily criticized, leading Apple to choose to "take additional time over the coming months" to make improvements before introducing the new functionality.

At the current time, there is no sign of CSAM wording in the iOS 15.2 beta, so Apple may first introduce Communication Safety before implementing the full suite of Child Safety Features.

Related Forum: iOS 15

Popular Stories

Dynamic Island iPhone 18 Pro Feature

11 Reasons to Wait for the iPhone 18 Pro

Monday May 11, 2026 9:01 am PDT by
We're only four months out from the launch of Apple's premium next-generation smartphone lineup, and while we're not expecting a sea change in terms of functionality, there are still several enhancements rumored to be coming to the iPhone 18 Pro and iPhone 18 Pro Max. One thing worth noting is that Apple is reportedly planning a major change to its iPhone release cycle this year, adopting a...
iOS 26

iOS 26.5 Features: Everything New in iOS 26.5

Monday May 11, 2026 5:09 pm PDT by
Apple released iOS 26.5 after a few months of beta testing, and while it doesn't have the Siri features we were hoping for since those are being held until iOS 27, there are a handful of useful changes worth knowing about. Subscribe to the MacRumors YouTube channel for more videos. End-to-End Encryption for RCS Support for end-to-end encryption (E2EE) for RCS messages between iPhone and...
General Apps Reddit Feature

Reddit Starts Blocking Mobile Website, Pushing Users to App Instead

Monday May 11, 2026 6:10 am PDT by
Social network Reddit recently began blocking mobile visitors to its website while pushing them to download the official Reddit app, and it's fair to say that the move is not going down well with users. If you visit reddit.com on your iPhone today, you may see a new popup that can't be dismissed, asking you to "get the app to keep using Reddit." A Reddit spokesperson told Ars Technica...

Top Rated Comments

tzm41 Avatar
59 months ago
Hopefully no CSAM ever… The system is going to be exploited by some states one way or the other.
Score: 42 Votes (Like | Disagree)
59 months ago
“Sensitive photos and videos show the private body parts that you cover with bathing suits.”

Weird prudish Apple clearly have no idea about cultures outside the USA.
Score: 27 Votes (Like | Disagree)
HappyDude20 Avatar
59 months ago
Terrible move by Apple
Score: 25 Votes (Like | Disagree)
Wildkraut Avatar
59 months ago
Did Ned Flanders get a Job at Apple?
Score: 22 Votes (Like | Disagree)
840quadra Avatar
59 months ago
I am hoping that there are far more details and explanations of what Apple is doing on device, and in the cloud for this feature before it is activated or officially offered to consumers. I get what they are trying to do, but for some there is a huge creep factor attached to this type of service / feature.
Score: 22 Votes (Like | Disagree)
frumpy16 Avatar
59 months ago

People need to stop using "CSAM" to mean "CSAM detection". Let's expand the acronym in your sentence: "Apple must have worked on child sexual abuse material for a long time" - so, what you're basically saying is Apple is dealing in child porn illegally.
Pedantic. It's pretty obvious what people mean.
Score: 19 Votes (Like | Disagree)