Code for Apple's Communication Safety Feature for Kids Found in iOS 15.2 Beta [Updated]

Update: We've learned from Apple that the Communication Safety code found in the first iOS 15.2 beta is not a feature in that update and Apple does not plan to release the feature as it is described in the article.


Apple this summer announced new Child Safety Features that are designed to keep children safer online. One of those features, Communication Safety, appears to be included in the iOS 15.2 beta that was released today. This feature is distinct from the controversial CSAM initiative, which has been delayed.

iphone communication safety feature arned
Based on code found in the iOS 15.2 beta by MacRumors contributor Steve Moser, Communication Safety is being introduced in the update. The code is there, but we have not been able to confirm that the feature is active because it requires sensitive photos to be sent to or from a device set up for a child.

As Apple explained earlier this year, Communication Safety is built into the Messages app on iPhone, iPad, and Mac. It will warn children and their parents when sexually explicit photos are received or sent from a child's device, with Apple using on-device machine learning to analyze image attachments.

If a sexually explicit photo is flagged, it is automatically blurred and the child is warned against viewing it. For kids under 13, if the child taps the photo and views it anyway, the child's parents will be alerted.

Code in iOS 15.2 features some of the wording that children will see.

  • You are not alone and can always get help from a grownup you trust or with trained professionals. You can also block this person.
  • You are not alone and can always get help from a grownup you trust or with trained professionals. You can also leave this conversation or block contacts.
  • Talk to someone you trust if you feel uncomfortable or need help.
  • This photo will not be shared with Apple, and your feedback is helpful if it was incorrectly marked as sensitive.
  • Message a Grownup You Trust.
  • Hey, I would like to talk with you about a conversation that is bothering me.
  • Sensitive photos and videos show the private body parts that you cover with bathing suits.
  • It's not your fault, but sensitive photos can be used to hurt you.
  • The person in this may not have given consent to share it. How would they feel knowing other people saw it?
  • The person in this might not want it seen-it could have been shared without them knowing. It can also be against the law to share.
  • Sharing nudes to anyone under 18 years old can lead to legal consequences.
  • If you decide to view this, your parents will get a notification to make sure you're OK.
  • Don't share anything you don't want to. Talk to someone you trust if you feel pressured.
  • Do you feel OK? You're not alone and can always talk to someone who's trained to help here.

There are specific phrases for both children under 13 and children over 13, as the feature has different behaviors for each age group. As mentioned above, if a child over 13 views a nude photo, their parents will not be notified, but if a child under 13 does so, parents will be alerted. All of these Communication Safety features must be enabled by parents and are available for Family Sharing groups.

  • Nude photos and videos can be used to hurt people. Once something's shared, it can't be taken back.
  • It's not your fault, but sensitive photos and videos can be used to hurt you.
  • Even if you trust who you send this to now, they can share it forever without your consent.
  • Whoever gets this can share it with anyone-it may never go away. It can also be against the law to share.

Apple in August said that these Communication Safety features would be added in updates to iOS 15, iPadOS 15, and macOS Monterey later this year, and iMessage conversations remain end-to-end encrypted and are not readable by Apple.

Communication Safety was also announced alongside a new CSAM initiative that will see Apple scanning photos for Child Sexual Abuse Material. This has been highly controversial and heavily criticized, leading Apple to choose to "take additional time over the coming months" to make improvements before introducing the new functionality.

At the current time, there is no sign of CSAM wording in the iOS 15.2 beta, so Apple may first introduce Communication Safety before implementing the full suite of Child Safety Features.

Related Forum: iOS 15

Popular Stories

iPhone SE 4 Vertical Camera Feature

iPhone SE 4 Production Will Reportedly Begin Ramping Up in October

Tuesday July 23, 2024 2:00 pm PDT by
Following nearly two years of rumors about a fourth-generation iPhone SE, The Information today reported that Apple suppliers are finally planning to begin ramping up mass production of the device in October of this year. If accurate, that timeframe would mean that the next iPhone SE would not be announced alongside the iPhone 16 series in September, as expected. Instead, the report...
iPhone 17 Plus Feature

iPhone 17 Lineup Specs Detail Display Upgrade and New High-End Model

Monday July 22, 2024 4:33 am PDT by
Key details about the overall specifications of the iPhone 17 lineup have been shared by the leaker known as "Ice Universe," clarifying several important aspects of next year's devices. Reports in recent months have converged in agreement that Apple will discontinue the "Plus" iPhone model in 2025 while introducing an all-new iPhone 17 "Slim" model as an even more high-end option sitting...
Generic iPhone 17 Feature With Full Width Dynamic Island

Kuo: Ultra-Thin iPhone 17 to Feature A19 Chip, Single Rear Camera, Semi-Titanium Frame, and More

Wednesday July 24, 2024 9:06 am PDT by
Apple supply chain analyst Ming-Chi Kuo today shared alleged specifications for a new ultra-thin iPhone 17 model rumored to launch next year. Kuo expects the device to be equipped with a 6.6-inch display with a current-size Dynamic Island, a standard A19 chip rather than an A19 Pro chip, a single rear camera, and an Apple-designed 5G chip. He also expects the device to have a...
iPhone 16 Pro Sizes Feature

iPhone 16 Series Is Less Than Two Months Away: Everything We Know

Thursday July 25, 2024 5:43 am PDT by
Apple typically releases its new iPhone series around mid-September, which means we are about two months out from the launch of the iPhone 16. Like the iPhone 15 series, this year's lineup is expected to stick with four models – iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max – although there are plenty of design differences and new features to take into account. To bring ...
icloud private relay outage

iCloud Private Relay Experiencing Outage

Thursday July 25, 2024 3:18 pm PDT by
Apple’s iCloud Private Relay service is down for some users, according to Apple’s System Status page. Apple says that the iCloud Private Relay service may be slow or unavailable. The outage started at 2:34 p.m. Eastern Time, but it does not appear to be affecting all iCloud users. Some impacted users are unable to browse the web without turning iCloud Private Relay off, while others are...
iPhone 17 Plus Feature Purple

iPhone 17 Rumored to Feature Mechanical Aperture

Tuesday July 23, 2024 9:32 am PDT by
Apple is planning to release at least one iPhone 17 model next year with mechanical aperture, according to a report published today by The Information. The mechanical system would allow users to adjust the size of the iPhone 17's aperture, which refers to the opening of the camera lens through which light enters. All existing iPhone camera lenses have fixed apertures, but some Android...

Top Rated Comments

tzm41 Avatar
36 months ago
Hopefully no CSAM ever… The system is going to be exploited by some states one way or the other.
Score: 42 Votes (Like | Disagree)
Marbles1 Avatar
36 months ago
“Sensitive photos and videos show the private body parts that you cover with bathing suits.”

Weird prudish Apple clearly have no idea about cultures outside the USA.
Score: 27 Votes (Like | Disagree)
HappyDude20 Avatar
36 months ago
Terrible move by Apple
Score: 25 Votes (Like | Disagree)
840quadra Avatar
36 months ago
I am hoping that there are far more details and explanations of what Apple is doing on device, and in the cloud for this feature before it is activated or officially offered to consumers. I get what they are trying to do, but for some there is a huge creep factor attached to this type of service / feature.
Score: 22 Votes (Like | Disagree)
Wildkraut Avatar
36 months ago
Did Ned Flanders get a Job at Apple?
Score: 22 Votes (Like | Disagree)
frumpy16 Avatar
36 months ago

People need to stop using "CSAM" to mean "CSAM detection". Let's expand the acronym in your sentence: "Apple must have worked on child sexual abuse material for a long time" - so, what you're basically saying is Apple is dealing in child porn illegally.
Pedantic. It's pretty obvious what people mean.
Score: 19 Votes (Like | Disagree)