Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time.

iphone communication safety feature

Communication Safety

First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children and their parents when receiving or sending sexually explicit photos. Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

When a child attempts to view a photo flagged as sensitive in the Messages app, they will be alerted that the photo may contain private body parts, and that the photo may be hurtful. Depending on the age of the child, there will also be an option for parents to receive a notification if their child proceeds to view the sensitive photo or if they choose to send a sexually explicit photo to another contact after being warned.

Apple said the new Communication Safety feature will be coming in updates to iOS 15, iPadOS 15 and macOS Monterey later this year for accounts set up as families in iCloud. Apple ensured that iMessage conversations will remain protected with end-to-end encryption, making private communications unreadable by Apple.

Scanning Photos for Child Sexual Abuse Material (CSAM)

Second, starting this year with iOS 15 and iPadOS 15, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies.

Apple said its method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, Apple said the system will perform on-device matching against a database of known CSAM image hashes provided by the NCMEC and other child safety organizations. Apple said it will further transform this database into an unreadable set of hashes that is securely stored on users' devices.

The hashing technology, called NeuralHash, analyzes an image and converts it to a unique number specific to that image, according to Apple.

"The main purpose of the hash is to ensure that identical and visually similar images result in the same hash, while images that are different from one another result in different hashes," said Apple in a new "Expanded Protections for Children" white paper. "For example, an image that has been slightly cropped, resized or converted from color to black and white is treated identical to its original, and has the same hash."

apple csam flow chart
Before an image is stored in iCloud Photos, Apple said an on-device matching process is performed for that image against the unreadable set of known CSAM hashes. If there is a match, the device creates a cryptographic safety voucher. This voucher is uploaded to iCloud Photos along with the image, and once an undisclosed threshold of matches is exceeded, Apple is able to interpret the contents of the vouchers for CSAM matches. Apple then manually reviews each report to confirm there is a match, disables the user's iCloud account, and sends a report to NCMEC. Apple is not sharing what its exact threshold is, but ensures an "extremely high level of accuracy" that accounts are not incorrectly flagged.

Apple said its method of detecting known CSAM provides "significant privacy benefits" over existing techniques:

• This system is an effective way to identify known CSAM stored in iCloud Photos accounts while protecting user privacy.
• As part of the process, users also can't learn anything about the set of known CSAM images that is used for matching. This protects the contents of the database from malicious use.
• The system is very accurate, with an extremely low error rate of less than one in one trillion account per year.
• The system is significantly more privacy-preserving than cloud-based scanning, as it only reports users who have a collection of known CSAM stored in iCloud Photos.

The underlying technology behind Apple's system is quite complex and it has published a technical summary with more details.

"Apple's expanded protection for children is a game changer. With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material," said John Clark, the President and CEO of the National Center for Missing & Exploited Children. "At the National Center for Missing & Exploited Children we know this crime can only be combated if we are steadfast in our dedication to protecting children. We can only do this because technology partners, like Apple, step up and make their dedication known. The reality is that privacy and child protection can co-exist. We applaud Apple and look forward to working together to make this world a safer place for children."

Expanded CSAM Guidance in Siri and Search

iphone csam siri
Third, Apple said it will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

The updates to Siri and Search are coming later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey, according to Apple.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

AirPods Pro 3 Mock Feature

AirPods Pro 3 Just Months Away – Here's What We Know

Friday April 18, 2025 5:16 am PDT by
Despite being more than two years old, Apple's AirPods Pro 2 still dominate the premium wireless‑earbud space, thanks to a potent mix of top‑tier audio, class‑leading noise cancellation, and Apple's habit of delivering major new features through software updates. With AirPods Pro 3 widely expected to arrive in 2025, prospective buyers now face a familiar dilemma: snap up the proven...
iphone 16 pro models 1

17 Reasons to Wait for the iPhone 17

Thursday April 17, 2025 4:12 am PDT by
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we often get rumored features months ahead of launch. The iPhone 17 series is no different, and we already have a good idea of what to expect from Apple's 2025 smartphone lineup. If you skipped the iPhone...
Beyond iPhone 13 Better Triad

Apple's 20th Anniversary iPhone May Finally Go All Screen

Tuesday April 15, 2025 6:31 am PDT by
Apple is preparing a "bold" new iPhone Pro model for the iPhone's 20th anniversary in 2027, according to Bloomberg's Mark Gurman. As part of what's being described as a "major shake-up," Apple is said to be developing a design that makes more extensive use of glass – and this could point directly to the display itself. Here's the case for Apple releasing a truly all-screen iPhone with no...
maxresdefault

iPhone 17 Pro Launching Later This Year With These 12 New Features

Sunday April 13, 2025 7:52 am PDT by
While the iPhone 17 Pro and iPhone 17 Pro Max are not expected to launch until September, there are already plenty of rumors about the devices. Subscribe to the MacRumors YouTube channel for more videos. Below, we recap key changes rumored for the iPhone 17 Pro models as of April 2025: Aluminum frame: iPhone 17 Pro models are rumored to have an aluminum frame, whereas the iPhone 15 Pro and ...
CarPlay Hero

Apple Releases Wireless CarPlay Fix

Wednesday April 16, 2025 11:28 am PDT by
If you have been experiencing issues with wireless CarPlay in your vehicle lately, it was likely due to a software bug that has now been fixed. Apple released iOS 18.4.1 today, and the update's release notes say it "addresses a rare issue that prevents wireless CarPlay connection in certain vehicles." If wireless CarPlay was acting up for you, updating your iPhone to iOS 18.4.1 should...
iOS 18

iOS 18.5 Includes Only a Few Changes So Far

Monday April 21, 2025 11:00 am PDT by
Apple seeded the third beta of iOS 18.5 to developers today, and so far the software update includes only a few minor changes. The changes are in the Mail and Settings apps. In the Mail app, you can now easily turn off contact photos directly within the app, by tapping on the circle with three dots in the top-right corner. In the Settings app, AppleCare+ coverage information is more...
top stories 2025 04 19

Top Stories: iPhone 17 Pro Rumors, CarPlay Bug Fix, and More

Saturday April 19, 2025 6:00 am PDT by
This week saw rumor updates on the iPhone 17 Pro and next-generation Vision Pro, while a minor iOS 18.4.1 update delivered not just security fixes but also a fix for some CarPlay issues. We also looked ahead at what else is in Apple's pipeline for the rest of 2025 and even the 20th-anniversary iPhone coming in 2027, so read on below for all the details on these stories and more! iPhone 17 ...
ipad air windows 11 arm

M2 iPad Air Runs Windows 11 ARM via Emulation, Thanks to EU Rules

Tuesday April 22, 2025 5:01 am PDT by
A developer has demonstrated Windows 11 ARM running on an M2 iPad Air using emulation, which has become much easier since the EU's Digital Markets Act (DMA) regulations came into effect. As spotted by Windows Latest, NTDev shared an instance of the emulation on social media and posted a video on YouTube (embedded below) demonstrating it in action. The achievement relies on new EU regulatory...

Top Rated Comments

levitynyc Avatar
49 months ago
Not gonna lie...that's kinda creepy.
Score: 71 Votes (Like | Disagree)
Exponent Avatar
49 months ago
No, too far, Apple.

What is going to keep you from scanning my library for NeuralHash matches against politics you don’t like? Or criticism of mainland dictatorial China?

if that doesn’t happen in the US, what will keep other countries (read above) from doing just that to their citizens?
Score: 58 Votes (Like | Disagree)
gaximus Avatar
49 months ago
What about photos of "Baby's first bath" will those users get treated as child exploitation?
Score: 56 Votes (Like | Disagree)
arn Avatar
49 months ago

Yeh good luck if say you have small young kids who don't keep their clothes on. Like what,t every baby?

This is also creepy asf sorry. Child predators are bad, obviously, but this isn't the way.
The CSAM thing doesn't detect/determine content of images. It checks photos against a database of specific (actively circulating) child abuse images.

Not to say there aren't legitimate concerns, but worrying that it is going to somehow flag your own kid's photos is not one of them.

(The child safety thing does detect, but seems the worst that does is through up a warning/blurring if you have it on)
Score: 44 Votes (Like | Disagree)
fenderbass146 Avatar
49 months ago
every year we go to 1984 a little more.
Score: 37 Votes (Like | Disagree)
Apple_Robert Avatar
49 months ago
I am against child abuse and child porn. However, I don't agree with what Apple is going to be doing with user phones.
Score: 34 Votes (Like | Disagree)