Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time.

iphone communication safety feature

Communication Safety

First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children and their parents when receiving or sending sexually explicit photos. Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

When a child attempts to view a photo flagged as sensitive in the Messages app, they will be alerted that the photo may contain private body parts, and that the photo may be hurtful. Depending on the age of the child, there will also be an option for parents to receive a notification if their child proceeds to view the sensitive photo or if they choose to send a sexually explicit photo to another contact after being warned.

Apple said the new Communication Safety feature will be coming in updates to iOS 15, iPadOS 15 and macOS Monterey later this year for accounts set up as families in iCloud. Apple ensured that iMessage conversations will remain protected with end-to-end encryption, making private communications unreadable by Apple.

Scanning Photos for Child Sexual Abuse Material (CSAM)

Second, starting this year with iOS 15 and iPadOS 15, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies.

Apple said its method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, Apple said the system will perform on-device matching against a database of known CSAM image hashes provided by the NCMEC and other child safety organizations. Apple said it will further transform this database into an unreadable set of hashes that is securely stored on users' devices.

The hashing technology, called NeuralHash, analyzes an image and converts it to a unique number specific to that image, according to Apple.

"The main purpose of the hash is to ensure that identical and visually similar images result in the same hash, while images that are different from one another result in different hashes," said Apple in a new "Expanded Protections for Children" white paper. "For example, an image that has been slightly cropped, resized or converted from color to black and white is treated identical to its original, and has the same hash."

apple csam flow chart
Before an image is stored in iCloud Photos, Apple said an on-device matching process is performed for that image against the unreadable set of known CSAM hashes. If there is a match, the device creates a cryptographic safety voucher. This voucher is uploaded to iCloud Photos along with the image, and once an undisclosed threshold of matches is exceeded, Apple is able to interpret the contents of the vouchers for CSAM matches. Apple then manually reviews each report to confirm there is a match, disables the user's iCloud account, and sends a report to NCMEC. Apple is not sharing what its exact threshold is, but ensures an "extremely high level of accuracy" that accounts are not incorrectly flagged.

Apple said its method of detecting known CSAM provides "significant privacy benefits" over existing techniques:

• This system is an effective way to identify known CSAM stored in iCloud Photos accounts while protecting user privacy.
• As part of the process, users also can't learn anything about the set of known CSAM images that is used for matching. This protects the contents of the database from malicious use.
• The system is very accurate, with an extremely low error rate of less than one in one trillion account per year.
• The system is significantly more privacy-preserving than cloud-based scanning, as it only reports users who have a collection of known CSAM stored in iCloud Photos.

The underlying technology behind Apple's system is quite complex and it has published a technical summary with more details.

"Apple's expanded protection for children is a game changer. With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material," said John Clark, the President and CEO of the National Center for Missing & Exploited Children. "At the National Center for Missing & Exploited Children we know this crime can only be combated if we are steadfast in our dedication to protecting children. We can only do this because technology partners, like Apple, step up and make their dedication known. The reality is that privacy and child protection can co-exist. We applaud Apple and look forward to working together to make this world a safer place for children."

Expanded CSAM Guidance in Siri and Search

iphone csam siri
Third, Apple said it will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

The updates to Siri and Search are coming later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey, according to Apple.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Top Rated Comments

levitynyc Avatar
13 months ago
Not gonna lie...that's kinda creepy.
Score: 71 Votes (Like | Disagree)
Exponent Avatar
13 months ago
No, too far, Apple.

What is going to keep you from scanning my library for NeuralHash matches against politics you don’t like? Or criticism of mainland dictatorial China?

if that doesn’t happen in the US, what will keep other countries (read above) from doing just that to their citizens?
Score: 58 Votes (Like | Disagree)
gaximus Avatar
13 months ago
What about photos of "Baby's first bath" will those users get treated as child exploitation?
Score: 56 Votes (Like | Disagree)
arn Avatar
13 months ago

Yeh good luck if say you have small young kids who don't keep their clothes on. Like what,t every baby?

This is also creepy asf sorry. Child predators are bad, obviously, but this isn't the way.
The CSAM thing doesn't detect/determine content of images. It checks photos against a database of specific (actively circulating) child abuse images.

Not to say there aren't legitimate concerns, but worrying that it is going to somehow flag your own kid's photos is not one of them.

(The child safety thing does detect, but seems the worst that does is through up a warning/blurring if you have it on)
Score: 44 Votes (Like | Disagree)
fenderbass146 Avatar
13 months ago
every year we go to 1984 a little more.
Score: 37 Votes (Like | Disagree)
Apple_Robert Avatar
13 months ago
I am against child abuse and child porn. However, I don't agree with what Apple is going to be doing with user phones.
Score: 34 Votes (Like | Disagree)

Related Stories

iCloud General Feature

Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off

Thursday August 5, 2021 2:16 pm PDT by
Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States. User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud...
Child Safety Feature yellow

Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]

Wednesday December 15, 2021 1:53 am PST by
Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods. Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...
maxresdefault

Apple Releases iOS 15.2 With App Privacy Report, Legacy Contacts, Hide My Email Improvements and More

Monday December 13, 2021 9:48 am PST by
Apple today released iOS 15.2 and iPadOS 15.2, the second major updates to the iOS and iPadOS 15 operating systems that were released in September 2021. iOS 15.2 comes more than a month after the launch of iOS 15.1. Subscribe to the MacRumors YouTube channel for more videos. The iOS and iPadOS 15.2 updates can be downloaded for free and the software is available on all eligible devices...
Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...
Child Safety Feature yellow

Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

Thursday August 19, 2021 1:23 am PDT by
An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters). "Though these capabilities are intended to protect...
apple csam flow chart

Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis

Friday August 6, 2021 10:25 am PDT by
Apple this week announced that, starting later this year with iOS 15 and iPadOS 15, the company will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children, a non-profit organization that works in collaboration with law enforcement agencies across the United...
General iOS 15

Apple Seeds Release Candidate Versions of iOS 15.2 and iPadOS 15.2 to Developers

Tuesday December 7, 2021 10:05 am PST by
Apple today seeded the RC versions of upcoming iOS and iPadOS 15.2 updates to developers for testing purposes, less than a week after seeding the fourth betas and six weeks after the launch of iOS 15.1 and iPadOS 15.1. iOS and iPadOS 15.2 can be downloaded through the Apple Developer Center or over the air after the proper profile has been installed on an iPhone or an iPad. iOS and iPadOS ...

Popular Stories

Apple Watch Series 7 Starlight Midnight

Standard Apple Watch Series 8 Rumored to Feature Same Design as Series 7

Friday August 5, 2022 7:46 am PDT by
The standard 41mm and 45mm models of the Apple Watch Series 8 will feature the same design as the Apple Watch Series 7, according to Twitter user @ShrimpApplePro, who was first to reveal that iPhone 14 Pro models would feature a new pill-and-hole display. Titanium will not be an option for the standard Apple Watch Series 8 models either, according to @ShrimpApplePro, but Bloomberg's Mark...
cook sept 2020 event

Gurman: Apple Preparing Pre-Recorded iPhone 14 and Apple Watch Series 8 Event

Sunday August 7, 2022 6:13 am PDT by
Apple has "started to record" its virtual September event, where it's expected to announce the upcoming iPhone 14 lineup, the Apple Watch Series 8, and a new "rugged" Apple Watch model, according to Bloomberg's Mark Gurman. Writing in his latest Power On newsletter, Gurman says the event, which is expected to take place in the early part of September, is already under production, implying...
iPhone 14 Pro Purple Front and Back MacRumors Exclusive

Five iPhone 14 Rumors You May Have Missed

Thursday August 4, 2022 6:05 am PDT by
With August upon us, the countdown is officially on. We're just weeks away from when we're expecting Apple to announce the iPhone 14 lineup. Rumors of the next iPhone start early in the year, and as a result, some details about the upcoming device sometimes get lost in the crowd. Exclusive MacRumors iPhone 14 Pro renders by graphic designer Ian Zelbo To help MacRumors readers, we've created a ...
banish safari app pop ups

New iOS App Blocks Those Annoying 'Open in App' Pop-Ups in Safari

Friday August 5, 2022 2:47 am PDT by
You've probably experienced visiting a website like Reddit or LinkedIn on your iPhone only to be greeted with an annoying, almost full-screen pop-up urging you to view the content in their app instead of on the website. It's a common practice for websites that have accompanying iOS apps to push users to open (if they already have the app installed) or download their app from the App Store to ...
top stories 7aug22

Top Stories: iPadOS 16 Delayed, iPhone 14 Pro Rumors, Studio Display Speaker Issues

Saturday August 6, 2022 6:00 am PDT by
The big Apple news this week was word that the upcoming iPadOS 16 update apparently won't be arriving alongside its counterpart update for the iPhone in September, largely due to a need to continue refining the new Stage Manager multitasking feature. Other popular stories this week included more hints about the iPhone 14 Pro's rumored always-on display, potential design leaks for the...