Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM).

craig wwdc 2021 privacy
Federighi admitted that Apple had handled last week's announcement of the two new features poorly, relating to detecting explicit content in Messages for children and CSAM content stored in iCloud Photos libraries, and acknowledged the widespread confusion around the tools:

It's really clear a lot of messages got jumbled pretty badly in terms of how things were understood. We wish that this would've come out a little more clearly for everyone because we feel very positive and strongly about what we're doing.

[...]

In hindsight, introducing these two features at the same time was a recipe for this kind of confusion. By releasing them at the same time, people technically connected them and got very scared: what's happening with my messages? The answer is...nothing is happening with your messages.

The Communications Safety feature means that if children send or receive explicit images via iMessage, they will be warned before viewing it, the image will be blurred, and there will be an option for their parents to be alerted. CSAM scanning, on the other hand, attempts to match users' photos with hashed images of known CSAM before they are uploaded to iCloud. Accounts that have had CSAM detected will then be subject to a manual review by Apple and may be reported to the National Center for Missing and Exploited Children (NCMEC).

The new features have been subject to a large amount of criticism from users, security researchers, the Electronic Frontier Foundation (EFF) and Edward Snowden, Facebook's former security chief, and even Apple employees.

Amid these criticisms, Federighi addressed one of the main areas of concern, emphasizing that Apple's system will be protected against being taken advantage of by governments or other third parties with "multiple levels of auditability."


Federighi also revealed a number of new details around the system's safeguards, such as the fact that a user will need to meet around 30 matches for CSAM content in their Photos library before Apple is alerted, whereupon it will confirm if those images appear to be genuine instances of CSAM.

If and only if you meet a threshold of something on the order of 30 known child pornographic images matching, only then does Apple know anything about your account and know anything about those images, and at that point, only knows about those images, not about any of your other images. This isn't doing some analysis for did you have a picture of your child in the bathtub? Or, for that matter, did you have a picture of some pornography of any other sort? This is literally only matching on the exact fingerprints of specific known child pornographic images.

He also pointed out the security advantage of placing the matching process on the iPhone directly, rather than it occurring on ‌iCloud‌'s servers.

Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.

When asked if the database of images used to match CSAM content on users' devices could be compromised by having other materials inserted, such as political content in certain regions, Federighi explained that the database is constructed from known CSAM images from multiple child safety organizations, with at least two being "in distinct jurisdictions," to protect against abuse of the system.

These child protection organizations, as well as an independent auditor, will be able to verify that the database of images only consists of content from those entities, according to Federighi.

Federighi's interview is among the biggest PR pushbacks from Apple so far following the mixed public response to the announcement of the child safety features, but the company has also repeatedly attempted to address users' concerns, publishing an FAQ and directly addressing concerns in interviews with the media.

Popular Stories

iOS 18

Apple Expected to Release iOS 18.3 Next Week With These New Features

Thursday January 23, 2025 6:41 am PST by
iOS 18.3 should be released to the public next week, following beta testing since mid-December. While the software update is a relatively minor one, it still includes a handful of new features, changes, and bug fixes for iPhones. Below, we recap everything new in iOS 18.3. Notification Summary Changes Examples of inaccurate Apple Intelligence notification summaries Apple Intelligence...
Apple Pay Walmart Feature

Walmart Stands Firm on Why It Doesn't Accept Apple Pay in the U.S.

Thursday January 23, 2025 7:32 am PST by
Walmart still does not accept Apple Pay or other NFC payments at its more than 4,600 stores across the U.S., and it stood firm on its reasoning for that today. A spokesperson for Walmart today informed MacRumors that its position on contactless payments has not changed since we last reached out about the matter in 2022. The big-box retailer said it remains focused on its own convenient...
iOS 18

Here Are Apple's Full Release Notes for iOS 18.3

Tuesday January 21, 2025 4:31 pm PST by
Apple provided developers and public beta testers with the release candidate version of iOS 18.3 today, and with it comes release notes confirming what's new. While we knew about several of the features that are in the update, there are some lesser known tweaks and bug fixes. The update adds new Visual Intelligence features for iPhone 16 models, it tweaks Notification summaries on all...
truecaller

Truecaller iOS Update Rolls Out Real-Time Caller ID Support

Wednesday January 22, 2025 2:07 am PST by
Popular caller ID app Truecaller is rolling out an update that brings real-time caller ID support to its iOS subscribers. Apple introduced Live Caller ID Lookup in iOS 18, allowing third-party caller ID apps to securely retrieve information about a caller from their servers, hence today's Truecaller update. iPhone users can enable the Live Caller ID Lookup feature by going to Settings ➝ ...
iPhone 17 Air Size Feature

'iPhone 17 Air' With Rear Camera Bar Allegedly Shown in Leaked Photo

Tuesday January 21, 2025 12:46 pm PST by
A leaker known as "Majin Bu" today shared an alleged image of a component for the rumored, ultra-thin "iPhone 17 Air" model. The blurry, pixelated image shows a pair of rear iPhone shells with a pill-shaped, raised camera bar along the top. On the left side of the bar, there is a circular cutout that appears to be for a single rear camera. On the right side of the bar, there appears to be an ...
apple tv 4k new orange

New Apple TV Launching This Year With These New Features

Wednesday January 22, 2025 6:01 pm PST by
A new Apple TV is expected to be released later this year. In this article, we recap rumored features and changes for the device. The next Apple TV will be equipped with Apple's own combined Wi-Fi and Bluetooth chip, according to Bloomberg's Mark Gurman. He said the chip supports Wi-Fi 6E, which would be an upgrade over the current Apple TV's standard Wi-Fi 6 support. Wi-Fi 6E extends the...
Generic iOS 19 Feature Mock Light

iOS 19 Leak Reveals All-New Design

Friday January 17, 2025 2:42 pm PST by
iOS 19 is still around six months away from being announced, but a new leak has allegedly revealed a completely redesigned Camera app. Based on footage it obtained, YouTube channel Front Page Tech shared a video showing what the new Camera app will apparently look like, with the key change being translucent menus for camera controls. Overall, the design of these menus looks similar to...
iOS 18

5 New Things Your iPhone Can Do in iOS 18.3

Friday January 24, 2025 1:55 am PST by
Apple is set to release iOS 18.3 next week, bringing further refinements to Apple Intelligence features, a couple of neat new capabilities to iPhone 15 Pro and iPhone 16 devices, and bug fixes. While not quite as packed with new features as Apple's preceding iOS 18 point releases, iOS 18.3 still introduces capabilities that aim to make your iPhone smarter and more intuitive. Below, we've...
iPhone SE 4 Single Camera Thumb 3

iOS 18.3 Leak Provides Clue About iPhone SE 4, iPad 11, and iPad Air 7 Launch Timing

Wednesday January 22, 2025 9:39 am PST by
New information has surfaced that indicates the rumored iPhone SE 4, iPad 11, and new iPad Air models are nearing launch. A private account on social media platform X today revealed that iOS 18.3 or iPadOS 18.3 will be preinstalled on all of those upcoming devices when they are released. It is still unclear exactly when the devices will launch, but this information suggests that Apple will...

Top Rated Comments

thadoggfather Avatar
45 months ago
It is confusing... but they are gaslighting us into thinking it is universal confusion when there is a large subset of people with clear understanding coupled with dissent
Score: 127 Votes (Like | Disagree)
AndiG Avatar
45 months ago
[HEADING=2]Federighi just doesn’t understand a simple fact. If you don‘t need a local scanning system - don‘t build one.[/HEADING]
Score: 112 Votes (Like | Disagree)
xxray Avatar
45 months ago

Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.
How does this change the fact at all that there’s now essentially a new backdoor to be abused that’s installed in iOS 15?

Stop defending and get rid of this BS, Apple.
Score: 107 Votes (Like | Disagree)
6787872 Avatar
45 months ago
"but think of the children" has been used for decades now to erode privacy.

people seem to think if you are against it, you support it. they know exactly what they are doing.
Score: 97 Votes (Like | Disagree)
JPSaltzman Avatar
45 months ago
The more you have to "explain" over and over some new controversial "feature", the more it begins to stink.

(And I don't even use iCloud to store anything important -- especially photos!)
Score: 88 Votes (Like | Disagree)
Mac Fly (film) Avatar
45 months ago
So someone who looks at child porn photos stops using iCloud Photos. What about the rest of us who want privacy? What about future governmental interference?
Score: 81 Votes (Like | Disagree)