Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM).

craig wwdc 2021 privacy
Federighi admitted that Apple had handled last week's announcement of the two new features poorly, relating to detecting explicit content in Messages for children and CSAM content stored in iCloud Photos libraries, and acknowledged the widespread confusion around the tools:

It's really clear a lot of messages got jumbled pretty badly in terms of how things were understood. We wish that this would've come out a little more clearly for everyone because we feel very positive and strongly about what we're doing.

[...]

In hindsight, introducing these two features at the same time was a recipe for this kind of confusion. By releasing them at the same time, people technically connected them and got very scared: what's happening with my messages? The answer is...nothing is happening with your messages.

The Communications Safety feature means that if children send or receive explicit images via iMessage, they will be warned before viewing it, the image will be blurred, and there will be an option for their parents to be alerted. CSAM scanning, on the other hand, attempts to match users' photos with hashed images of known CSAM before they are uploaded to iCloud. Accounts that have had CSAM detected will then be subject to a manual review by Apple and may be reported to the National Center for Missing and Exploited Children (NCMEC).

The new features have been subject to a large amount of criticism from users, security researchers, the Electronic Frontier Foundation (EFF) and Edward Snowden, Facebook's former security chief, and even Apple employees.

Amid these criticisms, Federighi addressed one of the main areas of concern, emphasizing that Apple's system will be protected against being taken advantage of by governments or other third parties with "multiple levels of auditability."


Federighi also revealed a number of new details around the system's safeguards, such as the fact that a user will need to meet around 30 matches for CSAM content in their Photos library before Apple is alerted, whereupon it will confirm if those images appear to be genuine instances of CSAM.

If and only if you meet a threshold of something on the order of 30 known child pornographic images matching, only then does Apple know anything about your account and know anything about those images, and at that point, only knows about those images, not about any of your other images. This isn't doing some analysis for did you have a picture of your child in the bathtub? Or, for that matter, did you have a picture of some pornography of any other sort? This is literally only matching on the exact fingerprints of specific known child pornographic images.

He also pointed out the security advantage of placing the matching process on the iPhone directly, rather than it occurring on ‌iCloud‌'s servers.

Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.

When asked if the database of images used to match CSAM content on users' devices could be compromised by having other materials inserted, such as political content in certain regions, Federighi explained that the database is constructed from known CSAM images from multiple child safety organizations, with at least two being "in distinct jurisdictions," to protect against abuse of the system.

These child protection organizations, as well as an independent auditor, will be able to verify that the database of images only consists of content from those entities, according to Federighi.

Federighi's interview is among the biggest PR pushbacks from Apple so far following the mixed public response to the announcement of the child safety features, but the company has also repeatedly attempted to address users' concerns, publishing an FAQ and directly addressing concerns in interviews with the media.

Popular Stories

iPhone 17 Slim Feature Single Camera 2

10 Reasons to Wait for Next Year's iPhone 17

Monday September 23, 2024 2:00 am PDT by
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we sometimes get rumored feature leaks so far ahead of launch. The iPhone 17 series is no different – already we have some idea of what to expect from Apple's 2025 smartphone lineup. If you plan to skip...
iPhone 15 Pro lineup

Apple's 80% Charging Limit for iPhone: How Much Did It Help After a Year?

Tuesday September 24, 2024 2:09 pm PDT by
With the iPhone 15 models that came out last year, Apple added an opt-in battery setting that limits maximum charge to 80 percent. The idea is that never charging the iPhone above 80 percent will increase battery longevity, so I kept my iPhone at that 80 percent limit from September 2023 to now, with no cheating. My iPhone 15 Pro Max battery level is currently at 94 percent with 299 cycles....
iPhone 17 Slim Feature Single Camera 2

iPhone 17 Air: Everything We Know About Apple's Slim iPhone

Monday September 23, 2024 1:50 am PDT by
In 2025, Apple is expected to discontinue the iPhone "Plus" device in its iPhone 17 lineup to make way for an iPhone "Air" – although it may not actually be called this when the device debuts in the fall of next year. Even though the iPhone 16 series has only just launched, when you consider that we learned about larger displays on the iPhone 16 Pro models way back in May 2023, rumors about a...
iphone 16 pro apple intelligence

Apple Intelligence Features Expected to Roll Out in This Order Between iOS 18.1 and iOS 18.4 [Updated]

Sunday September 22, 2024 6:00 am PDT by
iOS 18 was released to the public earlier this month, but the first Apple Intelligence features will not be available until iOS 18.1 is released in October. Apple Intelligence features will continue to roll out in iOS 18.2 and beyond, with the expected roadmap outlined below per Apple's website and rumors. Apple Intelligence requires an iPhone 15 Pro model or any iPhone 16 model, and it...
iFixit iPhone 16 Battery Removal

iPhone 16's 'Revolutionary' Battery Removal Process Shown in Video

Monday September 23, 2024 7:30 am PDT by
Over the weekend, well-known repair website iFixit shared an iPhone 16 and iPhone 16 Plus teardown video, and an accompanying blog post. Notably, the video shows Apple's new electrical battery removal process in action on the standard iPhone 16. iPhone 16 and iPhone 16 Plus batteries have an innovative type of adhesive that can be easily loosened with low-voltage electrical current, such as...
ios 18 1 beta 5 control center

Latest iOS 18.1 Beta Adds Control Center Reset and New Connectivity Options

Monday September 23, 2024 11:03 am PDT by
In the fifth beta of iOS 18.1, Apple has made some changes to the new customizable Control Center, refining it ahead of when iOS 18.1 is set to come out in October. In the Control Center section of the Settings app, there is now an option to return to the default setup. If you make a bunch of confusing changes to Control Center and don't want to go through the trouble of moving everything...
Touch Bar 13 Inch MacBook Pro

Apple Adds These 12 Macs to Vintage and Obsolete Products Lists

Monday September 23, 2024 8:50 am PDT by
Apple today added three Mac models to its vintage products list, and moved an additional nine Mac models from the vintage list to its obsolete products list. The following Macs are now classified as vintage: MacBook Air (Retina, 13-inch, 2018) MacBook Pro (13-inch, 2017, 2 Thunderbolt 3 Ports) MacBook Pro (13-inch, 2018, 4 Thunderbolt 3 Ports) The following Macs are now classified as ...

Top Rated Comments

thadoggfather Avatar
41 months ago
It is confusing... but they are gaslighting us into thinking it is universal confusion when there is a large subset of people with clear understanding coupled with dissent
Score: 127 Votes (Like | Disagree)
AndiG Avatar
41 months ago
[HEADING=2]Federighi just doesn’t understand a simple fact. If you don‘t need a local scanning system - don‘t build one.[/HEADING]
Score: 112 Votes (Like | Disagree)
xxray Avatar
41 months ago

Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.
How does this change the fact at all that there’s now essentially a new backdoor to be abused that’s installed in iOS 15?

Stop defending and get rid of this BS, Apple.
Score: 107 Votes (Like | Disagree)
6787872 Avatar
41 months ago
"but think of the children" has been used for decades now to erode privacy.

people seem to think if you are against it, you support it. they know exactly what they are doing.
Score: 97 Votes (Like | Disagree)
JPSaltzman Avatar
41 months ago
The more you have to "explain" over and over some new controversial "feature", the more it begins to stink.

(And I don't even use iCloud to store anything important -- especially photos!)
Score: 88 Votes (Like | Disagree)
Mac Fly (film) Avatar
41 months ago
So someone who looks at child porn photos stops using iCloud Photos. What about the rest of us who want privacy? What about future governmental interference?
Score: 81 Votes (Like | Disagree)