Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM).

craig wwdc 2021 privacy
Federighi admitted that Apple had handled last week's announcement of the two new features poorly, relating to detecting explicit content in Messages for children and CSAM content stored in iCloud Photos libraries, and acknowledged the widespread confusion around the tools:

It's really clear a lot of messages got jumbled pretty badly in terms of how things were understood. We wish that this would've come out a little more clearly for everyone because we feel very positive and strongly about what we're doing.

[...]

In hindsight, introducing these two features at the same time was a recipe for this kind of confusion. By releasing them at the same time, people technically connected them and got very scared: what's happening with my messages? The answer is...nothing is happening with your messages.

The Communications Safety feature means that if children send or receive explicit images via iMessage, they will be warned before viewing it, the image will be blurred, and there will be an option for their parents to be alerted. CSAM scanning, on the other hand, attempts to match users' photos with hashed images of known CSAM before they are uploaded to iCloud. Accounts that have had CSAM detected will then be subject to a manual review by Apple and may be reported to the National Center for Missing and Exploited Children (NCMEC).

The new features have been subject to a large amount of criticism from users, security researchers, the Electronic Frontier Foundation (EFF) and Edward Snowden, Facebook's former security chief, and even Apple employees.

Amid these criticisms, Federighi addressed one of the main areas of concern, emphasizing that Apple's system will be protected against being taken advantage of by governments or other third parties with "multiple levels of auditability."


Federighi also revealed a number of new details around the system's safeguards, such as the fact that a user will need to meet around 30 matches for CSAM content in their Photos library before Apple is alerted, whereupon it will confirm if those images appear to be genuine instances of CSAM.

If and only if you meet a threshold of something on the order of 30 known child pornographic images matching, only then does Apple know anything about your account and know anything about those images, and at that point, only knows about those images, not about any of your other images. This isn't doing some analysis for did you have a picture of your child in the bathtub? Or, for that matter, did you have a picture of some pornography of any other sort? This is literally only matching on the exact fingerprints of specific known child pornographic images.

He also pointed out the security advantage of placing the matching process on the iPhone directly, rather than it occurring on ‌iCloud‌'s servers.

Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.

When asked if the database of images used to match CSAM content on users' devices could be compromised by having other materials inserted, such as political content in certain regions, Federighi explained that the database is constructed from known CSAM images from multiple child safety organizations, with at least two being "in distinct jurisdictions," to protect against abuse of the system.

These child protection organizations, as well as an independent auditor, will be able to verify that the database of images only consists of content from those entities, according to Federighi.

Federighi's interview is among the biggest PR pushbacks from Apple so far following the mixed public response to the announcement of the child safety features, but the company has also repeatedly attempted to address users' concerns, publishing an FAQ and directly addressing concerns in interviews with the media.

Top Rated Comments

thadoggfather Avatar
15 months ago
It is confusing... but they are gaslighting us into thinking it is universal confusion when there is a large subset of people with clear understanding coupled with dissent
Score: 127 Votes (Like | Disagree)
AndiG Avatar
15 months ago
[HEADING=2]Federighi just doesn’t understand a simple fact. If you don‘t need a local scanning system - don‘t build one.[/HEADING]
Score: 112 Votes (Like | Disagree)
xxray Avatar
15 months ago

Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.
How does this change the fact at all that there’s now essentially a new backdoor to be abused that’s installed in iOS 15?

Stop defending and get rid of this BS, Apple.
Score: 107 Votes (Like | Disagree)
scheinderrob Avatar
15 months ago
"but think of the children" has been used for decades now to erode privacy.

people seem to think if you are against it, you support it. they know exactly what they are doing.
Score: 97 Votes (Like | Disagree)
JPSaltzman Avatar
15 months ago
The more you have to "explain" over and over some new controversial "feature", the more it begins to stink.

(And I don't even use iCloud to store anything important -- especially photos!)
Score: 88 Votes (Like | Disagree)
Mac Fly (film) Avatar
15 months ago
So someone who looks at child porn photos stops using iCloud Photos. What about the rest of us who want privacy? What about future governmental interference?
Score: 81 Votes (Like | Disagree)

Related Stories

Child Safety Feature Blue

Apple Delays Rollout of Controversial Child Safety Features to Make Improvements

Friday September 3, 2021 6:07 am PDT by
Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced. The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance...
communication safety 1

iOS 15.2 Beta Adds Messages Communication Safety Feature for Kids

Tuesday November 9, 2021 10:07 am PST by
Apple over the summer announced new Child Safety features that are aimed at keeping children safer online. Apple has confirmed that one of those features, Communication Safety in Messages, has been enabled in the second beta of iOS 15.2 that was released today, after hints of it appeared in the first beta. Note that Communication Safety is not the same as the controversial anti-CSAM feature that...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
apple csam flow chart

Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis

Friday August 6, 2021 10:25 am PDT by
Apple this week announced that, starting later this year with iOS 15 and iPadOS 15, the company will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children, a non-profit organization that works in collaboration with law enforcement agencies across the United...
Child Safety Feature yellow

Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

Thursday August 19, 2021 1:23 am PDT by
An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters). "Though these capabilities are intended to protect...
iCloud General Feature

Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off

Thursday August 5, 2021 2:16 pm PDT by
Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States. User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud...
Child Safety Feature Blue

Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection

Wednesday August 18, 2021 1:13 pm PDT by
Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit. Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...

Popular Stories

iphone 14 pro max vs 13 max 2

Camera Comparison: iPhone 14 Pro Max vs. iPhone 13 Pro Max

Thursday September 29, 2022 7:44 am PDT by
The iPhone 14 Pro and Pro Max introduce some major improvements in camera technology, adding a 48-megapixel lens and low-light improvements across all lenses with the new Photonic Engine. We've spent the last week working on an in-depth comparison that pits the new iPhone 14 Pro Max against the prior-generation iPhone 13 Pro Max to see just how much better the iPhone 14 Pro Max can be. Subscrib ...
tony blevins car

Apple Procurement VP Departs Company After Vulgar TikTok Comment

Thursday September 29, 2022 12:38 pm PDT by
Tony Blevins, Apple's vice president of procurement, is set to depart the company after he made a crude comment about his profession in a recent TikTok video, reports Bloomberg. Blevins was in a video by TikTok creator Daniel Mac, who was doing a series on the jobs of people he spotted with expensive cars. After seeing Blevins in an expensive Mercedes-Benz SLR McLaren, Mac asked Blevins what ...
Dark Sky App Featured

Dark Sky Removed From iOS App Store Ahead of Upcoming Shutdown

Wednesday September 28, 2022 4:27 pm PDT by
The Dark Sky weather app that's owned by Apple is no longer available for download in the U.S. App Store, suggesting that it has been removed ahead of schedule. Apple acquired Dark Sky back in March 2020 and has since incorporated elements of the app into the Weather app available on the iPhone (and soon, the iPad). Dark Sky remained available for purchase as a standalone weather app...
adaptive transparency airpods pro

iOS 16.1 Beta Brings Adaptive Transparency to Original AirPods Pro

Thursday September 29, 2022 1:08 pm PDT by
The third beta of iOS 16.1 that was released earlier this week expands the Adaptive Transparency feature introduced with the second-generation AirPods Pro to the original AirPods Pro. As noted on Reddit, first-generation AirPods Pro owners who also have the AirPods beta software will now see an "Adaptive Transparency" toggle in the AirPods section of the Settings app. The 5A304A beta...
tim cook malala

Tim Cook: Not Too Long From Now, You'll Wonder How You Led Your Life Without AR

Thursday September 29, 2022 7:26 am PDT by
Speaking at Università Degli Studi di Napoli Federico II in Naples, Italy, Apple CEO Tim Cook said that not too long from today, people will wonder how they led a life without augmented reality, stressing the "profound" impact it will have on the not so distant future. At the university, Cook was awarded an Honorary Degree in Innovation and International Management and also sat down for a...
iPhone 15 to Switch From Lightning to USB C in 2023 feature sans arrow

Kuo: iPhone 14 Pro Max Popularity Could Lead to More Differentiation Between iPhone 15 Pro and iPhone 15 Pro Max

Wednesday September 28, 2022 10:22 am PDT by
Apple has seen high demand for the 6.7-inch iPhone 14 Pro Max, which could lead the company to further differentiate the next-generation iPhone 15 Pro and Pro Max, according to Apple analyst Ming-Chi Kuo. Apple could add exclusive features to the iPhone 15 Pro Max in an effort to encourage more people to purchase the larger and more expensive device. Kuo last week said that Apple asked...
iOS 16 Wallpaper Spectrum Feature

Five Wallpaper Apps to Check Out for iOS 16's New Lock Screen Depth Effect

Thursday September 29, 2022 9:08 am PDT by
One of the biggest new features in iOS 16 is a completely redesigned iPhone Lock Screen. The new Lock Screen is entirely customizable, letting you change the colors and fonts, add widgets and new wallpapers, and more to make your iPhone uniquely yours. Of course, even before iOS 16, you could customize your Lock Screen with a wallpaper of your choice. iOS 16 takes the Lock Screen wallpaper...
apple watch ultra deuglify 1

Apple Watch Ultra User Mods Titanium Casing to 'Deuglify' Design

Tuesday September 27, 2022 8:05 am PDT by
An Apple Watch Ultra user has modified their new device's casing to add a brushed finish and remove the orange color of the Action Button in an effort to make it more visually appealing. The Apple Watch Ultra offers the first complete redesign of the Apple Watch since the product line's announcement in 2014, and while the design has been met with praise from many users, some have criticized...
mx mechanical keyboard logitech

Logitech Launches New 'Designed for Mac' Mice and Keyboards

Wednesday September 28, 2022 12:01 am PDT by
Logitech today announced the launch of several new mice and keyboards that have been developed for use with Apple's Macs, including Logitech's first mechanical keyboard that has been optimized for the Mac. The MX Mechanical Mini for Mac Keyboard has a keyboard layout designed for Macs, with tools to customize shortcuts with Logi Options+. The keyboard uses Tactile Quiet low-profile switches...
iphone 14 iphone 14 plus in hand feature

iPhone 14 Is Secretly Hiding a Beloved Mac Feature

Friday September 30, 2022 3:24 am PDT by
The iPhone 14 and iPhone 14 Pro models bring over a longstanding Mac feature, but the setting to enable it is off by default. The feature, which is actually a new accessibility option, allows the iPhone to play a startup chime like the Mac. When enabled, the sound comes alongside a new shutdown chime. The Mac has featured a startup chime since 1987's Macintosh II, and the iconic "bong"...