Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns

Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch.

Child Safety Feature Blue
When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got the technology that can balance strong child safety and user privacy," giving the company "a new ability to identify accounts which are starting collections of known CSAM."

Neuenschwander was asked if, in retrospect, announcing the Communication Safety features in Messages and the CSAM detection system in iCloud Photos together was the right decision, to which he responded:

Well, while they are [two] systems they are also of a piece along with our increased interventions that will be coming in Siri and search. As important as it is to identify collections of known CSAM where they are stored in Apple's iCloud Photos service, it's also important to try to get upstream of that already horrible situation.

When asked if Apple was trying to demonstrate to governments and agencies around the world that it is possible to scan for illicit content while preserving user privacy, Neuenschwander explained:

Now, why to do it is because, as you said, this is something that will provide that detection capability while preserving user privacy. We're motivated by the need to do more for child safety across the digital ecosystem, and all three of our features, I think, take very positive steps in that direction. At the same time we're going to leave privacy undisturbed for everyone not engaged in the illegal activity.

He was asked if Apple had created a framework that could be used for law enforcement to scan for other kinds of content in users' libraries and if it undermines Apple's commitment to end-to-end encryption.

It doesn't change that one iota. The device is still encrypted, we still don't hold the key, and the system is designed to function on on-device data... The alternative of just processing by going through and trying to evaluate users data on a server is actually more amenable to changes [without user knowledge], and less protective of user privacy... It's those sorts of systems that I think are more troubling when it comes to the privacy properties — or how they could be changed without any user insight or knowledge to do things other than what they were designed to do.

Neuenschwander was then asked if Apple could be forced to comply with laws outside the United States that may force it to add things that are not CSAM to the database to check for them on-device, to which he explained that there are a "number of protections built-in" to the service.

The hash list is built into the operating system, we have one global operating system and don't have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled. And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person's device or set of people's devices won't work because the system simply does not provide any knowledge to Apple for single photos stored in our service. And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity. And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the U.S. And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled, no part of the system is functional.

Neuenschwander continued that for users who are "not into this illegal behavior, Apple gain no additional knowledge about any user's cloud library," and "it leaves privacy completely undisturbed."

See TechCrunch's full interview with Neuenschwander for more information.

Popular Stories

airpods pro 3 purple

New, Higher End AirPods Pro Coming This Year

Tuesday January 20, 2026 9:05 am PST by
Apple is planning to debut a high-end secondary version of AirPods Pro 3 this year, sitting in the lineup alongside the current model, reports suggest. Back in September 2025, supply chain analyst Ming-Chi Kuo reported that Apple is planning to introduce a successor to the AirPods Pro 3 in 2026. This would be somewhat unusual since Apple normally waits around three years to make major...
smaller dynamic island iphone 18 pro Filip Vabrous%CC%8Cek

iPhone 18 Pro Leak: Smaller Dynamic Island, No Top-Left Camera Cutout

Tuesday January 20, 2026 2:34 am PST by
Over the last few months, rumors around the iPhone 18 Pro's front-panel design have been conflicted, with some supply-chain leaks pointing to under-display Face ID, reports suggesting a top-left hole-punch camera, and debate over whether the familiar Dynamic Island will shrink, shift, or disappear entirely. Today, Weibo-based leaker Instant Digital shared new details that appear to clarify the ...
iOS 27 Mock Quick

iOS 27 Will Add These 8 New Features to Your iPhone

Sunday January 18, 2026 3:51 pm PST by
iOS 27 is still many months away, but there are already plenty of rumors about new features that will be included in the software update. The first beta of iOS 27 will be released during WWDC 2026 in June, and the update should be released to all users with a compatible iPhone in September. Bloomberg's Mark Gurman said that iOS 27 will be similar to Mac OS X Snow Leopard, in the sense...
14 inch MacBook Pro Keyboard

MacBook Pro Buyers Now Facing Up to a Two-Month Wait Ahead of New Models

Sunday January 18, 2026 6:50 pm PST by
MacBook Pro availability is tightening on Apple's online store, with select configurations facing up to a two-month delivery timeframe in the United States. A few 14-inch and 16-inch MacBook Pro configurations with an M4 Pro chip are not facing any shipping delay, but estimated delivery dates for many configurations with an M4 Max chip range from February 6 to February 24 or even later. At...
Apple Logo Spotlight

Apple Expected to Unveil Five All-New Products This Year

Wednesday January 21, 2026 10:54 am PST by
In addition to updating many of its existing products, Apple is expected to unveil five all-new products this year, including a smart home hub, a Face ID doorbell, a MacBook with an A18 Pro chip, a foldable iPhone, and augmented reality glasses. Below, we have recapped rumored features for each product. Smart Home Hub Apple home hub (concept) Apple's long-rumored smart home hub should...

Top Rated Comments

jimbobb24 Avatar
58 months ago
Shorter “If your not breaking the law you have nothing to fear”.

I am sure am glad governments never change laws, have poorly defined laws, arbitrary enforcement, and executive orders/mandates etc that might change my status as a law abiding citizen at any moment.

Obviously this power could never be abused. Thank goodness. Go get those bad people with the pictures while the rest of us rest easy knowing they are not after us.
Score: 75 Votes (Like | Disagree)
LeeW Avatar
58 months ago

“If your not breaking the law you have nothing to fear”
The worst argument ever when it comes to privacy.
Score: 62 Votes (Like | Disagree)
Mebsat Avatar
58 months ago
As he states, it is clear that Apple will tolerate a single CSAM file within an iCloud Photos account. They designed it to do so. So what is the point of this? That fact alone gives law enforcement a battering ram to demand access to iCloud Photos. This feature does not preclude that there is CSAM stored in iCloud Photos. All Apple can claim is there is less CSAM in iCloud Photos.

If PR approved this disaster, firings must commence.
Score: 30 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
58 months ago
Bottom-line: They are scanning your iPhone. Whatever you store in iCloud.
Score: 25 Votes (Like | Disagree)
Cosmosent Avatar
58 months ago
Regardless as to how Apple tries to Spin It, the chances of an iOS 15 Boycott are now real !

I expect the iPhone 13 family to come pre-loaded with iOS 14.8.
Score: 24 Votes (Like | Disagree)
Jonas07 Avatar
58 months ago
Bunch of ******** to hide the fact they will scan your photos and messages, you have to be stupid to believe it will only for children between 0-12yo.
Score: 24 Votes (Like | Disagree)