Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns

Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch.

Child Safety Feature Blue
When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got the technology that can balance strong child safety and user privacy," giving the company "a new ability to identify accounts which are starting collections of known CSAM."

Neuenschwander was asked if, in retrospect, announcing the Communication Safety features in Messages and the CSAM detection system in iCloud Photos together was the right decision, to which he responded:

Well, while they are [two] systems they are also of a piece along with our increased interventions that will be coming in Siri and search. As important as it is to identify collections of known CSAM where they are stored in Apple's iCloud Photos service, it's also important to try to get upstream of that already horrible situation.

When asked if Apple was trying to demonstrate to governments and agencies around the world that it is possible to scan for illicit content while preserving user privacy, Neuenschwander explained:

Now, why to do it is because, as you said, this is something that will provide that detection capability while preserving user privacy. We're motivated by the need to do more for child safety across the digital ecosystem, and all three of our features, I think, take very positive steps in that direction. At the same time we're going to leave privacy undisturbed for everyone not engaged in the illegal activity.

He was asked if Apple had created a framework that could be used for law enforcement to scan for other kinds of content in users' libraries and if it undermines Apple's commitment to end-to-end encryption.

It doesn't change that one iota. The device is still encrypted, we still don't hold the key, and the system is designed to function on on-device data... The alternative of just processing by going through and trying to evaluate users data on a server is actually more amenable to changes [without user knowledge], and less protective of user privacy... It's those sorts of systems that I think are more troubling when it comes to the privacy properties — or how they could be changed without any user insight or knowledge to do things other than what they were designed to do.

Neuenschwander was then asked if Apple could be forced to comply with laws outside the United States that may force it to add things that are not CSAM to the database to check for them on-device, to which he explained that there are a "number of protections built-in" to the service.

The hash list is built into the operating system, we have one global operating system and don't have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled. And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person's device or set of people's devices won't work because the system simply does not provide any knowledge to Apple for single photos stored in our service. And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity. And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the U.S. And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled, no part of the system is functional.

Neuenschwander continued that for users who are "not into this illegal behavior, Apple gain no additional knowledge about any user's cloud library," and "it leaves privacy completely undisturbed."

See TechCrunch's full interview with Neuenschwander for more information.

Popular Stories

iphone 16 apple intelligence

Apple Aiming to Release 'Breakthrough' New iPhone Accessory

Wednesday February 18, 2026 12:43 pm PST by
Apple is looking for a "breakthrough" with its push into wearable AI devices, including an "AirTag-sized pendant," according to Bloomberg's Mark Gurman. In a report this week, he said the pendant is reminiscent of the failed Humane AI Pin, but it would be an iPhone accessory rather than a standalone product. The pendant would feature an "always-on" camera and a microphone for Siri voice...
Multicolored Low Cost A18 Pro MacBook Feature

Low-Cost MacBook Expected on March 4 in These Colors

Wednesday February 18, 2026 5:42 am PST by
Apple will announce its rumored low-cost MacBook at its event on March 4, with the device coming in a selection of bold color options, according to a known leaker. Earlier this week, Apple announced a "special Apple Experience" for the media in New York, London, and Shanghai, taking place on March 4, 2026 at 9:00am ET. Posting on Weibo, the leaker known as "Instant Digital" said that the...
iphone 17 pro green

iPhone 17 Pro Max Curiously Becomes Most Traded-In Smartphone

Wednesday February 18, 2026 9:13 am PST by
New trade-in data indicates that Apple's iPhone 17 Pro Max has rapidly become the single most traded-in smartphone. According to a new report from SellCell, Apple's latest flagship iPhone has quickly risen to the top of the independent trade-in market, accounting for 11.5% of all devices appearing in the top-20 trade-in rankings just months after release. The analysis is based on SellCell...
CarPlay Liquid Glass Dark

iOS 26.4's New CarPlay Video Feature Shown in Action

Wednesday February 18, 2026 9:29 am PST by
Back at WWDC 2025, Apple revealed that it was planning to allow CarPlay users to watch video via AirPlay in their vehicles while they are not driving, and the first beta of iOS 26.4 suggests the feature may be nearing availability. There are several new references to CarPlay video streaming functionality within the iOS 26.4 beta's source code. The feature is not yet visible to users, but...
Dynamic Island iPhone 18 Pro Feature

10 Reasons to Wait for Apple's iPhone 18 Pro

Wednesday February 18, 2026 5:12 am PST by
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models at the same time, which is why we often get rumored features months ahead of launch. The iPhone 18 series is no different, and we already have a good idea of what to expect for the iPhone 18 Pro and iPhone 18 Pro Max. One thing worth...

Top Rated Comments

59 months ago
Shorter “If your not breaking the law you have nothing to fear”.

I am sure am glad governments never change laws, have poorly defined laws, arbitrary enforcement, and executive orders/mandates etc that might change my status as a law abiding citizen at any moment.

Obviously this power could never be abused. Thank goodness. Go get those bad people with the pictures while the rest of us rest easy knowing they are not after us.
Score: 75 Votes (Like | Disagree)
LeeW Avatar
59 months ago

“If your not breaking the law you have nothing to fear”
The worst argument ever when it comes to privacy.
Score: 62 Votes (Like | Disagree)
Mebsat Avatar
59 months ago
As he states, it is clear that Apple will tolerate a single CSAM file within an iCloud Photos account. They designed it to do so. So what is the point of this? That fact alone gives law enforcement a battering ram to demand access to iCloud Photos. This feature does not preclude that there is CSAM stored in iCloud Photos. All Apple can claim is there is less CSAM in iCloud Photos.

If PR approved this disaster, firings must commence.
Score: 30 Votes (Like | Disagree)
TheYayAreaLiving 🎗️ Avatar
59 months ago
Bottom-line: They are scanning your iPhone. Whatever you store in iCloud.
Score: 25 Votes (Like | Disagree)
59 months ago
Regardless as to how Apple tries to Spin It, the chances of an iOS 15 Boycott are now real !

I expect the iPhone 13 family to come pre-loaded with iOS 14.8.
Score: 24 Votes (Like | Disagree)
Jonas07 Avatar
59 months ago
Bunch of ******** to hide the fact they will scan your photos and messages, you have to be stupid to believe it will only for children between 0-12yo.
Score: 24 Votes (Like | Disagree)