Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns

Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch.

Child Safety Feature Blue
When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got the technology that can balance strong child safety and user privacy," giving the company "a new ability to identify accounts which are starting collections of known CSAM."

Neuenschwander was asked if, in retrospect, announcing the Communication Safety features in Messages and the CSAM detection system in iCloud Photos together was the right decision, to which he responded:

Well, while they are [two] systems they are also of a piece along with our increased interventions that will be coming in Siri and search. As important as it is to identify collections of known CSAM where they are stored in Apple's iCloud Photos service, it's also important to try to get upstream of that already horrible situation.

When asked if Apple was trying to demonstrate to governments and agencies around the world that it is possible to scan for illicit content while preserving user privacy, Neuenschwander explained:

Now, why to do it is because, as you said, this is something that will provide that detection capability while preserving user privacy. We're motivated by the need to do more for child safety across the digital ecosystem, and all three of our features, I think, take very positive steps in that direction. At the same time we're going to leave privacy undisturbed for everyone not engaged in the illegal activity.

He was asked if Apple had created a framework that could be used for law enforcement to scan for other kinds of content in users' libraries and if it undermines Apple's commitment to end-to-end encryption.

It doesn't change that one iota. The device is still encrypted, we still don't hold the key, and the system is designed to function on on-device data... The alternative of just processing by going through and trying to evaluate users data on a server is actually more amenable to changes [without user knowledge], and less protective of user privacy... It's those sorts of systems that I think are more troubling when it comes to the privacy properties — or how they could be changed without any user insight or knowledge to do things other than what they were designed to do.

Neuenschwander was then asked if Apple could be forced to comply with laws outside the United States that may force it to add things that are not CSAM to the database to check for them on-device, to which he explained that there are a "number of protections built-in" to the service.

The hash list is built into the operating system, we have one global operating system and don't have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled. And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person's device or set of people's devices won't work because the system simply does not provide any knowledge to Apple for single photos stored in our service. And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity. And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the U.S. And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled, no part of the system is functional.

Neuenschwander continued that for users who are "not into this illegal behavior, Apple gain no additional knowledge about any user's cloud library," and "it leaves privacy completely undisturbed."

See TechCrunch's full interview with Neuenschwander for more information.

Popular Stories

iPhone 16 Pro Sizes Feature

iPhone 16 Series Is Just Two Months Away: Everything We Know

Monday July 15, 2024 4:44 am PDT by
Apple typically releases its new iPhone series around mid-September, which means we are about two months out from the launch of the iPhone 16. Like the iPhone 15 series, this year's lineup is expected to stick with four models – iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max – although there are plenty of design differences and new features to take into account. To bring ...
maxresdefault

Apple's AirPods Pro 2 vs. Samsung's Galaxy Buds3 Pro

Saturday July 13, 2024 8:00 am PDT by
Samsung this week introduced its latest earbuds, the Galaxy Buds3 Pro, which look quite a bit like Apple's AirPods Pro 2. Given the similarities, we thought we'd compare Samsung's new earbuds to the AirPods Pro. Subscribe to the MacRumors YouTube channel for more videos. Design wise, you could potentially mistake Samsung's Galaxy Buds3 Pro for the AirPods Pro. The Buds3 Pro have the same...
Beyond iPhone 13 Better Blue Face ID Single Camera Hole

10 Reasons to Wait for Next Year's iPhone 17

Monday July 8, 2024 5:00 am PDT by
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we sometimes get rumored feature leaks so far ahead of launch. The iPhone 17 series is no different – already we have some idea of what to expect from Apple's 2025 smartphone lineup. If you plan to skip...
macbook pro january

Best Buy's Black Friday in July Sale Takes Up to $700 Off M3 MacBook Pro for Members

Monday July 15, 2024 11:05 am PDT by
Best Buy's "Black Friday in July" sale is in full swing today, and in addition to a few iPad Air discounts we shared earlier, there are also some steep markdowns on the M3 MacBook Pro. You will need a My Best Buy Plus or Total membership in order to get some of these deals. Note: MacRumors is an affiliate partner with Best Buy. When you click a link and make a purchase, we may receive a small...
Generic iOS 18 Feature Real Mock

Apple Seeds Revised Third Betas of iOS 18 and iPadOS 18 to Developers

Monday July 15, 2024 10:09 am PDT by
Apple today seeded updated third betas iOS 18 and iPadOS 18 to developers for testing purposes, with the software coming a week after Apple initially released the third betas. Registered developers are able to opt into the betas by opening up the Settings app, going to the Software Update section, tapping on the "Beta Updates" option, and toggling on the ‌iOS 18/iPadOS 18‌ Developer Beta ...
ipaos 18 image playground

Apple Releases First iOS 18 and iPadOS 18 Public Betas

Monday July 15, 2024 1:16 pm PDT by
Apple today provided the first betas of iOS 18 and iPadOS 18 to public beta testers, bringing the new software to the general public for the first time since the Worldwide Developers Conference in June. Apple has seeded three developer betas so far, and the first public beta includes the same content that's in the third developer beta. Subscribe to the MacRumors YouTube channel for more videos. ...

Top Rated Comments

jimbobb24 Avatar
38 months ago
Shorter “If your not breaking the law you have nothing to fear”.

I am sure am glad governments never change laws, have poorly defined laws, arbitrary enforcement, and executive orders/mandates etc that might change my status as a law abiding citizen at any moment.

Obviously this power could never be abused. Thank goodness. Go get those bad people with the pictures while the rest of us rest easy knowing they are not after us.
Score: 75 Votes (Like | Disagree)
LeeW Avatar
38 months ago

“If your not breaking the law you have nothing to fear”
The worst argument ever when it comes to privacy.
Score: 62 Votes (Like | Disagree)
Mebsat Avatar
38 months ago
As he states, it is clear that Apple will tolerate a single CSAM file within an iCloud Photos account. They designed it to do so. So what is the point of this? That fact alone gives law enforcement a battering ram to demand access to iCloud Photos. This feature does not preclude that there is CSAM stored in iCloud Photos. All Apple can claim is there is less CSAM in iCloud Photos.

If PR approved this disaster, firings must commence.
Score: 30 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
38 months ago
Bottom-line: They are scanning your iPhone. Whatever you store in iCloud.
Score: 25 Votes (Like | Disagree)
Cosmosent Avatar
38 months ago
Regardless as to how Apple tries to Spin It, the chances of an iOS 15 Boycott are now real !

I expect the iPhone 13 family to come pre-loaded with iOS 14.8.
Score: 24 Votes (Like | Disagree)
Jonas07 Avatar
38 months ago
Bunch of ******** to hide the fact they will scan your photos and messages, you have to be stupid to believe it will only for children between 0-12yo.
Score: 24 Votes (Like | Disagree)