Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection

Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit.

Child Safety Feature Blue
Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his findings, another user was able to create a collision, an issue where two non-matching images share the same hash. Security researchers have warned about this possibility because the potential for collisions could allow the CSAM system to be exploited.

In a statement to Motherboard, Apple said that the version of the NeuralHash that Yvgar reverse-engineered is not the same as the final implementation that will be used with the CSAM system. Apple also said that it made the algorithm publicly available for security researchers to verify, but there is a second private server-side algorithm that verifies a CSAM match after the threshold is exceeded, along with human verification.

Apple however told Motherboard in an email that that version analyzed by users on GitHub is a generic version, and not the one final version that will be used for iCloud Photos CSAM detection. Apple said that it also made the algorithm public.

"The NeuralHash algorithm [... is] included as part of the code of the signed operating system [and] security researchers can verify that it behaves as described," one of Apple's pieces of documentation reads. Apple also said that after a user passes the 30 match threshold, a second non-public algorithm that runs on Apple's servers will check the results.

Matthew Green, who teaches cryptography at Johns Hopkins University and who has been a vocal critic of Apple's CSAM system, told Motherboard that if collisions "exist for this function," then he expects "they'll exist in the system Apple eventually activates."

"Of course, it's possible that they will re-spin the hash function before they deploy," he said. "But as a proof of concept, this is definitely valid," he said of the information shared on GitHub.

Because of the human element, though, another researcher, Nicholas Weaver, told Motherboard that all people can do with manipulating non-CSAM hashes into CSAM is "annoy Apple's response team with garbage images until they implement a filter" to get rid of false positives. Actually fooling Apple's system would also require access to the hashes provided by NCMEC and it would require the production of over 30 colliding images, with the end result not fooling the human oversight.

Apple is using its NeuralHash system to match a database of image hashes provided by agencies like the National Center for Missing Children (NCMEC) to images on user devices to search for CSAM. The system is designed to produce exact matches and Apple says there's a one in a trillion chance that an iCloud account can be accidentally flagged.

Apple is planning to implement the NeuralHash CSAM system in iOS and iPadOS 15 as part of a suite of child safety features, and it has been a hugely controversial decision with Apple receiving criticism from customers and privacy advocates. Apple has been attempting to reassure customers and security researchers about the implementation of the system with additional documentation and executive interviews.

Top Rated Comments

locovaca Avatar
15 weeks ago
Make it all public. It’s for the children, right? What do you have to hide, Apple?
Score: 63 Votes (Like | Disagree)
miniyou64 Avatar
15 weeks ago
People giving Apple the benefit of the doubt here are making a tremendous amount of assumptions. This kind of tech never remains only for its intended use. No matter which way you spin it (for the children!) this is invasive. Someone on Twitter mentioned what happens if someone airdrops you a bunch of illicit photos and they sync to iCloud in a matter of seconds? Boom you’re flagged. There’s 1,000,000 ways for this system to go wrong or be exploited or worse ruin innocent peoples lives. And if you do end up being one of those people, you will have exactly zero recourse to prove your innocence. It’s over for you. This entire thing is very stupid on Apple’s part.
Score: 58 Votes (Like | Disagree)
zakarhino Avatar
15 weeks ago
*amateur devs exploit the system within a few hours of discovery*

Apple: "Uhhh guys, this is totally not the finished algorithm! Believe us!"
Score: 47 Votes (Like | Disagree)
dguisinger Avatar
15 weeks ago
As Rene Ritchie says on MacBreak Weekly, Apple keeps talking down to us as if we don't understand, and our response is "You don't understand, we understand and do not like this"
Score: 42 Votes (Like | Disagree)
nawk Avatar
15 weeks ago
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
Score: 42 Votes (Like | Disagree)
LDN Avatar
15 weeks ago

Make it all public. It’s for the children, right? What do you have to hide, Apple?
Yeah, this is increasingly sounding like its not meant for children. Not forgetting the fact that actual pedos, who are likely very small in number, will either just turn off iCloud Photos or get Android phones. Normal people are left stuck with the bill. Apple are going to have to pull this whole thing.
Score: 39 Votes (Like | Disagree)

Related Stories

Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
iphone communication safety feature

Apple Outlines Security and Privacy of CSAM Detection System in New Document

Friday August 13, 2021 11:45 am PDT by
Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations. Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from...
craig wwdc 2021 privacy

Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Friday August 13, 2021 6:33 am PDT by
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM). Federighi admitted that Apple...
iphone communication safety feature

Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Thursday August 5, 2021 12:00 pm PDT by
Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time. Communication Safety First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children...
nso israeli surveillance firm

Apple Aims to Cut Down on Spyware With Lawsuit Against NSO Group

Tuesday November 23, 2021 10:09 am PST by
Apple today announced that it has filed a lawsuit against Israeli firm NSO Group and its parent company with the aim of holding it accountable for targeting Apple users with spyware used for surveillance purposes. In the lawsuit, Apple offers up information on how NSO Group infiltrated the devices of iPhone owners and how it utilized the Pegasus spyware to do so. Apple is asking for a...
apple privacy

University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology

Friday August 20, 2021 5:48 am PDT by
Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous." Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University...
communication safety 1

iOS 15.2 Beta Adds Messages Communication Safety Feature for Kids

Tuesday November 9, 2021 10:07 am PST by
Apple over the summer announced new Child Safety features that are aimed at keeping children safer online. Apple has confirmed that one of those features, Communication Safety in Messages, has been enabled in the second beta of iOS 15.2 that was released today, after hints of it appeared in the first beta. Note that Communication Safety is not the same as the controversial anti-CSAM feature that...

Popular Stories

airtag in hand

Apple AirTag Linked to Increasing Number of Car Thefts, Canadian Police Report

Friday December 3, 2021 7:10 am PST by
Apple's AirTags are being used in an increasing number of targeted car thefts in Canada, according to local police. Outlined in a news release from York Regional Police, investigators have identified a new method being used by thieves to track down and steal high-end vehicles that takes advantage of the AirTag's location tracking capabilities. While the method of stealing the cars is largely ...
telsa cyberwhistle

Elon Musk Urges Customers to Buy 'Tesla Cyberwhistle' Instead of Apple Polishing Cloth

Wednesday December 1, 2021 4:01 am PST by
Tesla CEO Elon Musk has encouraged customers to buy the "Cyberwhistle" for $50 instead of Apple's much-discussed Polishing Cloth. The product page, which Musk shared on Twitter on Tuesday evening, offers a limited edition stainless steel whistle with the same distinctive design of the Tesla Cybertruck:Inspired by Cybertruck, the limited-edition Cyberwhistle is a premium collectible made from ...
maxresdefault

Five Features to Look Forward to in the 2022 MacBook Air

Tuesday November 30, 2021 1:51 pm PST by
In 2022, Apple is going to release an updated version of the MacBook Air with some of the biggest design changes that we've seen since 2010, when Apple introduced the 11 and 13-inch size options. In the video below, we highlight five features that you need to know about the new machine. Subscribe to the MacRumors YouTube channel for more videos. No More Wedge Design - Current MacBook...
apple top apps games 2020

Apple Reveals the Most Downloaded iOS Apps and Games of 2021

Thursday December 2, 2021 12:05 am PST by
Along with naming its editorial picks for the top apps and games of 2021, Apple today shared charts for the most downloaded free and paid apps and games in the United States across 2021. The number one most downloaded free iPhone app was TikTok, followed by YouTube, Instagram, Snapchat, and Facebook. The top paid iPhone apps included Procreate Pocket, HotSchedules, The Wonder Weeks, and Touch...
iPhone SE Cosmopolitan Clean

New iPhone SE Reportedly on Track for Release in First Quarter of 2022

Tuesday November 30, 2021 8:08 am PST by
Apple plans to release a third-generation iPhone SE in the first quarter of 2022, according to Taiwanese research firm TrendForce. If this timeframe proves to be accurate, we can expect the device to be released by the end of March. As previously rumored, TrendForce said the new iPhone SE will remain a mid-range smartphone with added support for 5G:In terms of product development, Apple is...
m3 feature black

Macs With 'M3' Chips Expected to Use TSMC's 3nm Chip Technology With Test Production Reportedly Underway

Thursday December 2, 2021 7:36 am PST by
Apple's chipmaking partner TSMC has kicked off pilot production of chips built on its 3nm process, known as N3, according to Taiwanese supply chain publication DigiTimes. The report, citing unnamed industry sources, claims that TSMC will move the process to volume production by the fourth quarter of 2022 and start shipping 3nm chips to customers like Apple and Intel in the first quarter of...
apple view concept right corner

Apple Planning to Replace the iPhone With AR Headset in 10 Years

Wednesday December 1, 2021 2:29 am PST by
Apple is planning to replace the iPhone with an augmented reality (AR) headset in 10 years, a process that is apparently due to start as soon as next year with the launch of a head-mounted device, according to a recent report. Concept render of Apple's rumored AR headset by Antonio De Rosa In a note to investors seen by MacRumors, eminent analyst Ming-Chi Kuo explained that "Apple's goal is...