Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection

Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit.

Child Safety Feature Blue
Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his findings, another user was able to create a collision, an issue where two non-matching images share the same hash. Security researchers have warned about this possibility because the potential for collisions could allow the CSAM system to be exploited.

In a statement to Motherboard, Apple said that the version of the NeuralHash that Yvgar reverse-engineered is not the same as the final implementation that will be used with the CSAM system. Apple also said that it made the algorithm publicly available for security researchers to verify, but there is a second private server-side algorithm that verifies a CSAM match after the threshold is exceeded, along with human verification.

Apple however told Motherboard in an email that that version analyzed by users on GitHub is a generic version, and not the one final version that will be used for iCloud Photos CSAM detection. Apple said that it also made the algorithm public.

"The NeuralHash algorithm [... is] included as part of the code of the signed operating system [and] security researchers can verify that it behaves as described," one of Apple's pieces of documentation reads. Apple also said that after a user passes the 30 match threshold, a second non-public algorithm that runs on Apple's servers will check the results.

Matthew Green, who teaches cryptography at Johns Hopkins University and who has been a vocal critic of Apple's CSAM system, told Motherboard that if collisions "exist for this function," then he expects "they'll exist in the system Apple eventually activates."

"Of course, it's possible that they will re-spin the hash function before they deploy," he said. "But as a proof of concept, this is definitely valid," he said of the information shared on GitHub.

Because of the human element, though, another researcher, Nicholas Weaver, told Motherboard that all people can do with manipulating non-CSAM hashes into CSAM is "annoy Apple's response team with garbage images until they implement a filter" to get rid of false positives. Actually fooling Apple's system would also require access to the hashes provided by NCMEC and it would require the production of over 30 colliding images, with the end result not fooling the human oversight.

Apple is using its NeuralHash system to match a database of image hashes provided by agencies like the National Center for Missing Children (NCMEC) to images on user devices to search for CSAM. The system is designed to produce exact matches and Apple says there's a one in a trillion chance that an iCloud account can be accidentally flagged.

Apple is planning to implement the NeuralHash CSAM system in iOS and iPadOS 15 as part of a suite of child safety features, and it has been a hugely controversial decision with Apple receiving criticism from customers and privacy advocates. Apple has been attempting to reassure customers and security researchers about the implementation of the system with additional documentation and executive interviews.

Popular Stories

iphone 16 pro ghost hand

5 Reasons to Skip This Year's iPhone 17 Pro

Thursday July 10, 2025 4:54 am PDT by
Apple will launch its new iPhone 17 series in two months, and the iPhone 17 Pro models are expected to get a new design for the rear casing and the camera area. But more significant changes to the lineup are not expected until next year, when the iPhone 18 models arrive. If you're thinking of trading in your iPhone for this year's latest, consider the following features rumored to be coming...
apple wallet drivers license feature iPhone 15 pro

Apple Says iPhone Driver's Licenses Will Expand to These 8 U.S. States

Tuesday July 8, 2025 11:26 am PDT by
In select U.S. states, residents can add their driver's license or state ID to the Wallet app on the iPhone and Apple Watch, providing a convenient and contactless way to display proof of identity or age at select airports and businesses, and in select apps. Unfortunately, this feature continues to roll out very slowly since it was announced in 2021, with only nine U.S. states, Puerto Rico,...
iPhone 17 Pro in Hand Feature Lowgo

iPhone 17 Pro to Reverse iPhone X Design Decision

Monday July 7, 2025 9:46 am PDT by
Since the iPhone X in 2017, all of Apple's highest-end iPhone models have featured either stainless steel or titanium frames, but it has now been rumored that this design decision will be coming to an end with the iPhone 17 Pro models later this year. In a post on Chinese social media platform Weibo today, the account Instant Digital said that the iPhone 17 Pro models will have an aluminum...
iPhone 17 Pro in Hand Feature Lowgo

Leaker Reveals Amount of RAM in iPhone 17 Through iPhone 17 Pro Max

Wednesday July 9, 2025 8:08 am PDT by
Three out of four iPhone 17 models will feature more RAM than the equivalent iPhone 16 models, according to a new leak that aligns with previous rumors. The all-new iPhone 17 Air, the iPhone 17 Pro, and the iPhone 17 Pro Max will each be equipped with 12GB of RAM, according to Fixed Focus Digital, an account with more than two million followers on Chinese social media platform Weibo. The...
apple account card feature

Apple Account Card Expanding to More Countries

Tuesday July 8, 2025 7:34 pm PDT by
Apple is expanding the ability to add an Apple Account Card to the Wallet app to more countries, according to backend Apple Pay changes. With iOS 15.5, Apple updated the Wallet app to allow users to add an Apple Account Card, which displays the Apple credit balance associated with an Apple ID. If you receive an Apple gift card, for example, it is added to an Apple Account that is also...
macbook pro blue green

M5 MacBook Pro No Longer Coming in 2025

Thursday July 10, 2025 12:38 pm PDT by
Apple does not plan to refresh any Macs with updated M5 chips in 2025, according to Bloomberg's Mark Gurman. Updated MacBook Air and MacBook Pro models are now planned for the first half of 2026. Gurman previously said that Apple would debut the M5 MacBook Pro models in late 2025, but his newest report suggests that Apple is "considering" pushing them back to 2026. Apple is now said to be...
iOS 26 Feature

Everything New in iOS 26 Beta 3

Monday July 7, 2025 1:20 pm PDT by
Apple is continuing to refine and update iOS 26, and beta three features smaller changes than we saw in beta 2, plus further tweaks to the Liquid Glass design. Apple is gearing up for the next phase of beta testing, and the company has promised that a public beta is set to come out in July. Transparency In some apps like Apple Music, Podcasts, and the App Store, Apple has toned down the...
iCloud General Feature Redux

iPhone Users Who Pay for iCloud Storage Receive These Five Perks

Wednesday July 9, 2025 9:20 am PDT by
If you pay for iCloud storage on your iPhone, did you know that Apple offers you five perks beyond the extra storage space, at no additional cost? Here are the perks included with all iCloud+ plans:Private Relay keeps your Safari browsing history entirely private from network providers, websites, and even Apple. Hide My Email generates unique, random email addresses whenever needed. Hom...

Top Rated Comments

locovaca Avatar
51 months ago
Make it all public. It’s for the children, right? What do you have to hide, Apple?
Score: 63 Votes (Like | Disagree)
miniyou64 Avatar
51 months ago
People giving Apple the benefit of the doubt here are making a tremendous amount of assumptions. This kind of tech never remains only for its intended use. No matter which way you spin it (for the children!) this is invasive. Someone on Twitter mentioned what happens if someone airdrops you a bunch of illicit photos and they sync to iCloud in a matter of seconds? Boom you’re flagged. There’s 1,000,000 ways for this system to go wrong or be exploited or worse ruin innocent peoples lives. And if you do end up being one of those people, you will have exactly zero recourse to prove your innocence. It’s over for you. This entire thing is very stupid on Apple’s part.
Score: 58 Votes (Like | Disagree)
zakarhino Avatar
51 months ago
*amateur devs exploit the system within a few hours of discovery*

Apple: "Uhhh guys, this is totally not the finished algorithm! Believe us!"
Score: 47 Votes (Like | Disagree)
dguisinger Avatar
51 months ago
As Rene Ritchie says on MacBreak Weekly, Apple keeps talking down to us as if we don't understand, and our response is "You don't understand, we understand and do not like this"
Score: 42 Votes (Like | Disagree)
nawk Avatar
51 months ago
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
Score: 42 Votes (Like | Disagree)
LDN Avatar
51 months ago

Make it all public. It’s for the children, right? What do you have to hide, Apple?
Yeah, this is increasingly sounding like its not meant for children. Not forgetting the fact that actual pedos, who are likely very small in number, will either just turn off iCloud Photos or get Android phones. Normal people are left stuck with the bill. Apple are going to have to pull this whole thing.
Score: 39 Votes (Like | Disagree)