Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection

Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit.

Child Safety Feature Blue
Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his findings, another user was able to create a collision, an issue where two non-matching images share the same hash. Security researchers have warned about this possibility because the potential for collisions could allow the CSAM system to be exploited.

In a statement to Motherboard, Apple said that the version of the NeuralHash that Yvgar reverse-engineered is not the same as the final implementation that will be used with the CSAM system. Apple also said that it made the algorithm publicly available for security researchers to verify, but there is a second private server-side algorithm that verifies a CSAM match after the threshold is exceeded, along with human verification.

Apple however told Motherboard in an email that that version analyzed by users on GitHub is a generic version, and not the one final version that will be used for iCloud Photos CSAM detection. Apple said that it also made the algorithm public.

"The NeuralHash algorithm [... is] included as part of the code of the signed operating system [and] security researchers can verify that it behaves as described," one of Apple's pieces of documentation reads. Apple also said that after a user passes the 30 match threshold, a second non-public algorithm that runs on Apple's servers will check the results.

Matthew Green, who teaches cryptography at Johns Hopkins University and who has been a vocal critic of Apple's CSAM system, told Motherboard that if collisions "exist for this function," then he expects "they'll exist in the system Apple eventually activates."

"Of course, it's possible that they will re-spin the hash function before they deploy," he said. "But as a proof of concept, this is definitely valid," he said of the information shared on GitHub.

Because of the human element, though, another researcher, Nicholas Weaver, told Motherboard that all people can do with manipulating non-CSAM hashes into CSAM is "annoy Apple's response team with garbage images until they implement a filter" to get rid of false positives. Actually fooling Apple's system would also require access to the hashes provided by NCMEC and it would require the production of over 30 colliding images, with the end result not fooling the human oversight.

Apple is using its NeuralHash system to match a database of image hashes provided by agencies like the National Center for Missing Children (NCMEC) to images on user devices to search for CSAM. The system is designed to produce exact matches and Apple says there's a one in a trillion chance that an iCloud account can be accidentally flagged.

Apple is planning to implement the NeuralHash CSAM system in iOS and iPadOS 15 as part of a suite of child safety features, and it has been a hugely controversial decision with Apple receiving criticism from customers and privacy advocates. Apple has been attempting to reassure customers and security researchers about the implementation of the system with additional documentation and executive interviews.

Popular Stories

maxresdefault

Where's the New Apple TV?

Monday December 22, 2025 11:30 am PST by
Apple hasn't updated the Apple TV 4K since 2022, and 2025 was supposed to be the year that we got a refresh. There were rumors suggesting Apple would release the new Apple TV before the end of 2025, but it looks like that's not going to happen now. Subscribe to the MacRumors YouTube channel for more videos. Bloomberg's Mark Gurman said several times across 2024 and 2025 that Apple would...
top stories 2025 12 20

Top Stories: iOS 26.3 Beta, Major Apple Leaks, and More

Saturday December 20, 2025 6:00 am PST by
You'd think things would be slowing down heading into the holidays, but this week saw a whirlwind of Apple leaks and rumors while Apple started its next cycle of betas following last week's release of iOS 26.2 and related updates. This week also saw the release of a new Apple Music integration with ChatGPT, so read on below for all the details on this week's biggest stories! Top Stories i...
iPhone Top Left Hole Punch Face ID Feature Purple

iPhone 18 Pro Features Leaked in New Report, Including Under-Screen Face ID

Tuesday December 16, 2025 8:44 am PST by
Next year's iPhone 18 Pro and iPhone 18 Pro Max will be equipped with under-screen Face ID, and the front camera will be moved to the top-left corner of the screen, according to a new report from The Information's Wayne Ma and Qianer Liu. As a result of these changes, the report said the iPhone 18 Pro models will not have a pill-shaped Dynamic Island cutout at the top of the screen....
iOS 26

iOS 26.3 Brings AirPods-Like Pairing to Third-Party Devices in EU Under DMA

Monday December 22, 2025 3:20 pm PST by
The European Commission today praised the interoperability changes that Apple is introducing in iOS 26.3, once again crediting the Digital Markets Act (DMA) with bringing "new opportunities" to European users and developers. The Digital Markets Act requires Apple to provide third-party accessories with the same capabilities and access to device features that Apple's own products get. In iOS...
ios 18 security update

Don't Want to Upgrade to iOS 26? Here's How to Stay on iOS 18 [Update: Now Unavailable]

Friday December 19, 2025 10:37 am PST by
Since the beginning of December, Apple has been pushing iPhone users who opted to stay on iOS 18 to install iOS 26 instead. Apple started by making the iOS 18 upgrades less visible, and has now transitioned to making new iOS 18 updates unavailable on any device capable of running iOS 26. If you have an iPhone 11 or later, Apple is no longer offering new versions of iOS 18, even though there...
iOS 26

iOS 26.2 Adds These 8 New Features to Your iPhone

Monday December 22, 2025 8:47 am PST by
Earlier this month, Apple released iOS 26.2, following more than a month of beta testing. It is a big update, with many new features and changes for iPhones. iOS 26.2 adds a Liquid Glass slider for the Lock Screen's clock, offline lyrics in Apple Music, and more. Below, we have highlighted a total of eight new features. Liquid Glass Slider on Lock Screen A new slider in the Lock...
iPhone Chips

Apple Clings to Samsung as RAM Prices Soar

Monday December 22, 2025 6:17 am PST by
Apple is significantly increasing its reliance on Samsung for iPhone memory as component prices surge, according to The Korea Economic Daily. Apple is said to be expanding the share of iPhone memory it sources from Samsung due to rapidly rising memory prices. The shift is expected to result in Samsung supplying roughly 60% to 70% of the low-power DRAM used in the iPhone 17, compared with a...
iPhone Top Left Hole Punch Face ID Feature Purple

iPhone 18 Pro Launching Next Year With These 12 New Features

Tuesday December 23, 2025 8:36 am PST by
While the iPhone 18 Pro and iPhone 18 Pro Max are not expected to launch for another nine months, there are already plenty of rumors about the devices. Below, we have recapped 12 features rumored for the iPhone 18 Pro models. The same overall design is expected, with 6.3-inch and 6.9-inch display sizes, and a "plateau" housing three rear cameras Under-screen Face ID Front camera in...

Top Rated Comments

locovaca Avatar
57 months ago
Make it all public. It’s for the children, right? What do you have to hide, Apple?
Score: 63 Votes (Like | Disagree)
miniyou64 Avatar
57 months ago
People giving Apple the benefit of the doubt here are making a tremendous amount of assumptions. This kind of tech never remains only for its intended use. No matter which way you spin it (for the children!) this is invasive. Someone on Twitter mentioned what happens if someone airdrops you a bunch of illicit photos and they sync to iCloud in a matter of seconds? Boom you’re flagged. There’s 1,000,000 ways for this system to go wrong or be exploited or worse ruin innocent peoples lives. And if you do end up being one of those people, you will have exactly zero recourse to prove your innocence. It’s over for you. This entire thing is very stupid on Apple’s part.
Score: 58 Votes (Like | Disagree)
zakarhino Avatar
57 months ago
*amateur devs exploit the system within a few hours of discovery*

Apple: "Uhhh guys, this is totally not the finished algorithm! Believe us!"
Score: 47 Votes (Like | Disagree)
dguisinger Avatar
57 months ago
As Rene Ritchie says on MacBreak Weekly, Apple keeps talking down to us as if we don't understand, and our response is "You don't understand, we understand and do not like this"
Score: 42 Votes (Like | Disagree)
nawk Avatar
57 months ago
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
Score: 42 Votes (Like | Disagree)
PharaohVision Avatar
57 months ago

Make it all public. It’s for the children, right? What do you have to hide, Apple?
Yeah, this is increasingly sounding like its not meant for children. Not forgetting the fact that actual pedos, who are likely very small in number, will either just turn off iCloud Photos or get Android phones. Normal people are left stuck with the bill. Apple are going to have to pull this whole thing.
Score: 39 Votes (Like | Disagree)