Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection

Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit.

Child Safety Feature Blue
Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his findings, another user was able to create a collision, an issue where two non-matching images share the same hash. Security researchers have warned about this possibility because the potential for collisions could allow the CSAM system to be exploited.

In a statement to Motherboard, Apple said that the version of the NeuralHash that Yvgar reverse-engineered is not the same as the final implementation that will be used with the CSAM system. Apple also said that it made the algorithm publicly available for security researchers to verify, but there is a second private server-side algorithm that verifies a CSAM match after the threshold is exceeded, along with human verification.

Apple however told Motherboard in an email that that version analyzed by users on GitHub is a generic version, and not the one final version that will be used for iCloud Photos CSAM detection. Apple said that it also made the algorithm public.

"The NeuralHash algorithm [... is] included as part of the code of the signed operating system [and] security researchers can verify that it behaves as described," one of Apple's pieces of documentation reads. Apple also said that after a user passes the 30 match threshold, a second non-public algorithm that runs on Apple's servers will check the results.

Matthew Green, who teaches cryptography at Johns Hopkins University and who has been a vocal critic of Apple's CSAM system, told Motherboard that if collisions "exist for this function," then he expects "they'll exist in the system Apple eventually activates."

"Of course, it's possible that they will re-spin the hash function before they deploy," he said. "But as a proof of concept, this is definitely valid," he said of the information shared on GitHub.

Because of the human element, though, another researcher, Nicholas Weaver, told Motherboard that all people can do with manipulating non-CSAM hashes into CSAM is "annoy Apple's response team with garbage images until they implement a filter" to get rid of false positives. Actually fooling Apple's system would also require access to the hashes provided by NCMEC and it would require the production of over 30 colliding images, with the end result not fooling the human oversight.

Apple is using its NeuralHash system to match a database of image hashes provided by agencies like the National Center for Missing Children (NCMEC) to images on user devices to search for CSAM. The system is designed to produce exact matches and Apple says there's a one in a trillion chance that an iCloud account can be accidentally flagged.

Apple is planning to implement the NeuralHash CSAM system in iOS and iPadOS 15 as part of a suite of child safety features, and it has been a hugely controversial decision with Apple receiving criticism from customers and privacy advocates. Apple has been attempting to reassure customers and security researchers about the implementation of the system with additional documentation and executive interviews.

Top Rated Comments

locovaca Avatar
13 months ago
Make it all public. It’s for the children, right? What do you have to hide, Apple?
Score: 63 Votes (Like | Disagree)
miniyou64 Avatar
13 months ago
People giving Apple the benefit of the doubt here are making a tremendous amount of assumptions. This kind of tech never remains only for its intended use. No matter which way you spin it (for the children!) this is invasive. Someone on Twitter mentioned what happens if someone airdrops you a bunch of illicit photos and they sync to iCloud in a matter of seconds? Boom you’re flagged. There’s 1,000,000 ways for this system to go wrong or be exploited or worse ruin innocent peoples lives. And if you do end up being one of those people, you will have exactly zero recourse to prove your innocence. It’s over for you. This entire thing is very stupid on Apple’s part.
Score: 58 Votes (Like | Disagree)
zakarhino Avatar
13 months ago
*amateur devs exploit the system within a few hours of discovery*

Apple: "Uhhh guys, this is totally not the finished algorithm! Believe us!"
Score: 47 Votes (Like | Disagree)
dguisinger Avatar
13 months ago
As Rene Ritchie says on MacBreak Weekly, Apple keeps talking down to us as if we don't understand, and our response is "You don't understand, we understand and do not like this"
Score: 42 Votes (Like | Disagree)
nawk Avatar
13 months ago
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
Score: 42 Votes (Like | Disagree)
LDN Avatar
13 months ago

Make it all public. It’s for the children, right? What do you have to hide, Apple?
Yeah, this is increasingly sounding like its not meant for children. Not forgetting the fact that actual pedos, who are likely very small in number, will either just turn off iCloud Photos or get Android phones. Normal people are left stuck with the bill. Apple are going to have to pull this whole thing.
Score: 39 Votes (Like | Disagree)

Related Stories

apple privacy

University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology

Friday August 20, 2021 5:48 am PDT by
Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous." Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University...
iphone communication safety feature

Apple Outlines Security and Privacy of CSAM Detection System in New Document

Friday August 13, 2021 11:45 am PDT by
Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations. Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from...
craig wwdc 2021 privacy

Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Friday August 13, 2021 6:33 am PDT by
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM). Federighi admitted that Apple...
tmobilelogo

T-Mobile Says iOS 15.2 Bug Turning Off iCloud Private Relay for Some Users

Tuesday January 11, 2022 12:02 pm PST by
T-Mobile has not disabled iCloud Private Relay for its subscribers, in contrast to recent reports suggesting the carrier was preventing iPhone users from enabling the feature. In a statement to Bloomberg's Mark Gurman, T-Mobile said that iOS 15.2 device settings that default to the feature being toggled off, and that Apple has been contacted. T-Mobile explicitly says that iCloud relay has...
apple csam flow chart

Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis

Friday August 6, 2021 10:25 am PDT by
Apple this week announced that, starting later this year with iOS 15 and iPadOS 15, the company will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children, a non-profit organization that works in collaboration with law enforcement agencies across the United...
craig wwdc 2021 privacy

Report Highlights How Top Apple Executives Disagreed Over How Far iOS Anti-Tracking Measures Should Go

Monday March 14, 2022 7:19 am PDT by
A new report has highlighted how three top prominent executives initially found themselves at odds in early deliberations about Apple's App Tracking Transparency framework. According to the report from The Information, the executives who disagreed over how far Apple should go in protecting user privacy in digital advertising included Apple's Craig Federighi, who oversees software...
icloud private relay ios 15

Apple Says iOS 15.2 Included No Changes That Would Have Toggled iCloud Private Relay Off

Wednesday January 12, 2022 2:23 pm PST by
iOS 15.2 did not introduce a bug that turned iCloud Private Relay off for some users, Apple said in a statement that was provided to MacRumors. The statement was in response to a T-Mobile claim that iOS 15.2 had automatically toggled the iCloud Private Relay feature off for some users. iCloud Private Relay is an innovative internet privacy service that allows users with an iCloud+ subscription ...
appleprivacyad

Corellium Launching New Initiative to Hold Apple Accountable Over CSAM Detection Security and Privacy Claims

Tuesday August 17, 2021 1:35 am PDT by
Security research firm Corellium this week announced it is launching a new initiative that will "support independent public research into the security and privacy of mobile applications," and one of the initiative's first projects will be Apple's recently announced CSAM detection plans. Since its announcement earlier this month, Apple's plan to scan iPhone users' photo libraries for CSAM or...

Popular Stories

cook sept 2020 event

Gurman: Apple Preparing Pre-Recorded iPhone 14 and Apple Watch Series 8 Event

Sunday August 7, 2022 6:13 am PDT by
Apple has "started to record" its virtual September event, where it's expected to announce the upcoming iPhone 14 lineup, the Apple Watch Series 8, and a new "rugged" Apple Watch model, according to Bloomberg's Mark Gurman. Writing in his latest Power On newsletter, Gurman says the event, which is expected to take place in the early part of September, is already under production, implying...
iPhone 14 Lineup Feature Purple

Color Options for All iPhone 14 Models: Everything We Know

Monday August 8, 2022 3:59 am PDT by
The iPhone 14 and iPhone 14 Pro models are rumored to be available in a refreshed range of color options, including an all-new purple color. Most expectations about the iPhone 14 lineup's color options come from an unverified post on Chinese social media site Weibo earlier this year. Overall, the iPhone 14 and iPhone 14 Pro's selection of color options could look fairly similar to those of the ...
ios 16 beta 5 battery percent

iOS 16 Beta 5: Battery Percentage Now Displayed in iPhone Status Bar

Monday August 8, 2022 10:43 am PDT by
With the fifth beta of iOS 16, Apple has updated the battery icon on iPhones with Face ID to display the specific battery percentage rather than just a visual representation of battery level. The new battery indicator is available on iPhone 12 and iPhone 13 models, with the exception of the 5.4-inch iPhone 12/13 mini. It is also available on the iPhone 11 Pro and Pro Max, XS and XS Max, and...
iOS 16 battery percentage

Apple Limiting iOS 16 Beta 5 Battery Percentage Display to Select iPhones: Here Are the Supported Devices

Tuesday August 9, 2022 2:51 am PDT by
Apple this week brought back one of the most highly requested features from iOS users since the launch of the iPhone X in 2017: the ability to see your battery percentage directly in the status bar. Ever since the launch of the iPhone X with the notch, Apple has not allowed users to show their battery percentage directly in the status bar, forcing them to swipe down into Control Center to...
ios 16 battery indicator 2

Everything New in iOS 16 Beta 5: Battery Percentage in Status Bar, Find My Changes and More

Monday August 8, 2022 12:53 pm PDT by
Apple today seeded the fifth beta of iOS 16 to developers for testing purposes, introducing some small but notable changes to the iOS operating system. Subscribe to the MacRumors YouTube channel for more videos. We've rounded up everything new in the fifth beta below. Battery Percentage in Status Bar The battery icon in the status bar now displays the exact battery percent, a feature that ...
iphone 14 pro max camera bump compared lipilipsi 16 9

Bigger iPhone 14 Pro Max Camera Bump Shown Alongside iPhone 13 Pro Max

Monday August 8, 2022 4:33 am PDT by
The camera bump on the upcoming iPhone 14 Pro Max is expected to be the largest rear lens housing Apple has ever installed on its flagship smartphones, and a new photo offers a rare glimpse at just how prominent it is compared to Apple's predecessor device. iPhone 14 Pro Max dummy (left) vs iPhone 13 Pro Max All iPhone 14 models are expected to see upgrades to the Ultra Wide camera on the...
airpods pro black background

Beyond iPhone 14: Five Apple Products Expected to Launch Later This Year

Monday August 8, 2022 9:43 am PDT by
While the iPhone 14 and Apple Watch Series 8 are expected to be announced in September as usual, there are several more Apple products rumored to launch later this year, including new iPad and Mac models and more. Beyond the iPhone and Apple Watch, we've put together a list of five Apple products that are most likely to be unveiled by the end of 2022. Second-Generation AirPods Pro Apple...