Corellium Launching New Initiative to Hold Apple Accountable Over CSAM Detection Security and Privacy Claims

Security research firm Corellium this week announced it is launching a new initiative that will "support independent public research into the security and privacy of mobile applications," and one of the initiative's first projects will be Apple's recently announced CSAM detection plans.

appleprivacyad
Since its announcement earlier this month, Apple's plan to scan iPhone users' photo libraries for CSAM or child sexual abuse material has received considerable backlash and criticism. The majority of concerns revolve around how the technology used to detect CSAM could be used to scan for other types of photos in a user's library, possibly at the request of an oppressive government.

Apple will check for CSAM photos on a user's photo library by comparing the hashes of a user's pictures to a database of known CSAM images. The company has firmly pushed back against the idea that it will allow governments to add or remove images to that database, refuting the possibility that embodiments other than CSAM may get flagged if found in a user's iCloud Photo Library.

In an interview with The Wall Street Journal, Apple's senior vice president of software engineering, Craig Federighi, said that the on-device nature of Apple's CSAM detection method, compared to others such as Google who complete the process in the cloud, allows security researchers to validate the company's claim that the database of CSAM images is not wrongly altered.

Security researchers are constantly able to introspect what's happening in Apple's software, so if any changes were made that were to expand the scope of this in some way—in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.

Corellium's new initiative, called the "Corellium Open Security Initiative," aims to put Federighi's claim to the test. As part of the initiative, Corellium will award security researchers a $5,000 grant and free access to the Corellium platform for an entire year to allow for research.

Corellium believes that this new initiative will allow security researchers, hobbyists, and others to validate Apple's claims over its CSAM detection method. The security research firm, which just recently settled its long-lasting dispute with Apple, says it applauds Apple's "commitment to holding itself accountable by third-party researchers."

We hope that other mobile software vendors will follow Apple's example in promoting independent verification of security and privacy claims. To encourage this important research, for this initial pilot of our Security Initiative, we will be accepting proposals for research projects designed to validate any security and privacy claims for any mobile software vendor, whether in the operating system or third-party applications.

Security researchers and others interested in being part of the initiative have until October 15, 2021, to apply. More details can be found on Corellium's website.

Top Rated Comments

adib Avatar
9 weeks ago
For the first few months of iOS 15, I'm confident that the database just contains CSAM image fingerprints. However as time passes (and as Corellium's interest wanes), other authorities will push their agenda and force Apple's compliance to include "extra hashes" that are not part of CSAM....
Score: 31 Votes (Like | Disagree)
femike Avatar
9 weeks ago
Sadly as expected, users will just roll over and accept it no matter what Apple is found doing. The Public have short memories. This does not make it any less wrong. It is still an appalling decision which should be rescinded.
Score: 24 Votes (Like | Disagree)
brucewayne Avatar
9 weeks ago
The reason why Apple has been able to stave off warrant requests in the past is by claiming 'they don't have the key'

The current administration (as well as governments around the world) have been pushing for the ability to access your messages. CSAM gives Apple a chance to 'create' their own backdoor under noble pretenses (who is going to argue against stopping child abuse?) and creating an opening for the governments to eventually exploit. It won't matter what Corellium finds now.

And when it happens, Tim Cook will get up on stage and in his soothing southern drawl claim to be the good guy as they had the best of intentions. They won't even lose any customers over because most people are oblivious to privacy (Amazon has sold 100 million Alexa powered products), and the people that do care will have nowhere to go after the precedent is set and Google / Amazon / Microsoft have joined in.
Score: 23 Votes (Like | Disagree)
Substance90 Avatar
9 weeks ago
The fact that the analysis is done on device is even worse. That means that your privacy is invaded even with all network connection turned off.

EDIT: Let me elaborate for the down voters - if the photos are scanned only if uploaded to some cloud, you don't even have to cut your network connection. You just keep your photos on your device and you're safe. If the scanning is done on device that means that your privacy is not guaranteed no matter if you keep your photos offline or if you even cut your network connection.
Score: 12 Votes (Like | Disagree)
brucewayne Avatar
9 weeks ago

So you don't think the below applies in this case?

https://yourlogicalfallacyis.com/slippery-slope

I guess we'll have to wait and see and hopefully Apple will be open with that they add to that hash list. If it can also be monitored by external initiatives such as Corellium I think that's good.
I think we have 20 years of increasing government intrusion to conclude that if A happens Z won't be far behind.

Liberty once lost is lost forever.
Score: 12 Votes (Like | Disagree)
bobcomer Avatar
9 weeks ago

Likely 18 U.S. Code § 2258 ('https://www.law.cornell.edu/uscode/text/18/2258') - Failure to report child abuse and related laws:
* 18 U.S. Code § 2258A ('https://www.law.cornell.edu/uscode/text/18/2258A') - Reporting requirements of providers
* 18 U.S. Code § 2258B ('https://www.law.cornell.edu/uscode/text/18/2258B') - Limited liability for providers or domain name registrars
* 18 U.S. Code § 2258C ('https://www.law.cornell.edu/uscode/text/18/2258C')
* 18 U.S. Code § 2258D ('https://www.law.cornell.edu/uscode/text/18/2258D') - Limited liability for NCMEC
* 18 U.S. Code § 2258E ('https://www.law.cornell.edu/uscode/text/18/2258E') - Definitions
None of those require on device scanning.
Score: 11 Votes (Like | Disagree)

Related Stories

corellium

Apple Appeals Corellium Copyright Lawsuit Loss After Settling Other Claims

Tuesday August 17, 2021 7:23 pm PDT by
Back in December, Apple lost a copyright lawsuit against security research company Corellium, and today, Apple filed an appeal in that case, reports Reuters. The judge in the copyright case determined that Corellium was operating under fair use terms and that its use of iOS was permissible, throwing out several of Apple's claims. For those unfamiliar with Corellium, the software is designed...
apple devices security bug bounty mac iphone ipad

Security Researchers Unhappy With Apple's Bug Bounty Program

Thursday September 9, 2021 10:00 am PDT by
Apple offers a bug bounty program that's designed to pay security researchers for discovering and reporting critical bugs in Apple operating systems, but researchers are not happy with how it operates or Apple's payouts in comparison to other major tech companies, reports The Washington Post. In interviews with more than two dozen security researchers, The Washington Post collected a number...
Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...
corellium

Apple and Corellium Agree on Settlement to Bring Lawsuit to an End

Tuesday August 10, 2021 11:36 pm PDT by
Apple this week dropped its long-standing lawsuit against Corellium, the security research company that provides security researchers with a replica of the iOS operating system, allowing them to locate possible security exploits within Apple's mobile operating system, The Washington Post reports. Apple filed a lawsuit against Corellium in 2019, claiming the security company was infringing...
iPhone 13 Face ID

Apple Researching Ways to Use iPhone Camera to Detect Childhood Autism

Tuesday September 21, 2021 2:56 am PDT by
Apple is reportedly researching ways to use the cameras inside of the iPhone to detect childhood autism, aiming to use data from the camera to observe a child's behavior that could be used for early diagnosis, according to a new report from The Wall Street Journal. According to the report, which echoes previously announced research efforts, Apple wants to be able to use the camera inside of...
tim cook privacy

Apple Not Trying Hard Enough to Protect Users Against Surveillance, Researchers Say

Friday July 23, 2021 6:46 am PDT by
Following the news of widespread commercial hacking spyware on targeted iPhones, a large number of security researchers are now saying that Apple could do more to protect its users (via Wired). Earlier this week, it was reported that journalists, lawyers, and human rights activists around the world had been targeted by governments using phone malware made by the surveillance firm NSO Group...
iPhone 13 Security

Apple Apologizes to Researcher for Ignoring iOS Vulnerabilities, Says It's 'Still Investigating'

Monday September 27, 2021 12:55 pm PDT by
Last week, security researcher Denis Tokarev made several zero-day iOS vulnerabilities public after he said that Apple had ignored his reports and had failed to fix the issues for several months. Tokarev today told Motherboard that Apple got in touch after he went public with his complaints and after they saw significant media attention. In an email, Apple apologized for the contact delay...
apple privacy

University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology

Friday August 20, 2021 5:48 am PDT by
Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous." Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University...
iPhone 13 Security

Researcher Says Apple Ignored Three Zero-Day Security Vulnerabilities Still Present in iOS 15

Friday September 24, 2021 10:42 am PDT by
In 2019, Apple opened its Security Bounty Program to the public, offering payouts up to $1 million to researchers who share critical iOS, iPadOS, macOS, tvOS, or watchOS security vulnerabilities with Apple, including the techniques used to exploit them. The program is designed to help Apple keep its software platforms as safe as possible. In the time since, reports have surfaced indicating...
apple pay express transit london

Security Experts Warn of Apple Pay Express Transit Hack That Enables Large Unauthorized Visa Payments From Locked iPhones

Thursday September 30, 2021 12:14 am PDT by
Researchers in the U.K. have demonstrated how large unauthorized contactless payments can be made on locked iPhones by exploiting Apple Pay's Express Transit feature when set up with Visa. Express Transit is an Apple Pay feature that allows for tap-and-go payment at ticket barriers, eliminating the need to authenticate with Face ID, Touch ID, or a passcode. The device does not need to be...