Corellium Launching New Initiative to Hold Apple Accountable Over CSAM Detection Security and Privacy Claims

Security research firm Corellium this week announced it is launching a new initiative that will "support independent public research into the security and privacy of mobile applications," and one of the initiative's first projects will be Apple's recently announced CSAM detection plans.

appleprivacyad
Since its announcement earlier this month, Apple's plan to scan iPhone users' photo libraries for CSAM or child sexual abuse material has received considerable backlash and criticism. The majority of concerns revolve around how the technology used to detect CSAM could be used to scan for other types of photos in a user's library, possibly at the request of an oppressive government.

Apple will check for CSAM photos on a user's photo library by comparing the hashes of a user's pictures to a database of known CSAM images. The company has firmly pushed back against the idea that it will allow governments to add or remove images to that database, refuting the possibility that embodiments other than CSAM may get flagged if found in a user's iCloud Photo Library.

In an interview with The Wall Street Journal, Apple's senior vice president of software engineering, Craig Federighi, said that the on-device nature of Apple's CSAM detection method, compared to others such as Google who complete the process in the cloud, allows security researchers to validate the company's claim that the database of CSAM images is not wrongly altered.

Security researchers are constantly able to introspect what's happening in Apple's software, so if any changes were made that were to expand the scope of this in some way—in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.

Corellium's new initiative, called the "Corellium Open Security Initiative," aims to put Federighi's claim to the test. As part of the initiative, Corellium will award security researchers a $5,000 grant and free access to the Corellium platform for an entire year to allow for research.

Corellium believes that this new initiative will allow security researchers, hobbyists, and others to validate Apple's claims over its CSAM detection method. The security research firm, which just recently settled its long-lasting dispute with Apple, says it applauds Apple's "commitment to holding itself accountable by third-party researchers."

We hope that other mobile software vendors will follow Apple's example in promoting independent verification of security and privacy claims. To encourage this important research, for this initial pilot of our Security Initiative, we will be accepting proposals for research projects designed to validate any security and privacy claims for any mobile software vendor, whether in the operating system or third-party applications.

Security researchers and others interested in being part of the initiative have until October 15, 2021, to apply. More details can be found on Corellium's website.

Top Rated Comments

adib Avatar
24 months ago
For the first few months of iOS 15, I'm confident that the database just contains CSAM image fingerprints. However as time passes (and as Corellium's interest wanes), other authorities will push their agenda and force Apple's compliance to include "extra hashes" that are not part of CSAM....
Score: 31 Votes (Like | Disagree)
femike Avatar
24 months ago
Sadly as expected, users will just roll over and accept it no matter what Apple is found doing. The Public have short memories. This does not make it any less wrong. It is still an appalling decision which should be rescinded.
Score: 24 Votes (Like | Disagree)
brucewayne Avatar
24 months ago
The reason why Apple has been able to stave off warrant requests in the past is by claiming 'they don't have the key'

The current administration (as well as governments around the world) have been pushing for the ability to access your messages. CSAM gives Apple a chance to 'create' their own backdoor under noble pretenses (who is going to argue against stopping child abuse?) and creating an opening for the governments to eventually exploit. It won't matter what Corellium finds now.

And when it happens, Tim Cook will get up on stage and in his soothing southern drawl claim to be the good guy as they had the best of intentions. They won't even lose any customers over because most people are oblivious to privacy (Amazon has sold 100 million Alexa powered products), and the people that do care will have nowhere to go after the precedent is set and Google / Amazon / Microsoft have joined in.
Score: 23 Votes (Like | Disagree)
Substance90 Avatar
24 months ago
The fact that the analysis is done on device is even worse. That means that your privacy is invaded even with all network connection turned off.

EDIT: Let me elaborate for the down voters - if the photos are scanned only if uploaded to some cloud, you don't even have to cut your network connection. You just keep your photos on your device and you're safe. If the scanning is done on device that means that your privacy is not guaranteed no matter if you keep your photos offline or if you even cut your network connection.
Score: 12 Votes (Like | Disagree)
brucewayne Avatar
24 months ago

So you don't think the below applies in this case?

https://yourlogicalfallacyis.com/slippery-slope

I guess we'll have to wait and see and hopefully Apple will be open with that they add to that hash list. If it can also be monitored by external initiatives such as Corellium I think that's good.
I think we have 20 years of increasing government intrusion to conclude that if A happens Z won't be far behind.

Liberty once lost is lost forever.
Score: 12 Votes (Like | Disagree)
bobcomer Avatar
24 months ago

Likely 18 U.S. Code § 2258 ('https://www.law.cornell.edu/uscode/text/18/2258') - Failure to report child abuse and related laws:
* 18 U.S. Code § 2258A ('https://www.law.cornell.edu/uscode/text/18/2258A') - Reporting requirements of providers
* 18 U.S. Code § 2258B ('https://www.law.cornell.edu/uscode/text/18/2258B') - Limited liability for providers or domain name registrars
* 18 U.S. Code § 2258C ('https://www.law.cornell.edu/uscode/text/18/2258C')
* 18 U.S. Code § 2258D ('https://www.law.cornell.edu/uscode/text/18/2258D') - Limited liability for NCMEC
* 18 U.S. Code § 2258E ('https://www.law.cornell.edu/uscode/text/18/2258E') - Definitions
None of those require on device scanning.
Score: 11 Votes (Like | Disagree)

Popular Stories

google drive for desktop1

Google to Roll Out New 'Drive for Desktop' App in the Coming Weeks, Replacing Backup & Sync and Drive File Stream Clients

Tuesday July 13, 2021 1:18 am PDT by
Earlier this year, Google announced that it planned to unify its Drive File Stream and Backup and Sync apps into a single Google Drive for desktop app. The company now says the new sync client will roll out "in the coming weeks" and has released additional information about what users can expect from the transition. To recap, there are currently two desktop sync solutions for using Google...