European Commission to Release Draft Law Enforcing Mandatory Detection of Child Sexual Abuse Material on Digital Platforms

The European Commission is set to release a draft law this week that could require tech companies like Apple and Google to identify, remove and report to law enforcement illegal images of child abuse on their platforms, claims a new report out today.

European Commisssion
According to a leak of the proposal obtained by Politico, the EC believes voluntary measures taken by some digital companies have thus far "proven insufficient" in addressing the increasing misuse of online services for the purposes of sharing child sexual abuse content, which is why the commission wants to make detection of such material mandatory.

After months of lobbying, groups representing tech companies and children's rights organizations are said to be waiting to see how stringent the rules could be, and how they will work without tech companies having to scan the gamut of user content – a practice deemed illegal by the Court of Justice of the European Union in 2016.

Apart from how identification of illegal material would operate within the law, privacy groups and tech companies are worried that the EU executive could result in the creation of backdoors to end-to-end encrypted messaging services, the contents of which cannot be accessed by the hosting platform.

The EC's Home Affairs Commissioner Ylva Johansson has said technical solutions exist to keep conversations safe while finding illegal content, but cybersecurity experts disagree.

"The EU shouldn't be proposing things that are technologically impossible," said Ella Jakubowska, speaking to Politico. Jakubowska is policy adviser at European Digital Rights (EDRi), a network of 45 non-governmental organizations (NGOs.)

"The idea that all the hundreds of millions of people in the EU would have their intimate private communications, where they have a reasonable expectation that that is private, to instead be kind of indiscriminately and generally scanned 24/7 is unprecedented," said Jakubowska.

MEPs are far from aligned on the issue, however. Reacting to the leak of the proposal, centrist Renew Europe MEP Moritz Körner told Politico the Commission's proposal would mean "the privacy of digital correspondence would be dead."

The heated debate mirrors last year's controversy surrounding Apple's plan to search for CSAM (child sexual abuse material) on iPhones and iPads.

Apple in August 2021 announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for CSAM and Communication Safety to warn children and their parents when receiving or sending sexually explicit photos. The latter, and arguably less controversial, feature is already live on Apple's iMessage platform. Apple's method of scanning for CSAM has yet to have been deployed.

Following Apple's announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees.

The majority of criticism was leveled at Apple's planned on-device CSAM detection, which was lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual abuse.

Apple initially attempted to dispel some misunderstandings and reassure users by releasing detailed information and sharing interviews with company executives in order to allay concerns. However, despite Apple's efforts, the controversy didn't go away, and Apple decided to delay the rollout of CSAM following the torrent of criticism.

Apple said its decision to delay was "based on feedback from customers, advocacy groups, researchers and others... we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

In December 2021, Apple quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads hanged in the balance following significant criticism of its methods.

However, Apple says its plans for CSAM detection have not changed since September, which suggests CSAM detection in some form is still coming in the future.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iPhone SE 4 Vertical Camera Feature

iPhone SE 4 Production Will Reportedly Begin Ramping Up in October

Tuesday July 23, 2024 2:00 pm PDT by
Following nearly two years of rumors about a fourth-generation iPhone SE, The Information today reported that Apple suppliers are finally planning to begin ramping up mass production of the device in October of this year. If accurate, that timeframe would mean that the next iPhone SE would not be announced alongside the iPhone 16 series in September, as expected. Instead, the report...
iPhone 17 Plus Feature

iPhone 17 Lineup Specs Detail Display Upgrade and New High-End Model

Monday July 22, 2024 4:33 am PDT by
Key details about the overall specifications of the iPhone 17 lineup have been shared by the leaker known as "Ice Universe," clarifying several important aspects of next year's devices. Reports in recent months have converged in agreement that Apple will discontinue the "Plus" iPhone model in 2025 while introducing an all-new iPhone 17 "Slim" model as an even more high-end option sitting...
Generic iPhone 17 Feature With Full Width Dynamic Island

Kuo: Ultra-Thin iPhone 17 to Feature A19 Chip, Single Rear Camera, Semi-Titanium Frame, and More

Wednesday July 24, 2024 9:06 am PDT by
Apple supply chain analyst Ming-Chi Kuo today shared alleged specifications for a new ultra-thin iPhone 17 model rumored to launch next year. Kuo expects the device to be equipped with a 6.6-inch display with a current-size Dynamic Island, a standard A19 chip rather than an A19 Pro chip, a single rear camera, and an Apple-designed 5G chip. He also expects the device to have a...
iPhone 16 Pro Sizes Feature

iPhone 16 Series Is Less Than Two Months Away: Everything We Know

Thursday July 25, 2024 5:43 am PDT by
Apple typically releases its new iPhone series around mid-September, which means we are about two months out from the launch of the iPhone 16. Like the iPhone 15 series, this year's lineup is expected to stick with four models – iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max – although there are plenty of design differences and new features to take into account. To bring ...
icloud private relay outage

iCloud Private Relay Experiencing Outage

Thursday July 25, 2024 3:18 pm PDT by
Apple’s iCloud Private Relay service is down for some users, according to Apple’s System Status page. Apple says that the iCloud Private Relay service may be slow or unavailable. The outage started at 2:34 p.m. Eastern Time, but it does not appear to be affecting all iCloud users. Some impacted users are unable to browse the web without turning iCloud Private Relay off, while others are...
iPhone 17 Plus Feature Purple

iPhone 17 Rumored to Feature Mechanical Aperture

Tuesday July 23, 2024 9:32 am PDT by
Apple is planning to release at least one iPhone 17 model next year with mechanical aperture, according to a report published today by The Information. The mechanical system would allow users to adjust the size of the iPhone 17's aperture, which refers to the opening of the camera lens through which light enters. All existing iPhone camera lenses have fixed apertures, but some Android...

Top Rated Comments

contacos Avatar
29 months ago
and at the same time another entity in the EU demands end-to-end encryption. Hilarious.

It starts with child porn and ends with having an opinion. Scary future
Score: 24 Votes (Like | Disagree)
rme Avatar
29 months ago

and at the same time another entity in the EU demands end-to-end encryption. Hilarious.

It starts with child porn and ends with having an opinion. Scary future
Europe is once again heading towards a very very dark place. Was always obvious that ever more centralisation of power and ever bigger empire was going to lead to misery.
Score: 17 Votes (Like | Disagree)
AdonisSMU Avatar
29 months ago

Eh- not a fan of this.

Inevitably, "think of the children" always wins.
Also not a fan. EU is doing too much As. Per usual. The EU doesnt know how to leave people alone. The its for a good cause is a terrible argument. Do the people of the EU get to vote for the representation in the EU?
Score: 16 Votes (Like | Disagree)
Mac Fly (film) Avatar
29 months ago

and at the same time another entity in the EU demands end-to-end encryption. Hilarious.

It starts with child porn and ends with having an opinion. Scary future
Careful with having an opinion around here. MR don’t love that; you may get a temporary ban for such. Live in the EU and hate the EU. Centralised power corrupts. If child porn was the issue we’d know Maxwell’s client list and the court case transcript would be made public. Alas, child porn only matters when the perps are not wealthy and powerful.
Score: 11 Votes (Like | Disagree)
VulchR Avatar
29 months ago
Many posting above assume that Apple's local CSAM-detecting spying software is a response to pressure from governments like the US and EU. Perhaps. However, it is likely that Apple's system has given encouragement to governments that want 24/7 surveillance on our private lives. I can just picture authoritarian legislators now: 'See? Apple has a system that guarantees privacy [sic], so we can move ahead with this requirement for surveillance'.

Criminal investigations should begin with detection of crime. Global surveillance should not be used for the prevention of crime. The cost to liberty is too high.
Score: 11 Votes (Like | Disagree)
Abazigal Avatar
29 months ago
I previously opined that Apple’s implementation was them trying to have their cake and eat it too - find a way to detect illegal material one’s device without human intervention and thus preserving one’s privacy.

I continue to stand by this statement, and I believe that if Apple were ever to roll out said feature, it would be the least invasive means of scanning for CSAM compared to what the other companies are doing.

We also know now why Apple was exploring such a feature in the first place. Totally makes sense now.
Score: 10 Votes (Like | Disagree)