European Commission to Release Draft Law Enforcing Mandatory Detection of Child Sexual Abuse Material on Digital Platforms

The European Commission is set to release a draft law this week that could require tech companies like Apple and Google to identify, remove and report to law enforcement illegal images of child abuse on their platforms, claims a new report out today.

European Commisssion
According to a leak of the proposal obtained by Politico, the EC believes voluntary measures taken by some digital companies have thus far "proven insufficient" in addressing the increasing misuse of online services for the purposes of sharing child sexual abuse content, which is why the commission wants to make detection of such material mandatory.

After months of lobbying, groups representing tech companies and children's rights organizations are said to be waiting to see how stringent the rules could be, and how they will work without tech companies having to scan the gamut of user content – a practice deemed illegal by the Court of Justice of the European Union in 2016.

Apart from how identification of illegal material would operate within the law, privacy groups and tech companies are worried that the EU executive could result in the creation of backdoors to end-to-end encrypted messaging services, the contents of which cannot be accessed by the hosting platform.

The EC's Home Affairs Commissioner Ylva Johansson has said technical solutions exist to keep conversations safe while finding illegal content, but cybersecurity experts disagree.

"The EU shouldn't be proposing things that are technologically impossible," said Ella Jakubowska, speaking to Politico. Jakubowska is policy adviser at European Digital Rights (EDRi), a network of 45 non-governmental organizations (NGOs.)

"The idea that all the hundreds of millions of people in the EU would have their intimate private communications, where they have a reasonable expectation that that is private, to instead be kind of indiscriminately and generally scanned 24/7 is unprecedented," said Jakubowska.

MEPs are far from aligned on the issue, however. Reacting to the leak of the proposal, centrist Renew Europe MEP Moritz Körner told Politico the Commission's proposal would mean "the privacy of digital correspondence would be dead."

The heated debate mirrors last year's controversy surrounding Apple's plan to search for CSAM (child sexual abuse material) on iPhones and iPads.

Apple in August 2021 announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for CSAM and Communication Safety to warn children and their parents when receiving or sending sexually explicit photos. The latter, and arguably less controversial, feature is already live on Apple's iMessage platform. Apple's method of scanning for CSAM has yet to have been deployed.

Following Apple's announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees.

The majority of criticism was leveled at Apple's planned on-device CSAM detection, which was lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual abuse.

Apple initially attempted to dispel some misunderstandings and reassure users by releasing detailed information and sharing interviews with company executives in order to allay concerns. However, despite Apple's efforts, the controversy didn't go away, and Apple decided to delay the rollout of CSAM following the torrent of criticism.

Apple said its decision to delay was "based on feedback from customers, advocacy groups, researchers and others... we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

In December 2021, Apple quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads hanged in the balance following significant criticism of its methods.

However, Apple says its plans for CSAM detection have not changed since September, which suggests CSAM detection in some form is still coming in the future.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iPhone 16 Pro Sizes Feature

iPhone 16 Series Is Just Two Months Away: Everything We Know

Monday July 15, 2024 4:44 am PDT by
Apple typically releases its new iPhone series around mid-September, which means we are about two months out from the launch of the iPhone 16. Like the iPhone 15 series, this year's lineup is expected to stick with four models – iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max – although there are plenty of design differences and new features to take into account. To bring ...
macbook pro january

Best Buy's Black Friday in July Sale Takes Up to $700 Off M3 MacBook Pro for Members

Monday July 15, 2024 11:05 am PDT by
Best Buy's "Black Friday in July" sale is in full swing today, and in addition to a few iPad Air discounts we shared earlier, there are also some steep markdowns on the M3 MacBook Pro. You will need a My Best Buy Plus or Total membership in order to get some of these deals. Note: MacRumors is an affiliate partner with Best Buy. When you click a link and make a purchase, we may receive a small...
ipaos 18 image playground

Apple Releases First iOS 18 and iPadOS 18 Public Betas

Monday July 15, 2024 1:16 pm PDT by
Apple today provided the first betas of iOS 18 and iPadOS 18 to public beta testers, bringing the new software to the general public for the first time since the Worldwide Developers Conference in June. Apple has seeded three developer betas so far, and the first public beta includes the same content that's in the third developer beta. Subscribe to the MacRumors YouTube channel for more videos. ...
maxresdefault

Apple's AirPods Pro 2 vs. Samsung's Galaxy Buds3 Pro

Saturday July 13, 2024 8:00 am PDT by
Samsung this week introduced its latest earbuds, the Galaxy Buds3 Pro, which look quite a bit like Apple's AirPods Pro 2. Given the similarities, we thought we'd compare Samsung's new earbuds to the AirPods Pro. Subscribe to the MacRumors YouTube channel for more videos. Design wise, you could potentially mistake Samsung's Galaxy Buds3 Pro for the AirPods Pro. The Buds3 Pro have the same...
Beyond iPhone 13 Better Blue Face ID Single Camera Hole

10 Reasons to Wait for Next Year's iPhone 17

Monday July 8, 2024 5:00 am PDT by
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we sometimes get rumored feature leaks so far ahead of launch. The iPhone 17 series is no different – already we have some idea of what to expect from Apple's 2025 smartphone lineup. If you plan to skip...
Generic iOS 18 Feature Real Mock

Apple Seeds Revised Third Betas of iOS 18 and iPadOS 18 to Developers

Monday July 15, 2024 10:09 am PDT by
Apple today seeded updated third betas iOS 18 and iPadOS 18 to developers for testing purposes, with the software coming a week after Apple initially released the third betas. Registered developers are able to opt into the betas by opening up the Settings app, going to the Software Update section, tapping on the "Beta Updates" option, and toggling on the ‌iOS 18/iPadOS 18‌ Developer Beta ...

Top Rated Comments

contacos Avatar
29 months ago
and at the same time another entity in the EU demands end-to-end encryption. Hilarious.

It starts with child porn and ends with having an opinion. Scary future
Score: 24 Votes (Like | Disagree)
rme Avatar
29 months ago

and at the same time another entity in the EU demands end-to-end encryption. Hilarious.

It starts with child porn and ends with having an opinion. Scary future
Europe is once again heading towards a very very dark place. Was always obvious that ever more centralisation of power and ever bigger empire was going to lead to misery.
Score: 17 Votes (Like | Disagree)
AdonisSMU Avatar
29 months ago

Eh- not a fan of this.

Inevitably, "think of the children" always wins.
Also not a fan. EU is doing too much As. Per usual. The EU doesnt know how to leave people alone. The its for a good cause is a terrible argument. Do the people of the EU get to vote for the representation in the EU?
Score: 16 Votes (Like | Disagree)
Mac Fly (film) Avatar
29 months ago

and at the same time another entity in the EU demands end-to-end encryption. Hilarious.

It starts with child porn and ends with having an opinion. Scary future
Careful with having an opinion around here. MR don’t love that; you may get a temporary ban for such. Live in the EU and hate the EU. Centralised power corrupts. If child porn was the issue we’d know Maxwell’s client list and the court case transcript would be made public. Alas, child porn only matters when the perps are not wealthy and powerful.
Score: 11 Votes (Like | Disagree)
VulchR Avatar
29 months ago
Many posting above assume that Apple's local CSAM-detecting spying software is a response to pressure from governments like the US and EU. Perhaps. However, it is likely that Apple's system has given encouragement to governments that want 24/7 surveillance on our private lives. I can just picture authoritarian legislators now: 'See? Apple has a system that guarantees privacy [sic], so we can move ahead with this requirement for surveillance'.

Criminal investigations should begin with detection of crime. Global surveillance should not be used for the prevention of crime. The cost to liberty is too high.
Score: 11 Votes (Like | Disagree)
Abazigal Avatar
29 months ago
I previously opined that Apple’s implementation was them trying to have their cake and eat it too - find a way to detect illegal material one’s device without human intervention and thus preserving one’s privacy.

I continue to stand by this statement, and I believe that if Apple were ever to roll out said feature, it would be the least invasive means of scanning for CSAM compared to what the other companies are doing.

We also know now why Apple was exploring such a feature in the first place. Totally makes sense now.
Score: 10 Votes (Like | Disagree)