German Politician Asks Apple CEO Tim Cook to Abandon CSAM Scanning Plans

Member of the German parliament, Manuel Höferlin, who serves as the chairman of the Digital Agenda committee in Germany, has penned a letter to Apple CEO Tim Cook, pleading Apple to abandon its plan to scan iPhone users' photo libraries for CSAM (child sexual abuse material) images later this year.

privacy matters apple ad
In the two-page letter (via iFun), Höferlin said that he first applauds Apple's efforts to address the dangers posed by child sexual abuse and violence but notes that he believes Apple's approach to remedying the issue is not the right one. Höferlin continued to say that the approach Apple has chosen violates one of the "most important principles of the modern information society – secure and confidential communication."

The approach chosen by Apple however – namely CSAM scanning of end devices – is a dangerous one. Regardless of how noble your motives may be, you are embarking on a path that is very risky – not only for your own company. On the contrary, you would also be damaging one of the most important principles of the modern information society – secure and confidential communication. The price for this will most likely be paid not only by Apple, but by all of us.

Höferlin notably called Apple's CSAM approach "the biggest opening of the floodgates for communication confidentiality since the birth of the internet." The letter speaks out against Apple's plans to scan images in a users' iCloud Photo Library for CSAM by checking the hashes of images to a database of known child sexual abuse material.

That feature is entirely different from another feature rolling out later this year, in which iOS will use on-device image analysis to detect possible sexually explicit images in the Messages app and asks users under the age of 13 if they wish to see the photo. While Höferlin referenced some legitimate concerns over CSAM scanning, he continued that the feature destroys "some of the trust users place in not having their communications secretly monitored." Neither CSAM scanning nor the Child Safety Features in Message, however, are monitoring any communication.

Apple's senior vice president of software engineering, Craig Federighi, admitted in a recent interview that the conjoined announcement of CSAM detection and improved safety for children within the Messages app has caused confusion. Nonetheless, Höferlin continued in his letter by stating that while he wishes he could believe Apple's reassurance that it will not allow government interference into CSAM detection, he is unable to take the company by its word.

As much as I want to believe your assurances that you will reject all requests for further application of this function, such as the location of regime critics or surveillance of minorities, these lack credibility. In every country on Earth – even in my home country, despite our historical experiences – political forces continue to coalesce for whom confidential communication and encryption are a thorn in their side, and who are engaged in ongoing efforts to replace freedom with surveillance. For people who unlike us are not lucky enough to live in Western democracies, this can in the worst-case scenario mean a genuine threat to their lives.

Höferlin concluded his letter by pleading with Cook for Apple to abandon its CSAM scanning plans and asked that the company stays on the side of free and private internet.

That is why my urgent appeal to you is that you abandon your plans for CSAM scanning. This would not only save your own company from many foreseeable problems, but would also protect the Achilles' heel of the modern information society! Please stay on the side of those who defend civilization’s achievement of a free internet!

Since its announcement earlier this month, Apple’s plans have received criticism, and in response, the company has continued its attempt to address concerns by publishing additional documents and an FAQ page. CSAM scanning and Child Safety Features within the Messages app are still on track to be released later this year.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Top Rated Comments

antiprotest Avatar
4 weeks ago
The only way Apple will abandon this is if China tells them to, and we already know that this is not a feature that China will oppose.
Score: 67 Votes (Like | Disagree)
mzeb Avatar
4 weeks ago

German politician has no idea how this CSAM detection works and prefers a less private way of child porn scanning.
Despite the facts that it scans locally before uploading, that it only uses hashes initially, that child exploitation is a serious problem that must be solved and that Apple has done a pretty good design job to protect privacy it still isn’t right. This is still digital surveillance of the private population. And worse still it is being done by a corporation of unelected individuals deciding for us how it should be done rather than law enforcement. The term “overreach” is often used in government and it applies here. Apple is not responsible nor accountable for CSAM detection in law enforcement and no country’s citizens have passed a law to give them this mandate. However secure, private and well intentioned this system may be they are breaching the privacy of the people without the people’s’ permission.
Score: 55 Votes (Like | Disagree)
tonywalker23 Avatar
4 weeks ago
I don't like that people view child porn. And as a conservative Christian pastor who works full time at a church, I don't want anyone viewing porn. Furthermore, I intentionally don't watch material that has risky scenes or language that offends me.

However, the same technology that Apple wants us all to accept this fall could one day be the same technology that tells a government that I am a conservative Christian pastor. Therefore, the right thing in this situation is not to catch people that are going to not use the feature—the right thing in this situation is to not implement a feature that is highly useless against the people for whom it is intended... because the day might one day come when others get caught in a web that was not originally intended for them.
Score: 47 Votes (Like | Disagree)
Expos of 1969 Avatar
4 weeks ago
You tell him Herr Hoferlin!!
Score: 37 Votes (Like | Disagree)
Khedron Avatar
4 weeks ago

Might as well ask Cook to also remove all the cameras in the phone so that people can avoid being filmed during altercations (an altercation that THEY themselves usually cause), threatening to put them on YouTube or Facebook, which they almost certainly do regardless. Their video is subsequently shared many, many times. So much respect for people’s privacy! Do Facebook and YouTube care about people's privacy? No, not while it's entertainment. Yet the source comes from somebody's phone and violates the rights of others.

Or maybe remove the ability to record conversations via Voice Memos, or third-party apps just in case a conversation is recorded secretly without other people knowing.
Those are not the equivalent of this technology. Apple’s new technology is about having a process running on the user’s phone that monitors for illegal activity and reports matches to an authority. So this is like having your camera automatically scan for antisocial behaviour and reporting your GPS location to the local police.
Score: 35 Votes (Like | Disagree)
Villarrealadrian Avatar
4 weeks ago
Now it’s getting serious
Score: 30 Votes (Like | Disagree)

Top Stories

Child Safety Feature

Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread

Thursday August 5, 2021 1:04 pm PDT by
Apple today announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC). Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun...
appleprivacyad

Privacy Whistleblower Edward Snowden and EFF Slam Apple's Plans to Scan Messages and iCloud Images

Friday August 6, 2021 5:00 am PDT by
Apple's plans to scan users' iCloud Photos library against a database of child sexual abuse material (CSAM) to look for matches and childrens' messages for explicit content has come under fire from privacy whistleblower Edward Snowden and the Electronic Frontier Foundation (EFF). In a series of tweets, the prominent privacy campaigner and whistleblower Edward Snowden highlighted concerns...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
Child Safety Feature yellow

Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

Thursday August 19, 2021 1:23 am PDT by
An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters). "Though these capabilities are intended to protect...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...
Child Safety Feature

Facebook's Former Security Chief Discusses Controversy Around Apple's Planned Child Safety Features

Tuesday August 10, 2021 5:50 am PDT by
Amid the ongoing controversy around Apple's plans to implement new child safety features that would involve scanning messages and users' photos libraries, Facebook's former security chief, Alex Stamos, has weighed into the debate with criticisms of multiple parties involved and suggestions for the future. In an extensive Twitter thread, Stamos said that there are "no easy answers" in the...
iphone communication safety feature

Apple Remains Committed to Launching New Child Safety Features Later This Year

Tuesday August 10, 2021 10:58 am PDT by
Last week, Apple previewed new child safety features that it said will be coming to the iPhone, iPad, and Mac with software updates later this year. The company said the features will be available in the U.S. only at launch. A refresher on Apple's new child safety features from our previous coverage:First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac...
craig wwdc 2021 privacy

Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Friday August 13, 2021 6:33 am PDT by
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM). Federighi admitted that Apple...
Child Safety Feature Blue

Apple Delays Rollout of Controversial Child Safety Features to Make Improvements

Friday September 3, 2021 6:07 am PDT by
Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced. The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance...
iphone communication safety feature

Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Thursday August 5, 2021 12:00 pm PDT by
Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time. Communication Safety First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children...