Privacy Whistleblower Edward Snowden and EFF Slam Apple's Plans to Scan Messages and iCloud Images

Apple's plans to scan users' iCloud Photos library against a database of child sexual abuse material (CSAM) to look for matches and childrens' messages for explicit content has come under fire from privacy whistleblower Edward Snowden and the Electronic Frontier Foundation (EFF).

appleprivacyad
In a series of tweets, the prominent privacy campaigner and whistleblower Edward Snowden highlighted concerns that Apple is rolling out a form of "mass surveillance to the entire world" and setting a precedent that could allow the company to scan for any other arbitrary content in the future.

Snowden also noted that Apple has historically been an industry-leader in terms of digital privacy, and even refused to unlock an iPhone owned by Syed Farook, one of the shooters in the December 2015 attacks in San Bernardino, California, despite being ordered to do so by the FBI and a federal judge. Apple opposed the order, noting that it would set a "dangerous precedent."

The EFF, an eminent international non-profit digital rights group, has issued an extensive condemnation of Apple's move to scan users' iCloud libraries and messages, saying that it is extremely "disappointed" that a "champion of end-to-end encryption" is undertaking a "shocking about-face for users who have relied on the company's leadership in privacy and security."

Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor...

It's impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger's encryption itself and open the door to broader abuses.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children's, but anyone's accounts. That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change.

The EFF highlighted how various governments around the world have passed laws that demand surveillance and censorship of content on various platforms, including messaging apps, and that Apple's move to scan messages and ‌iCloud Photos‌ could be legally required to encompass additional materials or easily be widened. "Make no mistake: this is a decrease in privacy for all ‌iCloud Photos‌ users, not an improvement," the EFF cautioned. See the EFF's full article for more information.

The condemnations join the large number of concerns from security researchers and users on social media since Apple's announcement of the changes yesterday, triggering petitions to urge Apple to roll back its plans and affirm its commitment to privacy.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Top Rated Comments

Greenmeenie Avatar
11 weeks ago
Hey, nobody is for child porn… but I’m with Snowden on this. Very slippery slope here. And it goes against everything Apple has stood for concerning privacy.
Score: 135 Votes (Like | Disagree)
mazz0 Avatar
11 weeks ago
Gotta say, I agree with this. I think the slippery slope argument is valid here. In the US and Europe they might just use this for child porn (for now), but once the principle is established it becomes much harder for them to tell the government in China that they can't look for anti-CCP images, for example, and so on.
Score: 97 Votes (Like | Disagree)
Marbles1 Avatar
11 weeks ago
Awful step by apple. And the 'alert your parents if you view a nude' is some awful overreach but typical of apple's strangely prudish approach.
Score: 91 Votes (Like | Disagree)
Bonte Avatar
11 weeks ago
It's the first step in total censorship. If Apple can scan for porn, they surely can scan for other crimes and then it's all over.
Score: 78 Votes (Like | Disagree)
IIGS User Avatar
11 weeks ago

What’s all over? Your crimes?
What you say is a crime today, is someone else's fight for freedom.

No one, I say again, NO ONE is in favor of seeing children exploited, abused, or harmed. At least no on here I would hope. But that is not the point.

The point is, this is indeed a slippery slope. Much akin to Apple saying they will unlock a phone for law enforcement via a "back door". Which, at present it is my understanding they won't because it doesn't exist

Once the mechanism exists, once the door is installed, or the code made part of the basic building blocks of how the machine operates, it's no longer a question of not being able to do it, but when it will be done. At that point, it's incumbent upon the gatekeepers to decide what is and isn't permitted, or acceptable, or legal.

These are decisions made by human beings. Just as humans are capable of horrible evil acts (like exploitation of children) for their own personal reasons, they can be capable for such evil on a political scale.

Today, child exploitation. Tomorrow, someplace where being LGBTQ or pro democracy where Apple does business. Apple has all ready proven they will bow to the whims of foreign governments who threaten to cut off their business (and revenue stream).

When countries like China are jailing dissidents for expressing pro democracy viewpoints (see footnote link), one can only question how long it is before this sort of invasiveness is unleashed for nefarious reasons.

This is scary stuff. Apple is wrong on this. One hundred percent wrong. People (good people, with liberal with a small "l" ideals will suffer and die because of this). I have no doubt.

They say it could never happen here. Wherever "here" is. Well, it can and probably will happen wherever you are. This is one bigger step towards a high tech dystopia.

https://www.bbc.com/news/world-asia-china-58022072
Score: 71 Votes (Like | Disagree)
DanTSX Avatar
11 weeks ago

What’s all over? Your crimes?
We’re all criminals now. Wake up.
Score: 46 Votes (Like | Disagree)

Related Stories

Child Safety Feature

Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread

Thursday August 5, 2021 1:04 pm PDT by
Apple today announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC). Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun...
Child Safety Feature Blue

Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns

Tuesday August 10, 2021 9:07 am PDT by
Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch. When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got ...
eff logo lockup cleaned

EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

Monday September 6, 2021 3:18 am PDT by
The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout. Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative...
Child Safety Feature Purple

Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

Friday October 15, 2021 12:23 am PDT by
More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times). The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called...
Child Safety Feature yellow

Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

Thursday August 19, 2021 1:23 am PDT by
An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters). "Though these capabilities are intended to protect...
apple privacy

Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning

Monday August 9, 2021 1:50 am PDT by
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week. "Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of...
Child Safety Feature

Facebook's Former Security Chief Discusses Controversy Around Apple's Planned Child Safety Features

Tuesday August 10, 2021 5:50 am PDT by
Amid the ongoing controversy around Apple's plans to implement new child safety features that would involve scanning messages and users' photos libraries, Facebook's former security chief, Alex Stamos, has weighed into the debate with criticisms of multiple parties involved and suggestions for the future. In an extensive Twitter thread, Stamos said that there are "no easy answers" in the...
iCloud General Feature

Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off

Thursday August 5, 2021 2:16 pm PDT by
Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States. User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud...
privacy matters apple ad

German Politician Asks Apple CEO Tim Cook to Abandon CSAM Scanning Plans

Wednesday August 18, 2021 6:11 am PDT by
Member of the German parliament, Manuel Höferlin, who serves as the chairman of the Digital Agenda committee in Germany, has penned a letter to Apple CEO Tim Cook, pleading Apple to abandon its plan to scan iPhone users' photo libraries for CSAM (child sexual abuse material) images later this year. In the two-page letter (via iFun), Höferlin said that he first applauds Apple's efforts to...
eff apple park plane 1

EFF Flew a Banner Over Apple Park During Last Apple Event to Protest CSAM Plans

Friday September 24, 2021 2:06 am PDT by
In protest of the company's now delayed CSAM detection plans, the EFF, which has been vocal about Apple's child safety features plans in the past, flew a banner over Apple Park during the iPhone 13 event earlier this month with a message for the Cupertino tech giant. During Apple's fully-digital "California streaming" event on September 14, which included no physical audience attendance in...