This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government? https://t.co/nB8S6hmLE3
— Matthew Green (@matthew_d_green) August 5, 2021
More fodder for the @techmeme outraged: Apple has been scanning the hash of iCloud photos against known child abuse imagery hashes since late 2019. https://t.co/mypu95CLu4
— Charles Arthur (@charlesarthur) August 5, 2021
The idea that Apple is a “privacy” company has bought them a lot of good press. But it’s important to remember that this is the same company that won’t encrypt your iCloud backups because the FBI put pressure on them. https://t.co/tylofPfV13
— Matthew Green (@matthew_d_green) August 5, 2021
Porn is always the lead excuse to deploy privacy-crippling technologies. CSAM today, classified material next, and eventually, whatever any power deems "objectionable" tomorrow.
— Տean Mꇃꓰlroy ? (@SeanMcElroy) August 5, 2021
This playbook is so tired. Not fooled, @Apple https://t.co/IazmnCCXrB
This is deeply concerning. When I saw @matthew_d_green’s tweets about this, I thought, okay, already needed to get off iCloud, but what if this type of image scanning based surveillance is eventually integrated into all cloud services and operating systems at a very low level? https://t.co/8nVO0DCA43
— Joseph Kohlmann (he/him) is forming a union! (@jkohlmann) August 5, 2021
.@tim_cook @Apple Broad, 1984-style surveillance of the masses is wrong, no matter how good your intentions may be. It is a vile betrayal of Apple's very soul.https://t.co/zAAGptDPZZ
— Bryan Jones (@bdkjones) August 5, 2021
The theory is that you will trust Apple to only include really bad images. Say, images curated by the National Center for Missing and Exploited Children (NCMEC). You’d better trust them, because trust is all you have.
— Matthew Green (@matthew_d_green) August 5, 2021
Apple is reportedly planning to autoscan your iPhone for child abuse images. This is insanely disturbing if true
— Benj Edwards (@benjedwards) August 5, 2021
It opens the door for false positives and deep surveillance that can be abused in 1000 different wayshttps://t.co/lQeXeXyZGm
Some thoughts about the potential Fourth Amendment issues associated with the news that Apple intends to scan iPhones for child sex abuse material. https://t.co/FuiWaanJQQ
— Jeff Kosseff (@jkosseff) August 5, 2021
Once again, this ultimately comes back to a simple question: Do our devices work for us, or for the manufacturers? Do we have a right to expect that they're designed to work in our best interests, not someone else's? https://t.co/bhs8wmp1gj
— nick.eth (@nicksdjohnson) August 5, 2021
I'm not against this by any means but part of scanning your phone for stuff on it rubs me the wrong way https://t.co/7dxUbBzehu
— Max Weinbach (@MaxWinebach) August 5, 2021
This means that, depending on how they work, it might be possible for someone to make problematic images that “match” entirely harmless images. Like political images shared by persecuted groups. These harmless images would be reported to the provider.
— Matthew Green (@matthew_d_green) August 5, 2021
A number of people pointed out that these scanning technologies are effectively (somewhat limited) mass surveillance tools. Not dissimilar to the tools that repressive regimes have deployed — just turned to different purposes.
— Matthew Green (@matthew_d_green) August 5, 2021
The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t “hurt” anyone’s privacy.
— Matthew Green (@matthew_d_green) August 5, 2021
But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal.
SESTA AND FOSTA plus madness https://t.co/IK0d6aLMwJ
— ?Sydette Cosmic Dreaded Gorgon ?? (@Blackamazon) August 5, 2021
hardcore cryptography folks generally believe highest levels of encryption (like WhatsApp's) = best/safest, with least room for the US Govt to come in and snoop around on messages from the public.
— rat king (@MikeIsaac) August 5, 2021
but that comes with tradeoffs — namely, trafficking in illicit/illegal materials
this is the wolf of the total surveillance state wearing the skin of a sympathetic cause https://t.co/8zWavBQqDn
— badidea ? (@0xabad1dea) August 5, 2021
This is bad. For all the reasons in the thread, AND
— all out of cookies and spoons (@DiaKayyali) August 5, 2021
This tool won't be contained to child sexual abuse material. Tools developed to fight it are often inappropriately redeployed to try to detect "terrorist content"- regardless of false positives. https://t.co/tEXPvlVDeO (1/2) https://t.co/CC6f5GtmUI
Apple is managing an embargo of reporters right now. It’s quite a remarkable thing watching them do this, goal is to make sure everyone reports this when Apple wants and on Apple’s terms.
— Matthew Green (@matthew_d_green) August 5, 2021
These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.
— Matthew Green (@matthew_d_green) August 5, 2021
A neuralnet trained on the NCMEC photo dataset, used to selectively decrypt your iCloud images. A doubly-accursed model.https://t.co/lqzKkTOggw pic.twitter.com/MbdFSZhZVa
— Deirdre Connolly¹ is in the Hot Zone ???? (@durumcrustulum) August 5, 2021
FOSTA and SESTA were just the beginning. Now sex trafficking is the excuse to upload all your photos to Apple so they can review them for child abuse https://t.co/ZI1DBV50yB
— Mass. Pirate Party (@masspirates) August 5, 2021
Initially Apple is not going to deploy this system on your encrypted images. They’re going to use it on your phone’s photo library, and only if you have iCloud Backup turned on.
— Matthew Green (@matthew_d_green) August 5, 2021
So in that sense, “phew”: it will only scan data that Apple’s servers already have. No problem right?
Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content.
— Matthew Green (@matthew_d_green) August 5, 2021
That’s the message they’re sending to governments, competing services, China, you.
Because surely it will ensure this, right? You’d want to ensure that Apple (or someone who hacks Apple’s servers) can’t change the database selectively to target it to you — and have a normal CSAM database for everyone else.
— Matthew Green (@matthew_d_green) August 5, 2021
That's one hell of a buried lede, using AI as the smokescreen. In the last sentence of https://t.co/RousU4z7rB: "Apple will enable all the suspect photos to be decrypted and […] passed on to the relevant authorities." Apple can and will remotely decrypt your device on demand.
— @pndc (@pndc) August 5, 2021
Reading through the analysis. This is not… a security review. pic.twitter.com/UKGPKGK07x
— Matthew Green (@matthew_d_green) August 5, 2021
While I am clearly against any sort of protection for people with illegal photos, this really feels like a slippery slope of injecting tech into our personal networks. Once Apple does it, governments will expect it from all companies. https://t.co/83UmmN6z2X
— Shawn Wildermuth ?☕??? (@ShawnWildermuth) August 5, 2021
don't fall for it. this doesn't stop here. https://t.co/8H0s26gk7K
— ktb (@kevinbaker) August 5, 2021
For the past decade, providers like Apple, WhatsApp/Facebook, Snapchat, and others have been adding end-to-end encryption to their text messaging and video services. This has been a huge boon for privacy. But governments have been opposed to it.
— Matthew Green (@matthew_d_green) August 5, 2021
But ask yourself: why would Apple spend so much time and effort designing a system that is *specifically* designed to scan images that exist (in plaintext) only on your phone — if they didn’t eventually plan to use it for data that you don’t share in plaintext with Apple?
— Matthew Green (@matthew_d_green) August 5, 2021
Hopefully the next review is by an expert in adversarial ML who will explain how they’ve solved some of the hardest open problems in Computer Science.
— Matthew Green (@matthew_d_green) August 5, 2021
If you have auto-save of received images on Whatsapp etc. enabled (the default setting), this means anyone can send you illegal content, and the police may pick you up shortly after https://t.co/Qhz9T0euWE
— (@levelsio) August 5, 2021
As a parent, this might make a lot of sense. On the other hand, Apple’s about to drive a whole generation of teenagers & peer groups away from iMessage, to never come back. This feels like a radioactive feature, opt-in or not https://t.co/g0gPXbG0mU pic.twitter.com/BvHcizbQlk
— Steve Troughton-Smith (@stroughtonsmith) August 5, 2021
remember when apple was like “we’ll never let the FBI in your backdoor”?
— nikki, cyborg cowboy (@beeepbeepboop) August 5, 2021
also, choosing a really serious issue like child abuse as a backdoor is intentional; now, if you have any (valid) security concerns, it’s going to turn into “if you don’t abuse children don’t worry”. EVIL. https://t.co/CApjYDjlJp
(There are some other fancier approaches that split the database so your phone doesn’t even see it — the databases are actually trade secrets in some cases. This doesn’t change the functionality but makes the system even harder to check up on. Apple may use something like this.)
— Matthew Green (@matthew_d_green) August 5, 2021
Call me cautiously in favor, rather than aghast, by Apple's idea of doing CSAM detection on people's phones:https://t.co/uWmlowoFTe
— Nicholas Weaver (@ncweaver) August 5, 2021
Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.
— Matthew Green (@matthew_d_green) August 5, 2021
No, it’s by Mihir Bellare reviewing the PSI protocol. So no review at all of the important bits.
— Matthew Green (@matthew_d_green) August 5, 2021
Well, let’s see how the PSI protocol ensures accountability, ie that Apple can’t change the database to selectively spy on specific users.
I wasn’t planning to tweet about this paper as it is under review but it’s important to get the word out there now: we evaluated 5 perceptual hashing algorithms and found all of them to be vulnerable to a simple black-box adversarial attack https://t.co/TlyQSvG9WT A thread ⤵️ https://t.co/vRtpA1azh2
— Yves-A. de Montjoye (@yvesalexandre) August 5, 2021
These are bad things. I don’t particularly want to be on the side of child porn and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends.
— Matthew Green (@matthew_d_green) August 5, 2021
Regarding the Apple and CSAM scanning news, the biggest difference is that the extension of this canning to client photos that haven't been shared anywhere, AND the use of a neural network to detect unknown CSAM.
— Juliet Shen (@Juliet_Shen) August 5, 2021
They've been scanning iCloud for a whilehttps://t.co/3qVykFEuSB
The ability to add scanning systems like this to E2E messaging systems has been a major “ask” by law enforcement the world over. Here’s an open letter signed by former AG William Barr and other western governments. https://t.co/mKdAlaDSts
— Matthew Green (@matthew_d_green) August 5, 2021
Will they deploy this for iPhones in China? What if the govt asks (or passes a law) to know who has politically sensitive photos on their phone? https://t.co/mRqjjIAqK6
— Tom Simonite (@tsimonite) August 5, 2021
That iPhone is Apple's phone, not yours. And this completely betrays the company's pious privacy assurances.
— Dan Gillmor (@dangillmor) August 5, 2021
This is just the beginning of what governments everywhere will demand. All of your data will be fair game. If you think otherwise, you're terminally naive. https://t.co/S1nlAhWCxn
Ending child abuse should be a government priority. Engaging in searches without warrants is not the way to get there. This is how it starts, then it goes to gun images, protest images, protest discussions, intimate discussions, etc. https://t.co/RUeCwHJMgj
— John Hamasaki (@HamasakiLaw) August 5, 2021
“Apple already checks iCloud files against known child abuse imagery, like every other major cloud provider. But the system described here would go further, allowing central access to local storage.”https://t.co/b5sR6aJP1M
— Arielle Duhaime-Ross (@adrs) August 5, 2021
Dumbed-down explanation: Apple's iPhones will soon start secretly calling the police if they find photos on your phone that match fingerprints of photos depicting child abuse and any content eventually deemed objectionable.
— Nadim Kobeissi (@kaepora) August 5, 2021
This can/will be generalized to secure messaging https://t.co/WDfnD6ChS1
Twitter started using a system like this, based on work Microsoft had done, as far back as 2013. This is not a new idea in the slightest, and there’s absolutely no sign of it being applied to E2E. https://t.co/8dKrm2mhj9
— Charles Arthur (@charlesarthur) August 5, 2021
I mean it’s become pretty clear that when Apple says privacy, they don’t mean privacy from Apple. https://t.co/47eDq4pqeD
— st. (@seyitaylor) August 5, 2021
Spent the day trying to figure out if the Apple news is more benign than I thought it was, and nope. It’s definitely worse.
— Matthew Green (@matthew_d_green) August 5, 2021
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
— Matthew Green (@matthew_d_green) August 4, 2021
A small update from last night. I described Apple’s matching procedure as a perceptual hash function. Actually it’s a “neural matching function”. I don’t know if that means it will also find *new* content on your device or just known content. https://t.co/OkCgSrApXk
— Matthew Green (@matthew_d_green) August 5, 2021
There are a lot of problems with this idea. Yes — client side scanning (and encrypted images on server) is better than plaintext images and server-side scanning. But to some extent, the functionality is the same. And subject to abuse…
— Matthew Green (@matthew_d_green) August 5, 2021
The way this will work is that your phone will download a database of “fingerprints” for all of the bad images (child porn, terrorist recruitment videos etc.) It will check each image on your phone for matches. The fingerprints are ‘imprecise’ so they can catch close matches.
— Matthew Green (@matthew_d_green) August 5, 2021
for sure, but I think I’m OK with this tradeoff? FB reported 15.8M cases of CSAM to NCMEC in 2019; Apple reported 205. Seems likely that there is massive under-reporting of very bad stuff on iOS
— Casey Newton (@CaseyNewton) August 5, 2021
Hashes aren't perfect by any means, but AFAIK law enforcement can't charge someone on the basis of a matched hash - they have to find CSAM on a suspect's device. The hash match provides probable cause. https://t.co/e0svk6Cbk1
— Olivia Solon (@oliviasolon) August 5, 2021
This is important. https://t.co/jRCCEgR6fB
— Christina Warren (@film_girl) August 5, 2021
Apple's filtering of iMessage and iCloud is not a slippery slope to backdoors that suppress speech and make our communications less secure. We’re already there: this is a fully-built system just waiting for external pressure to make the slightest change. https://t.co/f2nv062t2n
— EFF (@EFF) August 5, 2021
Louder, for the people in the back: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. https://t.co/vRHRTxH0I8
— Eva (@evacide) August 5, 2021
Opening the door to client-side scanning of private messages will inevitably lead to requests for more expansive censorship and surveillance, especially from authoritarian regimes. Apple likes to Think Different, but, please, not like this. https://t.co/vljbPIMKCr
— Kurt Opsahl (@kurtopsahl) August 5, 2021
You can call it a dog door and insist it's only for dogs, but at the end of the day, IT'S STILL A FUCKING HOLE IN YOUR WALL. That's how my grandparents ended up with a possum in their kitchen. https://t.co/rF8ddpEArX
— Katharine Trendacosta (@k_trendacosta) August 5, 2021
I present to you: Apple's terrible fucking idea https://t.co/rF8ddpEArX
— Katharine Trendacosta (@k_trendacosta) August 5, 2021
iCloud and iMessage will be scanning your photos under the guise of "protecting children", ending their claim to end to end encryption. A tried and true strategy: incite a moral panic and leverage outrage to erode user privacy. https://t.co/DsgewLtxV0
— koush (@koush) August 6, 2021
?? Apple says to "protect children," they're updating every iPhone to continuously compare your photos and cloud storage against a secret blacklist. If it finds a hit, they call the cops.
— Edward Snowden (@Snowden) August 6, 2021
iOS will also tell your parents if you view a nude in iMessage.https://t.co/VZCTsrVnnc
The last thing I want is to have machine learning algorithms scan my phone's data without my permission.
— hardmaru (@hardmaru) August 6, 2021
At some point there will be more demand for "simple" smart phones—devices with no ML, no walled gardens, no surveillance / tracking / user monitoring.https://t.co/STQA5rd6MW
Backdoors are high-key bad. https://t.co/HrgJrqCBgm
— Ricky Mondello (@rmondello) August 6, 2021
EFF is indignant. "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor. To say that we are disappointed by Apple’s plans is an understatement." https://t.co/wxmJwy3ziz
— Joseph Menn (@josephmenn) August 5, 2021
“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts.” https://t.co/Ou87E69IXG
— Paul Mozur 孟建國 (@paulmozur) August 6, 2021