To the general public there is a difference between “keep iCloud backup on because devices break and I don’t want to lose my private photos” and “I want them scanned by Apple.”
— Matthew Green (@matthew_d_green) August 7, 2021
Apple assumed that due to past technical accidents the two were identical to the public. They aren’t.
It's clear that Apple didn't consult any civil society orgs. No civil liberties or human rights input. Privacy, freedom of expression, LGBTQI+ issues, orgs for homeless queer youth, none of it. If they had, they'd be touting that (even if they ignored everything the orgs said).
— Riana Pfefferkorn (@Riana_Crypto) August 9, 2021
Apple’s probably not Private for long iCloud Relay doesn’t sound quite as nice.
— Paul Haddad (@tapbot_paul) August 7, 2021
That's the most explicit Apple has been on the record that if you turn off uploading to iCloud Photos, CSAM detection doesn't happen. Apple wants to be clear it isn't "scanning" your photo library on your device.
— Jason Aten (@JasonAten) August 9, 2021
Apple's Mistake
— Stratechery (@stratechery) August 9, 2021
While it's possible to understand Apple's motivations behind its decision to enable on-device scanning, the company had a better way to satisfy its societal obligations while preserving user privacy.https://t.co/k9ix4ndDyh
This sounds a lot like “trust us, we could do it but we promise we won’t”. https://t.co/S3JS8OTJSh
— Alan Woodward (@ProfWoodward) August 9, 2021
Asking people to disable iCloud Photos in 2021 is not realistic, and Apple knows this. Everyone depends strongly on iCloud Photos not just for sync, but as a critical backup feature for what is often years and years of important photos.
— ????Nadim Kobeissi???? (@kaepora) August 9, 2021
The only photos that are scanned against the hashed database are those in iCloud Photos, which are destined for iCloud. So what exactly is he saying? https://t.co/LVggezE7xR
— John Wilson (@JohnWilson) August 9, 2021
Developer of FotoForensics is the best I’ve read on the valid reasons for, yet problematic technical & legal issues with Apple’s CSAM detection.
— Dare Obasanjo (@Carnage4Life) August 9, 2021
Apple has chosen to die on the hill of not doing server side scanning and has chosen an invasive alternative https://t.co/k9V8iLC2Fc pic.twitter.com/vlatbg0bXe
The author appears to suggest that privacy focused Apple should use the solutions employed by [ checks notes ] Facebook and Google. ??? https://t.co/NmE3jFDzZx
— Jay Cuthrell (@JayCuthrell) August 9, 2021
One can assume there are tens to hundreds of millions of CSAM images in iCloud based on extrapolation of how much storage they've announced is used by the service (8M terabytes) and reporting rates from other services (e.g. FotoForensics says 0.056%). So they must do something.
— Dare Obasanjo (@Carnage4Life) August 9, 2021
Apple has also historically made a privacy promise that they can't (actually they won't since they can decrypt your data) look at your content in iCloud. They're between a rock and a hard place. So they've chosen to keep their privacy promise by scanning content on users phones.
— Dare Obasanjo (@Carnage4Life) August 9, 2021
Apple's move to scan the photos *on your phone, with code running on your phone* has ignited a storm of debate. This differs from how FB does things.
— Antonio García Martínez (@antoniogm) August 9, 2021
As PR readers know, this is a general move toward on-device everything, and Apple's plan is in line with that. pic.twitter.com/hfiXDVgpMZ
If Apple had announced that they were scanning text messages sent through their systems, or photo libraries shared with outside users — well, I wouldn’t have been happy with that. But I think the public would have accepted it.
— Matthew Green (@matthew_d_green) August 7, 2021
But they didn’t do that. They announced that they’re going to do real-time scanning of individuals’ *private photo libraries* on their own phones.
— Matthew Green (@matthew_d_green) August 7, 2021
That’s… something different. And new. And uncomfortable.
One thing that really bothers me about the Apple CSAM solution is that it is really easily bypassed. So it will only work for the dumb criminals, but will potentially compromise everyone else. In that sense, it’s like the encryption backdoor situation.
— Yehuda Lindell (@LindellYehuda) August 8, 2021
My suspicion is that Apple are fibbing about their capaability or desire to resist the @ukhomeoffice rocking up to them with a #TechnicalCapabilityNotice under the Regulation of Investigatory Powers act, and thereby being forced to snoop on iPhones. https://t.co/acVxU9FUwz
— Alec Muffett (@AlecMuffett) August 9, 2021
Nothing in this FAQ (PDF: https://t.co/DmElwzVtFl) that Apple has released addresses any of our concerns. It basically boils down to “trust us, don’t worry, it’ll be fine.”
— Aral Balkan (@aral) August 9, 2021
To reiterate what we’re asking: Apple must halt deployment of its content monitoring technology (1/2)
I think these two paragraphs get to the heart of what is so disturbing about Apple’s photo scanning initiative. https://t.co/bCe9Sg8Qmn pic.twitter.com/9jFqhyc3U1
— Matthew Green (@matthew_d_green) August 9, 2021
The question from a privacy perspective is whether Apple keeping the "we won't look at your content on the server" promise by creating a precedent where "we look at content on your phone instead" is better or worse for customer privacy?
— Dare Obasanjo (@Carnage4Life) August 9, 2021
This will be true if and only if iCloud Photos are fully encrypted, and right now, they are not. Apple has policies and systems in place to secure your iCloud images but it does still have the ability to see them if it wants to, or is ordered to pic.twitter.com/qgILugZduV
— John Bergmayer (@bergmayer) August 9, 2021
On why Apple is doing this:
— Jason Aten (@JasonAten) August 9, 2021
"In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities."
As I said last week, IF you just look at the system in isolation, assume it’s implemented exactly as intended & frozen in stone, maybe it’s fine. But I don’t think that’s a terribly smart way to think about it.
— Julian Sanchez (@normative) August 9, 2021
In terms of US law, 4th Amendment law about how police can't pay the landlord to let them into a tenant's apartment should apply here to limit the parade of horribles.
— Cathy Gellis (@CathyGellis) August 9, 2021
But (a) it might not, and (b) it's of no use elsewhere in the world there's no such limit on police. https://t.co/q7OdWzEPNI
This is still not Apple explicitly saying “you can completely turn off local hashing of your photos.” The “system” and “this feature” are intentionally vague descriptors. Don’t connect the dots for them! Make them say it on the record. https://t.co/r6HrcKZQaK
— nilay patel (@reckless) August 9, 2021
"Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?"
— ????Nadim Kobeissi???? (@kaepora) August 9, 2021
Again, we get: "No, but actually yes." ?♂️?♂️
The Electronic Frontier Foundation *has already documented* instances where CSAM lists were expanded to target non-CSAM content. pic.twitter.com/ipJH5CzY6a
A frustrating thing about most discussion of the Apple CSAM scanning scheme is the conflation of three different questions:
— matt blaze (@mattblaze) August 8, 2021
- Will it have a significant impact on CSAM?
- What new risks does it expose users to if it works correctly?
- What are the new risks if something fails?
The system scans the photos on the phone. From a technical perspective it does not need to upload the photos to the cloud. Apple has *deployed* it this way, since they assume “turning on backup in case my device breaks” is equivalent to “consent for scanning.” My point: it’s not.
— Matthew Green (@matthew_d_green) August 7, 2021
Apple says it "will not accede to any government’s request to expand" use of its device scanning tech. Easy for them to *say* that, and Apple has resisted such demands in the past, but it's understandable why many are wary of the genie leaving the bottle https://t.co/uR1ozrZLqP
— Martin SFP Bryant (@MartinSFP) August 9, 2021
"Could governments force Apple to add non-CSAM images to the hash list?"
— ????Nadim Kobeissi???? (@kaepora) August 9, 2021
"Apple will refuse any such demands."
— except, they won't. Apple *has already dropped plans for encrypting iCloud backups specifically because the FBI complained*: https://t.co/qau08qQZBX pic.twitter.com/Nqi9nlBXQm
Apple commit to challenging requests to expand their CSAM detection to other material. So did UK ISPs, but they lost in court and did it anyway. Will Apple leave a market if put in the same position? https://t.co/KBOysTLT0F h/t @AlexMartin pic.twitter.com/h8e0dkY5SM
— Steven Murdoch (@sjmurdoch) August 9, 2021
If 5% of US Apple users disabled iCloud Photos in the next week, Apple would probably reconsider their plans to enable client side scanning. I wonder what that would take.
— Matthew Green (@matthew_d_green) August 8, 2021
It's important, as we're discussing Apple's CSAM tech, to note that saying it's simple to disable iCloud backup skips over everything we know about the way people use software. This doesn't win or lose the argument any which way but it addresses one of Apple's points.
— Patrick Howell O'Neill (@HowellONeill) August 9, 2021
The chatter around Apple's recent announcement of new child safety protections is rather frustrating, as the amplification/knowledge ratio seems rather high on this one.
— Alex Stamos (@alexstamos) August 7, 2021
Apple's docs: https://t.co/TIcVQ6Zb1J
A discussion with friends: https://t.co/c4IYPVMHUA
This, from @benthompson, is the crux of the matter. https://t.co/vzCr1iXrIp pic.twitter.com/qlw6QQXelE
— nilay patel (@reckless) August 9, 2021
It's hard to overstate the power of defaults. History shows that vanishingly few users mess with most of them: "For most users, the default value is the only value." https://t.co/6swhbLklRc
— Patrick Howell O'Neill (@HowellONeill) August 9, 2021
Literally the graf I copied/pasted to my colleagues covering this https://t.co/F3L4E1RfBX
— Steve Kovach (@stevekovach) August 9, 2021
Apple CSAM FAQ addresses misconceptions and concerns about photo scanning https://t.co/z3POmJdJT9 via @benlovejoy
— Dwight Silverman (@dsilverman) August 9, 2021
There *is* something I can do If I’m that concerned. I can choose to not do business with Apple. If enough people choose that path, Apple would be likely change their policy. That’s unlikely to happen. After all, how many Apple users are ok with Facebook & Google? https://t.co/Y41KiSPvRe
— Michael Gartenberg (@Gartenberg) August 9, 2021
Apple just published a FAQ document regarding its content-scanning rollout. Some choice parts:
— ????Nadim Kobeissi???? (@kaepora) August 9, 2021
"Does this mean Apple is going to scan all the photos stored on my iPhone?"
"No, but actually yes."?♂️
Full FAQ Here: https://t.co/poAetqdIbK pic.twitter.com/QKeYhZwgh6
Apple's response to 'slippery slope' concerns: "Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it." https://t.co/DaZrBuEONx pic.twitter.com/u0RChN0k5G
— Patrick Howell O'Neill (@HowellONeill) August 9, 2021
The hash function will leak out. And that will leave “the secrecy of NCMEC’s database” as the only remaining technical measure securing Apple’s encryption.
— Matthew Green (@matthew_d_green) August 8, 2021
But I’m going to tell you a secret: the really bad guys already have that database.
Apple also says that it won't add images hashes to the database, they have to come from NCMEC: "[hashes] are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes."
— Jason Aten (@JasonAten) August 9, 2021
This is a good one ---> https://t.co/rmacrmLlTC
— Steve Kovach (@stevekovach) August 9, 2021
Apple’s new FAQ on CSAM detection is disappointing. The document uses misleading phrasing to avoid explaining false positives. And the FAQ says little about how Apple will ensure the hashes are only CSAM and the same for all users. This is marketing. https://t.co/nSBf7RCgrk
— Jonathan Mayer (@jonathanmayer) August 9, 2021
The technology can be used for a wide range of scanning and cataloging beyond child abuse (including policing for copyright violations), but Apple pinky swears it would never do that.
— Nash Across the 8th Dimension (@Nash076) August 9, 2021
And as we all know, Apple always keeps its word.https://t.co/BnzlBH0Pe1
On the chance it could be forced to expand the scope of the feature:
— Jason Aten (@JasonAten) August 9, 2021
"We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."
One Bad Apple
— hardmaru (@hardmaru) August 9, 2021
“Apple says that they will scan your Apple device for CSAM material. If they find something that they think matches, then they will send it to Apple. You could have corporate confidential information and Apple may quietly take a copy of it.”https://t.co/XsEQPEiKNL
CN Gov't: "Hey Apple, we need you to do a check of all iPhones and Macs in China to find out who might have shared and been exposed to this horrible image using your great new CSAM tool. Don't forget we're a quarter of your company revenue!" pic.twitter.com/piQfGmMugm
— Bad Hombre (very stable genius) (@chibaolema) August 7, 2021
"Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it." (2/2)
— Tim Bradshaw (@tim) August 9, 2021
One to keep for the record. https://t.co/jM6iWF2BZb
„It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours.“https://t.co/Bp84beYJ9o
— hakan (@hatr) August 9, 2021
via @fubits
Apple’s put out a FAQ in response to backlash over their new CSAM photo scanning feature. Points out various ways their current design is privacy protective, which is great, but doesn’t really alleviate my core concerns. https://t.co/eSdL5CYoq8
— Julian Sanchez (@normative) August 9, 2021
Another frustrating aspect of anything involving CSAM is the implication that those warning about unintended consequences or technical failures are unconcerned about the problem or with protecting children.
— matt blaze (@mattblaze) August 8, 2021
In fact, many of us have ourselves been victims of child abusers.
I said this on the Vergecast too. At the end of the day it’s inevitably policy.
— Dieter Bohn (@backlon) August 9, 2021
In this case in particular all the technical layers are designed to make you think it’s technology that’s protecting your privacy. It’s not. It’s just Apple policy. https://t.co/rKV7ieV1w5
Apple responds to "slippery-slope" privacy concerns over CSAM tool: "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future..." (1/2)
— Tim Bradshaw (@tim) August 9, 2021
AFAICT, some civil society folks got a briefing from Apple 1 day before Thursday's announcement, and that was it. They touted the ? they got from prominent cryptographers, so the lack of even the usual pat phrase "in consultation with stakeholders from civil society" stands out.
— Riana Pfefferkorn (@Riana_Crypto) August 9, 2021
ICYMI: Apple looks to ease CSAM photo scanning concerns with new FAQ https://t.co/qMy7AT5xdq by @killianbell pic.twitter.com/uAUxpEIaWM
— Cult of Mac (@cultofmac) August 9, 2021
Apple looks to ease CSAM photo scanning concerns with new FAQ https://t.co/qMy7AT5xdq by @killianbell pic.twitter.com/7S5OtYsERA
— Cult of Mac (@cultofmac) August 9, 2021