Reading through Apple’s Platform Security docs for iCloud*, it seems like server-side CSAM scanning of iCloud Photos would be cumbersome and expensive for Apple, so I’m betting client-side scanning is an optimization.
— Dino A. Dai Zovi (@dinodaizovi) August 7, 2021
_________________________
* https://t.co/rxs56uGMAs https://t.co/KKSVqv9olD
We are now at over 3,000 screeching voices of the minority! We tried to vet every signature.
— ??Nadim Kobeissi?? (@kaepora) August 7, 2021
Our open letter now illustrates a very strong opposition to Apple's new content screening measures.
Thanks @Snowden for keeping me and @georgionic up all night! https://t.co/niSvN1jLXH
Apple has insisted on defining privacy as “we can’t see your data” then criticized Google & Facebook on privacy.
— Dare Obasanjo (@Carnage4Life) August 7, 2021
However there’s also “we protect your data from governments” where their track record is worse than others which is the elephant in the room for on-device scanning.
I have an entire lecture on live abuse, and I can't replicate the whole thing here, but here is a very stark description of what that can look like.
— Alex Stamos (@alexstamos) August 7, 2021
Content Warning: Really horrible child abuse.https://t.co/rHbitWSKU0
If you must read only one article to see the absurdity of the bombastic headline regurgitation currently being spewed about Apple's new child pron alerting system: https://t.co/rwB250w1AH
— Kontra (@counternotions) August 7, 2021
There's already been a lot of smart commentary on why this is a terrible idea. But I also wanted to add another piece of context:
— Wagatwe Wanjuki ?? ?? (@wagatwe) August 6, 2021
Violence against women and children is often used as an excuse to expand state surveillance without actually helping. https://t.co/ZHWOK2YChV
If you’re still confused by Apple’s CSAM initiative, try looking at it through this lens:
— Rene Ritchie (@reneritchie) August 7, 2021
Apple can’t abide known CSAM images on, or trafficked through, their servers, but they don’t want to scan actual private user libraries to catch them, and this system solves for that
I doubt 2 would have much effect? Why do you think it would?
— Charles Arthur (@charlesarthur) August 7, 2021
Seems like 1 should have a dramatic effect of identifying offenders, if the proportion holding is anything like the FB/WApp numbers would imply.
My guess is that 1 is mainly Apple no longer being willing/able to tolerate CSAM on their servers and this is their engineering solution to that
— Rene Ritchie (@reneritchie) August 7, 2021
Amazing how many sources got this story completely abjectly wrong. The slippery slope arguments are reasonable and worth considering—but if you don’t get the technology, you can’t even begin to plausibly make those arguments. https://t.co/gGuSOZ52la
— Matt Rosoff (@MattRosoff) August 7, 2021
Great overview and take on Apple CSAM stuff by @gruber - Apple's messaging/clarity on this could have been much much better, but it's not entirely a messaging issue.https://t.co/qMChBauYaB
— Arnold Kim (@arnoldkim) August 7, 2021
Unlike most of the alarmist pieces published this week, John @Gruber writes a comprehensive analysis of how Apple's new child safety initiatives actually work. Bottom line: while not perfect, it protects personal privacy a lot more than you think. https://t.co/bT1GVVRFfk
— Carl Howe (@cdhowe) August 7, 2021
are.
— Aral Balkan (@aral) August 7, 2021
I do not say this lightly or take any pleasure whatsoever in saying it.
I cannot stress enough how important it is that this line does not get crossed.https://t.co/dsrOy02lqg (2/2)
I also don't understand why Apple is pushing the CSAM scanning for iCloud into the device, unless it is in preparation for real encryption of iCloud backups. A reasonable target should be scanning shared iCloud albums, which could be implemented server-side.
— Alex Stamos (@alexstamos) August 7, 2021
1/ RE: Apple’s plan to scan every photo in iMessage with machine learning and alert parents to nudity.
— Brianna Wu (@BriannaWu) August 7, 2021
I think a lot of well-meaning Silicon Valley people don’t understand what life is like in the Religious Right South.
Let me share so you can imagine how it will be misused.
As a parent, I am so happy to see companies begin to take child safety more seriously.
— Aphrodite (@Mom_Mykayla) August 6, 2021
Child Safety - Apple https://t.co/XYlTNAQrFz
Very useful/detailed John Gruber explanation. Also points to hasty/wrong/prejudiced WaPo article. https://t.co/s6Yr7qnHZo
— Jean-Louis Gassée (@gassee) August 7, 2021
This is also a good reminder for privacy and security folks to pay more attention to the global regulatory environment. While the PRC has been invoked a lot, I expect that the UK Online Safety Bill and EU Digital Services Act were much more important to Apple's considerations.
— Alex Stamos (@alexstamos) August 7, 2021
Strongly agree, I can live with the design of iCloud as it exists - though what could happen next concerns me. It’s the iMessage component that gives me far more pause.
— Brianna Wu (@BriannaWu) August 7, 2021
That said, I *think* it’s very difficult to change the age of a child’s iCloud account. I forget why I tried that for my son a few years ago, but I gave up. (I wanted to make him “older” for something, but don’t recall the details.)
— John Gruber (@gruber) August 7, 2021
Here's a level-headed and detailed explanation about all the new Apple bad-image-detection features, including when one should actually start to worry, a startling revelation about existing FBI back-doors, and a good ol' take down of a sloppy journalist https://t.co/kDj8ECshrf
— Andrew Burke (@ajlburke) August 7, 2021
So I *think* this is why the Messages feature gives so much control to the kids, even those 12 and under. Control meaning that they always get a clear warning upon receiving or attempting to send a flagged image that proceeding will result in a notification to their parents.
— John Gruber (@gruber) August 7, 2021
What’s the net effect on CSAM on iOS of both:
— Benedict Evans (@benedictevans) August 7, 2021
1: on-device scanning of images uploaded to iCloud for matches to known CSAM and
2: iOS internet traffic going through a double-blind encrypted relay, so IP addresses can’t ID users
(And could Apple, politically, do 2 w/out 1?)
Apple's filtering of iMessage and iCloud is not a slippery slope to backdoors that suppress speech and make our communications less secure. We’re already there: this is a fully-built system just waiting for external pressure to make the slightest change. https://t.co/f2nv062t2n
— EFF (@EFF) August 5, 2021
The chatter around Apple's recent announcement of new child safety protections is rather frustrating, as the amplification/knowledge ratio seems rather high on this one.
— Alex Stamos (@alexstamos) August 7, 2021
Apple's docs: https://t.co/TIcVQ6Zb1J
A discussion with friends: https://t.co/c4IYPVMHUA
I learned a helluva lot just now about Apple’s upcoming child safety features by reading this brilliant @GlennF and @rmogull FAQ for TidBITS.
— Steven Aquino (he/him) (@steven_aquino) August 7, 2021
A must-read story.
(h/t @jsnell) https://t.co/q2pDuYZW8X
Good thread. It has occurred to me that while the CSAM iCloud photo detection is the feature garnering the most attention, the Messages image detection for kids has more potential to be problematic *as it exists*.
— John Gruber (@gruber) August 7, 2021
The real number is 236 in 1yr for iPhone. Which is a bit lower than 17m a month.
— thaddeus e. grugq (@thegrugq) August 7, 2021
This was a good conversation on Apple's new child safety announcement. Very smart folks with a variety of expertise and POVs including @alexstamos who has worked in the trenches of trust and safety and talks about the horrifying reality of the harm https://t.co/wk9YhcVhkb
— Patrick Howell O'Neill (@HowellONeill) August 6, 2021
The minority, but worst reports are those that involve the live abuse of a child. Those are often reported to NCMEC and also referred live to the relevant authorities by a combination of the child safety investigator (was part of my team) and law enforcement response (in legal).
— Alex Stamos (@alexstamos) August 7, 2021
3/ Any discussion of sex, certainly any discussion of homosexuality, anything promoting a secular worldview would be censored.
— Brianna Wu (@BriannaWu) August 7, 2021
If you don’t think parents like that are going to change their child’s age in iCloud to create a permanent surveillance state, you’re just wrong.
And yet there’s only 250 cases from iPhone. Which suggests that something isn’t adding up with that narrative. The CSAM database can only find known CSAM photos that have been added. So new content won’t get flagged, only existing content that has to come from somewhere
— thaddeus e. grugq (@thegrugq) August 7, 2021
If you’re like me you might have had a hard time cutting through the noise to understand exactly what is going on.
— Curtis Herbert (@parrots) August 7, 2021
This is a very good piece breaking down exactly what Apple is doing, and what they can and cannot see. A good 101 before convos about pros and cons and slopes. https://t.co/9PwI9RNvhQ
This is my point. You can’t think about these systems *AT THEIR BEST.* You have to ask hard questions about how they could be misused.
— Brianna Wu (@BriannaWu) August 7, 2021
As an industry, we massively suck at considering the downside. In this case, the downside is outed, potentially dead children. https://t.co/K3A6kwNjaT
Is CASM on iPhones a big problem? How do they get it there from the dark web? Don’t pedos believe this capability already exists and avoid iPhones to begin with?
— thaddeus e. grugq (@thegrugq) August 7, 2021
The vast majority of Facebook NCMEC reports are hits for known CSAM using a couple of different perceptual fingerprints using both NCMEC's and FB's own hash banks. What constitutes a "report", which can have multiple images and IPs, is one of the data challenges.
— Alex Stamos (@alexstamos) August 7, 2021
I’m a big fan of @apple and I use a ton of their products and services
— Olivier Simard-Casanova ?️??? (@simardcasanova) August 7, 2021
But on top of being creepy, we currently has zero *credible* reassurance that this system won’t be weaponized by authoritarian regimes or used for unrelated purposes
Maybe bc such a reassurance cannot exist https://t.co/8JJ07wHlCI
Probably the best summary I’ve seen of Apple’s recently announced child safety initiatives, including where critics are excessive or plain wrong, as well as genuine concerns for potential abuse in the future.https://t.co/lih9AG5w22
— Patrick Beja (@NotPatrick) August 7, 2021
So the Apple scanning system just dropped. https://t.co/MhkM66aJGM
— Matthew Green (@matthew_d_green) August 5, 2021
Apple unveils changes to iPhone designed to notify parents of child sexual abuse. But some privacy watchdogs are concerned about the implications. https://t.co/PBQD1jgqWR
— NYTimes Tech (@nytimestech) August 6, 2021
The irony of Apple’s theory that any analysis done on the device is automatically private and anything analysis in the cloud violates your privacy - they built a CSAM detection system that is *potentially* much worse for your privacy *because* it happens on your device
— Benedict Evans (@benedictevans) August 7, 2021
I agree with Gruber’s take that the 2018 delay on E2E iCloud backup encryption was done in order to incorporate this. The timings all feel right. I think there’s probably been CSAM but it’s been hard to action (can’t be seen to root through iCloud photos). Now it can.
— Charles Arthur (@charlesarthur) August 7, 2021
Apple saying they will “reject any such demands from a government” is laughable at best and a flat out lie at worst.
— Fernando (@fromjrtosr) August 7, 2021
Once any government makes it a law for Apple to use a different set of hashes, what then? Are they stopping business in China? In the US? In the EU?
Ridiculous. https://t.co/njVpZHDCGJ
One of the basic problems with Apple's approach is that they seem desperate to avoid building a real trust and safety function for their communications products. There is no mechanism to report spam, death threats, hate speech, NCII, or any other kinds of abuse on iMessage.
— Alex Stamos (@alexstamos) August 7, 2021
2/ I have a family member who enacted total lockdown on *everything* her child read.
— Brianna Wu (@BriannaWu) August 7, 2021
So when her child would go to the library to check out books, upon returning home she would pass them to my family member.
She would then research *every single* book online and clear it.
These numbers suggest it's common on regular services which is easily accessible on an iPhone.https://t.co/dEL8Luv9OG
— Pär Björklund (@Paxxi) August 7, 2021
I’d like to know what those Facebook numbers represent. New CSAM? Or people uploading CSAM that gets flagged by the db? And how many users are doing this? I saw one number was 20m, but that can’t be right. That’s like the entire adult male population of Great Britain.
— thaddeus e. grugq (@thegrugq) August 7, 2021
I meant that if Facebook detects that much, that's an indication of how prevalent it is on regular sites, no "dark web" required.
— Pär Björklund (@Paxxi) August 7, 2021
… I pay attention to these. But nothing could prepare anybody to a pretty much all-out-of-sudden announcement of a system shipping in a few weeks, with powerful capabilities but rather unclear governance/oversight structure. I’m pretty surprised in light of previous messages.
— Lukasz Olejnik (@lukOlejnik) August 7, 2021
What about the on-device nature of it makes it worse? Assuming the cloud implementation scanned the same set of photos…
— Antonio García Martínez (@antoniogm) August 7, 2021
??? #Apple’s #neuralMatch tool will scan #iPhones for child abuse content #Ransomware #DataBreaches #DarkWeb #CyberSec #infosec #Security #cybercrime #ThreatIntel #hackers #dataprotection #cyberthreats #cybersecurity #cyberattacks #cyberintelligence https://t.co/Wsz2gyjTno
— Jiniba (@JinibaBD) August 7, 2021
Governments can ask them to scan for XYZ you’ve uploaded to iCloud today. Apple has opened the door to scanning content on your device.
— Dare Obasanjo (@Carnage4Life) August 7, 2021
Even Gruber concedes this is new and now depends on how much you trust Apple to say “no” to various governments.
https://t.co/SODQlhhc3m
Probably the best summary I’ve seen of Apple’s recently announced child safety initiatives, including where critics are excessive or plain wrong, as well as genuine concerns for potential abuse in the future.https://t.co/lih9AG5w22
— Patrick Beja (@NotPatrick) August 7, 2021
All due respect here, but that’s not what they are doing, and it’s not what they announced. @gruber has a solid critique of what they will be doing: https://t.co/kxJRCXjlBT
— Dave Loftis (@lofdev) August 7, 2021
They are going to “bust” a whole lot of familial nudists - Apple wants to check your phone for child abuse images – what could possibly go wrong? | Arwa Mahdawi | The Guardian https://t.co/eGRhF4vjma
— David J. Ley PhD (@DrDavidLey) August 7, 2021
Apple wants to check your phone for child abuse images – what could possibly go wrong? On the surface Apple’s new features sound both sensible and commendable – but they also open a Pandora’s box of privacy and surveillance issues
— Alfons López Tena (@alfonslopeztena) August 7, 2021
By @ArwaM https://t.co/HLPNvB1QEi
Appleは、児童性的虐待資料(Child Sexual Abuse Material: CSAM)に関する画像がiCloudに保存される前に検出する新しいシステムについて批判されています。政治的に拡大解釈されて悪用される危険性が指摘されていますが…
— Megalith IT Alliance (@meg_it_all_JPN) August 7, 2021
(代表)https://t.co/wCh1jSZQMG
BBC "Apple criticised for system that detects child abuse" https://t.co/fVgbqMSLcc
— とりさん@11月に退職予定 (@biochem_fan) August 7, 2021
"new applications of cryptography" ということは、ハッシュ関数でマッチするのかしら? 当初の目的は妥当でも、例えば中国のような国家で他のコンテンツの検閲にも使える危険な技術であるのは間違いない。
Interesting that Apple is getting criticised by other tech companies who have a very sketchy history with users privacy, but praised by senior politicians both in the US and UK. https://t.co/D7XAQHWjiJ
— Shaun Jenks ?⌚️???✈️ (@Shaunjenks) August 7, 2021
https://t.co/pDMRZqvA9P
— 秋吉 健 (@bari5) August 7, 2021
海外でもいよいよ大事になってきましたよ
#今日の英語ニュース
— 亀の子 (@kamenokoki) August 7, 2021
アップルが児童ポルノを検出する機能を追加!
児童保護の観点からは良いように思えるこの措置。
iCloudにアップロードされる前の、個人の端末にある写真なども検出の対象です。
政府が個人の端末を監視する温床になる、と反対の声も…?#英語学習https://t.co/qAoPRjad4Z
Apple takes a step towards opening the back door https://t.co/RiTfG6eKmX | opinion
— Financial Times (@FT) August 7, 2021