
Login to comment
Apple is making significant changes to the way Siri grading works and, most importantly, it is turning it OFF by default, making it an opt-in process. https://t.co/MXqGwDVRnW
— Matthew Panzarino (@panzer) August 28, 2019
But it remains super frustrating to see this briefed out under embargo to other outlets, while Apple didn’t even do us the courtesy of sending a note to let us know it had gone live. If they’re so sorry about their mistake, why is the Guardian being punished for revealing it?
— alex hern (@alexhern) August 28, 2019
“Privacy Fundamentalism” as a slander will not age well. https://t.co/9dgqd7JWIK
— Can Duruk (@can) August 28, 2019
Apple making Siri recording review opt in, excellent. Going the extra mile to take on liability for employees by bringing them in house, exemplary. Two good moves. https://t.co/DKQbrxOYUB
— Tom Merritt (@acedtect) August 28, 2019
In a nutshell, Apple will once again use humans to listen to a sampling of Siri recordings (in an effort to improve Siri). However, users have to opt-in and Apple employees will be the ones listening.
— Neil Cybart (@neilcybart) August 28, 2019
Apple apologizes for shortcomings in Siri's privacy protection. Three changes:
— Neil Cybart (@neilcybart) August 28, 2019
1) Audio recordings of Siri interactions won't be retained.
2) Users have to opt-in to have Siri recordings sampled.
3) Apple will no longer use contractors for process. https://t.co/VMwuFgpueG pic.twitter.com/RZsK0t4TT6
Guardian reports the contractors who were listening to and grading Siri recordings have been fired https://t.co/dIIjkylDs9 https://t.co/KUBd6OJF1x
— Steve Kovach (@stevekovach) August 28, 2019
Apple’s Siri-grading changes are good, but I’d like to see them go a bit further.
— Marco Arment (@marcoarment) August 28, 2019
Saving audio should be opt-in, and is. ?
But saving transcripts is still mandatory if you want to use Siri at all. On-by-default is fine, but it should have an opt-out.https://t.co/XROYRUG9W9 pic.twitter.com/xD0cFrYXZP
This Siri privacy thing does feel like the most full-throated apology Apple has made since the Maps debacle back in 2012 (unlike the somewhat passive aggressive apologies for things like Mac keyboards and iPhone battery issues) https://t.co/gXOm5kcUp3
— Tim Bradshaw (@tim) August 28, 2019
Note that if you don’t opt in to Apple saving Siri recordings, they will still store the text transcripts of what Siri heard. So snippets of ‘private’ conversations are still technically accessible to employees. https://t.co/tAe3HsaIAh
— Benjamin Mayo (@bzamayo) August 28, 2019
Apple says it won't record your Siri conversations anymore--but it will, via "opt-in." Most folks fall for "opt-in" ruse. I gave Google approval to monitor me, as feature I wanted not available any other way. Google wasn't straight with me. Will Apple be? https://t.co/BT8c61V330
— jeffersongraham (@jeffersongraham) August 28, 2019
“Siri, call bullshit.” https://t.co/zwcNcqSSI2 pic.twitter.com/kohFiEdt6P
— Chuck Ross (@ChuckRossDC) August 28, 2019
"We’ve decided to make some changes to Siri" in response to privacy concerns, announces @Apple. https://t.co/u4OEZN10HN
— Steve Herman (@W7VOA) August 28, 2019
I'm mostly joking but it feels like Apple watched my video https://t.co/scUpqC41aK allowing users to "opt in" to help Siri better instead of an option to "opt out" and no longer will they use 3rd party contractors for analysis. Only internal at Apple
— Tailosive Tech (@tailosivetech) August 28, 2019
Apple published a new Press Release: »Improving Siri’s privacy protections«. https://t.co/r8qwlz52By
— IsTheAppleStoreDown (@IsTheStoreDown) August 28, 2019
Apple's changing Siri's privacy protections...after getting exposed for hearing our conversations. It's the right move, but how many times has Apple ONLY changed BECAUSE they were caught - Battery Gate, Touch Disease, Keyboard, Siri etc. https://t.co/OKsifFlc4m pic.twitter.com/P7QEAbSFVR
— Brian Tong (@briantong) August 28, 2019
“Improving Siri’s privacy protections”
— Rene Ritchie (@reneritchie) August 28, 2019
“Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger—“https://t.co/sbU138b65F
Credit to Apple for a) apologising and b) committing to making it opt-in, not opt-out, once it reinstates the grading program: https://t.co/YWz7mysZrv
— alex hern (@alexhern) August 28, 2019
Apple lede of “At Apple, we believe privacy is a fundamental human right.” Is always a precursor to something they did so badly and had to change. This time it’s the Siri ? https://t.co/Ta1gUcgS3v
— Bʀʏᴀɴ (@bry_campbell) August 28, 2019
Apple just apologized for using contractors to listen to Siri audio recordings for "grading" purposes and is making it opt-in and if you opt-in, reviewed by Apple employeeshttps://t.co/KKwOVCWwJT
— Raymond Wong??? (@raywongy) August 28, 2019
Are y'all satisfied now?
Apple will no longer keep Siri audio recordings by default, makes feature opt-in https://t.co/i3Y9c7rAPv by @campuscodi
— ZDNet (@ZDNet) August 28, 2019
? #Apple will no longer keep #Siri audio recordings by default, makes feature opt-in
— ᴊᴏʜᴀɴɴᴇs ᴅʀᴏᴏɢʜᴀᴀɢ ????? (@DrJDrooghaag) August 28, 2019
“What happens on your iPhone stays on our servers”...#CyberSecurity #Privacy https://t.co/VAaL502eII
This is the right move // Apple will no longer keep Siri audio recordings by default, makes feature opt-in https://t.co/Gn2GJimoIb via @ZDNet & @campuscodi
— Jason Cipriani (@MrCippy) August 28, 2019
Apple will no longer keep Siri audio recordings by default, makes feature opt-in https://t.co/7vpqtNOJCZ
— ぱんだ-精神障害2級-病み度;9,999,999,999% (@Panda_Lv6) August 28, 2019
Login to comment
A slew of recent stories revealed privacy failures in which Apple, Amazon, Microsoft, Google, & Facebook all sent users' audio clips to contractors for training voice-to-text AI. Now Apple has announced privacy improvements, and… they're pretty good! https://t.co/35S1EhMs8O
— Arvind Narayanan (@random_walker) August 29, 2019
Siri audio for improving speech recognition - customers must opt-in for human review of audio going forward, and only Apple employees will be doing reviews. https://t.co/VZhADj9urk
— Jules Polonetsky (@JulesPolonetsky) August 28, 2019
Improving Siri’s privacy protections <& why wasn’t this the case to begin with ? https://t.co/1oUkSbimMZ
— Privacy Matters (@PrivacyMatters) August 28, 2019
Improving Siri's privacy protections - Apple Newsroom https://t.co/JSuEnzvmSD pic.twitter.com/1wnni4FUNA
— Rich Tehrani (@rtehrani) August 28, 2019
Login to comment