I talked with @nytimes for this piece on the new NIST report (thanks to @CadeMetz for reaching out!). On this thread I elaborate a little on my perspective ?https://t.co/Hc4CQ86sFw
— Maria De-Arteaga (@mariadearteaga) December 20, 2019
.@BennieGThompson: The report "shows facial recognition systems are even more unreliable and racially biased than we feared. ... The Administration must reassess its plans for facial recognition technology in light of these shocking results" https://t.co/QFA7619jMB
— Drew Harwell (@drewharwell) December 19, 2019
As with much of the facial recognition stuff, the problem is not “wow these products are so cutting edge,” it’s “wow, these products are remarkably bad.” https://t.co/g0pas119do
— Emily Rauhala (@emilyrauhala) December 19, 2019
No surprise here but if we care to preserve any justice in our institutions, this is further warrant for a hard chill on govt and security use of these systems. https://t.co/dL7wxiIQC7
— Shannon Vallor (@ShannonVallor) December 19, 2019
1) NIST has a new evaluation of Facial Recognition software out. Not surprising, the software keeps getting better and more accurate but demographics matter. https://t.co/EYFIMIbzK5
— Albert Gidari (@agidari) December 19, 2019
i think every day about the really stupid tweet i saw a while back about how algorithms cannot be racist because they're math https://t.co/kfnwJL0gHy
— Steven Rich (@dataeditor) December 19, 2019
One false match can lead to missed flights, long interrogations, tense police encounters, false arrests, or worse.
— ACLU (@ACLU) December 19, 2019
Even government scientists are now confirming this surveillance technology is flawed and biased. https://t.co/BSpInNZuo9
“The majority of face recognition algorithms exhibit demographic differentials. A differential means that an algorithm’s ability to match two images of the same person varies from one demographic group to another.”
— Andrew G. Ferguson (@ProfFerguson) December 19, 2019
So, maybe not for policing.... https://t.co/r7R2BfCDoD
“The agency did not test systems from Amazon, Apple, Facebook and Google because they did not submit their algorithms for the federal study” https://t.co/yyPWtyWd6o via @NYTimes
— Lori McGlinchey (@macvie) December 19, 2019
New: Huge federal review of facial-recognition systems finds that white people get more accurate searches than everyone else. With some algorithms, Asian and African American people were *up to 100 times* more likely to be falsely identified than white men https://t.co/QFA760RIV3
— Drew Harwell (@drewharwell) December 19, 2019
“Asian & African American ppl were up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. Native Americans had the highest false-positive rate of all ethnicities, according to the study.” https://t.co/9KblwSlsr7
— blmohr (@blmohr) December 19, 2019
Facial-recognition technology also had more difficulty identifying women than men. And it falsely identified older adults up to 10 times more than middle-aged adults. https://t.co/C4wg1DquPJ
— NYT Business (@nytimesbusiness) December 20, 2019
It’s almost like race is a pervasive structure that is built into the architecture of the social world. https://t.co/iP2ysoQ0vJ
— Victor Ray (@victorerikray) December 19, 2019
There are two sides to this:
— Emily G (@EmilyGorcenski) December 20, 2019
- we should not be developing better, more accurate facial recognition to surveil minorities
- facial recognition in China works incredibly accurately compared to American tech, so the racism is just really, really obvious here https://t.co/s8DGR6gMur
Detroit - a majority black city - purchased facial recognition tech in 2017 for over $1M. After using the tech more than 600x DPD, this fall, couldn’t tell us how many arrests it’s led to. https://t.co/hzPmMCrwl1 https://t.co/u5pyl2r4lz
— Allie Gross (@Allie_Elisabeth) December 19, 2019
Oh who ever would have thought that face recognition systems have been trained to be technically racist and sexist, which compounds the sexist, racist ways they are deployed?https://t.co/xRJrVG5zl8
— Sasha Costanza-Chock (@schock) December 19, 2019
This NIST pretty much confirms all the pioneering work done by @jovialjoy @timnitGebru and @rajiinio wrt bias in facial rec. At this point, it would be v diffult to see how vendors could make the case for any LE applications. https://t.co/GdJFvxtmzr
— William Isaac (@wsisaac) December 19, 2019
This will surprise no black people https://t.co/GEPweQEPIr
— Abby D. Phillip (@abbydphillip) December 20, 2019
Federal study: Facial recognition systems most benefit middle-aged white males (by @mrgreene1977) https://t.co/at1LMGbJaP
— TNW (@thenextweb) December 20, 2019
Federal study: Facial recognition systems most benefit middle-aged white males https://t.co/lFYs8rCUiB via @thenextweb
— Philip Di Salvo (@philipdisalvo) December 20, 2019
Federal study: Facial recognition systems most benefit middle-aged white males (by @mrgreene1977) https://t.co/pdEgsW3jS8
— TNW (@thenextweb) December 20, 2019
Federal study: Facial recognition systems most benefit middle-aged white males https://t.co/D9Jp6wzooX
— Carina C. Zona af (@cczona) December 20, 2019
Federal study: Facial recognition systems most benefit middle-aged white males (by @mrgreene1977) https://t.co/xAi6XVy56W
— TNW (@thenextweb) December 19, 2019
well well well https://t.co/oaK5yuIBpU
— Tajha Sophia Chappellet-Lanier (@TajhaLanier) December 19, 2019
Real problems with facial-recognition technology according to the experts at @NIST. https://t.co/jvP2FlGlQE
— Jessica Rosenworcel (@JRosenworcel) December 19, 2019
“The majority of face recognition algorithms exhibit demographic differentials. A differential means that an algorithm’s ability to match two images of the same person varies from one demographic group to another.”
— Andrew G. Ferguson (@ProfFerguson) December 19, 2019
So, maybe not for policing.... https://t.co/r7R2BfCDoD
A new sweeping study of facial-recognition products looked at 189 software algorithms from 99 developers, the majority of the industry. Alarming takeaways... https://t.co/PpWNcGXDjR
— Jacob Ward (@byjacobward) December 19, 2019
Here is a link to the NIST write up: https://t.co/JRag7cSQCK
— William Isaac (@wsisaac) December 19, 2019
Current facial recognition algorithms are a lot crappier on people of color than white folk, with lots more false positives (wrongly saying, "I think that's the same guy!"). Given the increasing use by police etc., this should be of significant concern.https://t.co/ItkZbF4pKN
— ***Dave Hill (@Three_Star_Dave) December 19, 2019
New NIST study of face recognition finds "the majority of face recognition algorithms exhibit demographic differentials. "#Facerecognitionhttps://t.co/gWbwHgnyS9
— Jay Stanley (@JayCStanley) December 19, 2019
NIST Study Evaluates Effects of Race, Age, Sex on Face Recognition Software | NIST https://t.co/rcEuUg0ydX
— Philippe Vynckier - CISSP (@PVynckier) December 19, 2019
Federal study of top facial recognition algorithms finds ‘empirical evidence’ of bias https://t.co/XvGhEv7M9x pic.twitter.com/tgTKbQdHUb
— The Verge (@verge) December 20, 2019
Federal study of top facial recognition algorithms finds ‘empirical evidence’ of bias - The Verge https://t.co/SsYEKKDnlK
— Kyle E. Johnson (@kyleejohnson) December 20, 2019
3-year ban on police use of facial recognition technology in California to start in the new year #UCSBinfosec #ucsb #ITsecurity #cybersecurity #UCCyberStronghttps://t.co/EgPPW8v6W4
— UCSB Information Security (@UCSBInfoSec) December 20, 2019
It's a very bad day for #AI
— Eric Topol (@EricTopol) December 20, 2019
—https://t.co/hF3Z2LZMOJ by @natashanyt @CadeMetz
—https://t.co/Iwmn6eWigK by @willknight @WIRED
—https://t.co/V6D8eZVndi by @cwarzel @stuartathompson
—https://t.co/0MV9Rs3cWs by Max Weiss https://t.co/nBltoxzQBU pic.twitter.com/JgByWjTZT0
We might've laughed when we read that the North Wales Police' use of #facialrecognition contained 92% false positives. But "one false match can lead to... lengthy interrogations, watch list placements, tense police encounters, false arrests or worse" #AI https://t.co/GcnHm6WuCb
— Dorothea Baur (@DorotheaBaur) December 20, 2019
“While some biometric researchers and vendors have attempted to claim algorithmic bias is not an issue or has been overcome, this study provides a comprehensive rebuttal” - Joy Buolamwini of MIT. @natashanyt + @CadeMetz explain. https://t.co/eW5HWToSyP
— Steve Lohr (@SteveLohr) December 20, 2019
Many Facial-Recognition Systems Are Biased, Says U.S. Study
— PrivacyDigest (@PrivacyDigest) December 20, 2019
Algorithms falsely identified African-American and Asian faces 10 to 100 times more than Caucasian faces, researchers for the National Institute of Standards and Technology found. https://t.co/3NUq962kCL
New study confirms facial-recognition software has much lower accuracy rates in identifying non-whites. I wrote this summer about how that became a flashpoint in Detroit, whose share of black residents is larger than any other big American city.https://t.co/YLquv7X5KZ https://t.co/IWuwK5L6iT
— Amy Harmon (@amy_harmon) December 20, 2019
US study shows most commercial #facialrecognition systems exhibit bias against African-American and Asian faces. https://t.co/EKY8ywJbsc pic.twitter.com/ZeDxPMKP2v
— Access Now (@accessnow) December 20, 2019
Many #FacialRecognition Systems
— Spiros Margaris (@SpirosMargaris) December 20, 2019
Are #Biased, Says U.S. @NIST Study https://t.co/3R3DY3ZLAx #fintech #insurtech #AI #ArtificialIntelligence #MachineLearning #DeepLearning @natashanyt @CadeMetz @DeepLearn007 @psb_dc @sallyeaves @ahier @KirkDBorne @Fisher85M @Paula_Piccard pic.twitter.com/bb61TOeXKh
Breaking: the evangelists have tuned on IMPOTUS, stating he's morally unfit to be President!!!
— Pete Belmonte ????? (@PeteBelmonte) December 20, 2019
https://t.co/ux5VfEa2gW
https://t.co/CUsFfRy5GJ
— Larry Schweikart (@LarrySchweikart) December 19, 2019
When you've lost the Compost . . .
Confederate flag incident at Virginia high school sparks concern of racist behavior https://t.co/wtDNdLW9tB #Confederacygate #GamblingInTheCasino pic.twitter.com/xXr31ZE8ft
— Propane Jane™ (@docrocktex26) February 25, 2019
encouraging results for the opposition in Hungary's local elections https://t.co/Li5MdbVA11
— Eszter Hargittai (@eszter) October 13, 2019
Texas, Google battle over experts in states’ antitrust probe https://t.co/BzF3kJLlJZ
— Rich Tehrani (@rtehrani) December 19, 2019
UK watchdog set to challenge Google, Facebook ad dominance https://t.co/BzF3kJLlJZ
— Rich Tehrani (@rtehrani) December 19, 2019
Hi Misha & Matt — I love our #SPNFamily and always want to support everyone’s projects, but I recently read this that gave me a different view on things and I think it might be worth the read for both of you. ?https://t.co/DaIPETtV7Fhttps://t.co/RhDaD6N0kt
— ?V?O?T?E? (@thedoorgal) December 19, 2019
Federal study: Facial recognition systems most benefit middle-aged white males (by @mrgreene1977) https://t.co/D8WMgt6M4l
— TNW (@thenextweb) December 20, 2019
AI顔認証 人種で認識の精度に差 米国立研究所が調査 https://t.co/7MNrTJJxL7
— Haruhiko Okumura (@h_okumura) December 21, 2019
NIST Study Evaluates Effects of Race, Age, Sex on Face Recognition Software https://t.co/lcb2CR66KX
189のソフトを評価。非白人で精度が落ちるものが多い。学習データが不足しているのだろう
Federal study of top facial recognition algorithms finds ‘empirical evidence’ of bias https://t.co/HaTjlGpgJk pic.twitter.com/NFh3g5cvkV
— The Verge (@verge) December 21, 2019
#AI Bias: @NIST study of 189 software apps found that Asian & African-American ppl were up to 100x more likely to be misidentified in one-to-one matching than white men. In one-to-many matching, faces of African-American women returned most false positives.https://t.co/5bgolfaMZg
— Amy Diehl, PhD (@amydiehl) December 21, 2019
Facial recognition algorithms are biased - 10x-100x errors in Asian and African-American versus Caucasian!@annagines https://t.co/CWBcoHzcEF
— eSteve almirall (@ealmirall) December 20, 2019
One false match can lead to missed flights, lengthy interrogations, watch list placements, tense police encounters, false arrests or worse.https://t.co/abFAEeOBjB
— NYCLU (@NYCLU) December 20, 2019
Facial recognition systems sold by Cognitec, Megvii, and Microsoft falsely identified African-American and Asian faces 10 to 100 times more than Caucasian faces, according to new U.S. government studyhttps://t.co/CfB5zjE8Qz pic.twitter.com/jSAUna6Vm2
— CorpWatch (@CorpWatch) December 20, 2019
Under secret Stephen Miller plan, ICE to use child migrant information to expand deportation efforts, raising fears parents will face arrest https://t.co/5CKDsu0RGl
— Ninja Economics (@NinjaEconomics) December 21, 2019
Federal study confirms racial bias of many facial-recognition systems, casts doubt on their expanding use https://t.co/BzF3kJLlJZ
— Rich Tehrani (@rtehrani) December 21, 2019
Possible software issue forces NASA to cancel Boeing Starliner attempt to dock with space station https://t.co/BzF3kJLlJZ
— Rich Tehrani (@rtehrani) December 20, 2019