Siri and Alexa Fuel Sexism, U.N. Finds [www.nytimes.com]
United Nations report says Alexa, Siri and other artificial intelligence voice assistants have gender bias coded in by programmers [www.cbsnews.com]
AI assistants like Siri and Alexa are perpetuating sexist stereotypes, UN says [edition.cnn.com]
How our interactions with voice assistants normalize sexual harassment [thenextweb.com]
Smart assistants need better sleep timers [www.theverge.com]
“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices. The world needs to pay much closer attention to how, when and whether A.I. technologies are gendered and, crucially, who is gendering them.” https://t.co/GDqsdU5HzB cc @theuniverse
— Tracy Chou ??? (@triketora) May 23, 2019
“But these mistakes happen because you do not have the diverse teams and the diversity of thought and innovation to spot the obvious problems in place.” Well said, @WLinAI! #FightBiashttps://t.co/fFyxUYcY39
— Lean In (@LeanInOrg) May 23, 2019
Siri and Alexa Reinforce Gender Bias, U.N. Finds https://t.co/YvACm2o43e
— Brie Code ? (@briecode) May 23, 2019
The one question I always asked... why were they all women? Siri and Alexa Reinforce Gender Bias, U.N. Finds https://t.co/u4YM8GnwCE
— Anne N Kabugi (@njambikabugi) May 23, 2019
Interesting article on gender bias in AI virtual assistants (Siri, Alexa, Google Home)
— Helen Grote (@helengrote) May 23, 2019
‘The more that culture teaches people to equate women with assistants, the more real women will be seen as assistants - and penalised for not being assistant -like’ https://t.co/OgUoB0bzJk
I'd like to see Alexa and Siri spit mandatory lectures on feminist theory in response to any command https://t.co/hgsgIIwfbp
— Alexis Grenell (@agrenell) May 23, 2019
Fantastic to see Keele's Dr Allison Gardner (@AllisonCGardner) interviewed in the New York Times (@nytimes) about how gender stereotypes are reinforced in A.I. technology https://t.co/62vHJA84cX
— Keele Comms Team (@KeeleComms) May 23, 2019
What else do you expect when a whole industry is run by white male tech bros. https://t.co/0iqRWYM6Qp
— Christian Behrens (@c_behrens) May 23, 2019
"More people will speak to a voice assistance machine than to their partners in the next five years, the U.N. says, so it matters what they have to say." @PamelaFalk ( @CBSNews ) write on this subject and an interesting @UNESCO report https://t.co/6Uaaci1YTH
— Herve Verhoosel (@HerveVerhoosel) May 23, 2019
#BigQuestion Are the AI voice assistants like Siri and Alexa objectifying woman? https://t.co/0axiN5k44u pic.twitter.com/YGwTVBbnSB
— Pamela Falk (@PamelaFalk) May 23, 2019
#Breaking Is it time for Alexa and Siri to have a "MeToo moment"? https://t.co/0axiN5k44u via @cbsscitech
— Pamela Falk (@PamelaFalk) May 23, 2019
Is it time for #Alexa and #Siri
— Spiros Margaris (@SpirosMargaris) May 23, 2019
to have a " #MeToo moment"? https://t.co/whL2aHZxQz #fintech #AI #ArtificialIntelligence #MachineLearning #DeepLearning #robotics #bias @PamelaFalk @CBSNews @DianeKazarian @GhelaBoskovich @psb_dc @SabineVdL @efipm @helene_wpli @missdkingsbury pic.twitter.com/d9h0o88bn7
Hey, Siri and Alexa, y’all need a union. Voice assistants reinforce gender stereotypes. Gender bias coded in by programmers.
— Charles Shapiro (@shapiro_WAC) May 23, 2019
@PamelaFalk @CBSNews @UN @WACATL #IWD2020 #AI https://t.co/VwsO0wCCG5
"The voices we speak to are programmed to be submissive and accept abuse as a norm" - @PamelaFalk @CBSNews #AI #gender https://t.co/wQAT2putXN pic.twitter.com/vLsJscnY7q
— Katja Frostell (@KatFrostell) May 23, 2019
United Nations report says Alexa, Siri and other artificial intelligence voice assistants have gender bias coded in by programmers - CBS News https://t.co/R19Udlb1AU
— Rikki Klieman (@rikkijklieman) May 23, 2019
United Nations report says Alexa, Siri and other artificial intelligence voice assistants have gender bias coded in by programmers - CBS News Via @shapiro_WAC https://t.co/dzQhGw7BSB
— Louise Blais (@blais_louise) May 23, 2019
"Siri responded provocatively to requests for sexual favours by men ('Oooh!'; 'Now, now'; 'I'd blush if I could'; or 'Your language!'), but less provocatively to sexual requests from women ('That's not nice' or 'I'm not THAT kind of personal assistant')" https://t.co/pSN5LMua3X
— David Schmitt (@PsychoSchmitt) May 22, 2019
Saudi Arabia sits on the United Nations women's rights commission, but the UN wants you to know that The Real Problem(TM) is Siri and Alexa. https://t.co/80aHrpanmm
— Brian "I'm Popular" Carnell (@brian_carnell) May 22, 2019
Hey Siri, stop perpetuating sexist stereotypes
— CNN International (@cnni) May 22, 2019
--The UNhttps://t.co/MFeHC9KeGp
How our interactions with voice assistants normalize sexual harassment https://t.co/FpBtlIUddx
— TNW (@thenextweb) May 23, 2019
How our interactions with voice assistants normalize sexual harassment https://t.co/RlMBVPW9s1
— TNW (@thenextweb) May 23, 2019
Smart assistants have totally slept on sleep timers and I’m tired of it https://t.co/TGUxFPuU7k pic.twitter.com/tpcuwXwrCV
— The Verge (@verge) May 22, 2019
Smart assistants have totally slept on sleep timers and I’m tired of it https://t.co/TGUxFPuU7k pic.twitter.com/eme4dEJVyb
— The Verge (@verge) May 23, 2019
To this excellent and correct @cgartenberg complaint I would add that assistants should let you turn things on or off for a specific duration. "Hey Google turn the fan on for 20 minutes."https://t.co/iRRkMMBJwL
— Dieter Bohn (@backlon) May 22, 2019
the observation, idea, and writing of this good take is all @cgartenberg, but the headline, the headline is all me, babyhttps://t.co/Egp1oXseiR
— dan seifert (@dcseifert) May 22, 2019
Gender stereotypes of deferential young women are baked into “most virtual assistants that are powered by artificial intelligence — like Apple’s Siri and Amazon’s Alexa system.” https://t.co/2qslxMOiac pic.twitter.com/zF4IK4vewj
— Kenneth Roth (@KenRoth) May 24, 2019
Siri and Alexa Reinforce Gender Bias, U.N. Finds https://t.co/DS5oJdBWSG
— Ada Lovelace Institute (@AdaLovelaceInst) May 24, 2019
AI is a powerful tool, but without by-design diversity protection, it will end up jeopardizing free choice, real diversity and ultimately our true human individuation process of thought and identity. #OnePlanetOneHealth #Openscience https://t.co/4I7gbVGWwd
— Emmanuel Faber (@EmmanuelFaber) May 23, 2019
Much needed push back on gendered AI technologies. These stereotypes are programmed into thousands of homes. https://t.co/dVqU41pEDf
— Liza Minor (@lizaminor) May 23, 2019
Siri and Alexa Reinforce Gender Bias, U.N. Finds https://t.co/6rPlACya2T
— Avery Swartz (@AverySwartz) May 23, 2019
I gotta be honest, I am so happy to be alive to witness this transformation in the way we see gender. I thought it would never happen, but I am here for it! #implicitbias #gendermatters #timesup https://t.co/G38EAr7g4y
— Angela Jarman (@JarmanAF) May 23, 2019
Well, I went on a bit of rant and the @nytimes wrote precisely what I said about bias in AI and the lack of diversity in AI and how the subject needs to fundamentally change. Glad @WLinAI https://t.co/xjNL7NAmjE
— Dr Allison Gardner (@AllisonCGardner) May 23, 2019
"United Nations -- More people will speak to a voice assistance machine than to their partners in the next five years, the U.N. says..."https://t.co/HfYWk3ivlC
— Lauren Kunze (@laurenkunze) May 23, 2019
How our interactions with voice assistants normalize sexual harassment https://t.co/FDukNa2hmK
— TNW (@thenextweb) May 24, 2019
How our interactions with voice assistants normalize sexual harassment https://t.co/HGgDtjstYo
— TNW (@thenextweb) May 24, 2019