YouTube not limiting conspiracy videos after Parkland [www.fastcompany.com]
Are you a robot? [www.bloomberg.com]
Google and YouTube executives ignored warnings on toxic video content, now we're all paying the price [boingboing.net]
YouTube reportedly ignored employee warnings in order to grow faster [www.businessinsider.com]
YouTube execs ignored "toxic" content concerns - Bloomberg [seekingalpha.com]
YouTube Says 'Extreme' Videos Don't Do Well—So What Are These? [gizmodo.com]
YouTube, Like Facebook, Ignored Toxicity Warnings in Favor of 'Engagement' [www.extremetech.com]
Bloomberg: YouTube execs ignored warnings for engagement [9to5google.com]
One of most telling parts, for me, is Project Bean, the effort that didn't come to pass. Some involved diagnosed it as to benign neglect; others cited a paralysis and fear of big changes. YouTube's business is complicated! pic.twitter.com/OsH73yZL6b
— Mark Bergen (@mhbergen) April 2, 2019
To me this is one of the biggest questions about Google/YouTube: they know perfectly well what their algorithms favor and disfavor. Either they don’t care or they see it as a desirable outcome. https://t.co/yl4vEuYRsx
— Robert Cruickshank (@cruickshank) April 2, 2019
YouTube has neglected to take down neo-Nazi propaganda videos on the website, even after we specifically alerted them https://t.co/tmXOeYi3XI
— Motherboard (@motherboard) April 2, 2019
“We may have been hemorrhaging money...but at least dogs riding skateboards never killed anyone.” https://t.co/kQ8jBlMW58
— Brandy Zadrozny (@BrandyZadrozny) April 2, 2019
Infuriating and detailed read on YouTube's repeated rejections of policies designed to combat misinformation in favor of maximizing user engagement. https://t.co/vM6hQWIWdt
— Justin Brookman (@JustinBrookman) April 2, 2019
This is the most damning piece to date on Susan Wojcicki and YouTube's upper management, whose North Star focus on engagement blinded them to some pretty awful behavior. https://t.co/r1709it2BJ
— Ryan Mac (@RMac18) April 2, 2019
This is all outrageous. People in their own company were warning them about this, and yet...
— Louisa ?? (@LouisatheLast) April 2, 2019
Also, it’s insulting to compare YouTube to the library. Libraries have librarians. Librarians CURATE information. https://t.co/5xekIeo1rV
I still can’t believe that google thinks it’s doing anything to filter out radicalization by making people do NSAT surveys on infowars videos https://t.co/HmElqknc3L
— 100% clean soup (@vogon) April 2, 2019
According to this article if the Alt-Right were a category on YouTube it would be as popular as Music, Sports & Gaming. https://t.co/UrSAw8M6zo
— Katie Graham (@K80Blog) April 2, 2019
Incredibly damninghttps://t.co/B0mwFwWyH6
— Dieter Bohn (@backlon) April 2, 2019
As Facebook invests in addressing toxic content and literally begs for regulation, the media is going to increasingly focus on YouTube and their ?♂️attitude to extremist content.
— Dare Obasanjo (@Carnage4Life) April 2, 2019
They are where FB was before 2016 election, mindlessly oblivious. https://t.co/RiAo1ufJvd
“Bad actors quickly get very good at understanding where the bright lines are and skating as close to those lines as possible" https://t.co/VhlsVvY6Vl
— Oliver Darcy (@oliverdarcy) April 2, 2019
Must read from @mhbergen here, with some damning details on the failure of YouTube’s top management to stop the spread of conspiracy theories and hate speech from the site. https://t.co/5bc8NQhU72
— Sheera Frenkel (@sheeraf) April 2, 2019
Brutal, important look into YouTube by @mhbergen. Bingo -> “Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.” https://t.co/HmVoQoa2RJ
— Jason Kint (@jason_kint) April 2, 2019
“Conversations with over twenty people who work at, or recently left, YouTube reveal a corporate leadership unable or unwilling to act on these internal alarms for fear of throttling engagement.” https://t.co/hvXeXehwJb
— Donie O'Sullivan (@donie) April 2, 2019
This @mhbergen investigation into YouTube is one of the most important things about the company I’ve ever read. Just chockablock with damning revelations https://t.co/VKqbuby2rt pic.twitter.com/Vm8iW1JeRL
— Casey Newton (@CaseyNewton) April 2, 2019
New story: I spent weeks talking to folks who've worked at YouTube (and Google) about how the company has wrestled with recommendations, conspiracy theories and radicalism. https://t.co/FHmpHPyaz3
— Mark Bergen (@mhbergen) April 2, 2019
I frequently talk to people who think reforming a platform like YouTube is too complicated, that it would require some kind of impossibly difficult internal process. But the reality is more straightforward: it requires hard decisions by leadership that put people before profits. https://t.co/j0C01546tu
— frederick (@fredbenenson) April 2, 2019
It lays out in great detail how YouTube pursued ‘engagement’ at all costs, introducing a fig leaf ‘responsibility’ metric only after years of internal pressure from worried employees went nowhere.
— Casey Newton (@CaseyNewton) April 2, 2019
An incredibly damning piece about how YouTube debated for years about its radicalization problem internally, while still publicly denying its impact even until last week.https://t.co/A9yzE7Ir9O
— Ben Collins (@oneunderscore__) April 2, 2019
My latest: @YouTube acknowledges militant neo-Nazi content online (including Atomwaffen Division propaganda) but does nothing. Huge difference on how it deals with ISIS on its platform, even in the wake of white nationalist terrorism like Christchurch. https://t.co/KWooc20sai
— Ben Makuch (@BMakuch) April 2, 2019
This morning's must-read: https://t.co/Kx8uOoyynT
— Brian Stelter (@brianstelter) April 2, 2019
Comprehensive dive into YouTube, and its internal struggle to contain the worst stuff on the world's largest video site, by @mhbergen https://t.co/om2BPMrBkc
— Peter Kafka (@pkafka) April 2, 2019
I cannot believe Susan Wojcicki was pitching Sundar on an engagement-based payment system for creators as late as 2017 pic.twitter.com/oOxgHIukYi
— Casey Newton (@CaseyNewton) April 2, 2019
Wow YouTube's recommendations are bad for exactly the reason I guessed: seeking engagement over everything, even if that meant recommending infinite garbage. The fact that lower-level employees recognized the problem with that but were shut down is just worse. https://t.co/boQOuwqHUQ
— ultraklystron (@karlrolson) April 2, 2019
Proposals to change recommendations & curb conspiracies on @YouTube were ignored or sacrificed for engagement, "scores" of YouTube & @Google staff told @Business. https://t.co/CLfAQnMXcM Optimizing for views comes at a price when videos are overt misinformation or disinformation. pic.twitter.com/ztNaOQ2p8M
— Alex Howard (@digiphile) April 2, 2019
"The AI did it" is no longer an adequate excuse, never should have been in the first place.
— Christopher Mims ? (@mims) April 2, 2019
Hold companies accountable for the results of their decision-making systems, not the intent of the engineers who built them.
We should know by now: Algorithms have their own intentions. https://t.co/C6wMBvAFQ3
YouTube has since shifted its model -- the OKRs even! -- to "responsible growth." New figures here the millions that see its information panels on conspiracies and take satisfaction surveys. But as good @kevinroose invu shows, this change is hard to grok https://t.co/ADA2bovRzt
— Mark Bergen (@mhbergen) April 2, 2019
YouTube has gotten a lot of shit in the press over the past few years for not curbing destructive/inflammatory/conspiracy videos. It also got a lot of shit internally from employees, who then felt stymied/unheard, @mhbergen found. https://t.co/wRbF9W8hZE pic.twitter.com/I5HizD6dfm
— Amir Efrati (@amir) April 2, 2019
Unbelievably damning indictment of YouTube leadership. People have *died* over this stuff and they still mince words about it. https://t.co/5aI7bWcshb
— Anil Dash ? (@anildash) April 2, 2019
It's been hard to tell where YouTube's line is for extremist content. Here's one line: alerted to neo-Nazi propaganda, YT demonetized these videos but left them up: https://t.co/O1EMM3I6bW
— ᴅᴇʀᴇᴋ ᴍᴇᴀᴅ (@derektmead) April 2, 2019
YouTube knows exactly who the Nazis are, and allows them to continue using the platform because it’s profitable for them. https://t.co/mZL1xIw9g1
— AntiFash Gordon (@AntiFashGordon) April 2, 2019
"The company spent years chasing one business goal above others: “Engagement,” ...Conversations...reveal a corporate leadership unable or unwilling to act on these internal alarms for fear of throttling engagement."
— ?ashley ✨ (@AshleyEsqueda) April 2, 2019
~
Numbers aren't everything and any company who says so is bad. https://t.co/xkoa29Cw5r
`In recent years, scores of people inside YouTube and Google, its owner, raised concerns about the mass of false, incendiary and toxic content that the world's largest video site surfaced and spread.' https://t.co/C1OEEQse9l
— Tom Giles (@tsgiles) April 2, 2019
"Don't find the worst shit on our website, because we may get sued if you know about it" is a pretty fucked up incentive pic.twitter.com/NEXvM42JPI
— Ryan Mac (@RMac18) April 2, 2019
The problem with the social internet, IMO, is metrics. They're almost always a false indicator -- shock rather than quality -- but because businesses are built on KPIs, they will always manage by any given numbers, even bad ones.
— Heidi N. Moore (@moorehn) April 2, 2019
https://t.co/peTyXPb6BR
Good look into what went wrong with YouTube’s content and how profit and legal protections have complicated the fixes.
— ⊆ ∀ ® ¬ ⊜ ∫ (@iheartdogfarts) April 2, 2019
“…in the race to one billion hours, a formula emerged: Outrage equals attention.” https://t.co/5boPsWxRLa
NEW from @BMakuch: YouTube continues to leave neo-Nazi propaganda, podcasts, and audiobooks online. When alerted to the existence of these videos by Motherboard, the platform decided to leave them up: https://t.co/PG8WCb2fTC
— Jason Koebler (@jason_koebler) April 2, 2019
Here are the key takeaways from Bloomberg's investigation of YouTube https://t.co/EjhiuOhBv4 via @technology
— Alistair Barr (@alistairmbarr) April 2, 2019
Google and YouTube executives ignored warnings on toxic video content, now we're all paying the price https://t.co/1e4FbQyrOn
— Xeni Jardin (@xeni) April 2, 2019
‘인게이지먼트’는 지표일 뿐만 아니라, 콘텐츠 배열을 위한 인공지능의 원료. 이제 개개인의 첫페이지는 그 사람의 적나라한 수준. 저질의 책임도 당사자의 몫.
— Goodhyun 김국현 (@goodhyun) April 3, 2019
플랫폼 = 미디어 – 책임?.
유튜브와 페이스북이 늘 부러웠던 네이버 뉴스는 어제부로 따라하기로. https://t.co/uBCMWFBj48
YouTube: seltener in der Kritik als andere, ähnlich ignorant. "YouTube's problem is that it allows the nonsense to flourish. (...) Conversations with over 20 people reveal a corporate leadership unable/unwilling to act for fear of throttling engagement." https://t.co/TAsZxlVoeE
— Stefan Ottlitz (@hierprivat) April 3, 2019
Bloomberg: YouTube execs ignored warnings for engagement - 9to5Google https://t.co/JqDS3naR6D
— WhitestoneDome ES (@whitestoneES) April 2, 2019