Mozilla Foundation - YouTube Regrets https://t.co/9CNcCx8I73
— lunamoth (@lunamoth) July 7, 2021
1. 사람들이보고 후회하는 대부분의 동영상은 추천 동영상입니다.
2. 유튜브는 자체 정책을 위반하는 동영상을 추천합니다.
3. 비영어권 사용자가 가장 큰 타격을받습니다.
4. 유튜브 후회는 사람들의 삶을 영원히 바꿀 수 있습니다.
When confronted with the toxic potential of its recommendation algorithm, @YouTube’s routine response is to deny and deflect.
— Mozilla (@mozilla) July 7, 2021
That’s why we conducted the largest-ever crowdsourced investigation into YouTube’s algorithm. #YouTubeRegretshttps://t.co/RkBgVmoUhH
When confronted with the toxic potential of its recommendation algorithm, @YouTube’s routine response is to deny and deflect.
— Mozilla (@mozilla) July 7, 2021
That’s why we conducted the largest-ever crowdsourced investigation into YouTube’s algorithm.
More about #YouTubeRegrets ? https://t.co/RkBgVmoUhH pic.twitter.com/4frB7BR5Tg
Are your @YouTube recommendations sometimes lies? Conspiracy theories? Or just weird as hell?
— Mozilla (@mozilla) July 7, 2021
You’re not alone. That’s why we conducted a study to better understand harmful YouTube recommendations.
This is what we learned about #YouTubeRegrets ⤵️https://t.co/oSDVb0d63b pic.twitter.com/i1CtnaYN06
YouTube's recommendation algorithm drives 70% of watch-time on the platform - it also suggests hateful, harmful, dangerous content.
— Mozilla (@mozilla) July 7, 2021
You can do better, @YouTube.
Read more at ? https://t.co/RkBgVmoUhH pic.twitter.com/hFZKrupLLS
Our study shows that @YouTube sometimes recommends videos that violate their own policies. #YouTubeRegrets
— Mozilla (@mozilla) July 7, 2021
Read more at ? https://t.co/RkBgVmoUhH pic.twitter.com/Vdirr5YknU
YouTube’s powerful recommendation engine continues to direct viewers to videos that they say showed false claims and sexualized content, with the platform’s algorithms suggesting 71% of the videos that participants found objectionable https://t.co/opaTcKVeY9
— Anthony DeRosa ? (@Anthony) July 7, 2021
”YouTube’s problem with recommending terrible stuff is indeed systemic; a side-effect of the platform’s rapacious appetite to harvest views to serve ads.” https://t.co/5OkePk5ypf
— John Wilander (@johnwilander) July 7, 2021
“…71 percent — of all the reports [of harmful content] came from videos recommended by YouTube’s algorithm, and recommended videos were 40 percent more likely to be reported than intentionally searched-for videos, according to the report.” https://t.co/szEZM6A0JG
— hypervisible dot pdf (@hypervisible) July 7, 2021
? NEW ?
— ADL (@ADL) July 7, 2021
Disturbing report from our #StopHateForProfit partner @mozilla, shows @YouTube's recommendation algorithms routinely feed users hateful, "bottom-feeding" content and misinformation. https://t.co/sKdSxTyFF4
Despite YouTube's documented history of pushing people toward extremist content and down harmful rabbit holes, we know so little about its recommendation algorithm. This is because YouTube doesn't want us to know. The latest evidence comes from @mozilla. https://t.co/SXus7C82nv
— Brandy Zadrozny (@BrandyZadrozny) July 7, 2021
The YouTube recommendation engine is like the rogue AI from a sci-fi movie that wants to kill all humans.
— Ron Amadeo (@RonAmadeo) July 7, 2021
Luckily instead of giving it the nuclear launch codes we only gave it some video site, but it's still trying to end humanity with the power it has.https://t.co/o0A15QiJ67
YouTube recommends videos that violate the platform’s own policies, study finds (story by @thomas_macaulay) https://t.co/9J7RGeAxC8
— TNW (@thenextweb) July 7, 2021
Despite YouTube's documented history of pushing people toward extremist content and down harmful rabbit holes, we know so little about its recommendation algorithm. This is because YouTube doesn't want us to know. The latest evidence comes from @mozilla. https://t.co/SXus7C82nv
— Brandy Zadrozny (@BrandyZadrozny) July 7, 2021
YouTube's recommendations still push harmful videos, crowdsourced study finds- @BrandyZadrozny on a @mozilla study: https://t.co/dUhYVpLx2O via @nbcnews
— Justin Hendrix (@justinhendrix) July 7, 2021
Oh YouTube... https://t.co/lsEfhRFEbL
— Liana 'I Like Stuff' Kerzner (@redlianak) July 7, 2021
Google continues to mine misinformation, violence and hate for profit on Youtube. Great to see this Mozilla effort to highlight the problem. https://t.co/gVNQjCUv40 pic.twitter.com/GbRB47cowx
— Jeremy Stoppelman (@jeremys) July 7, 2021