The video algorithm: Facebook vs. YouTube [marketingland.com]
YouTube now bans minors from livestreams [mspoweruser.com]
YouTube Bans Livestreams From 'Younger Minors' [www.thewrap.com]
YouTube Increases Efforts To Protect Minors And Families [deadline.com]
An update on our efforts to protect minors and families [youtube.googleblog.com]
YouTube recommends videos of underage girls [www.cnbc.com]
YouTube pushes children's videos to pedophiles through content recommendation engine [boingboing.net]
Youtube has officially banned kids under 13 from streaming “unless accompanied by an adult,” which makes sense but doesn’t address one of the most ethically uncomfortable parts of YouTube, which is parents & businesses profiting from kidshttps://t.co/vhek7464hV
— ?️? caleb zane huett ?️? (@CZaneH) June 3, 2019
Damning story and revealing exchange: kids vids, unlike "borderline content," are too big to get cut off the recommendation machine. https://t.co/SnPxfxY8DR
— Mark Bergen (@mhbergen) June 3, 2019
"The extraordinary view counts — sometimes in the millions — indicated that the system had found an audience for the videos and was keeping that audience engaged." https://t.co/poNOpcfZgy pic.twitter.com/ekpqPg7mf5
— Christopher Ingraham (@_cingraham) June 3, 2019
"Best minds of our generation" have developed machine learning algorithms that connect pedophiles and amplify white-supremacists, misogynists and anti-vaxxers. Piece says YouTube wouldn't turn off recs because the company said "recommendations are the biggest traffic driver." pic.twitter.com/mrCnkTaHjA
— zeynep tufekci (@zeynep) June 3, 2019
Each video might appear innocent on its own, a home movie of a kid in a two-piece swimsuit or a nightie. But each has three common traits:
— Max Fisher (@Max_Fisher) June 3, 2019
• the girl is mostly unclothed or briefly nude
• she is no older than age 8
• her video is being heavily promoted by YouTube’s algorithm
YouTube’s algorithm also changed immediately after we notified the company, no longer linking the kiddie videos together.
— Max Fisher (@Max_Fisher) June 3, 2019
Strangely, however, YouTube insisted that the timing was a coincidence. When I pushed, YT said the timing might have been related, but wouldn’t say it was.
Advertisers saying this kind of stuff is unacceptable and then continuing to buy ads unchanged is the "thoughts and prayers" of digital media https://t.co/Z5XaYYVa93
— Brian Morrissey (@bmorrissey) June 3, 2019
We talked to one mother, in Brazil, whose daughter had posted a video of her and a friend playing in swimsuits. YouTube’s algorithm found the video and promoted it to users who watched other partly-clothed prepubescent children.
— Max Fisher (@Max_Fisher) June 3, 2019
Within a few days of posting, it had 400,000 views
Can recommendation algorithms powered by ML be saved? Totally unclear to me that they can. https://t.co/i8zlFWI6BN
— Annemarie Bridy (@AnnemarieBridy) June 3, 2019
Here we go again.
— Sleeping Giants (@slpng_giants) June 3, 2019
Advertisers, when will you finally bail on this horrendous platform? When will you demand that they finally clean up their act? https://t.co/fKOTtaDJZg
YouTube’s algorithm has been curating home movies of unwitting families into a catalog of semi-nude kids, we found.
— Max Fisher (@Max_Fisher) June 3, 2019
YT often plays the videos after users watch softcore porn, building an audience of millions for what experts call child sexual exploitationhttps://t.co/zNwsd9UsgN
Hard to read this and my story about Soph, the 14-year-old hate vlogger who got started at 9, and not conclude @youtube is a danger to kids https://t.co/Ih9TSGeURl
— Joe Bernstein (@Bernstein) June 3, 2019
we all deserve a better internet, where chasing engagement and clicks regardless of user well-being is no longer the default policy
— ?ashley ✨#pride ?️? (@AshleyEsqueda) June 3, 2019
companies, build better products - you're hurting people (even inadvertently) and it has to change https://t.co/JN0uZEweie
YouTube's algorithm has been compiling videos of young, partially clothed children and serving them up to people who watch sexual content on the platform. https://t.co/poNOpcfZgy
— Christopher Ingraham (@_cingraham) June 3, 2019
I asked YouTube— why not just turn off recommendations on videos of kids? Your system can already identify videos of kids automatically.
— Max Fisher (@Max_Fisher) June 3, 2019
The recommendation algorithm is driving this whole child exploitation phenomenon. Switching it off would solve the problem and keep kids safe.
YouTube isn’t just monetizing the malicious slander of women.
— Brianna Wu (@BriannaWu) June 3, 2019
They’re not just radicalizing a generation of young men into extremism.
They are literally turning innocent kids into fodder for pedophiles. https://t.co/0o6olStmli
Congratulations to all executives software developers at @YouTube. Your recommender algorithm has figured out how to algorithmically curate sexualized videos of children and to progressively recommend them people who watched other erotic content. https://t.co/eOHScAWomz pic.twitter.com/5oY9ol4Kau
— zeynep tufekci (@zeynep) June 3, 2019
YouTube’s algorithm has been curating home movies of unwitting families into a catalog of semi-nude kids, we found.
— Max Fisher (@Max_Fisher) June 3, 2019
YT often plays the videos after users watch softcore porn, building an audience of millions for what experts call child sexual exploitationhttps://t.co/zNwsd9UsgN
If you have young kids, read this. https://t.co/HehrTs1Jzk
— Amarnath Amarasingam (@AmarAmarasingam) June 3, 2019
Congratulations to all executives software developers at @YouTube. Your recommender algorithm has figured out how to algorithmically curate sexualized videos of children and to progressively recommend them TO people who watched other erotic content. https://t.co/eOHScAWomz pic.twitter.com/2tr29zTKCx
— zeynep tufekci (@zeynep) June 3, 2019
whatever happened to the twitter account that just tweeted about how all technology is good and it's silly to worry about malign consequences? https://t.co/ZFybn6lEfc
— Ben Walsh (@BenDWalsh) June 3, 2019
EVEN the Failing New York Times is forced to admit, Youtube/Google have been promoting pedophile content on their platforms.
— Suzy is NOT a Russian bot (@suzydymna) June 3, 2019
AG Barr is really shaking up the perverted Deep State, their time is drawing to a close.https://t.co/FHnMCi1ndp
This is appalling https://t.co/sxrGdBrtUC
— Josh Hawley (@HawleyMO) June 3, 2019
YouTube is a fascinating moderation case study because you could eliminate the source of almost all criticism by changing a single discrete feature that is not part of the core service but is completely off-limits because it drives traffic https://t.co/dug3vBm0ZL
— Adi Robertson (@thedextriarchy) June 3, 2019
On YouTube’s Digital Playground, an Open Gate for Pedophiles ... yet another example of an algorithm out of control. Via The New York Times https://t.co/afuMn3854y
— Ryan Heath (@PoliticoRyan) June 3, 2019
If Twitch did this we would see Fortnite streams decrease by 80%.https://t.co/MdgtdNxp9k
— G2 onscreen (@onscreenlol) June 3, 2019
YouTube has announced today that they are increasing their efforts to protect minors and families https://t.co/dbHVHcA5b9
— Deadline Hollywood (@DEADLINE) June 3, 2019
Here's an overview of everything we are doing to protect minors and families, including limiting recommendations: https://t.co/4gW7qvH9W8
— YouTubeInsider (@YouTubeInsider) June 3, 2019
In response to @nytimes article on @Youtube Child Exploitation, YT posted a new blog entry. They re-affirm their commitment to child safety & go over the new steps they have taken (restricting live features, disabling comments & reducing recommendations) https://t.co/yumFztY4YQ
— Pescatore? (@JoshPescatore) June 3, 2019
YouTube just issued a blog post on the most recent story: https://t.co/wrNLOVFlqR https://t.co/7fNFUqasoX
— Julia Alexander (@loudmouthjulia) June 3, 2019
YouTube pushes children's videos to pedophiles through its awful, horrible, reckless, and highly profitable content recommendation engine. @max_fisher & @amandataub at @nytimes, super important story. https://t.co/ZQXluVcjcg pic.twitter.com/y41G1EpMac
— Xeni Jardin (@xeni) June 3, 2019
YouTube pushes children's videos to pedophiles through content recommendation engine https://t.co/MvtXOuHb5B pic.twitter.com/yQW8Id5V7s
— Masque of the Red Death (@doctorow) June 3, 2019
From the "Holy shit is this wrong!!" files: https://t.co/2TsNgsusa9
— Jason Weisberger (@jlw) June 3, 2019
The video algorithm: Facebook vs. YouTube: https://t.co/GjZ759dgwm #digital #marketing #analytics #alexavery
— Alex Avery ??? (@alexavery) June 4, 2019
The video algorithm: Facebook vs. YouTube https://t.co/WRqDVyzioD pic.twitter.com/X1LHOwx8T1
— John Lincoln (@johnelincoln) June 3, 2019
youtube puts profit ahead of fixing its pedophile problem. it's disgusting. this from @max_fisher & @amandataub: https://t.co/AvJYRyaKMl pic.twitter.com/m95qF8nkxc
— David Farrier (@davidfarrier) June 3, 2019
This story is horrifying. And a clear example of what happens when the creators of algorithms don’t consider negative externalities. https://t.co/GNCw4hhkz5
— Derek Powazek ? (@fraying) June 4, 2019
terrifying. https://t.co/KgFuYQQV2o
— Christina Ginn (@NBChristinaGinn) June 4, 2019
YouTube is changing streaming age requirements. Which doesn't shock me since in TOS you need to be 13 to have an account.https://t.co/t8q1ujtRbJ#YouTube #account
— Andrew Kan (@AndrewKanFilm) June 3, 2019
More info here! https://t.co/t8q1ujtRbJ https://t.co/3iUIInyDOZ
— Andrew Kan (@AndrewKanFilm) June 3, 2019
Tubes bans minors from streaming.https://t.co/TPJUEcgwt7
— LinuxGameCast (@VennStone) June 3, 2019
Powered by advertising - YouTube pushes children's videos to pedophiles through content recommendation engine https://t.co/Y9g8tNCG2p
— Mass. Pirate Party (@masspirates) June 3, 2019