Facebook pays content reviewers a fraction of median salary: Report [www.cnbc.com]
The human cost of protecting Facebook [davelee.me]
Facebook content moderators speak out about difficulties of job [www.axios.com]
403 Forbidden [www.thewrap.com]
The same FB critics who call on the company to take on responsibility for moderating content (an operational job they don't want, and had to be pressed to perform), will of course be shocked, shocked at the human cost in reviewing billions of pieces of random content. https://t.co/GSzzA6k2Nt
— Antonio García Martínez (@antoniogm) February 25, 2019
This is a strawman. As @sivavaid has explained, Facebook’s impossible moderation problem is of its own making. The firehose of the worst humanity has to offer is not inevitable; it’s a result of Facebook’s architecture. Journalists are 100% right to call out both problems. https://t.co/hP2hwOb5h8
— Blake Reid??? (@blakereid) February 25, 2019
Facebook’s content moderators are poorly paid and managed, and have PTSD from the extreme content. Maybe most disturbing, some employees are embracing the conspiracies they’re meant to moderate, turning into 9/11 truthers, and Holocaust deniers. This whole report is wild. https://t.co/uDuQZuS2Gv
— Brandy Zadrozny (@BrandyZadrozny) February 25, 2019
We are living in a dystopia https://t.co/9t8IMqIZdX pic.twitter.com/DMUrcUlhBP
— Mark Di Stefano ?? (@MarkDiStef) February 25, 2019
Employees can be fired after making just a handful of errors a week, and those who remain live in fear of former colleagues returning to seek vengeance. One man we spoke with started bringing a gun to work to protect himself.
— The Verge (@verge) February 25, 2019
This is really cynical and I'm a cynic. It's possible to both want abusive content to be removed and to treat the people who make those content decisions with respect and humanity. https://t.co/vuqBNPAiSG
— Christina Warren (@film_girl) February 25, 2019
Finally, this specific example sounds more like a case of a facility that was mismanaged and completely out of control.
— J.Sack (@JayTSack) February 25, 2019
Last year when FB announced they were beginning this program, I messaged many recruiters and offered to help. I did not receive even one response.
Moderators cope with seeing traumatic images and videos by telling dark jokes about committing suicide, then smoking weed during breaks to numb their emotions. Moderators are routinely high at work.
— The Verge (@verge) February 25, 2019
i expected this to be bad but it was worse https://t.co/vBLf0AZ1cf
— Ted (@TedOnPrivacy) February 25, 2019
agree. it’s definitely fb critics’ fault facebook’s median salary is 9x that of one of these contractors https://t.co/dPt7CiWoyh
— brian feldman (@bafeldman) February 25, 2019
There's a roko's basilisk quality to conspiracy theories. Some are silly and some have plausible qualities. Stare at them day in and day out and one will worm its way in. https://t.co/Ke7cHa4S18
— Rob Manuel (@robmanuel) February 25, 2019
5| It is perfectly natural to believe some of the 'Conspiracy Theories' because some of them are true/warranted. By seeing these events happen in real-time, you become aware of how information is suppressed, amplified, or misstated by mainstream narratives.
— J.Sack (@JayTSack) February 25, 2019
the other ways platforms address similar needs sound, in this context, very morbid: they might create a new side to the market (a labor auction for mods?) or add, like, tipping. but hiding trauma behind a contractor with NDAs is about as morbid as it gets
— John Herrman (@jwherrman) February 25, 2019
I think the point is that a company as massively profitable as FB could afford to treat its moderators as human beings rather than disposable wage-slaves. https://t.co/820jgoRt1k
— Jeremy Konyndyk (@JeremyKonyndyk) February 25, 2019
I also spoke with employees on the site who told me they like their jobs, despite its challenges, and feel safe and supported at work. Not everyone emerges from this work with lasting trauma.
— Casey Newton (@CaseyNewton) February 25, 2019
I mean, if workers were poisoned cleaning up a toxic waste site, I think we could manage being outraged at both the conditions of the workers and the fact that the site was there in the first place. No one is denying that this is an immensely difficult problem! https://t.co/O03axk8eii
— Madeleine Varner (@tenuous) February 25, 2019
Brutal and important read, particularly as a reminder of how inextricable labor rights are from any meaningful response to the problems the tech behemoths are causing https://t.co/BFRlh8nMrp
— Lindsey Barrett (@LAM_Barrett) February 25, 2019
I also think that anybody writing the standard “I saw something online I don’t like and it should go away” story should consider that:
— Alex Stamos (@alexstamos) February 25, 2019
1) Somebody has to do that moderation work
2) Every content decision brings the possibility of mistakes
3) More moderation by tech == more power
Maybe "software margins" are the wrong goal for social media companies???https://t.co/kcGWjj6oSE
— ???☕️? (@hunterwalk) February 25, 2019
Great report by @CaseyNewton https://t.co/PLAIN7ehXM
Extremely necessary dive into what it's like to be an underpaid, undersupported contract Facebook content moderator by @CaseyNewton. When I was at Facebook HQ, I took this photo, which reminded FB staff that contractors are, indeed humans ...https://t.co/vCoIQKL2zs pic.twitter.com/BE1leSZXpD
— Jason Koebler (@jason_koebler) February 25, 2019
Read this and it’s hard to surmise anything but that social networks were a horrible mistake, not just in themselves, but what they’ve done to our culture.
— Sleeping Giants (@slpng_giants) February 25, 2019
Ughhhhhhhhhhhhhhhh. https://t.co/0yW1yfV8zV
Employees at a content-moderating company describe a workplace where people develop severe anxiety while still in training, and continue to struggle with trauma symptoms long after they leave.
— ProPublica (@ProPublica) February 25, 2019
These are the secret lives of Facebook moderators in America. https://t.co/gn7PGbmmy6
The Facebook mods who don’t quit from the trauma are radicalized by ithttps://t.co/tHyxuNscu4 pic.twitter.com/beMi9ciPFv
— Andy Campbell (@AndyBCampbell) February 25, 2019
I'm sympathetic to the view that *someone* has to look at horrible stuff on social platforms to ensure it doesn't spread further. But what Facebook is doing to these people is unconscionable. Great reporting, @CaseyNewton. https://t.co/U2vwp8Fxvq
— Kevin Roose (@kevinroose) February 25, 2019
this is super important work from @CaseyNewton
— Tony Romm (@TonyRomm) February 25, 2019
and even reading it horrifies me, i can't even imagine what it's like for those facebook actually employs https://t.co/NRHnF8YyNx
go ahead. keep screaming “fix the problem” at me. it only makes my labor practices Worse https://t.co/fkiHGdYZSc
— Bryan Menegus (@BryanDisagrees) February 25, 2019
But this call-center model — which is also used by Google, Twitter, and others — puts essential questions of speech and security in the hands of folks who are being paid as if they're doing customer service for Best Buy.
— Casey Newton (@CaseyNewton) February 25, 2019
3| Many of us talk about experiencing 'Post Traumatic Growth', in the same way that military personnel can. But this is a problem when returning to 'civilian life' where you're on 10 and everyone else is on 2.
— J.Sack (@JayTSack) February 25, 2019
This is a very good piece.
— Matt Klinman (@mattklinman) February 25, 2019
The internet doesn’t have to be organized this way.
There were once websites that had all the fucked up stuff on them. And, if you wanted that, you went there.
With no websites it all leaks onto the platforms. And this is the result. https://t.co/eNj5cjQXnP
I think there’s a lot of good debate around this today, but I have to say… when journalists report on sweatshops, do people say “well sorry that’s capitalism”? Maybe some, but not a lot. https://t.co/fbqV4gRuXj
— Jason Abbruzzese (@JasonAbbruzzese) February 25, 2019
1| The fact that these employees are 'Contracted Part-Time Labor', for $15/hr, is despicable. These people are exposed to the worst parts of humanity on a loop for 9 hours a day, they deserve to be paid like humans, receive benefits, and be able to support a family.
— J.Sack (@JayTSack) February 25, 2019
The most chilling part of this is not the snuff films or office sex. It's the part when Facebook puts up inspirational posters at the site the day before Casey arrives, and parades out employees to tell him how great their jobs are. Straight out of North Korea.
— Kevin Roose (@kevinroose) February 25, 2019
Employees have been found having sex inside stairwells and a room reserved for lactating mothers, in what one employee describes as “trauma bonding.”
— The Verge (@verge) February 25, 2019
2| Joking about the material is a strategy to to cope with it. There are certain personality types that are unable to do this, and they will be the most affected by any PTSD symptoms.
— J.Sack (@JayTSack) February 25, 2019
In his latest, The @verge's @CaseyNewton takes a look at the secret lives of @facebook moderators in America.
— Vox Media (@voxmediainc) February 25, 2019
Casey is the first journalist Facebook has allowed to visit an American content moderation site since the company began building dedicated facilities here two years ago. https://t.co/TF1jubFax6
I was a former 'Content Analyst' for over 4 years monitoring Twitter and have a few things to say about this.
— J.Sack (@JayTSack) February 25, 2019
Thread: https://t.co/QGgQFMMdpY
In stark contrast to the perks lavished on Facebook employees, team leaders micro-manage content moderators’ every bathroom break. Two Muslim employees were ordered to stop praying during their nine minutes per day of allotted “wellness time.”
— The Verge (@verge) February 25, 2019
Today I want to tell you what it's like to be a content moderator for Facebook at its site in Phoenix, Arizona. It's a job that pays just $28,800 a year — but can have lasting mental health consequences for those who do it. https://t.co/quieXG1Bm9 pic.twitter.com/cwtHXIqgol
— Casey Newton (@CaseyNewton) February 25, 2019
I thought everything that could be reported about Facebook's moderators had been reported but there is more. With Facebook, it seems there is always more. https://t.co/HQs3WQKkGn pic.twitter.com/s9WXM55sAb
— Kashmir Hill (@kashhill) February 25, 2019
This is a dumb argument. If people are drowning at a pool at alarming rates, you can simultaneously criticize the pool's owners and the conditions the overworked lifeguards face.https://t.co/vJwFUVpXIN
— Ryan Mac (@RMac18) February 25, 2019
Tremendous reporting by @CaseyNewton on working conditions at a secret Facebook content moderation facility in Phoenix, and appreciate the courage of the whistleblowers at Cognizant who talked to him. https://t.co/dglgFqOhN3
— Will Oremus (@WillOremus) February 25, 2019
So much messy work these days is done by humans while so much of the discussion around it is about AI. The multibillion dollar question is being answered: will AI capable of regulating platforms arrive before the laws requiring regulation?https://t.co/rDrt9XSigp
— Drew Breunig (@dbreunig) February 25, 2019
Facebook’s army of content moderators churn out the same kind of debilitating mental issues as you’d expect to see in soldiers returning from combat. It pays barely above minimum wage, bathroom breaks are tightly monitored, and the main reprieve is 9/mins day of “wellness” ☠️ https://t.co/KnmB1RACaB
— DHH (@dhh) February 25, 2019
Moderators in Phoenix will make just $28,800 per year — while the average Facebook employee has a total compensation of $240,000.
— The Verge (@verge) February 25, 2019
It isn't an easy issue. These companies are quasi-governments, and just like governments they have cops, prosecutors, social-workers, judges and others who have to put their hands into the muck of humanity and might suffer for it.
— Alex Stamos (@alexstamos) February 25, 2019
My understanding is that there are legal complications (like ERISA) that encourage companies to hire these kinds of positions as contractors. Would be nice to see some public discussion of this and movement to making them full-time employees.
— Alex Stamos (@alexstamos) February 25, 2019
There is no fix for the business model of monetizing user generated content with personal data on monopoly platforms. There are only unpriced costs to humanity. https://t.co/U6PwA55EhJ
— David Carroll ? (@profcarroll) February 25, 2019
4| It is extremely isolating to realize that the vast majority of people cannot relate to what you went through, and have no idea what goes on in the world outside of their local environment and the mainstream news narrative. It took a few years for me to realize this.
— J.Sack (@JayTSack) February 25, 2019
Gob-smacking insight into Facebook's content moderation in the US from @CaseyNewton, where trainees have panic attacks and more experienced staff start believing conspiracy theories https://t.co/kUiTxbtwfX pic.twitter.com/b35uasX4Er
— Alastair Reid (@ajreid) February 25, 2019
I think Casey's piece is important. People need to know the human cost of policing online humanity, which I saw myself. FB's contractor arrangement makes caring for these people harder.
— Alex Stamos (@alexstamos) February 25, 2019
AGM has a point, however, about journalists having it both ways. Nice to not be responsible. https://t.co/ccWTdCARai
facebook is doing to boomers what working in a mercury-vapored hat factory did to 19th-century laborers
— drewtoothpaste (@drewtoothpaste) February 25, 2019
Maybe FB should take away the lesson that they cannot do this work on the cheap — that is, they are not paying anywhere near the full cost of curating their platform responsibly https://t.co/c1NbFhooEn
— Gady Epstein (@gadyepstein) February 25, 2019
All of my team members were FTE, and for the people who worked on counter-terrorism and child abuse there were serious long-term impacts. They also volunteered for that work because they wanted to make a difference.
— Alex Stamos (@alexstamos) February 25, 2019
6| The stress and anxiety is real, as managers need to hit 'data-driven numbers', and 'Key Performance Indicators', and don't care about the facts on the ground of your situation. Personally, I was assessing 10,000~ tweets/day, which breaks down to 'many' every 10 seconds.
— J.Sack (@JayTSack) February 25, 2019
Facebook pays the reviewers filtering porn and murder a tiny fraction of its median salary, explosive report sayshttps://t.co/u60uAxPB1g
— Cyrus Farivar (@cfarivar) February 25, 2019
Current and former content moderators for Facebook tell The Verge that their experiences caused PTSD-like symptoms https://t.co/bhpSTaoaib
— Axios (@axios) February 25, 2019