The ideal scenario is to be able to algorithmically detect and eliminate child pornography, without a human looking at any of it.
— Man Nordau (@MaxNordau) August 30, 2022
Unfortunately, algorithms aren’t perfect, so image/video hosting sites end up clearing illegal content but accidentally flagging legal content. https://t.co/krT1w8A2Hi
These seem like valid questions that the article is not clear about. https://t.co/uyA1ST0zOA
— Mike Masnick (@mmasnick) August 30, 2022
We want to know more. Are you a current/former Tweep who used Twitter’s CSE tools, including RedPanda? Or did you work on adult content monetization? DM or email casey@platformer.news
— Casey Newton (@CaseyNewton) August 30, 2022
The fine folks who run this website wanted to monetize porn on it but they didn’t because they figured out it was impractical to moderate the sheer amount of rape and abuse material on the site they already had.
— American Solidarity Party 🧡 (@AmSolidarity) August 30, 2022
Think about that for a moment.https://t.co/JkbnCqPO7C
Ultimately those fears helped to scuttle a project that could have generated billions for Twitter.
— Casey Newton (@CaseyNewton) August 30, 2022
“We have weak security capabilities to keep the products secure,” the red team wrote.
what a headline.
— Matt Navarra (@MattNavarra) August 30, 2022
the damaging body blows just keep on landing at Twitter HQ https://t.co/KcTE38P3vC
Would the existence of a paid adult content section on Twitter make the platform’s existing CSAM worse or would it increase their likelihood of attracting the legal threats OnlyFans and Pornhub face? In other words: is this pro-safety or pro-not paying to deal with more lawsuits?
— Melissa Gira Grant (@melissagira) August 30, 2022
Scoop w/ CaseyNewton: Earlier this year, Twitter explored monetizing adult content — a feature that would have made it an OnlyFans competitor. But the project was shut down after the company realized it had a problem: child sexual exploitation material. https://t.co/VsblkEsdaT
— Zoë Schiffer (@ZoeSchiffer) August 30, 2022
1. CSAM scanning an adult service with 10-100k producer accounts and a business relationship with each: sensible.
— Matthew Green (@matthew_d_green) August 30, 2022
2. Scanning a social network with 330M pseudonymous accounts: hard and of unknown benefit, but fine.
3. CSAM scanning all private data and messages in the world: 😩
This story is crazy.
— Allie “I’m Still Here” Awesome (@AllieAwesome415) August 30, 2022
How Twitter’s child porn problem ruined its plans for an OnlyFans competitor https://t.co/kAzRN2aMnY via @Verge
As an adult creator, this problem could have been solved with implementing ID/verification for those wanting to sell adult content.
— Gwen Adora ⚡ (@GwenAdora) August 30, 2022
Sex workers have long had solutions for it, and while I don't agree there'd be growing pains, it's sad external factors stopped this project. https://t.co/1Jidazba5y
Gotta imagine Google and Apple's app stores would not have wanted this either, even if Twitter was perfect at removing CSAM. https://t.co/6NTIYFiXHo
— Steve Kovach (@stevekovach) August 30, 2022
monetized twitter porn is an insane move lmao i have to respect whoever on the product team is going balls to the wall this year https://t.co/4AvKBpVdv7
— stephanie (@isosteph) August 30, 2022
NEW: Twitter was readying an OnlyFans competitor this year, until a red team intervened and said it would be irresponsible. The reason: Twitter’s ongoing struggle to remove child sexual exploitation material from the platform https://t.co/ScF3Qh5Xo6@ZoeSchiffer + me
— Casey Newton (@CaseyNewton) August 30, 2022
One of the big social platforms is gonna come out with an OF competitor eventually and it’s gonna be weird. https://t.co/QTnjLKu3Wa
— Michael Kaplan (@OfficialKappy) August 30, 2022
Twitter is reportedly lagging far behind its peers when it comes to the automated moderation of CSE content.
— Damon Beres 🦇 (@dlberes) August 30, 2022
"Executives are apparently well-informed about the issue, and the company is doing little to fix it."https://t.co/fQkkGOixes
Nothing in the findings should have surprised Twitter executives. An internal report from February 2021 found that the company’s outdated, largely manual systems for tracking CSE were failing to keep pace with a huge spike in prevalence
— Casey Newton (@CaseyNewton) August 30, 2022
I like how we talk about these things as if it’s completely anodyne for one of the most consequential social media platforms to want to get into the porn monetization business, only to be stopped by the rampant child sexual exploitation which flourishes on their platform. https://t.co/MLHB4Imz0S
— Rachel Bovard (@rachelbovard) August 30, 2022
“What the Red Team discovered derailed the project: Twitter could not safely allow adult creators to sell subscriptions because the company…cannot accurately detect child sexual exploitation and non-consensual nudity at scale…” https://t.co/FwWJv1uzWV
— Kim Zetter (@KimZetter) August 30, 2022
at least it means twitter won’t be cynically capitalizing on sex workers’ censorship from their own platform by segregating them in a red light district the platform can profit from https://t.co/H9lHcGJhsw
— Melissa Gira Grant (@melissagira) August 30, 2022
Wish more media orgs (some?) would run red teams to see how their processes would respond to various forms of manipulation. https://t.co/fE8kgDNHgy
— Michael Morisy (@morisy) August 30, 2022
Huge story from @ZoeSchiffer and @CaseyNewton: Twitter had plans to monetize adult content with an OnlyFans competitor... but scrapped it after an internal "red team" found that its ability to remove child porn was limited and inconsistent https://t.co/wLcJSNnziW
— nilay patel (@reckless) August 30, 2022
this seems like a good advertisement for having a red team https://t.co/ngpRoSAFY8
— Will Oremus (@WillOremus) August 30, 2022
depressing that twitter has done so little to combat CSAM on its platform. not profitable, not fun and totally ignoring basic safety features. cool company. https://t.co/dNc6TvNlBf
— gabriel dance (@gabrieldance) August 30, 2022
@elonmusk more ammo 🚩 https://t.co/IwFWmYQAMj
— Jerry Capital (@JerryCap) August 30, 2022
This whole article is a bit weird. Policing CSAM on public social networks is good, and Twitter should be doing that (as opposed to scanning private data and messages.) But what does that have to do with verifying customer ages on a paid “OnlyFans” competitor service? https://t.co/sLc3mPfq93
— Matthew Green (@matthew_d_green) August 30, 2022
anyway, the fact that twitter convened a team of 84 to see how dicks would abuse such a platform and then agreed that they would abuse it, killing the launch is *exactly how we want things to work* so good job, twitter!https://t.co/Z315Rvmyfc pic.twitter.com/cXPpg9jHyJ
— Dan Hon (@hondanhon) August 30, 2022
I’ve argued that the combination of Super Followers and the fact that Twitter already allows hardcore pornography made it an obvious move that they compete with OnlyFans.
— Dare Obasanjo (@Carnage4Life) August 30, 2022
Turns out they had plans to do so but couldn’t police CSAM so shut down the effort.https://t.co/mZAcy8HZtF
this is a red herring. twitter didn’t drop adult content monetization, which would require age confirmation & biometric data like any other clip site, bc of child sexual exploitation material.
— 🌬Doctrix Snow (@MistressSnowPhD) August 30, 2022
twitter dropped ACM to avoid financial deplatforming & becoming NCOSE’s next target. https://t.co/DmwkPRWVNE
Twitter researchers had been clamoring about the issue for over a year. But the company ignored it to focus on user growth, according to 50+ pages of leaked internal documents.
— Zoë Schiffer (@ZoeSchiffer) August 30, 2022
In any case I don’t really see why having *paid* producer relationships (allowing manual address verification and other forms of KYC) would make Twitter’s CSAM problems worse than the current mess of anonymous accounts.
— Matthew Green (@matthew_d_green) August 30, 2022
With DSA in place, I imagine Twitter (+others) would have to do the type of analysis this "Red Team" did for CSAM as well as other risks. Regardless of whether or not they roll out a new feature/business model. Regulators+auditors would check assessment.https://t.co/36CtG1ggSF
— Julian Jaursch (@JJaursch) August 30, 2022
Can you imagine Twitter trying to launch an OnlyFans competitor on top of everything else going on at that company? 🫠 https://t.co/iWS2wZtX8l
— Kurt Wagner (@KurtWagner8) August 30, 2022
Intention without strategy is chaos… https://t.co/9n82tdll2D
— Kim Crayton ~ Antiracist Economist ~ She/Her ✊🏾💛 (@KimCrayton1) August 30, 2022
Not to start a debate but this is a legit use of a red team. Red Teams are not only for technical ops. Red Teams are to view problems/solutions from a different perspective (from the point of view of the adversary). @redteamjournal did a great job covering this as did @joevest https://t.co/euKIHjOhi6
— Jorge Orchilles 🦄 (@jorgeorchilles) August 30, 2022
This thread is good at noting the contradictions in this article here.
— Ashley Lake (@AshleyLatke) August 30, 2022
I think antis both outside and apparently inside the company are trying to make it seem allowing porn increases abuse - yet porn sites have less abuse than sites like facebook https://t.co/VkU6GC2L3o
A technology company not launching a product due to ethical concerns identified ahead of time. How novel! https://t.co/8c7eoe8PTv
— Olivia Solon (@oliviasolon) August 30, 2022