This is unbelievably disturbing.@telegram was already a toxic platform, giving the very worst of society, including nazis, terrorists and other extremists, a chance to amplify themselves, but this is another level.
— Sleeping Giants (@slpng_giants) October 20, 2020
How does @AppStore & @GooglePlay still distribute it? https://t.co/KpWQNVLfnH
UPDATE: A VK spokesperson told us, "Such communities have been blocked. We will run an additional check and block inappropriate content and communities."
— Jane Lytvynenko (@JaneLytv) October 20, 2020
The bot was also promoted using regular posts, not advertising tools. https://t.co/6ITKeXwvh7
"The automated service, however, only works on women" https://t.co/ewI1yjwFKJ
— Werewolf Bat Mitzvah (@EmGusk) October 20, 2020
As @wiczipedia said, this shows that
— Jane Lytvynenko (@JaneLytv) October 20, 2020
deepfake needs to be addressed beyond fears of national security & politics.
The vast majority of deepfakes target women, which is "part and parcel of the broader abuse and harassment that women have to deal with." https://t.co/8Z5UOCZttO
Deepfake software that creates fake "nudes" of women has turned up on Telegram as monetized chatbots. Users submit photos (mainly of people they know, taken from social media), and the bot spits out an AI-generated nude. Details here: https://t.co/FzZdbLQEC6
— James Vincent (@jjvincent) October 20, 2020
I'm not linking to the bot, because I'm not a dirtbag, but many of the fake nudes are sadly convincing. The original photos came from unsuspecting women's selfies, Instagram and TikTok accounts. And some of those targeted were girls younger than 18 https://t.co/d3rbIb9g5F
— Drew Harwell (@drewharwell) October 20, 2020
Writing this turned my blood cold.
— Jane Lytvynenko (@JaneLytv) October 20, 2020
A Telegram bot allows men to create fake nude images of women from a single clothed photo. Over 680,000 women have been affected with about 104,000 images shared publicly, per new to research from @sensityai.https://t.co/8Z5UOCZttO
The DOJ’s deeply flawed lawsuit would do nothing to help consumers. It could actually raise phone prices, make it harder for people to access the services they want, and artificially prop up lower quality search services. Read our blog post→https://t.co/iNnNL6yvNK
— Google Public Policy (@googlepubpolicy) October 20, 2020
An artificial intelligence service freely available on the Web has been used to transform more than 100,000 women’s images into nude photos without the women’s knowledge or consent, triggering fears of a new wave of damaging “deepfakes” https://t.co/G1Xf9TBI1q
— Samwise Gamgee (@Sambannz) October 20, 2020
I found one woman, an art student, who'd had an image of her (in a tank top) taken from her Instagram and turned into a fake nude. She said, “I believe in karma, and what comes around for them won’t be any cleaner than their own actions" https://t.co/d3rbIb9g5F
— Drew Harwell (@drewharwell) October 20, 2020
I have warned about this technology for years. This can happen to any of us or your kids...https://t.co/B0b4SBvEVk
— Harrison Smith (@HarrisonSmith85) October 20, 2020
Nothing contradicts the idea that science is a (blind) search for "truth" more than this:https://t.co/e3Ed7QT7dA
— Rod Graham (@roderickgraham) October 20, 2020
An #AI Service Transformed Thousands Of #Women’s Photos
— Spiros Margaris (@SpirosMargaris) October 21, 2020
Into #Fake Nudes Without Consent https://t.co/Z2LjXn5c4v #fintech #ArtificialIntelligence #MachineLearning #DeepLearning @anthonyraimond8 @IBTimes pic.twitter.com/V7fkyZDDX1