Apple’s new acqui-hire shows it’s still gearing up for autonomous driving [www.siliconrepublic.com]
App that can remove women's clothes from images shut down [www.bbc.com]
DeepFake Nudie App Goes Viral, Then Shuts Down [syncedreview.com]
A terrifying app for making any woman appear naked was killed off by its creator [edition.cnn.com]
Creator of DeepNude, App That Undresses Photos of Women, Takes It Offline [www.vice.com]
$50 DeepNude app undresses women with a single click [nakedsecurity.sophos.com]
Deepfake app that turned photos of women into nudes yanked after going viral [www.techspot.com]
DeepNudes: The outrageous app that turned women's pictures to nudes pulled down [www.firstpost.com]
This Horrifying App Undresses a Photo of Any Woman With a Single Click [www.vice.com]
DeepNude creator thinks the world 'is not ready' for misogynistic app [memeburn.com]
Login to comment
A programmer created a $50 app that uses neural networks to remove clothing from any woman and make them appear naked.— Lorenzo Franceschi-Bicchierai (@lorenzofb) June 26, 2019
It's called DeepNude and "it swaps clothes for naked breasts and a vulva, and only works on images of women."https://t.co/IdBolhL7qh
after deepfakes broke, each of the biggest social platforms were forced to rethink their rules on revenge porn and misinformation. retroactively, hateful and harmful communities were taken down, too.— Samantha Cole (@samleecole) June 27, 2019
this is never the *goal* but it's something we do think a lot about.
the ONLY upside here I see, is that if you do have nudes exposed (and due to stigma in our society that can deeply harm women on many levels), you can claim it's a deep fake.— Emma Evans (@TrancewithMe) June 28, 2019
Same for women in sex work who are exposed.
Shitty silver lining but :/https://t.co/QxnbMQxksr
we saw this happen with deepfakes, too: when we reported on that tech, a lot of people were (rightfully) worried that we were amplifying yet-unheard of harm.— Samantha Cole (@samleecole) June 27, 2019
but suddenly, under all that scrutiny, this little hobby isn't so much fun anymore. https://t.co/J1JwTM7ZTN
DeepNude's creator was basically preparing to run a press circuit on this.— Samantha Cole (@samleecole) June 27, 2019
he approached editors. he was on reddit, urging people to try, and share.
setting the tone as early as possible—that this exists, and is potentially deeply harmful—is often really crucial.
Deep Nude app withdrawn from market by makers. Glad it’s down but this is a perfect example of why inclusivity matters in product design. No one at this company seriously considered the immense likelihood of app being weaponized against girls and women. https://t.co/Xp5dyWjtYd— Soraya Chemaly (@schemaly) June 28, 2019
Good that this horrifying app was taken down, but it can still be used by those who have already downloaded it.— NDI Gender & Women (@NDIWomen) June 28, 2019
This is why @RepSpeier's #SHIELD Act to prevent online exploitation is so important. #NotTheCost @mlkrook @schemaly @seyiakiwowo https://t.co/gHqGA4ubmt
#deepfakes have worrying implications for our democracy. But we know who'll get the sharp end of this new technological stick:— Dr. Ann Olivarius (@AnnOlivarius) June 28, 2019
Women and girls. Their harassers can just create their own nudes now. #revengepornhttps://t.co/lBfoJFf8sU@daniellecitron @ma_franks @cagoldberglaw
New: an app that uses neural networks to remove clothing from the images of women making them look realistically nude. The $50 app, called DeepNude, "dispenses with the idea that deepfakes were about anything besides claiming ownership over women’s bodies" https://t.co/vKOA3HAJPR— Joseph Cox (@josephfcox) June 27, 2019
Login to comment
Motherboard Creator of DeepNude, App That Undresses Photos of Women, Takes It Offline: An app that algorithmically "undressed" images of women was taken down by its creator, citing server… https://t.co/Oy6dwBNYFo #programming #algorithm #deepfake #deepnude Via @motherboard pic.twitter.com/y1GDTFXmiq— Bradley Jon Eaglefeather (@bjeaglefeather) June 29, 2019
$50 DeepNude app undresses women with a single click: "I'm not a voyeur, I'm a technology enthusiast,” says the creator, who combined deepfake AI with a need for cash to get ka-CHING! https://t.co/dCeeH4VG8j #Fakenews #Governmentsecurity #MachineLearning @Cygnacom pic.twitter.com/ljFYi5sVlp— Debra Baker (@deb_infosec) June 28, 2019
Login to comment