Apple’s new acqui-hire shows it’s still gearing up for autonomous driving [www.siliconrepublic.com]
App that can remove women's clothes from images shut down [www.bbc.com]
DeepFake Nudie App Goes Viral, Then Shuts Down [syncedreview.com]
A terrifying app for making any woman appear naked was killed off by its creator [edition.cnn.com]
Creator of DeepNude, App That Undresses Photos of Women, Takes It Offline [www.vice.com]
$50 DeepNude app undresses women with a single click [nakedsecurity.sophos.com]
Deepfake app that turned photos of women into nudes yanked after going viral [www.techspot.com]
DeepNudes: The outrageous app that turned women's pictures to nudes pulled down [www.firstpost.com]
This Horrifying App Undresses a Photo of Any Woman With a Single Click [www.vice.com]
DeepNude creator thinks the world 'is not ready' for misogynistic app [memeburn.com]
A programmer created a $50 app that uses neural networks to remove clothing from any woman and make them appear naked.
— Lorenzo Franceschi-Bicchierai (@lorenzofb) June 26, 2019
It's called DeepNude and "it swaps clothes for naked breasts and a vulva, and only works on images of women."https://t.co/IdBolhL7qh
after deepfakes broke, each of the biggest social platforms were forced to rethink their rules on revenge porn and misinformation. retroactively, hateful and harmful communities were taken down, too.
— Samantha Cole (@samleecole) June 27, 2019
this is never the *goal* but it's something we do think a lot about.
the ONLY upside here I see, is that if you do have nudes exposed (and due to stigma in our society that can deeply harm women on many levels), you can claim it's a deep fake.
— Emma Evans (@TrancewithMe) June 28, 2019
Same for women in sex work who are exposed.
Shitty silver lining but :/https://t.co/QxnbMQxksr
this is simply infuriating: a programmer created an app to algorithmically "undress" women using any clothed image of her.
— Samantha Cole (@samleecole) June 26, 2019
we tested it, and the results are realistic enough.https://t.co/JcihR900cc
the creator of deepnude shut down the app, after @motherboard's report on it—his servers were crushed by the traffic, but he ultimately had a crisis of conscience about how people were using his creation: https://t.co/dQKkZHfq9V pic.twitter.com/P6ppXsD67U
— Samantha Cole (@samleecole) June 27, 2019
Male programmer: so what can I do with my skills?...let's make it easier to make fake nudes of women
— Erna Mahyuni (@ernamh) June 27, 2019
????? https://t.co/YXtlKCIydt
In today's heroes: @samleecole exposed a messed up app and within a day, the creators took it down https://t.co/D2QGTI00wt
— Ankita Rao (@anrao) June 27, 2019
The creator of the sick DeepNude app has shut it down. His response is still so maddening https://t.co/2ID66rZ4v0
— Siobhan Morris (@siomo) June 27, 2019
“Is this right? Can it hurt someone?”
— Ella (@latentexistence) June 26, 2019
“If I don’t do it, someone else will do it in a year.”
That’s a really shitty excuse to justify making something horrifying. https://t.co/ymCoka73wb
we saw this happen with deepfakes, too: when we reported on that tech, a lot of people were (rightfully) worried that we were amplifying yet-unheard of harm.
— Samantha Cole (@samleecole) June 27, 2019
but suddenly, under all that scrutiny, this little hobby isn't so much fun anymore. https://t.co/J1JwTM7ZTN
This is what happens when you don't make an Ethics course a mandatory part of your software programming curriculum or ongoing employee development https://t.co/otOPG3yp0g
— Tom Lommel (@tomlommel) June 27, 2019
anyway I'm done threading, this page is pleasing to see, that is all pic.twitter.com/Fg2ONzv3NR
— Samantha Cole (@samleecole) June 27, 2019
.@motherboard saving the internet once again https://t.co/8p6j1t9dD1
— Aaron W. Gordon (@A_W_Gordon) June 27, 2019
DeepNude's creator was basically preparing to run a press circuit on this.
— Samantha Cole (@samleecole) June 27, 2019
he approached editors. he was on reddit, urging people to try, and share.
setting the tone as early as possible—that this exists, and is potentially deeply harmful—is often really crucial.
Deep Nude app withdrawn from market by makers. Glad it’s down but this is a perfect example of why inclusivity matters in product design. No one at this company seriously considered the immense likelihood of app being weaponized against girls and women. https://t.co/Xp5dyWjtYd
— Soraya Chemaly (@schemaly) June 28, 2019
Good that this horrifying app was taken down, but it can still be used by those who have already downloaded it.
— NDI Gender & Women (@NDIWomen) June 28, 2019
This is why @RepSpeier's #SHIELD Act to prevent online exploitation is so important. #NotTheCost @mlkrook @schemaly @seyiakiwowo https://t.co/gHqGA4ubmt
App that can “undress women” taken offline https://t.co/QMx4gKGONC
— Asher Wolf (@Asher_Wolf) June 28, 2019
App that can remove women's clothes from images shut down:
— Jerome Elam (@JeromeElam) June 28, 2019
One campaigner against so-called revenge porn called the app "terrifying".https://t.co/S0Ldlch974 @BoysAreNot4Sale #everyvictimmatters #MonstersHidingInPlainSight... https://t.co/S0Ldlch974
I see VICE have stripped their celeb nude imagery from the story. Good.
— Kevin Beaumont ? (@GossiTheDog) June 28, 2019
Here’s a Beeb piece on it. https://t.co/XXTOrLvSNr
Really terrifying..
— Dr.Arfana Mallah (@Arfanamallah) June 28, 2019
'Terrifying' app that can undress women taken offline https://t.co/AqIon7NUFC
現在は過度な注目を集めたことでサービスが中止しているようですが、こういったAIを利用した画像処理アプリは今後も生産され続ける可能性があるので、対策は必要かと思います。https://t.co/a78MOu8WS1
— いっちー@精神科医 (@ichiipsy) June 28, 2019
Update: developer shuts down the app after Motherboard coverage, backlash https://t.co/fSaYVOUVGT
— Joseph Cox (@josephfcox) June 27, 2019
Creator of DeepNude, App That Undresses Photos of Women, Takes It Offline https://t.co/8vdabiWH7x @viceさんから
— DonDonタイムズ (@dondontimes) June 28, 2019
The creator of DeepNude has killed the app. https://t.co/oafDfz6ML6
— VICE (@VICE) June 28, 2019
The @deepnudeapp saw quick fame and quicker failure! After facing viral backlash, the creator says he does not want to be responsible for the technology.#Deepnudes #deepfakeshttps://t.co/guSZSHFjQC
— Tech2 (@tech2eets) June 28, 2019
[内容注意] ディープフェイクの技術を使って、着衣画像からヌード画像を生成するプログラムが登場しているという話題。
— 堀 正岳 (めほり)@「知的生活の設計」 (@mehori) June 26, 2019
ここまでリアルだと「偽の画像で、本物ではないからよい」では済まない倫理的問題が生まれる。でも、どの法律をどう適用してとりしまることになるのかhttps://t.co/O8BjYUF54i
Yet again saddened by the lack of morals and ethics...guys do something for good with your talent https://t.co/hg6RP6a6yt
— Bill Matthews (@Bill_Matthews) June 28, 2019
クリック1つで”女性を裸にできる”という凶悪なアプリケーションが開発されました。AIの画像処理技術を使って誰でも簡単に女性の写真を加工できてしまうので、女性への誹謗中傷や攻撃として利用されてしまう恐れ強そうです。なんらかの規制が必要だと強く感じます。https://t.co/inn48uq95H
— いっちー@精神科医 (@ichiipsy) June 28, 2019
The DeepNude app shows what deepfakes were always about: claiming ownership over women’s bodies. https://t.co/DUBt4VWBR4
— VICE (@VICE) June 26, 2019
This stupid DeepNude app for desperate virgins, except instead of replacing a clothed woman with an unclothed woman's body, it replaces anyone's body with shirtless Alex Jones.https://t.co/0EoMkr9gV1
— Emily H E X A Crose (@hexadecim8) June 27, 2019
#deepfakes have worrying implications for our democracy. But we know who'll get the sharp end of this new technological stick:
— Dr. Ann Olivarius (@AnnOlivarius) June 28, 2019
Women and girls. Their harassers can just create their own nudes now. #revengepornhttps://t.co/lBfoJFf8sU@daniellecitron @ma_franks @cagoldberglaw
Deep Learningで女性のヌード画像を生成するサービスの話題
— Nakaji Kohki / リリカちゃん (@nkjzm) June 27, 2019
当然モラルや倫理の問題が出てくる訳だけど、開発者は
- PhotoShopでも同じことができる
- 画像生成のみで、どう使うかはユーザー次第
- 誰でも使える技術なので、結局誰かがやる
と、極めて本質的な話をしている。https://t.co/wPEAs6fLWN
New: an app that uses neural networks to remove clothing from the images of women making them look realistically nude. The $50 app, called DeepNude, "dispenses with the idea that deepfakes were about anything besides claiming ownership over women’s bodies" https://t.co/vKOA3HAJPR
— Joseph Cox (@josephfcox) June 27, 2019
"This is absolutely terrifying...Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo." https://t.co/RvL5nPBTDE
— Motherboard (@motherboard) June 27, 2019
The blackmail and scandal potential of nudes diminish as high quality fakes proliferate. There's nothing inherently wrong with nudity, only social taboos, which will be exhausted by overexposure. https://t.co/BBwituDCQS
— Jay Graber (@arcalinea) June 28, 2019
The innovation never stopshttps://t.co/bLkxwgBcxH pic.twitter.com/cB59YHgO7k
— Nick (@NickatFP) June 28, 2019
This is horrific. Like, career ending, relationship ruining, reputation destroying HORRIFIC. How is it legal??https://t.co/bLQjmvcuOC
— Em Clarkson (@EmilyClarkson) June 27, 2019
着衣写真から1クリックでヌード写真を作る極悪アプリを作った理由が、「人工知能で、昔広告にあった【服が透けるメガネ】を実現出来ると気付いたから」で、人工知能の危険性が反乱じゃなくエロな事に笑うが、フォトショ1時間が1クリック30秒なら犯罪ともいかんしどうすんだコレhttps://t.co/5EYLsbdqKU pic.twitter.com/KFLAIJNENe
— Podoro (@podoron) June 28, 2019
About to ruin Twitter forever...https://t.co/0lWz7HcjO3
— ?(((Cozy Tyger)))? (@CoziestTyger) June 28, 2019
BBC News - App that can remove women's clothes from images shut down https://t.co/hP215ge05c
— ??⚕️#Humanity#5#OneLove ☤ (@DrDiana7) June 29, 2019
Motherboard Creator of DeepNude, App That Undresses Photos of Women, Takes It Offline: An app that algorithmically "undressed" images of women was taken down by its creator, citing server… https://t.co/Oy6dwBNYFo #programming #algorithm #deepfake #deepnude Via @motherboard pic.twitter.com/y1GDTFXmiq
— Bradley Jon Eaglefeather (@bjeaglefeather) June 29, 2019
Creator of DeepNude, App That Undresses Photos of Women, Takes It Offlinehttps://t.co/yJz1PiE2AR pic.twitter.com/MaWVG3wV2M
— CarlosAnaC (@CarlosAnaC) June 28, 2019
$50 DeepNude app undresses women with a single click: "I'm not a voyeur, I'm a technology enthusiast,” says the creator, who combined deepfake AI with a need for cash to get ka-CHING! https://t.co/dCeeH4VG8j #Fakenews #Governmentsecurity #MachineLearning @Cygnacom pic.twitter.com/ljFYi5sVlp
— Debra Baker (@deb_infosec) June 28, 2019
This Horrifying App Undresses a Photo of Any Woman With a Single Click - VICE https://t.co/TFdiAIVhnk
— Mad Bitcoins (@MadBitcoins) June 29, 2019
ディープヌード。「裸の写真を撮ったことがなくても、誰でも復讐ポルノの犠牲者になる可能性があります。この技術は一般には公開されないはずです。」いづれ出るとは解かってたものだけど。これがAR化したらその時はどう扱われるのか> https://t.co/FtdUczkWJG @viceさんから
— ゆきぞー (@yukizokin) June 27, 2019
昨年、AIを使ってポルノビデオの女優の顔をセレブの顔に変えるDeepFakeが一斉風靡して大問題になったけど、今度は、1clickで写真の中の女性の服を脱がすことができるApp「DeepNude」がリリースされた。料金はたったの$50。これはえらいこっちゃ。https://t.co/qG5O4BtNOp
— セキ ヤスヒサ?️ (@Campaign_Otaku) June 28, 2019
The violations this enables are horrifying to think about. What an unethical thing to build and release. https://t.co/lB7lYkUcMu
— Robert Zacny (@RobZacny) June 26, 2019