Ever app trained facial recognition tech on users' photos, report says [www.cnet.com]
Millions of people uploaded photos to the Ever app. Then the company used them to develop facial recognition tools. [www.nbcnews.com]
Photo App Ever Uses Photos to Train Facial Recognition Tools [www.macobserver.com]
Ever app users uploaded billions of photos, unaware they were being used to build a facial recognition system [www.grahamcluley.com]
Surveillance capitalism’s new normal: lure unsuspecting consumers to give you their photos with warm and fuzzy branding & then use those photos without informed consent to train face recognition tech you market to police and military https://t.co/M2rdl7Slkn pic.twitter.com/f1UYzs5YsB
— Olivia Solon (@oliviasolon) May 9, 2019
“When asked if the company could do a better job of explaining to Ever users that the app’s technology powers Ever AI, Aley said no.”
— Eric Mortensen (@ericmortensen) May 9, 2019
The company barely mentions it in its privacy policy and flat out hid it from users until NBC called them on it. https://t.co/X83x2w5eFd
Journalists get permission before photographing children. But startups think it's okay to use pictures of kids to train facial recognition tech, without bothering to ask? https://t.co/K0v3PkmTvZ
— Ainsley Harris (@ainsleyoc) May 9, 2019
Millions of people uploaded photos to the Ever app. Then the company used them to develop facial recognition tools. https://t.co/VpENa155Ff
— RR Apple (@RRalstonAgile) May 9, 2019
Millions of people uploaded photos to the Ever app. Then Ever used their photos to develop facial recognition tools which it now sells.https://t.co/syFeMiUCyb
— The Tor Project (@torproject) May 9, 2019
“They are commercially exploiting the likeness of people in the photos to train a product that is sold to the military and law enforcement. The idea that users have given real consent of any kind is laughable.”—Prof. @Lawgeek on photo storage app Ever: https://t.co/iisVSyRj71
— NYU Law (@nyulaw) May 9, 2019
Millions of people uploaded photos to the Ever app. Then the company used them to develop facial recognition tools. (by @oliviasolon
— Cyrus Farivar (@cfarivar) May 9, 2019
and @cfarivar) https://t.co/4lmPWZd7qU
"Photo storage app Ever, launched in 2013, now focuses on selling facial recognition tech, trained using 13B+ stored user photos, to others (NBC News)" https://t.co/xezsK6vg1s
— Chris Heilmann (@codepo8) May 9, 2019
If it is free, it isn't...
This is a privacy violation for the artificial intelligence age. A company uses its customers’ data in a way they would never expect: to build a surveillance system for cops and the military. https://t.co/gtaLAFO7Gh
— Matt Cagle (@Matt_Cagle) May 9, 2019
As Zuboff shows in Age of Surveillance Capitalism, the firm extracts by asserting an assymetric privacy to refine “data exhaust” into predictive models and derivative products, all while optimizing the opacity of these processes. https://t.co/uSoVpjPANP by @oliviasolon @cfarivar
— David Carroll ? (@profcarroll) May 9, 2019
Companies shouldn't use people's private information to build surveillance tools without their consent.
— ACLU (@ACLU) May 9, 2019
That would be an serious violation of privacy. https://t.co/Hbq6RiNNQL
This is one hell of a graphic from the @oliviasolon and @cfarivar story on a photo app company running a side hustle in training facial recognition AI https://t.co/2CTNP2JzOk pic.twitter.com/T7WB7JW024
— abortions are good (@GavinSchalliol) May 9, 2019
“Make memories” becomes “train drones.” ? https://t.co/WeeXTMev7E
— One Ring (doorbell) to surveil them all... (@hypervisible) May 9, 2019
“This looks like an egregious violation of people’s privacy,” said the @ACLU's @snowjake. “They are taking images of people’s families, photos from a private photo app, and using it to build surveillance technology. That’s hugely concerning.” https://t.co/M2rdl7Slkn
— Olivia Solon (@oliviasolon) May 9, 2019
If it's free, you are the product.
— Theo (@tprstly) May 9, 2019
Doesn't matter how many times you repeat yourself people just don't get it.
And from this point onwards everything you do or share is to train a machine somewhere to get better at what it does.
https://t.co/88fThY6i0C
“Ever AI promises prospective military clients that it can ‘enhance surveillance capabilities’ and ‘identify and act on threats.’ It offers law enforcement the ability to identify faces in body-cam recordings or live video feeds.”https://t.co/2lxe4clKcA
— Ind.ie (@indie) May 9, 2019
Users have shared the private photos stored in their #email and social networks with ever - not realising that they were being used to feed a facial recognition system. #cybersecurity #infosec https://t.co/xgGhWnU39I
— Ronald van der Meer (@ronaldvdmeer) May 9, 2019
Millions of people uploaded photos to the Ever app. Then the company used them to develop facial recognition tools. https://t.co/FzUiOpNUmZ via @nbcnews
— Magdalena? (@elisse1313) May 9, 2019
Millions of people uploaded photos to the Ever app. Then the company used them to develop facial recognition tools for private companies, law enforcement and the miltary. https://t.co/CB1N9bDvaH
— Aryeh Goretsky (@goretsky) May 10, 2019
Millions of people uploaded photos to the Ever app. Then the company used them to develop facial recognition tools. https://t.co/KIkC8bThgj via @NBCNews
— Privacy Matters (@PrivacyMatters) May 9, 2019
Everything is about getting your data. Full stop. https://t.co/TwoEkLPqAy
— Molly McKew (@MollyMcKew) May 9, 2019
Ever app users uploaded billions of photos, unaware they were being used to build a facial recognition system https://t.co/By1dwOMoRr via @InfoSecHotSpot pic.twitter.com/FB0wIxjy4K
— Sean Harris (@InfoSecHotSpot) May 9, 2019