DeepNude app that turned photos of clothed women into nudes shuts down The viral app took deepfakes to another level, but its Twitter account now says, "We don't want to make money this way." [cnet.com...]
"The world is not yet ready for DeepNude." (author)
no comment...
JS_Harris
4:55 am on Jul 21, 2019 (gmt 0)
Didn't Facebook ask users to send in nudes to protect them in case someone tried to blackmail them with nudes? Might have been another company, not the point though.
Deepfake is a nightmare for data gatherers, especially with image filters. Most filters embed the fact that they are a filter within the metadata and sites like Facebook add meta-data that is account unique to each picture for tracking purposes, but these fakes are a pain.
A Florida police force just canceled an Amazon facial recognition contract over "glitches" which are rumored to be similarities between one persons real image and someone elses filtered image. I'd expect a lot more image tracking techniques to come into play soon to help keep facial recognition systems free of false positives.
I just saw an image of a fake that merged Taylor Swift and Justin Beiber's faces that was so well done I could see both of them and couldn't tell which was the base image.
I think that when an image is taken that is believed to be you they need to do a quick check to see if you had any other metadata recorded when the image was taken. IE: was your mobile device nearby when the image was taken? If yes that boosts the likelyhood its you. Has that electronic device taken pictures of you before? If yes that boosts the likelyhood it's you. etc. 5g would be needed to cross-check this stuff in real time I'd imagine.
Dimitri
7:49 am on Jul 21, 2019 (gmt 0)
The news US presidential campaign will be fun with all the Deppfake stuffs :)