India Markets open in 4 hrs 16 mins

AI-created Gal Gadot XXX video goes viral on Reddit, hints fake celebrity porn is going to flood web soon

1 / 1

AI-created Gal Gadot XXX video goes viral on Reddit, hints fake celebrity porn is going to flood web soon

And seems that it is going to be doubly dangerous for Hollywood and Bollywood actors that may find themselves morphed into porn videos.

Artificial Intelligence (AI) is dangerous. And seems that it is going to be doubly dangerous for Hollywood and Bollywood actors that may find themselves morphed into porn videos. While morphed photos of actors in porn photos isn't something new -- blame it on Photoshop -- on the web, an AI-powered video app can now apparently create fake porn videos so thorough that it may blur lines between what is imaginary and what is real. The example of such videos are already available on a Reddit community called Deepfakes (NSFW) where porn videos "featuring" Hollywood celebrities are going viral.

The news originally came out in December when someone created a fake porn video that "featured" Gal Gadot -- yes, the Wonder Woman -- using an AI-powered app. Apparently the app, which uses neural networks to do its job, is so through at face-swap that it is difficult to tell that if the video is fake or real. As a proof of concept, some people took a porn video and replaced the face of the actor in it with Gal Gadot's. The video was later posted on the web.

Now, over at Deepfakes, some enterprising users have simplified the whole process to creating fake porn videos and have come out with an app that allows almost any tech savvy web-user to create a fake porn video "featuring" almost any celebrity. The Deepfakes community on Reddit has over 15,000 members.

The app, along with how-to-use-it manual, is available on Reddit. It's called FakeApp. "This app is intended to allow users to move through the full deepfake creation pipeline -- creating training data, training a model, and creating fakes with that model -- without the need to install Python and other dependencies or parse code," notes the app creator, who calls himself "deepfakeapp" on Reddit.

Even as the app goes viral, along with some of the stuff that has come out of it, its creator wants to simplify it even more. "I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks," the man behind FakeApp tells Motherboard. "Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button."

In a world where already it is difficult to tell apart fake from real, the FakeApp points towards a scary future where anyone can find himself or herself in a fake porn video. Or find himself or herself in any kind of video, that may then be used with scary consequences. Imagine the kind of videos that may go viral on the web once this technology is more widely available and is easier to use.

The technology to use AI to create "real videos" already exists and Hollywood uses it all the time in its movies. But now as more and more people get access to powerful neural networks, it seems the technology is going end up in the hands of people who will have enough free-time on their hands to play with footage and photos of famous people to create fake porn videos.