People Are Now Making Fake Celebrity Porn Using AI Technology, and It’s Frightening

People Are Now Making Fake Celebrity Porn Using AI Technology, and It’s Frightening

Be first to like this.

This post is also available in: Español

A Motherboard article reports that web users have begun creating fake porn scenes swapping celebrity faces onto other people’s bodies using artificial intelligence video technology. While this might sound fun, one tech expert says we’re only two years off from the technology being publicly available (and potentially abused).

Here’s how the technology works: Basically, the artificial intelligence software uses an artificial neural network (ANN) — a fancy sounding-name for a computer program that can learn and recognize patterns and change its output based on those patterns — to detect and swap out people’s faces on video. If you’ve ever used a face filter on Instagram — those fun filters that put animal ears or funny sunglasses on your face — or if you’ve ever used a face swap app, then you’ve already used similar AI technology.

The AI technology in this case receives hours of porn and celebrity footage. The ANN detects faces in different angles and then swaps out the porn performer’s face with a celebrity’s face, doing its best to seamlessly match the lighting and angles to make the swapped face seem natural and real.

RELATED | This Video of Obama Discussing the Pulse Shooting Is Frightening Because It’s Fake

Similar technology has allowed people to create fake videos of former U.S. President Barack Obama saying things he never said, simply by swapping out images of his lips onto footage of his face looking at the camera.

Motherboard reports that there’s an app that helps beginners create fake porn videos freely with step-by-step instructions. Some early examples of the technology by actress Jessica Alba’s face on porn performer Melanie Rios’ body (below) or Emma Watson’s face on a woman taking a shower.

Peter Eckersley, chief computer scientist for the Electronic Frontier Foundation said while the current technology still creates easily detectable fakes and is too advanced for non-technical users, “We’re on the cusp of this technology being really easy and widespread.”

If everyday web users can make hard-to-detect fake videos changing people’s faces and words, it’ll be harder for viewers to discern fakes from reality, raising considerable potential for widespread misuse and abuse.

Related Stories

Deck Out Your Bags and Battle Jackets With Pronoun Patches and Pins
'Bros' Isn't Perfect, But It's Important — and Hopefully a Sign of More to Come
The Ancient Romans Had Many Uses for Urine, Including Teeth Whitening
A Short History of the Bidet, and Why They're Just Not Popular (Yet) in America
Quantcast