"Deepfakes" Technology and Pornography Laws

“Deepfakes technology” uses machine learning to doctor film and replace the face of one person with the face of another. It does so by condensing the film editing task into three, easy steps: (1) “gather a photoset of a person;” (2) “choose a video to manipulate;” (3) wait for your computer to do the rest. There are several different softwares available to create Deepfakes. One software, dropped less than a month ago, already has 100,000 downloads. The softwares are most commonly being used to take celebrities’ faces and put them on iconic characters. However, a far more insidious use of the software is taking celebrities’ faces and placing them over the faces of adult film actors. Considering that the software works on anyone’s face, so long as the user obtains a set of clear photos of the person, which is easier than ever with sites like Facebook, Instagram, and Snapchat, any person can use the software.

Thus far, no one has taken legal action to have the videos removed. Under current law, it is unclear whether the videos can be considered nonconsensual porn.

Most people are familiar with the concept of “revenge porn,” defined as posting “[r]evealing or sexually explicit images or videos of a person . . . on the Internet” “without the consent of the subject,” and “in order to cause them distress or embarrassment.” However, because people also post revealing or nude content without consent for non-revenge reasons, for example, profit, thirty-four states and the District of Columbia passed legislation making “nonconsensual pornography” a crime. Nonconsensual pornography laws provide greater protections to adult victims who, unlike minors, do not receive additional protection under federal and state laws against child pornography.
Laws prohibiting nonconsensual porn vary by jurisdiction, and because they are fairly new, have gaps that allow acts of nonconsensual porn to slip through the cracks. If, for example, someone was to post a nonconsensual nude photo where they used emojis to cover the victim’s private areas, in many jurisdictions, the nude image “would likely fall outside a criminal statute’s definition of pornography.” Considering that actual nude photos may fall outside of the laws’ scope, the emergence of “Deepfakes” technology is particularly concerning.