Facebook Announces the Use of AI to Detect and Prevent “Revenge Porn”

April 1, 2019

Facebook recently announced the use of new technology to prevent the sharing of non-consensual intimate images, most commonly referred to as “revenge porn.” Facebook will now use “machine learning and artificial intelligence” to “proactively detect near nude images or videos that are shared without permission on Facebook or Instagram,” according to the company’s Global Head of Safety, Antigone Davis. This initiative is currently being done out of good will and not legal obligation. However, due to a recent amendment of federal law, it is foreseeable that initiatives like this will be required in the future and will come burdened with many legal concerns.

The Impacts of Revenge Porn

Revenge porn” is defined as the “distribution of sexually graphic images of individuals without their consent,” and it includes images that were obtained consensually in the context of a relationship, as well as those obtained through hidden camera or hacking. One study stated that 4 percent of internet users “have either had sensitive images posted without their permission or had someone threaten to post photos of them.” Victims of revenge porn suffer severe personal and psychological consequences. A study noted that 80 to 93 percent of victims “suffered significant emotional distress…[including] anger, guilt, paranoia, depression, or even suicide.”

Another study indicates that 51 percent of victims have had suicidal thoughts “due to being a victim;” 42 percent have had to “explain the situation to professional or academic supervisors, coworkers, or colleagues;” 55 percent fear that their professional reputation will be negatively impacted in the future; and 82 percent have “suffered significant impairment in social, occupational, or other important areas of functioning due to being a victim.” While men and women can be victims of revenge porn, this study concluded that 90 percent of the victims were women.

Facebook’s Attempt to Combat Revenge Porn

In an attempt to preemptively combat this issue, Facebook is launching an AI tool to detect revenge porn posts before anyone reports it. The tool itself will recognize near nude content that is paired with “derogatory or shaming text.” It will then flag the content to be reviewed by a “specially-trained” member of Facebook’s staff who will determine if the content is in violation of the Community Standards, remove it, and potentially disable the account that posted the content without permission. Facebook shared in an email to Gizmodo that the “detection technology was trained on revenge porn in order to better understand what these types of posts would look like.” This most recent initiative is an extension of Facebook’s 2017 pilot program that called for users to proactively submit their intimate images and videos so that they could receive a “digital fingerprint” that would allow Facebook to detect and prevent it from ever being shared on the site.

Facebook will now use “machine learning and artificial intelligence” to “proactively detect near nude images or videos that are shared without permission on Facebook or Instagram,” according to the company’s Global Head of Safety, Antigone Davis.

The 2017 pilot program received extensive criticism, including skepticism over whether Facebook should be trusted with intimate photos after it was determined that the site had shared private data with third party providers. Similarly, the announcement of the AI tool has been met with skepticism of its success. A major concern is that Facebook has yet to announce exactly how the tool will work. AI is only as good as the information put into it and contains the bias of the programmers, which is an issue, according to Rolling Stone because “a sizeable percentage of actual human men don’t actually know what consent is.” Therefore, it is unlikely that a machine would be able to determine if a post was shared with permission. Without more information on how the technology is actually “trained” to detect nonconsensual images, the skepticism will continue and legal concerns will likely arise.

The Laws Regarding Revenge Porn and Social Media Liability

Revenge porn has been recognized as a serious threat prompting most states to enact revenge porn statutes. There is dissonance between states around whether revenge porn should be redressed through criminalization or civil remedies; and because revenge porn doesn’t fall within any of the enumerated categories of unprotected speech, there is also an underlying First Amendment question surrounding the laws banning revenge porn. North Carolina’s Disclosure of Private Images Act criminalizes the act of “knowingly disclos[ing] an image of another person with the intent to…coerce, harass, intimidate, demean, humiliate, or cause financial loss to the depicted person” or with the intent to cause others to do any of the same. Along with criminal prosecution, the N.C. law allows for civil remedies to be awarded to victims.

While states have begun to address victims’ rights against perpetrators of revenge porn, there hasn’t been as much legal progress in holding websites accountable for the explicit content shared on their platforms. Many websites are protected by the Communications Decency Act §230, which provides a “safe harbor to internet service providers and platforms, exempting them from liability based on the speech and content of their users.” However, in 2018, Congress passed an amendment to Section 230 that allowed enforcement against online service providers that knowingly host content that promotes sex trafficking.

Based on the recent amendment, it is foreseeable that in the near future, social media sites will have the more active duty to prevent the sharing of revenge porn on their platforms. If this does happen, Congress will have to decide if tools like the one Facebook is launching will help guard against liability or will be a signal that the site knowingly hosts explicit content. Another issue that will need to be addressed is whether removing the content is enough, or if the site should have the duty to report revenge porn to authorities, specifically in regard to minors or children.            

As the law stands currently, there is no requirement for social media sites to prohibit the sharing of revenge porn or to report the criminal activity to authorities, but as the sharing of nonconsensual intimate images becomes more common, special attention should be paid to whether social media site should be required to take an active role in preventing it and what that role should actually be.

Hannah Petersen, 18 March 2019