In response to the global scourge of deepfake pornography, legislators in the UK have introduced a new law criminalizing the creation of non-consensual, sexually explicit deepfake content.
Recently, Channel 4 News host Cathy Newman spoke out on the trauma and dehumanization she experienced after finding a deepfake pornographic video with her likeness online. Newman is among more than 250 UK celebrities who have been public about their experience with such content. Former Love Island contestant Cally Jane Beech found the UK police could not help her after she found deepfake nude images of herself circulating on social media. Kate Isaacs, an activist behind the movement #NotYourPorn, found herself targeted by revenge deepfake pornography specifically because of her advocacy for victims.
Under the new amendment to the Criminal Justice Bill, creators behind this abusive content will face a criminal record, unlimited fines, and possible jail time depending on how widely their deepfakes have been shared. One of the most important aspects of the law is its focus on creation, making it an offense even if the creator has no intention of distributing the content. Should offenders choose to share the deepfakes, they may be charged with additional crimes and additional jail time. The law applies to all methods of deepfake creation, including the popular faceswap and “nudify” apps.
Crown Prosecution Service will be able to directly charge any offenders who create and/or spread deepfake nonconsensual pornography in order to “cause alarm, humiliation, or distress to the victim.” UK’s Minister for Safeguarding, Laura Farris MP, explained the government’s stance: "The creation of deepfake sexual images is despicable and completely unacceptable irrespective of whether the image is shared. This new offense sends a crystal clear message that making this material is immoral, often misogynistic, and a crime."
While the new amendment has been met with widespread approval, some advocacy groups have argued that the amendment hinges on the perpetrator’s intent, which may be hard to prove and provide a loophole. In addition, the same groups are calling for legislators to establish increased responsibility for tech companies to moderate deepfake content. Others seek details on how the law will be enforced, expressing concerns about the authorities’ ability to identify anonymous perpetrators, and suggesting that the police currently do not have the training or resources to deal with the sensitive nature of these cases.
Meanwhile, in March, the European Commission proposed a directive to criminalize non-consensual sharing of sexual images, including deepfake pornography. If passed, all EU member states will be required to develop laws similar to the UK amendment.
What Does This Mean for Legislation in the U.S.?
Right now, laws concerning non-consensual deepfake pornography are rather piecemeal and on a state-by-state basis. Despite efforts at introducing litigation on the federal level, progress has been slow-moving. That said, with movement in the UK on this specific issue, we have faith that our elected officials will take inspiration from our friends across the pond and move as swiftly as possible to address such a pressing issue.
At the same time, Reality Defender is committed to supporting clients in any efforts to detect non-consensual deepfake pornography. As an organization dedicated to upholding integrity, we welcome organizations and individuals who share our mission of protecting vulnerable populations — both online and offline — and will continue to hold conversations on such collaborative efforts. By working together, we can create a safer and more responsible digital landscape for all.