In the era of “fake news,” a new issue arises – deepfakes. You’ve probably heard the term tossed around, since it seems like everyone is talking about them right now, but what exactly are deepfakes? A deepfake is basically the use of artificial intelligence to superimpose images and videos onto each other. This essentially means that videos can be edited to seem like someone did or said something that they did not do or say in reality. The unique challenge of deepfakes is that it is becoming increasingly difficult to tell whether the videos are real or fake. While the technology can be used for benign, entertaining uses (hello Nicolas Cage face swaps), there is a darker side to the technology that has and will continue to have negative societal consequences.
One major negative to deepfakes and one use that has already taken hold on the internet is the creation of deepfake pornography. Using deepfake technology, people have started superimposing celebrities’ faces onto pornography and posting it on the internet; it goes without saying that this can have enormous consequences to victims’ reputations and wellbeing. As a silver lining, celebrities at least have a public relations platform to debunk false videos, just as they have a platform to deny allegations posted in a tabloid, for example. However, if the same technology is used to superimpose a private individual’s face onto pornography which is then made widely available, this could have devastating effects on a private individual’s reputation and mental health, with little ability for the individual to stop it. Such uses bleed into the realm of identity theft and revenge porn.
The other most obvious social consequence of deep fakes in this day and age is the mistrust of information. “Alternative facts” and “fake news” have become part of our vernacular, and now deepfakes give people the tools and ability to actually create fake news to manipulate facts and political narratives. Some collateral effects of the ability to create these alternative facts and scenarios is the loss of autonomy over their image and the increased possibility for defamation, infringement of publicity rights, and other reputational damage.
California has already begun to recognize the societal damage deepfakes could cause and has introduced legislation intended to curb some of the darker uses of the technology. Under proposed California Bill 730, a person could face up to a $2000 fine and a year in county jail for creating a deceptive recording (i.e. a deepfake) with the intent to distribute it, while knowing that the deepfake is likely to deceive people as well as defame or embarrass the person depicted in the deepfake. The proposed legislation would not apply in cases of satire or parody, and would not cover videos where no reasonable person would believe the video is real. While the bill hasn’t passed into law yet, it emphasizes just how big an issue deepfakes already are and could become in the future, if left unchecked.
Next week we’ll address the economic consequences of deepfakes.
Pfeiffer Law Corp is a law firm with an emphasis on social media and entertainment law.
Contact Jon and his team today.