Disney and other entertainment and media industry giants have issued letters and memoranda opposing a New York Assembly bill that would outlaw “deepfakes” — videos where performers’ faces get swapped out with celebrity faces. The company’s opposition to the deepfakes law may seem odd considering that most deepfakes are computer-generated fake celebrity porn, but the concern is that it’s so broadly worded that it might inhibit the company’s ability to tell stories about real-life individuals and events, especially if the characters are computer-generated in any way.
Disney — along with NBCUniversal, Viacom, Warner Brothers and the Motion Picture Association of America (MPAA) — have all written against New York Assembly Bill A.8155B, a law that refers to any computer generated likeness of a living or dead person as a “digital replica.”
The law says, “Use of a digital replica of an individual shall constitute a violation if done without the consent of the individual if the use is in an audiovisual pornographic work in a manner that is intended to create and that does create the impression that the individual represented by the digital replica is performing.” The bill would also prohibit the depiction of any living or dead person in “pornographic works” without their written legal consent.
But the media giants worry that the law will prohibit their ability to create likenesses of other real-life people in their films. In the MPAA’s memorandum against the bill, they say the deepfakes law also doesn’t define the term “pornographic works,” leaving it vague and too open to interpretation (and, thus, possible lawsuits).
Lisa Pitney, Vice President of Government Relations at The Walt Disney Company, writes, “[T]he bill would create entirely unprecedented rights to control the use of ‘digital replicas’ and the use of celebrity images in sexually explicit material, which, while presumably well-intended, threaten expressive activities as a result of undefined, vague or otherwise problematic statutory language.”
Twitter and Reddit have officially banned deepfakes from their social media platforms, citing policies against “involuntary pornography” and sharing “intimate photos or videos of someone that were produced or distributed without their consent.”