Note
50 Mitchell Hamline L. Rev. 596 (2024)

Election Integrity and the First Amendment: A Statutory Analysis of States’ Regulations of Election Deepfakes

By
Steven Carver

In May 2019, a conservative Facebook page posted a video of then-House Speaker Nancy Pelosi that quickly spread via social media networks. The video, taken from a speech she delivered on a Wednesday, made it appear that Pelosi was drunk. Her words sounded slurred, and her body language seemed sluggish. By that Thursday, the video had registered over two million views and thousands of comments on her appearance. On that Friday, The Washington Post published a story that showed the video had been edited to play Pelosi’s speech at approximately seventy-five percent speed compared to the original, and her voice pitch had also been altered to compensate for the slower video. The article debunked the assertion that Pelosi was intoxicated and claimed that “simple, crude manipulations” were used to make the viral content.

Two months later, California Assemblyman Marc Berman referenced this video before the California Senate Elections and Constitutional Amendments Committee. He introduced a bill that would prohibit the dissemination of manipulated media of election candidates in the sixty days before an election. While Berman used the doctored Pelosi video as an illustration, his bill encompassed not only simple, rudimentary editing as seen in the Pelosi video, but also a significantly more advanced form of synthetic media called “deepfakes.” Deepfakes are realistic representations of people doing things they never did or said that are generated by artificial intelligence (AI).

The California News Publishers Association (CNPA) spoke in opposition to Berman’s bill. A representative for the CNPA said, “Deepfakes are an old problem of disinformation dressed up in new clothes, and we already have tools to address that problem. First, we have the oldest solution in time. We have more speech.” The CNPA went on to point out that after the deepfake Pelosi video went viral, organizations like The Washington Post debunked the assertion that Pelosi was drunk. The debunking of the Pelosi video is an example of what the United States Supreme Court would call “counterspeech,” or the proposition that “[t]he remedy for speech that is false is speech that is true.”

California passed the bill into law later in 2019 and became the first of four states, as of late 2023, to regulate the use of deepfakes in election contexts. This Note presents an analysis of each of these states’ deepfake statutes under First Amendment doctrines, and it comes to the conclusion that the states do not appropriately employ counterspeech techniques to address deepfakes in elections.