wixamixstore



States are rapidly adopting laws to grapple with political deepfakes in lieu of comprehensive federal regulation of manipulated media related to elections, according to a new report from the Brennan Center for Justice.

Nineteen states passed laws regulating deepfakes in elections, and 26 others considered related bills. But an NBC News review of the laws and a new analysis from the Brennan Center, a nonpartisan law and policy institute affiliated with New York University School of Law, finds that most states’ deepfake laws are so broad that they would face tough court challenges, while a few are so narrow that they leave plenty of options for bad actors to use the technology to deceive voters.

“It’s actually quite incredible how many of these laws have passed,” said Larry Norden, vice president of the Brennan Center’s Elections and Government Program and the author of the analysis released Tuesday.

The study found that states introduced 151 different bills this year that addressed deepfakes and other deceptive media meant to fool voters, about a quarter of all state AI laws introduced.

“That’s not something you generally see, and I think it is a reflection of how quickly this technology has evolved and how concerned legislators are that it could impact political campaigns,” he said.

Deepfake technology, which uses artificial intelligence to mimic a person with audio or video, has existed for years. But this year’s election is the first in which the technology is so widely available that practically anyone with a computer can make a convincing deepfake cheaply or for free and post it on social media.

Senate Majority Leader Chuck Schumer, D-N.Y., indicated in an interview Thursday that he wanted to pass federal legislation regulating political deepfakes but that to get such bills passed before the election, they would most likely need to be attached to a must-pass funding bill near the end of September, barely a month before Election Day.

Many of the state laws that have been passed don’t impose criminal penalties on deepfakes but rather give candidates a path to sue political opponents who used them in political ads without disclosing that the technology was used. Many of those apply only 90 or 60 days before an election.

The laws don’t account for politically themed deepfakes that could, for example, persuade voters to stay home on Election Day using false videos depicting violence at polling places, as such videos wouldn’t depict candidates who would have standing to sue.

Some of the laws also require plaintiffs to prove that someone who made a deepfake acted with “actual malice,” which is far easier to prove about a rival campaign than about a prankster on social media.

A few laws, like Tennessee’s Ensuring Likeness, Voice and Image Security (ELVIS) Act of 2024, go the opposite direction and seek to broadly end a wide swath of potentially misleading deepfakes. People sued under those laws would have a good chance of winning by citing the First Amendment, Norden said. Similar bills are being considered in Illinois and South Carolina.

“The broader bills are going to come under more scrutiny from the courts,” he said.

There have been at least two major instances of a deepfake being used to potentially mislead voters en masse about a political candidate this year. In February, political consultant Steve Kramer engineered a plan in which robocalls played a recording of a deepfaked Joe Biden telling New Hampshire Democrats not to vote in the primary. And this week, X owner Elon Musk, who has endorsed Donald Trump for president, shared a deepfaked video of Democratic presidential nominee Kamala Harris misstating her policy positions. The video’s creator originally labeled it as parody, but the version Musk shared removed the label.

Neither of those instances have been punished by deepfake-specific laws. New Hampshire’s attorney general charged Kramer with 26 criminal counts, all related to existing laws prohibiting voter suppression and impersonating a candidate. New Hampshire didn’t have an AI-specific law at the time, though it has since passed one.

It appears the new law would have covered Kramer’s robocall scam. Its definition of felony deepfakes includes using the technology to knowingly distribute a deepfake to harm the reputation of a specific person in the context of a political campaign without acknowledging the technology has been used.

The Federal Communications Commission, which regulates U.S. telecommunications, fined Kramer $6 million, accusing him of violating existing agency rules against spoofing phone numbers to defraud. The FCC has since adopted a rule declaring it can fine people who use deepfakes to trick people, but only over telephone networks, as it doesn’t have jurisdiction over social media. 

It’s also unlikely that Musk’s video would run afoul of any existing state laws, Norden said, in part because it wasn’t part of an official campaign.

“There is almost always some kind of parody exception, and even without a label, Musk would likely argue this was parody,” he said.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *