89.9 FM Live From The University Of New Mexico
Play Live Radio
Next Up:
0:00 0:00
Available On Air Stations

NM lawmaker wants the public informed if a political ad is a deepfake

Igor Omilaev

The 2024 election comes as Artificial Intelligence has not only gotten better, but much more accessible. A bill moving forward in the New Mexico House tries to catch election policy up with the fast-moving technology that can create hard-to-spot fakes. The bill would help ensure voters know what they are looking at when AI is used to create disinformation.

Some voters in New Hampshire recently received a robocall featuring a voice that sounded remarkably like President Joe Biden discouraging them from casting a ballot in the state’s primary election last week.

“It’s important that you save your vote for the November election,” the voice said. “Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again.”

The state’s Department of Justice is investigating and said it appears to be AI-generated.

Only five states have laws on the books that explicitly require disclosures of misleading AI content like this, or “deepfakes,” according to the National Conference of State Legislatures. State Rep. Gail Chasey (D-Bernalillo) is hoping New Mexico joins them.

“We want to actually make certain that people know what they’re seeing in elections,” she told her colleagues on the House Government, Elections and Indian Affairs Committee Wednesday.

She is sponsoring House Bill 182, which would amend the state’s Campaign Reporting Act, requiring that any “materially deceptive media” comes with a disclaimer that it was made using AI. That way, voters are aware that — no matter how realistic something is — it depicts someone doing or saying something that never happened.

The bill makes it a crime to distribute content like this without such a disclaimer. A first offense would be a misdemeanor punishable by up to 90 days in jail, a $500 fine, or both. A second offense within five years would be a felony, resulting in a five year sentence, $1,000 fine, or both.

If the disclosure is made, it would not be a crime in and of itself under this bill to distribute a deepfake.

Lindsey Bachman with the Secretary of State’s Office told the panel of lawmakers Wednesday that her office would take complaints about potential violations.

“There would not be a dedicated person responsible for watching all the ads ever,” she explained.

She said the office would refer potential criminal cases to the attorney general or a district attorney. The bill also allows for a person who is falsely depicted, an impacted candidate or an organization representing voters to seek a remedy through the courts.

The bill passed out of the committee on a 7-2 vote. It now heads to the House Judiciary Committee.

Nash Jones (they/them) is a general assignment reporter in the KUNM newsroom and the local host of NPR's All Things Considered (weekdays on KUNM, 5-7 p.m. MT). You can reach them at nashjones@kunm.org or on Twitter @nashjonesradio.
Related Content