The AI election is here. Regulators can’t decide whose problem it is.

The federal government is facing a dwindling window to regulate the use of artificial intelligence on the campaign trail before the 2024 election. But a brewing turf war between federal agencies is threatening one of the most significant attempts to set new rules for the tools.

The chair of the Federal Communications Commission announced a plan last month to require that politicians disclose AI use in TV and radio ads. But the proposal is facing unexpected opposition from a top official on the Federal Election Commission, which has been considering its own new rules on AI use by campaigns.

Subscribe to The Post Most newsletter for the most important and interesting stories from The Washington Post.

The dispute - along with inaction at the FEC and Congress - could leave voters with limited federal protections against those who use AI to mislead the public or mask their political messages during the campaign’s final stretch. New generative AI technologies have already proved capable of creating uncannily realistic images.

“AI has the potential to be very influential in our elections, and right now, there’s a total vacuum of regulation on this issue,” said Ellen Weintraub, the Democratic vice chair of the Federal Election Commission.

Over a dozen states have adopted laws regulating AI use in campaigns, but Congress has yet to step in despite broad concern over the tool’s impact on Capitol Hill.

Adav Noti, executive director of the Campaign Legal Center and a former FEC associate general counsel, said that given the bureaucratic quagmire, the likelihood of having federal restrictions on AI use in place for campaigning ahead of the November presidential election is “extremely low.”

“The cavalry aren’t coming,” he said.

AI deepfakes have targeted officials and politicians this year. Democratic operative Steve Kramer was indicted last month over an AI-generated robocall impersonating President Biden that instructed New Hampshire residents not to vote early. Soon after, the FCC banned AI-generated voice imitations in robocalls. Last week, a deepfake video emerged purporting to show State Department spokesman Matthew Miller calling the Russian city of Belgorod a potential target for Ukrainian strikes with American weapons.

Any major AI problems in the campaign could cause headaches for the Biden administration, which has made moving rapidly on AI a policy centerpiece. Biden issued an executive order in October that required a range of federal agencies to swiftly formulate regulations on the use of AI technologies.

FCC Chairwoman Jessica Rosenworcel (D) announced plans last month to consider a rule requiring that political advertisers include on-air or written disclosures when they deploy “AI-generated content.”

But this week, a top elections official and a member of the FCC, both Republicans, threw a wrench in those plans by accusing the agency’s Democratic leadership of overstepping its authority.

FEC Chairman Sean Cooksey wrote in a letter to Rosenworcel that the proposal would trample on his agency’s role as the top enforcer of federal campaign law. The FCC’s maneuver could create “irreconcilable conflicts” with the FEC’s potential rules and prompt a legal challenge, Cooksey wrote.

The FCC proposal has not yet been made public, but Rosenworcel said that the move would not prohibit AI use but instead make “clear consumers have a right to know when AI tools are being used in the political ads they see.”

In an interview, Cooksey argued that rolling out disclosure requirements so close to an election could do more harm than good by creating public confusion about the standards.

“That will sow chaos with political campaigns and interfere with the upcoming election,” he said.

Fellow Republicans in Congress and at the FCC balked at Rosenworcel’s plan. The chair of the House Energy and Commerce Committee, Rep. Cathy McMorris Rodgers (R-Wash.), said in a statement the agency “does not have the expertise or authority to regulate political campaigns or AI.”

FCC Commissioner Brendan Carr (R) argued that because the rules would apply only to political ads on TV and radio and not online streaming platforms, such as YouTubeTV or Hulu, the sudden addition of AI disclosures in some places but not others would “ultimately be very confusing to the consumers.” He joined Cooksey in calling on the agency to table the matter until after the election, if not indefinitely.

“The FCC needs to, first of all, not introduce a sea change in the regulation of political speech on the eve of a national election,” Carr said.

Rosenworcel said in a statement that the FCC has required campaign ads to disclose sponsors for decades and that adapting these rules to the arrival of new technologies is nothing new.

“The time to act on public disclosure of AI use is now,” she said. “There are benefits to this technology, but we also know that it has the potential to mislead the public and misinform voters with made-up voices and images that impersonate people without their permission.”

With a 3-2 majority, Democrats at the FCC could bypass Carr’s objections and move ahead with the plans ahead of the election, but the specter of a legal challenge may bog down the effort.

Without legislation outlining how AI should be regulated, any federal agency’s actions “will almost certainly be challenged one way or another in court,” Noti said.

Multiple federal initiatives aimed at reining in AI’s impact on the 2024 race face an uncertain fate in Washington, even as officials in both parties warn about the technology’s potential to wreak havoc on the electoral process.

The FEC is considering its own petition on the issue, which would explicitly ban candidates from using AI to deliberately misrepresent opponents in political ads. But Democratic and Republican FEC officials alike have expressed skepticism about the agency’s ability to step into the matter and called on Congress to set new rules in lieu of the proposal.

Unlike the FCC, the FEC is evenly divided between the two major parties with a rotating chair, a setup that has often deadlocked the agency as election reform has become increasingly polarized.

On Capitol Hill, senators have advanced a package of bills that would require political ads generated by AI to feature disclaimers, among other restrictions. Yet despite calls for action on the issue from top congressional leaders, Congress’s window to act before Election Day is rapidly closing.

“While it’s a good thing that federal agencies are looking at the potential for AI to upend campaigns and elections, we cannot wait to put comprehensive guardrails in place to meet these threats head on,” said Sen. Amy Klobuchar (D-Minn.), who is leading the legislative effort.

Related Content

One graduate’s quiet protest: Bringing a banned book to commencement

The real dolphin tale: They’re smart, sometimes vicious and highly sexed