Republicans drop Trump-ordered block on state AI laws from defense bill


“A silly way to think about risk”

“Widespread and powerful movement” keeps Trump from blocking state AI laws.

A Donald Trump-backed push has failed to wedge a federal measure that would block states from passing AI laws for a decade into the National Defense Authorization Act (NDAA).

House Majority Leader Steve Scalise (R-La.) told reporters Tuesday that a sect of Republicans is now “looking at other places” to potentially pass the measure. Other Republicans opposed including the AI preemption in the defense bill, The Hill reported, joining critics who see value in allowing states to quickly regulate AI risks as they arise.

For months, Trump has pressured the Republican-led Congress to block state AI laws that the president claims could bog down innovation as AI firms waste time and resources complying with a patchwork of state laws. But Republicans have continually failed to unite behind Trump’s command, first voting against including a similar measure in the “Big Beautiful” budget bill and then this week failing to negotiate a solution to pass the NDAA measure.

Among Republican lawmakers pushing back this week were Rep. Marjorie Taylor Greene (R-Ga.), Arkansas Gov. Sarah Huckabee Sanders, and Florida Gov. Ron DeSantis, The Hill reported.

According to Scalise, the effort to block state AI laws is not over, but Republicans caved to backlash over including it in the defense bill, ultimately deciding that the NDAA “wasn’t the best place” for the measure “to fit.” Republicans will continue “looking at other places” to advance the measure, Scalise said, emphasizing that “interest” remains high, because “you know, you’ve seen the president talk about it.”

“We MUST have one Federal Standard instead of a patchwork of 50 State Regulatory Regimes,” Trump wrote on Truth Social last month. “If we don’t, then China will easily catch us in the AI race. Put it in the NDAA, or pass a separate Bill, and nobody will ever be able to compete with America.”

If Congress bombs the assignment to find another way to pass the measure, Trump will likely release an executive order to enforce the policy. Republicans in Congress had dissuaded Trump from releasing a draft of that order, requesting time to find legislation where they believed an AI moratorium could pass.

“Widespread” movement blocked Trump’s demand

Celebrating the removal of the measure from the NDAA, a bipartisan group that lobbies for AI safety laws, Americans for Responsible Innovation (ARI), noted that Republicans didn’t just face pressure from members of their own party.

“The controversial proposal had faced backlash from a nationwide, bipartisan coalition of state lawmakers, parents, faith leaders, unions, whistleblowers, and other public advocates,” an ARI press release said.

This “widespread and powerful” movement “clapped back” at Republicans’ latest “rushed attempt to sneak preemption through Congress,” Brad Carson, ARI’s president, said, because “Americans want safeguards that protect kids, workers, and families, not a rules-free zone for Big Tech.”

Senate Majority Leader John Thune (R-SD) called the measure “controversial,” The Hill reported, suggesting that a compromise that the White House is currently working on would potentially preserve some of states’ rights to regulate some areas of AI since “you know, both sides are kind of dug in.”

$150 million war over states’ rights to regulate AI

Perhaps the clearest sign that both sides “are kind of dug in” is a $150 million AI lobbying war that Forbes profiled last month.

ARI is a dominant group on one side of this war, using funding from “safety-focused” and “effective altruism-aligned” donor networks to support state AI laws that ARI expects can be passed much faster than federal regulations to combat emerging risks.

The major player on the other side, Forbes reported, is Leading the Future (LTF), which is “backed by some of Silicon Valley’s largest investors” who want to block state laws and prefer a federal framework for AI regulation.

Top priorities for ARI and like-minded groups include protecting kids from dangerous AI models, preventing AI from supercharging crime, protecting against national security threats, and getting ahead of “long-term frontier-model risks,” Forbes reported.

But while some Republicans have pushed for compromises that protect states’ rights to pass laws shielding kids or preventing fraud, Trump’s opposition to AI safety laws like New York’s “RAISE Act” seems unlikely to wane as the White House mulls weakening the federal preemption.

Quite the opposite, a Democrat and author the RAISE Act, Alex Bores, has become LTF’s prime target to defeat in 2026, Politico reported. LTF plans to invest many millions in ads to block Bores’ Congressional bid, CNBC reported.

New York lawmakers passed the RAISE Act this summer, but it’s still waiting for New York’s Democratic governor, Kathy Hochul, to sign it into law. If that happens—potentially by the end of this year—big tech companies like Google and OpenAI will have to submit risk disclosures and safety assessments or else face fines up to $30 million.

LTF leaders, Zac Moffatt and Josh Vlasto, have accused Bores of “pushing “ideological and politically motivated legislation that would ‘handcuff’ the US and its ability to lead in AI,” Forbes reported. But Bores told Ars that even the tech industry groups spending hundreds of thousands of dollars opposing his law have reported that tech giants would only have to hire one additional person to comply with the law. To him, that shows how “simple” it would be for AI firms to comply with many state laws.

To LTF, whose donors include Marc Andreessen and OpenAI cofounder Greg Brockman, defeating Bores would keep the opposition out of Congress, where it could be easier to meddle with industry dreams that AI won’t be heavily regulated. Scalise argued Tuesday that the AI preemption is necessary to promote an open marketplace, because “AI is where a lot of new massive investment is going” and “we want that money to be invested in America.”

“And when you see some states starting to put a patchwork of limitations, that’s why it’s come to the federal government’s attention to allow for an open marketplace, so you don’t have limitations that hurt innovation,” Scalise said.

Bores told Ars that he agrees that a federal law would be superior to a patchwork of state laws, but AI is moving “too quickly,” and “New York had to take action to protect New Yorkers.”

Why Bores’ bill has GOP so spooked

With a bachelor’s degree in computer science and prior work as an engineer at Palantir, Bores hopes to make it to Congress to help bridge bipartisan gaps and drive innovation in the US. He told Ars that the RAISE Act is not intended to block AI innovation but to “be a first step that deals with the absolute worst possible outcomes” until Congress is done deliberating a federal framework.

Bores emphasized that stakeholders in the tech industry helped shape the RAISE Act, which he described as “a limited bill that is focused on the most extreme risks.”

“I would never be the one to say that once the RAISE Act is signed, we’ve solved the problems of AI,” Bores told Ars. Instead, it’s meant to help states combat risks that can’t be undone, such as bad actors using AI to build “a bioweapon or doing an automated crime spree that results in billions of dollars in damage.” The bill defines “critical harm” as “the death or serious injury of 100 people or at least $1 billion in damages,” setting a seemingly high bar for the types of doomsday scenarios that AI firms would have to plan for.

Bores agrees with Trump-aligned critics who advocate that the US should “regulate just how people use” AI, “not the development of the technology itself.” But he told Ars that Republicans’ efforts to block states from regulating the models themselves are “a silly way to think about risk,” since “there’s certain catastrophic incidents where if you just said, ‘well, we’ll just sue the person afterwards,’ no one would be satisfied by that resolution.”

Whether Hochul will sign the RAISE Act has yet to be seen. Earlier this year, California Governor Gavin Newsom vetoed a similar law that the AI industry worried would rock their bottom lines by requiring a “kill switch” in case AI models went off the rails. Newsom did, however, sign a less extreme measure, the Transparency in Frontier Artificial Intelligence Act. And other states, including Colorado and Illinois, have passed similarly broad AI transparency laws providing consumer and employee protections.

Bores told Ars in mid-November that he’d had informal talks with Hochul about possible changes to the RAISE Act, but she had not yet begun the formal process of proposing amendments. The clock is seemingly ticking, though, as Hochul has to take action on the bill by the end of the year, and once it reaches her desk, she has 10 days to sign it.

Whether Hochul signs the law or not, Bores will likely continue to face opposition over authoring the bill, as he runs to represent New York’s 12th Congressional District in 2026. With a history of passing bipartisan bills in his state, he’s hoping to be elected so he can work with lawmakers across the aisle to pass other far-reaching tech regulations.

Meanwhile, Trump may face pressure to delay an executive order requiring AI preemption, Forbes reported, as “AI’s economic impact and labor displacement” are “rising as voter concerns” ahead of the midterm elections. Public First, a bipartisan initiative aligned with ARI, has said that 97 percent of Americans want AI safety rules, Forbes reported.

Like Bores, ARI plans to keep pushing a bipartisan movement that could scramble Republicans from ever unifying behind Trump’s message that state AI laws risk throttling US innovation and endangering national security, should a less-regulated AI industry in China race ahead.

To maintain momentum, ARI created a tracker showing opposition to federal preemption of state AI laws. Among recent commenters logged was Andrew Gounardes, a Democrat and state senator in New York—where Bores noted a poll found that 84 percent of residents supported the RAISE Act, only 8 percent opposed, and 8 percent were undecided. Gounardes joined critics on the far right, like Steve Bannon, who warned that federal preemption was a big gift for Big Tech. AI firms and the venture capitalist lobbyists “don’t want any regulation whatsoever,” Gounardes argued.

“They say they support a national standard, but in reality, it’s just cheaper for them to buy off Congress to do nothing than it is to try and buy off 50 state legislatures,” Gounardes said.

Bores expects that his experience in the tech industry could help Congress avoid that fate while his policies like the RAISE Act could sway voters who “don’t want Trump mega-donors writing all tech policy,” he wrote on X.

“I am someone with a master’s in computer science, two patents, and nearly a decade working in tech,” Bores told CNBC. “If they are scared of people who understand their business regulating their business, they are telling on themselves.”

Photo of Ashley Belanger

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

Leave a Comment

Your email address will not be published. Required fields are marked *