Spurred to action by tech industry lobbyists and insiders, Republicans in the Senate appear to be planning to add language to the National Defense Authorization Act (NDAA) that would preempt states from passing laws regulating AI labs.
Two sources with knowledge tell Fast Company that a small group of GOP lawmakers, staffers, and tech lobbyists worked through the weekend crafting the new language.
Heading into Thanksgiving, much uncertainty hangs over the fate of the state-level moratorium – and a fair amount of secrecy about how the AI industry and its MAGA allies will try to tie the hands of states, and Congress, to regulate AI. Democrats and others may not be allowed to see the new language until the vote to pass or reject the NDAA, a so-called “must-pass” bill that funds the military.
Senate Democrats also have no visibility on the scope of the moratorium language that will go in the NDAA. Could it, for example, prevent states from passing any and all kinds of AI laws, including those that focus on consumer protection issues or AI-related unemployment?
“What big tech is trying to do here is an even larger giveaway than Section 230,” says Future of Life Institute’s head of U.S. policy Michael Kleinman. (Section 230 of the Communications Decency Act of 1996 exempted tech platforms from liability for user-generated content.) “You literally have big tech lobbyists meeting with a handful of senior Republicans trying over the course of a holiday weekend to craft legislation that will govern what state governments can do around AI for the future–it’s appalling.”
Not long after Louisiana Republican Steve Scalise, the House Majority Leader, introduced the preemption measure last week, Massachusetts Democratic Senators Elizabeth Warren and Ed Markey quickly penned and sent a letter to their colleagues urging them to oppose adding the state moratorium — which they describe as a “poison pill” — to the NDAA, which will need 60 GOP votes to end a Democratic filibuster and advance to a final vote. Attorneys General representing 36 states sent a letter to Congressional leadership opposing the state moratorium language.
Congress is not in session because of the Thanksgiving holiday. But Republicans plan to make another push to convince lawmakers to add the state preemption to the NDAA when they return December 1, sources say.
Last week, the White House proposed a route that bypasses Congress, circulating a draft executive order (EO) that proposes pulling back congressionally approved broadband funding from any state enacting new AI laws. The EO also proposed creating a new Department of Justice task force to challenge existing state AI laws. The White House had reportedly planned to release the EO last Friday, but chose to delay it.
Many of the people who would benefit from a state AI moratorium were present at a November 18 White House state dinner hosted by President Donald Trump for Saudi Crown Prince Mohammed bin Salman. These include Elon Musk, Jeff Bezos, Nvidia CEO Jensen Huang, OpenAI’s Greg Brockman, AMD CEO Lisa Su, and Apple CEO Tim Cook. David Sacks, Trump’s “AI and crypto czar” and venture capitalist, was also among the attendees.
Given the import of stifling AI regulation whenever and wherever possible, it’s very likely that the state AI law moratorium was discussed while these people were in Washington for the event, one Washington source said.
Texas Republican Senator Ted Cruz tried last summer to tuck the preemption into the so-called One Big Beautiful Bill Act (an appropriations measure) last summer, but senators voted 99-1 to remove it. The moratorium idea is unpopular with the public, survey data shows, and unpopular across the political spectrum in Washington DC.
Despite broad opposition, tech industry insiders such as Marc Andreessen, Elon Musk, and Sacks have Trump’s ear, and have helped keep the state preemption idea alive in the Capitol.
Big tech’s big opportunity
In a broad sense, the chance to keep government oversight away from what could be the most impactful technology in a generation may explain why tech moguls and opinion leaders threw their support behind Donald Trump before the 2024 election and have continued to praise and appease him.
While the Trump administration rewards his tech industry allies by killing government inquiries and regulation, big U.S. tech companies and financiers are now sinking trillions into building the infrastructure needed to support a massive expansion of generative AI.
The AI industry has been ramping up its lobbying spend over the past two years to stifle AI regulation at both the federal and state levels. It’s also expanding into electoral politics.
This summer a group of AI companies and investors launched a super PAC worth $100 million called “Leading the Future” that will push “pro-AI” candidates and oppose pro-AI regulation candidates. Backers include a16z, OpenAI President Greg Brockman, Palantir co-founder Joe Lonsdale, Perplexity AI, and angel investor Ron Conway.
On a legal level, some in the tech industry, including the venture capital firm Andreessen Horowitz, argue that state laws should focus on the application, not the development, of AI–such as to prevent or punish things like fraud or civil rights violations–while federal law should govern the “national AI market.” AI companies also fear being burdened by a “patchwork” of state AI regulations instead of a single set of federal rules.
Tom Kemp, who directs the California Privacy Protection Agency, explains that in many tech policy issues there’s a debate over the boundary between issues covered by federal law and issues covered by state law. But Congress hasn’t come close to passing a broad AI safety and transparency law, and isn’t likely to.
“The fundamental issue they have is that there’s no federal backstop,” Kemp says. “So the moratorium basically says you just can’t do any laws having to do with AI.”
Innovation versus states’ rights
Many state governors, including Florida Republican Ron DeSantis, and legislators, claim they have not just a right but a responsibility to enact AI laws to protect the public in the absence of a federal law. State lawmakers are very aware of the series of reports about AI chatbots exacerbating mental health problems in users, including younger ones. “There’s a big concern that state legislators cannot protect kids from some of the harms of AI,” Kemp says.
On Monday, a bipartisan group of 280 state lawmakers from across the country sent a letter to lawmakers in the House and Senate opposing the state AI law preemption, saying it would hamstring their efforts to address the impacts of artificial intelligence.
The tech lobby and its Republican allies frame the moratorium as critical to helping the U.S. maintain its lead in AI — technology that will be increasingly used in defense and national security. But even the top players in defense don’t seem convinced.
“You need the four corners of the armed forces committees to be all approved,” says Kemp, who met with lawmakers last week in Washington to discuss the issue. In other words, the majority and minority leaders of both the House and Senate Armed Services Committees have to agree to insert state AI preemption language into the NDAA.
Kemp believes that Alabama Republican Rep. Mike Rogers, the chairman of the House Committee on Armed Services, and Washington Democratic Rep. Adam Smith, the committee’s ranking member, are opposed, as is Senate Committee on Armed Services ranking member Jack Reed. Mississippi Republican Senator Roger Wicker, the chairman of the committee, has yet to announce his position.
It’s possible that new language in the NDAA will go beyond a state-level pre-emption, and promote some form of broad, but weak, federal AI law that limits oversight by both federal and state regulators. On Monday the Leading the Future PAC launched a $10 million campaign to push Congress to craft a “national AI policy” that would override a patchwork of state laws, reports CNBC.
“What we’re seeing, not just with preemption, but with these big tech super PACs is that big tech will go to any effort to undermine that overwhelming small-D democratic will,” Kleinman says. “All the polling that we have done and that others have done shows that consistently across the board strong majorities of both parties support AI regulation.”