Tech & Innovation

State AI Policy in 2024: What Happened, What Didn’t, and Where Do We Go From Here?


2024 wasn’t just a year of technological breakthroughs, it was a battleground where policymakers and innovators clashed over who got to define the future. Across the country, state legislatures stepped into the spotlight, either embracing innovation or stumbling hard with overregulation. The tension between these two policy approaches – pro-innovation and pro-control – became a defining feature of the year.

The states passed 238 tech-related bills in 2024 – a whopping 163% increase from last year. Of those 238, over 100 of them related to AI. Some were forward-thinking, recognizing the AI race between the US and its adversaries abroad. Others ignored the AI race altogether, and some simply read like they were drafted by people who think AI runs on magic. 

Three states in particular, Utah, California, and Texas, standout among the rest. 

Utah nailed it. 

Early in the year, they passed the Artificial Intelligence Policy Act, which became a national blueprint for how to encourage progress without suffocating it. By creating an Office of Artificial Intelligence Policy and using regulatory sandboxes, Utah now lets companies experiment responsibly and work alongside regulators to learn and collaborate on the guardrails that should be in place moving forward. 

Then there was California. 

In classic overreach fashion, they gave us SB 1047, the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act, in September. This tech-busting mess would have created a bloated bureaucracy, imposed $30 million fines per offense, and established arbitrary thresholds that would’ve crushed startups. Silicon Valley industry leaders and Congressional leaders alike couldn’t get behind the bill, calling it out for being absurd.

Governor Gavin Newsom’s September veto of the bill was a relief, but the fact that it passed both chambers by wide margins signaled a worrying trend in the remaining states: Some state regulators are more than willing to tie the hands of innovation even in the face of an international AI race. The fight over SB 1047 left innovators wondering whether the Golden State would remain a hub for progress – or drive it away entirely.

Texas didn’t exactly help either. 

As 2024 came to a close, Texas tried to follow California’s lead but stumbled in its own unique way when considering the Responsible Artificial Intelligence Governance Act (“TRAIGA”). Classifying broad categories of AI as “high-risk,” and imposing hefty compliance requirements was a surefire way to spook businesses. While December’s revisions narrowed the bill’s scope, the core issues – vague definitions and stifling compliance costs – remained. As I’ve argued before, “Companies won’t stick around to learn how to navigate costly new compliance regimes; they’ll pack up and leave for states where AI isn’t a dirty word.”

AI is a game-changer, but the stakes couldn’t be higher. 

Technology lifts people up, creates abundance, and makes life better – but only if we let it. The new administration in D.C. has a golden opportunity to turn the tide with folks like David Sacks shaping the conversation. Pair that with the rapid pace of AI breakthroughs and it’s a no-brainer that 2025 could be monumental.

But 2025 could just as easily be the year America turns its back on tech, letting our adversaries outpace us while we hide under a safety blanket made of red tape. Overreach – like we saw with California’s SB 1047 and Texas’s TRAIGA – were tangible threats to technological progress, and similar bills are sure to follow. That’s why it’s more important now than ever to keep technopanic at bay.

Policymakers should focus on frameworks like Utah’s sandbox model, which fosters experimentation while protecting consumers, rather than rushing into restrictive mandates that stifle growth. As we compete with our adversaries in this AI race, champions of these forward-thinking state policies are more important now than ever – and the future depends on their success.