海角大神

Will AI rules stifle innovation? Inside the tussle between states, federal government.

|
Kent Nishimura/Reuters
U.S. President Donald Trump applauds at the "Winning the AI Race" Summit in Washington D.C., July 23, 2025.

Artificial intelligence is becoming an integral part of daily life for most Americans as they interact with chatbots, read summaries of Google searches, or receive personalized recommendations from social media algorithms.

This widespread use 鈥 and the booming industry around it 鈥 is raising the question of who gets to shape the rules for the transformative technology: the federal government, which argues that too much regulation could hinder innovation, or individual states, which have begun to pass laws, primarily designed to protect AI users from things like discrimination or dangerous suggestions from chatbots.

It鈥檚 a battle playing out in the government right now. Some Republicans in Congress are reportedly trying to insert a moratorium on state AI regulations into a must-pass defense spending bill that lawmakers aim to approve by the end of the year. At the same time, President Donald Trump has drafted an executive order that would pressure states to back off enforcing any regulations the administration deems burdensome, according to news reports.

Why We Wrote This

Artificial intelligence is showing up more and more in Americans鈥 daily lives, raising questions about who should regulate it, and whether the focus should be on unleashing innovation or protecting the people who use AI.

鈥淚nvestment in AI is helping to make the U.S. Economy the 鈥楬OTTEST鈥 in the World, but overregulation by the States is threatening to undermine this Major Growth 鈥楨ngine,鈥欌 President Trump wrote on social media last week.

Punchbowl News last week that House Republicans are urging the White House to hold off on an executive order while they try to negotiate a compromise. Commerce Secretary Howard Lutnick has meanwhile that the administration will act on its own if Congress isn鈥檛 making progress.

A recent Gallup poll shows that most Americans agree that the U.S. should have more advanced technologies than other countries. But 97% say AI should be subject to regulation.

Across the country, rules are taking shape. Lawmakers in all 50 states have introduced AI-related regulations, and at least 38 states have already enacted AI-related laws this year. Many of them say it鈥檚 essential to be able to respond quickly and nimbly to the ever-shifting challenges posed by AI.

Manuel Balce Ceneta/AP
Nvidia CEO Jensen Huang speaks on how AI infrastructure and AI factories that generate intelligence at scale are powering a new industrial revolution, at Walter E. Washington Convention Center, Oct. 28, 2025, in Washington.

鈥淭he states are the laboratories of democracy,鈥 says Democrat James Maroney, a Connecticut state senator who sits on a steering committee for a multistate working group on AI. The group includes Republicans and Democrats. 鈥淲e know the federal government just can鈥檛 act as quickly, and we know how rapidly these technologies are changing, allowing the states to regulate where they鈥檙e closer to seeing the harms.鈥

What states are passing

The Trump administration has previously tried to stop states from enforcing AI regulations. A draft of the Republicans鈥 tax-and-spending bill this past summer included a moratorium blocking states from passing such laws for 10 years. That proposed ban received considerable backlash from both Republican and Democratic governors and state officials, and it was ultimately struck down in the Senate in a 99-1 vote.

But the debate may not be over.

Republican Sen. Ted Cruz insisted at a Politico AI & Tech Summit in September that the idea of a 10-year moratorium is 鈥渘ot at all dead.鈥

As Congress and the administration weigh other ways to curb, or altogether halt, state regulations, states are moving forward. California Gov. Gavin Newsom, for one, has signed more than a dozen AI bills into law in the last two years, including a landmark transparency bill this September that requires large AI companies to disclose how they鈥檙e implementing safety protocols so that a chatbot can鈥檛, for example, give a user instructions on how to build a chemical weapon.

New York has advanced a similar law this year. An Illinois law imposed sweeping restrictions on AI use in therapy. And Colorado and Texas have passed bills protecting people from discrimination when AI algorithms are used to make consequential decisions, such as those related to housing or employment.

Several experts interviewed by the Monitor said that they expect a significant amount of legislation in the coming year that could establish safety parameters for how bots, such as ChatGPT, interact with users, especially children.

Godofredo A. V谩squez/AP
Chelsea Palacio, public information manager for the city of San Jose, showcases how a small detection camera uses AI to detect road hazards and potholes, in San Jose, California, Nov. 12, 2025.

A 鈥減atchwork of indecision鈥

The lawmakers passing these bills say they鈥檙e necessary to address challenges that people are already facing from AI technologies. But some experts and lawmakers fear the laws will have unintended side effects.

In a September congressional hearing, Rep. Darrell Issa of California warned of a 鈥減atchwork of indecision.鈥 If each state is passing its own laws, he said, companies could face a tangled web of conflicting laws and regulations.

Many Republicans and technology leaders fear that sorting through these standards could create cost barriers, hindering AI innovation for businesses and developers. That would come at a time when the U.S. is closely competing with China to see who will emerge as an AI leader, with the ability to shape AI algorithms and applications.

鈥淲hen you have 50 different states creating potentially 50 different sets of rules, some of which could even end up being in conflict with one another 鈥 that creates such a regulatory nightmare,鈥 says Patrick Hedger, the director of policy at NetChoice, a trade association advocating for free enterprise on the internet.

Travis Hall, the director of state engagement for the nonprofit Center for Democracy and Technology, doesn鈥檛 see this 鈥減atchwork鈥 argument actually playing out, at least so far.

He says many of the laws passed address specific industries, such as health care, that are already regulated under state law. And the ability to target local industries is one of the reasons he believes AI regulations should be implemented at the local level.

A recent Tennessee law called the ELVIS Act, for example, protects against the use of AI to impersonate musicians鈥 voices. Mr. Hall says the law makes sense for a state like Tennessee, where Elvis Presley grew up and made his home, that values its musicians.

John Amis/AP/File
Tennessee Senate Majority Leader Jack Johnson speaks during a news conference at RCA Records announcing the Ensuring Likeness, Voice, and Image Security (ELVIS) Act, a bill updating the state's protection of personal rights law to include protections for songwriters, performers, and music industry professionals' voices from the misuse of artificial intelligence, Jan. 10, 2024, in Nashville, Tennessee.

鈥淚 have a hard time imagining a federal bill that would really cover the landscape of things that AI touches,鈥 he says.

As states work to pass these laws, though, the process can be messy. American Enterprise Institute economist Will Rinehart points to problems with Colorado鈥檚 AI law this year as an example of states overlooking the burdens that can fall on businesses.

Although the law protecting against discrimination was signed in 2024, the state鈥檚 Democratic Gov. Jared Polis 鈥 who supported a 10-year moratorium on state regulation 鈥 responded to objections from tech groups about its potential impact by delaying implementation until June 2026. For example, under the initial law, AI developers could be held responsible if their algorithms caused discrimination, regardless of the developers鈥 intent.

Mr. Rinehart, for one, says he supports 鈥渢he nice, big overall idea of trying to give people protection.鈥 But 鈥渨hen you get into the nitty-gritty of the laws,鈥欌 he adds, implementing a specific regulation can be 鈥渧ery difficult.鈥

Senator Maroney in Connecticut is working with lawmakers in other states who are trying to take on such challenges. Their group of over 100 state lawmakers meets twice a month to discuss bills and share resources. Senator Maroney says this idea-sharing helps maintain consistency, avoiding the patchwork problem that many fear.

鈥淭he way forward is working together,鈥 he says.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
海角大神 was founded in 1908 to lift the standard of journalism and uplift humanity. We aim to 鈥渟peak the truth in love.鈥 Our goal is not to tell you what to think, but to give you the essential knowledge and understanding to come to your own intelligent conclusions. Join us in this mission by subscribing.
QR Code to Will AI rules stifle innovation? Inside the tussle between states, federal government.
Read this article in
/Business/2025/1125/artificial-intelligence-regulation-trump
QR Code to Subscription page
Start your subscription today
/subscribe