Creating 鈥榞ood rules鈥 for AI
Artificial intelligence鈥檚 rapid advances have lawmakers racing to manage risks, alongside its business and social benefits. In the U.S., willing state and federal coordination could point to new models.
Artificial intelligence鈥檚 rapid advances have lawmakers racing to manage risks, alongside its business and social benefits. In the U.S., willing state and federal coordination could point to new models.
Civilians around the world daily and easily engage with artificial intelligence, communicating with chatbot 鈥渢herapists鈥 and 鈥渇riends鈥 or creating realistic videos with entirely machine-generated content.
Governments, meanwhile, are racing to keep up with the implications of AI 鈥 positive and otherwise 鈥 for national security and economic competitiveness as well as for citizen freedoms, privacy, and safety. The challenge centers on whether and how much to regulate this rapidly advancing and lucrative sector. And how to do so without eroding the democratic, free-market values of individual and entrepreneurial autonomy.
Australia is now the first country to ban social media use for children under age 16. In July, the United Kingdom enacted age verification for accessing pornographic sites. And last year, the European Union passed an AI Act to 鈥渇oster responsible鈥 development, while addressing 鈥減otential risks to citizens鈥 health, safety, and fundamental rights.鈥
鈥淕ood rules ... help prevent disasters,鈥 policy analyst James Lardner noted in a study of 10 landmark regulations in the United States. They arise 鈥渙ut of crisis and struggle, but also ... out of the momentum of accomplishment,鈥 and can channel market forces 鈥渋n more positive directions,鈥 he observed.
U.S. governors and state legislatures have been busy trying to design such AI rules: In 2025, all 50 states introduced AI bills and 38 passed roughly 100 laws. These moves far outpaced action in Washington, where the House of Representatives and President Donald Trump pushed for a 10-year moratorium on state AI laws. The Senate voted this down 99-1 in July, and a November effort to include the provision in the defense bill also failed.
Mr. Trump has announced he will issue an executive order to preempt or override state rules. 鈥淵ou can鈥檛 expect a company to get 50 Approvals every time they want to do something,鈥 he posted on Truth Social. There is some validity to this, as business associations and tech industry leaders point out.
Others, such as Florida Gov. Ron DeSantis, see this as 鈥渇ederal government overreach,鈥 though. 鈥淪tripping states of jurisdiction to regulate AI is a subsidy to Big Tech鈥 and hampers efforts to protect children and intellectual property, he wrote on the social platform X.
History shows that state regulations often serve as initial guardrails and provide a template for comprehensive federal legislation.
鈥淪tate-level action has played a significant role in addressing early risks associated with emerging technologies,鈥 according to George Washington University researcher Tambudzai Gundani. 鈥淏ecause these tools are deployed in specific communities, local officials are often the first to hear complaints, see patterns of harm, and respond.鈥
For officials to effectively regulate and devise 鈥済ood rules鈥 for AI, it seems that a willingness to learn from local experience and listen to industry and federal concerns will both be necessary.