Advocates say delay of Colorado AI regulations offers breathing room, but not final resolution

Education officials, business leaders and technology groups said they are relieved that a new state law regulating artificial intelligence has been delayed by a few months — but they remained frustrated that many of their broader worries remain unresolved.
Last year, Gov. Jared Polis signed the nation’s first law regulating “high-risk” artificial intelligence systems. While he signed the measure, he issued a rare signing statement to the Colorado General Assembly, warning them that, without fine-tuning the new regulation before it goes into effect, it would tamper innovation in Colorado, inhibit the state’s ability to compete and put the new technology’s full potential out of reach to residents and businesses here.
The law, sponsored by Senate Majority Leader Robert Rodriguez, D-Denver, and Reps. Brianna Titone, D-Arvada, and Manny Rutinel, D-Commerce City, established standards and requirements for AI developers and deployers to reduce “algorithmic discrimination” and increase transparency. Also known as “algorithmic bias” under the new law, it applies to the process by which calculations produce “unjustified different treatment or impacts disfavoring people based on their race, color, ethnicity, sex , religion, age, national origin, disability, veteran status, genetic information, or any other classification protected by law.”
In his signing statement, Polis thanked the bill’s sponsors for initiating a conversation about “algorithmic discrimination,” but emphasized that the policy could create “a complex compliance regime” for AI developers and “deployers” in Colorado.
“I appreciate the sponsors’ interest in preventing discrimination and prioritizing consumer protection as Colorado leads in this space, and I encourage them to significantly improve on this before it takes effect,” he wrote.
Rodriguez and Titone subsequently created a task force, which included two other lawmakers and representatives of organizations and agencies like the ACLU, the state Attorney General’s Office, and the Colorado Chamber of Commerce. The task force met throughout the interim and into the 2025 legislative session. Its goal was to secure an agreement on changes before the bill went into effect in February 2026, the original timeline.
Some of those changes were outlined in the original version of Senate Bill 004, introduced during this month’s special by Rodriguez and Rep. Jennifer Bacon, D-Denver. While the bill initially sought to rewrite the 2024 law, the parties ultimately failed to agree on exactly how it should be rewritten, according to Rodriguez.
This would be the second try — and second failure by the parties — to reach a deal.
“It became impossible to iron out a path forward that works for everyone,” Rodriguez told his colleagues on the Senate Floor. “I believe this is the path forward to build on the progress we’ve made.”
Rodriguez then amended the bill to remove all provisions except one — pushing back the implementation of the law from February to June 30, 2026.
Education, technology and business groups largely welcomed the delay, and they now plan to work with sponsors and secure an agreement during the 2026 legislative session.
Technically, Policymakers and the other parties have about four months, during the regular session from January to May, to iron out changes to the AI regulatory regime. Broadly speaking, however, they could start working on a compromise now, giving them much more runway room to craft a deal.
In fact, it’s not unusual for policymakers to already have reached an agreement before a legislative session, in which that deal is formalized by passing it as new legislation.
“We appreciate Senator Rodriguez’s decision to delay implementation of SB24-205,” said Brittany Morris Saunders, President & CEO of the Colorado Technology Association . “This decision recognizes the complexity of artificial intelligence and the importance of building a framework that protects consumers, supports businesses, and provides clarity for policymakers.”
She added: “By extending the timeline, we now have the opportunity to work collaboratively on practical solutions that strengthen consumer trust, safeguard jobs, and preserve Colorado’s competitiveness. CTA and our members are committed to being active partners in this process to ensure that Colorado leads with both innovation and responsibility.”
Educators ‘agree with the concept’ of AI law, but want to ‘understand things a little better’
Colorado Education Association President Kevin Vick said that like any new technology, AI is a double-edged sword — capable of great things but also of harmful ones.
The union officially took a “monitoring” position on the 2024 law because, according to Vick, while his organization agreed with its intent, the timeline was inadequate.
“It’s such a new technology and it’s advancing so rapidly and being used in a number of ways that I don’t know if we’re really aware of it yet in society, so the potential for unintended harm is pretty high,” he said. “We agree with the idea that if AI is being used to make these sort of decisions in hiring or admissions or other life-changing decisions, individuals have a right to know about that — we agree with that concept, but I think our hesitancy is just that it’s so new that we wanted to learn more and understand things a little bit better.”
Schools all over Colorado use artificial intelligence systems for hiring and admissions, Vick said. The technology is especially useful for sorting through large amounts of data, but that filtration process has the potential to be “discriminatory or harmful to individuals, application processes, and employment systems,” he added.
For now, Vick said the four-month implementation delay is “probably adequate.”
“I think it’s a good balance point between the people who are saying they need more time and the people who are saying we can’t wait,” he said. “By delaying another four months, although it’s leaving people vulnerable for another four months, it should also give an adequate amount of time to have this implemented in an effective manner.”
Businesses appreciate the extra time, say complicated subjects take time to sort out
Colorado Chamber of Commerce executive director Rachel Beck said her members knew AI regulation is inevitable and are willing to comply, but, she added, many shared Polis’ apprehensions about the original bill’s potential to stifle innovation in a burgeoning industry.
“Our businesses recognize that innovation in this space is coming, and I think they generally believe that it is necessary,” she said. “But we have to make sure that, as we’re looking to make sure that the systems don’t do harm, we’re also making sure that we’re not restricting their ability to innovate and do things that are more efficient and that humans can’t do.”
Leslie Oliver, vice president of external affairs for the Denver Metro Chamber of Commerce, agreed.
“We don’t want companies taking a pass on Colorado or even considering leaving the state because our policies and regulations are overly broad and unpredictable and make it unaffordable to do business here just so Colorado can have a first-of-its-kind rule on the books,” she said, adding that her group’s members would prefer AI regulations to come from the federal level, rather than from “a patchwork of state-by-state regulations.”
In July, the White House released an “AI Action Plan” which in part called for deregulating AI and removing references to diversity, equity and inclusion, but did not specifically mention “algorithmic discrimination.”
Beck said businesses have three main worries with the law: its broad scope, vague definitions, and unclear liability, the latter of which is a common issue in AI legislation.
Since most businesses use open-source AI systems — made publicly available for others to customize and modify to their needs — there’s a risk that developers could be blamed for instances of algorithmic discrimination caused by changes a “deployer” made.
Conversely, deployers may end up taking the blame for something the developer encoded into the system’s framework that they had no knowledge of.
“The developers don’t want to be responsible for customizations that their customers are making that they have no control over,” said Beck. “Deployers, for their part, have no visibility into the framework and the code that the developer used to set up the system they bought, and I think both of those perspectives are completely legitimate and that has been a huge part of the conversation around AI policy.”
Beck said she hopes stakeholders and lawmakers can work to address the bill’s vagueness in scope and definitions during the 2026 legislative session. While the original task force has completed its work, she believes a second one would be beneficial to get the bill where it needs to be ahead of its implementation date.
Beck said the chamber is also glad Rodriguez decided to delay the implementation of Senate Bill 205 and acknowledged that while there is still much more work to be done, it’s not uncommon for complicated subjects to take multiple sessions to iron out.
“It took us three years to work out data privacy regulations, so when we’re tackling something this complex and this important and this critical to our economy, I don’t think that most of us are surprised that it’s taken us a couple of tries to figure it out,” she said.