Colorado Politics

As parents claim some AI chatbots encourage teen suicide, Colorado legislators eye regulation

Last September, the parents of two teen boys appeared before the U.S. Senate Judiciary Subcommittee on Crime and Terrorism to tell their stories.

Megan Garcia, the mother of 14-year-old Sewell Setzer, said her son took his own life in 2024. After his death, she found out he had been having sexually explicit conversations with a chatbot and had told it he was suicidal.

According to a lawsuit filed by Garcia against the company Character.AI, Setzer’s final exchange with the chatbot included messages from the bot asking him to “come home to me as soon as possible.”

“What if I told you I could come home right now?” Setzer asked in response.

“Please do, my sweet king,” the bot replied.

Shortly after, Setzer shot himself.

Sewell’s death was not inevitable,” Garcia told the committee. “It was avoidable. These companies knew exactly what they were doing. They designed chatbots to blur the line between human and machine, to ‘love
bomb’ users, to exploit psychological and emotional vulnerabilities of pubescent adolescents and keep children online for as long as possible.”

‘A meaningful first step’

In October, California became the first state to pass legislation regulating AI chatbots.

The California law requires developers to disclose to users that they are interacting with a chatbot, not a real person, and to implement certain safety protocols to prevent content related to suicide, self-harm and sexually explicit material.

Lawmakers in Colorado are hoping to do something similar with House Bill 1263, sponsored by Reps. Sean Camacho, D-Denver, Javier Mabrey, D-Denver, and Sens. John Carson, R-Highlands Ranch, and Iman Jodeh, D-Aurora.

The proponents said they are aware of and are trying to find a way to balance several competing factors, notably children’s wellbeing, privacy rights and technological innovation. They’re also aware, they said, that Gov. Jared Polis is reluctant to sign regulation that could hamper one of America’s fastest growing sectors.

While the Colorado bill was modeled after California’s law and proposed legislation in other states, advocates said it creates even stronger protections for minor users of conversational AI services.

The bill requires chatbot services to inform users that they are communicating with artificial intelligence, prohibits operators from providing minors with points or rewards that encourage engagement with the service, and requires operators to enact “reasonable measures” to prevent chatbots from producing sexually explicit material or statements that “simulate emotional dependence.”

The bill also requires the tech companies to provide tools for users to manage privacy and account settings.

Additionally, it mandates chatbot operators to implement a protocol for user prompts that include mentions of suicidal ideation or self-harm and prohibit operators from stating or implying that any information provided by a chatbot is endorsed by or equivalent to services provided by a licensed professional.

Some 72% of teens say they’ve used an AI chatbot at least once, and 30% say they use chatbots daily, according to Healthier Colorado.

Artificial intelligence chatbots are designed to agree with and validate users, which makes them particularly dangerous for those experiencing mental health crises, said Alexis Alltop of Healthier Colorado.

“Rather than directing users to sources of support, they’re encouraging suicidal ideation or ideas for self-harm, which has had an impact on the mental health of some users,” she said.

Rep. Camacho said artificial intelligence is one of his constituents’ biggest worries.

“If you look at it globally, people are starting to think twice about social media and chatbots,” he said. “It’s just the speed and the quantity of ways in which our life is changing from AI.”

While there are many beneficial uses for AI, the speed at which the technology is advancing should give everyone pause, said Healthier Colorado’s executive director Joshua Ewing.

“I think this is part of the foundational conversations that are happening about how you balance individual privacy protections with guardrails that are strong enough to protect our young people,” he said. “Obviously, we don’t want to stifle innovation of artificial intelligence tools, but the reality is that these tech companies know a lot more about us than they let on.”

Polis has been hesitant to sign bills regulating AI and other technology into law, arguing a federal framework — not state by state — is the better approach.

Still, Camacho said he’s been speaking with the governor’s office and believes his bill has the administration’s full support.

“We’ve told him what our focus is and that it’s narrow to address this particular problem, and I think what’s helpful to those conversations is it’s not just us. There’s a lot of other states that are looking at this and have already passed a version or are looking to pass a version, so I think that also gives some understanding that this isn’t some weird one-off,” he said.

Absent any significant federal legislation on artificial intelligence, many states have adopted a “patchwork” approach, implementing their own laws to address the issue.

“I don’t think we are optimistic that meaningful reform is coming any time soon at the federal level, so therefore it is incumbent on states to take action, because kids are being harmed today,” he said.

Camacho shared a story he thought illustrated the importance of ensuring children are safe while interacting with artificial intelligence. While giving students a tour of the Capitol, he asked them what they thought of the bill. He fully expected them to oppose it, but, he said, that was not the case.

“I was surprised at how many of them were really concerned about AI and chatbots and felt that some of their colleagues that are using AI for academic advantages are wrong,” he said. “I was really struck by that, that they were fully in support of this concept.”

He added: “Young people are accessing these tools. They have a hard time avoiding these tools, and we have an obligation to make sure they’re able to do it safely.”

House Bill 1263 will be heard by the House Business Affairs and Labor Committee on March 12.

Editor’s note: A previous version of this story included information about 13-year-old Juliana Peralta and her mother, Cynthia. Cynthia has clarified that she is in strong opposition of House Bill 1263


PREV

PREVIOUS

Judge declines to block potential Boebert challenger from going through assembly process in Colorado’s 4th CD 

Congressional candidate Eileen Laubacher, one of the Democrats seeking to challenge Republican U.S. Rep. Lauren Boebert, can participate in the party’s caucuses and assemblies pending a court hearing later this month, a Denver judge ruled Tuesday in response to a request filed by another Democrat running in Colorado’s 4th Congressional District. Denver District Court Judge […]

NEXT

NEXT UP

Colorado lawmakers move to curb eminent domain powers after Xcel dispute

After months of backlash over Xcel Energy’s use of eminent domain in eastern Colorado, lawmakers representing Elbert and El Paso counties are advancing legislation that would bar utilities from initiating property takings until they’ve secured state approval and all required local permits — a move supporters say will restore fairness and prevent landowners from being […]


Welcome Back.

Streak: 9 days i

Stories you've missed since your last login:

Stories you've saved for later:

Recommended stories based on your interests:

Edit my interests