A Larimer County lawmaker is proposing new regulations aimed at how online platforms design their services for minors, joining a growing national effort to address children’s safety in digital spaces.
“As a mom of three little boys, I’ve seen the way online gaming in particular and social media market children,” said Yara Zokaie, Colorado State House representative. “You want them to be able to have their online spaces to build community and connections, but we also know that our kids are being viewed as commodities.”
That concern is now shaping House Bill 26-1148, a proposal from Zokaie that would require online platforms used by minors to meet what the bill calls a “duty of care” standard: a legal expectation that companies actively design their systems with children’s safety in mind.
HB26-1148 focuses on businesses that operate online services likely to be accessed by minors, particularly gaming platforms that function as social spaces. Under the proposal, companies that collect user data and determine how it’s used would be required to default minor accounts to the highest privacy settings, delete age-verification data after confirming age and honor account deletion requests within 15 days.
The bill also prohibits platforms from collecting unnecessary data on minors, prompting weakened privacy protections or push notifications being sent between midnight and 6 a.m. It bars algorithmic systems from recommending illicit substances to minors and requires transparent pricing for in-game purchases. A 5% fee on add-on transactions made by minors would go to Colorado’s public school fund.
Zokaie said the goal is not to restrict what children can post or search online but to regulate how companies design their systems.
“What is regulated in this bill is the way the company promotes that information to children,” Zokaie said. “Companies are incentivized to show children things that are concerning to them, that cause an emotional response from them.”
The legislation was partly inspired by concerns raised regarding platforms like Roblox, which combine gaming, social interaction and user-generated content. Zokaie said parents frequently approach her about safety risks and addictive design features.
“The overwhelming response that I have received is from parents of little kids,” Zokaie said. “They are so concerned about what they are seeing and approach this bill just like I do as a parent, and we want to make sure our kids can have a social media-like space. … As a parent, you don’t want to take that away from them.”
Her proposal borrows elements from similar laws passed elsewhere, including a Vermont measure that imposed safety requirements on platforms serving minors. That law prompted companies to voluntarily add safeguards such as time-use warnings and limits on notifications, Zokaie said.
Colorado is not alone in trying to set guardrails for young users. Across the country, lawmakers are drafting bills targeting everything from age verification to algorithm transparency. Still, Congress has not passed comprehensive legislation setting uniform standards.
“The federal government currently has shown that they will give a free pass to any giant tech company,” Zokaie said. “And if you look at the CEOs of those companies, they are standing behind Donald Trump as he’s being inaugurated. So I’m not holding my breath for the federal government to take action here when we have giant social media companies exploiting our children for profit.”
Clare Brock, a Colorado State University political science assistant professor, said state-level regulations can produce unpredictable outcomes when federal standards are limited.
“When states start trying to do these patchwork regulations, we don’t really know how these big transnational corporations are going to respond to these … state-level regulations,” Brock said. Companies might comply, she explained, or restrict access to users in states that have stricter laws.
She pointed to Colorado’s salary-transparency requirement as an example: Some employers simply stopped listing jobs in the state rather than change their postings.
HB26-1148 is one of the several recent Colorado efforts aimed at youth online safety. SB26-051, introduced in January, would require operating systems to provide age-range signals to apps so developers know whether a user is a minor. Violations could bring civil penalties of up to $7,500 per minor affected, depending on the intentionality of the violation.
SB25-086, another bill that passed the general assembly last year, sought to require large social media companies to remove accounts linked to illegal activity within strict timelines and comply quickly with law enforcement warrants. Despite bipartisan legislative support, Gov. Jared Polis vetoed it, citing concerns about free speech and privacy.
Sponsors say those debates showed how difficult it is to regulate digital spaces without infringing on constitutional rights, a balance Zokaie said she intentionally tried to strike.
“You don’t want to take away somebody’s access to information or infringe on anyone’s First Amendment rights,” Zokaie said. “The way we go about this is really going to matter.”
At the national level, social media companies including Meta and Discord are facing lawsuits and regulatory scrutiny over how their platforms affect minors’ mental health and safety.
The absence of federal standards, Brock said, is part of why states are stepping in, even though technology regulation can be difficult for lawmakers who may not fully understand how platforms work.
“Regulating technology is certainly not impossible,” Brock said. “But it is challenging.”
Zokaie said she has heard from CSU students who support the effort, even if it doesn’t directly regulate their accounts.
“They want our online spaces to be safe,” Zokaie said.
State Sen. Dylan Roberts, a sponsor of the bill, framed the proposal as a middle ground.
“Young Coloradans deserve to grow up in a digital world that respects their privacy, protects their personal data and gives families clear tools,” Roberts said. “We can protect kids without slowing down innovation.”
State Sen. Mike Weissman said his involvement stems from years of consumer protection work.
“My involvement with 1148 comes from a background in consumer protection legislation, which I have worked on a lot over the years,” Weissman said. “There are many facets to the bill, but to me the common element is that we need some legal framework re- safety of newer and rapidly emerging technologies, just like we long have had for more mundane products in the marketplace.”
For Zokaie, the effort is personal as well as political. She said she expects the legislation to evolve as lawmakers, companies and families weigh in, and she plans to keep pushing even if it faces setbacks.
“Social media is a very tricky space to regulate,” Zokaie said. “So I will continue my work there.”
Reach Maci Lesh at news@collegian.com or on social media @RMCollegian.
