A story about the study of children’s use of social media can be seen in the player above.

If you or someone you know is experiencing a mental health crisis, you can call or text 988 to reach the 988 Suicide and Crisis Lifeline available 24/7. To reach the 24/7 Crisis Text Helpline, text 4HOPE to 741741.

COLUMBUS, Ohio (WCMH) — Mere days from completing the two-year budget, Ohio’s leaders are focused on a provision of the bill not related to money: limiting children’s use of social media.

At a news conference Monday morning, Gov. Mike DeWine and Lt. Gov. Jon Husted reminded the public of the Social Media Parental Notification Act, proposed by the duo upon release of the executive budget. The law would require social media companies to verify the age of users and seek parental consent before allowing children under 16 on the platforms.

The policy seeks to address what parents, health care providers, researchers and the U.S. Surgeon General have identified as a major contributing factor to the mental health crisis among youth and adolescents: Children’s unmitigated, unmonitored use of social media. Technology has been a “wonderful thing” in many instances, Husted said, but not in the way it has impacted youth’s mental and physical health.

“We’ve been conducting an experiment on our children that we know is failing, and we need to act,” Husted said.

Social media companies will have to develop methods to verify users’ ages, including by asking for government identification, credit or debit card information, or a digital consent form. For users under 16, companies must get “verifiable parental or legal guardian consent” and send written confirmation to parents.

Since announcing the proposed law in February, Husted said he has sought input from social media companies on the feasibility of the mandate, as well as mental health and child development experts on best practices related to children’s social media use. It’s why the executive office landed on age 16 as a cut-off, as opposed to a similar policy in Utah that extends limitations to age 18.

A spokesperson for Meta, which owns platforms Facebook, Instagram and WhatsApp, said Monday the company has developed dozens of ways to limit access to harmful materials and manage children’s privacy, including by showing warnings for sensitive content and automatically setting accounts to private for children under 16.

“We refer to research, feedback from parents, teens, experts, and academics to inform our approach, and we’ll continue evaluating proposed legislation and working with policymakers on these important issues,” the spokesperson said in an email.

In May, the Surgeon General released an advisory on social media and youth mental health, reporting that 95% of teens report using at least one social media platform and more than a third saying they use social media “almost constantly.” The advisory highlighted research suggesting social media use is correlated with increased anxiety and depression, with nearly two-thirds of adolescents reporting frequent exposure to “hate-based content” and a third of girls of color seeing racist content online at least monthly.

Tony Coder, executive director of the Ohio Suicide Prevention Foundation, said he’s heard from countless parents who lament that they didn’t know what their child experienced online — whether it be harassment, exposure to images promoting self-harm or other harmful content.

“We must help those that have no time, have no energy, and are just trying to find their kids help, so that we can be the voice for those that are struggling,” Coder said.

Natalie Fahmy contributed reporting.