Nearly 10 million people watched an online video recording of a Utah 14-year-old doing something embarrassing in front of a school bathroom mirror.
It took three months to get the video taken down, and it has since resurfaced, Utah Sen. Michael McKell told a session on kids and social media at NCSL’s 2023 Legislative Summit. The boy has dropped out of school and become suicidal.
That’s the kind of real-life harm prompting legislators across the nation to find ways to mitigate the impact social media can have on kids, from California’s Age Appropriate Design Code Act—which seeks to help protect children’s data online—to Utah’s Social Media Regulation Act, which instituted parental controls and age verification.
“The basic design of many of the current products allows behaviors in the digital world that we would never find acceptable in a physical world.”
—Minnesota Rep. Kristin Bahner
“I get super frustrated when I meet with tech (companies) and they say, ‘We’re doing everything we can to protect kids.’ If there’s not an expectation of privacy for a 14-year-old boy in a bathroom, I don’t know where there would be,” says McKell, who sponsored Utah’s bipartisan social media regulation law. Nearly all studies show a direct link between mood disorders and social media use of three to four hours a day, he says. One federal study of 17,000 kids found that 57% of girls were persistently sad, and one-third had contemplated suicide.
“When we ask kids, how many of you think social media is harmful, the vast majority we talk to in our focus groups all think social media is incredibly problematic,” McKell says.
Among other things, Utah’s law prevents social media companies from collecting and selling children’s data, sets up strict age verification requirements, allows parents and legal guardians to have full access to their child’s social media accounts, sets time restrictions and blocks direct messaging to minors by non-friends. “We want to empower parents in our state,” McKell says.
Minnesota Rep. Kristin Bahner, who sponsored an Age Appropriate Design Code bill modeled after one recently enacted in California, says that tech companies need to be held accountable for their product.
“Platforms have internal data that shows that not only do their products provide access to harmful content, but that their design choices are perpetuating a cycle of anxiety, depression and harm,” she says. “They know the harm, and they’re choosing profits over kids by failing to act.”
Bahner says parents can’t bear the sole responsibility for a complex system that by its very design takes control out of their hands.
“The basic design of many of the current products allows behaviors in the digital world that we would never find acceptable in a physical world,” she says. “We would never allow children to talk to random strangers in a mall. But online, random adults can message vulnerable youth at ease and without their parents’ consent or knowledge. Our kids deserve better, and we deserve better products.”
Age appropriate design means making safety a part of a tech platform’s design, rather than an afterthought. “It does not rely on expensive techniques, complicated structures or provisions that make it challenging, if not impossible to implement or enforce,” says Bahner, who has an extensive background in IT. “It’s simple and it’s elegant in its design. It’s not new and it is not novel. We simply ask them to apply the same assessment through the lens of child safety and data privacy. We ask them to do it at the outset in the design phase, which is the most cost-effective time to consider design choices and mitigate risk. It is the best solution for business and the best that I have seen for parents.”
Carl Szabo, vice president and general counsel for NetChoice, which fights tech laws it considers unconstitutional or otherwise unlawful, says he thinks the best approach is the one being tried in Florida.
“They are now adding social media education to the school curriculum. And then it lets it be up to the teachers to decide what’s best for their classroom, up to the school board to decide what’s best for their (community). And then, finally, they require that information be made available to parents,” he says.
“At the end of the day, we can all use education on this, because the kids will find a way online,” Szabo says. “So, let’s teach them to be decent human citizens when it comes to social media. And let’s see how that goes.”
Lisa Ryckman is the associate director of communications at NCSL.
Want More? Subscribe to NCSL Newsletters Tailored Just for States
Get the latest human services state policy news, research and analysis delivered directly to your inbox. Choose from a variety of newsletters like Children, Youth and Families Update. Sign up