Social Media Experts Are Skeptical About the Power of New State Laws
Lawmakers worry about negative effects on teens, but others raise concerns over free speech.
Get stories like this delivered straight to your inbox. Sign up for 蜜桃影视 Newsletter
Ritika Shroff had the typical Gen Z experience with social media. At 13, she signed up for Instagram, then Snapchat. Later, she downloaded TikTok and worked her way through other popular platforms.
But in high school, she began to see downsides, feeling pressure when comparing her number of followers, test scores and experiences with those of her peers online.
鈥淭hey鈥檙e doing X, Y and Z with their lives, and I think I got pulled into it,鈥 Shroff said.
Today, Shroff, a 19-year-old sophomore at American University in Washington, D.C., still sees the benefits of social media, such as allowing her to stay in touch with hometown friends from Des Moines, Iowa, and family in India. While she thinks there should be more rules around social media, she doesn鈥檛 think individual state actions, such as a state suing a platform, would make much difference.
鈥淭hese small things won鈥檛 make an impact in the broader landscape,鈥 Shroff said.
More states are hoping to rein in the harm that social media can do to teens鈥 mental health and privacy by approving laws that require age verification or parental consent, prohibit 鈥渁ddictive feeds鈥 or ban the apps for minors. They also are taking social media companies to court.
But some experts say such efforts won鈥檛 make social media any safer. Instead, they fear the moves might infringe on people鈥檚 privacy and First Amendment rights 鈥 while potentially making the platforms harder for everyone to use.
鈥淭his is global media, and trying to regulate it at the micro level 鈥 the fear for a lot of people is that we鈥檙e going to end up with different rules for different states, which is just going to undercut the whole promise and potential of internet-based media and communication,鈥 said Kevin Goldberg of the Freedom Forum, a nonprofit aimed at protecting First Amendment rights.
Some social media disputes are playing out at the federal level. Last week, the U.S. Supreme Court upheld a bipartisan federal law banning TikTok, a popular video sharing platform, unless its China-based parent company agreed to sell the app. The ban briefly went into effect before President Donald Trump, who had tried unsuccessfully to ban TikTok by executive order in his first term, signed an executive order it for another 75 days.
But absent other federal action to curb social media鈥檚 effects on young people, many states are considering new legislation. In New York, a enacted in June prohibits social media platforms from providing to minors so-called addictive feeds without parental consent. New York Attorney General Letitia James, a Democrat, is drafting to enforce the law.
Social media feeds are designed to keep kids scrolling longer and longer to drive up ad revenue, noted state Democratic Sen. Andrew Gounardes, who sponsored the . Kids who are addicted to social media suffer mental health issues, and people who spend more time scrolling tend to struggle to navigate real-life relationships, he argued.
鈥淪o social media, for all the positives that might exist, has some real, deeply negative and dark downsides that we are finally seeing manifest, and we have to reconcile it,鈥 Gounardes said.
But tech developers are concerned new state laws could weaken privacy protections for users, take away online mental health resources for marginalized communities and restrict the flow of online information, said Paul Lekas, the senior vice president and head of global public policy and government affairs at the Software & Information Industry Association, a trade association representing the digital content industry.
鈥淭he bills are all different, so it鈥檚 hard to say that all of them are good or all of them are bad,鈥 Lekas said. 鈥淏ut a lot of concerns come up in a number of these bills.鈥
Age restrictions
Some research suggests that excessive is worsening young people鈥檚 mental health. Teens who spend the most time on social media are significantly more likely to exhibit negative emotions, such as sadness and anger, according to a 2023 Gallup .
A Florida that went into effect this month prohibits kids who are under 14 from having social media accounts. A user who is 14 or 15 would have to get parental consent before starting an account.
Ashley Moody, Florida鈥檚 Republican attorney general at the time, agreed not to enforce the law while a alleging it would restrict minors鈥 freedom of speech plays out. Moody was sworn into the U.S. Senate this week to replace Sen. Marco Rubio, the new U.S. secretary of state.
More measures are expected across the country during 2025 legislative sessions.
A new bill in would prohibit anyone under the age of 16 from creating social media accounts without verified parental permission. A similar bill was introduced in , but with an age limit of 18. A prefiled bill in would set the age at 13.
To verify age, some apps may require all users to upload a photo of their ID. This could be of particular concern for adult users who would have their full legal identity tied to their social media account, said Ash Johnson, a senior policy manager at the Information Technology & Innovation Foundation, a think tank focused on public policy surrounding technology.
Rather than an outright ban on social media accounts for users under a certain age, increasing transparency and accountability measures for social media developers would improve the safety of the apps, Johnson said.
She pointed to California as an example. The state鈥檚 Age-Appropriate Design Code Act was partially from enforcement by a federal appeals court last year. It would have required companies to ensure that online services likely to be accessed by children are designed to eliminate the risk of harm to them.
Parental controls, Johnson said, also could make it easier for parents to oversee their child鈥檚 media presence by deciding what content they can access.
Instagram鈥檚 new , for example, automatically place teenage users into an account that limits who can contact them and the content they see 鈥 and anyone under the age of 16 will have to get parental permission before changing any of the safety features.
鈥淚t would give children a really customizable experience on social media depending on their individual developmental needs,鈥 Johnson said.
A lot of the laws around the country are specifically designed to prevent younger people from either accessing certain content online or entire social media platforms, said Goldberg, of the Freedom Forum. Changing the way in which social media developers control who can and can鈥檛 have an account could change what people see on their feeds.
鈥淲e鈥檝e seen a lot of this, especially at the state level, which is concerning,鈥 he said. 鈥淢any of the laws that we are seeing proposed 鈥 and even passed 鈥 raise First Amendment concerns.鈥
States go to court
States also are turning to lawsuits to address social media effects on young people.
In October, attorneys general in California, Illinois, Kentucky, Louisiana, Massachusetts, Mississippi, New Jersey, New York, North Carolina, Oregon, South Carolina, Vermont, Washington and the District of Columbia TikTok, alleging violations of state consumer protection laws.
Led by California Democratic Attorney General Rob Bonta and James of New York, the lawsuits allege that TikTok exploits and harms young users and deceives the public about the social media platform鈥檚 dangers.
Texas Republican Attorney General Ken Paxton filed a similar suit that same month accusing TikTok of violating a protecting children online. The law prohibits digital service providers from sharing, disclosing or selling a minor鈥檚 personal information without permission from a parent.
TikTok has disputed the claims, calling them 鈥渋naccurate and misleading鈥 in a . The company says its platform is safe for kids and offers time limits and parental controls.
States have also taken aim at Snapchat and Meta. In September, New Mexico Attorney General Ra煤l Torrez, a Democrat, filed a against Snap Inc., Snapchat鈥檚 parent company, alleging the app鈥檚 developers were ignoring reports of sextortion, failing to implement age-verification rules, admitting to features that connect minors with adults and more.
And in 2023, more than Meta, claiming Instagram and Facebook worsened the youth mental health crisis.
The social media companies need to be held accountable, said Julie Scelfo, of Mothers Against Media Addiction.
Scelfo, a career journalist who covered youth mental health for years, said she was disturbed after finding out that more and more young children wanted to commit suicide as social media became more mainstream.
鈥淪ocial media can connect people for positive things, but it has also been a very convenient conduit for all of the worst forces in society,鈥 Scelfo said.
But tech companies are winning some fights 鈥 and going on the offensive.
In addition to the partial block of the Age-Appropriate Design Code Act, a federal judge has until Feb. 1 another California law designed to protect children from addictive feeds. The Protecting Our Kids from Social Media Addiction Act would prevent social media platforms from providing minors with 鈥減ersonalized feeds.鈥
Across the states, companies are challenging dozens of laws restricting social media 鈥 and in some cases, they鈥檙e winning.
鈥淚 think that shows that courts are skeptical that either there鈥檚 no proof behind the goals of the legislators or that they鈥檙e not being precise enough,鈥 Goldberg said. 鈥淪o, I鈥檓 skeptical. I don鈥檛 think this is going to help because there will always be ways for children to access content on the internet or social media 鈥 it鈥檚 almost impossible to truly enforce.鈥
is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Stateline maintains editorial independence. Contact Editor Scott S. Greenberger for questions: [email protected].
Did you use this article in your work?
We鈥檇 love to hear how 蜜桃影视鈥檚 reporting is helping educators, researchers, and policymakers.