SUMMARY
Do social media companies try to hook children on their products? Do they fail to adequately protect those children from harmful content, predators, and exploitation? Millions of parents would probably agree with the juries in California and Texas that recently answered those questions with a resounding yes. As a result, one young plaintiff was awarded $6 million from YouTube and Meta in one case, and Meta was ordered to pay $374 million in civil penalties in the other. Meta and YouTube have, of course, vowed to appeal. Despite a growing awareness of the risks to children and teens online, new legislation on children’s online safety remains stalled in Congress, and a robust regulatory system is nowhere in sight.
ANALYSIS
Thousands of families of distressed teens are suing META, Tik Tok, Google, and other social media companies, alleging their children were deliberately hooked on platforms that disregarded their safety, subjecting them to exploitation and harmful content.
Social media companies have always ducked responsibility for their content, and receive immunity through Section 230 of the Communications Decency Act of 1996, which does not hold them to the same standards as actual publishers, who are considered liable for what they publish. Two recent suits took a different approach to establishing liability.
In the California case, a young woman claimed she had become hooked on social media at a young age, resulting in her suffering from depression and anxiety. The evidence presented by the plaintiff focused on the way the platforms had been designed, including showing that the companies knew full well that features like infinite scroll, push notifications, and algorithm amplification would help hook young people, and that hooking them was absolutely the goal.
The youth market generates billions in profits, and advertisers know the easiest way to reach young people is on social media. According to a survey by the Pew Research Center, 36% of U.S. teens report using TikTok, YouTube, Instagram, Snapchat, and/or Facebook “almost constantly.”
A day before the California verdict, a jury in New Mexico found Meta liable for not protecting young people from online harm, such as sexual predators, in breach of the state’s consumer protection laws. Dozens of other states have filed similar cases, and the financial stakes are higher than in the individual suit, with Meta ordered to pay $374 million in civil penalties.
Evidence presented by the New Mexico attorney general included internal Meta documents and testimony from former Meta employees demonstrating how Meta’s design features enabled child exploitation by pedophiles and predators. In that case, evidence of Meta’s intentional strategies to hook young people on their products was also presented.
“These products were purposefully designed to harm and addict millions of young people, and lead to lifelong mental health consequences,” commented Sacha Haworth, Executive Director of The Tech Oversight Project, in a press release about the verdict. She urged passage of the Kids Online Safety Act, now making its way through Congress.
Known as KOSA, it would establish a “duty of care” for platforms to act to prevent harmful activities like cyberbullying and sexual exploitation, strengthen privacy protections for minors, and provide parents and children with better ways to opt out of addictive algorithmic recommendations.
KOSA is controversial and has attracted the ire of free speech advocates and LGBTQ groups that fear the “duty of care” provisions will result in censorship. The Electronic Frontier Foundation questioned its likely effectiveness, saying, “This bill won’t bother big tech. Large companies will be able to manage this regulation, which is why Apple and X have agreed to support it. In fact, X helped negotiate the text….”
Age limits and parental controls have turned out to be largely ineffective, shifting even more responsibility away from the companies, even as they continue to rake in huge profits from their youngest subscribers. As is often the case, the European Union is considering a much stronger approach with proposals for:
- Complete bans on the most harmful addictive practices for minors, and automatic disabling of many addictive features for including infinite scrolling, auto-play, and reward loops,
- Actions to rein in targeted ads, influencer marketing, and addictive design, and
- Bans on engagement-based recommendations for minors and protection from commercial exploitation by prohibiting offering minors financial incentives, which is now common practice for “kidfluencers.”
ENGAGEMENT RESOURCES
- What is Technology Addiction? https://www.psychiatry.org/patients-families/technology-addictions-social-media-and-more/what-is-technology-addiction
- The Tech Accountability Project https://law.yale.edu/mfia/projects/tech-accountability-project
- Kids Online Safety Act https://www.congress.gov/bill/119th-congress/house-bill/7757/text
- The Kids Online Safety Act Will Make the Internet Worse for Everyone by Joe Mullin, May 15, 2025, https://www.eff.org/deeplinks/2025/05/kids-online-safety-act-will-make-internet-worse-everyone

