Emily Field
December 26, 2025
Meta Buried Own Research On Youth Harm, Schools Say
5 min
AI-made summary
- School districts have filed claims in a multidistrict litigation alleging that Meta suppressed internal research indicating its social media platforms negatively impact young users' mental health, with staff comparing their roles to drug pushers
- The districts assert Meta prioritized growth over safety, failed to disclose harmful findings, and inadequately addressed child safety concerns
- Meta denies the allegations, citing efforts to protect teens and stating that internal studies did not conclusively demonstrate harm
- The case is ongoing in the Northern District of California.
School districts are alleging that Meta clamped down on internal research showing that the mental health of young users suffered from compulsive use of its social media platforms, even as staff likened themselves to drug pushers.
The schools said Friday that Meta and other social media companies knew students were hooked on their platforms, to the detriment of their mental health and with widespread effects on education. Meta researchers said that "we're basically pushers," that "Instagram is addictive" and that time spent on the platform is "having a negative impact on mental health," according to the school districts' unredacted filing.
"What Meta knew — and what it failed to reveal — should shock the conscience," the school districts said. "Internal studies conducted by Meta, both qualitative and quantitative, repeatedly confirmed that Instagram and Facebook use could lead to a panoply of negative outcomes for teens, including addiction, sleep disruption, anxiety, depression, negative appearance comparison and body image problems."
A chat between two user experience researchers excerpted in the brief said: "We're basically pushers" and "the top-down directives drive it all towards making sure people keep coming back for more. That would be fine if [it's] productive but most of the time it isn't. … The majority is just mindless scrolling and ads."
The brief is part of an MDL involving personal injury plaintiffs, schools and attorneys general who claim that Meta Platforms Inc., YouTube LLC and other social media giants design their multibillion-dollar revenue-generating social media platforms to be addictive, to the detriment of minors' health and livelihoods.
"Like tobacco, this is a situation where there are dangerous products marketed to kids," plaintiffs co-lead counsel Previn Warren of Motley Rice LLC said in a statement.
A Meta spokesperson told Law360 that the allegations rely on "cherry-picked quotes and misinformed opinions."
"The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens — like introducing Teen Accounts with built-in protections and providing parents with controls to manage their teens' experiences," the spokesperson said. "We're proud of the progress we've made, and we stand by our record."
Regarding the internal study, the Meta spokesperson said on Bluesky over the weekend that it found that those who believed Facebook was bad for them stopped using it — which makes intuitive sense but doesn't show anything about the actual effects of using the platform.
"Meta researchers were concerned about this going in. They spent months trying to design a study with the specific goal of overcoming what are called 'expectation effects,' the idea that beliefs and expectations influence perception," the spokesperson posted. "A pilot of the study ran. Researchers analyzed the results and found the study didn't overcome those expectation effects."
One of Meta's internal studies was an experiment started in late 2019, dubbed Project Mercury, that asked random users to stop using Facebook and Instagram for a period to see how they fared compared to those who kept using the platforms, according to the brief.
An employee of Meta said: "if the results are bad and we don't publish and they leak, is it going to look like tobacco companies doing research and knowing cigs were bad and then keeping that info to themselves? I went … oh."
Which is what Meta did, the schools said.
The pilot results confirmed that people who stopped using Facebook for a week had lower feelings of depression and anxiety. But rather than conducting more research, or raising an alarm about the findings, Meta put a stop to the project on the grounds that participants' feedback was tainted by outside media narratives, according to the brief.
"Meta never publicly acknowledged these findings from its own in-house addiction specialists. Instead— just weeks later — Mark Zuckerberg testified before Congress and denied that Meta profits from creating addictive products."
Meta's top brass knew in 2015 that use among young people had started to decline, and the company chose to sacrifice safety to recapture that profitable audience in line with its motto "Move Fast and Break Things," the schools said.
For example, Zuckerberg directed employees to take part in a "lockdown sprint" to launch the Facebook Live feature, intentionally leaving out parental and teacher safeguards. Zuckerberg warned that telling adults "will probably ruin the product from the start," according to the brief.
Concerningly, Meta learned that teens were using Live to broadcast suicides and attempts, which it did nothing to prevent or warn about prior to its launch, the schools said.
"As one former Meta VP conceded, 'it was a growth imperative to make Live Video a product and release it, as opposed to slowing down and really trying to think through all the negative ways it could be used,'" the brief said.
One longitudinal study of teenagers over the course of a school year tracked their attentiveness — in other words, their habitual or unintended use — while using Instagram and their perceived ability to control their use, according to their brief.
Early results showed just over a third of the teens had low attentiveness while using Instagram, the school districts said. The study also found that parents and family were not responsible for the problem, according to the brief.
The results of this study were also never disclosed, the school districts said.
And Meta has also lied about its "zero tolerance" approach to child sexual abuse material on its platforms, the schools said.
It has three AI tools to detect violations of its CSAM policies. But even if a photo or video is determined with 100% confidence that it violates the policies, that content is not automatically removed, the districts said.
Internal documents show that child safety is not a Meta priority, the schools said, pointing to quotes such as: "Child Safety is explicitly called out as a non-goal in our H2 plans."
According to its second quarter report, Meta took action on five million pieces of child sexual exploitation content, nearly all of which was found before users reported it. Content goes through human review after being flagged by AI tools before being sent to the National Center for Missing and Exploited Children.
In 2021, ex-Facebook employee Frances Haugen famously testified before a Senate panel about Facebook's hidden internal research on the harms of social media on both children and adults.
"But behind closed doors, Mr. Zuckerberg was texting confidants, 'I'm not going to say [child safety] is my personal main focus when I have a number of other areas I'm more focused on like building the metaverse,'" the brief said.
The personal injury plaintiffs and school district plaintiffs are represented by Lexi J. Hazam of Lieff Cabraser Heimann & Bernstein LLP and Previn Warren of Motley Rice LLC.
Meta is represented by Ashley Margaret Simonsen and Paul William Schmidt of Covington & Burling LLP and James P. Rouhandeh of Davis Polk & Wardwell LLP.
The MDL is In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, case number 4:22-md-03047, in the U.S. District Court for the Northern District of California.
Article Author
Emily Field
The Sponsor
