Law360 (October 28, 2025, 10:24 PM EDT) -- Attorneys for Meta Platforms, YouTube, Snap and TikTok on Tuesday urged a Los Angeles judge to toss claims against them from an upcoming bellwether trial over the platforms' alleged harm to youth mental health, arguing that Section 230 of the Communications Decency Act should prevent many of the claims from reaching a jury.
During a daylong hearing, Los Angeles Superior Court Judge Carolyn B. Kuhl heard arguments on motions for summary judgment or summary adjudication filed by the companies ahead of a scheduled January bellwether trial focused on three cases in what could be the first of dozens of trials related to over 1,000 consolidated complaints.
The judge previously tossed claims alleging that specific content on the platforms caused harm, finding that Section 230 and the First Amendment barred them, but allowed claims that certain design features of the platforms are harmful to move forward. Attorneys for the companies frequently argued on Tuesday that many of the plaintiffs' claims as alleged still fall within the protections of Section 230.
Ashley Simonsen of Covington & Burling LLP, who represents Meta, showed the judge deposition testimony of a plaintiff identified as K.G.M., where she said content she viewed on some of the platforms caused her eating disorder, and where she agreed with an attorney's question that the disorder came from "the content you received on those sites."
"She attributes her harm in terms of the orthorexia and body dysmorphia to specific content," Simonsen said, adding that a child psychologist serving as an expert for the plaintiffs, Dr. Kara Bagot, said "the same thing."
Simonsen then showed the judge more deposition testimony where K.G.M. attributed her suicidal thoughts and self-harm to content she saw on Instagram and YouTube.
"So we know that of the numerous features this plaintiff is challenging, she cannot bring a claim for content recommendation algorithms," adding that such claims are barred under Section 230.
K.G.M. also said she suffered harm by engaging in an "endless scroll" on Instagram, but Simonsen argued that one cannot attribute her harm simply to scrolling, but to the content she was viewing while scrolling, because you "can't have an endless scroll without content."
Simonsen also argued the judge previously held that Section 230 may not apply to "endless scroll"' if that feature operated to addict and harm minor users of the platform, but that K.G.M. "does not claim any harm from endless scroll independent of the content, and neither does Dr. Bagot. So endless school ... cannot go to a jury in terms of being a basis for K.G.M.'s claims."
While the defendants made other arguments on Tuesday against the claims, including that some were filed after the statute of limitations, Section 230 was repeatedly cited by some of the defendants' attorneys.
Josh Autry of Morgan & Morgan, who represents the plaintiffs, told the judge that the defendants' arguments in the K.G.M. case "is that if user content is a partial cause of a harm, then that harm cannot be recovered for. That's a reworking of the but-for test."
He went on to argue that the defendants' desired test is if harm is partially caused by user content.
"And if so, they believe that Section 230 is an absolute bar," he said. "This provision would give defendants a unique position under the law. It is a position that police officers don't have, prosecutors don't have. They want a type of immunity, absolute immunity, that does not exist in really any other context."
The three cases scheduled for the first bellwether trial are among over 1,000 that have been coordinated in Los Angeles Superior Court, alleging that the social media companies purposely induce young people to compulsively use their products, according to the personal injury master complaint filed in 2023.
Judge Kuhl tossed some claims from the master complaint, but allowed failure-to-warn claims to proceed alongside negligence and concealment allegations.
Similar multidistrict litigation is pending in the U.S. District Court for the Northern District of California.
The plaintiffs are represented by Joseph G. VanZandt and Davis Vaughn of the Beasley Allen Law Firm, Mariana A. McConnell, Paul R. Kiesel and Cherisse H. Cleofe of Kiesel Law LLP, Rachel Lanier of The Lanier Law Firm PC, Josh Autry of Morgan & Morgan and Rahul Ravipudi, Brian J. Panish, Ian Samson and Jesse Creed of Panish Shea Ravipudi LLP.
Meta is represented by Alexander L Schultz, Paul W. Schmidt, Ashley Simonsen and Michael Imboscio of Covington & Burling LLP.
Snap is represented by Jonathan H. Blavin, Rose Leda Ehler, Victoria A. Degtyareva and Ariel Teshuva of Munger Tolles & Olson LLP.
TikTok is represented by David M. Mattern and Geoffrey M. Drake of King & Spalding LLP.
YouTube is represented by Christopher C. Chiou, Matthew K. Donohue, Samantha Machock, Lauren Gallo White and Brian M. Willen of Wilson Sonsini Goodrich & Rosati PC, Joseph G. Petrosinelli and Ashley W. Hardin of Williams & Connolly LLP, and Yardena R. Zwang-Weissman, Brian Ercole and Stephanie Schuster of Morgan Lewis & Bockius LLP.
The state case is Social Media Cases, case number JCCP5255, in the Superior Court of the State of California, County of Los Angeles.
For a reprint of this article, please contact reprints@law360.com.

Oct 28