In Landmark Social Media Trial, Big Tech Faces the Big Tobacco Playbook | Analysis

A bellwether case accusing Meta and YouTube of addicting young users and harming their mental health could reverberate through courtrooms for years to come

Social Studies -- “Peer/Algorithm Pressure” -- Episode 3
A scene from "Social Studies", Episode 3 'Peer/Algorithm Pressure' (Credit: Lauren Greenfield/INSTITUTE)

It can feel like everyone is addicted to their phones, scrolling Instagram and TikTok, sometimes oblivious to the world around them. But can it be proven in court that tech giants like Meta and YouTube deliberately hooked young people on their products, fueling anxiety, depression and body dysmorphia?

That’s the argument in a landmark trial now underway in Los Angeles, where Instagram chief Adam Mosseri testified Wednesday that social media use is not a “clinical addiction,” while Meta CEO Mark Zuckerberg and YouTube CEO Neal Mohan are expected to appear in the coming weeks.

This bellwether trial is one of thousands of lawsuits alleging that Facebook and Instagram-parent Meta, YouTube, TikTok and Snap — through their engagement designs — have addicted young people and caused serious harm. The plaintiffs are borrowing a playbook used against Big Tobacco in the 1990s, arguing that companies knowingly got young people addicted to apps they knew could cause harm. Mark Lanier, who represents the plaintiff in this case, a 20-year-old California woman referred to as Kaley G.M., displayed three children’s blocks in his opening statement on Monday and told jurors: “This case is as easy as ABC. Addicting, brains, children.”

Concerns about what social media is doing to young people have been mounting for years, explored in doc-series like Lauren Greenfield’s “Social Studies” and books such as Jonathan Haidt’s “The Anxious Generation,” the latter helping propel efforts around the country to restrict phones in schools. The issue has also drawn sustained scrutiny on Capitol Hill, where Mosseri and Zuckerberg have faced questions in the past about child safety and mental health. Now, with deep public skepticism of Big Tech reflected in recent polling, those fears are being put before a jury in Los Angeles — and the verdict could reverberate in courtrooms for years to come. 

“Having a jury is obviously a dice roll for the companies,” Kate Klonick, an associate professor of law at St. John’s University, told TheWrap. “A jury could come back and say the standard was high, but the plaintiffs met it.” But if the companies prevail, she added, it could reinforce the idea that even if social media addiction is in the zeitgeist, it may not be a viable civil claim.

Both Snap and TikTok settled with the plaintiff last month, while Meta and YouTube have opted to fight in court, arguing that mental health troubles cannot be directly tied to their products. The case is the first of nine scheduled to go to trial in California state court, and it comes as New Mexico’s attorney general brought a separate case to trial this week against Meta, alleging the company failed to protect young people from sexual exploitation. Federal cases brought by school districts and state attorneys general are also slated to go to trial this summer.

The Los Angeles case also tests the limits of Section 230 of the Communications Decency Act, which has largely shielded tech companies from liability for user-generated content posted on their platforms. But the allegations here focus less on what users shared and more on the companies’ own product design choices, including features like infinite scroll and autoplay.

Klonick said that tech companies have long argued their algorithms reflect editorial judgment, like a newspaper deciding what goes on the front page, and therefore are protected by the First Amendment. While courts have largely accepted that, she said the question is whether platforms can avoid liability over product design choices alleged to be harmful.

“It’s hard to say, this is user speech and simultaneously this is our speech,” Klonick said. “They’re confronting that at this moment.”

Mark Lanier
The plaintiff’s attorney, Mark Lanier, arrives at Los Angeles County Superior court this week. (Ethan Swope/Getty Images)

Klonick also stressed that the Los Angeles and New Mexico cases are procedurally and substantively different: One is a plaintiff seeking damages for alleged mental health harm, while the other is brought by a state attorney general alleging Meta failed to protect children.

Still, the fact that they’re unfolding simultaneously has been framed as a “split screen of Mark Zuckerberg’s nightmares,” as Sacha Haworth, executive director of the Tech Oversight Project, put it in a statement this week.

“These are the trials of a generation; just as the world watched courtrooms hold Big Tobacco and Big Pharma accountable, we will, for the first time, see Big Tech CEOs like Zuckerberg take the stand,” she said, adding: “The world is watching, Meta’s reckoning has arrived.”

Inside the Los Angeles courtroom

On Monday, Lanier described YouTube and Instagram as “digital casinos,” grabbing children’s attention to keep swiping, according to the New York Times. “They didn’t just build apps, they built traps,” Lanier said. “They didn’t want users, they wanted addicts.”

Lanier pointed to internal company documents in which executives discussed seeking out young users. “If we want to win big with teens, we must bring them in as tweens,” read a 2018 Meta document. The plaintiff has claimed to have started accounts on YouTube at age 8 and Instagram at 9.

Meghann Cuniff, an independent journalist covering the trial for her site, Legal Affairs and Trials, told TheWrap she was particularly interested in the case given the major tech giants involved, as well as the presence of Lanier, who has won major cases against pharmaceutical companies over opioid addiction and defective products.

“It’s really a unique thing in what he’s doing, taking the model for the tobacco addiction cases, the opioid cases, and then applying that to technology and social media,” she said.

In the 1990s, state attorneys general filed lawsuits against the major tobacco companies, efforts which led to the 1998 master settlement requiring the companies to pay $206 billion over 25 years to reimburse for health-care costs related to smoking.

Meta’s lawyer Paul Schmidt acknowledged the plaintiff “experienced mental health struggles,” but questioned whether Instagram was the cause. The Times noted that Schmidt also “presented the jury with health records and text messages that showed [the plaintiff] had been verbally and physically abused by her mother and abandoned by her father,” and that she began seeing a therapist at age 3. 

Meta spokesman Andy Stone told TheWrap that the question for jurors “is whether Instagram was a substantial factor in the plaintiff’s mental health struggles,” and that “the evidence will show she faced many significant, difficult challenges well before she ever used social media.”

Mosseri said in his testimony on Wednesday that “there’s always trade-off between safety and speech” and that the company tries “to be as safe as possible and censor as little as possible.”

Cuniff reported that the plaintiff, who appeared briefly in the courtroom Monday, will not be present for most of the proceedings, but will testify later in the trial.  “You’ll get to hear from her when she comes to testify,” Lanier told the jury.

Adam Mosseri (Credit: Ethan Swope/Getty Images)
Instagram CEO Adam Mosseri testified Wednesday that Instagram is not “clinically addictive.” (Ethan Swope/Getty Images)

YouTube lawyer Luis Li argued in opening statements Tuesday that the video app is designed for entertainment, like Netflix, rather than social media. “It’s not trying to get in your brain and rewire it,” he said. “It’s just asking you what you like to watch.” Li also noted that the plaintiff used YouTube for 29 minutes a day between 2020 and 2024, according to the Times, and spent just over a minute daily watching YouTube Shorts, which features endless scrolling.

“The Plaintiff is not addicted to YouTube and never has been,” Li said in a statement provided to TheWrap. “She, her father, and her doctor all swore to that. Medical records contain no such diagnoses, and the data proves she spent little more than a minute a day using the very features her lawyers claim are addictive.”

A spokesperson for Google, which owns YouTube, said the company has “built services and policies to provide young people with age-appropriate experiences, and parents with robust controls,” and that “allegations in these complaints are simply not true.”

Lanier questioned Mosseri about company decisions, presenting documents from 2019 in which executives urged Mosseri and Zuckerberg against lifting a ban on beauty filters. “We would rightly be accused of putting growth over responsibility,” one former executive told Mosseri, who along with Zuckerberg, opted to reverse the ban. 

Zuckerberg is expected to take the stand on Feb. 18.

The New Mexico case

Meanwhile, another trial kicked off this week in Santa Fe, New Mexico, where state attorney general Raúl Torrez sued Meta in 2023, alleging that the company failed to safeguard children from sexual exploitation.

“New Mexico’s case,” as the AP noted, “is built on a state undercover investigation using proxy social media accounts and posing as kids to document sexual solicitations and the response from Meta.”

In Monday’s opening statements, prosecuting attorney Donald Migliori told jurors that “Meta clearly knew that youth safety was not its corporate priority” and that they would present evidence that Zuckerberg and Mosseri put profits ahead of safety.

Meta attorney Kevin Huff said Meta did not “deceive” users and disclosed risks.

Stone, the Meta spokesman, accused Torrez of running an “ethically compromised” investigation in a lengthy X thread, and said in a statement that the state attorney general has made “sensationalist, irrelevant and distracting arguments.”

“For over a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most,” Stone added. “We use these insights to make meaningful changes — like introducing Teen Accounts with built-in protections and providing parents with tools to manage their teens’ experiences. We’re proud of the progress we’ve made, and we’re always working to do better.” 

Torres has also been making his case in the court of public opinion, telling CBS News’ Major Garrett that Section 230 would not shield Meta given the allegations.

“We’re not going after the content that’s posted by third-party participants on the platform,” Torrez said. “We’re instead holding them to account for the design features of the product and the misrepresentations that they’ve made about the safety of those products. Section 230 does not immunize them or shield them from liability for lying to the public, for lying to parents and young people, and it also doesn’t shield them from liability for the product choices and design choices that they’ve made in developing these platforms and allowing them to create these manifest dangers.”

“If successful,” Torres added, “we’ll build a blueprint and a roadmap for policymakers and others who are concerned about this behavior, not only in the United States but around the world. And hopefully, we’ll bring a new era of accountability for Big Tech.” 

Social media’s harms have been hotly debated in Congress and school boards for years. But what happens in the jury box could have even greater consequences on how we interact with these platforms in the future.

Comments