Mary Rodee, whose 15-year-old son died by suicide, stood outside Los Angeles Superior Court on March 25, 2026, by a banner naming victims after a jury found Meta and YouTube liable for designing social media in ways that harmed a young woman and awarded her $6 million. (Frederic J. Brown/AFP via Getty Images)
In 2017, Matthew Herrick sued the dating app Grindr after an ex-boyfriend used fake profiles to harass him and send strangers to his home. His lawyer, Carrie Goldberg, argued the app was a defective product because Grindr claimed it could not stop the abuse. The case was dismissed under Section 230 of the Communications Decency Act, a 1996 federal law that shields online platforms from liability for user-posted content.
Goldberg says she and Herrick lost every appeal. But in the years since, courts have grown more receptive to the idea that companies can be held responsible for how they build and monetize their products — the core argument Goldberg made against Grindr. “Decisions about how apps work and are monetized are things that, in my mind, the platform should be liable for if they get it wrong and injure somebody,” she said.
That product-liability approach mirrors tactics used against Big Tobacco in the 1990s, and it has produced notable legal wins recently. In 2021 Goldberg sued Omegle over child sexual exploitation concerns; the site later shut down as part of a settlement. Also in 2021, an appeals court allowed a lawsuit against Snapchat over a speed filter linked to fatal crashes to proceed, rejecting a Section 230 dismissal; Snapchat settled in 2023.
Last week, juries in Los Angeles and New Mexico handed high-profile victories to plaintiffs who argued social platforms harmed children. In Los Angeles, jurors found Meta (owner of Instagram) and Google’s YouTube deliberately engineered addictive features that contributed to a young woman’s mental-health problems and awarded her $6 million. In New Mexico, a jury ordered Meta to pay the state $375 million for failing to protect young users from child predators; a second trial phase in May will consider whether Meta created a public nuisance and could lead to further penalties and orders to change app design.
Advocates and lawyers say these rulings mark a turning point. Sarah Gardner of the Heat Initiative, an online child-safety advocacy group, said the verdicts “have created a different playing field” and could build momentum for change beyond the courtroom. Many expect Meta and Google to appeal, and some foresee the Supreme Court eventually addressing the legal theory behind these cases. Meta argues teen mental health cannot be tied to a single app; Google contends YouTube is not a social network.
Thousands of similar lawsuits are moving through state and federal courts, and new complaints target video games, gambling apps and AI chatbots. Moody’s has identified more than 4,000 pending cases against 166 companies alleging addictive software design. Within days of the LA verdict, a Massachusetts suit accused sports-betting firms DraftKings and FanDuel of fostering gambling addiction through personalized bonuses and nudges to keep users betting. “It’s personalized itself to you,” said Jennifer Hoekstra, who represents the plaintiff and was involved in the LA social-media case. DraftKings said it will “vigorously defend” itself.
Lawyers and advocates hope courtroom wins will spur policy changes that have long stalled. “If you go and look at what really changed the tobacco industry, it wasn’t one thing, it was everything together,” Gardner said, arguing that regulatory action, public pressure and legal liability together shifted industry behavior. Matthew Bergman of the Social Media Victims Law Center, which represented the plaintiff in the LA trial, said the only way to force companies to prioritize safety is to internalize the cost of harm. His firm has also sued OpenAI and other AI-chatbot makers, alleging the tools have contributed to mental-health crises and suicides; OpenAI says the cases are heartbreaking and that it is working with mental-health experts to improve chatbot responses to signs of distress.
Bergman acknowledged that the financial awards so far are small relative to the tech giants’ multitrillion-dollar valuations, but he said the verdicts send a message: hit their pocketbooks and their behavior will follow. Meta and Google plan to appeal the rulings, and legal experts predict a protracted series of appeals that could reach the nation’s highest court, potentially reshaping Section 230 interpretations and companies’ exposure for product design choices.
Relatives of victims, including Lori Schott, were among those who left the Los Angeles courthouse carrying portraits of their loved ones after the decision. Advocates say the litigation could accelerate other efforts to make apps safer for children — from new laws to platform changes — but they also caution that a single verdict won’t change business incentives overnight.
Google is a financial supporter of NPR.