Mary Rodee, whose 15-year-old son died by suicide, stood outside the Los Angeles Superior Court on March 25, 2026, beneath a banner naming victims after a jury found Meta and YouTube liable for designing features that harmed a young woman and awarded her $6 million.
The verdicts reflect a legal strategy that treats social apps and platforms as products whose design choices can create liability — an approach that has gained traction after decades of broad immunity for tech companies under Section 230 of the Communications Decency Act. That 1996 law generally shields platforms from being sued over user-posted content, a protection that helped doom earlier cases like Matthew Herrick’s 2017 suit against Grindr. Herrick’s lawyer, Carrie Goldberg, argued the dating app had effectively been sold as a defective product because it failed to stop harassment; courts dismissed the case under Section 230 and her appeals failed.
Since then, however, courts have increasingly entertained arguments that companies can be held responsible for how they design and monetize their services. Goldberg and other attorneys have pursued product-liability theories similar to tactics used against Big Tobacco in the 1990s. Those efforts have produced notable results: a 2021 suit against Omegle over child sexual exploitation led to the site’s shutdown as part of a settlement, and an appeals court allowed a lawsuit against Snapchat over an in-app speed filter linked to fatal crashes to proceed, prompting a 2023 settlement.
Last week’s high-profile wins in Los Angeles and New Mexico underscored the momentum. In Los Angeles, jurors concluded Meta and Google’s YouTube built addictive features that contributed to a young woman’s mental-health decline and ordered $6 million in damages. In New Mexico, a jury found Meta liable and ordered the company to pay $375 million for failing to protect children from predators; a second phase of that trial will consider whether Meta created a public nuisance, which could bring additional penalties and design-change orders.
Advocates and lawyers say these rulings could change how platforms operate and be a catalyst for broader policy shifts. Sarah Gardner of the Heat Initiative, an online child-safety group, said the verdicts have “created a different playing field,” combining legal pressure with the prospect of regulatory action and public scrutiny. Matthew Bergman of the Social Media Victims Law Center, who represented the plaintiff in the Los Angeles case, argued that forcing companies to internalize the costs of harm is the only reliable way to make safety a priority.
Meta and Google have signaled they will appeal. Meta contests that teen mental health cannot be causally tied to a single app, while Google has argued that YouTube is not a social network. Legal experts expect extended appeals that could ultimately reach the U.S. Supreme Court and reshape how Section 230 and product-design liability are interpreted.
The litigation wave extends well beyond social media: thousands of related suits are pending against platforms, games, gambling apps and AI chatbots. Moody’s has tracked more than 4,000 cases targeting 166 companies for allegedly designing addictive software. Within days of the Los Angeles verdict, a Massachusetts lawsuit accused sports-betting firms DraftKings and FanDuel of fostering addiction through personalized offers and nudges; DraftKings said it will “vigorously defend” itself. Bergman’s firm has also sued OpenAI and other chatbot makers, alleging their tools have contributed to mental-health crises; OpenAI called the cases heartbreaking and said it is working with mental-health experts to improve chatbot responses.
Lawyers and advocates note that even substantial verdicts may be small relative to the tech giants’ valuations, but they say financial penalties can still change incentives over time. They also caution that one verdict won’t immediately rewrite industry practices; sustained legal pressure, regulation and consumer advocacy together are likelier to drive systemic change, as happened with tobacco.
Relatives of victims, including Lori Schott, left the Los Angeles courthouse carrying portraits of their loved ones after the decision. Whether these trials produce quick reform or a prolonged legal and political struggle, they have already reshaped the debate about platform responsibility and created new legal pathways for holding tech companies accountable.
Disclosure: Google is a financial supporter of NPR.