According to The Wall Street Journal, a landmark personal-injury trial began this week in Los Angeles Superior Court, centered on a young woman known as K.G.M. who claims her use of social media as a teen led to body dysmorphia, suicidal thoughts, anxiety, addiction, and depression. This is the first of over 5,000 similar lawsuits to go to trial, with Meta, TikTok, Snap, and YouTube facing more than 3,000 cases in California alone and another 2,000-plus in federal court. The plaintiffs allege the companies built addictive features like infinite scroll and algorithmic recommendations that harm teens. Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri are expected to testify in the six-week trial, though Snap’s CEO settled with the plaintiff last week. The jury’s decision could influence a wave of settlements and challenge the legal immunity platforms have long enjoyed.
The Business Model on Trial
Here’s the thing: this case isn’t really about one person. It’s a direct attack on the fundamental engagement-driven business model of social media. The plaintiffs’ lawyers are arguing that features like autoplay and algorithmic feeds aren’t just neutral tools—they’re intentionally addictive by design, especially for developing adolescent brains. And the companies’ defense is telling. They’re not just saying “we didn’t do it.” They’re arguing that holding them liable would “make it impossible to have free expression and open communication.” That’s a huge, almost existential claim. It frames any regulation or accountability as a threat to the very idea of social media itself. But is that true, or is it just a convenient shield for a profitable status quo?
The Section 230 End-Run
This is where it gets legally fascinating. For decades, Section 230 of the Communications Decency Act has been a magic shield for internet platforms. It basically says they aren’t liable for what users post. The companies tried to get this case thrown out using that shield, but Judge Carolyn Kuhl said no. Why? Because this case isn’t (just) about the content—it’s about the product design. The jury has to decide if the harm was caused by third-party posts, or by the platforms’ own features like infinite scroll and push notifications. This is a clever, and potentially groundbreaking, legal strategy. If the plaintiffs win here, it blows a massive hole in the traditional Section 230 defense for a whole new class of lawsuits. No wonder one expert called it “their next shot.”
A Reckoning Long Coming
Look, the public mood has shifted. When a WSJ poll shows 71% of people support banning most social media for kids under 16, you know the era of blind trust is over. The legal playbook is straight out of the Big Tobacco and opioid litigation files: find internal documents showing the companies knew about the potential for harm. Lead lawyer Mark Lanier says many such documents will be shown at trial for the first time. That’s a big deal. And the fact that Snap settled right before the trial started? That’s not a sign of confidence. It feels like the beginning of a long, ugly, and expensive reckoning. The companies can talk all they want about “safeguards” and “age-appropriate experiences,” but a jury is about to see the raw evidence. That changes everything.
