/
© 2026 RiffOn. All rights reserved.

Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

  1. Decoder with Nilay Patel
  2. A jury says Meta and Google hurt a kid. What now?
A jury says Meta and Google hurt a kid. What now?

A jury says Meta and Google hurt a kid. What now?

Decoder with Nilay Patel · Apr 2, 2026

Landmark verdicts against Meta & Google for platform addiction challenge Section 230 by targeting product design over content, sparking a debate.

User Addiction Is a Core Design Goal for Social Media, Not a Side Effect

Social media platforms view user addiction as a key performance indicator. They employ cognitive scientists to engineer products that maximize engagement. Users blaming themselves for their inability to log off are not in a fair fight; they are playing a "rigged game" designed by experts to capture their attention.

A jury says Meta and Google hurt a kid. What now? thumbnail

A jury says Meta and Google hurt a kid. What now?

Decoder with Nilay Patel·3 days ago

Calls to Repeal Section 230 After Recent Verdicts Are Politically Incoherent

Politicians are using anti-tech verdicts to demand a repeal of Section 230, but the logic is flawed. Abolishing the law would force platforms to become hyper-aggressive in their content moderation to avoid liability, directly contradicting the "free speech" goals these same critics often claim to support.

A jury says Meta and Google hurt a kid. What now? thumbnail

A jury says Meta and Google hurt a kid. What now?

Decoder with Nilay Patel·3 days ago

Regulating Algorithmic Personalization Is Tech's Hardest First Amendment Challenge

While features like autoplay can be separated from speech, algorithmic personalization is much closer to protected editorial discretion. Attempts to regulate how platforms recommend content—the likely cause of many user harms—will face severe First Amendment challenges, making it the thorniest issue for policymakers.

A jury says Meta and Google hurt a kid. What now? thumbnail

A jury says Meta and Google hurt a kid. What now?

Decoder with Nilay Patel·3 days ago

Recent Verdicts Are Bellwether Trials, Unleashing a Wave of Social Media Lawsuits

The wins against Meta and Google are not isolated events but "bellwether" cases that have opened the floodgates for litigation. With this new product liability strategy validated, a massive pipeline of over 1,500 similar lawsuits from individuals, schools, and states is now set to move forward, posing an existential risk.

A jury says Meta and Google hurt a kid. What now? thumbnail

A jury says Meta and Google hurt a kid. What now?

Decoder with Nilay Patel·3 days ago

Social Media Lawsuits Target Addictive Product Design to Bypass Section 230 Protections

Recent verdicts against Meta and Google succeed by framing the problem as "defective product design" (like autoplay and infinite scroll) rather than harmful user content. This novel legal strategy circumvents the broad immunity that Section 230 of the Communications Decency Act typically provides to tech platforms.

A jury says Meta and Google hurt a kid. What now? thumbnail

A jury says Meta and Google hurt a kid. What now?

Decoder with Nilay Patel·3 days ago

Section 230 Is Defended Based on a Failed Policy Goal from the AOL Era

The original vision for Section 230 was to foster a competitive marketplace of user-controlled moderation tools, a world that never materialized. Defending the 30-year-old law today means protecting an unrealized policy goal from a completely different technological era, raising questions about its continued relevance.

A jury says Meta and Google hurt a kid. What now? thumbnail

A jury says Meta and Google hurt a kid. What now?

Decoder with Nilay Patel·3 days ago

Defining "Safe Social Media" Is Nearly Impossible, Complicating Any Regulatory Fix

Even if platforms agree to make changes, there's no industry or societal consensus on what constitutes "safe social media." It's unclear if removing specific features like autoplay or infinite scroll would actually improve mental health, making it difficult for companies to address liability or for regulators to craft effective rules.

A jury says Meta and Google hurt a kid. What now? thumbnail

A jury says Meta and Google hurt a kid. What now?

Decoder with Nilay Patel·3 days ago

Tech's Trust & Safety Profession Has Been Demoted from Advocate to Compliance Function

The Trust & Safety field, once a powerful internal voice for user rights and ethical principles, has been systematically weakened. To appease political pressures, tech companies have pushed out vocal advocates, reducing the role to a mere compliance function and leaving platform governance to the whims of their leaders.

A jury says Meta and Google hurt a kid. What now? thumbnail

A jury says Meta and Google hurt a kid. What now?

Decoder with Nilay Patel·3 days ago

Tech Giants Lose in Court Because Jurors Have Personal Experience with Social Media's Harms

Social media addiction lawsuits resonate deeply with jurors because nearly everyone knows someone struggling with compulsive platform use. This shared negative experience makes juries highly sympathetic to plaintiffs' claims that the products are inherently flawed, neutralizing corporate arguments about statistical user satisfaction.

A jury says Meta and Google hurt a kid. What now? thumbnail

A jury says Meta and Google hurt a kid. What now?

Decoder with Nilay Patel·3 days ago