While there is majority public support for banning teen social media use in the U.S., regulation is blocked by 'whataboutism'—a lobbying tactic of raising endless hypothetical objections (e.g., VPNs, privacy) to create legislative paralysis and prevent any action from being taken.
The problem with social media isn't free speech itself, but algorithms that elevate misinformation for engagement. A targeted solution is to remove Section 230 liability protection *only* for content that platforms algorithmically boost, holding them accountable for their editorial choices without engaging in broad censorship.
Pinterest's CEO argues that social media should establish common safety standards, akin to crash test ratings. This would allow companies to differentiate themselves and build brands around user well-being, turning a regulatory burden into a proactive, market-driven competitive advantage.
Relying solely on parents to manage kids' social media use is flawed. When a single child is taken off platforms like Snapchat, they aren't protected; they're ostracized from their peer group. This network effect means only collective action through legislation can effectively address the youth mental health crisis.
The core business model of dominant tech and AI companies is not just about engagement; it's about monetizing division and isolation. Trillions in shareholder value are now directly tied to separating young people from each other and their families, creating an "asocial, asexual youth," which is an existential threat.
Entrepreneurs often see the kids' market as less crowded and thus easier to enter. The reality is the opposite: it's less crowded because it's significantly more complex, with far more laws and regulations (like COPPA) that founders must navigate successfully to survive.