We scan new podcasts and send you the top 5 insights daily.
Companies like Facebook and YouTube feign precise control, but their use of blunt instruments—like banning all political ads or disabling all comments on certain videos—proves they can't manage content at a micro level and are struggling with the chaos of their own systems.
Instead of trying to identify and censor specific "bad" content, a more effective strategy is to use non-targeted, "soft" approaches. This involves temporarily deranking any content spreading too quickly and injecting randomness into recommendation algorithms to break up echo chambers and soften feedback loops.
YouTube's content rules change weekly without warning. A sudden demonetization or age-restriction can cripple an episode's reach after it's published, highlighting the significant platform risk creators face when distribution is controlled by a third party with unclear policies.
Unlike historical propaganda which used centralized broadcasts, today's narrative control is decentralized and subtle. It operates through billions of micro-decisions and algorithmic nudges that shape individual perceptions daily, achieving macro-level control without any overt displays of power.
Despite different political systems, the US and Chinese internets have converged because power is highly centralized. Whether it's a government controlling platforms like Weibo or tech oligarchs like Elon Musk controlling X, the result is a small group dictating the digital public square's rules.
Both tech and media are fundamentally about disseminating information. The internet gave tech platforms superior distribution, disrupting media's business model and its role as the primary shaper of public narrative. This created a power struggle over who controls what society sees and thinks.
Platform decay isn't inevitable; it occurred because four historical checks and balances were removed. These were: robust antitrust enforcement preventing monopolies, regulation imposing penalties for bad behavior, a powerful tech workforce that could refuse unethical tasks, and technical interoperability that gave users control via third-party tools.
Tyler Cowen's experience actively moderating his "Marginal Revolution" blog has made him more tolerant of large tech platforms removing content. Seeing the necessity of curation to improve discourse firsthand, he views platform moderation not as censorship but as a private owner's prerogative to maintain quality.
Instead of outright banning topics, platforms create subtle friction—warnings, errors, and inconsistencies. This discourages users from pursuing sensitive topics, achieving suppression without the backlash of explicit censorship.
Societal polarization is not just ideological but algorithmic. Social media platforms are financially incentivized to amplify divisive content because "enragement equals engagement," which drives ad revenue. This creates a distorted, more hostile view of reality than what exists offline.
Internet platforms like Weibo don't merely react to government censorship orders. They often act preemptively, scrubbing potentially sensitive content before receiving any official directive. This self-censorship, driven by fear of punishment, creates a more restrictive environment than the state explicitly demands.