We scan new podcasts and send you the top 5 insights daily.
By slightly altering common words (e.g., "war" to "w-r"), social media accounts can increase engagement. The unusual spelling forces users to pause and reread, signaling to the algorithm that the content is engaging and thereby boosting its visibility, even if the comments are about the censorship itself.
Unlike historical propaganda which used centralized broadcasts, today's narrative control is decentralized and subtle. It operates through billions of micro-decisions and algorithmic nudges that shape individual perceptions daily, achieving macro-level control without any overt displays of power.
One of the most effective ways to boost online engagement is to make a deliberate, correctable error. The podcast notes that misspelling "Clawd bot" led to a flood of comments from users eager to correct them, demonstrating that the internet's need to be right is a powerful growth hack.
Algorithms optimize for engagement, and outrage is highly engaging. This creates a vicious cycle where users are fed increasingly polarizing content, which makes them angrier and more engaged, further solidifying their radical views and deepening societal divides.
Extremist figures are not organic phenomena but are actively amplified by social media algorithms that prioritize incendiary content for engagement. This process elevates noxious ideas far beyond their natural reach, effectively manufacturing influence for profit and normalizing extremism.
A/B testing on platforms like YouTube reveals a clear trend: the more incendiary and negative the language in titles and headlines, the more clicks they generate. This profit incentive drives the proliferation of outrage-based content, with inflammatory headlines reportedly up 140%.
The addictiveness of social media stems from algorithms that strategically mix positive content, like cute animal videos, with enraging content. This emotional whiplash keeps users glued to their phones, as outrage is a powerful driver of engagement that platforms deliberately exploit to keep users scrolling.
The word "bop," once meaning a good song, was adopted by OnlyFans creators to describe their profession without being censored. This demonstrates "Algo Speak"—language evolving specifically to circumvent platform moderation, whether real or perceived.
Instead of outright banning topics, platforms create subtle friction—warnings, errors, and inconsistencies. This discourages users from pursuing sensitive topics, achieving suppression without the backlash of explicit censorship.
To avoid demonetization, creators use code words like "unalive" for "dead." This stems from advertisers' brand safety concerns, creating a "comically childish" communication style that is likely ineffective against sophisticated algorithms and frustrating for creators.
By censoring simple words like "war," news outlets create confusion. This makes users pause, which algorithms interpret as interest, and comment to ask for clarification. The resulting engagement boosts the post's visibility, even if the comments are about the typo, not the content.