Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

Signal President Meredith Whittaker explains that "AI" was coined by John McCarthy in the 1950s. It wasn't a precise technical term, but a "flashy" brand created to win Cold War-era grant money and to distinguish his work from his rival's field of "cybernetics," reframing AI as a marketing concept from its inception.

Related Insights

Early AI ads, like OpenAI's first, positioned AI as a monumental step in human history. The next wave is expected to be more pragmatic, focusing on specific, relatable use cases for the average consumer. This marketing evolution reflects the technology's maturation from a conceptual wonder to a practical tool for the mass market.

Naming AI research teams with terms like "AGI" is more about signaling a long-term "north star" and creating "vibes" to attract ambitious talent, rather than reflecting a concrete, step-by-step plan to achieve artificial general intelligence.

In 2015-2016, major tech companies actively avoided the term "AI," fearing it was tainted from previous "AI winters." It wasn't until around 2017 that branding as an "AI company" became a positive signal, highlighting the incredible speed of the recent AI revolution and shift in public perception.

Dr. Li views the distinction between AI and AGI as largely semantic and market-driven, rather than a clear scientific threshold. The original goal of AI research, dating back to Turing, was to create machines that can think and act like humans. The term "AGI" doesn't fundamentally change this North Star for scientists.

The term "AI" is a moving target. Technologies like databases or even machine learning were once considered AI but are now just "software." In common usage, AI simply refers to the newest, most novel computational capabilities, and the label will fade as they become commonplace.

The guest argues that over 60% of what's labeled 'AI' is the opportunistic application of existing technology for profit. It's driven by classic capitalist motives like winning market share, rather than genuine, process-changing innovation.

The computer industry originally chose a "hyper-literal mathematical machine" path over a "human brain model" based on neural networks, a theory that existed since the 1940s. The current AI wave represents the long-delayed success of that alternate, abandoned path.

The term "Artificial Intelligence" implies a replacement for human intellect. Author Alistair Frost suggests using "Augmented Intelligence" instead. This reframes AI as a tool that enhances, rather than replaces, human capabilities. This perspective reduces fear and encourages practical, collaborative use.

Sequoia highlights the "AI effect": once an AI capability becomes mainstream, we stop calling it AI and give it a specific name, thereby moving the goalposts for "true" AI. This historical pattern of downplaying achievements is a key reason they are explicitly declaring the arrival of AGI.

In the 2010s, the term "AI" was perceived as hype. To gain serious traction, the field was deliberately rebranded as "Machine Learning." Now, the cycle has reversed, and "AI" is once again the preferred term, highlighting the cyclical and strategic nature of technology branding.

The Term "AI" Originated as a Marketing Ploy to Secure Research Funding | RiffOn