The idea that language creates thought is backwards. Pre-linguistic infants already have a sophisticated understanding of the world (e.g., cause and effect). They learn language by shrewdly guessing a speaker's intent and mapping the sounds they hear onto thoughts they already possess.

Related Insights

While on a career break, the author's deepest anxieties about failure and irrelevance were perfectly articulated by his young son. This reveals a dynamic where children can absorb and voice their parents' unspoken fears, serving as an unwitting mirror to the subconscious.

Research shows children engage in more complex, "authentic communication" when playing with peers because they are constantly negotiating and problem-solving. In contrast, adult-child interactions are often didactic and less challenging, stunting the development of sophisticated language skills.

Andre Karpathy argues that comparing AI to animal learning is flawed because animal brains possess powerful initializations encoded in DNA via evolution. This allows complex behaviors almost instantly (e.g., a newborn zebra running), which contradicts the 'tabula rasa' or 'blank slate' approach of many AI models.

Effective learning isn't data storage. Neuroscientist Mary Helen Imordino-Yang argues that our emotional thought processes become a "hat stand" for information. To retrieve the facts, we re-experience the associated emotion, making subjective engagement central to memory.

The brain connects abstract, learned concepts (like social status) to innate rewards (like shame or pride) via a "steering subsystem." The cortex learns to predict the responses of this more primitive system, effectively linking new knowledge to hardwired emotional and motivational circuits.

Just as crawling is a vital developmental step for babies even though adults don't crawl, some learning processes that AI can automate might be essential for cognitive development. We shouldn't skip steps without understanding their underlying neurological purpose.

Our sense of self isn't an innate property but an emergent phenomenon formed from the interaction between our internal consciousness and the external language of our community (the "supermind"). This implies our identity is primarily shaped not by DNA or our individual brain, but by the collective minds and ideas we are immersed in.

The Fetus GPT experiment reveals that while its model struggles with just 15MB of text, a human child learns language and complex concepts from a similarly small dataset. This highlights the incredible data and energy efficiency of the human brain compared to large language models.

By silently watching animals, one can learn the 'first language' of energy—a pre-verbal understanding of intent and emotional states conveyed through body movement and presence. This non-rational language builds a deep sense of connectivity with all creatures, including humans.

AI models use simple, mathematically clean loss functions. The human brain's superior learning efficiency might stem from evolution hard-coding numerous, complex, and context-specific loss functions that activate at different developmental stages, creating a sophisticated learning curriculum.