We scan new podcasts and send you the top 5 insights daily.
Unlike computers, human brains have no distinction between hardware and software; every memory physically alters the brain's structure. Furthermore, neurons are not simple on/off transistors; their firing is influenced by a complex chemical bath of hormones and neurotransmitters, making them more analog than digital.
The human brain contains more potential connections than there are atoms in the universe. This immense, dynamic 'configurational space' is the source of its power, not raw processing speed. Silicon chips are fundamentally different and cannot replicate this morphing, high-dimensional architecture.
The brain's hardware limitations, like slow and stochastic neurons, may actually be advantages. These properties seem perfectly suited for probabilistic inference algorithms that rely on sampling—a task that requires explicit, computationally-intensive random number generation in digital systems. Hardware and algorithm are likely co-designed.
To achieve 1000x efficiency, Unconventional AI is abandoning the digital abstraction (bits representing numbers) that has defined computing for 80 years. Instead, they are co-designing hardware and algorithms where the physics of the substrate itself defines the neural network, much like a biological brain.
The cortex has a uniform six-layer structure and algorithm throughout. Whether it becomes visual or auditory cortex depends entirely on the sensory information plugged into it, demonstrating its remarkable flexibility and general-purpose nature, much like a universal computer chip.
"Amortized inference" bakes slow, deliberative reasoning into a fast, single-pass model. While the brain uses a mix, digital minds have a strong incentive to amortize more capabilities. This is because once a capability is baked in, the resulting model can be copied infinitely, unlike a biological brain.
Digital computers have separate units for processing (CPU) and memory (RAM). In biological computation, this distinction dissolves. The strength and pattern of connections between neurons *is* the memory, and the electrical firing (spiking) across these same connections *is* the processing.
The brain doesn't strive for objective, verbatim recall. Instead, it constantly updates and modifies memories, infusing them with emotional context and takeaways. This process isn't a bug; its purpose is to create useful models to guide future decisions and ensure survival.
With 10x more neurons going to the eye than from it, the brain actively predicts reality and uses sensory input primarily to correct errors. This explains phantom sensations, like feeling a stair that isn't there, where the brain's simulation briefly overrides sensory fact.
While today's computers cannot achieve AGI, it is not theoretically impossible. Creating a generally intelligent system will require a new physical substrate—likely biological or chemical—that can replicate the brain's enormous, dynamic configurational space, which silicon architecture cannot.
When we observe neurons, we are not seeing the true substrate of thought. Instead, we are seeing our 'headset's' symbolic representation of the complex conscious agent dynamics that are responsible for creating our interface in the first place.