Digital computers have separate units for processing (CPU) and memory (RAM). In biological computation, this distinction dissolves. The strength and pattern of connections between neurons *is* the memory, and the electrical firing (spiking) across these same connections *is* the processing.

Related Insights

The human brain contains more potential connections than there are atoms in the universe. This immense, dynamic 'configurational space' is the source of its power, not raw processing speed. Silicon chips are fundamentally different and cannot replicate this morphing, high-dimensional architecture.

The brain's hardware limitations, like slow and stochastic neurons, may actually be advantages. These properties seem perfectly suited for probabilistic inference algorithms that rely on sampling—a task that requires explicit, computationally-intensive random number generation in digital systems. Hardware and algorithm are likely co-designed.

To achieve 1000x efficiency, Unconventional AI is abandoning the digital abstraction (bits representing numbers) that has defined computing for 80 years. Instead, they are co-designing hardware and algorithms where the physics of the substrate itself defines the neural network, much like a biological brain.

The primary motivation for biocomputing is not just scientific curiosity; it's a direct response to the massive, unsustainable energy consumption of traditional AI. Living neurons are up to 1,000,000 times more energy-efficient, offering a path to dramatically cheaper and greener AI.

The supply chain for neurons is not the main problem; they can be produced easily. The true challenge and next major milestone is "learning in vitro"—discovering the principles to program neural networks to perform consistent, desired computations like recognizing images or executing logic.

Contrary to sci-fi imagery, the living neurons for biocomputing platforms are not extracted from animals. They are created from commercially available stem cells, which are originally derived from human skin. This process avoids the ethical and practical issues tied to using primary tissue.

Companies like Cortical Labs are growing human brain cells on chips to create energy-efficient biological computers. This radical approach could power future server farms and make personal 'digital twins' feasible by overcoming the massive energy demands of current supercomputers.

While today's computers cannot achieve AGI, it is not theoretically impossible. Creating a generally intelligent system will require a new physical substrate—likely biological or chemical—that can replicate the brain's enormous, dynamic configurational space, which silicon architecture cannot.

There's a qualitative difference between neurons grown in vitro from stem cells and those found in an adult brain. The scientific community discusses whether lab-grown neurons are less mature, like "infant" neurons, and may lack some receptors. The "perfect" neuron for computation is an open research question.

Biological intelligence has no OS or APIs; the physics of the brain *is* the computation. Unconventional AI's CEO Naveen Rao argues that current AI is inefficient because it runs on layers of abstraction. The future is hardware where intelligence is an emergent property of the system's physics.