Companies like Cortical Labs are growing human brain cells on chips to create energy-efficient biological computers. This radical approach could power future server farms and make personal 'digital twins' feasible by overcoming the massive energy demands of current supercomputers.
The human brain contains more potential connections than there are atoms in the universe. This immense, dynamic 'configurational space' is the source of its power, not raw processing speed. Silicon chips are fundamentally different and cannot replicate this morphing, high-dimensional architecture.
To achieve 1000x efficiency, Unconventional AI is abandoning the digital abstraction (bits representing numbers) that has defined computing for 80 years. Instead, they are co-designing hardware and algorithms where the physics of the substrate itself defines the neural network, much like a biological brain.
Digital computing, the standard for 80 years, is too power-hungry for scalable AI. Unconventional AI's Naveen Rao is betting on analog computing, which uses physics to perform calculations, as a more energy-efficient substrate for the unique demands of intelligent, stochastic workloads.
The primary bottleneck for scaling AI over the next decade may be the difficulty of bringing gigawatt-scale power online to support data centers. Smart money is already focused on this challenge, which is more complex than silicon supply.
Musk highlights that the human brain built civilization using just 10 watts for higher functions. This serves as a clear benchmark, demonstrating that current AI supercomputers, which consume megawatts, have a massive, untapped opportunity for improving power efficiency.
New artificial neurons operate at the same low voltage as human ones (~0.1 volts). This breakthrough eliminates the need for external power sources for prosthetics and brain interfaces, paving the way for seamless, self-powered integration of technology with the human body.
Instead of seizing human industry, a superintelligent AI could leverage its understanding of biology to create its own self-replicating systems. It could design organisms to grow its computational hardware, a far faster and more efficient path to power than industrial takeover.
The current 2-3 year chip design cycle is a major bottleneck for AI progress, as hardware is always chasing outdated software needs. By using AI to slash this timeline, companies can enable a massive expansion of custom chips, optimizing performance for many at-scale software workloads.
DeepMind's Shane Legg argues that human intelligence is not the upper limit because the brain is constrained by biology (20-watt power, slow electrochemical signals). Data centers have orders of magnitude advantages in power, bandwidth, and signal speed, making superhuman AI a physical certainty.
Biological intelligence has no OS or APIs; the physics of the brain *is* the computation. Unconventional AI's CEO Naveen Rao argues that current AI is inefficient because it runs on layers of abstraction. The future is hardware where intelligence is an emergent property of the system's physics.