Google DeepMind's AI has expanded the catalog of known stable crystals from 40,000 to over 400,000. These AI-predicted materials are now being lab-tested and could lead to breakthroughs in physics-limited industries by enabling technologies like better electric vehicle batteries and superconductors.
The ambitious goal of discovering a high-temperature superconductor isn't just a scientific target; it's a strategic choice. Achieving it requires building numerous sub-systems like autonomous synthesis and characterization, effectively forcing the creation of a general-purpose AI for science platform.
Startups and major labs are focusing on "world models," which simulate physical reality, cause, and effect. This is seen as the necessary step beyond text-based LLMs to create agents that can truly understand and interact with the physical world, a key step towards AGI.
AI models will produce a few stunning, one-off results in fields like materials science. These isolated successes will trigger an overstated hype cycle proclaiming 'science is solved,' masking the longer, more understated trend of AI's true, profound, and incremental impact on scientific discovery.
Early AI models advanced by scraping web text and code. The next revolution, especially in "AI for science," requires overcoming a major hurdle: consolidating and formatting the world's vast but fragmented scientific data across disciplines like chemistry and materials science for model training.
AI models are trained on large lab-generated datasets. The models then simulate biology and make predictions, which are validated back in the lab. This feedback loop accelerates discovery by replacing random experimental "walks" with a more direct computational route, making research faster and more efficient.
AI is developing spatial reasoning that approaches human levels. This will enable it to solve novel physics problems, leading to breakthroughs that create entirely new classes of technology, much like discoveries in the 1940s led to GPS and cell phones.
The ultimate goal isn't just modeling specific systems (like protein folding), but automating the entire scientific method. This involves AI generating hypotheses, choosing experiments, analyzing results, and updating a 'world model' of a domain, creating a continuous loop of discovery.
Contrary to the idea that AI will make physical experiments obsolete, its real power is predictive. AI can virtually iterate through many potential experiments to identify which ones are most likely to succeed, thus optimizing resource allocation and drastically reducing failure rates in the lab.
Following the success of AlphaFold in predicting protein structures, Demis Hassabis says DeepMind's next grand challenge is creating a full AI simulation of a working cell. This 'virtual cell' would allow researchers to test hypotheses about drugs and diseases millions of times faster than in a physical lab.
The primary impact of quantum computing won't just be faster calculations. It will be its ability to generate entirely new insights into complex systems like molecules—knowledge that is currently out of reach. This new data can then be fed into AI models, creating a powerful synergistic loop of discovery.