Public announcements about quantum computing progress often cite high numbers of 'physical qubits,' a misleading metric due to high error rates. The crucial, error-corrected 'logical qubits' are what matter for breaking encryption, and their number is orders of magnitude lower, providing a more realistic view of the technology's current state.

Related Insights

Contrary to the belief that it has no current utility, quantum computing is already being used commercially and generating revenue. Major companies like HSBC and AstraZeneca are leveraging quantum machines via cloud platforms (AWS, Azure) for practical applications like financial modeling and drug discovery, proving its value today.

A quantum-resistant upgrade for Bitcoin creates a major governance dilemma regarding the 20-30% of coins in early, vulnerable addresses (like Satoshi's) that are likely lost. The community must decide whether to allow an attacker to seize these billions, potentially destabilizing the network, or to proactively burn them via a contentious code change.

A "software-only singularity," where AI recursively improves itself, is unlikely. Progress is fundamentally tied to large-scale, costly physical experiments (i.e., compute). The massive spending on experimental compute over pure researcher salaries indicates that physical experimentation, not just algorithms, remains the primary driver of breakthroughs.

The history of AI, such as the 2012 AlexNet breakthrough, demonstrates that scaling compute and data on simpler, older algorithms often yields greater advances than designing intricate new ones. This "bitter lesson" suggests prioritizing scalability over algorithmic complexity for future progress.

Don't trust academic benchmarks. Labs often "hill climb" or game them for marketing purposes, which doesn't translate to real-world capability. Furthermore, many of these benchmarks contain incorrect answers and messy data, making them an unreliable measure of true AI advancement.

Instead of relying on hyped benchmarks, the truest measure of the AI industry's progress is the physical build-out of data centers. Tracking permits, power consumption, and satellite imagery reveals the concrete, multi-billion dollar bets being placed, offering a grounded view that challenges both extreme skeptics and believers.

Unlike traditional banks that use 2FA and can roll back fraudulent transactions, Bitcoin's decentralized and immutable design makes it a top target for a quantum attack. It represents a massive, unprotected honeypot, as stolen funds cannot be recovered, elevating its risk profile above other financial systems.

Nvidia CEO Jensen Huang's public stance on quantum computing shifted dramatically within months, from a 15-30 year timeline to calling it an 'inflection point' and investing billions. This rapid reversal from a key leader in parallel processing suggests a significant, non-public breakthrough or acceleration is underway in the quantum field.

The primary hurdle for securing Bitcoin against quantum computers isn't just the arrival of the technology, but the massive, multi-year logistical challenge of migrating all existing wallets. Due to larger transaction sizes and network throughput limits, this migration could take 10-30 months even under optimistic scenarios.

A symbiotic relationship exists between AI and quantum computing, where AI is used to significantly speed up the optimization and calibration of quantum machines. By automating solutions to the critical 'noise' and error-rate problems, AI is shortening the development timeline for achieving stable, powerful quantum computers.