The Department of Defense (DoD) doesn't need a "wake-up call" about AI's importance; it needs to "get out of bed." The critical failure is not a lack of awareness but deep-seated institutional inertia that prevents the urgent action and implementation required to build capability.
The principle that governments must hold a monopoly on overwhelming force should extend to superintelligence. AI at that level has the power to disorient political systems and financial markets, making its private control untenable. The state cannot be secondary to any private entity in this domain.
In AI-driven cybersecurity, being the first to defend your systems or embed exploits gives a massive but temporary edge. This advantage diminishes quickly as others catch up, creating a "fierce urgency of now" for national security agencies to act before the window closes.
The belief that a future Artificial General Intelligence (AGI) will solve all problems acts as a rationalization for inaction. This "messiah" view is dangerous because the AI revolution is continuous and happening now. Deferring action sacrifices the opportunity to build crucial, immediate capabilities and expertise.
The military lacks the "creative destruction" of the private sector and is constrained by rigid institutional boundaries. Real technological change, like AI adoption, can only happen when intense civilian leaders pair with open-minded military counterparts to form a powerful coalition for change.
Viewing AI as just a technological progression or a human assimilation problem is a mistake. It is a "co-evolution." The technology's logic shapes human systems, while human priorities, rivalries, and malevolence in turn shape how the technology is developed and deployed, creating unforeseen risks and opportunities.
The military's career path rewards generalist experience, effectively punishing officers who specialize in critical fields like AI and cyber. Talented specialists are forced to abandon their expertise to get promoted, leading many to leave the service not for money, but to continue doing the work they excel at.
Analogizing AI to electricity is too narrow. A better comparison is the shift from feudalism to market capitalism, which fundamentally restructured society over centuries. AI will have a similarly profound, systemic impact but compressed into less than a decade, making prediction and preparation incredibly challenging.
Bureaucracies, like AI models, have pre-programmed "weights" that shape decisions. The DoD is weighted toward its established branches (Army, Navy, etc.). Without a dedicated Cyber Force, cybersecurity is consistently de-prioritized in budgets, promotions, and strategic focus, a vulnerability that AI will amplify.
The relationship between governments and AI labs is analogous to European powers and chartered firms like the British East India Company, which wielded immense, semi-sovereign power. This private company raised its own army and conquered India, highlighting how today's private tech firms shape new frontiers with opaque power.
