Beyond technical merit, standards can be a geopolitical tool. By creating unique national standards, like for electrical plugs or AI reporting, a country can favor its domestic manufacturers who are already compliant, creating a subtle but effective barrier for foreign competitors.
When a company's patented technology becomes essential to an international standard (like 5G), it creates a chokepoint. The US leveraged this by imposing export controls on American firms like Qualcomm, preventing Chinese companies like ZTE from legally building standard-compliant cell phones globally.
Like early electricity, which caused fires and electrocutions, AI is a powerful, scary, and poorly understood technology. The historical process of making electricity safe through standards for measurement (Volts, Amps, Ohms) and devices (fuses) provides a clear roadmap for governing AI risks.
Early internet users feared online payments until the HTTPS encryption standard provided a secure, trustworthy process. Similarly, broad AI adoption requires process standards for safety and risk management to build the public and enterprise trust necessary for a boom in the AI-enabled economy.
Formal standards development organizations (SDOs) like the ISO operate on a 12-24 month timeline. This deliberate, consensus-based process is too slow to keep pace with the rapid evolution of AI technology, creating a governance gap that requires more agile, iterative approaches.
Instead of waiting for formal bodies, Google DeepMind is developing and open-sourcing its own technical standards for AI agents. This strategy aims to solve immediate interoperability problems and establish a market-wide de facto standard through rapid, widespread adoption, bypassing slower, formal channels.
The EU AI Act mandates compliance with 'harmonized standards' for high-risk AI systems. However, many of these essential standards are still undeveloped, creating a high-stakes race for standards bodies to define the rules before the regulation is fully enforceable, effectively 'gesturing to things that have not yet been developed'.
