A single 20-mile car trip emits as much CO2 as roughly 10,000 chatbot queries. This means that if AI helps you avoid just one such trip, you have more than offset a year's worth of heavy personal AI usage.

Related Insights

People often object to AI's energy use simply because it represents a *new* source of emissions. This psychological bias distracts from the fact that these new emissions are minuscule compared to massive, existing sources like personal transportation.

A widely circulated media claim that a single chatbot prompt consumes an entire bottle of water is a gross exaggeration based on a flawed study. The actual figure is closer to 2 milliliters, or 1/200th of a typical bottle.

The energy consumed by a chatbot is so minimal that it almost certainly reduces your net emissions by displacing more carbon-intensive activities, such as driving a car or even watching TV.

Although 90% of an AI server's financial cost is the upfront hardware purchase, the vast majority (~95%) of its lifetime carbon footprint comes from the electricity used to run it, not from its manufacturing.

Block's CTO quantifies the impact of their internal AI agent, Goose. AI-forward engineering teams save 8-10 hours weekly, a figure he considers the absolute baseline. He notes, "this is the worst it will ever be," suggesting exponential gains are coming.

While global emissions and water usage from AI are manageable, the most significant danger is localized air pollution from fossil fuel power plants, which poses immediate and severe health risks to nearby communities.

To contextualize the energy cost of AI inference, a single query to a large language model uses roughly the same amount of electricity as running a standard microwave for just one second.

The production of one hamburger requires energy and generates emissions equivalent to 5,000-10,000 AI chatbot interactions. This comparison highlights how dietary choices vastly outweigh digital habits in one's personal environmental impact.

The projected 80-gigawatt power requirement for the full AI infrastructure buildout, while enormous, translates to a manageable 1-2% increase in global energy demand—less than the expected growth from general economic development over the same period.

AI's justification for massive energy and capital consumption is weakening as its public-facing applications pivot from world-changing goals to trivial uses like designing vacations or creating anime-style images. This makes the high societal costs of data centers and electricity usage harder for the public to accept.