Crusoe Cloud is partnering with Tesla co-founder JB Straubel's Redwood Materials to use second-life EV batteries for power. By pairing these recycled batteries with solar, they can run a fully off-grid AI data center 24/7 at a lower price than grid power in Northern Virginia, a major data center hub.
Future Teslas will contain powerful AI inference chips that sit idle most of the day, creating an opportunity for a distributed compute network. Owners could opt-in to let Tesla use this power for external tasks, earning revenue that offsets electricity costs or the car itself.
Crusoe Cloud located a massive AI data center in West Texas because the area has so much wind and solar power that prices frequently go negative. Transmission bottlenecks mean renewable producers must often shut down, creating a unique opportunity for energy-hungry data centers to co-locate and absorb the stranded, ultra-cheap power.
The vast network of consumer devices represents a massive, underutilized compute resource. Companies like Apple and Tesla can leverage these devices for AI workloads when they're idle, creating a virtual cloud where users have already paid for the hardware (CapEx).
Musk envisions a future where a fleet of 100 million Teslas, each with a kilowatt of inference compute, built-in power, cooling, and Wi-Fi, could be networked together. This would create a massive, distributed compute resource for AI tasks.
To overcome energy bottlenecks, political opposition, and grid reliability issues, AI data center developers are building their own dedicated, 'behind-the-meter' power plants. This strategy, typically using natural gas, ensures a stable power supply for their massive operations without relying on the public grid.
The energy demand from AI can be met by allowing data centers to generate their own power "behind the meter." This avoids burdening the public grid and allows data centers to sell excess power back, potentially lowering electricity costs for everyone through economies of scale.
AI companies are building their own power plants due to slow utility responses. They overbuild for reliability, and this excess capacity will eventually be sold back to the grid, transforming them into desirable sources of cheap, local energy for communities within five years.
The public power grid cannot support the massive energy needs of AI data centers. This will force a shift toward on-site, "behind-the-meter" power generation, likely using natural gas, where data centers generate their own power and only "sip" from the grid during off-peak times.
To circumvent grid connection delays, infrastructure costs, and potential consumer rate impacts, data centers are increasingly opting for energy independence. They are deploying on-site power solutions like gas turbines and fuel cells, which can be faster to implement and avoid burdening the local utility system.
Crusoe's CEO explains their core strategy isn't just finding stranded energy, but actively developing new power sources alongside their AI factories. By building out power capacity to meet peak demand, they create an abundance of energy that can also benefit the surrounding grid, turning a potential liability into an asset.