Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

Starcloud provides core infrastructure—a "box" with power, cooling, and connectivity—but lets customers install their own chips. This makes them an infrastructure provider like Equinix, not a cloud provider like AWS. This strategy offloads the massive capital cost of chips and focuses on their core competency: building satellites.

Related Insights

Space data centers' viability hinges on a breakeven point where launch costs are outweighed by savings from no permitted land, no need for battery backup (24/7 sun), and 8x more efficient solar panels. Starcloud estimates this economic crossover occurs when launch costs drop to around $500 per kilogram.

Until launch costs drop, Starcloud's initial customers are military and earth observation satellites that are bottlenecked by data downlink capacity. By processing data in space, Starcloud solves this problem and can charge premium rates, building a sustainable business while waiting for the larger market to become viable.

The next wave of space companies is moving away from the vertically integrated "SpaceX model" where everything is built in-house. Instead, a new ecosystem is emerging where companies specialize in specific parts of the stack, such as satellite buses or ground stations. This unbundling creates efficiency and lowers barriers to entry for new players.

Following predictions from Jeff Bezos and investments from Eric Schmidt, Elon Musk has entered the space-based data center race. He stated that SpaceX will leverage its existing Starlink V3 satellites, which already have high-speed laser links, to create an orbital cloud infrastructure, posing a significant challenge to startups in the sector.

Skepticism around orbital data centers mirrors early doubts about Starlink, which was initially deemed economically unfeasible. However, SpaceX drastically reduced satellite launch costs by 20x, turning a "pipe dream" into a valuable business. This precedent suggests a similar path to viability exists for space-based AI compute.

A key trend, exemplified by Starfish Space, is the rise of businesses serving other space assets rather than just ground-based consumers. Starfish provides services *to* satellites, indicating the development of a self-sustaining, in-orbit economic ecosystem with its own B2B market.

The company initially explored space-based solar but realized beaming power to Earth is highly inefficient. Since most new energy powers data centers anyway, they pivoted to moving the data centers to the power source in space, eliminating the massive energy loss from transmission.

The primary advantage of orbital data centers isn't cost, but speed to market. Building on Earth involves years of real estate, permitting, and power grid challenges. The space-based model can turn manufactured chips into operational compute within weeks by treating deployment as an industrial manufacturing and launch problem.

On Earth, each new data center is more expensive than the last due to land and energy constraints. In space, manufacturing satellites at scale and declining launch costs (via Starship) mean the marginal cost for each new data center goes down, creating fundamentally different scaling economics.

Starfish Space will own and operate its fleet of "Otter" space tugs, selling services like de-orbiting rather than the hardware itself. This model allows them to continuously improve their software across the entire fleet, capture more value, and align their business with customer outcomes.