We scan new podcasts and send you the top 5 insights daily.
The CEO of AI startup Basis advises against using current compute costs to forecast future profitability. He argues the cost of intelligence is dropping so rapidly that today's margins are not predictive. The focus should be on driving value, confident that the underlying economics will improve dramatically.
The cost for a given level of AI performance halves every 3.5 months—a rate 10 times faster than Moore's Law. This exponential improvement means entrepreneurs should pursue ideas that seem financially or computationally unfeasible today, as they will likely become practical within 12-24 months.
When evaluating AI companies, focus on customer love (gross retention) and efficient acquisition over gross margins. High margins are less critical initially, as the 99%+ decline in model input costs suggests a clear path to future profitability if the core product is sticky.
Rabois argues that unlike foundational model or infrastructure plays, AI application startups shouldn't need to burn cash on compute. He believes they should be able to pass these costs through to customers and demonstrate healthy unit economics immediately.
For a true AI-native product, extremely high margins might indicate it isn't using enough AI, as inference has real costs. Founders should price for adoption, believing model costs will fall, and plan to build strong margins later through sophisticated, usage-based pricing tiers rather than optimizing prematurely.
The compute-heavy nature of AI makes traditional 80%+ SaaS gross margins impossible. Companies should embrace lower margins as proof of user adoption and value delivery. This strategy mirrors the successful on-premise to cloud transition, which ultimately drove massive growth for companies like Microsoft.
While an AI bubble seems negative, the overproduction of compute power creates a favorable environment for companies that consume it. As prices for compute drop, their cost of goods sold decreases, leading to higher gross margins and better business fundamentals.
AI companies operate under the assumption that LLM prices will trend towards zero. This strategic bet means they intentionally de-prioritize heavy investment in cost optimization today, focusing instead on capturing the market and building features, confident that future, cheaper models will solve their margin problems for them.
The cost for a given level of AI capability has decreased by a factor of 100 in just one year. This radical deflation in the price of intelligence requires a complete rethinking of business models and future strategies, as intelligence becomes an abundant, cheap commodity.
During major technology shifts like the move to cloud or AI, the best companies (e.g., hyperscalers, Snowflake) often have terrible early margins. In AI, inference costs are falling so rapidly that a company's margin profile can improve dramatically. Judging an early AI company on SaaS-era margin expectations is a mistake.
In rapidly evolving AI markets, founders should prioritize user acquisition and market share over achieving positive unit economics. The core assumption is that underlying model costs will decrease exponentially, making current negative margins an acceptable short-term trade-off for long-term growth.