The "SaaSpocalypse" looms, but the real shock this earnings season was the sheer scale of hyperscaler capex revisions. At the start of the year, analysts projected the top five US hyperscalers would spend $540 billion. Following recent earnings reports, that estimate has ballooned to $660 billion. This is not just growth; it is a strategy of "deterrence". By spending at such a massive scale, hyperscalers ensure that private challengers like OpenAI and Anthropic cannot compete on infrastructure alone.
This spending wave marks a new, more aggressive phase in the AI investment cycle. We previously noted that hyperscalers could increase capex by 50% annually by tapping operating cash flows and debt markets. They are doing exactly that. Despite their record-breaking funding rounds, Anthropic and OpenAI lack the organic capital to match this pace. Furthermore, hyperscalers are effectively "crowding out" the market by securing priority supply at TSMC, further strangling the infrastructure efforts of smaller labs.
The path to an IPO remains the primary exit, but the public markets will demand a shift toward self-sufficiency. Investors will eventually seek proof that these companies can fund their own build-outs through operating cash flow rather than perpetual fundraising. The current economics of model releases appear unsustainable; obsolescence often arrives before development costs are recouped. To survive, these firms must either find aggressive new revenue streams or dramatically slow the pace of new, costly releases.
On the demand side, the narrative is shifting from hype to "AI airtime" and P&L reality. Our analysis of 254 recent earnings transcripts shows that headcount reduction has moved from a niche experiment to a broad corporate theme. Cost arbitrage remains the dominant driver of AI adoption, reinforcing our view that AI will continue to shrink the labor force. While more industries now cite AI as a revenue opportunity, it will take significant time for these projections to materialize into actual earnings.
The acceleration of the AI cycle presents three immediate shifts for private market investors.
First, as infrastructure costs climb, application-layer margins face a reckoning. Companies with gross margins tethered to LLM token costs must either absorb these overheads or test their pricing power. Low-churn "sticky" products may successfully pass these costs to the end-user, but firms currently operating at 40% margins lack the cushion to survive a sustained price hike from providers.
Second, the bottleneck moves to efficiency. We expect a surge in capital flowing toward hardware and software innovations that reduce the energy and compute intensity of training and inference. The "brute force" era is hitting physical limits.
Finally, the eventual public exits of Anthropic and OpenAI will trigger a massive redistribution of capital within the VC ecosystem. This liquidity event will not merely reward early backers; it will recapitalize the next generation of scale-ups, fueling a second wave of growth across the broader tech stack.











