There are data centres being built in the Mojave Desert. In Norway, where the cold air is free. In Saudi Arabia, where the energy is. Microsoft, Google, Amazon, and Meta have committed somewhere between $630 billion and $690 billion to AI infrastructure in 2026 alone — roughly a 60% increase from the record they set in 2025.[1] Goldman Sachs projects total hyperscaler capex at $1.15 trillion between 2025 and 2027.[2]
That's not a metaphor. It's a physical reality. Hundreds of millions of square feet of compute, running on electricity that, in at least one case, Microsoft can't procure fast enough — they're sitting on $80 billion in unfulfilled Azure orders because they cannot find enough power.
The business model that justifies this investment is tokens on a metre. Units of intelligence sold at subscription or per-use pricing. Throughput is the metric. Whether your specific use case worked is not a problem they're structured to solve.
That's fine. It's rational. But it's worth being clear about what you're buying.
The most dramatic single day
On 23 February 2026, Anthropic demonstrated an AI coding tool automating the exploration and analysis of legacy COBOL code. IBM's stock dropped 13% in a single day. About $40 billion in market cap, gone. It was IBM's worst trading day since October 2000.[3]
COBOL modernisation is a multi-billion-dollar-per-year consulting business. Not because COBOL is interesting — it isn't — but because the systems running on it are irreplaceable, and the pool of people who can work with them has been shrinking for decades. It's a classic scarcity play. And in one demonstration, the scarcity argument became significantly weaker.
The IBM day is a useful reference point because it's precise. A specific capability, demonstrated publicly, on a specific date, with a measurable market response. Most disruption is gradual and ambiguous. This one was sudden and clear.
Sources: Introl / Goldman Sachs Feb 2026; Data Center Richness Feb 2026; multiple sources Feb–Mar 2026; Fast Company / India Today Feb 2026
The SaaSpocalypse
The software industry had a rough 2025 and a rougher start to 2026. The public SaaS index fell 6.5% while the S&P 500 climbed 18%. About $2 trillion in market cap was erased from software stocks.[4] Revenue multiples dropped from over 7x to under 3.5x NTM revenue. The index is down 37% from its Q3 2025 peak.
The market is pricing in a disruption that hasn't fully arrived yet. That's how markets work — they price the expected future, not the current state. What the market is pricing is "seat compression": the mechanism by which AI agents don't just help workers use SaaS tools, they replace the need for the software the worker was using.
The Christensen Institute articulated the frame: AI disruption follows the pattern Christensen documented — entering from below, taking the simpler, cheaper, more accessible workflows first, then moving upmarket. AI isn't going for the most complex, most embedded enterprise SaaS first. It's taking ticket classification, document summarisation, customer support routing. The vendors whose moat was "we automate X" are discovering that X is now a foundation model API call.
Not all SaaS is equally exposed. The question is what the actual core value is. Data network effects — where the product improves as more customers use it — survive. Deep integrations that would take months to replicate survive. Vertical compliance knowledge survive. "We use AI under the hood to classify your support tickets" is a feature, not a moat.
| What survives the shift | What doesn't |
|---|---|
| ✓ Data network effects — product improves with scale | · Thin automation wrappers on accessible APIs |
| ✓ Deep workflow integrations (years of switching cost) | · "We use AI under the hood" as a differentiator |
| ✓ Vertical compliance and regulatory knowledge | · Task-specific AI features built on commodity APIs |
| ✓ Proprietary training data and domain-specific models | · Scarcity plays on skills the model now has |
| ✓ Workflow context that took years to build | · Per-seat pricing where the "seat" can be an agent |
The incentive structure
The hyperscalers have a clean incentive: sell more tokens, build bigger clusters, lower the marginal cost per token. A leading model cost $0.01 per thousand input tokens in 2024. A large model dropped 67% in price between versions. An open-weight model entered the market at $0.14 per million tokens. The commodity trajectory is clear and the race to the bottom is real.
What this means is that the token providers are not structurally aligned with whether your use case delivers value. They're aligned with consumption. More usage, more revenue, regardless of outcome.
The data on outcomes is not encouraging. Gartner found that only 54% of AI projects make it from pilot to production. Among those, fewer than half have a formal ROI measurement framework. Less than 1% of global executives report significant ROI — defined as ≥20% profitability or cost savings increase.[5] 89% of enterprises have adopted AI tools. Only 23% can accurately measure their ROI.
The starkest single number: executives believe AI saves them 4.6 hours per week. They spend 4 hours and 20 minutes per week validating AI outputs. The net weekly gain is 16 minutes.[6]
Forrester published a note in March 2026 pointing out that a large share of reported AI revenue growth from enterprise software vendors is reclassification — rebranded data clouds, existing tools, and legacy SKUs bundled into "AI-influenced revenue." Even the success stories may be inflated.
The $2 trillion question
Bain estimated in March 2026 that $2 trillion in fresh revenue is needed by 2030 to fund the AI infrastructure growth being built now. Most enterprises remain in experimentation mode with modest productivity gains.
The gap between infrastructure investment and demonstrated returns is the defining tension of the current moment. It resolves in one of a few ways: the returns arrive and justify the investment, the market corrects and the build slows, or the infrastructure finds new uses that weren't anticipated when it was built. Probably some combination.
What's clear is that the token economy is being built at scale regardless of whether enterprise use cases are ready for it. That's the desert in the title — a massive infrastructure build, in physical deserts, powered by infrastructure debt, waiting for the revenue to arrive.
Being smart about where you spend, and what you actually own, is the response that makes sense in that environment. That's what the next article is about.
References
- ↩ "Hyperscaler AI infrastructure capex 2026: $630–690B" — Data Center Richness / Introl, February 2026. source
- ↩ "Goldman Sachs projects $1.15 trillion in AI capex 2025–2027" — Goldman Sachs Global Investment Research, February 2026. source
- ↩ "IBM stock drops 13% after Claude Code COBOL demo" — Fast Company, February 2026. source
- ↩ "$2 trillion in SaaS market cap erased in the SaaSpocalypse" — Digital Applied / DevInstance, early 2026. source
- ↩ "Less than 1% of executives report significant AI ROI" — Mavvrik AI Research, 2025–2026. source
- ↩ "Executives gain 16 minutes a week net from AI tools" — Project Flux / PR Newswire, 2025. source
ticketyboo brings governed AI development to your pull request workflow. 5 governance runs free, one-time welcome grant. No card required.
View pricing Start free →