Where to for the AI datacentre boom? Transformational utilities, and their bubbles.

Prediction: The AI datacentre industry will be another example of a recurring pattern I’ll call a “transformational utility”: an industry which is capital intensive, massively disruptive, and soon indispensable to the rest of the economy, but also undifferentiated. And therefore, for early equity holders, often disappointing.

The old playbook, again

AI is obviously transformative, but it’s not the first technology to rewire society. Let’s look at previous innovations such as canals, railways, steel, electricity, fibre internet, and mobile phone networks to see what we can learn about capital-intensive, society-changing inventions.

When societies reorganise around new infrastructure, the story tends to rhyme:

  1. Breakthrough + capex. A new invention arrives with vast promise, but equally vast capital requirements.
  2. Early scarcity. Capacity lags because capital projects take time to execute.
  3. “Bubble” phase. Those in the lead enjoy massive valuations, as they promise to dominate the revolution.
  4. Commoditisation. The buildout catches up with demand; the lack of differentiation in the underlying product exposes an inability to sustain high prices.
  5. Real growth continues. The sector keeps getting bigger and more valuable to society.
  6. Multiple compression. But the early players cannot maintain pricing power, valuations tend back down, and many early investors lose despite the sector’s real-world success.

Canals, railways, steel, electricity, fibre backbones, and mobile networks have all walked this path: The railway barons, US Steel, Edison Electric (later GE), Cisco, and most mobile networks enjoyed boom valuations at some point. Then returns normalised, even as their industries grew to multiples of their prior size.

Steel is a particularly interesting example. It sat at the centre of USSR and later Chinese industrial strategy. But as raw steel capacity become abundant, the USA’s path showed that long term economic leadership came from differentiated offerings downstream. (There is a geopolitical angle to steel which is re-emerging now: more on that below.)

Of course, there are high-valuation industries (and bubbles) that are NOT transformational utility bubbles, for example:

  • Tulips. Some bubbles centre on things with trivial enduring utility. AI compute isn’t that.
  • iPhones (or Rolexes). Some products sustain premium margins through differentiation and brand. Raw compute is not that either. For example, the mechanical-watch industry (Rolex, etc.) is worth more than ever before, because it has reframed itself as that of strongly-branded status symbols for men, not merely timekeepers.

Why AI data centres behave like utilities

The test for utility economics is interchangeability. If buyers view your product as equivalent across providers, price drifts towards (operating cost + cost of capital). Higher prices just attract new entrants, who can gain share until prices converge to the threshold for new entrants.

Compute is globally tradeable over networks. A data centre is just a building with electricity, cooling and connectivity, in which lots of matrix multiplications can be done. It may, in fact, be the most tradable of all the transformational utilities, as there are no technical reasons why we couldn’t put all our compute in one place on the planet.

The major structural reason to deviate from this gravity is geopolitics (data sovereignty, national security, sanctions, energy policy). Governments can and will localise capacity; they can also tax or subsidise it. But that’s political risk, not durable product differentiation, and it drives subsidies, not big financial returns.

So where’s the differentiation (and the excess return)?

Think of where differentiation exists in the AI stack:

  • Chips (e.g., NVIDIA). Moderate. Real technical edge and speed-of-innovation moats, but they’re cyclical and not guaranteed (ask Intel).
  • Data centres / cloud compute. Low (outside geopolitics). Scale and operations matter, but sameness dominates pricing power in the long run.
  • Models (LLMs, core algorithms). Moderate now, lower over time. Capabilities diffuse fast; weights leak; papers ship; open models improve. Most use cases allow users to freely swap between several LLMs.
  • Applications. High variance, real moats available. This is where long-term margin lives — exactly as electricity’s wealth accrued to the things using it, not the grid itself.

What this implies

I’m not predicting a dramatic bubble “burst” tomorrow. Scarcity can continue longer than sceptics expect, and AI is likely to be capacity constrained for a long time. But multiples for compute-heavy businesses should compress as capacity catches up. The companies will be fine, some shareholders won’t. (Cisco still exists; 1999 buyers are still unhappy.)

Infrastructure bets need a clear theory of longevity. OpenAI (and others) tying valuation to data-centre buildout only makes sense if controlling compute during the next few years catapults them into a leading position in a new post-AI world, in a way that didn’t happen to any of the previous darlings of transformational infrastructure. This might happen (AI is unusual enough to keep minds open) but it’s a high-conviction, high-timing bet.

Finally, to be really clear, I’m not predicting that the AI revolution will underwhelm. Far from it! Just that the actual buildout of data centres is something you might want to leave to someone else.

Leave a comment