top of page

Data Centres

  • Dec 22, 2025
  • 4 min read

2025Q3


We don’t claim to have cracked the code on the finances of the AI boom but let’s play a little game. Imagine I tell you I’ve got something to sell. You say, “Sorry, I don’t have the money, and I can’t buy your product.” Normally, I’d just walk away and try my luck elsewhere.

But instead, I say, “Don’t worry, I’ll give you the money so you can buy my product.” At that point, you’d probably laugh and call me a snake eating its own tail.


And yet, when it’s OpenAI and NVIDIA playing this game, suddenly it’s called “strategic partnership.” Funny how circular funding starts looking like innovation once you slap a trillion-dollar market cap on it.


With that out of our system, we can focus on electricity markets. Renewable developers also want in on the action and are having a FOMO moment. While they might not be getting personal cheques from Jensen Huang or Sam Altman, they have been able to convince themselves that the future is full of buckets of gold. Artificial Intelligence will spur a new growth in data centre roll out in Australia and their 2035 cheques from previously much promised green hydrogen load will now be replaced by much larger ones from data centres (hopes spring eternal).


As debt investors, our DNA is configured with scepticism. We have a view of the future that is fundamentally not as optimistic as equity investors. Therefore, we have decided to investigate how much substance is in the future electricity demand growth from data centres.

Presently, there are more than 150 data centres in Australia with a combined capacity of more than 1.6 gigawatts (GW). In 2024-25, data centres consumed around 4 terawatt-hours (TWh) of electricity across the NEM, accounting for a mere 2% of grid-delivered supply. But under the latest NEM Electricity Statement of Opportunities, Australian Energy Market Operator (AEMO) predicts that demand from data centres will surge to 21.4 TWh by 2034–35, equivalent to around 9% of grid-supplied electricity. That’s the kind of jump that gets a solar farm developer out of bed.


So what is AEMO banking on?


Training large language models?

For context, training GPT-3 consumed approximately 1.3 Gigawatt-hours (GWh) of electricity, while GPT-4 is estimated to have taken over 50 GWh. As models becomes more complex, energy consumption is expected to increase exponentially. But here’s the catch, model training is not happening in Australia. Most models are being trained in the US or China. Washington and Beijing won’t export model training to Macquarie Park when it underpins national security and global tech dominance.


Prompting and inference?

Most models consume between 0.42 to 1.10 Watt-hour (Wh) per query. Energy consumption increases with prompt length and model architecture. To put this into perspective, a simple Google search consumes approximately 0.30 Wh of electricity. One could say, prompting increases electricity consumption by approximately 40% compared to using a search engine. As a side note, we asked Chat GPT to compare energy consumption of its system relative to the human brain – the answer? For a one second query it estimated that the brain uses 6 joules relative to Chat-GPT requiring 100-1000 joules for the same query. 


However, if we use Google as our benchmark and assume six trillion annual searches shift from Google to AI-augmented queries, that’s approximately 3 TWh of incremental global demand. All six trillion searches won’t happen in Australia. We contribute less than 2% to global GDP and make up less than 0.4% of the global population. Even on a conservative basis, if we estimated that 10% of AI-augmented queries occurred in Australia, that’s just 0.3 TWh. On its own, 0.3 TWh won’t move the dial in the NEM that consumes 21.4 TWh per year.


The key to supporting AEMO’s forecast would lie in scaling inference from models. The number of applications embedding AI in the future will be central to growth in energy consumption. However, inference must happen locally, from hyperscalers in Australia, to drive the 13-fold expansion AEMO is forecasting. The key questions will be how relevant latency, security, and sovereignty risks are for artificial intelligence applications.


Cooling demand from hyperscalers?

One may argue that it is not just prompting and AI. It is a whole digital ecosystem that will be housed in these hyperscalers and for every megawatt of servers, there’s another megawatt for cooling, backup, and redundancy. Data centres are power-to-heat conversion machines, and in Australia’s climate, cooling loads can rival IT loads.


However, most data centres run well below 100% of rated load. Rated load is based on the worst-case combination of a data centre working at maximum activity and environmental conditions at their hottest (air conditioning at max capacity). In reality, data centres don’t always run at maximum capacity, as seen in the table below which highlights the utilisation of NextDC’s data centres:


While we may see an influx of gigawatts of new data centres, on average it will operate well below nameplate capacity.


As investors in energy markets, we would love to see growth in electricity demand, but we would rather see the cobra itself rather than just hear the music from the snake charmer.

 
 
 

Comments


© 2023 Infradebt Pty Ltd

bottom of page