When you purchase through links on our site, we may earn an affiliate commission.Heres how it works.
In 2023, that grew by 55% to 7.4 GW (gigawatt = a billion watts).
That would put data centers close to 4% of the entire global power use.
What differentiates AI data centers from those in conventional hyper-scale environments is the demands on power.
To explain why, imagine an AI dataset developed to automate road traffic.
A local AI data center in New York would be needed to reduce the latency sufficiently.
The caveat to operating at maximum capacity for extended periods is increased temperatures and voltages.
It will be necessary to regularly assess and potentially enhance the performance and design of cooling and electrical systems.
And, with the regular appearance of new and more efficient AI technology, upgrade cycles will become commonplace.
All the big chip players realise the potential of successful AI chips.
We are literally talking months.
TechRadar Pro created this content as part of a paid partnership with AMD.
The contents of this article are entirely independent and solely reflect the editorial opinion of TechRadar Pro