This is a two-part series explaining:
- Why data centres need so much energy and water
- The concerns regarding this infrastructure
- How the AI entrepreneurs are planning to solve the looming energy crisis.
AI development seems to be constrained at this moment in time by power generation and cooling capacity.
Data centres are very huge warehouse size buildings and are very big consumers of energy mainly because Graphic Processing Units (GPUs) are running many complex searches and calculations at once, requiring more and more power; generally about four times more power than the traditional data chip. However, the specifics can vary based on technological advancements and energy efficiency improvements. (More about that in our next blog.)
People in the know estimate that because of the surge of AI applications, the amount of power data centres would need, could triple by 2028. Estimates are set at about 10 gigawatts, (according to a Green Economy Coalition article) which is the electricity usage of 7 million homes in the US.
The expected electricity needed in the US for the current AI revolution is 92 GW of morepower, says ex-Google CEO Eric Schmidt. One gigawatt is one big nuclear power station.
The big US tech companies are looking for a breakthrough; OpenAI, Microsoft, Google, Amazon, xAI and Meta are all in the same boat and the fear exists that they will rely on fossil fuels, which is an easily accessible source. Elon Musk is doing exactly that with xAI in Memphis—using gas turbines for its ‘Colossus’ Supercomputer because gas on the grid is an easy solution, reflecting a broader trend within the industry.
The talk is about using clean energy instead and it seems as if everybody is buying nuclear capacity right now. Nuclear is available around the clock (unlike solar and wind) but not enough is available – there are ongoing debates about nuclear expansion and its role in the energy mix.
Combining sources might be a possible solution eg: nuclear fusion plants, smaller nuclear reactors, batteries for solar and wind power which can store power around the clock. In a recent podcast Musk laid emphasis on solar power, talking about solar powered satellites aiming for 100 gigawatts a year. He said “compared to the sun, all other energy sources are like cavemen throwing twigs into a fire.” He went on to say that “there is a lot of energy out there, we just have to capture it. We are not going to run out of it.” He also mentioned a theoretical future of orbital data centres, saying “one millionth of the son’s energy will be more than a thousand as much energy as can be produced on earth.”
Other energy saving solutions that could be looked at are:
- Liquid cooling and immersion cooling, replacing traditional air cooling. - Carbon-Aware Computing: shifting non-urgent AI training workloads to times and locations where clean energy is most abundant.- High-Efficiency Hardware: going for better performance per watt. - AI for Grid Management: Using AI to improve the integration of renewable energy into the grid.
From the outside, it seems as if these AI companies are actually turning into energy utility companies in their urgency for an energy breakthrough.
One is left wondering whether the future currency might be … wattage?
New business models could see individuals and businesses generating surplus energy (e.g., through solar panels) and then sell their excess wattage back to the grid or to other consumers, fostering a decentralized energy market.
Interesting times we live in.
Are your current systems and processes hindering your business from achieving its next growth milestone? Now there is a smarter way to get work done.