The AI Energy Meltdown (continued)

 Introduction

Computer Chips, CPUs and GPUs 

Conversations at the World Economic Forum in Davos this year were dominated by the topic of AI. 

 Jensen Huang, CEO of NVIDIA, told delegates that infrastructure to generate intelligence is being built all over the world:

- Chip factories

- Computer factories 

- AI factories/Data centres

 Let’s talk about the first one regarding energy consumption, seeing that data centres and their implications on the world’s energy resources were discussed in the previous blog. 

Chip Factories 

The computer chip—or integrated circuit (IC) or microchip or simply a chip, is the fundamental building block of modern technology. They are tiny, thin, and flat and made of silicon in factories like Intel in Silicon Valley and TSMC in Taiwan and they hold an entire electronic circuit. (Silicon is used because it is a semiconductor, meaning its ability to conduct electricity can be manipulated, making it ideal for controlling electrical flow.)

 Within a chip, transistors act as miniature electrical switches that turn a current on or off, creating binary code (0s and 1s) to perform calculations, process data and store, or transmit information—effectively serving as the "brains" or memory of electronic devices. They manage all hardware and software operations and are best for: web browsing, running programs, databases, general computing. They are also called central processing units (CPUs)

 And then entered the GPU. Graphic Processing Units (designed by NVIDIA in 1999) are thousands of smaller, simpler cores, optimized for handling large and complex volumes of data. GPUs excel at highly parallel tasks like rendering visuals during gameplay, manipulating video data during content creation, and computing results in intensive AI workloads.

 The first GPUs were built to speed up 3D graphics rendering, making movie and video game scenes seem more realistic and engaging. They were quickly deployed in other areas due to their high-speed parallel processing capabilities. Unlike CPUs, GPUs often run at 100% capacity during rendering, AI training, or gaming, consuming 100 to over 450W (and up to 1,000W+ for AI chips), which is about four times more power than the traditional data chip. They need massive amounts of energy and cooling; running thousands of small, simultaneous tasks rather than a few complex ones.

 Almost all electrical energy consumed is converted into heat due to resistance and therefore needs cooling; if not cooled properly, GPUs will exceed safe temperatures, forcing them to "throttle" (slow down) to avoid permanent damage. 

AI Factories and Data centres 

Chip manufacturing is an extremely energy-intensive, 24/7 process requiring specialized cleanrooms, high-powered lasers, and precise temperature/humidity control, leading to high energy consumption. While a single fab (semiconductor fabrication plant) can use more power than a single data centre, the total number of data centres worldwide is far greater, making the sector's aggregate consumption higher. 

 In our previous blog we discussed the possible solutions currently on the table for the energy shortage in the USA. People in the know estimate that because of the surge of AI technology, the amount of power needed in the USA could triple by 2028. 

 China does not seem to have the same looming crisis and they are the current leaders in the speed of building new generation capacity and also in control of the renewable energy supply chain.

Conclusion

What is the implication of all of this for you and your small business? Like all things AI the answer is: nobody knows what even the next week holds. Things are changing every day at the speed of light, we can actually only predict the next five minutes—but we will keep you updated on a regular basis. 

 

 

 

 

Run your business, your way.

Your business is unique, but your software is off the shelf? Ditch the workarounds and let's build your ERP systems to fit your teams.