How Hungry Is AI Technology?

Feb 7, 2024 By Ananya S, Writer Intern
Anonymous's picture

Did you know that AI could soon need as much electricity as an entire country? 

In our rapidly evolving computer age, Artificial Intelligence, or AI, stands out as one of the most significant fields of study.

AI researchers strive to develop "intelligent" machines that are capable of learning and performing tasks to solve problems and achieve goals.

Applications of AI range from image and face recognition programs to virtual assistants that are powered by voice recognition (such as Alexa and Siri), and language recognition and production software such as the popular ChatGPT.  

While this new field brings a lot of excitement for a brighter future, it also comes with a formidable issue: the powerful hardware that powers AI consumes high amounts of electricity.

Experts predict that AI servers could use up to 85 to 134 terawatt hours (TWh) annually by 2027, which is close to the annual electricity usage of the countries of Argentina, the Netherlands, and Sweden. This could boost the world’s carbon emissions by a significant amount. Not to mention, because of its massive hardware, AI systems also require large amounts of water for cooling.

Regulations and Improvements

Since AI is a relatively new field, regulations to control its environmental impacts are scarce. 

However, last October, California passed new climate disclosure laws that require all large companies, including those that use AI, to disclose how much carbon their operations use and the climate risks involved. These laws are the first of their kind and are predicted to affect around 10,000 companies. Many hope that other states will adopt similar laws and this will become a federal standard. 

Meanwhile, AI researchers are also trying to reduce the amount of energy that AI models consume. Researchers at MIT’s Lincoln Laboratory Supercomputing Center (LLSC) noticed that one graphics processing unit (GPU), which is the hardware that helps train AI models, consumed around 1,300 megawatt-hours of electricity. This is close to how much energy is used by 1,450 U.S. households per month!

The researchers discovered that capping the amount of power that a GPU is allowed to draw could decrease energy consumption by around 12-15%. This has also resulted in GPUs running around 30 degrees Fahrenheit cooler, which means that less water is used for cooling the system. 

Additionally, LLSC has created new techniques for AI training such as a system that terminates ineffective AI models early on. This has been found to decrease the energy used for training models by around 80%. 

What Next?

These energy-saving techniques not only cut costs in AI development but also encourage companies to seek strategies for reducing energy consumption.

Recently, even the US Air Force, which owns thousands of AI data centers, has been looking for ways to reduce their energy usage. 

The massive amount of energy usage in the AI industry is prompting many researchers and companies to find the perfect sweet spot where we can enjoy all of the benefits of AI and cut down on all of the downsides. 

Sources: NYT, BBC, MIT, Stanford.edu, Our World in Data