AI is already consuming 1-2% of the world’s energy use. In 2022, data centers consumed 460 terawatts hours, equivalent to France’s energy demand. By 2026, consumption is set to at least double, by conservative estimates.
Data centers that make AI possible are accelerating their energy consumption at rapid levels that will only further contribute to already disastrous environmental issues.
AI proponents claim that its use will help tackle other issues associated with the climate crisis, including efficient resource consumption, as well as other complex social issues.
However, AI emits a significant amount of CO2, a driving force of climate change. Critics note that this quality marks the opposite of what it proposes to achieve.
Conservative estimates on AI’s energy consumption suggest that the technology is consuming as much as developed nations. Without a clear solution on how to mitigate these issues, this problem is set to continue.
AI runs largely on data centers, but also on hardware that too has issues with sustainability in terms of its production and usage.
While the “interference” phase of AI use itself doesn’t consume a great deal of energy, its constant use accumulates. For example, in 2022, Facebook’s data centers performed trillions of interferences per day.
These interferences come from algorithmic functions like issuing recommendations, among others. This means that every time a Facebook user, of which there are more than three billion, logs on and views their newsfeed, interferences are triggered.
It’s not only Facebook. Online platforms that, for example, request confirmation of "humanness," often through the identification of objects in images, also engage in the same process.
While app developers do analyze energy consumption in the development stage, studies have shown that consumption is actually much higher in day-to-day use.
The organization Algorithm Watch uses the example of the BLOOM model. During the development phase, the app used 24.7 tons of CO2, not including emissions from hardware production and operations, which makes emissions closer to double the amount reported.
Another example the organization highlights is PaLM, Google’s AI language model. Although PaLM receives nearly 90% of its energy from carbon-free sources, a single run in its development phase resulted in 271.43 tons of CO2 emissions.
This amount of CO2 emissions is equivalent to 1.5 commercial jet flights in the United States. While that may not seem like a lot given the thousands of flights that occur in the US each day, it is still a significant amount of C02.
This is especially important to remember when we consider a program like PaLM, which is supposedly a more sustainable model in comparison to some of its other AI competitors.
In addition to emissions, AI is based on computing power. This, of course, requires energy. The servers get hot and need to be cooled down constantly to prevent overheating.
This is where water comes in. Fresh water is used to cool down power plants and AI servers. For language models like GPT-3 and LaMDA, Algorithm Watch claims that millions of liters are used just in the training phase.
The growth and increasing demand of AI systems means more and more water infrastructure will need to be accessible. The facts and figures speak for themselves. Google’s water consumption increased by 20% in one year alone, while Microsoft's increased by 34%.
Increasingly popular AI tools, such as ChatGPT, consume 500ml (17 oz) of water in just a few questions and answers. With at least 100 million active users, the amount of water that ChatGPT alone consumes is alarming.
It’s clear that AI’s actual energy consumption needs to be accurately documented, but it also needs to be assessed and regulated. There is risk emerging from the use of AI just in terms of sustainability alone.
Some critics fear that legal measures and assessments may hinder the evolution and innovation of AI, but lessening the power consumption is essential.
According to the Hebrew University, “to lower the power consumption is key,” otherwise the risk is that AI becomes “a rich man’s tool.”
As AI operates on the basis of machine learning, it will always consume a great deal of energy to train and operate the models.
It’s clear that the continuation of AI requires some parameters to ensure we can reap its benefits over the long run. Monitoring mechanisms to understand resource consumption, CO2 emissions, and hardware production are critical elements.
The fear is that any regulations will create conditions so inaccessible that they will produce limitations in terms of who can create and use AI models.
Some experts argue that the AI industry needs to engage in large-scale architectural changes that adapt to environmental concerns, while also embracing more efficient approaches to the industry’s needs.
These adjustments also apply to hardware development. Companies may need to invest in making technical tools like computer chips more efficient.
Google’s 2024 environmental report noted that its carbon emissions increased by half in the last five years alone, and Microsoft's by 30%. A big source of energy use comes from AI’s consistent need for accessing memory.
Experts argue that the key to diminishing energy use is to find a way to make accessing memory less energy-heavy and develop circuits that are more efficient.
This is a lengthy and expensive process that would require a great deal of investment in research. Therefore, experts don’t believe companies will speed toward finding energy solutions now, instead opting to expand their AI services while regulations remain unclear.
Sources: (Nature.com) (Algorithm Watch) (Fierce Network) (Wired)
See also: A third of Americans would consider doing therapy with AI. Would you?
Artificial intelligence is the future, but is it speeding up unsustainable energy consumption? That's what the numbers imply. According to the International Energy Agency (IEA), in 2022, AI consumed 2% of the world's energy. By 2026, that number is set to increase by up to nearly 130%. To put that figure into context, it is equivalent to Sweden's annual energy consumption. Experts worry about the sheer resource consumption that AI requires, warning that already detrimental environmental issues will only continue to accelerate.
Want to learn more about AI's worrying energy consumption? Click on.
By 2026, data centers will consume as much energy as Sweden
AI's unsustainable energy consumption
LIFESTYLE Climate change
Artificial intelligence is the future, but is it speeding up unsustainable energy consumption? That's what the numbers imply. According to the International Energy Agency (IEA), in 2022, AI consumed 2% of the world's energy. By 2026, that number is set to increase by up to nearly 130%. To put that figure into context, it is equivalent to Sweden's annual energy consumption. Experts worry about the sheer resource consumption that AI requires, warning that already detrimental environmental issues will only continue to accelerate.
Want to learn more about AI's worrying energy consumption? Click on.