





























See Also
See Again
© Getty Images
0 / 30 Fotos
AI's energy use
- AI is already consuming 1-2% of the world’s energy use. In 2022, data centers consumed 460 terawatts hours, equivalent to France’s energy demand. By 2026, consumption is set to at least double, by conservative estimates.
© Getty Images
1 / 30 Fotos
Rapid levels of consumption
- Data centers that make AI possible are accelerating their energy consumption at rapid levels that will only further contribute to already disastrous environmental issues.
© Getty Images
2 / 30 Fotos
Climate crisis
- AI proponents claim that its use will help tackle other issues associated with the climate crisis, including efficient resource consumption, as well as other complex social issues.
© Getty Images
3 / 30 Fotos
CO2 emissions
- However, AI emits a significant amount of CO2, a driving force of climate change. Critics note that this quality marks the opposite of what it proposes to achieve.
© Getty Images
4 / 30 Fotos
Matching consumption of developed nations
- Conservative estimates on AI’s energy consumption suggest that the technology is consuming as much as developed nations. Without a clear solution on how to mitigate these issues, this problem is set to continue.
© Getty Images
5 / 30 Fotos
Sustainability
- AI runs largely on data centers, but also on hardware that too has issues with sustainability in terms of its production and usage.
© Getty Images
6 / 30 Fotos
Interference
- While the “interference” phase of AI use itself doesn’t consume a great deal of energy, its constant use accumulates. For example, in 2022, Facebook’s data centers performed trillions of interferences per day.
© Getty Images
7 / 30 Fotos
Algorithmic
- These interferences come from algorithmic functions like issuing recommendations, among others. This means that every time a Facebook user, of which there are more than three billion, logs on and views their newsfeed, interferences are triggered.
© Getty Images
8 / 30 Fotos
Humanness
- It’s not only Facebook. Online platforms that, for example, request confirmation of "humanness," often through the identification of objects in images, also engage in the same process.
© Getty Images
9 / 30 Fotos
Development
- While app developers do analyze energy consumption in the development stage, studies have shown that consumption is actually much higher in day-to-day use.
© Getty Images
10 / 30 Fotos
Emissions
- The organization Algorithm Watch uses the example of the BLOOM model. During the development phase, the app used 24.7 tons of CO2, not including emissions from hardware production and operations, which makes emissions closer to double the amount reported.
© Getty Images
11 / 30 Fotos
PaLM
- Another example the organization highlights is PaLM, Google’s AI language model. Although PaLM receives nearly 90% of its energy from carbon-free sources, a single run in its development phase resulted in 271.43 tons of CO2 emissions.
© Getty Images
12 / 30 Fotos
Significant CO2
- This amount of CO2 emissions is equivalent to 1.5 commercial jet flights in the United States. While that may not seem like a lot given the thousands of flights that occur in the US each day, it is still a significant amount of C02.
© Getty Images
13 / 30 Fotos
Comparison
- This is especially important to remember when we consider a program like PaLM, which is supposedly a more sustainable model in comparison to some of its other AI competitors.
© Getty Images
14 / 30 Fotos
Computing power
- In addition to emissions, AI is based on computing power. This, of course, requires energy. The servers get hot and need to be cooled down constantly to prevent overheating.
© Getty Images
15 / 30 Fotos
Water use
- This is where water comes in. Fresh water is used to cool down power plants and AI servers. For language models like GPT-3 and LaMDA, Algorithm Watch claims that millions of liters are used just in the training phase.
© Getty Images
16 / 30 Fotos
More water needed
- The growth and increasing demand of AI systems means more and more water infrastructure will need to be accessible. The facts and figures speak for themselves. Google’s water consumption increased by 20% in one year alone, while Microsoft's increased by 34%.
© Getty Images
17 / 30 Fotos
ChatGPT
- Increasingly popular AI tools, such as ChatGPT, consume 500ml (17 oz) of water in just a few questions and answers. With at least 100 million active users, the amount of water that ChatGPT alone consumes is alarming.
© Getty Images
18 / 30 Fotos
AI and sustainability
- It’s clear that AI’s actual energy consumption needs to be accurately documented, but it also needs to be assessed and regulated. There is risk emerging from the use of AI just in terms of sustainability alone.
© Getty Images
19 / 30 Fotos
Will a legal framework hinder AI?
- Some critics fear that legal measures and assessments may hinder the evolution and innovation of AI, but lessening the power consumption is essential.
© Getty Images
20 / 30 Fotos
Preventing AI from becoming inaccessible
- According to the Hebrew University, “to lower the power consumption is key,” otherwise the risk is that AI becomes “a rich man’s tool.”
© Getty Images
21 / 30 Fotos
Machine-learning
- As AI operates on the basis of machine learning, it will always consume a great deal of energy to train and operate the models.
© Getty Images
22 / 30 Fotos
Parameters needed
- It’s clear that the continuation of AI requires some parameters to ensure we can reap its benefits over the long run. Monitoring mechanisms to understand resource consumption, CO2 emissions, and hardware production are critical elements.
© Getty Images
23 / 30 Fotos
Inaccessibility
- The fear is that any regulations will create conditions so inaccessible that they will produce limitations in terms of who can create and use AI models.
© Getty Images
24 / 30 Fotos
Architectural changes needed
- Some experts argue that the AI industry needs to engage in large-scale architectural changes that adapt to environmental concerns, while also embracing more efficient approaches to the industry’s needs.
© Getty Images
25 / 30 Fotos
Hardware development
- These adjustments also apply to hardware development. Companies may need to invest in making technical tools like computer chips more efficient.
© Getty Images
26 / 30 Fotos
Accessing memory
- Google’s 2024 environmental report noted that its carbon emissions increased by half in the last five years alone, and Microsoft's by 30%. A big source of energy use comes from AI’s consistent need for accessing memory.
© Getty Images
27 / 30 Fotos
More efficient circuits
- Experts argue that the key to diminishing energy use is to find a way to make accessing memory less energy-heavy and develop circuits that are more efficient.
© Getty Images
28 / 30 Fotos
Investment needed
- This is a lengthy and expensive process that would require a great deal of investment in research. Therefore, experts don’t believe companies will speed toward finding energy solutions now, instead opting to expand their AI services while regulations remain unclear. Sources: (Nature.com) (Algorithm Watch) (Fierce Network) (Wired) See also: A third of Americans would consider doing therapy with AI. Would you?
© Getty Images
29 / 30 Fotos
© Getty Images
0 / 30 Fotos
AI's energy use
- AI is already consuming 1-2% of the world’s energy use. In 2022, data centers consumed 460 terawatts hours, equivalent to France’s energy demand. By 2026, consumption is set to at least double, by conservative estimates.
© Getty Images
1 / 30 Fotos
Rapid levels of consumption
- Data centers that make AI possible are accelerating their energy consumption at rapid levels that will only further contribute to already disastrous environmental issues.
© Getty Images
2 / 30 Fotos
Climate crisis
- AI proponents claim that its use will help tackle other issues associated with the climate crisis, including efficient resource consumption, as well as other complex social issues.
© Getty Images
3 / 30 Fotos
CO2 emissions
- However, AI emits a significant amount of CO2, a driving force of climate change. Critics note that this quality marks the opposite of what it proposes to achieve.
© Getty Images
4 / 30 Fotos
Matching consumption of developed nations
- Conservative estimates on AI’s energy consumption suggest that the technology is consuming as much as developed nations. Without a clear solution on how to mitigate these issues, this problem is set to continue.
© Getty Images
5 / 30 Fotos
Sustainability
- AI runs largely on data centers, but also on hardware that too has issues with sustainability in terms of its production and usage.
© Getty Images
6 / 30 Fotos
Interference
- While the “interference” phase of AI use itself doesn’t consume a great deal of energy, its constant use accumulates. For example, in 2022, Facebook’s data centers performed trillions of interferences per day.
© Getty Images
7 / 30 Fotos
Algorithmic
- These interferences come from algorithmic functions like issuing recommendations, among others. This means that every time a Facebook user, of which there are more than three billion, logs on and views their newsfeed, interferences are triggered.
© Getty Images
8 / 30 Fotos
Humanness
- It’s not only Facebook. Online platforms that, for example, request confirmation of "humanness," often through the identification of objects in images, also engage in the same process.
© Getty Images
9 / 30 Fotos
Development
- While app developers do analyze energy consumption in the development stage, studies have shown that consumption is actually much higher in day-to-day use.
© Getty Images
10 / 30 Fotos
Emissions
- The organization Algorithm Watch uses the example of the BLOOM model. During the development phase, the app used 24.7 tons of CO2, not including emissions from hardware production and operations, which makes emissions closer to double the amount reported.
© Getty Images
11 / 30 Fotos
PaLM
- Another example the organization highlights is PaLM, Google’s AI language model. Although PaLM receives nearly 90% of its energy from carbon-free sources, a single run in its development phase resulted in 271.43 tons of CO2 emissions.
© Getty Images
12 / 30 Fotos
Significant CO2
- This amount of CO2 emissions is equivalent to 1.5 commercial jet flights in the United States. While that may not seem like a lot given the thousands of flights that occur in the US each day, it is still a significant amount of C02.
© Getty Images
13 / 30 Fotos
Comparison
- This is especially important to remember when we consider a program like PaLM, which is supposedly a more sustainable model in comparison to some of its other AI competitors.
© Getty Images
14 / 30 Fotos
Computing power
- In addition to emissions, AI is based on computing power. This, of course, requires energy. The servers get hot and need to be cooled down constantly to prevent overheating.
© Getty Images
15 / 30 Fotos
Water use
- This is where water comes in. Fresh water is used to cool down power plants and AI servers. For language models like GPT-3 and LaMDA, Algorithm Watch claims that millions of liters are used just in the training phase.
© Getty Images
16 / 30 Fotos
More water needed
- The growth and increasing demand of AI systems means more and more water infrastructure will need to be accessible. The facts and figures speak for themselves. Google’s water consumption increased by 20% in one year alone, while Microsoft's increased by 34%.
© Getty Images
17 / 30 Fotos
ChatGPT
- Increasingly popular AI tools, such as ChatGPT, consume 500ml (17 oz) of water in just a few questions and answers. With at least 100 million active users, the amount of water that ChatGPT alone consumes is alarming.
© Getty Images
18 / 30 Fotos
AI and sustainability
- It’s clear that AI’s actual energy consumption needs to be accurately documented, but it also needs to be assessed and regulated. There is risk emerging from the use of AI just in terms of sustainability alone.
© Getty Images
19 / 30 Fotos
Will a legal framework hinder AI?
- Some critics fear that legal measures and assessments may hinder the evolution and innovation of AI, but lessening the power consumption is essential.
© Getty Images
20 / 30 Fotos
Preventing AI from becoming inaccessible
- According to the Hebrew University, “to lower the power consumption is key,” otherwise the risk is that AI becomes “a rich man’s tool.”
© Getty Images
21 / 30 Fotos
Machine-learning
- As AI operates on the basis of machine learning, it will always consume a great deal of energy to train and operate the models.
© Getty Images
22 / 30 Fotos
Parameters needed
- It’s clear that the continuation of AI requires some parameters to ensure we can reap its benefits over the long run. Monitoring mechanisms to understand resource consumption, CO2 emissions, and hardware production are critical elements.
© Getty Images
23 / 30 Fotos
Inaccessibility
- The fear is that any regulations will create conditions so inaccessible that they will produce limitations in terms of who can create and use AI models.
© Getty Images
24 / 30 Fotos
Architectural changes needed
- Some experts argue that the AI industry needs to engage in large-scale architectural changes that adapt to environmental concerns, while also embracing more efficient approaches to the industry’s needs.
© Getty Images
25 / 30 Fotos
Hardware development
- These adjustments also apply to hardware development. Companies may need to invest in making technical tools like computer chips more efficient.
© Getty Images
26 / 30 Fotos
Accessing memory
- Google’s 2024 environmental report noted that its carbon emissions increased by half in the last five years alone, and Microsoft's by 30%. A big source of energy use comes from AI’s consistent need for accessing memory.
© Getty Images
27 / 30 Fotos
More efficient circuits
- Experts argue that the key to diminishing energy use is to find a way to make accessing memory less energy-heavy and develop circuits that are more efficient.
© Getty Images
28 / 30 Fotos
Investment needed
- This is a lengthy and expensive process that would require a great deal of investment in research. Therefore, experts don’t believe companies will speed toward finding energy solutions now, instead opting to expand their AI services while regulations remain unclear. Sources: (Nature.com) (Algorithm Watch) (Fierce Network) (Wired) See also: A third of Americans would consider doing therapy with AI. Would you?
© Getty Images
29 / 30 Fotos
By 2026, data centers will consume as much energy as Sweden
AI's unsustainable energy consumption
© Getty Images
Artificial intelligence is the future, but is it speeding up unsustainable energy consumption? That's what the numbers imply. According to the International Energy Agency (IEA), in 2022, AI consumed 2% of the world's energy. By 2026, that number is set to increase by up to nearly 130%. To put that figure into context, it is equivalent to Sweden's annual energy consumption. Experts worry about the sheer resource consumption that AI requires, warning that already detrimental environmental issues will only continue to accelerate.
Want to learn more about AI's worrying energy consumption? Click on.
RECOMMENDED FOR YOU




































MOST READ
- Last Hour
- Last Day
- Last Week