Environmental impact of artificial intelligence
From Wikipedia, the free encyclopedia
The environmental impact of the design, training, deployment and use of artificial intelligence includes the greenhouse gas emissions from generating electricity for data centres and computing hardware, operational and upstream water use, and material impacts from hardware manufacturing, mining and electronic waste.[1][2][3][4]


Estimating AI's environmental effects can be difficult because results depend on how impacts are measured, including whether accounting includes only model computation or also data-centre overhead, idle capacity, hardware manufacture, and local electricity supply.[5]
As these issues have received greater attention, governments and regulators have increasingly considered data-centre reporting requirements, energy-efficiency standards, and broader transparency measures for AI-related resource use.[6][7]
Carbon footprint and energy use
AI-related energy use arises at multiple stages, including model training, fine-tuning, inference, storage, networking, and supporting infrastructure such as cooling and power conversion.[2]
Individual level

Published estimates of energy use per AI request vary widely across models, tasks and measurement methods.[5] A benchmark study presented at the 2024 ACM Conference on Fairness, Accountability, and Transparency found substantial differences between task types, with lower energy use for some text tasks and much higher energy use for image generation in the study's test conditions.[9] In that benchmark, simple classification tasks consumed about 0.002–0.007 Wh per prompt on average (about 9% of a smartphone charge for 1,000 prompts), while text generation and text summarisation each used about 0.05 Wh per prompt; image generation averaged 2.91 Wh per prompt, and the least efficient image model in the study used 11.49 Wh per image (roughly equivalent to half a smartphone charge).[9]
First-party measurements in production environments have also been published. A 2025 Google study on Gemini assistant serving reported median per-prompt energy, emissions, and water-use estimates under the authors' accounting framework, while noting that different system boundaries can produce substantially different results.[5] The study reported a median text-prompt estimate of about 0.24 Wh, which is roughly as much energy as watching nine seconds of television. The study also stated that software and infrastructure improvements reduced energy use by a factor of 33 and carbon emissions by a factor of 44 for a typical prompt over one year within the authors' framework.[5]
Researchers at the University of Michigan measured the energy consumption of various Meta Llama 3.1 models released in 2024 and found that smaller language models (8 billion parameters) use about 114 joules (0.03167 Wh) per response, while larger models (405 billion parameters) require up to 6,700 joules (1.861 Wh) per response. This corresponds to the energy needed to run a microwave oven for roughly one-tenth of a second and eight seconds, respectively.[10]
Comparisons between AI systems and human labour for specific tasks have produced mixed results and remain sensitive to assumptions about output quality, workload and system boundaries. A 2024 study in Scientific Reports reported 130 to 2900 times lower estimated carbon emissions for selected AI systems than for human writers and illustrators under its assumptions.[11] A later Scientific Reports paper reported a counterexample for programming tasks under its assumptions, finding 5 to 19 times higher estimated emissions for the evaluated AI system than for human programmers on the benchmark used in that study.[12]
System level
Energy use and efficiency

AI electricity intensity depends not only on model architecture but also on hardware and facility efficiency. Data-centre operators commonly report Power usage effectiveness (PUE), which measures the ratio of total facility energy to IT equipment energy; a lower PUE indicates less overhead energy for cooling and other supporting infrastructure.[5]
Operators may also publish metrics and case studies on hardware efficiency, cooling systems and power sourcing. In its 2024 environmental report, Google stated that its 2023 total greenhouse gas emissions increased 13% year over year, primarily because of increased data-centre energy consumption and supply-chain emissions, while also reporting lower PUE than industry averages for its own facilities.[16]
The International Energy Agency has also reported that data centres remain a relatively small share of global electricity use overall, but that their local effects can be much more pronounced because demand is geographically concentrated.[1]
Carbon footprint
At system level, AI contributes to rising electricity demand in data centres and related infrastructure. The International Energy Agency estimated that data centres used about 415 TWh of electricity in 2024, or around 1.5% of global electricity consumption, and projected that data-centre electricity use could rise to about 945 TWh by 2030, with AI identified as the main driver of that growth alongside other digital services.[1]
The carbon footprint of AI systems depends strongly on electricity sources, hardware efficiency, utilisation rates, and what stages are included in the accounting. Training large models can require substantial electricity, while total lifecycle impacts also depend on deployment scale and the amount of inference performed after training.[2][17][5]
Early analyses of frontier-model development reported rapid historical growth in training compute for selected systems, although later trends have depended on changes in model design, hardware and efficiency gains.[18]
Accounting methods that include upstream or embodied impacts, such as hardware manufacture and facilities construction, can materially affect estimates of AI-related emissions.[17][4]
Decisions and strategies by individual companies
Large technology companies have reported that the expansion of AI and cloud infrastructure affects their sustainability targets, electricity demand, and resource use. Google, for example, attributed part of its emissions growth in 2023 to increased data-centre energy consumption and supply-chain emissions in its 2024 environmental report.[16]
Cloud and AI companies have also announced measures intended to reduce environmental impacts, including investment in more efficient hardware, low-carbon electricity procurement, alternative cooling systems, and water stewardship programmes. The extent, comparability, and third-party verification of such disclosures vary between firms and jurisdictions.[16][6]
Water usage
Data centres can use water directly for cooling and indirectly through the water used in electricity generation, depending on the local energy mix.[3] Public reporting on data-centre water use has often been inconsistent, making comparisons between operators and regions difficult.[3]
To standardise operational reporting, The Green Grid proposed the metric water usage effectiveness (WUE), defined as annual site water use divided by IT equipment energy use.[19] WUE does not by itself measure local water stress, source sustainability, or all upstream water impacts.[19][3] Studies of AI water use also distinguish between water withdrawal and water consumption[a].[21]
Research on AI-specific water use has argued that the water footprint of AI systems can be difficult to observe and may vary substantially by location, cooling design, and electricity source. A 2025 Communications of the ACM article summarised methods for estimating AI water footprints and emphasised the distinction between water withdrawal and water consumption.[21]
Li and colleagues estimated that global AI water withdrawal could reach 4.2–6.6 billion cubic metres in 2027 under the scenarios examined in their article.[21] Using GPT-3, released by OpenAI in 2020, as an example, they estimated that training the model in Microsoft's U.S. data centres could consume about 700,000 litres of onsite water and about 5.4 million litres in total when offsite electricity-related water use was included; they also estimated that 10–50 medium-length GPT-3 responses could consume about 500 mL of water, depending on when and where the model was deployed.[21] Published prompt-level estimates have also varied by system and accounting framework: the 2025 Google study on Gemini assistant serving reported a median text-prompt estimate of about 0.26 mL under its framework.[5]
Location can materially affect the significance of data-centre water use. Research on U.S. data centres found that one-fifth of servers' direct water footprint came from moderately to highly water-stressed watersheds, while nearly half of servers were fully or partially powered by plants located in water-stressed regions.[22] A 2025 Reuters report, citing data from Verisk Maplecroft and NatureFinance, said that an average mid-sized data centre uses about 1.4 million litres of water per day for cooling and that Phoenix would experience a 32% increase in annual water stress if currently planned data centres come online.[23]
Water use also occurs upstream in semiconductor fabrication, which relies on large quantities of ultrapure water.[24]
E-waste
AI systems depend on specialised computing hardware, and rapid turnover in servers and accelerators may contribute to rising e-waste.[4] According to the Global E-waste Monitor 2024, the world generated an estimated 62 million tonnes of e-waste in 2022, and the total was projected to rise to 82 million tonnes by 2030 under its scenarios.[25] The World Health Organization has also identified e-waste as a growing environmental and public-health issue.[26]
A 2024 study in Nature Computational Science estimated that generative AI could add between 1.2 and 5 million tonnes of e-waste by 2030 under the scenarios examined by the authors.[4] In the study's higher-end scenarios, this would represent up to 12% of projected global e-waste by 2030.[4] The authors also estimated that circular-economy strategies along the generative-AI value chain could reduce AI-related e-waste generation by 16–86%.[4]
Mining
AI hardware depends on complex supply chains for metals, minerals and manufactured components. UNCTAD has reported that the expansion of digital infrastructure increases demand for raw materials and raises environmental and distributional concerns linked to extraction, processing and manufacturing.[27]
Specialised chips used in AI systems can depend on supply chains involving critical minerals and other materials whose extraction and processing may have significant environmental and social effects. These impacts are not unique to AI, but may increase as demand for AI-related hardware grows.[27]
Social impact and environmental justice
The environmental effects of AI-related infrastructure are not distributed evenly. Research on U.S. data centres has found that their environmental footprints vary by region and may intersect with local electricity systems, water availability and existing environmental burdens.[22] In that study, one-fifth of servers' direct water footprint came from moderately to highly water-stressed watersheds, while nearly half of servers were fully or partially powered by plants located in water-stressed regions.[22]
Concerns have also been raised about local air pollution, permitting and grid stress in communities hosting AI-related facilities and associated power infrastructure. In 2025, civil-rights and environmental groups challenged permits connected to an xAI facility in the Memphis area, arguing that air-pollution burdens could fall disproportionately on historically overburdened neighbourhoods. The dispute has been the subject of regulatory and legal proceedings.[28][29][30]
Climate solutions
Despite concerns about its environmental footprint, AI has been used in environmental and climate-related applications, including weather forecasting, Earth observation, and optimisation in transport and energy systems.
In weather forecasting, peer-reviewed studies have reported strong results for some AI-based forecasting systems under specific evaluation frameworks. A 2023 Nature paper on Pangu-Weather reported strong medium-range forecasting performance relative to a leading numerical weather prediction system in the study's evaluation.[31] AI has also been used in research on extreme weather and climate-event modelling.[32]
AI has also been proposed for mitigation-oriented optimisation. Google's Green Light project, for example, uses traffic data and machine learning to recommend traffic-signal timing adjustments intended to reduce stop-and-go traffic and associated emissions at intersections.[33]
Whether AI produces net environmental benefits at large scale remains an open question, because outcomes depend on deployment choices, rebound effects, additional infrastructure demand and the extent to which electricity and cooling systems are decarbonised.[1]
Conflict on the use of AI for environmental research
There is ongoing debate over the balance between the possible environmental benefits of AI applications and the environmental costs of scaling AI systems. This includes discussion of transparency, efficiency, rebound effects, and the extent to which AI-related infrastructure growth may offset environmental gains from specific applications.[2][5]
Policy and regulation
United States
In the United States, proposals have been introduced to study and standardise reporting on AI's environmental impacts. The Artificial Intelligence Environmental Impacts Act of 2024 (S. 3732), introduced in the Senate in February 2024, would require a federal study on the environmental impacts of AI, direct the National Institute of Standards and Technology to convene a consortium on measurement and standards, and establish a voluntary reporting system.[7]
European Union
In the European Union, the Energy Efficiency Directive introduced reporting obligations for large data centres. The European Commission has stated that a European database collects information relevant to the energy performance and water footprint of data centres, and that a delegated regulation sets out the information and key performance indicators for the reporting scheme.[6]
EU member states also maintain national AI strategies, some of which include references to sustainability, energy efficiency, or environmental applications of AI.
France
France's AI strategy documents have discussed AI in relation to ecological transition and environmental applications, including the use of digital infrastructure and data for environmental policy.[34]
Germany
Germany's national AI strategy includes sections on the environmental impacts of AI and on research into energy-efficient and sustainable AI applications.[35]
Italy
Italy's national AI strategy documents include sustainability-related priorities and discuss AI applications in areas such as environment, infrastructure, and sustainable development goals.[36]