
Artificial Intelligence Becomes a Major Consumer of Energy and Water: How the Growth of Neural Networks Impacts Climate and Creates Risks and Opportunities for Investors and the Global Economy
The Rapid Growth of AI and Its Energy Appetite
The demand for AI computing power has soared in recent years. Since the launch of public neural networks like ChatGPT in late 2022, businesses worldwide have accelerated the adoption of artificial intelligence models, requiring vast amounts of data processing. Industry estimates suggest that by 2024, AI will account for approximately 15-20% of the total energy consumption of data centers globally. The power needed to run AI systems could reach 23 GW by 2025—comparable to the total electricity consumption of a country like the United Kingdom. In comparison, this figure exceeds the energy consumption of the entire Bitcoin mining network, indicating that AI has become one of the most energy-intensive forms of computation.
This exponential growth is driven by massive investments from technology companies in infrastructure: new data centers are opened almost weekly, and every few months, new production facilities for specialized machine learning chips are launched. The expansion of such infrastructure directly leads to increased electricity consumption required to power and cool thousands of servers servicing modern neural networks.
Emissions on the Scale of a Metropolis
Such high energy consumption inevitably leads to substantial greenhouse gas emissions, especially if part of the energy comes from fossil fuels. According to recent research, AI is projected to be responsible for 32–80 million metric tons of CO2 emissions per year by 2025. This effectively elevates the "carbon footprint" of AI to the level of an entire city; for example, annual emissions from New York City are around 50 million tons of CO2. For the first time, a technology that seemed purely digital demonstrates the same scale of impact on the climate as large industrial sectors.
It is essential to note that these estimates are considered conservative. They primarily account for emissions from power generation for server operations, while the entire lifecycle of AI—from equipment production (servers, chips) to disposal—creates additional carbon footprints. If the AI boom continues at its current pace, the volume of associated emissions will rapidly increase. This complicates global efforts to reduce greenhouse gases and poses challenges for technology companies regarding how to integrate the explosive growth of AI within their carbon neutrality commitments.
The Water Footprint of Neural Networks
Another hidden resource "appetite" of AI is water. Data centers consume vast amounts of water for cooling servers and equipment: evaporative cooling and air conditioning rely heavily on water resources. In addition to direct consumption, significant volumes of water are also required indirectly— at power plants for cooling turbines and reactors during the generation of electricity that computational clusters consume. Experts estimate that AI systems alone might consume between 312 to 765 billion liters of water by 2025. This is comparable to the total volume of bottled water consumed by humanity in a year. Thus, neural networks create a colossal water footprint that has remained mostly invisible to the public until now.
Official estimates often do not reflect the full picture. For instance, the International Energy Agency reported that approximately 560 billion liters of water were used by all data centers worldwide in 2023, yet this figure did not include water utilized in power plants. The true water footprint of AI could be several times higher than formal estimates. Major players in the industry are currently hesitant to disclose details: in a recent report on its AI system, Google explicitly stated that it does not account for water consumption associated with third-party power stations in its metrics. This approach has drawn criticism since a significant portion of water is consumed precisely to meet the electrical needs of AI.
The scale of water consumption is already raising concerns in various regions. In arid areas of the U.S. and Europe, communities are opposing the construction of new data centers, fearing that they will siphon scarce water from local sources. Moreover, corporations are noticing an increase in the "thirst" of their server farms: Microsoft reported that global water consumption by its data centers surged by 34% in 2022 (to 6.4 billion liters) largely due to increased loads associated with training AI models. These facts emphasize that water factors are rapidly becoming paramount in assessing the environmental risks of digital infrastructure.
Lack of Transparency Among Tech Giants
Paradoxically, given the scale of impact, there is very little publicly available information about the energy and water consumption of AI. Major tech companies (Big Tech) typically report aggregate figures for emissions and resources in their sustainability reports, failing to separately specify the share related to AI. Detailed information about the operation of data centers—such as how much energy or water is consumed specifically for computations related to neural networks—often remains internal to the companies. There is a significant lack of data regarding "indirect" consumption, such as water used in the production of electricity for data center needs.
As a result, researchers and analysts often have to act like detectives, piecing together the picture from fragmented data: snippets from corporate presentations, estimates of the number of AI server chips sold, data from energy companies, and other indirect indicators. This lack of transparency complicates the understanding of the full scale of AI's environmental footprint. Experts are calling for stricter disclosure standards: companies should report on the energy and water consumption of their data centers, broken down by key areas, including AI. Such transparency would allow society and investors to objectively assess the impact of new technologies and encourage the industry to seek ways to reduce its environmental burden.
Threatening Environmental Risks
If current trends persist, the growing "appetite" of AI could exacerbate existing environmental issues. Additional tens of millions of tons of greenhouse gas emissions annually would complicate the achievement of the goals set by the Paris Agreement on climate. The consumption of hundreds of billions of liters of freshwater will take place amid a global water resource deficit projected to reach 56% by 2030. In other words, without measures for sustainable development, the expansion of AI risks coming into conflict with the planet's ecological limits.
If no changes are made, such trends could lead to the following negative consequences:
- Acceleration of global warming due to increased greenhouse emissions.
- Exacerbation of freshwater shortages in already arid regions.
- Increased stress on energy systems and socio-environmental conflicts over limited resources.
Local communities and authorities are already beginning to respond to these challenges. In some countries, restrictions are being imposed on the construction of "energy-hungry" data centers, requiring the use of water recycling systems or purchasing renewable energy. Experts note that without significant changes, the AI industry risks transitioning from a purely digital sector to a source of tangible environmental crises—from droughts to disruptions in climate plans.
Investor Perspective: The ESG Factor
Environmental aspects of the rapid development of AI are becoming increasingly important for investors. In an era when ESG (environmental, social, and governance) principles take center stage, the carbon and water footprint of technologies directly affects company valuations. Investors are questioning whether the "green" shift in policy will lead to increased costs for companies betting on AI. For instance, tightening carbon regulations or implementing water usage fees could raise expenses for those companies whose neural network services consume significant amounts of energy and water.
Conversely, companies that are already investing in mitigating the environmental impact of AI could gain a competitive advantage. Transitioning data centers to renewable energy, improving chips and software for greater energy efficiency, and implementing water reuse systems reduce risks and enhance reputations. The market highly values progress in sustainability: investors worldwide are increasingly incorporating environmental metrics into their business evaluation models. Thus, for technology leaders, the pressing question is how to continue scaling AI capabilities while meeting societal expectations for sustainability. Those who find the balance between innovation and responsible resource management will benefit in the long run—both in terms of their image and their business valuation.
The Path to Sustainable AI
Despite the scale of the problem, the industry has opportunities to channel AI growth toward sustainable development. Leading global technology companies and researchers are already working on solutions to reduce the environmental footprint of AI without stifling innovation. Key strategies include:
- Enhancing the energy efficiency of models and hardware. Developing optimized algorithms and specialized chips (ASICs, TPUs, etc.), which perform machine learning tasks with lower energy consumption.
- Transitioning to clean energy sources. Utilizing electricity from renewable resources (solar, wind, hydro, and nuclear energy) to power data centers, aiming to zero out carbon emissions from AI operations. Many IT giants are already entering "green" contracts to procure clean energy for their needs.
- Reducing and recycling water consumption. Implementing new cooling systems (liquid or immersion cooling) that require significantly less water, as well as reusing technical water. Choosing locations for data centers with consideration of water availability: preferring regions with cooler climates or sufficient water resources. Studies show that careful location choices and cooling technologies can reduce the water and carbon footprint of a data center by 70–85%.
- Transparency and accounting. Introducing mandatory monitoring and disclosure of energy consumption and water use associated with AI infrastructure. Public accountability stimulates companies to manage resources more effectively and allows investors to track progress in reducing ecosystem burdens.
- Leveraging AI for resource management. Paradoxically, artificial intelligence itself could help address this problem. Machine learning algorithms are already being used to optimize cooling in data centers, predict loads, and distribute tasks in ways that minimize peak loads on networks and improve server utilization efficiency.
The next few years will be crucial in integrating sustainability principles at the core of the rapidly growing AI sector. The industry stands at a crossroads: it can either continue on its current trajectory, risking encounter with ecological barriers, or turn problems into catalysts for new technologies and business models. If transparency, innovation, and responsible resource management become integral parts of AI strategies, the "digital mind" can evolve hand in hand with caring for the planet. This is the balance investors and society expect from the new technological era.