How much water and energy does AI actually use?
Separating facts from misconceptions
AI’S GROWING RESOURCE APPETITE IN FOCUS
Artificial intelligence has seen explosive growth, and with it has come rising concern about its environmental footprint. In the United States, tech companies are investing billions in new data centers to train and run AI models. These facilities consume large amounts of electricity and water, prompting warnings from environmental groups and scrutiny from policymakers and Gen-Z in particular. In late 2025, over 230 environmental organizations even wrote to Congress claiming that AI data centers “threaten Americans’ economic, environmental, climate and water security”. The AI industry, for its part, has pushed back. An op-ed by an industry coalition claimed AI’s water usage is “minimal and often recycled”. Such polarized messaging has led to confusion. How much power and water does AI actually use, and what are the real impacts?
Here, we’ll take a critical, fact-based look at AI’s energy and water use, drawing on the latest US data and research. I’ll explain why AI consumes so much electricity and cooling water, put its footprint in context, and debunk some common misconceptions, something that I think a lot of people, specifically content creators, should educate themselves on before using their platform to potentially spread misinformation.
WHY AI CONSUMES POWER AND WATER
Running advanced AI models is a resource-intensive process. Training a state-of-the-art generative AI model like GPT-5 entails billions of computations that demand staggering amounts of electricity. Even after deployment, providing AI services to millions of users (and continuously fine-tuning models) draws large power loads long after the initial training. All this electricity doesn’t just vanish into the cloud, it turns into heat. AI data centers packed with high-performance processors run hot, and without cooling the servers would overheat and fail. That is where water comes in. Many data centers use water-based cooling: circulating water absorbs heat from servers and then evaporates in cooling towers, carrying the heat away. In effect, AI literally runs hot, and facilities often rely on water evaporation (like sweating) to stay cool.
Critically, this means AI’s energy footprint and water footprint are intertwined. The more electricity a data center draws, the more heat to remove, and typically, more water is needed for cooling. In fact, on average, a data center requires on the order of 2 liters of water for cooling per kilowatt-hour of energy consumed. In other words, every unit of power used has a direct water cost for cooling. Using water for cooling is often more energy-efficient than using only air conditioning, which is why companies opt for it despite the water loss. Location plays a role too. In cooler or water-rich climates, some data centers rely more on air cooling, whereas in hot or arid regions, water cooling is favored to avoid massive electric chiller loads.
Another reason AI pushes energy use so high is the computing hardware involved. Modern AI models run on specialized chips (GPUs and AI accelerators) that perform trillions of math operations per second. These chips consume far more power than traditional servers and thus generate more heat. As demand for AI surges, companies are installing millions of power-hungry GPUs in new data centers. The rapid turnover of AI models also plays a part: unlike traditional software, AI models may be retrained or replaced within months, meaning the energy invested in training one generation can be quickly followed by an even larger training cycle for the next. In short, AI’s cycle of intense computation → high electricity use → excess heat → heavy cooling creates a significant environmental footprint in terms of both energy consumption and water consumption.
ENERGY DEMAND
It’s difficult to precisely disentangle how much of the tech sector’s electricity is because of AI, since data centers handle many kinds of workloads. What is clear is that data center energy use is rising sharply, largely due to the AI boom. In 2024, US data centers consumed about 183 terawatt-hours (TWh) of electricity; over 4% of all US power use that year. To put that in perspective, that’s roughly as much electricity as the entire country of Pakistan uses in a year, and the trajectory is steeply upward. By 2030, US data centers’ power draw is projected to more than double to around 426 TWh, barring major efficiency breakthroughs. The Department of Energy anticipates data centers’ share of US electricity could reach 6.7 to 12% by 2028, up from about 4.4% in 2023.
Much of this growth is attributed to AI and high-performance computing. A recent analysis by the International Energy Agency noted that a single large AI-focused “hyperscale” data center can use as much electricity in a year as 100,000 U.S. homes; even larger AI facilities under construction could draw 20× that amount. Because tech companies often cluster many data centers in certain regions, the local grid impacts are even more striking.
Globally, AI’s energy footprint is already significant. One peer-reviewed estimate suggests that running AI systems worldwide in 2025 may consume on the order of 30 to 80 million tons of CO₂ worth of electricity, a carbon footprint comparable to that of a city like New York. In July 2024, Google revealed that its own carbon emissions had jumped 48% from the previous year, a spike the company largely attributed to the rise of AI workloads and data center expansion. This is despite Google’s aggressive investments in renewable energy; the scale of new AI servers simply outpaced those efforts, at least in the short term. Historically, efficiency gains in hardware and cooling kept data center energy use relatively stable (one study famously found that between 2010 and 2018, global data center computing output rose 5× while energy use rose only ~6%). Today, that trend is changing. The old assumption that data centers would keep getting more efficient and flatline in energy growth no longer holds true in the AI era. Without major interventions, AI’s electricity appetite is poised to become a much larger slice of our energy pie, with attendant climate impacts.
WATER DEMAND
An electrician works inside a large data center that uses extensive cooling infrastructure. Water-based cooling is common to keep AI servers from overheating. Companies are now exploring advanced cooling methods (like liquid immersion) to curb water use.
Electricity is only half of the story. AI also has a significant water footprint. Data centers withdraw water from local supplies (often municipal water or groundwater) to dissipate heat through evaporation. Let’s start with the big-picture numbers. In 2023, US data centers directly consumed an estimated 17 to 18 billion gallons of water for cooling. That sounds like a lot, and it is. It’s roughly equivalent to the annual water use of a US city of a few hundred thousand people. Even so, it represented only about 0.3% of the total US public water supply. In broad national terms, that’s a relatively modest share. For comparison, agriculture, power plants, and other industries account for far higher fractions of water withdrawals. However, those aggregate figures can be misleadingly small, because water use is highly local. Data centers tend to be built in clusters and often in places already prone to water stress. Recent reports show about two-thirds of US data center projects built since 2022 are in regions of high water stress (think of arid parts of the Southwest). In such areas, a single facility can indeed make a big difference.
Consider that a large AI data center can require between 1 and 5 million gallons of water per day for cooling. On the high end, that’s equivalent to the water use of 50,000 local residents. If that facility draws from the same aquifer or water system as the community, conflicts can arise immediately. In Mesa, Arizona (a desert city), officials and residents have opposed new data centers partly out of fear for the city’s future water supply amid a megadrought. In Dalles, Oregon, a legal battle forced Google to disclose its data center water usage, revealing the company’s servers were consuming nearly a quarter of the city’s entire water supply. And in Newton County, Georgia, proposals for AI data centers sought more water per day than all county residents combined, prompting tough questions for local authorities. These examples underscore that where AI data centers operate is key: In water-rich regions, pulling in millions of gallons from a river or lake may go unnoticed, but in a drought-prone or small community, it can be a huge strain. So, when people talk about the over-consumption of water when it comes to AI, it is often both wrong and correct at the same time, with many people not really knowing when they may be accurate or not.
It’s also important to understand indirect water use. The majority of water associated with running AI is not the cooling water you see on site, it’s the water used by power plants to generate the electricity that feeds the data center. Thermoelectric power plants (coal, natural gas, nuclear) withdraw large volumes of water for cooling as they produce electricity, and even some renewables (like hydropower) have water footprints. Researchers have estimated that this indirect water can make up 80% or more of a data center’s total water footprint. So if an AI model is using a lot of electricity, it is likely driving water use upstream at the power source as well.
What about per-task water usage? It has been widely reported that chatbot AI queries have a hidden water cost. Indeed, researchers calculated that writing a 100-word email with AI could consume around 500 mL of water (about one typical drinking bottle’s worth) when you account for both data center cooling and power generation. Hearing that “each ChatGPT query guzzles a bottle of water” understandably raised alarms for some people. In truth, that figure, while backed by analysis, needs context. For one, it’s an average estimate under certain conditions. The actual water per query varies widely depending on the data center location, cooling method, and time of day. In absolute terms, half a liter of water is not a huge amount; other everyday activities easily dwarf that (producing a single hamburger, for instance, can require 400+ gallons of water through the full supply chain). The real concern is scale: If millions or billions of AI queries are happening, those half-liters add up fast, potentially stressing water supplies if concentrated in one area.
EFFICIENCY AND ALTERNATIVES
The interplay between energy and water leads to tricky trade-offs. Using more water for evaporative cooling reduces the need for power-hungry air conditioning, which in turn cuts carbon emissions. Google’s water strategist notes that the right combination of cooling methods depends on where the center is located. In a region with scarce water but greener electricity, it might be better to use air cooling (save water, emit a bit more CO₂). In a region with ample water but a coal-heavy grid, water cooling might actually be environmentally preferable (using water to keep power demand and emissions down). These trade-offs mean blanket judgments can be misleading, something that I often see from creators on TikTok, which take specific metrics out of context where then millions of users receive misleading information on how much water is being used with every ChatGPT prompt. The AI industry often emphasizes that overall, US data center water withdrawal is a small fraction compared to sectors like agriculture. People ask, is using water for AI worthwhile? Those skeptical of AI’s social benefits are more likely to resent even small water diversions, whereas those who see AI’s value might accept far larger uses.
On the positive side, there are emerging solutions to reduce AI’s resource footprint. Companies are investing in renewable energy to power data centers, and exploring ways to make AI computations more energy-efficient. Advanced cooling technologies are also being adopted to cut water usage. Some hyperscale operators have started using recycled wastewater instead of potable water for cooling, to avoid tapping drinking supplies. Others are shifting to “liquid cooling” systems that don’t rely on evaporation at all. Also, a number of US states from California to Virginia have proposed laws to require data centers to disclose their electricity and water usage and to use a certain share of renewable energy. While there is still no federal requirement for AI companies to report their environmental impact, public pressure and incentives are pushing the industry toward greater transparency.
COMMON MISCONCEPTIONS
Understanding the facts allows us to address a few common misconceptions surrounding AI’s resource usage:
Misconception 1: “Each AI query uses an insanely large amount of water/energy.”
Reality: There is a kernel of truth that even a single AI task is not free. For example, a single medium-length AI-generated response might use on the order of a few Wh (watt-hours) of energy and hundreds of milliliters of water when all is accounted for. That sounds wild (a chat with a bot = drinking a bottle of water), but perspective matters. The per-query resource use is on par with many trivial everyday actions. For instance, two AI-generated emails consume roughly as much electricity as fully charging one smartphone, and the water for one AI prompt (500 mL) is about what’s used to grow just a few bites of food or to produce a single sheet of paper. The point is not that AI’s resource usage is negligible, but that an individual using ChatGPT a few times is not an environmental crime. The aggregate impact is where scrutiny is deserved. Claims that “AI will drain our reservoirs dry one question at a time” are incredibly misleading. The real issues are scaling and concentration.
Misconception 2: “AI’s water use is universally catastrophic.”
Reality: AI’s water footprint varies greatly by location and context. In many water-abundant areas, a new data center’s cooling water might be a drop in the bucket. However, in water-scarce regions or small communities, AI projects can indeed become a flashpoint. It’s neither correct to say “AI is drying up America” nor “AI’s water use is nothing to worry about.” The truth lies in between. Where water is plentiful, there is usually enough for cities, industries and some data centers. Where water is scarce, even one large data center can spark conflicts. So, blanket statements about AI water use can be overblown, but it is fair to question the wisdom of building water-intensive server farms in the middle of a historic drought.
Misconception 3: “Tech companies will handle the issue. They’re all going green, so we don’t need to worry.”
Reality: It’s true that the big AI players (Google, Microsoft, Meta, OpenAI, etc.) have made bold pledges about sustainability. Many have goals to reach carbon-neutral or water-positive operations in coming years. They are pouring money into renewables, novel cooling techniques, and even citing data centers near clean power sources. Microsoft, for example, announced plans for zero-water cooling systems in the future. These efforts are commendable and will help. However, so far the actual results are mixed. Google’s latest environmental report showed a huge jump in emissions and that the company only replenished 18% of the water it consumed. In other words, even the most proactive firms are struggling to keep up with AI’s growth. Also, not every data center is run by a tech giant with deep pockets. Many are operated by providers or smaller firms that may not have the same incentives to invest in efficiency. Transparency is another issue. Outside of a few leaders, many data center operators still do not publicly report detailed data on energy or water use. So while we should encourage and applaud industry sustainability programs, we still need to keep an eye out.
A SUSTAINABLE AI FUTURE
AI’s impact on energy and water resources is a complex picture with plenty of nuance. On one hand, it’s clear that using AI, whether it’s chatting with an assistant, generating code, or training a machine-learning model, is not free of environmental costs. It draws on an electrical grid that may still be powered by fossil fuels and uses cooling infrastructure that can consume substantial water. The rapid expansion of AI capabilities has measurably increased the load on power grids and, in some locales, started to strain water supplies. These are challenges that warrant serious attention.
On the other hand, some doomsday narratives are too simplistic. AI is not single-handedly wrecking the environment. It remains a small fraction of total energy or water use when viewed globally. The worst-case scenarios can often be averted with sensible actions, like smarter siting of data centers, investment in renewables and grid upgrades, adoption of efficient cooling and chip technologies, and policies to improve transparency and resource planning. In many cases, solutions are already in progress. The key is ensuring that the AI revolution and the sustainability revolution go hand in hand. This means companies and regulators must treat energy efficiency and water conservation as first-class priorities alongside raw AI performance.
Perhaps the biggest misconception would be that we face an either/or choice. Either embrace AI and swallow huge environmental damage, or reject AI to save the planet. In reality, there is a path forward to reap AI’s benefits while managing its resource footprint. It will require innovation and likely new standards or laws to guide responsible deployment. It will also require a candid, factual dialogue, free of hyperbole, about what trade-offs we’re willing to make for the services AI provides. By dispelling myths and understanding the real data, we can have that dialogue and make informed decisions. AI does consume water and energy, but it doesn’t have to run our wells dry or crash our grids. The future of AI should be not only intelligent, but sustainable. I know, I know. That’s easier said than done.








