The Real Cost of Running AI: Energy and Environment | Cliptics

Somewhere in rural Iowa, a building the size of six football fields hums around the clock. There are no windows. No signs out front. Just rows upon rows of servers stacked floor to ceiling, bathed in blue LED light and cooled by enough water to sustain a small town. This is a hyperscale data center, and it exists for one purpose: to keep artificial intelligence running.
We talk about AI in abstract terms. Productivity gains, creative breakthroughs, the future of work. But behind every ChatGPT query, every AI-generated image, every automated customer service interaction, there is a physical reality that rarely makes headlines. Electricity gets consumed. Water gets evaporated. Carbon gets emitted. And the scale of it all is growing faster than almost anyone predicted.
The Electricity Problem No One Wants to Talk About
The International Energy Agency dropped a figure in late 2025 that should have stopped people in their tracks: global data center electricity consumption was on pace to double by 2028, reaching roughly 945 terawatt-hours per year. To put that in perspective, that is more electricity than the entire nation of Japan uses annually.
AI is the primary driver. Training a single large language model like GPT-4 consumed an estimated 50 gigawatt-hours of electricity, enough to power 4,600 average American homes for an entire year. But training is only the beginning. Inference, the process of actually using a trained model to answer questions and generate content, accounts for roughly 80 to 90 percent of an AI system's lifetime energy use. Every time you ask an AI to summarize an article or generate code, a cluster of GPUs somewhere draws power.
Google reported that its total energy consumption rose 17 percent in 2025, with AI workloads cited as the leading contributor. Microsoft's emissions climbed 30 percent above its 2020 baseline despite its pledge to become carbon negative by 2030. Meta's data center buildout in the American Midwest has required new transmission lines capable of delivering hundreds of megawatts to single facilities.
The uncomfortable truth is that the efficiency gains in chip design, the shift to renewable energy, and the optimization of cooling systems have not kept pace with the explosive growth of AI demand. We are running faster just to stay in the same place.
Water: The Hidden Resource Drain
If electricity is the visible cost of AI, water is the invisible one. Data centers need cooling, and the most common method involves evaporating massive volumes of water to dissipate heat. A single mid-sized data center can consume between 3 and 5 million gallons of water per day, roughly equivalent to the daily water use of a city of 50,000 people.
A 2024 study published in Nature estimated that a single conversation of 20 to 50 queries with a large language model consumes approximately 500 milliliters of water. That is a standard water bottle evaporated into the atmosphere for what amounts to a few minutes of chat.
The geographic dimension makes this worse. Many of the largest data centers are located in regions already facing water stress. Northern Virginia, home to the densest concentration of data centers on Earth, has seen growing tensions between tech companies and local water authorities. In Arizona and Nevada, new facilities compete directly with agriculture and residential use for increasingly scarce groundwater.
Google disclosed that its global water consumption for data centers reached 6.1 billion gallons in 2025. Microsoft consumed 2.5 billion gallons the same year. These numbers continue to climb, and they do not account for the water used in electricity generation itself, which adds substantially to the total footprint.
The Carbon Equation Gets Complicated
The carbon story is perhaps the most nuanced, and the most contested.
On one hand, the major cloud providers have made ambitious commitments. Google claims to match 100 percent of its electricity use with renewable energy purchases. Microsoft has invested heavily in carbon capture and removal technologies. Amazon has become the largest corporate buyer of renewable energy in the world.
On the other hand, the accounting is more complicated than it appears. "Matching" renewable energy on an annual basis does not mean a data center runs on clean power around the clock. When the sun goes down and the wind stops blowing, those servers still draw from the grid, and in many regions, that grid is still powered substantially by natural gas and coal. The concept of 24/7 carbon-free energy, where every hour of consumption is matched with clean generation in the same grid region, remains an aspiration rather than a reality for most facilities.
Researchers at the University of Massachusetts Amherst estimated that training a single large AI model can produce over 600,000 pounds of carbon dioxide equivalent, roughly five times the lifetime emissions of an average car, including its manufacture. As models grow larger and more numerous, this figure has only increased.
There is also the embodied carbon of the hardware itself to consider. Manufacturing the specialized GPUs and TPUs that power AI workloads is an energy-intensive process involving rare earth mining, semiconductor fabrication, and global shipping. NVIDIA shipped over 3.7 million data center GPUs in 2025 alone. Each one carries a carbon debt before it processes a single computation.
The Rebound Effect: When Efficiency Backfires
History offers a cautionary lesson here. The Jevons paradox, first observed in 1865 with coal consumption, describes how improvements in the efficiency of a resource's use tend to increase rather than decrease total consumption. Make something cheaper and more efficient, and people use more of it.
AI is following this pattern precisely. More efficient models and cheaper inference costs have not reduced total energy consumption. Instead, they have enabled entirely new categories of use. Companies that would never have deployed AI at older price points are now integrating it into every product and workflow. The volume of AI queries globally has grown an estimated 10x between 2024 and 2026.
Each individual query uses less energy than it did two years ago. The total energy consumed by AI has nonetheless skyrocketed. Efficiency improvements are being overwhelmed by demand growth, and there is no indication this trajectory will change.
What Is Actually Being Done
Not all the news is grim. Genuine innovations are emerging, even if they are not yet operating at sufficient scale.
Immersion cooling, where servers are submerged in non-conductive liquid, can reduce cooling energy by up to 95 percent and virtually eliminate water consumption. Microsoft, Google, and a growing number of smaller operators have begun piloting these systems, though they remain a fraction of total installed capacity.
Nuclear energy is reentering the conversation as the only zero-carbon baseload power source capable of meeting data center demand at scale. Microsoft signed a deal to restart a unit at the Three Mile Island nuclear plant specifically to power AI workloads. Amazon and Google have invested in small modular reactor companies. Whether these projects can be built fast enough to matter remains an open question.
On the software side, techniques like model distillation, quantization, and sparse inference are producing smaller, more efficient models that deliver comparable performance at a fraction of the energy cost. The trend toward specialized models optimized for specific tasks rather than monolithic general-purpose systems is promising.
Some companies have begun publishing detailed environmental impact reports for their AI products. This transparency, while imperfect, creates accountability and allows researchers and regulators to track progress.
The Questions We Should Be Asking
The environmental cost of AI is not a reason to abandon the technology. Climate modeling, drug discovery, grid optimization, and agricultural efficiency all benefit enormously from AI capabilities. The question is not whether to use AI but how to use it responsibly.
That starts with honesty. The tech industry's tendency to emphasize efficiency gains while downplaying absolute consumption growth creates a misleading picture. A 50 percent improvement in energy efficiency means nothing if workloads grow by 300 percent.
It continues with accountability. Consumers, businesses, and regulators deserve clear, standardized metrics for the environmental cost of AI services. How much energy does a query consume? How much water? What is the carbon intensity of the specific grid powering the response? These numbers should be as accessible as a nutrition label.
And it requires a willingness to ask an uncomfortable question: does every AI application justify its environmental cost? An AI system that helps manage a power grid or accelerates medical research has a fundamentally different value proposition than one that generates novelty images or automates tasks that did not need automating.
The servers keep humming. The water keeps evaporating. The meters keep spinning. The real cost of running AI is not measured in dollars per query. It is measured in megawatts, in gallons, in tons of carbon, and ultimately in the kind of planet we are choosing to build. The least we can do is look at the bill.