top of page

AI Series 06: Beyond Prompts: The Hidden Costs of AI Tools

The Illusion of Digital AI


Generative AI seems effortless—just a chat box or an added feature in your favorite app.

But behind the scenes?

It’s an enormous industrial system consuming electricity, water, chips, and rare materials, all housed in massive data centers.


We talk a lot about what AI can do, but rarely about what it costs to make it work.


This article in our AI Series reflects on this overlooked question:

What does it really cost to make AI work every time you hit “send” on a prompt?


ree

The Hidden Costs of AI


Physical Footprint


We often imagine “the cloud” as something abstract—weightless, floating somewhere in cyberspace.

In reality, the cloud is made of massive, resource-hungry data centers: enormous buildings the size of football fields, packed with tens of thousands of servers, blinking lights, and heavy-duty cooling systems.

Every AI interaction—typing a prompt, generating an image, asking a chatbot a question—travels through a chain of physical hardware.

Your request is processed in racks of high-performance GPUs, stored on solid-state drives, and sent back through fiber-optic cables.


The bigger the AI model, the more power-hungry the process:


  • Larger models = more computation → more electricity, more cooling, and more infrastructure.

  • AI is no longer just “software” but a full-scale industrial system tied to energy grids, global supply chains, and water networks.


Energy Use

  • Global data center electricity consumption could more than double by 2030 to 945 TWh (IEA, 2025)—more than Japan’s current electricity use and just under 3% of total global electricity consumption*.

  • By 2035, demand could hit 1,200 TWh, with AI as the primary driver*.

  • In the U.S., data centers could account for half of all new electricity demand by 2030*.

  • Data centers consumed about 4.4% of total U.S. electricity in 2023 and are projected to consume 6.7% to 12% by 2030—potentially doubling their share of national electricity demand in just seven years**


Training energy


  • Training GPT-3 used about 1,287 MWh—enough to power 120 U.S. homes for a year.

  • GPT-4 may have required 25,000+ GPUs.


Water Use


  • Cooling systems often rely on local water sources.

  • Google’s water use rose 20% in one year, largely because of AI, affecting ecosystems in heat-stressed areas.


Query Energy Use


  • ChatGPT queries now use approximately 0.3 watt-hours per query—10× less than earlier estimates of 3 watt-hours, thanks to more efficient models and hardware improvements.

  • However, this is still roughly equivalent to what a Google search consumes, though the computational complexity remains much higher***. 


Sources 


In Europe


  • In the European Union, data centers accounted for about 3% of electricity use, and this share could double by 2026 as AI adoption grows Data Center Dynamics.

  • Across the broader European region (EU, Norway, Switzerland, UK), data center power demand is expected to nearly triple—from ~62 TWh today to over 150 TWh by 2030. That jump could raise their share of total electricity use from 2% to ~5%. ReutersMcKinsey & Company.



Hardware Manufacturing


  • High-performance AI chips (like Nvidia’s H100) require rare earth elements such as neodymium, terbium, and dysprosium for magnets, and metals like cobalt, tantalum, and palladium for electronics.

  • Mining these materials is energy-intensive, environmentally disruptive, and often concentrated in regions with significant social and ecological challenges.

  • Chip fabrication uses huge amounts of ultra-pure water (up to 8 million liters per day in some fabs) and chemicals, generating hazardous waste.

  • The semiconductor supply chain stretches across continents—silicon wafers from one country, assembly in another, packaging somewhere else—each step consuming energy and resources.


    ree

Financial Costs


Training state-of-the-art AI is one of the most expensive computing tasks on Earth.


Training Costs

  • GPT-3: ~$4.6 million in compute costs (2020 est.)

  • GPT-4: estimates range from $50–100+ million to train, depending on architecture and parallel experiments.

  • Google's Gemini 1.0 Ultra: $192 million to train (2024) 

  • Visual Capitalist Industry estimates suggest that "close to a billion dollars will already be spent on a single training run in 2024," according to Anthropic CEO Dario Amodei. Hyperbolic


Inference Costs (Ongoing Use)


  • Once a model is trained, every user query (“inference”) requires expensive GPU time. 

  • Serving millions of queries daily can cost tens of millions per month in electricity, cooling, and cloud hosting.


Hardware Investment


  • Training GPT-4 likely used 25,000+ GPUs (Nvidia A100 or H100), each costing $25k–$40k, plus data center space and networking gear.


Barrier to Entry


  • Only a few companies—OpenAI (Microsoft), Google DeepMind, Anthropic, xAI—can afford to train and operate models at the cutting edge.

  • This creates a winner-takes-most market where compute power, not just talent, determines who can innovate at scale.


The “cloud” might feel weightless from your laptop or phone—but it’s built on a foundation of steel, silicon, power lines, water pipes, and entire buildings humming with machines 24/7.

It's now  a global industrial system and an elite capital race, shaping who can build the AI of tomorrow.


ree

The Impact — Local and Global


AI’s hunger for resources isn’t evenly spread—it creates intense local strain and broad global consequences.


Local


  • Power Grids

    Many U.S. data centers are concentrated in just five regions. This clustering puts huge pressure on local electricity grids, sometimes competing with residential and industrial needs.

  • Electricity consumption

    In Ireland, the situation is particularly acute: data centers already consume 21% of national electricity. This rise has prompted regulatory pauses on new data center builds around Dublin to prevent grid overload AP

  • Water Systems

    Cooling AI servers can draw millions of liters annually from nearby rivers, reservoirs, or aquifers—especially damaging in drought-prone areas or southern Europe where droughts are common.

Imagine you're a farmer in a drought-prone area. For generations, your community has relied on a local reservoir. Now, a new data center has moved in, and it's drawing millions of liters of water to cool its AI servers. Suddenly, the abstract cost of AI becomes a very real competition for a life-giving resource.

  • Community Effects

    Rising energy and water costs, land use competition, and noise/light pollution from large facilities.


Global


  • Carbon Footprint

    AI is projected to push data centers from 4.4% of U.S. electricity use today toward 20% of global power use by 2035 (IEA).

  • Supply Chain Pressures

    Mining rare earths, manufacturing chips, and shipping hardware across continents expands the industry’s environmental toll.

  • Concentration of Capacity

    The majority of AI compute power is owned by a handful of companies in a few countries, raising questions about resilience, access, and geopolitical leverage.


Environmental & Policy Implications


  • If left unchecked, AI could become one of the fastest-growing industrial sources of emissions.

  • Governments may need to step in with energy-use reporting, water-use limits, or sustainable infrastructure mandates—similar to how we regulate other heavy industries.


ree

The Future of AI’s Energy Economy


As AI grows, so does its demand for energy and resources.

This is pushing the industry toward new ways to manage power and efficiency.


Dedicated AI Energy Infrastructure


  • Some companies are building special data centers designed just for AI.

  • These centers use advanced cooling, renewable energy, and smart power management to handle the heavy load more sustainably.


Competition for Compute as a Resource


  • Just like oil or metals, computing power is becoming a valuable resource.

  • Companies compete to access the fastest chips and biggest data centers. This drives innovation but also concentrates control in the hands of a few.


Efficiency Innovations

  • The future will see smarter, smaller AI models that use less power but still deliver great results.

  • New chip designs and software improvements are already making AI less energy-hungry.


Measuring Your AI Footprint


  • Right now, it’s hard to know exactly how much energy your single prompt uses.

  • But soon, we might have tools like “prompt meters” that track the environmental cost of each AI request—similar to how some apps track your carbon footprint.


ree

Awareness Is Power


AI isn’t just “digital magic.”

It’s a real industrial system with real costs—energy, water, materials, and money.

Users, companies, and governments need to understand these costs and work together to make AI more sustainable.

The first step is awareness.

When we know the true price of AI, we can push for smarter choices and cleaner technology.


What Can You Do?


  • Be Mindful: Use powerful AI tools deliberately, not for simple tasks a search engine can handle.

  • Ask Questions: Encourage your favorite AI companies to be transparent about their energy sources and water usage.

  • Support Efficiency: Choose the right tool for the job. See below.

  • Share This Knowledge: The first step to change is collective awareness. Share this article with your network.


ree







Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page