Generative AI has transformed how people search, learn, create, and even invest. Yet, behind its polished interfaces lies a growing environmental and financial cost that’s rarely discussed. While debates over plagiarism, job displacement, and privacy dominate headlines, the toll of training and running large AI models on the planet—and on the companies providing them—is starting to look just as disruptive.
A Gold Rush with a Green Shadow
Since OpenAI’s ChatGPT kicked off the generative AI race in late 2022, companies have poured billions into building and deploying ever-larger models. But powering these systems demands enormous amounts of electricity and water.
Training a large language model (LLM) can consume staggering energy, generating significant carbon emissions. The massive data centers that host these models also rely on vast amounts of water for cooling—sometimes straining local ecosystems. And for end users—whether consumers, corporations, or governments—the constant upgrades in high-performance hardware carry their own environmental burden, from mining rare materials to shipping, packaging, and eventual e-waste.
How Big Is the Carbon Footprint?
Generative AI platforms, from conversational LLMs to image-generation tools, require heavy computation even for day-to-day use. Analysts estimate that a single generative AI prompt uses about 3 watt-hours (Wh) of electricity. For comparison, a typical refrigerator uses about 1–2 kilowatt-hours (kWh) per day. Put differently, one day of fridge use—roughly 1.5 kWh—equals about 500 AI prompts.
Individually, that may seem trivial. Multiplied by billions of users and repeated daily interactions, the numbers balloon. It’s so resource-intensive that OpenAI CEO Sam Altman recently admitted that people typing “please” and “thank you” to ChatGPT costs “tens of millions of dollars” in electricity.
Most of this power still comes from fossil fuels, amplifying the climate impact.
“People are turning to ChatGPT and other platforms for everything instead of going on Wikipedia or opening a book and so that really adds up,” says Dr. Sasha Luccioni, AI and Climate Lead at Hugging Face, a New York City–based company focused on democratizing AI.
“Large language models use about 30 times more energy than [websites] because it’s generating results to your prompts from scratch based on their training data, and that process takes a lot more computation,” adds Dr. Luccioni. “We’re going further and further into resource consumption without actually seeing the cost; users aren’t seeing a bill, it’s free or almost free, and so I think that we don’t really have a connection to the environmental impact.”
Industry Giants Double Down
Despite the environmental concerns, AI investment shows no sign of slowing. Meta CEO Mark Zuckerberg recently announced plans to spend “hundreds of billions of dollars” on AI products and is building massive new data centers to support that growth. The Prometheus facility, expected online next year, will be Meta’s first multi-gigawatt data center. Another, Hyperion, could scale up to 5 gigawatts of computational power over the coming years.
Meta declined to comment for this story, but the sheer size of these facilities underscores how quickly AI’s infrastructure demands are accelerating.
Other firms are trying to at least quantify their footprint. Mistral AI, which develops LLMs, published a blog post called Our contribution to a global environmental standard for AI outlining its first-of-its-kind comprehensive study.
“We have recently conducted a first-of-its-kind comprehensive study to quantify the environmental impacts of our LLMs, [a] report that aims to provide a clear analysis of the environmental footprint of AI, contributing to set a new standard for our industry,” said Audrey Herblin-Stoop, Mistral AI’s VP of Public Affairs. She added that the footprint of a model correlates strongly with its size, so offering smaller or case-specific models can help. “This is why, at Mistral AI, we offer our customers a broad range of model sizes, starting with our smallest model, Ministral 3B.”
Meanwhile, Google agreed in August to curb power use for AI data centers to ease strain on the grid during demand surges. Its Gemini platform is already among the most widely used generative AI tools.
The Consumer Role
While companies race to scale up, experts argue that everyday users can also play a role in reducing AI’s footprint.
Dr. Luccioni poses a simple question: “Do you really need to generate a cookie recipe from scratch, or can you look it up in the recipe book or online?” She notes that people now use AI as a calculator, encyclopedia, and even therapist—yet thinking critically about what truly needs an AI-generated response could reduce unnecessary resource consumption.
AI as Part of the Solution
It’s important to note that AI itself can help fight climate change. Models are already being used to improve resource efficiency, manage energy distribution, optimize waste streams, and forecast climate impacts.
“The most-used AI models for fighting climate change are not the models that are the most harmful,” Dr. Luccioni explains. “Typically, the models that are used for, say, climate prediction and biodiversity monitoring, and for things like designing new generations of batteries and what have you, are quite efficient and can run locally on a laptop, and so these aren’t the problem in terms of environmental impact.”
However, she cautions that we’re seeing a Jevons paradox: technological progress increases efficiency but also drives higher overall consumption. “Many tech companies are improving its efficiency, including Nvidia with each generation of hardware, so you can do more and more, but people are still using it more as well, which cancels out any kind of efficiency gain,” she says. “All this demand for genAI right now, it’s really, really adds up — and so we’re going to need to find meaningful solutions.”
Why This Matters for Investors
Generative AI’s hidden costs aren’t just environmental—they’re financial. As companies like OpenAI, Meta, Google, and Mistral pour capital into infrastructure and energy, the long-term profitability of these services may hinge on efficiency gains, new pricing models, or regulatory pressure. For investors, understanding both the upside (AI-driven productivity) and the downside (soaring energy costs, carbon taxes, and public backlash) will be critical to evaluating AI-related equities and supply chains.

