Many environmentally-aware individuals grapple with a sense of unease when using AI tools like ChatGPT, a sentiment highlighted in an analysis by Hannah Ritchie for “Sustainability by numbers.” Ritchie notes, “My sense is that a lot of climate-conscious people feel guilty about using ChatGPT. In fact it goes further: I think many people judge others for using it, because of the perceived environmental impact.”
Her article, which she mentions aligns with detailed work by other analysts like Andy Masley, aims to reframe this by suggesting that for typical text-based interactions, an individual’s carbon footprint is surprisingly small. However, this perspective is set against the backdrop of the substantial, collective energy demands of the artificial intelligence sector and an urgent call for greater transparency from tech companies about their power usage.
The Individual User’s Environmental Echo
Ritchie’s examination often starts with a common estimate: a single ChatGPT query uses approximately 3 Watt-hours (Wh) of electricity. When compared to the UK’s average daily per capita electricity use of 12,000 Wh, ten such queries (30 Wh) account for only about 0.2% of that daily total. This figure rises to around 2% if a user makes 100 queries. For users in the United States, where individual electricity consumption is higher, these percentages are even lower, at approximately 0.09% for ten queries and 0.9% for one hundred.
In terms of carbon output, one query is estimated by some sources to produce around 2 to 3 grams of CO2, a figure that includes the amortized emissions from the energy-intensive process of training AI models. Ritchie calculates that ten daily queries over a year would add about 11 kilograms of CO2 to an individual’s footprint.
This represents a fractional increase—around 0.16% for an average UK resident (whose energy and industry footprint is about 7 tonnes) and 0.07% for an American. To further contextualize this, Ritchie, referencing analysis by Andy Masley, points out that abstaining from 50,000 ChatGPT questions—a volume representing roughly 14 years of ten daily interactions—would yield less CO2 savings than everyday actions like recycling or using reusable bags.
Adding another layer to this, Ritchie highlights more recent research from Epoch AI suggesting that the energy per query might be as low as 0.3 Wh, a tenfold decrease from the older 3 Wh estimate.
If this more recent figure holds, the individual environmental impact shrinks further; ten daily queries would then represent just 0.02% of a UK individual’s electricity consumption. “A typical query uses far less energy than a standard lightbulb, or even just running your laptop for 5 minutes,” Ritchie observes.
Her advice to users of text-based AI is direct: “For the regular or even relatively high user of text-based LLMs: stop stressing about the energy and carbon footprint. It’s not a big deal, and restraining yourself from making 5 searches a day is not going to make a difference.” She does, however, caution that this perspective may not extend to users generating extensive high-quality video or audio content.
The Colossal Thirst Of AI Infrastructure
While individual queries might be light on resources, Ritchie is firm in distinguishing this from the overall energy profile of the AI industry. “I am not saying that AI energy demand, on aggregate, is not a problem,” she clarifies in her article, which follows her earlier piece on AI’s broader energy use. “It is, even if it’s “just” of a similar magnitude to the other sectors that we need to electrify, such as cars, heating, or parts of industry. It’s just that individuals querying chatbots is a relatively small part of AI’s total energy consumption.”
The training phases for these large language models are particularly demanding; for instance, training OpenAI’s GPT-3 reportedly consumed 1,287 MWh, while the significantly larger GPT-4 required an estimated 62,318 MWh.
This broader concern is vividly demonstrated by the situation unfolding in Memphis, Tennessee, where Elon Musk’s xAI is developing its “Colossus” AI supercomputer. The facility has ignited community protests over air pollution. The core of the issue lies in xAI’s use of numerous methane gas turbines, allegedly operating without standard emission controls or the necessary Clean Air Act permits, to power the energy-intensive operations.
This is occurring in South Memphis, a predominantly Black, lower-income community already burdened with poor air quality and the highest rates of asthma-related emergency room visits in Tennessee. The area is also home to over 17 other industrial facilities registered with the EPA’s Toxics Release Inventory.
The “Colossus” project’s power demands are immense. It launched with over 100,000 Nvidia Hopper GPUs in June 2024—these are advanced processors optimized for AI workloads, delivering substantial computational power but also consuming considerable energy. By December 2024, xAI announced plans for a tenfold expansion to over 1 million GPUs.
Musk has stated that Colossus will ultimately need 2,000 megawatts (MW) of power, far exceeding the 150 MW currently approved by the Tennessee Valley Authority (TVA). In August 2024, the site reportedly had access to only 8MW of grid power, requiring the use of mobile generators.
Environmental groups estimate xAI could become a leading emitter of nitrogen oxides (NOx)—gases that contribute to smog and can worsen respiratory ailments—in Shelby County, potentially releasing 1,200 to 2,000 tons annually. State Representative Justin Pearson stated to ABC24 Memphis, “They are saying they have a permit for 15 gas turbines. Right now, we know that they have 35 gas turbines at that facility, and we know that 33 of those are operating currently.”
He also mentioned potential daily fines for emissions including “130 tons of nitrogen oxide and 17.2 tons of formaldehyde.” Beyond air quality, the Memphis xAI facility is also reported to be consuming 5 million liters of water per day for cooling, adding another layer to local resource strain. This reflects a broader trend, as global AI demand could lead to water withdrawals of 4.2–6.6 billion cubic meters in 2027.
Local Voices And Regulatory Questions
Residents near the Memphis facility have reported tangible health effects. “How come I can’t breathe at home and y’all get to breathe at home?” asked Boxtown resident Alexis Humphreys at an April 25, 2025, public hearing, displaying her asthma inhaler.
xAI has characterized the gas turbines as a temporary measure. Shannon Lynn, an environmental consultant for xAI, suggested a regulatory exemption for temporary sources, a claim disputed by environmental law experts. Bruce Buckheit, a former EPA air enforcement director, told Politico, “There needs to be a permit beforehand. You don’t just get that first year for free.”
On April 9 the Southern Environmental Law Center (SELC) formally requested that the Shelby County Health Department (SCHD) compel xAI to cease turbine operations until a major source air permit is secured. The Greater Memphis Chamber of Commerce has actively supported xAI’s presence, with Chamber President Ted Townsend remarking to Politico, “It doesn’t always require a community to say, ‘Well, we are OK with that or we are not. It’s a capitalistic thing. Companies come in and they operate.”
The Unseen Cost And The Path Forward
The Memphis scenario throws into sharp relief the environmental considerations tied to the burgeoning AI field. While an individual’s interaction with an AI like ChatGPT might seem inconsequential from an energy standpoint, the cumulative infrastructure required to deliver these services globally tells a different story. Major tech companies are seeing their environmental metrics affected; Google’s greenhouse gas emissions rose by 13% in 2023 largely due to AI data centers, and Microsoft’s electricity consumption per dollar of revenue increased by over 50% from 2020 to 2023 for similar reasons. The International Energy Agency projects that electricity demand from data centers worldwide could more than double by 2030, with AI as a primary driver.
Ritchie’s article strongly emphasizes the challenge of getting clear data. “I mentioned this in my previous article, but let me say again how crazy I think it is that we’re left debating the order-of-magnitude energy use of LLMs,” she writes. “We’re not just talking about whether it’s 3, 3.5 or 4 Wh. We’re talking about whether our current calculations are ten times too high. Of course, tech companies do know what the right number is; it’s just that a lack of transparency means the rest of us are left bumbling around, wasting time.”
This demand for clarity from the technology sector is echoed by communities like those in Memphis, who are on the front lines of AI’s expanding energy and resource footprint. While some research, such as a new training method from the Technical University of Munich claimed to be significantly more energy-efficient, offers potential technological mitigations, the overall trend points to a growing need for responsible development and transparent accounting of AI’s environmental costs.