Get the top HN stories in your inbox every day.
scottcha
wmf
Voluntary conservation was only working by accident and guilt tripping never works. The grid needs to become clean so that we can have new industries.
XorNot
Yep, this is the real answer. It's also the only answer. The big fiction was everyone getting hopped on the idea that "karma" was going to be real, and people's virtue would be correctly identified by overt environmentalism rather then action.
Fossil fuel companies won, and they won in about 1980s when BP paid an advertising firm to come up with "personal carbon footprint" as a meaningful metric. Basically destroyed environmentalism since...well I'll let you know when it stops.
s1mplicissimus
It's a false dichotomy to say "either systemic change or individual change" - both have always and will always go hand in hand, influencing each other in the process. To say only systemic change is required leaves out the individual responsibility for those who have the means to choose. To say it's just individual change required leaves out the fact that people can only choose within the reality of their situation, which clearly is defined by the outcome of the system they are in.
andymasley
I made a point in the post to say that it's better to mostly ignore your personal carbon footprint and focus on systematic change, but that I was writing the post for people who still wanted to reduce their consumption anyway
fulafel
Emissions are a collective action problem. Guilt tripping works poorly directly on behaviour but it works on awareness&public discourse -> voting -> policy.
Cf how we addressed the ozone hole, acid rain, slavery, etc.
fmbb
The grid being clean means not having any fossil power. We can only get there by shutting down all fossil fuel power plants.
We can not get there by adding new power generation.
celticninja
Well you need the latter to replace the former. So you need to add new power generation to allow you to shut down fossil fuel plants.
And to be honest what we need to do is replace them with nuclear power stations to manages the base load of nations power requirements. Either that or much better power storage is required
caseyy
> We can only get there by shutting down all fossil fuel power plants. We can not get there by adding new power generation.
https://knowyourmeme.com/photos/1433498-no-take-only-throw
"No add new power plants, only transform our grid to greener".
shafyy
Even if grid the was 100% renewable, this does not mean that there's no environmental cost to producing electricity. As a society, we need to decide what is important and try to minize energy consumption for things that are not important.
And shoving LLMs into every nook and cranny of every application, so just tech giants who run the data centers can make more money and some middle managers get automatic summaries of their unnecessary video calls and emails is, I would argue, not important.
But once again, the fundamental issue is late-stage capitalism.
mritterhoff
What's the upside of moralizing energy consumption, especially once it's 100% renewable. Why not just let the market decide? If I'm paying for it, why does anyone else get a say in how I use it?
curvaturearth
Having LLMs everywhere haven't helped me much, it just gets in the way.
YetAnotherNick
Why do you belive this? Datacenter uses just a 1-1.3 percent of electricity from grid and even if you suppose AI increased the usage by 2x(which I really doubt), the number would still be tiny.
Also AI training is easiest workload to regulate, as you can only train when you have cheaper green energy.
kolinko
I also had doubts, but asked chat and it confirms it’s an issue - including sources.
https://chatgpt.com/share/678b6b3e-9708-8009-bcad-8ba84a5145...
The issue is that they are often localised, so even if it’s just 1% of power, it can cause issues.
Still, by itself, grid issues don’t mean climate issues. And any argument complaining about a co2 cost should also consider alternative cost to be reliable. Even if ai was causing 1% or 2% or 10% of energy use, the real question is how much it saves by making society more efficient. And even if it wasn’t, it’s again more of a question about energy companies polluting with co2.
Microsoft, which hosts OpenAI, is famously amazing in terms of their co2 emissions - so far they were going way beyond what other companies were doing.
baobun
ChatGPT didn't "confirm" anything there. It is not a meaninful reference.
YetAnotherNick
What do you mean by confirms the issue? What's the issue exactly?
seanmcdirmid
Is that true though? Data centers can be placed anywhere in the USA, they could be placed near a bunch of hydro or wind farm resources in the western grid which has little coal anyways outside of one line from Utah to socal. The AI doesn’t have to be located anywhere near to where it is used since fiber is probably easier to run than a high voltage power line.
wmf
That was already done years ago and people are predicting that the grid will be maxed out soon.
seanmcdirmid
Build new data centers near sources of power, and grid capacity isn’t going to be a problem. Heck, American industry used to follow that (building garment factories on fast moving rivers before electricity was much of a thing, Boeing grew up in the northwest due to cheap aluminum helped out by hydro). Why is AI somehow different from an airplane?
scottcha
There are a large number of reasons the AI datacenters are geographically distributed--just to list a few off the top of my head which come up as top drivers: latency, data sovereignty, resilience, grid capacity, renewable energy availability.
Karrot_Kream
Why does latency matter for a model that responds in 10s of seconds? Latency to a datacenter is measured in 10s or 100s of milliseconds, which is 3-4 orders of magnitude less.
fulafel
The root problem there is that fossil energy is very cheap and the state sponsors production of fossil fuels. Consequently the energy incentives are weak and other concerns take priority.
This is coupled with low public awareness, many people don't understand the moral problem in using fossils so the PR penalty from a fossil powered data center is low.
getwiththeprog
This is a great article for discussion. However articles like this must link to references. It is one thing to assert, another to prove. I do agree that heating/cooling, car and transport use, and diet play massive roles in climate change that should not be subsumed by other debates.
The flip side to the authors argument is that LLMs are not only used by home users doing 20 searches a day. Governments and Mega-Corporations are chewing through GPU hours on god-knows-what. New nuclear and other power facilities are being proposed to power their use, this is not insignificant. Schneider Electric predicts 93 GW of energy spent on AI by 2028. https://www.powerelectronicsnews.com/schneider-electric-pred...
simonw
The question this is addressing concerns personal use. Is it ethical to use ChatGPT on a personal basis? A surprising number of people will say that it isn't because of the energy and water usage of those prompts.
strogonoff
I would be surprised if many people said it is unethical to use LLMs like ChatGPT for environmental reasons, as opposed to ethical principles such as encouraging unfair use of IP and copyright violation.
Still, LLM queries are not made equal. The environmental justification does not take into account for models querying other services, like the famous case where a single ChatGPT query resulted in thousands of HTTP requests.
simonw
I see people complaining that ChatGPT usage is unethical for environmental reasons all the time. Here's just the first example I found from a Bluesky search (this one focuses on water usage): https://bsky.app/profile/theferocity.bsky.social/post/3lfckq...
"the famous case where a single ChatGPT query resulted in thousands of HTTP requests"
Can you provide more information about that? I don't remember gearing about that one - was it a case of someone using ChatGPT to write code and not reviewing the result?
minimaxir
> I would be surprised if many people said it is unethical to use LLMs like ChatGPT for environmental reasons, as opposed to ethical principles such as encouraging unfair use of IP and copyright violation.
Usually they complain about both.
fulafel
I feel it's great that people have gotten invested in energy use this way, even if it's a bit lopsisded. We should use it in a positive way to get public opinion and political overton window behind rapid decarbonization and closure of oil fields.
BeetleB
> Governments and Mega-Corporations are chewing through GPU hours on god-knows-what.
The "I don't know so it must be huge" argument?
dwattttt
Not knowing what it's being spent on is separate to knowing whether it's being spent
chefandy
I’m not an expert, but I’ve seen multiple reports that predict very large increases in electricity demand.
https://www.goldmansachs.com/insights/articles/AI-poised-to-...
jonas21
> However articles like this must link to references.
There are links to sources for every piece of data in the article.
blharr
Where?
One of the most crucial points "Training an AI model emits as much as 200 plane flights from New York to San Francisco"
This seems to come from this blog https://icecat.com/blog/is-ai-truly-a-sustainable-choice/#:~....
which refers to this article https://www.technologyreview.com/2019/06/06/239031/training-...
which is talking about models like *GPT-2, BERT, and ELMo* -- _5+ year old models_ at this point.
The keystone statement is incredibly vague, and likely misleading. What is "an AI model"? From what I found, this is referring to GPT-2,
andymasley
I'm the author, the 200 flights number is taken from posts made by environmentalists and seem to match public numbers given (about 50 GWh) https://www.forbes.com/sites/arielcohen/2024/05/23/ai-is-pus....
If you think the numbers I used are wildly off I'd really appreciate any source saying so and I'll update the post with the correct amounts.
gloflo
Just 200 flights? I would expected a number at least 100 times that. 200 flights of that range are what, 0.1% of a single day of global air traffic?
All of that is crazy in terms of environmental destruction but this makes AI training seem nothing to focus on to me.
mmoskal
I assume this comes from the 60GWh figure, which does translate to about 200 flights (assuming energy density of gasoline; in actual CO2 emissions it was probably less since likely cleaner energy was used than for running planes).
moozilla
The link the article uses to source the 60 GWh claim (1) appears to be broken, but all of the other sources I found give similar numbers, for example (2) which gives 50 GWh. This is specifically to train GPT-4, GPT-3 was estimated to have taken 1,287 MWh in (3), so the 50 GWh number seems reasonable.
I couldn't find any great sources for the 200 plane flights number (and as you point out the article doesn't source this either), but I asked o1 to crunch the numbers (4) and it came up with a similar figure (50-300 flights depending on the size of the plane). I was curious if the numbers would be different if you considered emissions instead of directly converting jet fuel energy to watt hours, but the end result was basically the same.
[1] https://www.numenta.com/blog/2023/08/10/ai-is-harming-our-pl...
[2] https://www.ri.se/en/news/blog/generative-ai-does-not-run-on...
[3] https://knowledge.wharton.upenn.edu/article/the-hidden-cost-...
[4] https://chatgpt.com/share/678b6178-d0e4-800d-a12b-c319e324d2...
KTibow
If I understand TFA correctly that's a claim it's covering and arguing against, not arguing for.
undefined
crakhamster01
One miss in this post is that the author tries to make their point by comparing energy consumption of LLMs to arbitrary points of reference. We should be comparing them to their relevant parallels.
Comparing a ChatGPT query to an hour long Zoom call isn't useful. The call might take up ~1700 mL of water, but that is still wildly more efficient than what we used to do prior - travel/commute to meet in person. The "10x a Google search" point is relevant because for many of the use cases mentioned in this post and others like it (e.g. "try asking factual questions!"), you could just as easily get that with 1 Google search and skimming the results.
I have found use for LLMs in software development, but I'd be lying if I said I couldn't live without it. Almost every use case of an LLM has a simple alternative - often just employing critical thinking or learning a new skill.
It feels like this post is a long way of saying "yes, there are negative impacts, but I value my time more".
andymasley
I basically do think that at some threshold it's important to weigh your time against negative impacts. I personally avoid taking flights whenever I can because of the climate and think that's worth my time relative to the emissions saved, but I also never worry about optimizing the energy use of my digital clock because that would take too much time relative to the emissions I could save. ChatGPT exists somewhere between those two things, and my argument in the post is that it's much closer to the digital clock.
Retric
On a logarithmic scale, it’s closer to the flight.
Flying 1000 miles commercially only represents about 10 gallons of fuel.
andymasley
10 gallons of fuel's worth of energy could be used to ask ChatGTP 100,000 questions (assuming 3 Wh per question) or power a digital clock (1-2 W) for 35 years. If you assume you ask ChatGPT 8 questions per day, it's using exactly as much energy as a digital clock. Personal use of ChatGPT is much closer to the clock.
maeil
The section on training feels weak, and that's what the discussion is mainly about.
Many companies are now trying to train models as big as GPT-4. OpenAI is training models that may well be even much larger than GPT-4 (o1 and o3). Framing it as a one-time cost doesn't seem accurate - it doesn't look like the big companies will stop training new ones any time soon, they'll keep doing it. So one model might only be used half a year. And many models may not end up used at all. This might stop at some point, but that's hypothetical.
blharr
It briefly touches on training, but uses a seemingly misleading statistic that comes from (in reference to GPT-4) extremely smaller models.
This article [1] says that 300 [round-trip] flights are similar to training one AI model. Its reference of an AI model is a study done on 5-year-old models like BERT (110M parameters), Transformer (213M parameters), and GPT-2. Considering that models today may be more than a thousand times larger, this is an incredulous comparison.
Similar to the logic of "1 mile versus 60 miles in a massive cruise ship"... the article seems to be ironically making a very similar mistake.
[1] https://icecat.com/blog/is-ai-truly-a-sustainable-choice/#:~....
mmoskal
737-800 burns about 3t of fuel per hour. NYC-SFO is about 6h, so 18t of fuel. Jet fuel energy density is 43MJ/kg, so 774000 MJ per flight, which is 215 MWh. Assuming the 60 GWh figure is true (seems widely cited on the internets), it comes down to 279 one-way flights.
blharr
Thanks, I missed that 60 GWh figure. I got confused because the quotes around the statement, so I looked it up and couldn't find a quote. I realize now that he's quoting himself making that statement (and it's quite accurate)
I am surprised that, somehow, the statistic didn't change from GPT-2-era to GPT-4. Did GPUs really get that much more efficient? Or that study must have some problems
devmor
I am sure that’s intentional, because this article is the same thing we see from e/acc personalities any time the environmental impact is brought up.
Deflection away from what actually uses power and pretending the entire system is just an API like anything else.
andymasley
I am to put it mildly not an e/acc and referenced being very worried about other risks from advanced AI in the article.
devmor
Then I would certainly be interested to know why you spent so much time making the same argument e/acc AI proponents make ad nauseam.
As it stands, the majority of your article reads like a debate against a strawman that is criticizing something they don't understand, rather than a refutation of any real criticism of environmental impact from the generative AI industry.
If your aim was to shut down bad faith criticism of AI from people who don't understand it, that's admirable and I'd understand the tone of the article, but certainly not the claim of the title.
Liquix
~90% of the plastic debris in the ocean comes from ten rivers [0]. eight are in china/SEA. millions and billions of single-use items are sitting in warehouses and on store shelves wrapped in plastic. even before the plastic is discarded, the factories these items are produced in dump metric tons of waste into the oceans/soil with little repercussion.
point is, none of our "personal lifestyle decisions" - not eating meat, not mining bitcoin, not using chatgpt, not driving cars - are a drop in the bucket compared to standard practice overseas manufacturing.
us privileged folks could "just boycott", "buy renewable", "vote with your wallet", etc, but sales will move to a less developed area and the pollution will continue. this is not to say that the environment isn't important - it's critically important. it's just to say that until corporations are forced to do things the right way, it's ludicrous to point fingers at each other and worry that what we do day-to-day is destroying the planet.
saagarjha
That's definitely not true. Let's take Americans, for example, driving their cars to work. Americans are about 15% of the world's emissions, of which 25% or so is transportation, of which well over half is cars. So you not driving to work is making direct impact on 2-3% of the world's overall emissions. Likewise, your decisions on all the other things, if taken in aggregate, will have a significant impact on overall emissions.
idle_zealot
"Driving to work" is hardly a "vote with your wallet" style consumer choice. Our housing, building, and transportation policies have been geared towards encouraging car-dependence for nearly a century. In places with better public transit and bike lanes, people spontaneously choose to use those modes of transport. Just like with companies dumping as much plastic waste/CO2 as they can get away with, this is a policy problem, plain and simple. No amount of pro-environment metal straw campaigns will solve it. At best environmentally-conscious messaging could encourage changes in voting behavior which influence policy. At worst people could be convinced that they're "doing their part" and fail to consider systemic changes.
hmottestad
Regular voting is usually what affects things such as the transportation infrastructure in your country or city. It’s a slow proceed though.
Here in Oslo there has been a lot of investment in bike lanes, but just because one part of the local government builds more bike lanes doesn’t mean that other parts of the government will follow suit. Police still doesn’t care about cars illegally blocking the bike lanes. The people ploughing snow see bike lanes as the last thing that should need ploughing, preferably no earlier than 2 weeks after it snowed. A dedicated bike path I use to work is supposed to be ploughed within 2 hours of snow, but it took a week before anything was done and now three weeks later it’s still not to the standard that the government has set.
dijit
I would agree with you, but Americans intentionally reinforce car dependence whenever it's discussed.
It's bad enough that even non-US people regurgitate those talking points despite them being significantly less true for them; because they see it so much online.
saagarjha
See, my point is that everyone first goes “it’s not me”, then they understand it is them and go “but it’s not my policies” and then they vote in the policies which are the problem. It’s totally fine to go “we need collective action to fix this”. But you have to actually join the collective action. You think billionaires are getting rich by committing environmental arbitrage? Then don’t oppose attempts to make the costs appropriate, even if you must now pay your fair share too.
irishloop
Meat and dairy specifically accounts for around 14.5% of global greenhouse gas emissions, according to the UN’s Food and Agricultural Organization (FAO).
If people collectively just ate a bit less meat and dairy, it would go a long way. Don't even have to be perfect. Just show a little bit of restraint.
throwaway314155
Right just a little bit of restraint. On an unprecedented scale of coordination by hundreds of millions to billions of people - a scale of cooperation that has probably never occurred in human history (and there's no reason to believe it will any time soon).
But sure, if people "just" did a "little", it would go a long way. Just a _little_ restraint from the entire population all at once in perpetuity. No big deal.
mossTechnician
The US government could help by ending subsidies towards meat and dairy production, which will prevent those products from being artificially underpriced.
https://www.ewg.org/news-insights/news/2024/10/usda-livestoc...
llmthrow102
Greenhouse gas emissions are only a fraction of terrible things that humans are inflicting on the environment, and meat/dairy are both nutritious food that provides requirements for sustenance, and if not eaten need to be replaced by something else that will also cause greenhouse gas emissions (aka, a 10% reduction in meat consumption does not equal to a 1.45% reduction in greenhouse gas emissions)
I think it's kind of crazy to place the burden of environmental destruction on individual buying habits, rather than the people in power who actually have the ability to make sweeping changes that might actually move the needle.
Let's start with not incentivizing, then disincentivizing the mass production and importation of plastic garbage waste and e-waste that not only create greenhouse gas emissions but pollute the environment in other, irreversible ways.
And if your government and leaders don't make this a priority, and regardless of who you vote in, big-name corpo donors get their way instead, then maybe it's time for a new government.
starspangled
Not encouraging population growth everywhere but particularly in the highest per-capita consuming and polluting countries, but rather allow them to naturally level off and even gradually decline would go a much longer way. It would enable significant emissions reductions and reduction in all other environmental impacts of consumption without impacting quality of life.
Eating bugs and living in pods sounds great and all, but if the end result is just allowing the ruling class to pack more drones and consumers in like sardines then it's not really solving anything.
teaearlgraycold
I was able to cut out 95% of meat without it being much trouble.
TiredOfLife
Since my birth the population of earth has almost doubled.
Brystephor
How much of Americans driving to work is because they choose too though? Amazon's 5 day RTO policy is a good example. How many of the people now going to an office 5 days a week would've done so without the mandate? I see the traffic every day, and saw the same area before the mandate, so I can tell you with confidence that there's many more cars on the road as a result of this commute. this all funnels back to the corporate decision to mandate 5 days in office.
josephcsible
Exactly. IMO, any politician who's serious about saving the environment or reducing the number of cars should be proposing bills to heavily tax employers for every unnecessary commute they require of their employees (maybe $100-$500 per employee per unnecessary day required in the office).
netcan
if taken in aggregate, will have a significant impact
This is a good sentiment. But, in context, it is a fallacy. A harmful one.
Consumer action on transport and whatnot, assuming a massive and persistent global awareness effort... has the potential of adding up to a rounding error.
Housing policy, transport policy, urban planning... these are what affects transport emissions. Not individual choices.
Look at our environmental history. Consumer choice has no wins.
It's propaganda. Role reversal. Something for certain organizations to do. It is not an actual effort to achieve environmental benefit.
We should be demanding governments clean up. Governments and NGOs should not be demanding that we clean up.
fastball
This assumes all emissions / externalities are created equal, which they are not.
eru
You are right. Though for CO2 that simplification comes pretty close to true.
smcin
Could you say more?
Are you talking about comparing CO2 to N2O to CH4 to fluorocarbons, for example?
aio2
The emissions from vehicles are different from plastics produced by factories.
Also, while important, 2-3% of world emissions is a drop in the bucket compared to the other 97%. Let's consider the other causes and how we can fix them.
Think about this: for many people, not driving to work is a big deal. If people collectively decide to do that, that's a lot of effort and inconvenience just for 2-3%.
ido
while 3% might sound like a drop in the bucket, there isn't any single specific chunk of the rest of the 97% that will immediately cut, say, 30-40% of emissions (also remember that 2-3% is the super specific "Americans not driving cars", not "everyone in the world not driving cars").
saagarjha
There isn’t really a magic wand we can wave and get 50% back for free and without inconvenience. The other 97% involves things like individually figuring out where our electricity generation goes. Or figuring out which farms to shut down, or what manufacturing we don’t like anymore. All of this must happen. It will be inconvenient. I picked a slice that is immediately relevant to a lot of people here. But there are a lot of axes to look at this.
photonthug
> That's definitely not true. Let's take Americans, for example, driving their cars to work.
Even an example like this that is carefully chosen to make consumers feel/act more responsible falls short. You want people to change their lives/careers to not drive? Ok, but most people already want to work from home, so even the personal “choice” about whether to drive a car is basically stuck like other issues pending government / corporate action, in this case to either improve transit or to divest from expensive commercial real estate. This is really obvious isn’t it?
Grabbing back our feeling of agency should not come at the expense of blaming the public under the ridiculous pretense of “educating” them, because after years of that it just obscures the issues and amounts to misinformation. Fwiw I’m more inclined to agree with admonishing consumers to “use gasoline responsibly!” than say, water usage arguments where cutting my shower in half is supposed to somehow fix decades of irresponsible farming, etc. But after a while, people mistrust the frame itself where consumers are blamed, and so we also need to think carefully about the way we conduct these arguments.
saagarjha
I didn't really carefully choose this, it was just what I came up. As others have mentioned, meat is another big one. FWIW I have no disagreement with letting people work from home, or pushing for other changes to make them less car-dependent.
citrin_ru
I think many Americans driving to work would be happy to work from home if not RTO mandates (encouraged by the government at least on a local level).
petesergeant
This massively lets the Philippines off the hook. China has a gazillion people, and so does India, and the rest of SE Asia is bad for pollution, but the Philippines — with 1.5% of the world’s population — is an incredible 36% of ocean plastic pollution.
Also a call-out to Malaysia who are an upper-middle income country and contribute far too much per capita given their income situation, but again, they are a drop in the ocean compared to the (much, much poorer) Philippines.
Having spent half my life in South-East Asia, there’s a cultural problem that needs fixing there.
A pretty graph that make it clear just how bad the most egregious polluters are comparatively: https://ourworldindata.org/grapher/ocean-plastic-waste-per-c...
mnsc
What are they doing in the Philippines? Dumping household waste straight into the rivers?
petesergeant
Pretty much. Buying everything in sachets and then not having any real waste disposal; https://earth.org/philippines-plastic/
deepsun
I think they are fishing commerially a lot. Most of the ocean trash comes from the fishing industry (e.g. abandoned nets).
ternnoburn
Per capita beef consumption is down by 35% in the US since the 70s. From 62kg/person/year to 35.
Beef produces ~100kg CO2 per kg of meat. That's a reduction of 27,000kgs of CO2 reduction, per capita.
That's not nothing. By simply reducing beef consumption by 1 kilogram a month, you can prevent more than a metric ton of CO2. If 5% of Americans cut 1 kilo of beef a month, that'd knock out 15 million tons of CO2.
Small changes can have an impact on aggregate, and just because someone else is not making those changes doesn't excuse us looking at ourselves and saying, "I could choose something else this time".
nameless_me
I have always felt this way too. Our personal choices do not move the needle on fossil fuel and plastics. One could embrace aversion to these out of a sense of sustainability to signal virtue, but lets not pretend it will save the planet. It won't. Restricting aviation flights, stopping wars and minimizing the dirty fuel used in maritime freight does much more. But the world will not do it.
kalaksi
While I agree in general, my opinion is that customer choices do also matter and can move the needle, slowly, with larger cultural change.
Personally, trying to make better choices, big or small, isn't about "virtue signalling". It's about acknowledging the issues and living according to ones values.
rolftheperson
This line of thinking is what undermines democracies and ruins the environment. Your choice might just be a drop in the ocean, but guess what the ocean is made out of.
yodsanklai
> it's just to say that until corporations are forced to do things the right way
But this isn't going to happen by itself. We need to vote for people who believes in regulating these corporations (rather than deregulating them).
eviks
> "vote with your wallet", etc, but sales will move to a less developed area and the pollution will continue.
But voting with your wallet is literally moving sales to a more developed area with less pollution?
wisty
I think this is wrong.
Descriptively / "objectively" if you make your demand cleaner, you decrease demand for dirty consumption. You can't say individuals don't matter by comparing them to the world, that's invalid.
Normatively, is it a useful lie? Maybe, to some extent. People are lazy, selfish, and stupid. Peter Singer points out that we might be nice to people nearby, but we don't give money to people starving in other countries even if we think it will make a real difference. And no human can really know how even a pencil is made, so we make poor decisions. A carbon tax would unleash the free market on the problem. But saying individuals can't act is not good leadership, if even the people who say they want to fix the issue won't make personal sacrifices, why should the average voter?
simonw
The absolute best thing I've read on this subject is this article here: https://about.bnef.com/blog/liebreich-generative-ai-the-powe...
It talks at great length about data center trends relating to generative AI, from the perspective of someone who has been deeply involved in researching power usage and sustainability for two decades.
I made my own notes on that piece here (for if you don't have a half hour to spend reading the original): https://simonwillison.net/2025/Jan/12/generative-ai-the-powe...
strogonoff
I find the following to be a great point regarding what we ought to consider when adapting our lifestyle to reduce negative environmental impact:
> In deciding what to cut, we need to factor in both how much an activity is emitting and how useful and beneficial the activity is to our lives.
The further example with a hospital emitting more than a cruise ship is a good illustration of the issue.
Continuing this line of thought, when thinking about your use of an LLM like ChatGPT, you ought to weigh not merely its emissions and water usage, but also the larger picture as to how it benefits the human society.
For example: Was this tech built with ethically sound methods[0]? What are its the foreseeable long-term effects on human flourishing? Does it cause a detriment to livelihoods of the many people while increasing the wealth gap with the tech elites? Does it negatively impact open information sharing (willingness to run self-hosted original content websites or communities open to public, or even the feasibility of doing so[1][2]), motivation and capability to learn, creativity? And so forth.
[0] I’m not going to debate utilitarianism vs. deontology here, will just say that “the ends justify the means” does not strike me as a great principle to live by.
undefined
OlivOnTech
Hello Simon,
You mention that
> Google, Microsoft, Meta and Amazon all have net-zero emission targets which they take very seriously, making them "some of the most significant corporate purchasers of renewable energy in the world". This helps explain why they're taking very real interest in nuclear power.
Nuclear is indeed (more or less) zero-emission, but it's not renewable.
Thank you for the synthesis and link to the original article, it's a good read!
yapyap
Such a stupid post, I know people on HN don’t like absolute descriptors like that and sorry for that.
Obviously the LLMs and ChatGPT don’t use the most energy when answering your question, they churn through insane amounts of water and energy when training them, so much so that big tech companies do not disclose and try to obscure those amounts as much as possible.
You aren’t destroying the environment by using it RIGHT NOW, but you are telling the corresponding company that owns the LLM you use “there is interest in this product”, en masse. With these interest indicators they will plan for the future and plan for even more environmental destruction.
nick__m
It's not like they are mixing that water with oil and pumping into the aquifer. Water evaporate, turn into clouds, that precipitate into rain that fall on the ground and water bodies, where it can be used again. So what's the problem, with datacenter water usage? Has the water cycle has stopped and I was not informed?
ternnoburn
Fresh water is finite. Infinite in reuse, but we can only take so much from a river before that river ceases to be. If you have a megabit connection, it doesn't matter that your cloud backups have infinite storage, you are limited by bandwidth.
Water vapor stays aloft for wild, so there's no guarantee it enters the same watershed it was drawn from.
It's also a powerful greenhouse gas, so even though it's removed quickly, raising the rate we produce it results in more insulation.
It's not a finite resource, we need to be judicious and wise in how we allocate it.
namesbc
[flagged]
simonw
Plenty of companies have revealed exactly how much energy and CO2 they have used training a model. Just off the top of my head, I've seen those numbers are available for Meta's Llama models, Microsoft's Phi series and DeepSeek's models - including their impressive DeepSeek v3 which trained for less than $6m in cost - a huge reduction compared to other similar models, and a useful illustration of how much more effect this stuff can get on the training side of things.
fulafel
Anyone care to have a go at back of the envelope number for training energy use amortized per query for ChatGPT's models? Is the training or the inference going to dominate?
jna_sh
Similar feelings about the repeated references to the apparently agreed consensus that individual action is pointless vs systematic change like switching to a renewable energy system. Jevons Paradox would like a word.
undefined
monero-xmr
[flagged]
ben_w
> Charge the consumer of energy the requisite price. If you want to make them pay for some externality, great.
> And it is elites only as poors don’t give a shit
The poor people are also consumers; raising prices of energy for that group is a fantastic way to get kicked out of office even if you're an actual literal dictator.
People are complex.
The screeds you're objecting to are part of the political process to tell governments to do something, even if that something ends up being a mix of what you suggest plus subsidies for the poor, or something completely different, in any case to avoid being defenestrated
andymasley
Hey all I wrote this post. To clear up a few points:
I meant this post to tell individuals that worrying about the emissions they personally cause using ChatGPT is silly, not that AI more broadly isn't using a lot of energy.
I can't really factor in how demand for ChatGPT is affecting the future of AI. If you don't want to use ChatGPT because you're worried about creating more demand, that's more legit, but worry about the emissions associated with individual searches right now on their own is a silly distraction.
One criticism is that I didn't talk about training enough. I included a section on training in the emissions and water sections, but if there's more you think I should address or change I'm all ears. Please either share them in the comments on the post or here.
I saw someone assumed I'm an e/acc. I'm very much not and am pretty worried about risks from advanced AI. Had hoped the link to an 80,000 Hours article might've been a clue there.
Someone else assumed I work for Microsoft. I actually exclusively use Claude but wanted to write this for a general audience and way fewer people know about Claude. I used ChatGPT for some research here that I could link people to just to show what it can do.
zdragnar
The title does not match the content.
A more appropriate title is "Emissions caused by chatgpt use are not significant in comparison to everything else."
But, given that title, it becomes somewhat obvious that the article itself doesn't need to exist.
9rx
> "Emissions caused by chatgpt use are not significant in comparison to everything else."
Emissions directly caused by Average Joe using ChatGPT is not significant compared to everything else. 50,000 questions is a lot for an individual using ChatGPT casually, but nothing for the businesses using ChatGPT to crunch data. 50,000 "questions" will be lucky to get you through the hour.
Those businesses aren't crunching data just for the sake of it. They are doing so ultimately because that very same aforementioned Average Joe is going to buy something that was produced out of that data crunching. It is the indirect use that raises the "ChatGPT is bad for the environment" alarm. At very least, we at least don't have a good handle on what the actual scale is. How many indirect "questions" am I asking ChatGPT daily?
jonas21
> given that title, it becomes somewhat obvious that the article itself doesn't need to exist.
Why? I regularly hear people trying to argue that LLMs are an environmental distaster.
_ache_
Because LLMs are an environmental disaster.
It's not about any individual usage. It's the global technology that is yet to prove to be useful and that already have bad for the environment.
Any new usage should be free of impact on the environment.
(Note: The technology of LLM itself is not an environmental disaster, but how it is put in use currently isn't the way).
c0redump
> yet to prove to be useful
I don’t understand this perspective. It should be abundantly clear at this point that these systems are quite useful for a variety of applications.
Do they have problems? Sure. Do the AI boosters who breathlessly claim that the models are super intelligent make me cringe? Sure.
But saying that they’re not useful is just downright crazy.
satvikpendem
> It's the global technology that is yet to prove to be useful
Useful for whom, by what definition? I personally find it very useful for my day to day work, whether it be helping me write code, think through ideas, or otherwise.
simonw
The article needs to exist because the idea that ChatGPT usage is environmentally disastrous really has started to make its way into the human hive mind.
I'm glad someone is trying to push back against that - I see it every day.
deepsun
Learning a new model (like GPT-4) is way more costly than running it.
originalvichy
Where in the world are you getting the numbers for how much video streaming uses energy? I am quite sure that just as with LLMs, most of the energy goes into the initial encoding of the video, and nowadays any rational service encodes videos to several bitrates to avoid JIT transcoding.
Networking can’t take that much energy, unless perhaps we are talking about purely wireless networking with cell towers?
oneplane
LLM Inference is still quite power-hungry, Video decoding with hardware acceleration is much more efficient.
But we can do some estimates, heck, we can even ask GPT for some numbers.
Say you want to do 30 minutes of video (h265) or 30 minutes of LLM inferencing on a generic consumer device, ignoring the source of the model or source of encoded video, you get about 4x difference:
Energy usage for 30 minutes of H.265 decoding: ~15–20 Wh.
Energy usage for 30 minutes of Llama3 inference: ~40–60 Wh.
This is optimised already, so a working hardware H.265 decoder is assumed, and for inferencing, something on the level of an RTX 3050, but can also be a TPU or NE.While not the most scientific comparison, it's perhaps good to know that video decoding is practically always local, and for streaming services it will use whatever is available and might even switch codecs (i.e. AV1, H.265, H.264 depending on what is available, and what licenses are used). And if you have older hardware, some codecs won't even exist in hardware, to the point where you start doing software decoding (very inefficient).
AI inferencing is mostly remote (at least the heavy loads) in a datacenter because local availability of hardware is pretty hit and miss, models are pretty big and spinning one up every time you just wanted to ask something is not very user friendly. Because in a datacenter you tend to pay for amperage per rack, you spec your AI inferencing hardware to eat that power since you're not saving any money or hardware life when you don't use it. That means that efficiency is important (more use out of a rack) but scaling/idling isn't really that big of a deal (but it has slowly dawned on people that burning power 'because you can' is not really a great model). That AI inferencing in a datacenter is more power-hungry as a result, because they can, because it is faster, and that's what attracts users.
I would estimate that the local llama3 inferencing uses less power than when done in a datacenter, because there simply is less power available locally (try finding an end-user device that is used mass-market with enough power available, you won't; only small markets like gaming PCs and workstations will do).
semiquaver
20 Wh for 30 minutes of hardware accelerated h265 decoding is an order of magnitude too high at any bitrate. Please cite your sources.
oneplane
As I wrote in my reply, I don't have "sources".
Pure decode excluding any other requirements is probably pretty low, but running a decoder isn't all you need. There's network, display, storage and RAM so your OS can run etc. There will probably be plenty of variation (brightness, environment, how you get your stream in since a 5G modem is probably going to be different energy-wise compared to WiFi or Ethernet), and if you have something like a decoder in the CPU or in the GPU and if that GPU is separate, more PCIe involvement etc. But we can still estimate:
Hardware decoding (1080p video): ~5–15 W for the CPU/GPU
Overall system power usage (screen, memory, etc.): ~25–45 W for a typical laptop.
Duration (30 minutes): If we assume an average of 35 W total system power, the energy consumption is:
Energy = 35W × 0.5h ours = 17.5 Wh
We can do a similar one for inference, also recognising you'll have variations either way:
CPU inference: ~50 W. GPU inference: ~80 W. Overall system power usage: ~70–120 W for a typical laptop during LLM inference.
Duration (30 minutes): Assuming an average of 100 W total system power:
Energy = 100W × 0.5 hours = 50Wh
We could pretend that our own laptop is very good at some of these tasks, but we're not taking about the best possible outcome, we're talking about the fact that there is a difference between decoding a video stream and doing LLM inference, and the fact that that difference is big enough to make someone's point that video streaming is somehow 'worse' or 'as bad as' LLM usage moot. Because it's not. LLM training and LLM inference eats way more energy.
Edit: looking at some random search engine results, you get a bunch of reddit posts with screenshots from people asking where the power consumption goes on their locally running LLM inferencing: https://www.reddit.com/r/LocalLLaMA/comments/17vr3uu/what_ex...
It seems their local usage hovers around 100W. Other similar posts hover around the same, but it seems to be throttle based as other machines with faster chips also throttle around the same power target while delivering better performance. Most local models use a quantised model which is less resource-hungry, the cloud-hosted models tend to use much larger (and thus more hungry models).
Edit2: looking at some real-world optimised decoding measurements, it appears you can decode VP9 and H.265 on 1 year old hardware below 200mW. So not even 1W. That would mean LLM inferencing is orders of magnitude more power hungry than video decoding. Either way: LLM power usage > Video Decode power usage, so the article trying to put them in the same boat is nonsense.
simonw
"I would estimate that the local llama3 inferencing uses less power than when done in a datacenter, because there simply is less power available locally"
Is this taking into account the fact that datacenter resources are shared?
Llama 3 on my laptop may use less power, but it's serving just me.
Llama 3 in a datacenter or more expensive, more power-hungry hardware is potentially serving hundreds or thousands of users.
liontwist
Luckily we don’t have to do such a calculation. All this energy use will be factored into cost which tells us which is using more resources.
bdndndndbve
Ah yes high tech, an industry where there's famously no weird distorting influence from VCs subsidizing unprofitable business models to grab market share.
liontwist
It doesn’t matter that you are paying for, someone’s paying for it, and that economizing force always is putting pressure on.
changoplatanero
If you use chatgpt somehow saves you from making one trip to the doctor in your car it can offset the entire year worth of chatgpt usage in terms of co2 impact.
yapyap
if your use of chatgpt saves you from a trip to the doctor I would be very concerned
kaonwarb
Early days, but not as crazy as it sounds: https://jamanetwork.com/journals/jamanetworkopen/fullarticle...
"The LLM alone scored 16 percentage points (95% CI, 2-30 percentage points; P = .03) higher than the conventional resources group."
asmor
Would be much more interesting if this ranked based on severity of misdiagnosis. An LLM that is 50% better at diagnosing a common cold but missed sepsis 10% more often would not be an overall improvement.
dragonwriter
ChatGPT is probably adequate to provide a slightly more user-friendly but also slight-less-reliable replacement for a reliable consumer-oriented medical reference book or website for the task of determining whether self-care without seeing a doctor or seeing a doctor is appropriate for symptoms not obviously posing an immediate emergency.
ekianjo
Most doctor visits are for benign matters...
BobaFloutist
The point of the doctorate is for them to make that determination.
croes
If ChatGPT somehow makes you eat more burgers it could make it makes water consumption worse.
hexage1814
To me the "ChatGPT is destroying the environment "card always felt somewhat like bad faith arguing from the anti-AI crowd trying to find any excuse for being against AI. Like, the same people who complained about "using AI is destroying environment" seemed to have no issue with boarding a plane which would emit a bunch of CO2 so that they can have a vacation in Europe or the like. Selective environmentalism.
drawfloat
Who is this person you’re constructing? Being concerned about plane emissions and travel is an incredibly common thing and people are adjusting their lifestyles accordingly - lots of overnight sleeper train lines are reopening due to demand.
noirbot
I mean, it's a literal net new expenditure of power and water. I also deeply doubt they have "no issue" with plane travel. You're just assuming the worst and most hypocritical position to someone, which seems deeply bad faith as well.
It's literally true that most of the AI vendors and their data center partners are writing off energy and water conservation targets they'd had for the near future because of LLM money. That is actually bad in the short and likely long term, especially as people use LLMs for increasingly frivolous things. Is it really necessary to have an LLM essentially do a "I'm Feeling Lucky" Google Search for you at some multiple of that environmental cost? Because that's what most of my friends and coworkers use ChatGPT for. Very rarely are they using it to get anything more complex than just searching wikipedia or documentation you could have had bookmarked.
A person has a choice of if they take a flight and if it's worth it for them. They have no power except for raising a complaint in public on if OpenAI or Google or whoever spends vast amounts of money and power to train some new model. If your bar is that no one is allowed to complain about a company burning energy unless they live a totally blameless life farming their own food without electricity then random companies will get to do any destructive act they want.
minimaxir
It's less bad faith, more a meme that has become so prevalent that it's impossible to dispell as it's something too nuanced for social media. I've seen more than a few social media posts asking "do they really cut down a rainforest every time someone generates an AI image?"
rainonmoon
I mean we can go back and forth all day with credulous clowns on both sides. There are certainly plenty of misguided AI advocates who think what they're using is magic.
croes
You are talking about hypocrites.
What about the people who take the protection of the environment seriously?
They got now a setback because not only didn’t we reach our previous goals on lowering energy consumption but know we put new consumption on top of that. Just because the existing one ate worse doesn’t make it good.
There is a reason why MS missed its CO2 targets and why everyone is kn search for more energy sources.
They all create more CO2.
Get the top HN stories in your inbox every day.
Yet we also see that hyperscale cloud emissions targets have been reversed due to AI investment, Datacenter growth is hitting grid capacity limits in many regions, and peaker plant and other non-renewable resources on the grid are being deployed more to handle this specific growth from AI. I think the author, by qualifying on "chatgpt" maybe can make the claims they are making but I don't believe the larger argument would hold for AI as a whole or when you convert the electricity use to emissions.
I'm personally on the side that the ROI will probably work out in the long run but not by minimizing the potential impact and keeping the focus on how we can make this technology (currently in its infancy) more efficient. [edit wording]