this post was submitted on 30 Jul 2024
174 points (94.4% liked)

Fuck AI

1398 readers
442 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 8 months ago
MODERATORS
 

One million Blackwell GPUs would suck down an astonishing 1.875 gigawatts of power. For context, a typical nuclear power plant only produces 1 gigawatt of power.

Fossil fuel-burning plants, whether that's natural gas, coal, or oil, produce even less. There's no way to ramp up nuclear capacity in the time it will take to supply these millions of chips, so much, if not all, of that extra power demand is going to come from carbon-emitting sources.

top 28 comments
sorted by: hot top controversial new old
[–] breadsmasher@lemmy.world 109 points 3 months ago (1 children)

NVidia designing, building and selling these sorts of cards with astronomical power usage? I get it. They want to stay at the top.

But those buying these cards at least need to be taxed, charged, regulated, whatever to make sure the huge additional power they require is funded by said company, and should only be green/renewable energy sources. And not using clean drinking water communities need for cooling.

If companies want to run massive amounts of hardware like this, it should be prohibitively expensive unless they build their own GREEN power stations, and find ways to cool without using drinking water from any community.

At the moment, taxes and government money goes into power stations which these DCs then use. All the cost is pushed right down onto the every day tax payer and consumer. But all the profit is flowing upwards.

Make them pay for what they use. Make them pay to make these cards efficient, clean, and safe for our environment. Its not like these trillion dollar companies couldn’t pay for it all and make the world a better place.

[–] pennomi@lemmy.world 32 points 3 months ago (1 children)

Using tons of energy isn’t a problem, as long as it’s carbon neutral (or negative). The problem is that we are simply not there yet. Taxing carbon is a great solution and would nearly immediately fix the problem (on the scale of years, not decades).

[–] ptz@dubvee.org 40 points 3 months ago* (last edited 3 months ago) (1 children)

Using tons of energy isn’t a problem, as long as it’s carbon neutral (or negative).

That energy should go to more useful-to-society purposes, first. If all the "AI" datacenters are running on green power and the rest of us are still burning coal, then that's green power that's still wasted. It's even more of a slap in the face if taxpayer funds go toward the costs of building any of those single-purpose green energy projects.

[–] Kecessa@sh.itjust.works 19 points 3 months ago (1 children)

Bingo, same argument I've had to bring up with crypto bros, if it's using green energy that could be used to power essential things that are powered by fossil fuel then it's green energy that we're wasting. All these projects should be put on hold until we're running on 100% green energy and we produce enough surplus that we can afford to use it for non essentials.

[–] bizarroland@fedia.io 2 points 3 months ago (1 children)

Seems like the smart move would make a mandate that any AI data center that wants to activate needs to provide its own power from wind and solar, or at least build or contribute to an existing wind and solar plant on the same grid to offset their estimated power consumption watt for watt.

[–] Kecessa@sh.itjust.works 5 points 3 months ago (1 children)

Same issue, why shouldn't we use that to replace fossil fuel elsewhere instead? The land they're using to produce green energy could be used to do the same but for essential needs instead.

[–] bizarroland@fedia.io 2 points 3 months ago (1 children)

I see what you're saying, and I get that I'm in the fuck AI magazine.

[–] Kecessa@sh.itjust.works 1 points 3 months ago

Funny, I hadn't even noticed myself 😂

[–] FlyingSquid@lemmy.world 39 points 3 months ago (1 children)

I've had multiple people on Lemmy tell me that the amount of energy LLMs use will be trivial. They always base it on the amount of energy used to train the LLMs, not the millions (billions? trillions?) of calculations those LLMs have to do every second they're used by who knows how many people 24 hours a day.

Then you bring up the water wasting and the best they can do is say something like, "okay, that's a problem... but only in some places!"

(Some places including much of the United States. Guess where lots of the data centers are?)

[–] sunstoned@lemmus.org 0 points 3 months ago* (last edited 3 months ago) (1 children)

I don't disagree, but it is useful to point out there are two truths in what you wrote.

The energy use of one person running an already trained model on their own hardware is trivial.

Even the energy use of many many people using already trained models (ChatGPT, etc) is still not the problem at hand (probably on the order of the energy usage from a typical search engine).

The energy use in training these models (the appendage measuring contest between tech giants pretending they're on the cusp of AGI) is where the cost really ramps up.

[–] FlyingSquid@lemmy.world 1 points 3 months ago (1 children)

(probably on the order of the energy usage from a typical search engine).

I find that hard to believe. Search engines just regurgitate what is in a database. LLMs have to do calculations to create the sentences they produce. That takes more energy.

[–] sunstoned@lemmus.org 1 points 3 months ago* (last edited 3 months ago)

Believe what you will. I'm not an authority on the topic, but as a researcher in an adjacent field I have a pretty good idea. I also self host Ollama and SearXNG (a metasearch engine, to be clear, not a first party search engine) so I have some anecdotal inclinations.

Training even a teeny tiny LLM or ML model can run a typical gaming desktop at 100% for days. Sending a query to a pretrained model hardly even shows up on HTop unless it's gigantic. Even the gigantic models only spike the CPU for a few seconds (until the query is complete). SearXNG, again anecdotally, spikes my PC about the same as Mistral in Ollama.

I would encourage you to look at more explanations like the one below. I'm not just blowing smoke, and I'm not dismissing the very real problem of massive training costs (in money, energy, and water) that you're pointing out.

https://www.baeldung.com/cs/chatgpt-large-language-models-power-consumption

[–] kn0wmad1c@programming.dev 29 points 3 months ago (1 children)

We should start celebrating efficiency. Let's name a planet after the scientist who discovers how to power these cards using 60% less energy.

[–] Chocrates@lemmy.world 4 points 3 months ago (3 children)

Room temperature super conductors will save us, if they exist.

[–] ICastFist@programming.dev 14 points 3 months ago (1 children)

I'm fully expecting AI bros to say AI will create a working room-temp superconductor design

[–] bizarroland@fedia.io 11 points 3 months ago (1 children)

I'm sure that if you asked any AI they would give you recipes for room temperature superconductors.

I asked an AI what lava would feel like if you took the heat out of it and it told me, but then it asked me if I would like to know if I would be interested in some delicious lava recipes.

So I said yes.

[–] teamevil@lemmy.world 4 points 3 months ago

And? I need to know if I'm preparing my lava correctly.

Magma

[–] Fermion@feddit.nl 2 points 3 months ago

Resistive heating is not the dominant energy loss mechanism in modern computing. Since the advent of field effect transistors, switching losses dominate. Room temperature super conductors could be relevant in power generation, distribution, and manuafacturing, but would not radically alter the power requirements for computing.

I personally don't think any possible room temperature super conductors would be economical to produce at a large enough scale to make a large difference in energy demands. Researchers have pretty thoroughly investigated the classes of materials that are easy to manufacture, which suggests a room temperature superconductor would be prohibitevely expensive to produce.

[–] shalafi@lemmy.world 1 points 3 months ago (1 children)

Been waiting since the 90s, about given up hope.

[–] Chocrates@lemmy.world 1 points 3 months ago

The one last summer broke me. I have a healthy skepticism of any announcement, but that one seemed so credible I bought in.

[–] user1234@lemmynsfw.com 20 points 3 months ago (1 children)

You only need 1.21 gigawatts to go back to the future in a Delorean.

[–] neclimdul@lemmy.world 6 points 3 months ago (2 children)

I think you mean 1 point 21 jigowatts

[–] tektite 2 points 3 months ago

Jigga what?

[–] taxon@lemmy.world 2 points 3 months ago (1 children)

"I always figured that the word “Jigawatt”was made up just for the movie and meant to sound like a really large amount. It wasn’t unit I started researching this flux capacitor replica project that I stumbled across a few references to the actual term. It turns out that the original pronunciation of “Giga” was with the “j” sound (really a soft “g”)." Here Check out Merriam Webster's pronunciation

[–] neclimdul@lemmy.world 2 points 3 months ago

Yeah pretty neat. I spelled it out because I doubt many people in this day and age would pronounce it that way normally.

[–] fubarx@lemmy.ml 11 points 3 months ago (1 children)

Nvidia could announce a side-gig dedicated to fossil-free, local power generation. Get your money coming and going.

[–] BallsandBayonets@lemmings.world 3 points 3 months ago (1 children)

Boil water with the heat coming off their GPUs, use the steam to turn turbines, use the generated electricity to power the GPUs! Free unlimited power! Why hasn't anyone gotten in on this yet?

/s