Skip to content(if available)orjump to list(if available)

The GPU shortage is over

The GPU shortage is over

261 comments

·July 1, 2022

liketochill

>Reportedly, one big reason used GPU prices are tanking is because crypto miners are flooding the market with cards that are no longer profitable due to the ongoing crypto crash, where the total market cap of all crypto assets has fallen by two-thirds since its peak of $3 trillion last November. (Miners have been switching away from GPUs for some time, though.)

What to do with all that computing power now. I guess AI.

baal80spam

I'm not an expert but I wouldn't want to buy a second-hand GPU that was stressed by crypto mining.

moffkalast

The damage is actually comparable or lower than normal usage, since they run at a constant temp which is less stressful and tend to be undervolted to get a better hashrate/dollar so the lines shouldn't be eroded much. At least according to that one LinusTechTips test.

torginus

Yeah, I don't necessarily agree. I assume many miners didn't run these cards in temperature controlled server rooms, but in places like abandoned warehouses in south-east Asia, with high ambient temps and humidity, which can destroy electrical components.

And even if they did, the capacitors in the VRMs of these cards are full of electrolytic capacitors which do degrade over time due to heat and chemical reactions occurring between the plates and the electrolyte as well as the evaporation of electrolyte, they not only age, but age faster over time as the degraded capacitors are put under increased stress.

Besides, the assumption of running these cards underclocked and undervolted might not be true - if the folks running those obtained the electricity illegally, or incredibly cheaply, it might make more sense to run these cards hotter.

Everything considered, these second hand GPUs are not even that cheap, I saw 5700 XTs going for $250 on ebay, with a comparable brand new 6600 costing about $300.

Edit: The fans also have suffered serious wear and tear and are likely to fail. Considering these are often custom, you might not even be able to replace them.

xxs

The GPU core doesn't get stressed much, of course. Memory, however, is another issue - rtx3090 being an absolute disaster (effectively cooling on half of the memory). HBM cards are likely in a similar boat.

The (large) GPU mining operations run at rather high ambient.

ineedasername

I don't know anything about component wear patterns, so a question: would a peak constant temp cause less wear than moderate fluctuations? (And is it fast heating/cooling cycles that cause physical stress or is it some other mechanism of action that causes the wear?)

more_corn

But they’re put together by amateurs, in non-static safe environments, operated with questionable cooling…

tehbeard

Ive seen some posts/tweets that There are cards exiting the crypto market where failed ram chips on them have just been yanked off the board, as crypto just needs cores not lots of vram, i wouldn't trust it unless very, very cheap to lower my risk.

fortran77

I really wouldn't trust them to have been run with adequate cooling, or to not have been run at higher clockrates, etc.

KennyBlanken

There are different problems.

Thermal cycling common to gaming uses causes mechanical failures in chips, between chips and the board, and the board - but electrolytic capacitor lifetime drops dramatically as temperature goes up.

Not much thermal cycling in desktop systems seeing mostly "productivity" use that have proper ventilation.

runnerup

Crypto miners actually often run the cards at low wattages and low heat. Because the top 5% of performance requires like 30% of the energy. So you can reduce your marginal cost (electrical bill) by 30% while only decreasing your revenue (mining) by 5%.

Gamers are the ones destroying GPU's, not professional miners who tune their systems properly.

gaudat

But they stress the hell out of the VRAM coz mining is limited by memory bandwidth. There are reports on 2nd hand video cards from miners getting rendering artifacts after only a few days of gaming.

caymanjim

I don't think anyone is suggesting you should buy a used one. The fact that crypto miners are dumping them likely indicates that they aren't going to buy the new ones up. It could mean they're just rotating stock, but given crypto prices, more likely means they're giving up and dumping assets. This should free up new stock for the rest of us.

I'd buy a used one if the price were commensurate with the risk. Say 20% retail.

RedShift1

Is there any evidence that chips degrade in such a way? I've got servers that are 8 years old that ran full bore and they still work fine?

magicalhippo

> Is there any evidence that chips degrade in such a way?

As far as I can see, the answer is yes[1]:

The relationship between integrated circuit failure rates and time and temperature is a well established fact. The occurrence of these failures is a function which can be represented by the Arrhenius Model. Well validated and predominantly used for accelerated life testing of integrated circuits, the Arrhenius Model assumes the degradation of a performance parameter is linear with time and that MTBF is a function of temperature stress.

However, the dramatic acceleration effect of junction temperature (chip temperature) on failure rate is illustrated in a plot of the above equation for three different activation energies in Figure 2. This graph clearly demonstrates the importance of the relationship of junction temperature to device failure rate. For example, using the 0.99 ev line, a 30° rise in junction temperature, say from 130°C to 160°C, results in a 10 to 1 increase in failure rate.

Wikipedia has an overview of the the Arrhenius equation[2].

Now, as you probably know most complex chips like GPUs and CPUs have built-in thermal management which prevents the junction temperature to rise above some limit, which one would assume is set at a point which reasonably guarantees a decent lifetime.

However, according to this, chips that have experienced less heat should, on average, live longer.

[1]: https://www.ti.com/lit/an/snva509a/snva509a.pdf

[2]: https://en.wikipedia.org/wiki/Arrhenius_equation

nikau

From my experience pretty much all failures in modern electronics are due either:

  * Dried up electrolytic capacitors

  * Bad solder joints exacerbated by lead free solder.

  * Failed high current transistors (typically regulation mosfets or triacs in appliances)
Failure of low current handling chips is pretty rare.

walrus01

more concerning is the wear/tear on the fan bearings and the thermal interface between heatsink and board will often be dried out and in poor condition.

some gpus are rather hard to get a replacement fan for because it's not like they use an off the shelf modular 60 or 80mm 12vdc fan.

digitallyfree

The servers are built and rated for 24/7 operation, unlike consumer GPUs. This includes mechanical as well as power components.

pclmulqdq

They do, mostly due to thermal expansion and contraction, fan wear, and thermal paste degradation on the card. Even so, this has a very small effect.

derefr

Why not? Embarrassingly-parallel problems (of the kind you'd use a GPU for) are a perfect use-case for enhancing fault-tolerance through redundancy (i.e. running your program on three cards at once and only accepting the output if they all agree.)

The only place this doesn't apply is gaming, where you're trying to get realtime interactive results while economizing on both cost and energy. But for every other use of a GPU, you can't go wrong with "throwing more compute at the problem."

jayd16

I mean, if the price it right, why not? I wouldn't want to pay anywhere near full price, though.

pojzon

If the price is like 20% of retail - why not.

If higher or close to retail - why buy used gpu ?

Let miners literally drawn in debt.

wyager

Constant-on is typically less harmful to the chips than on-and-off. Electromigration or whatever is less of a risk than solder joints going bad.

WithinReason

Lot of info in this here:

https://www.youtube.com/watch?v=1T0npiqjEWQ

Apparently GDDR6X and HBM memory might degrade from Ethereum mining

pjc50

> where the total market cap of all crypto assets has fallen by two-thirds since its peak of $3 trillion last November.

This is why it's important to keep battering the crypto advocates any time they promote anything linked to proof-of-waste. Cryptocurrency is the "paperclip maximiser" that will consume scarce energy and hardware manufacturing resources, and turn them into "grey goo" which people have been conned into believing is worth something.

JonathanFly

>What to do with all that computing power now. I guess AI.

For AI, the lack of memory is pretty limiting.

Having a ton of 8 GB or 10 GB memory GPUs (3070/3080) is annoying and often just not even possible to use effectively without designing specifically to it. If they get really really cheap maybe they'll be some tools or models that are built to leverage scenarios like that, it's not impossible, but for now it's probably just the cheap 3090s (24GB of memory) that will benefit AI.

dannyw

I picked up a cheap ex-mining 3090 last week. Amazing for a ML hobbyist. Works out so much cheaper over time than paying for cloud GPUs.

A couple weeks of training and I break even.

teeray

Where did you shop for used ex-mining cards?

MonkeyMalarky

Feels like distributed model training would be the ultimate yak shaving exercise for training ML models at home. Sure you can do it, the cost of the lower memory cards is super attractive, you'll definitely learn lots of cool things, and, after all the effort you will have made exactly no progress on the original task you set out to do. Kind of like the choice between making a video game or a game engine.

somethoughts

Seems like it'd be a decent time to start some version of a indie "cloud gaming as a service" company similar to Nvidia GeForce Now or Google Stadia.

It perhaps isn't the end of the world if your gaming session errors out due to the service being based on used crypto mining GPU cards - especially as the service subscription cost approaches $0/month.

fomine3

Nvidia disallows use Geforce on datacenter

hyperhopper

Doesn't that violate first sale doctrine?

keithalewis

Ooh, ooh. T-cool. Let's call the game Nero's Fiddle.

alexklarjr

>ai sturtups are flooding the market with cards that are no longer profitable due to the ongoing hype fade, now back to VR news…

myself248

I should buy some Fuzzy Logic domain names in case that particular hype cycle comes back around.

Sakos

What about gaming? People have been waiting to be able to buy affordable GPUs for a while now.

Bigpet

The DIY desktop gaming tower market has got to be pretty small by now.

Margins on pre-builds have been shrinking to almost nothing, I build my own PCs because I see it as a hobby, but I could hardly recommend it nowadays to someone who's looking for a machine, not a new hobby.

Sakos

There are a lot of people looking to upgrade their prebuilts, for example. It's far cheaper than buying a new system.

You say it like people didn't buy GPUs before crypto. What do you think happened to all those GPUs AMD and nVidia have made every year for the past decade or so?

mkoubaa

Computational physics. We have a lot of atom based engineering to work out this decade.

maxmalysh

> What to do with all that computing power now. I guess AI.

CEO of Kryptex here. We have an answer:

https://www.kryptex.com/en/rent-cloud-gpus

chrisco255

Not mentioned in the article: Ethereum will be transitioning to proof of stake in the next few months and the large majority of GPU-based mining will go away entirely. Probably will be some great deals on video cards later this year.

DarmokJalad1701

> Ethereum will be transitioning to proof of steak in the next few months

Just in time for the beef shortage!

tuankiet65

I'd love to be paid to hoard steaks.

copperx

Why? You would never be able to eat them.

benreesman

ETH2 has been 6 months away for 6 years.

googlryas

Sure, but when asset prices have crashed is probably when resistance to PoS is weakest? So now is the time to spring it upon them.

dereg

ETH 2.0 hasn't been held off due to resistance to PoS. It's been held off because there's no ETH 2.0 to ship yet.

chrisco255

There is no meaningful resistance to proof of stake in Ethereum. Users, devs, etc all on board. It has been part of the roadmap for years. Changing consensus is not a light upgrade for a blockchain. The chain needs to maintain 100% uptime in spite of the switch.

thorw73m

welcome to software development

rowanG077

Ethereum transitioning to proof of stake "soon" has been said for a long, long time now. I believe it when I see it.

DennisP

Fair enough, but none of the following was true until recently:

The production proof-of-stake network has been running in parallel for a year and a half, with 10% of all ETH locked up in it.

What's left is the merge with the rest of the network. They've tested the merge on every combination of five execution clients and five staking clients.

They've done about a dozen large test merges on copies of the live network.

They've done the merge on a large long-running testnet used by applications, block explorers, exchanges, etc. Essentially it was a practice run by the whole ecosystem. There were a few minor glitches but it worked well enough to be a success if it'd been the live network.

There are two major testnets left, and if those merge successfully then the next step is production.

candiddevmike

Finally we'll get to see what plutocracy as code looks like

brobinson

There's a countdown and difficulty bomb: https://wenmerge.com/

swinglock

There has been many.

alexklarjr

isnt it doing so since 2019?

ck_one

First test nets have transitioned to PoS already. It's gonna happen soon.

yourad_io

Did they have a successful test? The last news I caught had 1/3 failures or something like that

lmm

2015 was the originally announced date I think.

chrisco255

It was just part of the roadmap back then. It wasn't an imminent upgrade for which there are live testnet merges being conducted.

swalsh

They've merged on the test net, so the August timeline is sounding realistic.

redox99

Actually the ideal timeline is for late September

CodeBeater

I heard that it was the plan from its inception.

razemio

This, every single article I have read so far is missing this VERY important piece of information. I am still mining ETH and it is still profitable. I mine with 4 1080s and I do not pay for power. However, if I would have 200+ gpus I would slowly start selling before the prices tank due to the merge. Also more dedicated mining hardware is hitting the market for ETH which is far more efficient then gpu mining.

bootloop

> it is still profitable

> I do not pay for power

I guess you don't run your own solar farm so I assume someone else is paying that for you without knowing.

Faaak

To play devils advocate, I indeed have solar panels at home (7kW). I use my graphic cards only when it is sunny. However I don't mine ETH but instead contribute to Folding@Home.

drumhead

>Do not pay for power

Hmmmm, I detect a flaw in you plan....

walrus01

If anyone does buy a mining-used 3060 or something I would strongly recommend learning how to re-paste the thermal interface between the heatsink and the chips mounted on the PCB. Some good youtube examples out there for people who've never done it before.

a decent syringe of heatsink paste is $7 on newegg or amazon

nyanpasu64

Does removing and reinstalling a heatsink damage the thermal pads, requiring buying new ones ($10 of thermal pads doesn't last as long as $10 of thermal paste)?

CodeBeater

In my opinion, thermal pads are better suited for components that need just that little extra bit of heat dissipation to perform adequately, i.e components that are < 10° C from being able to be passively cooled. Voltage regulators come to mind.

So in that instance, if the thermal pads are reasonably well installed, the components should be just fine, since we're working with much looser tolerances.

When you need active cooling, the manufacturing complexity increases to the point where the convenience doesn't really make sense, since you have to get it right, so thermal paste is the way to go.

exmadscientist

Pads are preferred over paste for two main reasons: low assembly cost and ability to fill gaps. Both are very important. If you have an application using a pad, and it's flat enough to replace with (good) paste, you will always see a performance improvement.

I do not ever re-use pads. I have a used pad on my desk at work that shows an indent of the chip that was pressed into its other side... and you can read the part number and lot code laser marked into the chip. If the pad conforms to that but then can't relax back... it's never going to make good contact again. Full stop. (Granted, this was an extremely expensive thick gap-filler pad [TG-A9000, I think] for a particular application, and performance will vary.)

walrus01

thermal pads in general are worse than properly applied paste, if you learn to paste things smoothly and accurately.

there's a reason you won't seen thermal pads as the interface between a cpu's heat spreader and the underside of a skived copper heatsink in a high wattage per socket, dual CPU 1U or 2U server, where the heatsink has to be really efficient.. it'll be a factory applied paste.

ollien

I think the question was more surrounding the thermal pads on things like the VRMs. My understanding (though full disclosure, I've never disassembled a GPU myself) is so long as you take care to not tear them, they should be fine.

KennyBlanken

For over a month the local microcenter has had hundreds of RTX 3xxx series GPUs, and that's just from the public numbers, which are capped at quantity 25.

Nobody is buying these cards. One, inflation. Two, everyone who really wanted them already bought. Three, people have realized that they can just be careful about graphics quality settings and save themselves hundreds of dollars.

The newer cards are a shitty deal, frankly, at the low end. A new 3060 retail costs twice as much as a used 1070 ti that will perform nearly identically, even have nearly the same power draw (you really have to hand it to NVIDIA to have nearly the same performance per watt with a half decade spread.)

Is DLSS nice? Yeah. Is RTX pretty? Yeah. Is it twice the cost nice/pretty? Nope?

NVIDIA is going to have to trickle the 4000 series cards out so slowly if they want to have any hope of not severely boning their resellers. My guess is that we'll see 4080's first to get the whales, then 4090's to get the whales again...and then the lower cards announced but released from storage in very, very small quantities.

cyber_kinetist

I checked to see if you claims were true (3060 = 1070ti)… and you’re absolutely right!

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3060-vs-Nvi...

Yeah, there doesn’t seem to be any good reason to upgrade if you aren’t aiming for the highest specs (which is most people). Even ignoring the crypto craze, there wasn’t really that much tangible advance in GPU specs (and even when it did it traded off in terms of size, power consumption, and heat.) Before you were able to buy a GTX 970 for $400 to play all the latest games at the best options and that was enough… and even to this day it’s still a solid entry-level GPU.

jms703

Four, some people gave up trying.

kfarr

True that. I used to upgrade graphics cards every few years, now I've had mine for about 6 or 7 can't even remember. Can't even remember the model, some Nvidia thing. But it plays 90% of games fine, and that's more than a lifetime worth of entertainment. I'll upgrade when my rig dies someday

copperx

I'm just hoping 1050TIs become affordable again so I can build a really cheap gaming PC.

epakai

The cards I've been watching (radeon pro wx 2100/4100, and RX 400/500 series) are just hitting 2 years ago prices on the used market. They were already used then. I'm just hoping prices settle to something acceptable by the end of the year. It's been crazy watching the price go up for these old models though.

hexo

Just looked at prices in Germany... 3090ti is still more than 2k euros, same for Czech Republic (seen one for about 2700 eur few days ago), or Slovakia, very similar in France. So, no - I don't really see prices getting normal. Where are they? How can I get a new 3090ti for 1400 euros in Europe?

Aeolun

> How can I get a new 3090ti for 1400 euros in Europe?

Buy in the US, ship to Europe? Shipping and import tax combined would still seem less.

beebeepka

Not worth it. 1 year warranty is also a major issue

hexo

Well, not feasible at all, our customs officers are very well known to hold your mail for random time which can be easily 6 months or more.

null

[deleted]

exikyut

What if legitimate travelers entering the country were to bring the items in as part of their checked luggage?

I feel like there's a whole untapped market here where people could rent out X amount of space in their suitcases - with the obvious proviso that the item(s) would need to be completely opened (and maybe independently xrayed) first, to eliminate surprises.

razemio

https://www.mindfactory.de/product_info.php/24GB-MSI-GeForce...

1800€ and a very good one. Ofc not yet at 1400 but it is getting there.

confident_inept

The GPU shortage is at the initial stages of being over, largely due to multiple facets of the cryptomining implosion that are right around the corner.

I still can't yet walk into a Best Buy or local electronics store and get one off the shelf, and online prices are still jacked above MSRP.

We'll see real action 6 months to a year from now.

7speter

Best Buy has weekly nvidia founders edition drops, on Thursday mornings, and last night they had a 3070fr drop because it doesn’t seem like enough people bought them on Thursday. You might not be able to get one off the shelf but you can order one to pickup.

carom

You totally can buy them off the shelf right now. I saw them a few months ago in a Best Buy in Rancho Cucamonga, just checked the site and I could pick up a 3060 in LA tomorrow.

Nursie

Here in Australia you pretty much can now, which is good.

alexklarjr

Last time used mining cards were selling at 20-30% of msrp so just wait till prices go down to the bottom. Pandemic is over, nobody wants/can play games instead of life anymore.

yardie

I volunteer as a coach and we’re facing an enrollment drop due to the pandemic being over. When schools were closed we were considered one of the safer sports due to social distancing. Now, we’re losing kids to indoor online gaming and contact sports.

happyopossum

What sport do you coach?

yardie

Youth sailing.

We had kids flying in from all over the world to learn and compete. Other youth sports such as basketball, soccer, football were effectively shutdown.

zagrebian

People have played games because life sucks even before the pandemic.

4gotunameagain

If your attitude towards life is that it sucks so I'll play computer games instead, it sounds like a negative feedback loop :)

turbonaut

I’m not sure if I’m reading it correctly, but if the reader has the inference / bias that playing games makes life suck, that would be a positive feedback loop (albeit with negative effects).

Negative feedback loops are self correcting. So if games make life better it would indeed be one.

netmare

Well, negative feedback loops are generally more useful than positive ones. At least in electronics...

Jowsey

> nobody wants/can play games instead of life anymore.

Steam alone is still peaking at 25,000,000 concurrent logged-in users https://steamdb.info/graph/

lmm

Right, and how does that number compare to 6 months or a year ago?

wolfgang42

If I’m reading the “Lifetime concurrent users on Steam” chart on https://steamdb.info/app/753/graphs/ correctly, it’s a slight increase: today 28.0M, 6mo ago 27.9M, 1y ago 25.2M

unethical_ban

I can't tell if you're being facetious or if you think no one games anymore.

copperx

OP is probably excluding children and teenagers.

ShamelessC

And adults born in the past 30-40 years?

patrulek

> Pandemic is over

Are you sure? Gov started telling us (in Poland) that it will come back after summer, so pandemic card is still on the table.

tehjoker

It's not actually over, but if your hobby is getting COVID over and over until your immune system blows out you do you.

qwerpy

Is an immune system more like a tire or a muscle when it comes to wear and tear? I’m an engineer so I have zero medical expertise, but I thought that antibodies make you more resilient to a particular disease and getting sick is the usual way to get antibodies.

tehjoker

The particular biology of sars2 is concerning. It has a super antigen, supresses mhc1, and is very fast and stealthy. Long covid is very real. A recent study found vaccination only protects against it by 15%. Another major study found serial reinfection incurs significant risks each infection.

A. Leonardi hypothesizes that naive T cells are protective. Older people have fewer. Hyperstimulation due to antigen will cause the process from naive to effector to proceed and over time deplete that population.

In any case, COVID is slightly better controlled with vaccination to the point where it doesn't outright kill you, but that doesn't mean that with a seatbelt I'm looking to crash my car on the reg. You don't walk away healthier each time.

thorncorona

It's endemic at this point. 2 years in we have vaccines, treatments, and the virus itself has evolved.

Anecdotally over the last couple months most of the people I know have caught it and recovered a week later.

bcatanzaro

All of us are going to need a lot of therapy to deal with the PTSD from this pandemic. It's very difficult to know how to weigh small but measurable risks. Meanwhile the unseen risks of years of lockdowns will reverberate through generations.

null

[deleted]

lamontcg

Wait until the recession really hits.

GiorgioG

Cards need to be $500 at the top end. That's when the real shortage is over.

tempoponet

Digital Foundry notes that price/transistor isn't really going down anymore, and die shrinks aren't enough to improve performance or price (at least for GPUs). The result is cards that are bigger, more expensive, hotter, and more power hungry. The last two mean more expensive cooling. The Nvidia 3000 series was a nice price improvement compared to the 2000 series, but MSRP at the top end will approach 2k MSRP in the next generation.

Nursie

It was a price improvement at MSRP but until recently it wasn’t a price improvement at all due to the insane markups and stock shortages.

dannyw

No, I don't want top end cards to cap out $500.

Imagine being a hobbyist videographer and having to jump to Quadros or professional cards for video editing.

null

[deleted]

benreesman

Most of the comments seem focused on the risks of used GPUs.

I’m just loving buying an FE at Best Buy for MSRP.

shepherdjerred

Wow. I read your comment, went to Best Buy's website, and after a few minutes of waiting in a virtual "line" I bought a RTX 3070 for MSRP ($500).

Finally I can upgrade my 6 year old GTX 1070.

alliao

GTX 970 reporting for duty...

tehbeard

Do it.

I made the jump and its so nice having the vram alone for modern titles.

Chyzwar

I am now replacing my 8 years old GTX 750Ti with RX 6700 XT.

andrewmcwatters

The shortage is over, but the prices are still not justifiable. Why would anyone pay MSRP for how old these cards are with new product lines being announced this month?

lol

Obi_Juan_Kenobi

Depends on the card and situation. If 3060ti performance is plenty for you, how much would you expect to save on a 4050? Probably not much, and those cards won't be out until next year sometime.

For a higher end card? Yeah it could go either way.

hwers

New product lines are always on the horizon

0xcde4c3db

In some sense, sure, but in this case it means an actual new generation of high-end ("9"/"8"/"7" class) chips likely launching in a few months, not a "refresh" sometime next year.

slickdork

Perhaps, but nvidia is 23 months into a 24 month release cycle.

null

[deleted]

jacooper

Totally agree, been looking for an RX6600xt, but the price is just not justifiable for how old the card is really.

drumhead

Lets not forgot, its bad news for the Card Manuafacturers if consumers decide to buy the flood of used cards on the market and not new ones. Its really not in their interest for us to be buying them at all.

bush-bby

Why is that?

99_00

oars

Lol thanks for sharing.