Viewing a single comment thread. View all comments

swisstraeng t1_j09a5ge wrote

Yeah, and the main issue is that, when you add layers on top of layers, you are less and less flat. And at some point you're a whole layer wrong, so you have to do long and expensive processes to try to flatten the thing again.

Cooling is partially an issue, but that's also because CPU/GPU manufacturers push their chips to their limits in an attempt to make them appear better. And end up selling stuff like RTX4090 that is clocked way too high and end up eating 600W, when it could have 90% of the performances for 300W. But hey. They're not the ones paying the power bill.

32

orincoro t1_j0c7tzb wrote

I wonder how much electricity globally is consumed by needlessly overclocked GPUs.

1

swisstraeng t1_j0ei32s wrote

Surprisingly not much. If we only look at industry grade hardware. Consumers? Yeah, a lot is wasted.

All server and industrial stuff is actually not too bad. For example, the chip used in the RTX 4090 is also used in a Quadro card.

It is the AD102 chip. Used in the RTX 6000 Ada gpu, which has only 300W TDP compared to the RTX 4090 that has 450W and is pushed to 600W sometimes. Or worse, 800W in the RTX 4090ti.

We're talking about the same chip and a 300W versus 800W difference.

Anyone using a rtx 4090ti is wasting 500W into a bit of extra computing power.

But hey, kwh costs about 0.25euros in the EU depending where you live. This means, you pay 1 euro every 8h of use for a rtx4090ti that could be saved by downclocking the card.

1