Viewing a single comment thread. View all comments

orincoro t1_j0c7tzb wrote

I wonder how much electricity globally is consumed by needlessly overclocked GPUs.

1

swisstraeng t1_j0ei32s wrote

Surprisingly not much. If we only look at industry grade hardware. Consumers? Yeah, a lot is wasted.

All server and industrial stuff is actually not too bad. For example, the chip used in the RTX 4090 is also used in a Quadro card.

It is the AD102 chip. Used in the RTX 6000 Ada gpu, which has only 300W TDP compared to the RTX 4090 that has 450W and is pushed to 600W sometimes. Or worse, 800W in the RTX 4090ti.

We're talking about the same chip and a 300W versus 800W difference.

Anyone using a rtx 4090ti is wasting 500W into a bit of extra computing power.

But hey, kwh costs about 0.25euros in the EU depending where you live. This means, you pay 1 euro every 8h of use for a rtx4090ti that could be saved by downclocking the card.

1