Bitcoin Forum

Other => CPU/GPU Bitcoin mining hardware => Topic started by: _Vince_ on February 22, 2012, 01:52:35 PM



Title: GPU's temperature and its relationship with power consumption
Post by: _Vince_ on February 22, 2012, 01:52:35 PM
We all know that the high temperature, the higher current leakage in the transistors. But the question is how much higher?

Suppose same GPU with constant clock, fan, workload...; at 80oC it consumes more energy than when it is at 70oC. It is because of current leakage at higher temp is much higher, VRM efficiency decrease with VRM temp

There is a interesting article here:

http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_480_Amp_Edition/27.html

"so for every °C that the card runs hotter it needs 1.2W more power to handle the exact same load."


Have you ever measure your card to see how much additional power does it take when the temp increase 1oC?



Title: Re: GPU's temperature and its relationship with power consumption
Post by: BookLover on February 22, 2012, 02:03:32 PM
(marking)


Title: Re: GPU's temperature and its relationship with power consumption
Post by: cpt_howdy on February 22, 2012, 02:07:39 PM
If you want to maximise the efficiency of your cooled mining card, there'll be an equilibrium point where a 1W increase in power to your cooling solution (probably a fan) would result in a 1W reduction in the power draw of your card. At this point it's not worth cooling the card down further (unless you go for a more efficient cooling technology)


Title: Re: GPU's temperature and its relationship with power consumption
Post by: DeathAndTaxes on February 22, 2012, 02:13:18 PM
If you want to maximise the efficiency of your cooled mining card, there'll be an equilibrium point where a 1W increase in power to your cooling solution (probably a fan) would result in a 1W reduction in the power draw of your card. At this point it's not worth cooling the card down further (unless you go for a more efficient cooling technology)

True but one can also run  a card cooler by lowering clocks.  Lower clock enough and you can also lower the voltage.

I can run a 5970 @ 40% fan and <60C  but only at 535MHz and 0.7V :)

Right now my power costs even with "hot GPUs" (~70C) are only about 1/3rd of the revenue so increased efficiency is mostly academic however as the network becomes more efficient (7900 series cards, FPGAs, etc) things like lower temps, undervolting, underclocking can be used to increase the "effective economic lifespan".  When my 12GH/s farm is no longer economical I can "convert it" to a 6GH/s farm which is economical and grind out maybe another years worth of revenue.


Title: Re: GPU's temperature and its relationship with power consumption
Post by: cpt_howdy on February 22, 2012, 02:18:48 PM
If you want to maximise the efficiency of your cooled mining card, there'll be an equilibrium point where a 1W increase in power to your cooling solution (probably a fan) would result in a 1W reduction in the power draw of your card. At this point it's not worth cooling the card down further (unless you go for a more efficient cooling technology)

True but one can also run  a card cooler by lowering clocks.  Lower clock enough and you can also lower the voltage.

I can run a 5970 @ 40% fan and <60C  but only at 535MHz and 0.7V :)

Right now my power costs even with "hot GPUs" (~70C) are only about 1/3rd of the revenue so increased efficiency is mostly academic however as the network becomes more efficient (7900 series cards, FPGAs, etc) things like lower temps, undervolting, underclocking can be used to increase the "effective economic lifespan".  When my 12GH/s farm is no longer economical I can "convert it" to a 6GH/s farm which is economical and grind out maybe another years worth of revenue.

Je suis d'accord! There are much greater efficiency gains to be had by undervolting and underclocking, but if you really want to shave off the last few watts possible, then you can ramp up those fans. I personally keep the temps acceptable and the fans low, just so my secret rigs in the cupboards don't get discovered  ;)


Title: Re: GPU's temperature and its relationship with power consumption
Post by: _Vince_ on February 22, 2012, 02:26:10 PM
If any of you have a kill-a-watt and some spare time, please help doing test:

-With cgminer, set target temp 65oC, auto fan , record the wattage (average for 1-2 minutes)

-Repeat with target temp 75oC