Bitcoin Forum

Alternate cryptocurrencies => Mining (Altcoins) => Topic started by: Meatball on June 11, 2017, 08:03:43 PM



Title: What drives power consumption for mining?
Post by: Meatball on June 11, 2017, 08:03:43 PM
So, random question as I jump back into some coin mining and trying to build the most efficient rigs.  What drives a cards power consumption down the most?  Lowering Power Limit, Core Clock or Memory Clock?  In particular, I've got some 1070's I'm trying to mine ETH with, and while I can get them to 30-31 MH, can't seem to get the card below 110W draw or so.


Title: Re: What drives power consumption for mining?
Post by: Vann on June 11, 2017, 08:56:13 PM
TDP > Core clock > Memory clock in terms of power usage.


Title: Re: What drives power consumption for mining?
Post by: nerdralph on June 12, 2017, 01:18:55 AM
Core volts are what EAT watts.

And voltage scales (in a not exactly linear manner) with core clock.
Power is roughly proportional to voltage squared, and if voltage is held constant, power scales linearly with frequency.


Title: Re: What drives power consumption for mining?
Post by: KaydenC on June 12, 2017, 02:06:10 AM
My rule of thumb is:

Power scales to the square of voltage.
Power scales to the square of clock speed.
Clock speeds scale linearly with voltage.

Mean time to failure gets exponentially worse with increased voltage
Mean time of fan failure gets exponentially worse as fan speed increases

Because my farm is in a remote location, I just run everything at 0.85v, Nvidia and AMD Gpus alike, but I'll push them as hard as possible with clock speeds and dual mining at these voltages.