So recently, I've been seeing a lot of people talking about how high of clocks they are mining at, and how much voltage they are pushing into them to hit those clocks.
I decided to do a test on my own system with 3, reference 6970's in it. The results are rather surprising.
All the wattage's are measured with my UPS and are from the wall while running guiminer on 3 gpus.
System IDLE wattage is about 350watts from the wall.
Default Voltages(1175MV), 880 is stock core clock. 880/1375: 905 Watts
880/685: 835 Watts
880/340: 782 Watts
~385MH/s per GPU
Default Voltages(1175MV) (Highest overclock without changing voltage)
940/340: 815 Watts
~420MH/s
1235MV970/340: 915 Watts
~430MH/s
1300MV1000/340: 1050Watts
~450MH/s
As you can see from the above results increasing the voltage greatly increases power usage. So much so that it will cause a negative impact on your BTC earned, even though you are creating more MH/s. For me with 6970's it seems that the sweet spot is the highest overclock without raising core voltage. This should apply to any other GPU also.
Now, one more thing is that my video cards are water cooled, which means they will run much cooler than air, and cooler power components use much less power than hot ones. So if you are on air cooling your power usage from overclocking/over volting could be much higher.
System Specs:
Intel Xeon W3520 @ 4ghz
Asus P6t7
3, 6970's
12GB 2000mhz ram
15 Case fans
Corsair AX1200
2, 120GB OCZ Agility SSD
3, WD 1TB blacks
Asus Xonar STX