bathrobehero has a valid point; a new thread on this stuff, so that we don't flood sp_'s ccminer thread.
So, here we go--->
To get this thing started, I will quote/cross-post from the previous thread:
970 (GV-N970WF3OC-4GD - 250w OC edition instead of 145w):
stock - 2.75 mh/s at 187W
oc - 3.0 mh/s at 208W (+185/0 - 1501mhz)
I had assumed the 970 TDP would be within the 145W Nvidia spec except if OCed. It seems that
assumption was way off. I didn't find an actual TDP spec for this card, just PSU and connector reqs.
Is it really 250W? This changes the balance of power (bad pun) and makes me wonder about the 980
rated at 165W.
I am by no means going to sit here and write as if I am an authority on this stuff.
However, I have taken note of something that I think deserves mention and discussion.
TDP is the term we all seem to use (myself included) to refer to and to determine the power consumption of a GPU.
But, TDP is short for Thermal Design Power. Further definition is:
"TDP is the average power a device can dissipate when running real applications." (aka "normal" apps that an average user would run)
TDP, then, is NOT exactly equal to the device's maximum power consumption, nor is it necessarily measured at 100% load.
TDP is self-reported by manufacturers. It seems that several years ago, AMD & Intel used different percentages of processor loading to measure and report their CPU's TDP.
AMD used ~100% load and intel something like 80-85%. Well, intel's method seems to have become the norm.
This is what led to the definition of: average power dissipated when running "real applications."
So, under intense loading situations, a device can definitely consume more power and dissipate more heat than its TDP would indicate.
Of course, overclocking raises the amount of power consumed. And mining intensive algorithms certainly is not what would be called "real applications."
From
http://www.cpu-world.com/Glossary/M/Minimum_Maximum_power_dissipation.html:
Maximum power dissipation is the maximum power dissipated by the CPU under the worst conditions - at the maximum core voltage, maximum temperature and maximum signal loading conditions.
Maximum Power dissipation is always higher than Thermal Design Power.
I believe that when mining the more intensive algos, these GTX 970 and 980 video cards are in fact consuming somewhat more power than their nvidia-stated TDP might indicate.
EDIT: Using an AC/DC clamp meter, I measured the actual current through the two 6-pin PCIe connectors on my stock BIOS Zotac GTX 970 w/ moderate overclock, while mining Quark algo with sp_mod release version 52.
One connector showed a relatively consistent 4.5A.
The other connector fluctuated between 2.5A and ~7A. I think it is safe to average it to 4.5A.
9A @ 12V = 108W + 75W (assumed) through the PCIe bus = 183 Watts
TDP
TDP
Yes, it's not how much a card can pull but TDP does correlate fairly well to that amount (at least it did until the 900-series) and we don't have better figures from the specs. (Linus' explaination:
https://www.youtube.com/watch?v=yDWO177BjZY )
I downloaded some random stock BIOSes from
here and opened them in Maxwell II BIOS Tweaker to check their maximum
TD.. I mean power consumption using
this image to translate the figures and here are the results (in watts):
Bios Flavor Maximum Target
Asus.GTX970.4096.141028 Strix OC 250 163.46-193.152
EVGA.GTX970.4096.141020 FTW 250 170-187
Galaxy.GTX970.4096.140912 EXOC 200 200-250
Gigabyte.GTX970.4096.141105 Windforce OC 250 250-280
Gigabyte.GTX970.4096.141910_1 G1 Gaming 250 250-280
MSI.GTX970.4096.141029 Gaming 250 200-220
NVIDIA.GTX970.4096.140826 Reference? 250 151.2-160.3
Palit.GTX970.4096.140903 Standard 250 151.2-160.3
Palit.GTX970.4096.140910 JetStream OC 250 180-200
PNY.GTX90.4096.140912 VCGGTX9704XPB 250 151.2-160.3
Zotac.GTX970.4096.141024 Standard 196 151.2-160-3
Zotac.GTX970.4096.141910 AMP Omega 350 325-345
...
Gigabyte GTX 980 G1 Gaming 250 300-366
Gigabyte GTX 980 Windforce OC 250 270-300
I added two 980 (CBA to add more) to the list to show the figures does seem to be correct - at least compared to a stress test from a
Tom's hardware review:
Ps: we should really have more threads instead of flooding this one with everything