jajamelony (OP)
Newbie
Offline
Activity: 8
Merit: 0
|
|
July 06, 2017, 12:17:11 AM |
|
Hi,
Can someone tell me how to calculate the total power drawn by a GPU?
I'm logging various GPU metrics with HWiNFO64 and would like to know which values constitute the total wattage drawn by a GPU. In the log file I'm seeing the following (these were captured while the card was mining):
GPU Core Voltage (VDDC) [V] 1.194 GPU Core Current [A] 78.313 GPU Core Power [W] 93.486 GPU Chip Power [W] 129.578 GPU VRM Voltage Out (VOUT/VID) [V] 1.193 GPU VRM Voltage In (VIN/+12V) [V] 12.125 GPU VRM Current In (IIN) [A] 6.438 GPU VRM Current Out (IOUT) [A] 56.5 GPU VRM Power Out (POUT) [W] 67.25 GPU VRM Power In (PIN) [W] 78
Is the total the card drawing the Core Power + Chip Power? What about VRM? Is there a way to validate the wattage drawn vs Power In/Current In?
I think understanding these values would be really helpful in deciding how to approach overclocking/underclocking of a specific card (based on historic data) instead of taking a shot in dark, right?
thanks jaja
|
|
|
|
|
bathrobehero
Legendary
Offline
Activity: 2002
Merit: 1051
ICO? Not even once.
|
|
July 06, 2017, 12:49:32 AM |
|
Measure at the wall. Software can't possible be accurate.
|
Not your keys, not your coins!
|
|
|
MATHReX
|
|
July 06, 2017, 06:20:29 AM |
|
Hi,
Can someone tell me how to calculate the total power drawn by a GPU?
I'm logging various GPU metrics with HWiNFO64 and would like to know which values constitute the total wattage drawn by a GPU. In the log file I'm seeing the following (these were captured while the card was mining):
GPU Core Voltage (VDDC) [V] 1.194 GPU Core Current [A] 78.313 GPU Core Power [W] 93.486 GPU Chip Power [W] 129.578 GPU VRM Voltage Out (VOUT/VID) [V] 1.193 GPU VRM Voltage In (VIN/+12V) [V] 12.125 GPU VRM Current In (IIN) [A] 6.438 GPU VRM Current Out (IOUT) [A] 56.5 GPU VRM Power Out (POUT) [W] 67.25 GPU VRM Power In (PIN) [W] 78
Is the total the card drawing the Core Power + Chip Power? What about VRM? Is there a way to validate the wattage drawn vs Power In/Current In?
I think understanding these values would be really helpful in deciding how to approach overclocking/underclocking of a specific card (based on historic data) instead of taking a shot in dark, right?
thanks jaja
You can start by investing in a kill-a-watt first, after that just plug your system through it and check the power consumption in system idle, after that run it in full load and record the power consumption. Now, you can get the approx value for power consumption of one GPU by total power in full load minus total power in idle state dividing whole by the number of GPUs you have. Secondly, the values that you specify don't have any relation on how well they will be undervolt/overclocked. The parameter that comes in play for deciding this is ASIC quality of your GPU.
|
|
|
|
jajamelony (OP)
Newbie
Offline
Activity: 8
Merit: 0
|
|
July 06, 2017, 12:44:25 PM |
|
I do have the kill-a-watt plug but for an in-depth analysis it is not good. I can't go an see day-by-day power draw. Plus there are stories of it burning up when used with high wattage use. For that reason I am afraid of using it with my rig.
So are we saying that the on-board sensors in cards are not accurate?
thanks jaja
|
|
|
|
jmigdlc99
|
|
July 06, 2017, 12:58:45 PM |
|
Onboard sensors usually are accurate for giving ballpark ranges. Of course, measuring TDP at the wall is best. If you want deeper analysis, there are power meters out there that have daily consumption readings. Bought mine from amazon.
|
0xacBBa937A57ecE1298B5d350f40C0Eb16eC5fA4B
|
|
|
64dimensions
|
|
July 06, 2017, 01:35:54 PM |
|
Using the Kill-a-watt meter is the way to go.
A method is as follows:
1) Have the rig setup and running using its nominal mining software settings and the kill-a-watt installed. Start with the full complement of cards. Make a measurement on your N cards.
2) Power down the system and completely remove one card. Setup and run the rig with N - 1 cards.
3) Do 2) till you get down to one card.
4) Plot Power on the y axis versus card number on the x axis. Fit a straight line (ax + b).
5) b, the y intercept will be the non GPU power consumed. Subtract b from the power draw for the full N card system. Divide by N and you have the power draw per GPU.
To save time, I just do this once for a particular MB/GPU/processor combination. The important part is getting the b measurement. As a working assumption, I assume b is constant going forward. For a ASUS Z87 plus with celeron, which runs 5 cards, b is about 25 watts.
|
|
|
|
bathrobehero
Legendary
Offline
Activity: 2002
Merit: 1051
ICO? Not even once.
|
|
July 06, 2017, 01:42:52 PM |
|
Using the Kill-a-watt meter is the way to go.
A method is as follows:
1) Have the rig setup and running using its nominal mining software settings and the kill-a-watt installed. Start with the full complement of cards. Make a measurement on your N cards.
2) Power down the system and completely remove one card. Setup and run the rig with N - 1 cards.
3) Do 2) till you get down to one card.
4) Plot Power on the y axis versus card number on the x axis. Fit a straight line (ax + b).
5) b, the y intercept will be the non GPU power consumed. Subtract b from the power draw for the full N card system. Divide by N and you have the power draw per GPU.
To save time, I just do this once for a particular MB/GPU/processor combination. The important part is getting the b measurement. As a working assumption, I assume b is constant going forward. For a ASUS Z87 plus with celeron, which runs 5 cards, b is about 25 watts.
That's kind of overdoing it and if you want super accurate numbers it's also skewed since GPU mining also requires some CPU so without cards you get less consumption if the CPU is downclocked. I just assume each rig uses 60 watts and work around that. And since the rig will always be running, it's advised to measure the power consumption at the wall with all the cards being idle (mining software not running) and measure it when mining is running for a while. The difference is your true GPU power consumption.
|
Not your keys, not your coins!
|
|
|
Appuned
Newbie
Offline
Activity: 68
Merit: 0
|
|
July 06, 2017, 01:43:31 PM |
|
I do have the kill-a-watt plug but for an in-depth analysis it is not good. I can't go an see day-by-day power draw. Plus there are stories of it burning up when used with high wattage use. For that reason I am afraid of using it with my rig.
So are we saying that the on-board sensors in cards are not accurate?
thanks jaja
With a kill-a-watt, you can estimate the power draw of a single GPU by replacement method.
|
|
|
|
jajamelony (OP)
Newbie
Offline
Activity: 8
Merit: 0
|
|
July 06, 2017, 03:32:30 PM |
|
Using the Kill-a-watt meter is the way to go.
A method is as follows:
1) Have the rig setup and running using its nominal mining software settings and the kill-a-watt installed. Start with the full complement of cards. Make a measurement on your N cards.
2) Power down the system and completely remove one card. Setup and run the rig with N - 1 cards.
3) Do 2) till you get down to one card.
4) Plot Power on the y axis versus card number on the x axis. Fit a straight line (ax + b).
5) b, the y intercept will be the non GPU power consumed. Subtract b from the power draw for the full N card system. Divide by N and you have the power draw per GPU.
To save time, I just do this once for a particular MB/GPU/processor combination. The important part is getting the b measurement. As a working assumption, I assume b is constant going forward. For a ASUS Z87 plus with celeron, which runs 5 cards, b is about 25 watts.
That's kind of overdoing it and if you want super accurate numbers it's also skewed since GPU mining also requires some CPU so without cards you get less consumption if the CPU is downclocked. I just assume each rig uses 60 watts and work around that. And since the rig will always be running, it's advised to measure the power consumption at the wall with all the cards being idle (mining software not running) and measure it when mining is running for a while. The difference is your true GPU power consumption. Ok that sounds like a good approach. thank you! jaja
|
|
|
|
PcChip
|
|
July 06, 2017, 05:08:37 PM |
|
Kill-a-watt.
|
Legacy signature from 2011: All rates with Phoenix 1.50 / PhatK 5850 - 400 MH/s | 5850 - 355 MH/s | 5830 - 310 MH/s | GTX570 - 115 MH/s | 5770 - 210 MH/s | 5770 - 200 MH/s
|
|
|
coin123123
|
|
July 06, 2017, 08:15:55 PM |
|
only socket meter
tested on my last rig with two 280x:
gpu 1050mv, ambient temp 25c, gpu's temp 70c-80c = 550w-600w from wall gpu 1050mv, ambient temp 20c, gpu's temp 60c-70c = 450w-500w from wall
meaning with same setting but with just room temperature change possible to lower consumption so without live meter you can't know how much you use
|
|
|
|
Appuned
Newbie
Offline
Activity: 68
Merit: 0
|
|
July 26, 2017, 11:59:51 AM |
|
only socket meter
tested on my last rig with two 280x:
gpu 1050mv, ambient temp 25c, gpu's temp 70c-80c = 550w-600w from wall gpu 1050mv, ambient temp 20c, gpu's temp 60c-70c = 450w-500w from wall
meaning with same setting but with just room temperature change possible to lower consumption so without live meter you can't know how much you use
I also notice the power increase when the temperature is higher.
|
|
|
|
Aura
|
|
July 26, 2017, 12:03:25 PM |
|
You can measure the gpu's power consumption with a multimeter. There are cheap multimeters available on the net, these can give you an decent measurement.
|
|
|
|
NobodyIsHome
Jr. Member
Offline
Activity: 74
Merit: 1
|
|
July 26, 2017, 06:29:14 PM |
|
Like others have said, Kill-a-watt devices work.
Also, a digital UPS or PDU can measure load as well.
|
|
|
|
xxcsu
|
|
July 27, 2017, 03:23:27 AM |
|
Hi,
Can someone tell me how to calculate the total power drawn by a GPU?
So are u asking about the total gpu power draw ? For that u can use GPU-Z , that software going to give you a very accurate reading about total gpu power draw ... If you are asking about the total power draw for your graphics card ,included GPU , Memory modules , voltage regulators , cooling fans ...etc .... that is a different story You can measure the gpu's power consumption with a multimeter. There are cheap multimeters available on the net, these can give you an decent measurement.
could you do some presentation or write a how to do instruction ? how can you measure the GPU power draw on the graphics card with multimeter I also notice the power increase when the temperature is higher.
when the temp is higher the card cooling fans working much harder
|
|
|
|
igotfits
|
|
July 27, 2017, 03:42:44 AM |
|
i was looking for the same answers but everyone kept saying get one of these: https://www.amazon.com/gp/product/B000RGF29Q/ref=oh_aui_detailpage_o04_s00?ie=UTF8&psc=1So what i did was run the comp with no gpu and just added one by one and logging how much more wattage was being drawn. The weird thing about this is when adding a certain amount of GPUs the wattage would jump extremely high... (example: with 2gpus i would get 300 at the wall so around 100watts per card, then when adding a 3rd i would get 450watts!) So IMO it will never be accurate per card only as whole machine running and taking wattage at the wall.
|
|
|
|
NameTaken
|
|
July 27, 2017, 04:10:58 AM |
|
Corsair Link shows my AC input, DC output, efficiency, temperature, fan speed, etc.
|
|
|
|
Appuned
Newbie
Offline
Activity: 68
Merit: 0
|
|
August 25, 2017, 03:11:11 PM |
|
Corsair Link shows my AC input, DC output, efficiency, temperature, fan speed, etc.
I also have the Corsair PSU. But it is expensive.
|
|
|
|
fanatic26
|
|
August 25, 2017, 04:33:43 PM |
|
I am surprised so many people are recommending the kill-a-watt when in testing they prove to be inaccurate. The numbers can be skewed by +-10% with a kill-a-watt. Specifically with PC power supplies there are issues of it reading wrong and showing 100% efficiency. The proper way to measure your power is with an amp clamp on your power cord. You test the amperage then multiply it by the voltage to get wattage.
Say for example your rig is pulling 7.3 amps at 116 volts. With those numbers your rig is drawing 847 watts.
Test with 1 card, then 2, etc to see what your cards are actually pulling.
|
Stop buying industrial miners, running them at home, and then complaining about the noise.
|
|
|
|