Bitcoin Forum
May 04, 2024, 08:18:06 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 »  All
  Print  
Author Topic: How to figure out total power draw of a GPU?  (Read 2613 times)
jajamelony (OP)
Newbie
*
Offline Offline

Activity: 8
Merit: 0


View Profile
July 06, 2017, 12:17:11 AM
 #1

Hi,

Can someone tell me how to calculate the total power drawn by a GPU?

I'm logging various GPU metrics with HWiNFO64 and would like to know which values constitute the total wattage drawn by a GPU. In the log file I'm seeing the following (these were captured while the card was mining):

GPU Core Voltage (VDDC) [V] 1.194
GPU Core Current [A] 78.313
GPU Core Power [W]   93.486
GPU Chip Power [W]  129.578
GPU VRM Voltage Out (VOUT/VID) [V] 1.193
GPU VRM Voltage In (VIN/+12V) [V] 12.125
GPU VRM Current In (IIN) [A] 6.438
GPU VRM Current Out (IOUT) [A] 56.5
GPU VRM Power Out (POUT) [W] 67.25
GPU VRM Power In (PIN) [W] 78

Is the total the card drawing the Core Power + Chip Power? What about VRM? Is there a way to validate the wattage drawn vs Power In/Current In?

I think understanding these values would be really helpful in deciding how to approach overclocking/underclocking of a specific card (based on historic data) instead of taking a shot in dark, right?

thanks
jaja
1714810686
Hero Member
*
Offline Offline

Posts: 1714810686

View Profile Personal Message (Offline)

Ignore
1714810686
Reply with quote  #2

1714810686
Report to moderator
1714810686
Hero Member
*
Offline Offline

Posts: 1714810686

View Profile Personal Message (Offline)

Ignore
1714810686
Reply with quote  #2

1714810686
Report to moderator
1714810686
Hero Member
*
Offline Offline

Posts: 1714810686

View Profile Personal Message (Offline)

Ignore
1714810686
Reply with quote  #2

1714810686
Report to moderator
"Bitcoin: the cutting edge of begging technology." -- Giraffe.BTC
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714810686
Hero Member
*
Offline Offline

Posts: 1714810686

View Profile Personal Message (Offline)

Ignore
1714810686
Reply with quote  #2

1714810686
Report to moderator
1714810686
Hero Member
*
Offline Offline

Posts: 1714810686

View Profile Personal Message (Offline)

Ignore
1714810686
Reply with quote  #2

1714810686
Report to moderator
smg1902
Newbie
*
Offline Offline

Activity: 7
Merit: 0


View Profile
July 06, 2017, 12:35:02 AM
 #2

Software readings aren't accurate unless you have one of the digital power supplies that outputs the readings directly from the psu.  If you don't have one of those, you'll need to use external hardware to measure it.  Something like this is cheap and fairly accurate.  https://www.amazon.com/P3-International-P4460-Electricity-Monitor/dp/B000RGF29Q
bathrobehero
Legendary
*
Offline Offline

Activity: 2002
Merit: 1051


ICO? Not even once.


View Profile
July 06, 2017, 12:49:32 AM
 #3

Measure at the wall. Software can't possible be accurate.

Not your keys, not your coins!
MATHReX
Sr. Member
****
Offline Offline

Activity: 861
Merit: 281


View Profile
July 06, 2017, 06:20:29 AM
 #4

Hi,

Can someone tell me how to calculate the total power drawn by a GPU?

I'm logging various GPU metrics with HWiNFO64 and would like to know which values constitute the total wattage drawn by a GPU. In the log file I'm seeing the following (these were captured while the card was mining):

GPU Core Voltage (VDDC) [V] 1.194
GPU Core Current [A] 78.313
GPU Core Power [W]   93.486
GPU Chip Power [W]  129.578
GPU VRM Voltage Out (VOUT/VID) [V] 1.193
GPU VRM Voltage In (VIN/+12V) [V] 12.125
GPU VRM Current In (IIN) [A] 6.438
GPU VRM Current Out (IOUT) [A] 56.5
GPU VRM Power Out (POUT) [W] 67.25
GPU VRM Power In (PIN) [W] 78

Is the total the card drawing the Core Power + Chip Power? What about VRM? Is there a way to validate the wattage drawn vs Power In/Current In?

I think understanding these values would be really helpful in deciding how to approach overclocking/underclocking of a specific card (based on historic data) instead of taking a shot in dark, right?

thanks
jaja


You can start by investing in a kill-a-watt first, after that just plug your system through it and check the power consumption in system idle, after that run it in full load and record the power consumption. Now, you can get the approx value for power consumption of one GPU by total power in full load minus total power in idle state dividing whole by the number of GPUs you have.

Secondly, the values that you specify don't have any relation on how well they will be undervolt/overclocked. The parameter that comes in play for deciding this is ASIC quality of your GPU.
jajamelony (OP)
Newbie
*
Offline Offline

Activity: 8
Merit: 0


View Profile
July 06, 2017, 12:44:25 PM
 #5

I do have the kill-a-watt plug but for an in-depth analysis it is not good. I can't go an see day-by-day power draw. Plus there are stories of it burning up when used with high wattage use. For that reason I am afraid of using it with my rig.

So are we saying that the on-board sensors in cards are not accurate?

thanks
jaja
jmigdlc99
Sr. Member
****
Offline Offline

Activity: 784
Merit: 282


View Profile WWW
July 06, 2017, 12:58:45 PM
 #6

Onboard sensors usually are accurate for giving ballpark ranges. Of course, measuring TDP at the wall is best. If you want deeper analysis, there are power meters out there that have daily consumption readings. Bought mine from amazon.

0xacBBa937A57ecE1298B5d350f40C0Eb16eC5fA4B
64dimensions
Hero Member
*****
Offline Offline

Activity: 578
Merit: 508


View Profile
July 06, 2017, 01:35:54 PM
 #7

Using the Kill-a-watt meter is the way to go.

A method is as follows:

1) Have the rig setup and running using its nominal mining software settings and the kill-a-watt installed. Start with the full complement of cards. Make a measurement on your N cards.

2) Power down the system and completely remove one card. Setup and run the rig with N - 1 cards.

3) Do 2) till you get down to one card.

4) Plot Power on the y axis versus card number on the x axis. Fit a straight line (ax + b).

5) b, the y intercept will be the non GPU power consumed. Subtract b from the power draw for the full N card system. Divide by N and you have the power draw per GPU.

To save time, I just do this once for a particular MB/GPU/processor combination. The important part is getting the b measurement. As a working assumption, I assume b is constant  going forward. For a ASUS Z87 plus with celeron, which runs 5 cards, b is about 25 watts.





bathrobehero
Legendary
*
Offline Offline

Activity: 2002
Merit: 1051


ICO? Not even once.


View Profile
July 06, 2017, 01:42:52 PM
 #8

Using the Kill-a-watt meter is the way to go.

A method is as follows:

1) Have the rig setup and running using its nominal mining software settings and the kill-a-watt installed. Start with the full complement of cards. Make a measurement on your N cards.

2) Power down the system and completely remove one card. Setup and run the rig with N - 1 cards.

3) Do 2) till you get down to one card.

4) Plot Power on the y axis versus card number on the x axis. Fit a straight line (ax + b).

5) b, the y intercept will be the non GPU power consumed. Subtract b from the power draw for the full N card system. Divide by N and you have the power draw per GPU.

To save time, I just do this once for a particular MB/GPU/processor combination. The important part is getting the b measurement. As a working assumption, I assume b is constant  going forward. For a ASUS Z87 plus with celeron, which runs 5 cards, b is about 25 watts.

That's kind of overdoing it and if you want super accurate numbers it's also skewed since GPU mining also requires some CPU so without cards you get less consumption if the CPU is downclocked.

I just assume each rig uses 60 watts and work around that.

And since the rig will always be running, it's advised to measure the power consumption at the wall with all the cards being idle (mining software not running) and measure it when mining is running for a while. The difference is your true GPU power consumption.

Not your keys, not your coins!
Appuned
Newbie
*
Offline Offline

Activity: 68
Merit: 0


View Profile
July 06, 2017, 01:43:31 PM
 #9

I do have the kill-a-watt plug but for an in-depth analysis it is not good. I can't go an see day-by-day power draw. Plus there are stories of it burning up when used with high wattage use. For that reason I am afraid of using it with my rig.

So are we saying that the on-board sensors in cards are not accurate?

thanks
jaja

With a kill-a-watt, you can estimate the power draw of a single GPU by replacement method.
jajamelony (OP)
Newbie
*
Offline Offline

Activity: 8
Merit: 0


View Profile
July 06, 2017, 03:32:30 PM
 #10

Using the Kill-a-watt meter is the way to go.

A method is as follows:

1) Have the rig setup and running using its nominal mining software settings and the kill-a-watt installed. Start with the full complement of cards. Make a measurement on your N cards.

2) Power down the system and completely remove one card. Setup and run the rig with N - 1 cards.

3) Do 2) till you get down to one card.

4) Plot Power on the y axis versus card number on the x axis. Fit a straight line (ax + b).

5) b, the y intercept will be the non GPU power consumed. Subtract b from the power draw for the full N card system. Divide by N and you have the power draw per GPU.

To save time, I just do this once for a particular MB/GPU/processor combination. The important part is getting the b measurement. As a working assumption, I assume b is constant  going forward. For a ASUS Z87 plus with celeron, which runs 5 cards, b is about 25 watts.

That's kind of overdoing it and if you want super accurate numbers it's also skewed since GPU mining also requires some CPU so without cards you get less consumption if the CPU is downclocked.

I just assume each rig uses 60 watts and work around that.

And since the rig will always be running, it's advised to measure the power consumption at the wall with all the cards being idle (mining software not running) and measure it when mining is running for a while. The difference is your true GPU power consumption.

Ok that sounds like a good approach. thank you!

jaja
PcChip
Sr. Member
****
Offline Offline

Activity: 418
Merit: 250


View Profile
July 06, 2017, 05:08:37 PM
 #11

Kill-a-watt.

Legacy signature from 2011: 
All rates with Phoenix 1.50 / PhatK
5850 - 400 MH/s  |  5850 - 355 MH/s | 5830 - 310 MH/s  |  GTX570 - 115 MH/s | 5770 - 210 MH/s | 5770 - 200 MH/s
coin123123
Sr. Member
****
Offline Offline

Activity: 252
Merit: 250


View Profile
July 06, 2017, 08:15:55 PM
 #12

only socket meter


tested on my last rig with two 280x:

gpu 1050mv, ambient temp 25c, gpu's temp 70c-80c = 550w-600w from wall
gpu 1050mv, ambient temp 20c, gpu's temp 60c-70c = 450w-500w from wall

meaning with same setting but with just room temperature change possible to lower consumption so without live meter you can't know how much you use
Appuned
Newbie
*
Offline Offline

Activity: 68
Merit: 0


View Profile
July 26, 2017, 11:59:51 AM
 #13

only socket meter


tested on my last rig with two 280x:

gpu 1050mv, ambient temp 25c, gpu's temp 70c-80c = 550w-600w from wall
gpu 1050mv, ambient temp 20c, gpu's temp 60c-70c = 450w-500w from wall

meaning with same setting but with just room temperature change possible to lower consumption so without live meter you can't know how much you use

I also notice the power increase when the temperature is higher.
Aura
Sr. Member
****
Offline Offline

Activity: 518
Merit: 268


View Profile
July 26, 2017, 12:03:25 PM
 #14

You can measure the gpu's power consumption with a multimeter. There are cheap multimeters available on the net, these can give you an decent measurement.
NobodyIsHome
Jr. Member
*
Offline Offline

Activity: 74
Merit: 1


View Profile
July 26, 2017, 06:29:14 PM
 #15

Like others have said, Kill-a-watt devices work.

Also, a digital UPS or PDU can measure load as well.
xxcsu
Hero Member
*****
Offline Offline

Activity: 1498
Merit: 597


View Profile WWW
July 27, 2017, 03:23:27 AM
 #16

Hi,

Can someone tell me how to calculate the total power drawn by a GPU?


So are u asking about the total gpu power draw ? For that u can use GPU-Z , that software going to give you a very accurate reading about total gpu power draw ...

If you are asking about the total power draw for your graphics card ,included GPU , Memory modules , voltage regulators , cooling fans ...etc .... that is a different story Smiley  Wink

You can measure the gpu's power consumption with a multimeter. There are cheap multimeters available on the net, these can give you an decent measurement.

could you do some presentation or write a how to do instruction ? how can you measure the GPU power draw on the graphics card with multimeter Huh    Grin


I also notice the power increase when the temperature is higher.

when the temp is higher the card cooling fans working much harder Smiley

Learn about Merit & new rank requirements , Learn how to use MERIT , make this community better
If you like the answer you got for your question from any member ,
If you find any post useful , informative use the +Merit button.
igotfits
Full Member
***
Offline Offline

Activity: 298
Merit: 100


View Profile
July 27, 2017, 03:42:44 AM
 #17

i was looking for the same answers but everyone kept saying get one of these: https://www.amazon.com/gp/product/B000RGF29Q/ref=oh_aui_detailpage_o04_s00?ie=UTF8&psc=1
So what i did was run the comp with no gpu and just added one by one and logging how much more wattage was being drawn.
The weird thing about this is when adding a certain amount of GPUs the wattage would jump extremely high...
(example: with 2gpus i would get 300 at the wall so around 100watts per card, then when adding a 3rd i would get 450watts!)
So IMO it will never be accurate per card only as whole machine running and taking wattage at the wall.
NameTaken
Hero Member
*****
Offline Offline

Activity: 630
Merit: 502


View Profile
July 27, 2017, 04:10:58 AM
 #18

Corsair Link shows my AC input, DC output, efficiency, temperature, fan speed, etc.
Appuned
Newbie
*
Offline Offline

Activity: 68
Merit: 0


View Profile
August 25, 2017, 03:11:11 PM
 #19

Corsair Link shows my AC input, DC output, efficiency, temperature, fan speed, etc.

I also have the Corsair PSU. But it is expensive.
fanatic26
Hero Member
*****
Offline Offline

Activity: 756
Merit: 560


View Profile
August 25, 2017, 04:33:43 PM
 #20

I am surprised so many people are recommending the kill-a-watt when in testing they prove to be inaccurate. The numbers can be skewed by +-10% with a kill-a-watt. Specifically with PC power supplies there are issues of it reading wrong and showing 100% efficiency. The proper way to measure your power is with an amp clamp on your power cord. You test the amperage then multiply it by the voltage to get wattage.

Say for example your rig is pulling 7.3 amps at 116 volts. With those numbers your rig is drawing 847 watts.

Test with 1 card, then 2, etc to see what your cards are actually pulling.

Stop buying industrial miners, running them at home, and then complaining about the noise.
Pages: [1] 2 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!