Bitcoin Forum
December 09, 2016, 03:44:10 PM *
News: To be able to use the next phase of the beta forum software, please ensure that your email address is correct/functional.
 
   Home   Help Search Donate Login Register  
Pages: [1]
  Print  
Author Topic: How much power would i be using?  (Read 1388 times)
klapeck
Newbie
*
Offline Offline

Activity: 21

www.btcmart.net


View Profile
June 21, 2011, 03:01:43 PM
 #1

How much power would i be using? monthly? and im in the USA/Florida



if i built this rig only 3 of them.

2 x Radeon HD 5970 Video Card
1 x ASUS Rampage III Extreme Motherboard
1 x Thermaltake VM400M1W2Z V9 BlacX Edition Mid-Tower Case - ATX, µATX, Dual SATA HDD Dock, USB 3.0, 230mm Silent Fan
1 x Thermaltake TPG-850M ToughPower Grand Power Supply - 850W, 80 Plus Gold, 140mm Fan, CrossFireX Certified, SLI Certified
1 x Corsair XMS3 TW3X4G1333C9AG 4GB Dual Channel DDR3 RAM - - PC10666, 1333MHz, 4096MB (2x 2048MB), 240 Pin, Dual-Channel
1 x Seagate ST31000520AS Barracuda LP Hard Drive - 1TB, 5900RPM, 32MB, SATA-3G
1 x AMD HDX920XCJ4DGI Phenom II X4 920 Quad Core OEM Processor - 2.80GHz, 6MB Cache, 1800MHz (3600 MT/s) FSB, Deneb, Quad-Core, OEM, Socket AM2+
1481298250
Hero Member
*
Offline Offline

Posts: 1481298250

View Profile Personal Message (Offline)

Ignore
1481298250
Reply with quote  #2

1481298250
Report to moderator
1481298250
Hero Member
*
Offline Offline

Posts: 1481298250

View Profile Personal Message (Offline)

Ignore
1481298250
Reply with quote  #2

1481298250
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
Salain
Newbie
*
Offline Offline

Activity: 18


View Profile
June 21, 2011, 04:46:43 PM
 #2

.10 cents per amp. each video card uses 2.4 amps. So so 4.8 just on the video cards and maybe anoher 1amp on the PC itself. So figure 5.8 amps.

5.8 amps. so 5.8 x .10 = .58 cents per day. =17.40 cents per month.

Electricity use is charged in watts (includes the voltage the amps are pushed at), not amps.  You'd be using maybe 650W, or 0.65kWatts per hour.  That's about 468kWhour per month, at 13 cents a watt that would be $60.84 per month.  Probably a conservative guess on the high side.

Sharecoin: SUtRDGuqYGJHevMWTLnPaurFDdzJqJcDvg
kajung2k
Newbie
*
Offline Offline

Activity: 4


View Profile
June 21, 2011, 05:01:39 PM
 #3

650 Watts x 24 hours = 15,600 watts Used. 
Need to convert to Kilo = 15,600 / 1000 = 15.6kWh used
Electricity is .13 a kWh ..

so your bill will be:  $2.03 ($2.028) -- I rounded up

Zman0101
Member
**
Offline Offline

Activity: 98


View Profile
June 21, 2011, 05:11:52 PM
 #4

My bad KlaPeck... I calculated it wrong... Use this site.

http://www.electricity-usage.com/Electricity-Usage-Calculator.aspx?Device=&Watts=850&CostPerKWH=0.10&HoursPerDay=24


But also take into consideration.... you arent using the full 850 watts. Watts = Volts x amps

Ohms Law

If you are in the US Volts is 120 on standard outlet

http://www.angelfire.com/pa/baconbacon/page2.html

With the Amps i gave you.. At full load you are using 648 watts.

So you are looking at $43.55 per month.
Salain
Newbie
*
Offline Offline

Activity: 18


View Profile
June 21, 2011, 05:41:51 PM
 #5


But also take into consideration.... you arent using the full 850 watts. Watts = Volts x amps

Ohms Law

If you are in the US Volts is 120 on standard outlet

http://www.angelfire.com/pa/baconbacon/page2.html

With the Amps i gave you.. At full load you are using 648 watts.

So you are looking at $43.55 per month.

The AC voltage at the wall doesn't matter, your computer's power supply converts that into lower voltage DC.  As far as I know, a 12V rail supplies a lot of the power to a video card.  So a good graphics card would probably pull over 25Amps at 12V DC.  At the end of the day, that isn't something you need to worry about since computer equipment manufacturers report the maximum Wattage requirements to you.  Those computer parts can then be paired with a power supply which is also simplified to watts in the "big label" literature.  If you're building a computer it is good to know the amperage at each voltage, but it isn't a necessary consideration for this discussion.

Sharecoin: SUtRDGuqYGJHevMWTLnPaurFDdzJqJcDvg
Zman0101
Member
**
Offline Offline

Activity: 98


View Profile
June 21, 2011, 06:21:21 PM
 #6


But also take into consideration.... you arent using the full 850 watts. Watts = Volts x amps

Ohms Law

If you are in the US Volts is 120 on standard outlet

http://www.angelfire.com/pa/baconbacon/page2.html

With the Amps i gave you.. At full load you are using 648 watts.

So you are looking at $43.55 per month.

The AC voltage at the wall doesn't matter, your computer's power supply converts that into lower voltage DC.  As far as I know, a 12V rail supplies a lot of the power to a video card.  So a good graphics card would probably pull over 25Amps at 12V DC.  At the end of the day, that isn't something you need to worry about since computer equipment manufacturers report the maximum Wattage requirements to you.  Those computer parts can then be paired with a power supply which is also simplified to watts in the "big label" literature.  If you're building a computer it is good to know the amperage at each voltage, but it isn't a necessary consideration for this discussion.

Salain... Ok so lets think about this... If you have a power supply that puts out 850 Watts Max right... You still with me? And one video card = 25 amps...and we known ohms law thats been around for hundred of years is ..... to determine the amount of watts a device uses is. Watts = volts x amps

You still with me Bra...? 120volts x 25 amp = 3000 Watts = Salain is an idiot !  Go on Newegg and find me a powersupply that puts out 3000 watts and maybe will have a valid argument. " What a LAMA"
Salain
Newbie
*
Offline Offline

Activity: 18


View Profile
June 21, 2011, 06:52:42 PM
 #7

Salain... Ok so lets think about this... If you have a power supply that puts out 850 Watts Max right... You still with me? And one video card = 25 amps...and we known ohms law thats been around for hundred of years is ..... to determine the amount of watts a device uses is. Watts = volts x amps

You still with me Bra...? 120volts x 25 amp = 3000 Watts = Salain is an idiot !  Go on Newegg and find me a powersupply that puts out 3000 watts and maybe will have a valid argument. " What a LAMA"

And there I was, trying to keep it constructive.  It's your loss - there is a lot I could teach you if you weren't so afraid of being wrong that you're not willing to listen.

Sharecoin: SUtRDGuqYGJHevMWTLnPaurFDdzJqJcDvg
BitcoinDealer
Newbie
*
Offline Offline

Activity: 28


View Profile
June 21, 2011, 07:05:16 PM
 #8

Alot of power

Bitcoin will fail (unfortunately)

Look at my previous posts for explanation

Just wait for Bitcoin v2
Zman0101
Member
**
Offline Offline

Activity: 98


View Profile
June 21, 2011, 07:13:57 PM
 #9

Bro i have a rig with 3 xfx 6950s running...This is one of many. Now on a single 20 amp circuit with nothing else on it but that one rig it draws 7.4 amps. AT FULL LOAD. Circuit was tested with a FLUKE metter. What is it that you can possibly teach me that makes this different. The orignial question is how much is the machine going to cost me to run. Who gives a shit it changes dc to ac. Power company just wants to know what your drawing.

watts= 120volts x 7.4 amps =888Watts. This rig has 1200 watt PSU. Not using all of the 1200watts

888 watts x .10 kilowatts = .9 cents per hour

.9 cents per hour x 24hours in a day = $2.16 per day

$2.16 per day x 30 days = $64.80


Riddle me that one Batman? Your not right... You will be a noob forever.
Bitslizer
Newbie
*
Offline Offline

Activity: 14


View Profile
June 21, 2011, 08:00:23 PM
 #10

Bro i have a rig with 3 xfx 6950s running...This is one of many. Now on a single 20 amp circuit with nothing else on it but that one rig it draws 7.4 amps. AT FULL LOAD. Circuit was tested with a FLUKE metter. What is it that you can possibly teach me that makes this different. The orignial question is how much is the machine going to cost me to run. Who gives a shit it changes dc to ac. Power company just wants to know what your drawing.

watts= 120volts x 7.4 amps =888Watts. This rig has 1200 watt PSU. Not using all of the 1200watts

888 watts x .10 kilowatts = .9 cents per hour

.9 cents per hour x 24hours in a day = $2.16 per day

$2.16 per day x 30 days = $64.80


Riddle me that one Batman? Your not right... You will be a noob forever.

doesn't sound right....

I've 1 i3+6850 ++ 1 C2D+5770 + 1C2D desktop and 2 C2D laptop, previously doing F@H CPU+GPU, and my electricty cost for those is only about $30-$40 a month @ $0.14 kWh


Zman0101
Member
**
Offline Offline

Activity: 98


View Profile
June 21, 2011, 08:35:38 PM
 #11

Well tell the fluke people they are idiots and the person who wrote Ohms Law.  Math is math bro.
RevolutionMaster
Full Member
***
Offline Offline

Activity: 126


View Profile
June 21, 2011, 09:36:30 PM
 #12

I don't worry about the specifics, and instead follow these simple rules.

Assume that your power consumption will never be more than 90% of your PSU's rated load.
If actual usage is less, that's awesome, but always plan for worst-case scenarios.
Pages: [1]
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!