klapeck
Newbie
Offline
Activity: 21
Merit: 0


June 21, 2011, 03:01:43 PM 

How much power would i be using? monthly? and im in the USA/Florida
if i built this rig only 3 of them.
2 x Radeon HD 5970 Video Card 1 x ASUS Rampage III Extreme Motherboard 1 x Thermaltake VM400M1W2Z V9 BlacX Edition MidTower Case  ATX, µATX, Dual SATA HDD Dock, USB 3.0, 230mm Silent Fan 1 x Thermaltake TPG850M ToughPower Grand Power Supply  850W, 80 Plus Gold, 140mm Fan, CrossFireX Certified, SLI Certified 1 x Corsair XMS3 TW3X4G1333C9AG 4GB Dual Channel DDR3 RAM   PC10666, 1333MHz, 4096MB (2x 2048MB), 240 Pin, DualChannel 1 x Seagate ST31000520AS Barracuda LP Hard Drive  1TB, 5900RPM, 32MB, SATA3G 1 x AMD HDX920XCJ4DGI Phenom II X4 920 Quad Core OEM Processor  2.80GHz, 6MB Cache, 1800MHz (3600 MT/s) FSB, Deneb, QuadCore, OEM, Socket AM2+






Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.

Salain
Newbie
Offline
Activity: 18
Merit: 0


June 21, 2011, 04:46:43 PM 

.10 cents per amp. each video card uses 2.4 amps. So so 4.8 just on the video cards and maybe anoher 1amp on the PC itself. So figure 5.8 amps.
5.8 amps. so 5.8 x .10 = .58 cents per day. =17.40 cents per month.
Electricity use is charged in watts (includes the voltage the amps are pushed at), not amps. You'd be using maybe 650W, or 0.65kWatts per hour. That's about 468kWhour per month, at 13 cents a watt that would be $60.84 per month. Probably a conservative guess on the high side.




kajung2k
Newbie
Offline
Activity: 4
Merit: 0


June 21, 2011, 05:01:39 PM 

650 Watts x 24 hours = 15,600 watts Used. Need to convert to Kilo = 15,600 / 1000 = 15.6kWh used Electricity is .13 a kWh ..
so your bill will be: $2.03 ($2.028)  I rounded up





Salain
Newbie
Offline
Activity: 18
Merit: 0


June 21, 2011, 05:41:51 PM 

But also take into consideration.... you arent using the full 850 watts. Watts = Volts x amps Ohms Law If you are in the US Volts is 120 on standard outlet http://www.angelfire.com/pa/baconbacon/page2.htmlWith the Amps i gave you.. At full load you are using 648 watts. So you are looking at $43.55 per month. The AC voltage at the wall doesn't matter, your computer's power supply converts that into lower voltage DC. As far as I know, a 12V rail supplies a lot of the power to a video card. So a good graphics card would probably pull over 25Amps at 12V DC. At the end of the day, that isn't something you need to worry about since computer equipment manufacturers report the maximum Wattage requirements to you. Those computer parts can then be paired with a power supply which is also simplified to watts in the "big label" literature. If you're building a computer it is good to know the amperage at each voltage, but it isn't a necessary consideration for this discussion.




Zman0101
Member
Offline
Activity: 98
Merit: 10


June 21, 2011, 06:21:21 PM Last edit: June 21, 2011, 06:31:33 PM by Zman0101 

But also take into consideration.... you arent using the full 850 watts. Watts = Volts x amps Ohms Law If you are in the US Volts is 120 on standard outlet http://www.angelfire.com/pa/baconbacon/page2.htmlWith the Amps i gave you.. At full load you are using 648 watts. So you are looking at $43.55 per month. The AC voltage at the wall doesn't matter, your computer's power supply converts that into lower voltage DC. As far as I know, a 12V rail supplies a lot of the power to a video card. So a good graphics card would probably pull over 25Amps at 12V DC. At the end of the day, that isn't something you need to worry about since computer equipment manufacturers report the maximum Wattage requirements to you. Those computer parts can then be paired with a power supply which is also simplified to watts in the "big label" literature. If you're building a computer it is good to know the amperage at each voltage, but it isn't a necessary consideration for this discussion. Salain... Ok so lets think about this... If you have a power supply that puts out 850 Watts Max right... You still with me? And one video card = 25 amps...and we known ohms law thats been around for hundred of years is ..... to determine the amount of watts a device uses is. Watts = volts x amps You still with me Bra...? 120volts x 25 amp = 3000 Watts = Salain is an idiot ! Go on Newegg and find me a powersupply that puts out 3000 watts and maybe will have a valid argument. " What a LAMA"




Salain
Newbie
Offline
Activity: 18
Merit: 0


June 21, 2011, 06:52:42 PM 

Salain... Ok so lets think about this... If you have a power supply that puts out 850 Watts Max right... You still with me? And one video card = 25 amps...and we known ohms law thats been around for hundred of years is ..... to determine the amount of watts a device uses is. Watts = volts x amps
You still with me Bra...? 120volts x 25 amp = 3000 Watts = Salain is an idiot ! Go on Newegg and find me a powersupply that puts out 3000 watts and maybe will have a valid argument. " What a LAMA"
And there I was, trying to keep it constructive. It's your loss  there is a lot I could teach you if you weren't so afraid of being wrong that you're not willing to listen.




BitcoinDealer
Newbie
Offline
Activity: 28
Merit: 0


June 21, 2011, 07:05:16 PM 

Alot of power




Zman0101
Member
Offline
Activity: 98
Merit: 10


June 21, 2011, 07:13:57 PM 

Bro i have a rig with 3 xfx 6950s running...This is one of many. Now on a single 20 amp circuit with nothing else on it but that one rig it draws 7.4 amps. AT FULL LOAD. Circuit was tested with a FLUKE metter. What is it that you can possibly teach me that makes this different. The orignial question is how much is the machine going to cost me to run. Who gives a shit it changes dc to ac. Power company just wants to know what your drawing.
watts= 120volts x 7.4 amps =888Watts. This rig has 1200 watt PSU. Not using all of the 1200watts
888 watts x .10 kilowatts = .9 cents per hour
.9 cents per hour x 24hours in a day = $2.16 per day
$2.16 per day x 30 days = $64.80
Riddle me that one Batman? Your not right... You will be a noob forever.




Bitslizer
Newbie
Offline
Activity: 14
Merit: 0


June 21, 2011, 08:00:23 PM 

Bro i have a rig with 3 xfx 6950s running...This is one of many. Now on a single 20 amp circuit with nothing else on it but that one rig it draws 7.4 amps. AT FULL LOAD. Circuit was tested with a FLUKE metter. What is it that you can possibly teach me that makes this different. The orignial question is how much is the machine going to cost me to run. Who gives a shit it changes dc to ac. Power company just wants to know what your drawing.
watts= 120volts x 7.4 amps =888Watts. This rig has 1200 watt PSU. Not using all of the 1200watts
888 watts x .10 kilowatts = .9 cents per hour
.9 cents per hour x 24hours in a day = $2.16 per day
$2.16 per day x 30 days = $64.80
Riddle me that one Batman? Your not right... You will be a noob forever.
doesn't sound right.... I've 1 i3+6850 ++ 1 C2D+5770 + 1C2D desktop and 2 C2D laptop, previously doing F@H CPU+GPU, and my electricty cost for those is only about $30$40 a month @ $0.14 kWh




Zman0101
Member
Offline
Activity: 98
Merit: 10


June 21, 2011, 08:35:38 PM 

Well tell the fluke people they are idiots and the person who wrote Ohms Law. Math is math bro.




RevolutionMaster


June 21, 2011, 09:36:30 PM 

I don't worry about the specifics, and instead follow these simple rules.
Assume that your power consumption will never be more than 90% of your PSU's rated load. If actual usage is less, that's awesome, but always plan for worstcase scenarios.




