It's 1 kwh not kw/h. (power = energy * time. kW * hours = kWh).
Yes all electrical power used by a computer gets converted into heat.
Technically it isn't exactly 100%. Some gets converted into infared light and if your computer is near a window could "escape" your house. However for all intents & purposes. 1kWh load = 1kWh thermal energy added to your house. 1 kWh ~= 3400 BTU if that helps you think about it in easier terms.
|
|
|
"I have an old disk with Windows on it "
As in a hard drive with Windows already installed? That is NEVER going to work. A windows install has lots of low level system drivers. REINSTALL a fresh copy of windows.
If you didn't mean a hard drive well then I have no clue because as the post above me indicated your post contained no real details. "I have this car and it has these part but it doesn't go. What is wrong?"
|
|
|
"it doesn't go through the POST sequence, goes directly to FF"
Try removing both video cards. Power up system (video card isn't required). If it doesn't go through post sequence and skips to FF I would RMA the MB first. This is simply not normal. Never seen a MB do that. 50 is when the displays come on BTW. Had enough tense moments in getting first 3x 5970 water cooled system working that I think I have the LED post sequence memorized.
|
|
|
nope...
it's in drivers AMD writes...
In Linux, though? So the drivers are fine with me BIOS flashing my 6970's as I please, but there's some special case for 6990's? wtf. 6990 has two bioses. One for each GPU. I don't have a link but I remember seeing a thread on overclockers forum about it. If you only flash one of the two bioses (i.e. for GPU 0 but not GPU 1) the drivers hang.
|
|
|
There isn't that much of a "multiple" trading other currencies for bitcoin. I believe at most you can make ~20% more mining other coins to trade for bitcoins.
If a CPU is useless for bitcoins it is useless for alternative currencies also.
|
|
|
Very nice. Even though not economical for me I may just have to buy one anyways.
|
|
|
PowerSupply is a silverstone strider 1500W 80 plus silver. all GPUs have memory clocked down to 250 mhz affinity is set to first core, done in batch file
Well that is just WEIRD! Widely divergent power draws with roughly the same settings and equipment.
|
|
|
When I measure power consumption at the wall, I read about 1050W. What does this number represent? Is it the power draw of the power supply, after which 924W are provided to the system? Yes this is correct. Or is it the actual consumption of the system after taking into consideration its efficiency (1050 / 88% = 1193W)?
No this is incorrect. IF your powersupply is 88% efficient then 1050W AC at the wall means the power supply pulls in 1050W. It converts 88% of that (924W) into useful DC electricity and converts the remaining 126W into heat. Remember though efficiency varies depending on load. Your powersupply is well matched. Generally you want load to be 60% to 80% of peak because that is where powersupply is the most efficient.
|
|
|
It varies from card to card. Best thing to do is test yourself.
I have a 3x5970 water cooled rigs. With memory at full speed temps are 30C over ambient and it pulls 1080W at the wall. With core overclocked to 900MHz (from 735MHz) and memory underclocked to 200MHz temps are 24C over ambient and it pulls 960 at the wall.
Best thing you can is experiment yourself.
Use a wall meter like Kill-a-watt, and set fan to static speed (say 70%) to record temps.
Set it to one setting, run miner for 3 hours (for good sample size). record temp, kWh used, and temp. Adjust memory clock and repeat.
|
|
|
I have a Sapphire ATI Radeon HD 5870 running at out of the box settings, 850 Mhz/ 1200 Mhz and I get around 370 Mhashes/sec hashing speed. Even if you don't want to overclock KILL THAT MEMORY SPEED. 1GB or 2GB at 1200MHz gulps the juice down and all that electricity becomes heat. If nothing else your card will run cooler, quieter, and use less electricity for same number of hashes. If you do overclock the core having lower memory gives you some headroom.
|
|
|
ya i'm pulling between 1100 to 1200w
That seems like a lot for 4 6970s.. I've got a rig with 4 6950s that only pulls about 800W, and they're basically the same chipset. The 6970 has a slightly higher stock voltage (1.175V), but it doesn't seem like enough to account for a 300W difference.. Three potential sources of higher load. Power supply. One that is 75% efficient vs 90% efficient will add 100W to 150W additional AC on the same 800W DC load. Memory Clock. At full clock these memory on these cards suck down some juice. In my experience downclocking from 1375MHz to 300MHz saves 30W a card. On a quad core if you have the 100% CPU bug you will be burning 10W-30W extra. Setting affinity to a single core can reduce that 70%.
|
|
|
Looks like they extended it to 10/4.
Not sure why 5970 doesn't get more love. It kills any other card in terms of power efficiency and w/ right cooling can be solidly overclocked. 90% of the performance of a 6990 for almost half the cost and throw in more MH per kWh as a bonus.
|
|
|
Buy a kill-a-watt. Every miner should have one.
I don't think you will get another 5970 though.
A 3x 5970 system draws 945W @ the wall. Overclocked they draw 1020.
I would guess 4x 5970 are drawing. ~1100W maybe 1200W overclocked. Unless you want efficiency to fall off a cliff (waste money, and more heat) you don't want to load powersupply more than 80%. 1500W @ 80% = 1200W. You likely could squeeze one more in but the PS will be running very hot and inefficient.
|
|
|
Fair enough. You're right that I wasn't figuring the power of the computer the FPGA boards are connected to (though I would think that this could be pretty low - a netbook perhaps). I wouldn't want to drop my voltage/clock as low as you suggest, but I see your point that by doing this I could come closer to the efficiency of FPGAs. This increases the $/Mh, but this ratio is high for FPGAs anyway. BTW, do you know anyone who mines at such a low voltage?
No but very few people mine w/ FPGA either for the same reason. Underclocking a GPU lowers the electrical cost per hash but raises the capital cost (same card produces less hashes). Given the high risk in mining (who knows what bitcoin will be worth in 6, 12, 24 months) that tradeoff isn't currently worth it for most miners. It is more profitable to run miners clocked to the max (and gulping power) as bitcoins are still selling for 3x power costs. Still for miners who have already bought hardware if necessary (due to falling bitcoin prices) underclocking would be a means to "stay in the game" longer if bitcoin prices fell significantly.
|
|
|
That probably explains my high stale rate... I have 9 rigs on hubs that are on a 200 feet of cat5 from the router
You are kidding right? Good Ethernet cable will transmit at about 70% of speed of light. 200ft of extra cable maybe adds 0.0000001 seconds to transmission time.
|
|
|
You want a utility which will launch a program (in this case miner) when computer is idle for x amount of time. Here is one example: http://appsapps.info/idlestart.phpNote I haven't used it, tested it, or even downloaded it. It is just an example. Likely there are other similar utilities which do the same thing. That is what you want to look for. Triggering off screensaver or monitor power off is dead end. Just look for utility which will start an app after x minutes of idle time.
|
|
|
You should go solo just for the thrill. It's like a free casino.
Free as in large electric bills wagered against potentially no return.
|
|
|
You can do far better than that if you undervolt and underclock. If you want efficiency try cutting the clock IN HALF and cutting voltage down 20%. Although you won't be getting to no 12MHps/Watt.
Still comparing LX150-2 to an ENTIRE COMPUTER is silly. LX150 will require a computer also. Lastly I think the OP was indicating a GPU miner where electricity is cheap could underclock/undervolt and have a lower PRODUCTION COST (not raw efficiency) than a FPGA miner where electricity is expensive.
Given some parts of US have 6 cent electricity and some parts of Europe have 20+ cent electricity that is certainly possible. The OP claim is on extreme and yours is the other extreme the reality is in the middle.
|
|
|
Cgminer is very useful. The bad news is it doesn't overvolt properly. To get the most out of 5970 you need to raise voltage. Hell it doesn't even report voltage properly.
cgminer has made things simpler but I still need to use afterburner to modify voltage. The annoying thing about afterburner is it won't lower mem clock <500 and cgminer can lower it to 200 (likely lower than that but hashrate suffers).
To the OP sadly I haven't found any utility which works 100% of the time to modify clock, memory, and voltage. Seems a pretty easy thing to do but it doesn't seem to be.
There is a command line utility barelyclocked but it won't overvolt 5970s beyond 1.1V.
|
|
|
It doesn't fix 100% CPU bug for multiple GPU. If you have a single GPU it fixes the 100% bug then again AMD created that bug in 11.7.
The 100% CPU w/ multiple GPU which has been around for 2.5 years now still hasn't been fixed by AMD despite it being an outstanding issue, reported on their forum, and updated with every single driver release.
|
|
|
|