Title: Mhash/watt - gpus vs fpgas Post by: BCMan on March 12, 2012, 11:35:11 AM Wondering how big is the gap between them. Possible to get close to fpgas efficiency with gpus downclocking/undervolting? Should I really care about it if I pay 0.057$ per kWh and already have my gpus with 300 mhz underclocked mem/max undervolted?
Title: Re: Mhash/$/watt - gpus vs fpgas Post by: DeathAndTaxes on March 12, 2012, 12:34:14 PM I am assumming you mean 5.7 cents per kWh because watts is a measure of power and electricity is paid per unit of energy.
watt = power KW = power watt hours = energy kWh = energy If you pay 5.7 cents per kWh then FPGA are a non-issue (at least in the near term). The true metric is total cost of ownership over expected lifespan. So capital cost + all energy cost / # of hashes produced in say 18 months or 3 years. With very low power rates you can keep a low TCO without FPGAs. TL/DR version is all that matter is the total cost (equpment + electricity + repairs + labor/management) to produce 1 PH (1 quadrillion hashes) over the lifetime (either actual or economic) over the equipment (or life of under warranty replacements. To answer your direction question (some ballpark figures): Code: CPU mining: .01 to 0.25 MH/W (totally uncompetitive at this point) Title: Re: Mhash/$/watt - gpus vs fpgas Post by: BCMan on March 12, 2012, 12:49:09 PM I am assumming you mean 5.7 cents per kWh because watts is a measure of energy and electricity is paid per unit of power. Ah yeah. Sorry, messed up the numbers, had a few bottles of strong beer here.Quote 28nm FPGA ~ 40 MH/W This is actual stats of already tested units or another promo bs? Hard to believe to these numbers.LargeCoin (sASIC) ~200 MH/W Custom ASIC ~ 500 MH/W+ Title: Re: Mhash/watt - gpus vs fpgas Post by: Global BTC on March 12, 2012, 12:54:45 PM watt = energy KW = energy watt hours = power kWh = power You've mixed it up. Power is energy per time, 1 watt is 1 joule per second. Watt is a measure of power, kWh is a measure of energy. Title: Re: Mhash/watt - gpus vs fpgas Post by: DeathAndTaxes on March 12, 2012, 12:57:18 PM LargeCoin is as advertised not tested. So take it with a grain of salt it could be a scam (not a very good one if it is) or it could be over optimistic estimate (remember BFL 1.05 GH/s @ 19.8W). Still it is at least plausible that a current gen sASIC could get ~200 MH/s. I would have thought 100MH/W was more likely.
The "Custom ASIC" listed isn't a product offered by anyone.... yet.:) Think of it more as the upper limit of what is possible with current silicon. I took the wattage and speed of the SHA "testbed" chip which was used for testing the various algorithms used in one of which eventually became SHA-256 and quadrupled it to account for 2 die shrinks ("testbed" processor was @ 130nm). A 45nm custom ASIC (chip that does nothing but SHA-256 hashing) should be in the ballpark of 500 MH/s. Beyond that you are only getting more efficiency by making the chip smaller (die shrink). 32nm would be ~ 2x the performance per watt. 22nm would be 4x. 16nm (next gen Intel is look for first production in 2015) would be 8x, etc. So one could use the "custom ASIC" number as a how close to "perfect" am I metric. I added some caveats to the table for clarification. Title: Re: Mhash/watt - gpus vs fpgas Post by: DeathAndTaxes on March 12, 2012, 12:58:04 PM watt = energy KW = energy watt hours = power kWh = power You've mixed it up. Power is energy per time, 1 watt is 1 joule per second. Watt is a measure of power, kWh is a measure of energy. Your earn 1 BurnCoin. Ouch can't believe I did that and in a "correction". Fixed. Title: Re: Mhash/$/watt - gpus vs fpgas Post by: BCMan on March 13, 2012, 02:02:42 AM I am assumming you mean 5.7 cents per kWh because watts is a measure of power and electricity is paid per unit of energy. Thats the info I needed. Thanks!watt = power KW = power watt hours = energy kWh = energy If you pay 5.7 cents per kWh then FPGA are a non-issue (at least in the near term). The true metric is total cost of ownership over expected lifespan. So capital cost + all energy cost / # of hashes produced in say 18 months or 3 years. With very low power rates you can keep a low TCO without FPGAs. TL/DR version is all that matter is the total cost (equpment + electricity + repairs + labor/management) to produce 1 PH (1 quadrillion hashes) over the lifetime (either actual or economic) over the equipment (or life of under warranty replacements. To answer your direction question (some ballpark figures): Code: CPU mining: .01 to 0.25 MH/W (totally uncompetitive at this point) |