Bitcoin Forum
April 19, 2024, 08:25:25 PM *
News: Latest Bitcoin Core release: 26.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 »  All
  Print  
Author Topic: Purchasing a Bitforce unit = betting on stagnation?  (Read 4782 times)
chickenado (OP)
Hero Member
*****
Offline Offline

Activity: 1036
Merit: 500



View Profile
December 08, 2011, 12:03:43 PM
Last edit: December 08, 2011, 12:50:21 PM by chickenado
 #1

At current difficulty, a Butterfly Labs Bitforce unit takes about 8 months to earn back the 230 BTC that it cost to buy.  Even if the number of miners remains equal, I would expect difficulty to go up as a result of miners switching to Bitforce units.  So a more realistic amortization time is about 12 months.

This doesn't seem like a very good bet to me.  12 months is like a century in the bitcoin universe.  My prediction is that in the next 12 months, bitcoin will either a) experience another huge growth spurt, perhaps by 1-2 orders of magnitude, or b) continue to shrink until only a handful of diehard enthusiasts use it.   I don't see very much room for stagnation.

If case a) materializes, the Bitforce unit is a bad investment because it will never earn back the 230 BTC.

If case b) materializes, it's also a bad investment because then you can make more money by selling those 230 BTC now and buying them back when the price is hits 0.1 USD.

Even if stagnation materializes, it's a risky investment because the unit could be made obsolete by superior ASICs much sooner than Jan 2013 when it finally starts to return profits.

If you are one of the buyers, I am curious: What are your motivations?
1713558325
Hero Member
*
Offline Offline

Posts: 1713558325

View Profile Personal Message (Offline)

Ignore
1713558325
Reply with quote  #2

1713558325
Report to moderator
1713558325
Hero Member
*
Offline Offline

Posts: 1713558325

View Profile Personal Message (Offline)

Ignore
1713558325
Reply with quote  #2

1713558325
Report to moderator
1713558325
Hero Member
*
Offline Offline

Posts: 1713558325

View Profile Personal Message (Offline)

Ignore
1713558325
Reply with quote  #2

1713558325
Report to moderator
If you see garbage posts (off-topic, trolling, spam, no point, etc.), use the "report to moderator" links. All reports are investigated, though you will rarely be contacted about your reports.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1713558325
Hero Member
*
Offline Offline

Posts: 1713558325

View Profile Personal Message (Offline)

Ignore
1713558325
Reply with quote  #2

1713558325
Report to moderator
DeepBit
Donator
Hero Member
*
Offline Offline

Activity: 532
Merit: 501


We have cookies


View Profile WWW
December 08, 2011, 12:48:45 PM
 #2

This doesn't seem like a very good bet to me.  12 months is like a century in the bitcoin universe.  My prediction is that in the next 12 months, bitcoin will either a) experience another huge growth spurt, perhaps by 1-2 orders of magnitude, or b) continue to shrink until only a handful of diehard enthusiasts use it.   I don't see very much room for stagnation.
1. 12 months is a very good return time in real world. Yes, breaking even in 1-2 months is very cool, but usually much more suspicious.

2. "continue to shrink" - oh, really ? As I see, the number of different BTC businesses and merchants now is much greater than before and continues to grow. Also the price is more stable.

Even if stagnation materializes, it's a risky investment because the unit could be made obsolete by superior ASICs much sooner than Jan 2013 when it finally starts to return profits.
Working on this.

Welcome to my bitcoin mining pool: https://deepbit.net ~ 3600 GH/s, Both payment schemes, instant payout, no invalid blocks !
Coming soon: ICBIT Trading platform
ElectricMucus
Legendary
*
Offline Offline

Activity: 1666
Merit: 1057


Marketing manager - GO MP


View Profile WWW
December 08, 2011, 12:51:58 PM
 #3

Mining is going to remain unprofitable until transaction fees kick in. Oh I nearly forgot: Isn't there a design flaw?
Something which makes it unprofitable to share transactions with high reward?
finway
Hero Member
*****
Offline Offline

Activity: 714
Merit: 500


View Profile
December 08, 2011, 12:55:36 PM
 #4

Yeah, u r right. Only a big rise will make FPGA sense.

caston
Hero Member
*****
Offline Offline

Activity: 756
Merit: 500



View Profile WWW
December 08, 2011, 01:11:15 PM
 #5

A lot of transactions fees are lost to exchanges in the form of "trade commissions". So although Magical Tux is making a nice profit a lot of miners are struggling. At the same time traders are being gouged more in fees using conventional services such as bank transfers, credit card fees, paypal fees and so on then they otherwise would in their lifetime. Ironic that the very same bitcoin that is supposed to be saving people fees is actually increasing them. If you want to develop bitcoin further look at applications to return more transaction fees to miners and give less to banks and middle men.

bitcoin BTC: 1MikVUu1DauWB33T5diyforbQjTWJ9D4RF
bitcoin cash: 1JdkCGuW4LSgqYiM6QS7zTzAttD9MNAsiK

-updated 3rd December 2017
rph
Full Member
***
Offline Offline

Activity: 176
Merit: 100


View Profile
December 08, 2011, 10:12:47 PM
Last edit: December 08, 2011, 10:24:00 PM by rph
 #6

12 months is a very good return time in real world.

Exactly. Those complaining about 1-2 year payoff periods must live in some alternative universe where
it's easy to earn 50-100% a year, with a small amount of startup capital, from anywhere in the world.

A miner that produces its own cost in BTC over 1 year is still worth at least 50% of its initial cost after that year.
So: it's earning a 50%+ return on investment.

For people with a certain skill set willing to tolerate a certain level of risk -
that looks very attractive compared to 5% corporate bonds, 0.9% FDIC savings accounts, etc.

-rph

Ultra-Low-Cost DIY FPGA Miner: https://bitcointalk.org/index.php?topic=44891
nmat
Hero Member
*****
Offline Offline

Activity: 602
Merit: 501


View Profile
December 08, 2011, 10:28:24 PM
 #7

12 months payout is great if it actually happens. Depending on your electricity price you may never break even. Even if prices remain the same, difficulty will probably rise until the average production cost on an FPGA matches market price.
Mousepotato
Hero Member
*****
Offline Offline

Activity: 896
Merit: 1000


Seal Cub Clubbing Club


View Profile
December 08, 2011, 10:30:40 PM
 #8

To put things in real world perspective let's pit a BFL Bitforce FPGA against an AMD 5970.

Price:
5970: $300-400
BFL: $700

MH/s:
5970: 840MH/s
BFL: 1000MH/s

Power Consumption:
5970: 350W
BFL: 20W

Cost of operation @ $.10/KWh:
5970: $26.04/mo
BFL: $1.55/mo


Money saved by staying home during the week playing WoW/BF3/MW3 instead of going out:
5970: $500-800/mo
BFL: N/A


So yeah, the BFL Bitforce FPGA might save a little power in the long run, but because you can't game on it, is it really worth it? Tongue

Mousepotato
runeks
Legendary
*
Offline Offline

Activity: 980
Merit: 1008



View Profile WWW
December 08, 2011, 11:04:45 PM
 #9

Mining is going to remain unprofitable until transaction fees kick in. Oh I nearly forgot: Isn't there a design flaw?
Something which makes it unprofitable to share transactions with high reward?
Mining is supposed to always fluctuate at the border between unprofitable and profitable. That's how users of Bitcoin get the lowest possible fees. Mining profitability will never get back to the level it was at 6 months ago. Back then, all Bitcoin users were financing miners though the inflation of Bitcoins.
TL;DR: paying 50 BTC per block (as we're doing now) is a huge amount when there are only no more than 50 transactions in a block. In essence, every Bitcoin user is right now - through inflation - sharing a per-transaction fee of 1 BTC for everyone's transactions.

Mining right now - assuming Bitcoin doesn't disappear - is probably more profitable than it ever will be again.
mtminer
Member
**
Offline Offline

Activity: 86
Merit: 10


View Profile
December 09, 2011, 06:12:33 AM
 #10

Everone seems to forget that in a year 50 coin reward is going to go down to 25 per block. This makes it pretty difficult to put hard earned money out when you cant control price or difficulty knowing that your gross cash flow is going to get cut in half.

What is the reliability going to be like on these home brewed fpga designs?
sadpandatech
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500



View Profile
December 09, 2011, 01:50:30 PM
 #11

To put things in real world perspective let's pit a BFL Bitforce FPGA against an AMD 5970.

Price:
5970: $300-400
BFL: $700

MH/s:
5970: 840MH/s
BFL: 1000MH/s

Power Consumption:
5970: 350W
BFL: 20W

Cost of operation @ $.10/KWh:
5970: $26.04/mo
BFL: $1.55/mo

Price:
5970: $300-400 (resale to miners @6-12months 0% due to 0 ROI from power usage)(resale to gamers with 7xxx available??)
BFL: $700 (resale to miners 100% pre asic) (cost to repurpose and value unknown)

MH/s:
5970: 840MH/s (@925 MHz using phatk mod?)
BFL: 900MH/s  (so far)

Power Consumption:
5970: 374w (@ 925MHz, 840MH/s) (Does not account for non-idle cpu, mobo, PSU heat usage)
BFL: 60w (@ ~900MH/s) (numbers not official but is known to be more than 20w, less than 80w for now) (Does not account for ~idle comp usage)

Cost of operation @ $.10/KWh:
5970: $26.93/mo
BFL: $4.32/mo

Earnings after electricity @ $3.00/BTC @ 1.1mil difficulty:
5970: $43.12/mo (70.05 gross - 26.93)
BFL: $70.73/mo (75.05 gross - 4.32)


  You were kidding about the gaming/going out thing right? ;p IF you're a gamer, you're not going to be gaming on your mining 5970's. Else you earn 0 and still cost elec. With BFL, you can still run your games at the same time on a much more efficient vid setup.  Grin Thus saving money by staying home more.  Waiting on Asics and assuming the creators of which have no desire to mark-up with their dev costs in consideration seems haphazard to me. Look at the mark up of existing FPGAs..... There is a necessity to recover as much of dev costs as quickly as possible before next competing product comes out.  And from an investment perspective, sitting on your cash loses you money.

DISCLAIMER; My statements are not an endorsement for BFL. I am and will remain of the mindset, I'll believe it when I see it...

If you're not excited by the idea of being an early adopter 'now', then you should come back in three or four years and either tell us "Told you it'd never work!" or join what should, by then, be a much more stable and easier-to-use system.
- GA

It is being worked on by smart people.  -DamienBlack
fred0
Sr. Member
****
Offline Offline

Activity: 349
Merit: 250


View Profile
December 11, 2011, 01:28:50 AM
Last edit: December 12, 2011, 10:34:17 PM by fred0
 #12

.... There is a necessity to recover as much of dev costs as quickly as possible before next competing product comes out.  And from an investment perspective, sitting on your cash loses you money.
I used even more conservative figures($2/BTC, $0.20 kwh electrical), and came up with hardware paid off just under a year.

The real challenge is forecasting longer than a year.

Block reward halves in Dec 2012(+/- 1 month)
Greater adoption of bitcoin
Development if asic hashing tech

Actually, I am optimistic about the whole thing.

Block reward halves implies the cost to produce a bitcoin will double.  If bitcoin price is based in the cost to manufacture, bitcoin would double.  I doubt this likely, but some people really believe this. I think that supply and demand trumps all. Since the block reward halving implies 50% of all the bitcoin that will exist are in circulation, we should start to see the deflationary effects of bitcoin kick in, not major, but likely minor, nonetheless stabilizing bitcoin price.

Greater adoption of bitcoin implies greater demand for bitcoin. Greater demand, greater price.

Development of asic hashing tech, this could lower the cost to produce a bitcoin, but I really think manufacturing cost to produce a bitcoin is not the major factor in the price. If, tomorrow, we develop a technology to mine gold that costs $0.01 per ounce, would the price of gold plummet?  I think not, since most of the gold has already been mined.  Likewise, unless asic technology comes out really soon, it's not likely to have an effect on bitcoin price.  If it comes out in a year, we will have already hit the 50% mark, and hit the beginning of deflationary period for the bitcoin lifecycle. It still could be deterimental to existing miners, but hopefully the hardware has already been paid for.  A definite risk factor.

Since we are really in an early adoption phase, I think that bitcoin price will rise enough to allow anyone to invest in FPGA tech (BFL or other) and recover their hardware costs in under a year.

Heaven knows, I not an economist, these are just my speculations, and the logic I used to arrive at them.  

So stop with all the gloom and doom!
sadpandatech
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500



View Profile
December 11, 2011, 01:56:07 AM
 #13

  I agree with most of what you're saying, Fred0. I still don't see where asics would affect the earnings of an fpga unit in operation now. Unless we assume fpga builders will not lower their prices as more competition comes in and asic units are sold at no mark up. The fpgas are already so low on energy costs that a further reduction there from asics would not be enough by itself to make asics instantly more economical than fpgas. The only real factor will be initial $/MH and it's near impossible to speculate what the $/MH of asics will be. My speculation would be that they will have a higher mark-up over component build costs due to the massive dev costs involved. This should be more than enough to give fpga miners time to make up the starting costs of the units price/MH.

  Time will tell. I for one will be watching very closely what is available behind the scenes with asic development. But for now, my money is on fpga for atleast the next 6-9 months. And of course super efficient GPU's if the price stays up and dif does not jump to much. A little note from my hunt for lx150's leads me to believe a lot of batches have sold in the last few months that would not normally have done so. According to the distributors anyhows. There was of course no mention of who or for what purpose they were bought for however.

  Cheers

  Edit'; Reading over your post again I am pretty sure we're seeing it about the same way. Not sure what I interpreted at first that we disagreed on. ;p

If you're not excited by the idea of being an early adopter 'now', then you should come back in three or four years and either tell us "Told you it'd never work!" or join what should, by then, be a much more stable and easier-to-use system.
- GA

It is being worked on by smart people.  -DamienBlack
DeepBit
Donator
Hero Member
*
Offline Offline

Activity: 532
Merit: 501


We have cookies


View Profile WWW
December 11, 2011, 02:03:40 AM
 #14

I agree with most of what you're saying, Fred0. I still don't see where asics would affect the earnings of an fpga unit in operation now. Unless we assume fpga builders will not lower their prices as more competition comes in and asic units are sold at no mark up. The fpgas are already so low on energy costs that a further reduction there from asics would not be enough by itself to make asics instantly more economical than fpgas.
May ASICs adoption cause difficulty to rise, lowering FPGA earnings ?

I for one will be watching very closely what is available behind the scenes with asic development.
Do you know something about what happens there, behind the scenes ? Care to tell us ? :)

Welcome to my bitcoin mining pool: https://deepbit.net ~ 3600 GH/s, Both payment schemes, instant payout, no invalid blocks !
Coming soon: ICBIT Trading platform
sadpandatech
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500



View Profile
December 11, 2011, 02:10:35 AM
 #15

I agree with most of what you're saying, Fred0. I still don't see where asics would affect the earnings of an fpga unit in operation now. Unless we assume fpga builders will not lower their prices as more competition comes in and asic units are sold at no mark up. The fpgas are already so low on energy costs that a further reduction there from asics would not be enough by itself to make asics instantly more economical than fpgas.
May ASICs adoption cause difficulty to rise, lowering FPGA earnings ?
That would almost equally reduce asic earnings as well, since the elctricty costs will not be that much lower.
I for one will be watching very closely what is available behind the scenes with asic development.
Do you know something about what happens there, behind the scenes ? Care to tell us ? Smiley
hehe, I wish I knew something worth sharing or had what bit of information I would monitor in a form that was worth sharing. =)


  Cheers

If you're not excited by the idea of being an early adopter 'now', then you should come back in three or four years and either tell us "Told you it'd never work!" or join what should, by then, be a much more stable and easier-to-use system.
- GA

It is being worked on by smart people.  -DamienBlack
DeepBit
Donator
Hero Member
*
Offline Offline

Activity: 532
Merit: 501


We have cookies


View Profile WWW
December 11, 2011, 02:48:33 AM
 #16

I agree with most of what you're saying, Fred0. I still don't see where asics would affect the earnings of an fpga unit in operation now. Unless we assume fpga builders will not lower their prices as more competition comes in and asic units are sold at no mark up. The fpgas are already so low on energy costs that a further reduction there from asics would not be enough by itself to make asics instantly more economical than fpgas.
May ASICs adoption cause difficulty to rise, lowering FPGA earnings ?
That would almost equally reduce asic earnings as well, since the elctricty costs will not be that much lower.
But the ASICs' price per MH/s may be lower, comparing to FPGA.

Welcome to my bitcoin mining pool: https://deepbit.net ~ 3600 GH/s, Both payment schemes, instant payout, no invalid blocks !
Coming soon: ICBIT Trading platform
fred0
Sr. Member
****
Offline Offline

Activity: 349
Merit: 250


View Profile
December 11, 2011, 02:50:09 AM
Last edit: December 11, 2011, 03:52:06 AM by fred0
 #17

 I agree with most of what you're saying, Fred0. I still don't see where asics would affect the earnings of an fpga unit in operation now. Unless we assume fpga builders will not lower their prices as more competition comes in and asic units are sold at no mark up. The fpgas are already so low on energy costs that a further reduction there from asics would not be enough by itself to make asics instantly more economical than fpgas. The only real factor will be initial $/MH and it's near impossible to speculate what the $/MH of asics will be. My speculation would be that they will have a higher mark-up over component build costs due to the massive dev costs involved. This should be more than enough to give fpga miners time to make up the starting costs of the units price/MH.
Well, I was thinking that if asics can produce bitcoin at $0.05 per bitcoin, that it would be tough to compete if we are using BFL singles at a cost of $0.65 per bitcoin.  That doesn't mean that we'll be operating at a loss, but that would reduce profitability.  I think that it is unlikely near term, but after the block reward halving, it might bite us in the ass. Just a risk, that I don't want to leave off my radar screen.
 Time will tell. I for one will be watching very closely what is available behind the scenes with asic development. But for now, my money is on fpga for atleast the next 6-9 months. And of course super efficient GPU's if the price stays up and dif does not jump to much.
I think the FPGA is the most promising also.  They blow all current GPUs away for energy use. BFL also competes with hashpower with GPUs.  

We need to keep an eye on GCN processors, while many seem to think that for mining they wiil follow NVIDIA's path and be a yawn, it just seems that algorithms can be rewritten to take advantage of the quad threads in each stream processor.  I really would not be surprised to see a much greater hash power boost from the 79xx series.

While it is easy to succumb to thoughts that bitcoin could flounder and die altogether, I think that this is an unreasonable reaction to the "bubble" bursting.

A nice article http://www.avc.com/a_vc/2011/11/bitcoin.html
sadpandatech
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500



View Profile
December 11, 2011, 04:33:14 AM
 #18

I agree with most of what you're saying, Fred0. I still don't see where asics would affect the earnings of an fpga unit in operation now. Unless we assume fpga builders will not lower their prices as more competition comes in and asic units are sold at no mark up. The fpgas are already so low on energy costs that a further reduction there from asics would not be enough by itself to make asics instantly more economical than fpgas.
May ASICs adoption cause difficulty to rise, lowering FPGA earnings ?
That would almost equally reduce asic earnings as well, since the elctricty costs will not be that much lower.
But the ASICs' price per MH/s may be lower, comparing to FPGA.

  Likely they will be. But, we really do not know yet.  The important part is that it does not change the earnings potential. Only the costs to operate and electricty cost per BTC will effect earnings.

  If asics end up being a huge percentage faster to pay for the initial investment then it is quite likely the difficulty will adjust rather quickly to compensate for that. That would effect anything bought for mining and its time to pay back investment, including the asics. But that is a never ending ladder. The same thing was true for CPU < GPU < FPGA. The key difference between the tech on that ladder is that CPU is ~90w/MH, GPU is ~.44w/MH and fpga is ~.042w/MH. That's a huge jump in efficieny from CPU to GPU, roughly 200x. Then only about 10.5x for GPU to FPGA(5970 v. Ztex). Without knowing the actual asic power consumption I would speculate it to be about ~2x more efficient. Using 20w/GH for the asic. Which may be optimistic imho.

  I do see where you are coming from though. Even as cheap as an fpga is to power, if the difficulty goes up enough then it becomes obviously more profitable for asic. But, just how much @2x(my speculated number since we lack hard data) would difficulty need to go up? We need to chart or graph it out I think. My math skills are really pretty basic so I am not sure whether diffulty would need to increase the same as the efficiency difference between CPU/GPU, CPU/FPGA, GPU/FPGA or what.? We could use the historical dificulty to summize the growth % from cpu to gpu but it would be hard to pin down the point where gpu not only took majority share of the hash rate but where that would intersect with stale earnings for CPU. We would of course have to normalize the price/difficulty data. Even lacking good FPGA global hash data we could get pretty close to speculating it's difficulty apex. We would need to compare the CPU to GPU difficulty apex slope in relation to their efficiency. Then applying that formula to the GPU to FPGA difficulty in relation to efficiency. I can probably pen and paper it but it will take me considerably longer to trial and error the proper method. Maybe one of the more proficient academics here can lend a hand?

  The questions then are how much will asics $/MH be? How much cheaper can FPGAs be made? I believe the asic $/MH will not be enough of a leap lower verses FPGA build costs to make FPGA payoff time unreasonable. To make this speculation I am considering an FPGA cost of 1/MH or less. Which is very doable now. LX150-n3 are street priced at $141, a cheap board and components costs $35 and assembly can be done for as low as $17. Total for ~200MH = $193  And the new series of Spartan are due out soon.

  On that note, has anyone had access to any of the Spartan-7 early release chips? And, just how many MH can an asic achieve? What would be the estimated power usage of a 1GH asic? using 'asic' as a blanket term for all variations, s-sasic, f-asic, etc.

  Thanks for poking me more about this fpga to asic thing, Deepbit. If I have time I will try to apply more than my instincts to giving a proper answer.

  Cheers,
   Derek

If you're not excited by the idea of being an early adopter 'now', then you should come back in three or four years and either tell us "Told you it'd never work!" or join what should, by then, be a much more stable and easier-to-use system.
- GA

It is being worked on by smart people.  -DamienBlack
sadpandatech
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500



View Profile
December 11, 2011, 04:46:09 AM
 #19

Well, I was thinking that if asics can produce bitcoin at $0.05 per bitcoin, that it would be tough to compete if we are using BFL singles at a cost of $0.65 per bitcoin.  That doesn't mean that we'll be operating at a loss, but that would reduce profitability.  I think that it is unlikely near term, but after the block reward halving, it might bite us in the ass. Just a risk, that I don't want to leave off my radar screen.
 Time will tell. I for one will be watching very closely what is available behind the scenes with asic development. But for now, my money is on fpga for atleast the next 6-9 months. And of course super efficient GPU's if the price stays up and dif does not jump to much.
I think the FPGA is the most promising also.  They blow all current GPUs away for energy use. BFL also competes with hashpower with GPUs.  

We need to keep an eye on GCN processors, while many seem to think that for mining they wiil follow NVIDIA's path and be a yawn, it just seems that algorithms can be rewritten to take advantage of the quad threads in each stream processor.  I really would not be surprised to see a much greater hash power boost from the 79xx series.

While it is easy to succumb to thoughts that bitcoin could flounder and die altogether, I think that this is an unreasonable reaction to the "bubble" bursting.

A nice article http://www.avc.com/a_vc/2011/11/bitcoin.html
  So 4.6w for 900MH? I wish I knew, as having that kind of data or even something close about potential asic would make the math a ton easier. Atleast for me.  Tongue

Yea, I certainly wont turn a blind eye to 79xx until it has been tested. Sadly I am not capable of writing any kind of code that could attempt to utilize its new architecture. Or any other code for that matter, really. :/

Yea, the whole bubble thing did not bother me. Look at silver, from $24 to $47 in 6 months. And how long has it been around? I believe as Bitcoin gets into more and more hands the 'bubbles' will have less and less impact. Especially since it is not subject to the 'paper' sword that entities like JP Morgan took to silver's throat in order to make a profit.

Thanks for the aritcle read and the convo. The back and forth debating helps open up angles one may not have thought of otherwise.

  Cheers,
   Derek

If you're not excited by the idea of being an early adopter 'now', then you should come back in three or four years and either tell us "Told you it'd never work!" or join what should, by then, be a much more stable and easier-to-use system.
- GA

It is being worked on by smart people.  -DamienBlack
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
December 11, 2011, 02:49:55 PM
Last edit: December 11, 2011, 03:05:24 PM by DeathAndTaxes
 #20

Without knowing the actual asic power consumption I would speculate it to be about ~2x more efficient. Using 20w/GH for the asic. Which may be optimistic imho.

  I do see where you are coming from though. Even as cheap as an fpga is to power, if the difficulty goes up enough then it becomes obviously more profitable for asic. But, just how much @2x(my speculated number since we lack hard data) would difficulty need to go up? We need to chart or graph it out I think. My math skills are really pretty basic so I am not sure whether diffulty would need to increase the same as the efficiency difference between CPU/GPU, CPU/FPGA, GPU/FPGA or what.? We could use the historical dificulty to summize the growth % from cpu to gpu but it would be hard to pin down the point where gpu not only took majority share of the hash rate but where that would intersect with stale earnings for CPU. We would of course have to normalize the price/difficulty data. Even lacking good FPGA global hash data we could get pretty close to speculating it's difficulty apex. We would need to compare the CPU to GPU difficulty apex slope in relation to their efficiency. Then applying that formula to the GPU to FPGA difficulty in relation to efficiency. I can probably pen and paper it but it will take me considerably longer to trial and error the proper method. Maybe one of the more proficient academics here can lend a hand?

  The questions then are how much will asics $/MH be? How much cheaper can FPGAs be made? I believe the asic $/MH will not be enough of a leap lower verses FPGA build costs to make FPGA payoff time unreasonable. To make this speculation I am considering an FPGA cost of 1/MH or less. Which is very doable now. LX150-n3 are street priced at $141, a cheap board and components costs $35 and assembly can be done for as low as $17. Total for ~200MH = $193  And the new series of Spartan are due out soon.

A couple of concepts which might enlighten you (or maybe muddy the waters even more).
sASIC (structured ASICS) are roughly 2x to 3x more efficient per watt than FPGA and have a per unit cost of ~1/2 to 1/5th depending on volume. (5K units to 50K units).
ASICS are roughly are more like 5x to 20x more efficient per watt than FPGA and can have a per unit cost as low as 1/10th that of a FPGA but really only make sense in volume of hundred thousands of units or more.

So it isn't like a sASIC would be more efficient BUT more expensive.  It would be that a sASIC could be 2x as efficient per watt and half the cost.  A true ASIC (even cell based) could be in the <$0.20 per GH and 100MH/W range.  

Now I find it beyond unlikely we will see sASICS anytime in the next couple years.  Startup capital is in the hundreds of thousands of dollars.  We are talking about months of talent/salary, IP licensing, high end design software, FPGA prototyping (@ $2000+ per chip), test runs and contracted (and at a minimum partially prepaid) production runs, etc.  An established player could do it for cheaper but a no FAB is going to trust a startup with anything less than full prepayment of 10K units.  

True ASICS are even more unlikely as it requires even more customization and that means talent, more testing, and unless you want development time in years even more licensing of IP.  Startup capital is likely in the low millions for current gen (45nm) ASIC.

So I think any FPGA bought today is safe from the threat of sASIC or cell based ASIC "future" designs at least for 3-5 years.  Bitcoin would need to see significant stabilization and growth before it attracts the kind of capital necessary for those kind of designs.


Still remember FPGA are subject to Moore's law.  28nm FPGA are very scarce right now and priced off the chart but in time they will be mundane.  They will deliver ~2x the performance per watt and per $ (slightly less but using 2 as multiplier is fine).  That will be the true threat to current gen FPGA but even there it will affect new sales and resale value more than profitability for a long time.  



Quote
And, just how many MH can an asic achieve?

This is a meaningless metric.  Say you have a design which gets x GH.  If you quadruple the size of the chip you could get 4x performance.  So the performance per chip isn't relevent.  A chip that has 4x the surface area will generally have lower yields.  So as some point there is a "magic" size where the cost of multi-chip design balances the additional cost of larger chip. 

If you could get a 1 GH board @ 15W for $100 would you really care if it was made up of 1, 2, or 4 chips.  All that matters is performance per watt and performance per $ right?

Still to get a very loose ballpark figure.
Current FPGA gets about 1MH per square mm.
On a 45nm process a completely custom ASIC could maybe achieve ~20MH per square mm.  On a 100mm^2 chip we are talking ~2GH/s.  Of course there is no reason one would need to stop at 100mm^2.  CPU/GPU come as large as 500mm^2.  A chip that large could acheive maybe 10GH/s.  However larger chips = lower yields.  Likely first gen ASIC will be designed small.


Quote
What would be the estimated power usage of a 1GH asic? using 'asic' as a blanket term for all variations, s-sasic, f-asic, etc.

Well you can't lump together all ASIC as they get vastly higher efficiency as you move up the cost ladder.

sASIC - lowest upfront cost, highest per unit cost.  Still roughly 2x the efficiency of FPGA (in performance per watt and performance per $).
cell based ASICS.  higher upfront cost, significant risk, much lower per unit cost.
Custom ASIC. huge risk, massive upfront cost, "negligble" per unit cost.

Pages: [1] 2 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!