organofcorti
Donator
Legendary
Offline
Activity: 2058
Merit: 1007
Poor impulse control.
|
|
October 13, 2013, 07:29:15 AM |
|
I dont think you understand what Im trying to calculate here? IM trying to figure out the hashrate where mining costs and income are in balance (for a given BTC exchange rate and for the current block reward). Thats just a number that depends on a lot of variables, but mostly electricity cost, energy efficiency, and to some extend, investment horizon, hardware production costs. There is no timeline on when we will approach this, there is no historical data to check against.
This has already happened at least once previously, when difficulty levelled out for a year or so until the exchange rate increased. Does your calculator indicate that would have happened?
|
|
|
|
freedomno1
Legendary
Offline
Activity: 1806
Merit: 1090
Learning the troll avoidance button :)
|
|
October 13, 2013, 07:46:57 AM |
|
Parabolic bubble for mining seems like a weaker argument but I can see something like that occurring a balance out on difficulty over time
|
Believing in Bitcoins and it's ability to change the world
|
|
|
Puppet (OP)
Legendary
Offline
Activity: 980
Merit: 1040
|
|
October 13, 2013, 07:59:00 AM Last edit: April 26, 2014, 10:50:10 PM by Puppet |
|
I dont think you understand what Im trying to calculate here? IM trying to figure out the hashrate where mining costs and income are in balance (for a given BTC exchange rate and for the current block reward). Thats just a number that depends on a lot of variables, but mostly electricity cost, energy efficiency, and to some extend, investment horizon, hardware production costs. There is no timeline on when we will approach this, there is no historical data to check against.
This has already happened at least once previously, when difficulty levelled out for a year or so until the exchange rate increased. Does your calculator indicate that would have happened? What happened in 2012 was different. We mined with GPU's and gpu pricing is not dependent on bitcoin profitability. AMD (and nVidia) price their products for gamers mostly. They didnt charge huge premiums when GPU mining was highly profitable, and when bitcoin mining stopped being profitable, it didnt cause AMD to lower its prices. ASICs pricing will behave very different since they serve no other market besides mining. Still, lets see what we get. Lets take January 2012 BTC exchange rate ~$5 Assuming most people were doing GPU mining. Lets take a 5870 @350Mh @200W at the wall. Lets say it costed $250. If I plug those numbers in I get a network speed of 5000 GH. edit: I forgot block reward was double back then. So Id get 1000GH. In reality it was ~800 GH. If you take in to account FPGA mining, and given the range of possible outcomes of my current spreadsheet, thats close enough in my book.
|
|
|
|
organofcorti
Donator
Legendary
Offline
Activity: 2058
Merit: 1007
Poor impulse control.
|
|
October 13, 2013, 08:19:58 AM |
|
I dont think you understand what Im trying to calculate here? IM trying to figure out the hashrate where mining costs and income are in balance (for a given BTC exchange rate and for the current block reward). Thats just a number that depends on a lot of variables, but mostly electricity cost, energy efficiency, and to some extend, investment horizon, hardware production costs. There is no timeline on when we will approach this, there is no historical data to check against.
This has already happened at least once previously, when difficulty levelled out for a year or so until the exchange rate increased. Does your calculator indicate that would have happened? What happened in 2012 was different. We mined with GPU's and gpu pricing is not dependent on bitcoin profitability. AMD (and nVidia) price their products for gamers mostly. They didnt charge huge premiums when GPU mining was highly profitable, and when bitcoin mining stopped being profitable, it didnt cause AMD to lower its prices. ASICs pricing will behave very different since they serve no other market besides mining. Still, lets see what we get. Lets take January 2012 BTC exchange rate ~$5 Assuming most people were doing GPU mining. Lets take a 5870 @350Mh @200W at the wall. Lets say it costed $250. If I plug those numbers in I get a network speed of 5000 GH. edit: I forgot block reward was double back then. So Id get 1000GH. In reality it was ~8000 GH. If you take in to account FPGA mining, and given the range of possible outcomes of my current spreadsheet, thats close enough in my book. Points taken: miners were more heterogenous then so an average miner profitability can't be easily determined. Especially since those with the largest hashrates were FPGAs. Even so your result was at least the correct order of magnitude. If that's as accurate as it gets, that's still good enough to plan for. Another question - in your example you've used average miner electricity costs at 0.12c per kWh. Do you still think this is a reasonable estimate? I suppose you assume that miners will move rigs to the lowest cost areas - overseas if necessary?
|
|
|
|
Puppet (OP)
Legendary
Offline
Activity: 980
Merit: 1040
|
|
October 13, 2013, 08:24:51 AM |
|
Another question - in your example you've used average miner electricity costs at 0.12c per kWh. Do you still think this is a reasonable estimate? I suppose you assume that miners will move rigs to the lowest cost areas - overseas if necessary?
THere is a reason I put that variable on an axis of the chart . In the long run, yes I assume mining will migrate to where electricity is cheaper. The cheapest rates I found are in Kuwait at an astounding $0.01 per KwH. Not sure if anyone will want to invest in a huge bitcoin mine located there, but for sure it will move to Russia, China, certain US states where prices are lowest.
|
|
|
|
niothor
|
|
October 13, 2013, 10:39:24 AM |
|
Another question - in your example you've used average miner electricity costs at 0.12c per kWh. Do you still think this is a reasonable estimate? I suppose you assume that miners will move rigs to the lowest cost areas - overseas if necessary?
THere is a reason I put that variable on an axis of the chart . In the long run, yes I assume mining will migrate to where electricity is cheaper. The cheapest rates I found are in Kuwait at an astounding $0.01 per KwH. Not sure if anyone will want to invest in a huge bitcoin mine located there, but for sure it will move to Russia, China, certain US states where prices are lowest. You've quoted that 2.5 cents price from Wikipedia in Russia but be aware that is the cost per kw for a certain plan in a certain time frame (23-07). Overall cost are much higher.
|
|
|
|
Puppet (OP)
Legendary
Offline
Activity: 980
Merit: 1040
|
|
October 13, 2013, 11:08:29 AM |
|
Overall or average rates per country dont really matter. In the US price differences are also really big between different states or area's, but mining will just move to where its cheaper. Once you get close to marginal profitability, miners with higher rates will just shut down or sell their gear. Whatever latitude that leaves for cheaper electricity miners will be filled either by them buying used gear or deploying new gear. And yeah, that might lead to centralization.
|
|
|
|
Nagle
Legendary
Offline
Activity: 1204
Merit: 1002
|
|
October 15, 2013, 06:14:59 AM |
|
I think your calculations should include ancillary pcb, power, and cooling costs as even if these items aren't sold as a package miners still have to buy them and will calculate the costs into the equation. At some point I'm sure the extra stuff will cost more then the ASIC chips. As a rule of thumb, a system with ICs in it costs about 4x the price of the ICs alone. (The multiplier is less for very high volume items, more for low-volume items.) Cooling cost varies with climate and cooling system, but multiplying electricity cost by 1.5 to 3 is reasonable. You also have to account for cost of capital, floor space, and staffing if you're operating beyond the back bedroom level. That's how you price it out as a business. What does it look like with reasonable business-type cost assumptions?
|
|
|
|
mtminer
Member
Offline
Activity: 86
Merit: 10
|
|
October 15, 2013, 10:59:19 PM |
|
I dont think you understand what Im trying to calculate here? IM trying to figure out the hashrate where mining costs and income are in balance (for a given BTC exchange rate and for the current block reward). Thats just a number that depends on a lot of variables, but mostly electricity cost, energy efficiency, and to some extend, investment horizon, hardware production costs. There is no timeline on when we will approach this, there is no historical data to check against.
This has already happened at least once previously, when difficulty levelled out for a year or so until the exchange rate increased. Does your calculator indicate that would have happened? What happened in 2012 was different. We mined with GPU's and gpu pricing is not dependent on bitcoin profitability. AMD (and nVidia) price their products for gamers mostly. They didnt charge huge premiums when GPU mining was highly profitable, and when bitcoin mining stopped being profitable, it didnt cause AMD to lower its prices. ASICs pricing will behave very different since they serve no other market besides mining. Still, lets see what we get. Lets take January 2012 BTC exchange rate ~$5 Assuming most people were doing GPU mining. Lets take a 5870 @350Mh @200W at the wall. Lets say it costed $250. If I plug those numbers in I get a network speed of 5000 GH. edit: I forgot block reward was double back then. So Id get 1000GH. In reality it was ~8000 GH. If you take in to account FPGA mining, and given the range of possible outcomes of my current spreadsheet, thats close enough in my book. All in, new equipment, psu, cpu, ram, cards, fans, and power distribution 1 gigahash cost ~$1,000. You could do cheaper with used cards but not in bulk.
|
|
|
|
motoglen
Newbie
Offline
Activity: 3
Merit: 0
|
|
October 16, 2013, 07:26:47 AM |
|
Brilliient analysis. Puts a framework around conclusions I came to a few weeks ago and I totally agree except maybe with the assumption that miners will be completely rational.
Another analysis I recently read projected the possibility that, depending on power efficiently, miners might shutdown in "efficiency blocks" reducing hash rate and thus leaving only the most power efficient to continue mining but at a substantiately reduced network rate. This could cause a reduction in difficulty if no profits exist for chip manufacturers to continue to pour hash rate into the network. At that point, less efficient equipment might become productive again so it will be turned on. I guess that could lead to a permanently oscillilating difficulty. However, as everyone always knew, power efficiency (and cost) rule. 10 nm structures, coming? Quantum computers?
|
|
|
|
fluidjax
|
|
October 19, 2013, 09:02:38 PM |
|
In winter miners make good heaters, so can run much less profitable if you also take into account reduced heating bills. As energy is so expensive in the UK, and it can get quite cold, the benefits are greater than a country where electricity is cheap and the climate is hot.
|
|
|
|
rampalija
|
|
October 31, 2013, 12:10:55 AM |
|
nice one
|
|
|
|
jmumich
|
|
March 04, 2014, 06:35:32 AM |
|
Where did you get your numbers for 36$ per 28nm ASIC Manufacturing Costs?
I'd expect that to be behind several NDAs..
The silicon cost is calculated based on the price per wafer and number of candidates per wafer. Its not like TSMC or GF have pricelists on their website, but there is plenty of industry analysis literature out there that gives an idea. Im using $4000 per processed 300mm 28nm wafer, which is last years average price. I dont have a public source for you for that, but this may show the ballpark is at least correct: http://www.xbitlabs.com/news/other/display/20110912192619_TSMC_Reportedly_Hikes_Pricing_on_28nm_Wafers_Due_to_Increased_Demand.htmlNote the articel is from 2011. Prices may have come down further since 2012, and I strongly suspect bitcon asics use less layers than the average (making them cheaper), but otoh $4000 is a volume price that may be out of reach of bitcoin asic vendors today. Since this is an endgame calculator, that doesnt matter much. To get to the above numbers, bitcoin asics would have to become fairly big volume anyway. As for the other costs, chip packaging is typically calculated per ball, with $0.003 per ball being a good rule of thumb. That works out to ~$3 per chip. The additional $4 per chip I used for testing and handling is probably way too much. At some point would the number of ASICs produced increase the price of their raw materials, silicon, PCB, power supplies, as well as the price of electricity? Or is supply of these things so great that the increase in demand won't materially change the prices of those components?
|
|
|
|
Puppet (OP)
Legendary
Offline
Activity: 980
Merit: 1040
|
|
March 04, 2014, 07:01:09 AM |
|
At some point would the number of ASICs produced increase the price of their raw materials, silicon, PCB, power supplies, as well as the price of electricity? Or is supply of these things so great that the increase in demand won't materially change the prices of those components?
For some perspective, the industry ships 10 billion ARM based chips per year. Yes billion with a B. While those are much smaller, far lower power etc, they still use the same fabs, still need PCBs, need to be packaged etc, so no, bitcoin asics will never be so high volume as to cause systemic shortages anywhere. The only place I can imagine were you might see temporary shortages is (water) cooling and highend PSU's, but those vendors should have no real problem ramping up production given a bit of time. After all, the PC industry ships a million PC's per day, so the supply chain and infrastructure is in place (and increasingly running idle as PC shipments dwindle).
|
|
|
|
jmumich
|
|
March 04, 2014, 03:34:31 PM |
|
At some point would the number of ASICs produced increase the price of their raw materials, silicon, PCB, power supplies, as well as the price of electricity? Or is supply of these things so great that the increase in demand won't materially change the prices of those components?
For some perspective, the industry ships 10 billion ARM based chips per year. Yes billion with a B. While those are much smaller, far lower power etc, they still use the same fabs, still need PCBs, need to be packaged etc, so no, bitcoin asics will never be so high volume as to cause systemic shortages anywhere. The only place I can imagine were you might see temporary shortages is (water) cooling and highend PSU's, but those vendors should have no real problem ramping up production given a bit of time. After all, the PC industry ships a million PC's per day, so the supply chain and infrastructure is in place (and increasingly running idle as PC shipments dwindle). Makes sense for the materials, but would there be an impact on electricity prices, particularly if ASICs were centralized in a low-rate area? Particularly in the case of places like Kuwait, where someone pointed out that electricity rates are extremely low ... in many of these areas they are low because the region is energy-rich and the government subsidizes the cost of electricity. I doubt those same governments would subsidize a massive, for-profit ASIC farm. They can run the farm themselves, but then their cost of electricity is what they could otherwise sell that electricity for, not the cost minus the subsidy. In other areas, prices may be low due to weak demand. What would adding 15 MWh (roughly the cost for 10 PH/s, right?) do to the electricity market in some of the low cost areas in Russia, for example (where prices vary greatly, I understand). I may be off in my calculation, or the amount of electricity is similarly small compared to the size of the overall market. Otherwise, it is best to use a rate of electricity where there would be no marginal impact on price by adding the additional power consumption - I don't know what that rate is, though I would guess it is closer to the rates seen in the US and Europe than other places.
|
|
|
|
novello
|
|
March 04, 2014, 10:27:16 PM |
|
Where did you get your numbers for 36$ per 28nm ASIC Manufacturing Costs?
I'd expect that to be behind several NDAs..
Very good question as the gross margins TSMC and Global Foundries charge for Hashfast type volumes are closer to 70%. An 85% yield on a device of this size at this point on the learning curve is totally unrealistic - more like 60%. Taking these two together, the price per die should be closer to $100, with testing and packaging on top of that. It's not likely to change much in the near future unless huge volumes (Qualcomm scale) of wafers are ordered
|
|
|
|
Puppet (OP)
Legendary
Offline
Activity: 980
Merit: 1040
|
|
March 04, 2014, 10:58:07 PM |
|
Very good question as the gross margins TSMC and Global Foundries charge for Hashfast type volumes are closer to 70%. An 85% yield on a device of this size at this point on the learning curve is totally unrealistic - more like 60%. Taking these two together, the price per die should be closer to $100, with testing and packaging on top of that.
It's not likely to change much in the near future unless huge volumes (Qualcomm scale) of wafers are ordered
What do TSMC magins have to do with anything? The cost Im projecting is the processed wafer cost for the fab customer. Feel free to disbelieve my estimates, but as yet another public reference: http://www.soiconsortium.org/pdf/Economic_Impact_of_the_Technology_Choices_at_28nm_20nm.pdfFor a 100mm² chip, die cost is estimated to be around $7 and that includes a *very* low yield estimate (probably because the document is a few years old, 28nm has matured tremendously since). Moreover bitcoin asics are so simple and so redundant that yields will be far higher, probably close to 95% after harvesting chips with a few bad cores.
|
|
|
|
jimmothy
|
|
March 05, 2014, 01:39:14 AM |
|
BTW, googling for electricity prices, wikipedia shows rates in russia can be as low as 2.4 cents per KWH. That gives this result: In kuwait its only 1 cent, that would allow the network to reach 1 exahash (1000 PH) if you can solve the cooling problem Free electricity would bottom out around 1.7 EH. Don't forget about chip improvements. AM supposedly will release a 0.2w/gh 40nm chip so I would assume 0.1w/gh is possible with some fine tuning of a 20/28nm chip. Also with immersion-cooling you can have very high densities and combined with cheap electricity+energy arbitrage means very low running costs. I would guess we hit the point where we are finally limited by electricity costs at 5000ph. This also assumes the btc exchange rate stays constant.
|
|
|
|
Puppet (OP)
Legendary
Offline
Activity: 980
Merit: 1040
|
|
March 05, 2014, 07:49:45 AM |
|
Makes sense for the materials, but would there be an impact on electricity prices, particularly if ASICs were centralized in a low-rate area?
Sounds far fetched. Most of those cheap electricity area's have cheap electricity because there is an abundance of eg hydroelectric power. There are enough of such regions that I cant see bitcoin mining making a difference there, not at todays exchange rate anyway. You speak of dozens of MW, but the larger hydroelectric installations have a capacity in the thousands of MW (and are often under utilized). The entire world uses 20000TWh per year, just how much difference can you imagine bitcoin mining will make?
|
|
|
|
jmumich
|
|
March 05, 2014, 01:15:07 PM |
|
Makes sense for the materials, but would there be an impact on electricity prices, particularly if ASICs were centralized in a low-rate area?
Sounds far fetched. Most of those cheap electricity area's have cheap electricity because there is an abundance of eg hydroelectric power. There are enough of such regions that I cant see bitcoin mining making a difference there, not at todays exchange rate anyway. You speak of dozens of MW, but the larger hydroelectric installations have a capacity in the thousands of MW (and are often under utilized). The entire world uses 20000TWh per year, just how much difference can you imagine bitcoin mining will make? I don't think bitcoin mining will make any difference in worldwide energy usage - you're right it is too small. I am talking just about localized usage in areas where rates are low - I do think dozens of MW can make a difference in price in local areas where price is low and supply is in the thousands. For one, the energy usage for bitcoin mining is continuous - once you have a 15 MW datacenter in place, it will draw 15 MW 24/7/365, or as close to it as possible, by design. Data centers in the US of comparable size already struggle with varying power costs, enough that researchers are proposing algorithms that would allow data companies to shift server load to data centers where the price is low at the moment. ( http://www2.ece.ohio-state.edu/~xwang/papers/icpp12_datacenter.pdf) Bitcoin mining does not have that option - it has to run constantly. It's not hard to find articles discussing concern on an increase in energy prices in areas where companies propose to build data centers that would utilize dozens of MW. My point is only that while Bitcoin miners can locate themselves where power costs are lower than average, they cannot necessarily count on power use that is dramatically lower than average, at least not on a large scale.
|
|
|
|
|