My assumption is that ASIC miners optimize for the "hashes per joule" metric, as did GPU miners over CPU miners.
It would seem that after some time, the decision to mine Bitcoins would be based on if you could source energy cheaply enough.
When do we reach that point?
I'm not a hardware guy so I'm curious to understand what the upperbound limits are of ASIC mining from a technical perspective. How much more can we optimize?
Thanks!
ASICs managed to jump quite a few process geometries to get to 28 nm and 20 nm but from this point things get really hard. There's probably a 2x to 4x improvement every 2 years from now on, dictated largely by the semiconductor technology (
http://hashingit.com/9-where-next-for-bitcoin-mining-asics, although I now suspect I was too optimistic on the 16nm change).
Power efficiency is certainly not going to improve faster than the hash rate now though unless someone identifies a brand new technology to use. The hash rate will continue to drive upwards (albeit at a much slower rate than it has) so it's inevitable that that will drive more mining towards lower cost energy (wherever in the world that may be found) and lower hardware margins (but more hardware burns more power). Each improvement, however, simply defers the time at which the majority of the mining reward goes towards energy costs.