There is a lot of talk about how Bitcoin mining incentivises the creation of cheaper energy, but what is not often talked about is how it also incentivises the creation of cheaper computation.
I've been thinking a lot about theoretical limits to the efficiency of computation as it relates to Bitcoin.
I had assumed that Bitcoin miners would soon approach theoretical limits of computational efficiency in the context of hash power.
Maybe within an order of magnitude I thought.
As it turns out, we are nowhere near the theoretical limit of computational efficiency:
https://en.wikipedia.org/wiki/Landauer%27s_principleI'm sure there are practical limits I am not aware of - e.g. lower limits of feature size of semiconductors probably impose a relatively hard limit on computational efficiency.
Someone here probably knows the calcs for the practical limits of computational efficiency with conventional semiconductor technology.
In any case, it is reassuring to know that we have a long way to go yet.
To think that Bitcoin could be a strong driving force to improve the efficiency of computer hardware is very exciting to me.
Just a thought I wanted to share with you folks.