Let's say you flip a coin 10 times.

You mark a cross on paper if the first 3 flips gives you heads; in this case, if up to the 7th flip you still get heads, you put a circle around the cross you just marked.

Now do this for several billion times, divide the number of circles by the number of crosses. It should be rather close to 1/16. That's the idea.

You mark a cross on paper if the first 3 flips gives you heads; in this case, if up to the 7th flip you still get heads, you put a circle around the cross you just marked.

Now do this for several billion times, divide the number of circles by the number of crosses. It should be rather close to 1/16. That's the idea.

I think this model has a number of flawed assumptions... but please correct me if I'm wrong:

1. The difficulty is not constant. For the first 32,255 blocks the difficulty remained at 1. That's roughly 2^15 of your "crosses". You'd have to retroactively count which "blocks" of 10 coins had 3 leading "heads", which would reduce the "current number of blocks solved" (or crosses) significantly. OP based his claim on current blocks solved of all difficulties.

2. SHA256 is a deterministic function - does not produce random output. Given an infinite set of inputs, it will reduce each to one of 2^256 values. Over an infinite set of inputs, one might assume the outputs are evenly distributed, but...

3. There is not an infinite set of inputs. Based on the block hashing algorithm, there are 80 bytes x 8 = 640 "bits" of coin "inputs" possible. 40 bytes (half) are almost guaranteed to be the same for all miners, and at the same positions. That leaves 2^320 bits to be toggled "randomly" before being fed into the SHA256 function. Because half the total input bits are static, the inputs themselves are not evenly distributed.

4. SHA256 isn't as "fair" as one might assume. http://www.femto-second.com/papers/SHA256LimitedStatisticalAnalysis.pdf. I'll admit this paper is above my head... so feel free to take advantage of that and tell me this paper doesn't say what I think it says

5. The original SHA256 output is again hashed with SHA256. Therefore the maximum inputs for the final iteration is 2^256, as a best case scenario. The input was skewed once due to the structure of the block header, skewed again by the imperfect nature of the SHA256 algorithm, and now skewed yet again by a second iteration of SHA256.

6. Has anyone proven mathematically that each and every value from 0 to 2^256- 1 is actually possible as an output of SHA256?

7. Has it also been proven that SHA256 can produce all 2^256-1 outputs given only the inputs from 0 to 2^256 - 1?

To me, the OP's claim failed right at #1. As I said:

C: What is this magical theorem that says "the log base 2 of the

**number of blocks found**is the number of leading 0's that might be found exceeding the network difficulty in a double sha256 hash of an essentially random input"? I don't think it exists."Number of blocks found" != "number of blocks found at X difficulty". OP was claiming the former, you're claiming the later, which at least makes sense.

For what it's worth, there will always be 2106 blocks solved at a given difficulty before the next is chosen. That's roughly 2^11. Within those 2016 blocks, someone found an answer with 12 extra leading 0's. Assuming completely random inputs (which they aren't) and assuming SHA256 is fair (it isn't) and that a 2nd iteration of SHA256 can still produce all 2^256 outputs (who knows?), it still seems that block 125552 was statistically significant. And you can't really count very many blocks after those 2106, because the difficulty has been changed again... you're now requiring 4 heads in a row for a cross, but still only 7 for circles, which doubles the probability of a "circle".

Thoughts?