Anders (OP)
|
|
July 08, 2014, 06:56:18 PM |
|
Randomness is a tricky issue. Are the Bitcoin proof of work solutions truly randomly distributed in the search space? If not, then just incrementing the test may be a suboptimal solution and a random test more efficient. I'm unclear about how many possible combinations there are to test in Bitcoin. Surely there must be more than 232?
There is an infinite number of possible inputs. Any inputs is equally as probable to produce the desired hash (unless SHA-256 is cryptographically weak which there is no indication it is at the current time). Baring some cryptographic weakness it doesn't matter where you start. The only reason to use a subset of the possible inputs would be if you know that certain some range of inputs have a higher than normal probability of producing a solution than other inputs. If that is the case you may only hash certain inputs but even there picking randomly would be pointless. The goal is to attempts as many solutions as possible in the shortest amount of time. You can't think "nonce =rand.GetNext" is going to be faster than "nonce++". Yes, nonce++ would definitely be faster than any pseudorandom algorithm, except perhaps some clever hardware implementation. But as you say, using random numbers wouldn't improve the proof of work speed.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
July 08, 2014, 06:56:21 PM |
|
Literally infinite? I would have thought the variables could be enumerated and each has a finite set of possible values.
In theory could have an infinite number of merkle trees but given the hash is a finite value eventually you would have a hash collision as the merkle tree is compressed down to a 256 bit value (merkle root hash). So you are right there is a finite number of possible inputs (baring a protocol change). The header is 640 bits so there are 2^640 possible inputs although not all of them would be valid under current protocol at any given point in time (i.e. timestamps >2 hours from now or a version >2 would be invalid right now although they may be valid in the future).
|
|
|
|
Anders (OP)
|
|
July 08, 2014, 07:00:30 PM |
|
For any given block header, there are exactly 232 possible values to test.
If you don't find a solution withing those 232 values, it is necessary to give up on the block you are working on and try a new block. This can be done by modifying the timestamp or by modifying the merkle root.
Ah! That explains it. Several timestamps or modifications may have to be tried if no solution was found in the 2 32 combinations.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
July 08, 2014, 07:03:31 PM |
|
Well all the txns are distilled down to a single merkle root hash which only has 256 bits. Originally I was thinking an infinite number of merkle trees but that doesn't matter because there is still a finite number of merkle root hashes. Still even with nothing else 256 bit for merkle root hash + 32 bit nonce + 32 bit timestamp = 320 bits. Still infinite or finite as you pointed out unless you some subset of the values are more likely to produce a solution it doesn't matter where you start.
|
|
|
|
DannyHamilton
Legendary
Offline
Activity: 3486
Merit: 4851
|
|
July 08, 2014, 07:03:57 PM |
|
For any given block header, there are exactly 232 possible values to test.
If you don't find a solution withing those 232 values, it is necessary to give up on the block you are working on and try a new block. This can be done by modifying the timestamp or by modifying the merkle root.
Ah! That explains it. Several timestamps or modifications may have to be tried if no solution was found in the 2 32 combinations. And with multiple processors (multiple ASIC threads), generally multiple timestamp/merkle-root variations are being tried simultaneously. Essentially every thread in existence across the entire bitcoin mining network is each working on its own variation of timestamp&merkle-root. As soon as they finish their 2 32 attempts, they start on a new unique timestamp&merkle-root combination. This continues until the block is solved, at which time there is a new "previous blockhash", and the process starts over.
|
|
|
|
gmaxwell
Moderator
Legendary
Offline
Activity: 4284
Merit: 8808
|
|
July 08, 2014, 07:05:52 PM |
|
Ah! That explains it. Several timestamps or modifications may have to be tried if no solution was found in the 232 combinations.
Several indeed: On average the network tries 16,818,461,371 modifications (each allowing 2^32 hash attempts) per block solved these days.
|
|
|
|
DannyHamilton
Legendary
Offline
Activity: 3486
Merit: 4851
|
|
July 08, 2014, 07:06:18 PM |
|
Still even with nothing else 256 bit for merkle root hash + 32 bit nonce + 32 bit timestamp = 320 bits.
2 320 = 2.14 X 10 96Lets round up and call it a googol.
|
|
|
|
jonald_fyookball
Legendary
Offline
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
|
|
July 08, 2014, 07:07:53 PM |
|
Well all the txns are distilled down to a single merkle root hash which only has 256 bits. Originally I was thinking an infinite number of merkle trees but that doesn't matter because there is still a finite number of merkle root hashes. Still even with nothing else 256 bit for merkle root hash + 32 bit nonce + 32 bit timestamp = 320 bits. Still infinite or finite as you pointed out unless you some subset of the values are more likely to produce a solution it doesn't matter where you start.
I don't think there can be an infinite number of merkle trees with a 1MB block size limit. While we are the topic, why does Bitcoin use a double hash for the Merkle root?
|
|
|
|
DannyHamilton
Legendary
Offline
Activity: 3486
Merit: 4851
|
|
July 08, 2014, 07:14:07 PM |
|
I don't think there can be an infinite number of merkle trees with a 1MB block size limit.
True. It can't be any more than 2 8,000,000 possible merkle tree variations (8 bits per byte, 1 million bytes) Want to convert that into base 10 for me? My calculator won't do it.
|
|
|
|
Anders (OP)
|
|
July 08, 2014, 07:20:11 PM |
|
I don't think there can be an infinite number of merkle trees with a 1MB block size limit.
True. It can't be any more than 2 8,000,000 possible merkle tree variations (8 bits per byte, 1 million bytes) Want to convert that into base 10 for me? My calculator won't do it. Decimal approximation: 7.5612130194946271264814592984191480373499754082086168... × 10 8428839http://www.wolframalpha.com/input/?i=2^28%2C000%2C000
|
|
|
|
DannyHamilton
Legendary
Offline
Activity: 3486
Merit: 4851
|
|
July 08, 2014, 07:27:27 PM |
|
I don't think there can be an infinite number of merkle trees with a 1MB block size limit.
True. It can't be any more than 2 8,000,000 possible merkle tree variations (8 bits per byte, 1 million bytes) Want to convert that into base 10 for me? My calculator won't do it. Decimal approximation: 7.5612130194946271264814592984191480373499754082086168... × 10 8428839http://www.wolframalpha.com/input/?i=2^28%2C000%2C000 Given that there are an estimated 10 80 particles in the known universe, and the estimated number of grains of sand that the entire known universe could hold (given average size of a grain of sand) is estimated at 10 90, and the quantity of planck volumes that would fit in the known universe is approximately 10 183, I think we've reached the point where the number of possibilities are indistinguishable from infinite.
|
|
|
|
|
Anders (OP)
|
|
July 08, 2014, 07:37:57 PM |
|
Given that there are an estimated 1080 particles in the known universe, and the estimated number of grains of sand that the entire known universe could hold (given average size of a grain of sand) is estimated at 1090, and the quantity of planck volumes that would fit in the known universe is approximately 10183, I think we've reached the point where the number of possibilities are indistinguishable from infinite.
I typed in the wrong number. Here is the correct number: 9.2323412683466475285638791370210766478370544416421428... × 10 2408239http://www.wolframalpha.com/input/?i=2^8%2C000%2C000Still beyond astronomical though, lol.
|
|
|
|
Anders (OP)
|
|
July 08, 2014, 07:39:56 PM |
|
Yours doesn't make sense Anders, how can 2^y > 10^y?
I did some sloppy typing and had one extra 2 in the exponent by mistake.
|
|
|
|
jonald_fyookball
Legendary
Offline
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
|
|
July 08, 2014, 07:43:40 PM |
|
Given that there are an estimated 1080 particles in the known universe, and the estimated number of grains of sand that the entire known universe could hold (given average size of a grain of sand) is estimated at 1090, and the quantity of planck volumes that would fit in the known universe is approximately 10183, I think we've reached the point where the number of possibilities are indistinguishable from infinite.
I typed in the wrong number. Here is the correct number: 9.2323412683466475285638791370210766478370544416421428... × 10 2408239http://www.wolframalpha.com/input/?i=2^8%2C000%2C000Still beyond astronomical though, lol. Still much smaller than a googolplex,...so not very impressive as far as "large numbers" go. hehe. Also still wondering why Satoshi decided we should be using double hashes in the Merkle tree.
|
|
|
|
Anders (OP)
|
|
July 08, 2014, 08:16:45 PM |
|
One idea I came to think of is that perhaps the Bitcoin proof of work solutions are non-randomly distributed. Complex and even chaotic distribution, sure, or else someone would have found certain patterns by now, yet still maybe not truly randomly distributed. In that case certain number generators could produce faster results than merely incrementing the nonce. And with statistical analysis good number generators could be discovered.
|
|
|
|
DannyHamilton
Legendary
Offline
Activity: 3486
Merit: 4851
|
|
July 08, 2014, 08:19:30 PM |
|
One idea I came to think of is that perhaps the Bitcoin proof of work solutions are non-randomly distributed. Complex and even chaotic distribution, sure, or else someone would have found certain patterns by now, yet still maybe not truly randomly distributed. In that case certain number generators could produce faster results than merely incrementing the nonce. And with statistical analysis good number generators could be discovered.
Good luck. If you can figure it out, you'll be quite wealthy. Some of the best minds in mathematics have studied SHA-256. I'm not aware of anyone finding any discernible pattern in the output.
|
|
|
|
xenon481
|
|
July 09, 2014, 12:47:55 AM |
|
Still much smaller than a googolplex,...so not very impressive as far as "large numbers" go. hehe.
And also much much unfathomably smaller than Graham's Number.
|
Tips Appreciated: 171TQ2wJg7bxj2q68VNibU75YZB22b7ZDr
|
|
|
Anders (OP)
|
|
July 09, 2014, 02:55:09 PM |
|
One idea I came to think of is that perhaps the Bitcoin proof of work solutions are non-randomly distributed. Complex and even chaotic distribution, sure, or else someone would have found certain patterns by now, yet still maybe not truly randomly distributed. In that case certain number generators could produce faster results than merely incrementing the nonce. And with statistical analysis good number generators could be discovered.
Good luck. If you can figure it out, you'll be quite wealthy. Some of the best minds in mathematics have studied SHA-256. I'm not aware of anyone finding any discernible pattern in the output. I was thinking of a brute force test where numerous number generators with different distributions are used for finding Bitcoin nonces. Even genetic algorithms can be used where mutations of the best generators are tested in a tree-like structure. Will probably fail to find any serious weakness in SHA-256. Here is a comment similar to yours: "Cryptographic experts have so far failed to crack SHA256 and only a small number of defects (possible collision scenarios) have been declared which although reducing the theoretical security (of a brute force attack) do not compromise SHA256 in practical use." -- http://bitcoin.stackexchange.com/questions/17132/is-bitcoin-mining-itself-compromising-the-security-of-sha256Anyway it would be fun to try anyway. It seems that the experts have more mathematical approaches to finding collisions etc. My approach here is more of a brute force shoot-from-the-hip millions of bullets.
|
|
|
|
Cubic Earth
Legendary
Offline
Activity: 1176
Merit: 1020
|
|
July 10, 2014, 06:52:24 AM |
|
Probably not. Remember the nonce range is only 2^32. A 1 GH/s chip will complete that in a matter of seconds. A 10 GH/s chip in a matter of milliseconds. If you have multiple chips they all are starting at a nonce of zero they are just working on different work. In hindsight it would have been better if Satoshi had made the nonce range larger (at least 64 bit) but that is water under the bridge. I did point out a way that the blockheader could be changed to support this. It would require a hard fork but it could be done in a manner which is compatible with existing hardware.
I see what you saying, that having a larger nonce space would make mining more efficient. But since mining is proof of work, I don't see how making it more efficient for everyone would change anything at all. In fact, if searching the SHA-256 space was too efficient, we would have to switch the proof of work to something harder. It's highly doubtful we would ever face such a problem, but my point is proof of work need not be made easier.
|
|
|
|
|