Kazimir
Legendary
Offline
Activity: 1036


June 11, 2012, 10:31:25 AM 

If I understood correctly, in most situations where hashing occurs, the Bitcoin protocol applies double hashing: sha256(sha256(input)) rather than sha26(input).
Why is this, doesn't this actually reduce entropy?
As far as I know, we cannot guarantee that there aren't many different inputs x and y which have different hashes, but the same double hashes. That is, sha256 is not a onetoone mapping on the 256 bit space, right?








Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.


kokjo
Legendary
Offline
Activity: 1050
You are WRONG!


June 11, 2012, 10:48:37 AM 

that is correct, but the design decision about double sha256, is i think to slow mining down. and using doublesha256 allaround is just for consistency.
it won't be a problem, as long as you don't search the whole 256 bit space for a collision(which you don't want to do).
DISCLAIMER: im not a cryptographer.

"The whole problem with the world is that fools and fanatics are always so certain of themselves and wiser people so full of doubts." Bertrand Russell



kokjo
Legendary
Offline
Activity: 1050
You are WRONG!


June 11, 2012, 11:18:37 AM 

If I understood correctly, in most situations where hashing occurs, the Bitcoin protocol applies double hashing: sha256(sha256(input)) rather than sha26(input).
Why is this, doesn't this actually reduce entropy?
As far as I know, we cannot guarantee that there aren't many different inputs x and y which have different hashes, but the same double hashes. That is, sha256 is not a onetoone mapping on the 256 bit space, right?
First, do we actually *know* that sha256 is *not* a one to one mapping on the 256 bit space ? If it turns out to be, then you've got nothing. I don't know the answer, I'm not a professional cryptographer, but looking at the code for SHA256, there doesn't seem to be an obvious dropping of bits within the transform step itself, but then I am too lazy to analyze it indepth. Would love for someone more knowledgeable than I to comment. Second, if SHA256 does indeed somehow drop information for a 256bit input, IMO, if there's a reduction in entropy, it's likely to be negligible when compared to the additional work needed to untangle the complexity added by the second round. Finally, what someone said: the likely intent of the team who designed bitcoin was to slow mining down, not to add a layer of security there. Arguably, they failed because they didn't foresee the length at which people would go to mine coins (first GPUs, then FPGAs, then dedicated ASICs). Had they realized, they would have added an scryptlike round to the hash step. satoshi encouraged people to mine with gpus, he did foresee this.

"The whole problem with the world is that fools and fanatics are always so certain of themselves and wiser people so full of doubts." Bertrand Russell



Kazimir
Legendary
Offline
Activity: 1036


June 11, 2012, 11:25:00 AM 

Sure, I get that double hashing was intended to slow down mining, as well as vexing brute force. But perhaps sha256(sha256(input)+input) would have been better. First, do we actually *know* that sha256 is *not* a one to one mapping on the 256 bit space ? If it turns out to be, then you've got nothing. I don't know the answer, I'm not a professional cryptographer, but looking at the code for SHA256, there doesn't seem to be an obvious dropping of bits within the transform step itself, but then I am too lazy to analyze it indepth. Can't tell for sure, but obviously sha256 is designed to be irreversible (besides the mathematical fact that for inputs larger than 256 bit it's bound to lose information). for example if it does something like X=A+B (with + being a binary addition modulo 2³²) it "drops bits" in the sense that either a 1bit in A or in B would result in a 1bit in X. Similarly for X=(A<<n)+(B>>m) etc, albeit for different bits. Second, if SHA256 does indeed somehow drop information for a 256bit input, IMO, if there's a reduction in entropy, it's likely to be negligible when compared to the additional work needed to untangle the complexity added by the second round. Could be, or not, I really don't see why or why not that would be negligible. I hope someone with indepth knowledge about hashing can confirm or deny this? (hopefully confirm ) Finally, what someone said: the likely intent of the team who designed bitcoin was to slow mining down, not to add a layer of security there. Arguably, they failed because they didn't foresee the length at which people would go to mine coins (first GPUs, then FPGAs, then dedicated ASICs).
Had they realized, they would have added an scryptlike round to the hash step. You mean bcrypt? (not familiar with scrypt, perhaps you mean something similar, I think bcrypt would have been a very good, flexible, futureproof choice for Bitcoin)




pusle


June 11, 2012, 11:27:42 AM 

The way I understand it is that adding more rounds makes it harder to find a short cut/weakness. The fixed length for the second sha256 stage also removes one type of "attack" often used against such algorithms.




Kazimir
Legendary
Offline
Activity: 1036


June 11, 2012, 11:35:29 AM 

The way I understand it is that adding more rounds makes it harder to find a short cut/weakness. The fixed length for the second sha256 stage also removes one type of "attack" often used against such algorithms. OK, nonetheless I don't see any demerits (yet it does overcome certain risks) from using the alternative double hashing scheme I mentioned above. Or more generally: Hash(input,depth) := Hash(input) if depth==1 Hash(Hash(input)+input,depth1) otherwise (where 'Hash' can be any regular hashing function, such as sha256)




kokjo
Legendary
Offline
Activity: 1050
You are WRONG!


June 11, 2012, 11:38:36 AM 

Hash(input,depth) := Hash(input) if depth==1 Hash(Hash(input)+input,depth1) otherwise (where 'Hash' can be any regular hashing function, such as sha256) don't write recursive code. its bad. for the stack and people mind. also your code is kind of broken... what does Hash with only one input?

"The whole problem with the world is that fools and fanatics are always so certain of themselves and wiser people so full of doubts." Bertrand Russell



Ukigo


June 11, 2012, 11:56:05 AM 

You mean bcrypt? (not familiar with scrypt, perhaps you mean something similar, I think bcrypt would have been a very good, flexible, futureproof choice for Bitcoin)
No, he means this : http://www.tarsnap.com/scrypt.html

"...Enemies are everywhere ! Angka is all rage ! Be a good soldiers, blow everything... " < Pol Pot (C)



Kazimir
Legendary
Offline
Activity: 1036


June 11, 2012, 12:06:05 PM 

don't write recursive code. its bad. for the stack and people mind. Well it was pseudocode, just to get the general idea. In more detail: (this does something different than the example above) function DeepHash( input , depth ) { h = ''; while(0≤(depth)) h = Hash(h+input); // where Hash(x) is a regular hasing function, e.g. sha256 return h; } DeepHash(input,1) is the same as Hash(input) DeepHash(input,2) gives Hash(Hash(input)+input) DeepHash(input,3) gives Hash(Hash(Hash(input)+input))+input) etc. also your code is kind of broken... what does Hash with only one input? It was a polymorphic function, a twoparameter alternative to the already existing Hash function with one input (i.e. regular sha256).




Meni Rosenfeld


June 11, 2012, 12:15:53 PM 

I did a calculation which says that every application of SHA256 reduces entropy by about 0.5734 bits. I have no idea if that's correct. The reason for this sacrifice is almost certainly to prevent cracks in SHA256 from being immediately translated to an attack on Bitcoin hashing. First, do we actually *know* that sha256 is *not* a one to one mapping on the 256 bit space ? If it turns out to be, then you've got nothing. I don't know the answer, I'm not a professional cryptographer, but looking at the code for SHA256, there doesn't seem to be an obvious dropping of bits within the transform step itself, but then I am too lazy to analyze it indepth.
SHA256, as a cryptographic hash function, aspires to be indistinguishable from random. If it was in fact random, the number of preimages for every 256bit element would follow the Poisson distribution  about 36% would have no preimage, 36% would have one, 18% two, 6% three and so on. So I'd say it's almost certain that it's not a 11 mapping. Finally, what someone said: the likely intent of the team who designed bitcoin was to slow mining down, not to add a layer of security there.
What could that mean? The difficulty controls the mining rate. If a hash function half as hard would be chosen, the difficulty would double and you'd have the same generation rate. Arguably, they failed because they didn't foresee the length at which people would go to mine coins (first GPUs, then FPGAs, then dedicated ASICs).
Of course they foresaw all of this, if not the timing of their advent. Had they realized, they would have added an scryptlike round to the hash step.
The hash function should be easy to verify  each application should be fast but block generation requires many applications. Choosing a slow hash function would be counterproductive. satoshi encouraged people to mine with gpus, he did foresee this.
Eh. I remember hearing the opposite. I probably remember wrong. I know of one comment Satoshi made about GPUs, and it wasn't an encouragement: We should have a gentleman's agreement to postpone the GPU arms race as long as we can for the good of the network. It's much easer to get new users up to speed if they don't have to worry about GPU drivers and compatibility. It's nice how anyone with just a CPU can compete fairly equally right now.




Kazimir
Legendary
Offline
Activity: 1036


June 11, 2012, 12:24:36 PM 

I did a calculation which says that every application of SHA256 reduces entropy by about 0.5734 bits. I have no idea if that's correct. Ah, interesting. Well if that's true, the problem seems insignificant, although it could have been prevented entirely by an approach like I posted above (just an example, obviously there are numerous alternatives). And still covering the vulnerability for potential cracks in sha256.




kokjo
Legendary
Offline
Activity: 1050
You are WRONG!


June 11, 2012, 12:32:36 PM 

I did a calculation which says that every application of SHA256 reduces entropy by about 0.5734 bits. I have no idea if that's correct. Ah, interesting. Well if that's true, the problem seems insignificant, although it could have been prevented entirely by an approach like I posted above (just an example, obviously there are numerous alternatives). And still covering the vulnerability for potential cracks in sha256. no! the number is wrong, the entropy after 512 rounds of sha256, will be below 0, if its true. which is not good or insignificant. its a huge error. a acceptable reduction would be around 10^80 bits/round sorry dude you did your math wrong.

"The whole problem with the world is that fools and fanatics are always so certain of themselves and wiser people so full of doubts." Bertrand Russell



TangibleCryptography


June 11, 2012, 12:33:41 PM 

There are lots of "quirks" with the protocol but the core of it remains solid and sometimes surprisingly inclusive.
Rather than iterative uses of a single hashing function using two hashing functions would have provided more resistance in the event that SHA256 is significantly degraded.
i.e. RIPEMD160(SHA256(input)+input)
ironically a modified version of this structure is used for creating addresses from public keys but not hashing.




Pieter Wuille


June 11, 2012, 12:48:03 PM 

I did a calculation which says that every application of SHA256 reduces entropy by about 0.5734 bits. I have no idea if that's correct.
Assuming SHA256 is a random function (maps every input to a uniformly random independent output), you will end up having (on average) (11/e)*2^256 different outputs. This indeed means a loss of entropy of about half a bit. Further iterations map a smaller space to an output space of 2^256, and the loss of entropy of each further application drops very quickly. It's certainly not the case that you lose any significant amount by doing 1000 iterations or so.

aka sipa, core dev team
Tips and donations: 1KwDYMJMS4xq3ZEWYfdBRwYG2fHwhZsipa



Kazimir
Legendary
Offline
Activity: 1036


June 11, 2012, 12:53:28 PM 

no! the number is wrong, the entropy after 512 rounds of sha256, will be below 0, if its true. which is not good or insignificant. its a huge error.
a acceptable reduction would be around 10^80 bits/round
sorry dude you did your math wrong. Eh no, it doesn't work that way. Decreasing entropy from 256 bit to (2560.5734)=255.4266 bit means a reduction by factor 255.4266/256 = 0.997760156250 0.997760156250 ^{512} ≈ 0.317243314 so after 512 rouds, about 31.7% or 81.2 bits of entropy remains. FYI: the standard BitcoinQt client applies many tens of thousands hashing rounds (or sometimes hundreds of thousands, depending on your cpu speed) to create the master key for wallet encryption. Well, rest assured, all wallets do not have the same master key




Meni Rosenfeld


June 11, 2012, 12:57:56 PM 

I did a calculation which says that every application of SHA256 reduces entropy by about 0.5734 bits. I have no idea if that's correct.
Mmmh. Does that mean that sha256 ^N(some 256bit input) for N sufficiently large would always converge to the same value, independent of the actual input. No, because the assumptions I made become less true the more rounds are done (maybe they're not even accurate enough after one round). The set of all possible images of SHA256^N becomes smaller for larger N until it converges to a fixed set (which is probably very large). Then SHA256 is a permutation (onetoone mapping) on this set. (This is true for every function from a space to itself). First, do we actually *know* that sha256 is *not* a one to one mapping on the 256 bit space ? If it turns out to be, then you've got nothing. I don't know the answer, I'm not a professional cryptographer, but looking at the code for SHA256, there doesn't seem to be an obvious dropping of bits within the transform step itself, but then I am too lazy to analyze it indepth.
SHA256, as a cryptographic hash function, aspires to be indistinguishable from random. If it was in fact random, the number of preimages for every 256bit element would follow the Poisson distribution  about 36% would have no preimage, 36% would have one, 18% two, 6% three and so on. So I'd say it's almost certain that it's not a 11 mapping. Very interesting observation, and you're probably correct. Even more interesting would be to find a way to measure if that is indeed the case (even if the verification is in a montecarlo sense) That's probably as hard as determining whether the function is broken. Finally, what someone said: the likely intent of the team who designed bitcoin was to slow mining down, not to add a layer of security there.
What could that mean? The difficulty controls the mining rate. If a hash function half as hard would be chosen, the difficulty would double and you'd have the same generation rate. You're right, but then I'm not entirely convinced by the 'this buys use time the day sha256 gets cracked' either. If that was their concern, why not combine two wildly differing 256bit hash algorithm and XOR the result ? That probably could work too, but that also doesn't guarantee improves security. Maybe the two hash functions are actually connected in some unexpected way and the XOR is actually weaker than either. SHA256 itself is multiple iterations of a basic function. I'm guessing it is assumed that the basic function itself is ok and that the more times you apply it, the harder it is to attack.




kokjo
Legendary
Offline
Activity: 1050
You are WRONG!


June 11, 2012, 12:58:14 PM 

i should really stop posting in this thread, im embarrassing myself.

"The whole problem with the world is that fools and fanatics are always so certain of themselves and wiser people so full of doubts." Bertrand Russell



Kazimir
Legendary
Offline
Activity: 1036


June 11, 2012, 01:05:15 PM 

No, because the assumptions I made become less true the more rounds are done (maybe they're not even accurate enough after one round). The set of all possible images of SHA256^N becomes smaller for larger N until it converges to a fixed set (which is probably very large). Then SHA256 is a permutation (onetoone mapping) on this set. (This is true for every function from a space to itself). Ah right, yeah. It would be interesting to know (or at least have a reasonable estimate) how large this conversion set is. Intuitively I'd say it's quite sufficient to hold as a secure hash for the forseeable future, although I don't have any reasonable arguments for this.




TangibleCryptography


June 11, 2012, 01:58:22 PM 

Do people who design hash functions target these kind of properties or are they sort of an afterthefact implication of stronger soughtafter properties of the function ? It is property they are targeting. The ideal hashing function is described as a random oracle http://en.wikipedia.org/wiki/Random_oracleNo hashing function to date is a perfect random oracle but that is the ideal they are striving for.




Meni Rosenfeld


June 11, 2012, 03:52:32 PM 

I did a calculation which says that every application of SHA256 reduces entropy by about 0.5734 bits. I have no idea if that's correct.
Assuming SHA256 is a random function (maps every input to a uniformly random independent output), you will end up having (on average) (11/e)*2^256 different outputs. This indeed means a loss of entropy of about half a bit. Further iterations map a smaller space to an output space of 2^256, and the loss of entropy of each further application drops very quickly. It's certainly not the case that you lose any significant amount by doing 1000 iterations or so. I took it one step further and considered the distribution among the outputs, which is not uniform; the result for the amount of entropy lost is (1/e)*sum(log_2 n / (n1)!). But this is probably little more than a nice exercise, as SHA256 is likely too far from random for this calculation to be meaningful. And I mistakenly input ln rather than log_2 to the software, so the value I want really should be 0.827.




