Bitcoin Forum
November 01, 2024, 07:16:02 AM *
News: Bitcoin Pumpkin Carving Contest
 
   Home   Help Search Login Register More  
Pages: [1] 2 »  All
  Print  
Author Topic: What if SHA-256 is a poor random oracle?  (Read 4792 times)
Anders (OP)
Full Member
***
Offline Offline

Activity: 126
Merit: 100



View Profile
September 13, 2014, 02:06:23 PM
 #1

As the Bitcoin mining power increases, the target for the SHA-256 hash value becomes smaller and thereby more difficult to reach. This maintains the average time for new blocks at 10 minutes.

A possible flaw could potentially be discovered when the target becomes smaller. For a hash function to work as intended it needs to behave like a random oracle.

"In cryptography, a random oracle is an oracle (a theoretical black box) that responds to every unique query with a (truly) random response chosen uniformly from its output domain. If a query is repeated it responds the same way every time that query is submitted." -- http://en.wikipedia.org/wiki/Random_oracle

Evidently, SHA-256 has behaved pretty much like a random oracle so far for Bitcoin, but what if for lower target values this property starts to degrade? A worst-case scenario would be that for a certain target value, a new block becomes impossible to mine. The Bitcoin mining today depends on the SHA-256 values being distributed randomly. If for some target value the randomness is reduced too much then the worst-case scenario becomes a possibility.
Quokka
Newbie
*
Offline Offline

Activity: 11
Merit: 0


View Profile
September 13, 2014, 02:56:41 PM
 #2

It's a bad situation, but fortunately SHA-256 has held up well this far so it's less than likely to happen. Of course, that doesn't answer your question as to what if it DID happen. I'd imagine it'd probably end in a hard-fork that involved changing to a new algorithm. It'd be a huge disruption and put a lot of ASIC miners out of business but probably wouldn't be the absolute end of Bitcoin.

And then, of course, it becomes a question of whether or not the new algorithm is any more random than SHA-256.
jl2012
Legendary
*
Offline Offline

Activity: 1792
Merit: 1111


View Profile
September 13, 2014, 02:59:44 PM
 #3

I don't believe it would happen but if it does happen we have no choice but hardfork.

To make sure existing ASIC will still work, we can change the requirement from

Code:
HASH256(version|prev_block|merkle_root|timestamp|bits|nonce) < target

to

Code:
HASH256(version|prev_block|merkle_root|timestamp|bits|nonce) < target

AND

HASH256(HASH256(version|prev_block|merkle_root|timestamp|bits)|nonce) < target


Let say due to defect in SHA256 it is impossible to find a block if more than 160 leading 0-bits are required. With the new scheme, the target will become 80 leading 0-bits while the probability of success remains 1/2^160

(Currently, the lowest hash found is 80 leading 0-bits: https://bitcointalk.org/index.php?topic=29675.msg8131780#msg8131780)

Donation address: 374iXxS4BuqFHsEwwxUuH3nvJ69Y7Hqur3 (Bitcoin ONLY)
LRDGENPLYrcTRssGoZrsCT1hngaH3BVkM4 (LTC)
PGP: D3CC 1772 8600 5BB8 FF67 3294 C524 2A1A B393 6517
jl2012
Legendary
*
Offline Offline

Activity: 1792
Merit: 1111


View Profile
September 13, 2014, 04:53:39 PM
Last edit: September 14, 2014, 04:40:40 PM by jl2012
 #4

Actually, I don't think it would become a real problem.

Let's assume that it is impossible to have more than 100 leading 0-bits in an SHA-256 hash. When the target becomes 97 leading 0-bits, 1/8 of the eligible hashes are actually impossible. Therefore, the actual difficulty will become 1.14x of the apparent difficulty. 1.33x for 98 bits, 2x for 99 bits, 3.41x for 99.5 bits, 5.33x for 99.7bits, 14.93x for 99.9bits, etc. This will seriously hinder the growth of the apparent difficulty and we may never hit the boundary of 100 bits (unless someone with massive hashing power suddenly push it over the boundary).

Donation address: 374iXxS4BuqFHsEwwxUuH3nvJ69Y7Hqur3 (Bitcoin ONLY)
LRDGENPLYrcTRssGoZrsCT1hngaH3BVkM4 (LTC)
PGP: D3CC 1772 8600 5BB8 FF67 3294 C524 2A1A B393 6517
Willisius
Sr. Member
****
Offline Offline

Activity: 364
Merit: 250

I'm really quite sane!


View Profile
September 14, 2014, 01:16:00 AM
 #5

Are you kidding? SHA256 was made by the NSA. They would never promote a faulty standard, right? ...right?
Anders (OP)
Full Member
***
Offline Offline

Activity: 126
Merit: 100



View Profile
September 14, 2014, 03:14:23 AM
 #6

Are you kidding? SHA256 was made by the NSA. They would never promote a faulty standard, right? ...right?

A random oracle is just an ideal theoretical model. In reality SHA-256 will only at best be an approximation of a random oracle if the claim in the following quote is correct:

"... This property proves that SHA-256 is not a random oracle. Yet, it does not endanger in any way the resistance of SHA-256 to collisions or preimages. Therefore, being a random oracle seems to be strictly harder than being a secure hash function.

It has actually been shown (by Canetti, Goldreich and Halevi) that random oracles cannot exist "in all generality" in the following sense: it is possible to build pathological signature and asymmetric encryption schemes, which are secure when they internally use a random oracle, but which are insecure whenever an actual computable function is used instead of the mythical gnome-in-the-box.

Summary: proofs in the random oracle model are fine, but are never complete enough to cover a practical implementation: we know that any function we will use in lieu of the random oracle will not be a random oracle; so security relies on the fervent hope that the parts where the actual function is not a random oracle do not impact security. This justifies a bit of mistrust. Still, a proof in the random oracle model is much better than no proof at all." -- http://crypto.stackexchange.com/questions/879/what-is-the-random-oracle-model-and-why-is-it-controversial
Anders (OP)
Full Member
***
Offline Offline

Activity: 126
Merit: 100



View Profile
September 14, 2014, 08:01:46 AM
 #7

Hey! I came to think about an idea of how to measure SHA-256's random oracle behavior so far. By taking the statistics for the Bitcoin hash rates over time compared with the target difficulty. If SHA-256 is a good random oracle then those two metrics should remain proportional over time. And if there are deviations (when averaged over longer periods) then that indicates that SHA-256 is a less good random oracle.

Of course, even if it turns out that the Bitcoin statistics so far shows good random oracle behavior there can still be deviations later on when the target is reduced and the difficulty increased. Unless it can be mathematically proved that the behavior will remain consistent over time. That could be tricky to prove if there are very complex nonlinearities in the distribution of SHA-256 hash values.
Dabs
Legendary
*
Offline Offline

Activity: 3416
Merit: 1912


The Concierge of Crypto


View Profile
September 14, 2014, 01:00:53 PM
 #8

I've run SHA-256 a few billion times. It looks good, as in, uniform distribution of whatever inputs I gave it. Check out your favorite bitcoin casino, they do a few million SHA-256 every day.

Razick
Legendary
*
Offline Offline

Activity: 1330
Merit: 1003


View Profile
September 14, 2014, 04:25:00 PM
 #9

Actually, I suspect that it would become a real problem.

Let's assume that it is impossible to have more than 100 leading 0-bits in an SHA-256 hash. When the target becomes 97 leading 0-bits, 1/8 of the eligible hashes are actually impossible. Therefore, the actual difficulty will become 1.14x of the apparent difficulty. 1.33x for 98 bits, 2x for 99 bits, 3.41x for 99.5 bits, 5.33x for 99.7bits, 14.93x for 99.9bits, etc. This will seriously hinder the growth of the apparent difficulty and we may never hit the boundary of 100 bits (unless someone with massive hashing power suddenly push it over the boundary).

Perhaps I am missing something, but didn't you just demonstrate why it might NOT be a problem? The actual difficulty would rise above the apparent difficulty, but the apparent difficulty would never rise to the point where a valid hash would be impossible, thus allowing the network to continue functioning.

If someone with massive hashing power could push it over the boundary, then blocks would stop until the next difficulty adjustment after which mining blocks would resume. Am I right?

ACCOUNT RECOVERED 4/27/2020. Account was previously hacked sometime in 2017. Posts between 12/31/2016 and 4/27/2020 are NOT LEGITIMATE.
jl2012
Legendary
*
Offline Offline

Activity: 1792
Merit: 1111


View Profile
September 14, 2014, 04:42:20 PM
 #10

Actually, I suspect that it would become a real problem.

Let's assume that it is impossible to have more than 100 leading 0-bits in an SHA-256 hash. When the target becomes 97 leading 0-bits, 1/8 of the eligible hashes are actually impossible. Therefore, the actual difficulty will become 1.14x of the apparent difficulty. 1.33x for 98 bits, 2x for 99 bits, 3.41x for 99.5 bits, 5.33x for 99.7bits, 14.93x for 99.9bits, etc. This will seriously hinder the growth of the apparent difficulty and we may never hit the boundary of 100 bits (unless someone with massive hashing power suddenly push it over the boundary).

Perhaps I am missing something, but didn't you just demonstrate why it might NOT be a problem? The actual difficulty would rise above the apparent difficulty, but the apparent difficulty would never rise to the point where a valid hash would be impossible, thus allowing the network to continue functioning.

If someone with massive hashing power could push it over the boundary, then blocks would stop until the next difficulty adjustment after which mining blocks would resume. Am I right?

Sorry, I meant " I don't think it would become a real problem."

However, if someone with massive hashing power could push it over the boundary, then blocks would stop, and stop forever. The next difficulty adjustment will happen after 2160 blocks are found, which will never happen.

Remember: difficulty adjustment happens every 2160 blocks, not every 2 weeks

Donation address: 374iXxS4BuqFHsEwwxUuH3nvJ69Y7Hqur3 (Bitcoin ONLY)
LRDGENPLYrcTRssGoZrsCT1hngaH3BVkM4 (LTC)
PGP: D3CC 1772 8600 5BB8 FF67 3294 C524 2A1A B393 6517
Quokka
Newbie
*
Offline Offline

Activity: 11
Merit: 0


View Profile
September 14, 2014, 10:47:38 PM
 #11

Sorry, I meant " I don't think it would become a real problem."

However, if someone with massive hashing power could push it over the boundary, then blocks would stop, and stop forever. The next difficulty adjustment will happen after 2160 blocks are found, which will never happen.

Remember: difficulty adjustment happens every 2160 blocks, not every 2 weeks

That's true, but (Correct me if I'm wrong) it's nothing that couldn't be fixed by a fork. In fact, a fork would be even easier since the "original" branch of the network would be at a standstill. Granted, the fork would have to include some measure of preventing that same situation from happening again and that may not be feasible.
🏰 TradeFortress 🏰
Bitcoin Veteran
VIP
Legendary
*
Offline Offline

Activity: 1316
Merit: 1043

👻


View Profile
September 15, 2014, 01:02:54 AM
 #12

Sorry, I meant " I don't think it would become a real problem."

However, if someone with massive hashing power could push it over the boundary, then blocks would stop, and stop forever. The next difficulty adjustment will happen after 2160 blocks are found, which will never happen.

That's under this assumption of yours:

Quote
Let say due to defect in SHA256 it is impossible to find a block if more than 160 leading 0-bits are required.

I don't see any basis for the assumption? Why might there be a certain threshold?
jl2012
Legendary
*
Offline Offline

Activity: 1792
Merit: 1111


View Profile
September 15, 2014, 03:42:15 AM
 #13

Sorry, I meant " I don't think it would become a real problem."

However, if someone with massive hashing power could push it over the boundary, then blocks would stop, and stop forever. The next difficulty adjustment will happen after 2160 blocks are found, which will never happen.

That's under this assumption of yours:

Quote
Let say due to defect in SHA256 it is impossible to find a block if more than 160 leading 0-bits are required.

I don't see any basis for the assumption? Why might there be a certain threshold?

Just a pure academic discussion based on OP's idea

Donation address: 374iXxS4BuqFHsEwwxUuH3nvJ69Y7Hqur3 (Bitcoin ONLY)
LRDGENPLYrcTRssGoZrsCT1hngaH3BVkM4 (LTC)
PGP: D3CC 1772 8600 5BB8 FF67 3294 C524 2A1A B393 6517
Anders (OP)
Full Member
***
Offline Offline

Activity: 126
Merit: 100



View Profile
September 18, 2014, 07:19:22 AM
 #14

I found what looks like something that could be poor random oracle behavior of SHA-256:



Transaction times: https://blockchain.info/charts/avg-confirmation-time?timespan=2year&showDataPoints=false&daysAverageString=1&show_header=true&scale=0&address=

Difficulty: https://blockchain.info/charts/difficulty?timespan=2year&showDataPoints=false&daysAverageString=1&show_header=true&scale=0&address=
spin
Sr. Member
****
Offline Offline

Activity: 362
Merit: 262


View Profile
September 18, 2014, 08:07:37 AM
 #15

I found what looks like something that could be poor random oracle behavior of SHA-256:
....
blockchain.info
....

One problem is that blockchain.info is not always very reliable.  I've seen transactions unconfirmed on blockchain.info that are quickly confirmed on other block explorer sites.  Just search the tech support section for examples.

Another problem with that is that transaction confirmation times depend on when they saw the transaction vs.  when they saw it confirm.  Thus it could be influenced by network problems, latency etc. 

My stats is rusty, but you probably want a db of all block hashes and then analyse the conditional distribution and check for uniformness of distribution given the target at the time of each block.

A simple example might be to check the % of hashes that are lower than 50% of the target.  This should be 50%.  You could do this check over different periods to see if something is changing as well.


If you liked this post buy me a beer.  Beers are quite cheap where I live!
bc1q707guwp9pc73r08jw23lvecpywtazjjk399daa
jl2012
Legendary
*
Offline Offline

Activity: 1792
Merit: 1111


View Profile
September 18, 2014, 12:07:49 PM
 #16


It is transaction confirmation time, not block time. The transaction volume burst during last bubble so it takes longer to confirm a transaction

Donation address: 374iXxS4BuqFHsEwwxUuH3nvJ69Y7Hqur3 (Bitcoin ONLY)
LRDGENPLYrcTRssGoZrsCT1hngaH3BVkM4 (LTC)
PGP: D3CC 1772 8600 5BB8 FF67 3294 C524 2A1A B393 6517
Anders (OP)
Full Member
***
Offline Offline

Activity: 126
Merit: 100



View Profile
September 18, 2014, 01:37:09 PM
 #17

It is transaction confirmation time, not block time. The transaction volume burst during last bubble so it takes longer to confirm a transaction

It's true that there is an increase in number of transactions around that time but the increase in confirmation time happens a bit before (November 2013) the increase in transaction volume: https://blockchain.info/charts/n-transactions?timespan=2year&showDataPoints=false&daysAverageString=1&show_header=true&scale=0&address=
Anders (OP)
Full Member
***
Offline Offline

Activity: 126
Merit: 100



View Profile
September 18, 2014, 02:12:12 PM
 #18

Notice here the green line that shows that the increase in confirmation times happened before the increase in transaction volume:

spin
Sr. Member
****
Offline Offline

Activity: 362
Merit: 262


View Profile
September 18, 2014, 05:07:15 PM
 #19

Not sure why you are looking at transaction confirmation times?

If you liked this post buy me a beer.  Beers are quite cheap where I live!
bc1q707guwp9pc73r08jw23lvecpywtazjjk399daa
Anders (OP)
Full Member
***
Offline Offline

Activity: 126
Merit: 100



View Profile
September 18, 2014, 05:15:58 PM
 #20

Not sure why you are looking at transaction confirmation times?

But the blockinfo graph shows an average of about 10 minutes. So isn't that really the block time?
Pages: [1] 2 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!