Bitcoin Forum
May 11, 2024, 07:52:45 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: A Proposal to Reduce Variation in Block Creation Times  (Read 1223 times)
Shatosi Makanoto (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0



View Profile
May 08, 2015, 10:43:09 AM
 #1

One of the characteristics of bitcoin's mining algorithm that exacerbates the block size problem is the fact that the time that it takes to "discover" a new block exhibits a Poisson distribution, which means that although the average time will be ten minutes, sometimes it will be 30 seconds and other times it will be an hour. It is the long times that are problematic, since they "load up" the block.

What if the mining algorithm were modified, so that instead of rewarding a block to the first miner that discovers a hash at difficulty D, the reward is awarded to the first miner to discover ten hashes at difficulty D/10? This would tighten up the window by a factor of ten, and alleviate some of the pressure on the block size.

The two disadvantages I see are:

    The ten hashes would have to be recorded in the new block. This would increase the block size, but since hashes are only 32 bytes (or 64 bytes, I don't remember which) each, this would be a trivial overhead.
    The ten hashes would have to be calculated by anyone wishing to verify the block's validity. Once again, I don't think this would be significant.

This would have other important benefits. The greatest benefit would be that the time required to wait for validation of the transaction by the sender and receiver would be much, much closer to ten minutes, almost all the time.

I chose the number ten arbitrarily, for purposes of discussion. Perhaps the optimum value would be 5, 20, or 200: let's discuss.

You may respond that this technique does not guarantee that there won't be hour-long waits, and that's true. But when this method sees an hour-long wait, the wait under the current algorithm would be ten hours!

Your thoughts?

[EDIT] I think that there is a problem with this algorithm as stated above. Pools and individuals with greater hashing power would have a tremendous advantage. The likelihood of finding the ten solutions would not be proportional to the miner/pool's hashing power H, but to H10. Let me see if any way can be found to remove that advantage (help me out if you can).

[EDIT #2] Is this a solution? A further restriction can be placed on the ten hashes that comprise the winning set: all ten must be generated sequentially (or within a specific range); that is, their nonces must be ten sequential integers (or ten integers that are all within a specific range), with no changes in the rest of the block (of course, the difficulty D would have to be adjusted accordingly). This would eliminate the exponential advantage that larger pools have and make the larger pool have an advantage that is proportional to its hashing power H, just as it currently is.
1715413965
Hero Member
*
Offline Offline

Posts: 1715413965

View Profile Personal Message (Offline)

Ignore
1715413965
Reply with quote  #2

1715413965
Report to moderator
Even in the event that an attacker gains more than 50% of the network's computational power, only transactions sent by the attacker could be reversed or double-spent. The network would not be destroyed.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1715413965
Hero Member
*
Offline Offline

Posts: 1715413965

View Profile Personal Message (Offline)

Ignore
1715413965
Reply with quote  #2

1715413965
Report to moderator
1715413965
Hero Member
*
Offline Offline

Posts: 1715413965

View Profile Personal Message (Offline)

Ignore
1715413965
Reply with quote  #2

1715413965
Report to moderator
1715413965
Hero Member
*
Offline Offline

Posts: 1715413965

View Profile Personal Message (Offline)

Ignore
1715413965
Reply with quote  #2

1715413965
Report to moderator
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1004



View Profile
May 08, 2015, 11:13:49 AM
 #2

You hit the nail on the head with your first edit.  This proposal would give slightly faster miners a huge advantage.  This would quickly lead to almost perfect centralisation.

Your second edit does not rescue the situation as far as I can see.  Depending on what you mean by restricting the nonce to a particular range, I believe you will either fail to remedy the tremendous advantage of the faster miners or reintroduce the original variance.
medUSA
Legendary
*
Offline Offline

Activity: 952
Merit: 1003


--Signature Designs-- http://bit.ly/1Pjbx77


View Profile WWW
May 08, 2015, 11:31:46 AM
 #3

Whatever arbitrary value you set, you are lowering the network difficulty. When blocks are found more easily, block time will speed up. Then difficulty ramps up in the next difficulty change and we are back to square one.
Shatosi Makanoto (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0



View Profile
May 08, 2015, 11:33:19 AM
 #4

Here is a calculation to demonstrate what the difficulty would have to be if the ten solutions had to be sequential:

Currently, the hash rate (https://blockchain.info/charts/hash-rate) is 350 terahash/sec, or 350 * 1012. A solution should be found in 10 minutes, or 600 seconds, average. So currently, it takes 210 exahash, or 210 * 1015 hashes to find a solution. In order to find ten solutions in a row, each solution would have to be found in the tenth root of 210 * 1015 which is a mere 54. This is obviously far too small to be practical, since the mining program would have to keep track of all these solutions as it is generating billions or trillions of hashes per second.

This can be corrected by allowing the solutions to be contained in a range, instead of requiring them to be strictly sequential. For example, the ten solutions could be required to be such that the difference between the lowest and highest nonce values is no more than, say, 100 or 1000. The larger the range is, the more difficult it can be made to find each successful hash. I would say that the range should be adjusted so that a successful hash requires something like a billion attempts to find.

I'm working on determining how to set the range to achieve this...
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
May 08, 2015, 11:58:28 AM
 #5

The only way to make what you are proposing work is to allow multiple people to work together to produce the hash.

That is already what the block chain is designed to do.  You get what you are after by dropping the block time to 1 minute and having 9 sub-blocks for every full block.  A sub-block would just contain a coinbase transaction (maybe just paying to 1 address).  The minting fee would be 10 times lower.  The 1 minute block time should be OK, since it is mostly just sending headers.  It is only every 10th block that gets rewards.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Shatosi Makanoto (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0



View Profile
May 08, 2015, 12:12:56 PM
 #6

Quote
Your second edit does not rescue the situation as far as I can see.  Depending on what you mean by restricting the nonce to a particular range, I believe you will either fail to remedy the tremendous advantage of the faster miners or reintroduce the original variance.

It seems to me that my proposal would reduce the standard deviation in the Poisson distribution of the solution sets, so that solutions would be found in a much tighter range around the ten minute period.

To take an extreme example, consider if a solution set must contain a million successful hashes within a specified window. Then, successful hashes would be streaming out of the generator at a high rate, and the algorithm would be watching a sliding window of hashes to see if a million of them were contained within a specified-size range. This means that there are a million times as many opportunities for the hashes to achieve the range requirement, and the likelihood of any individual hash being the successful millionth hash would be smaller. The time would still average out to ten minutes, but the individual times would be very close to the average (i.e., the standard deviation would be much smaller).

Compare this to gas molecules in a box. Suppose we have only one molecule in the box, and success is defined as finding that single molecule in a tiny corner of the box. The corner size could be adjusted so that, on average, it took ten minutes before that molecule encountered the corner. The time that it took on any given attempt would vary over a range as described by a Poisson distribution (http://en.wikipedia.org/wiki/Poisson_distribution). If, however, X molecules are in the box and we require that the same-size corner be encountered X times, the standard deviation in the time that it takes for those X events to occur would be much smaller than the standard deviation for the single molecule.
jl2012
Legendary
*
Offline Offline

Activity: 1792
Merit: 1097


View Profile
May 08, 2015, 01:05:56 PM
 #7

Forget it. I proposed it last year: https://bitcointalk.org/index.php?topic=572816.0

It's a very bad idea.

Donation address: 374iXxS4BuqFHsEwwxUuH3nvJ69Y7Hqur3 (Bitcoin ONLY)
LRDGENPLYrcTRssGoZrsCT1hngaH3BVkM4 (LTC)
PGP: D3CC 1772 8600 5BB8 FF67 3294 C524 2A1A B393 6517
btcbobby
Newbie
*
Offline Offline

Activity: 47
Merit: 0


View Profile
May 08, 2015, 01:10:38 PM
 #8

Have you paid attention to all the drama involved in increasing bitcoins blocksuze? Imagine that times 1000 if you suggest changes to the hashing process.
Shatosi Makanoto (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0



View Profile
May 08, 2015, 01:46:46 PM
 #9

Quote
Have you paid attention to all the drama involved in increasing bitcoins blocksuze? Imagine that times 1000 if you suggest changes to the hashing process.

That's one reason I'm suggesting this. The reason that the block size is an urgent issue is because of the extreme variability in block creation time.

In addition, it would be wonderful for a host of other reasons to have the block creation time more constant.
Shatosi Makanoto (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0



View Profile
May 08, 2015, 01:57:23 PM
 #10

Quote
Forget it. I proposed it last year: https://bitcointalk.org/index.php?topic=572816.0
It's a very bad idea.

I think that if you look at my proposal closely, you will see that it is quite different from yours. In my proposal, miners do not broadcast anything until they have accumulated all ten successful hashes within a specified range window. These ten hashes will be found in an average of ten minutes, with a much tighter variance (standard deviation). Then, a new block is assembled that contains all ten hashes, and that is broadcast over the network.

A miner (or pool) with a larger hashing rate will only have an advantage proportional to its hash rate H, just as things are now, but finding ten successful hashes will have a much smaller deviation from ten minutes.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1004



View Profile
May 08, 2015, 11:19:42 PM
 #11

Here is a calculation to demonstrate what the difficulty would have to be if the ten solutions had to be sequential:

Currently, the hash rate (https://blockchain.info/charts/hash-rate) is 350 terahash/sec, or 350 * 1012. A solution should be found in 10 minutes, or 600 seconds, average. So currently, it takes 210 exahash, or 210 * 1015 hashes to find a solution. In order to find ten solutions in a row, each solution would have to be found in the tenth root of 210 * 1015 which is a mere 54. This is obviously far too small to be practical, since the mining program would have to keep track of all these solutions as it is generating billions or trillions of hashes per second.

From this calculation it is clear that you intend for miners to discard solutions that cannot be used due to a range requirement.  You've effectively replaced the rare event of finding a single 1-in-(210 * 1015) hash to the rare event of finding ten 1-in-54 hashes in a row.  There is no reduction in variance, even with wider ranges.

Note: your link reports about 350 000 000 gigahashes/second = 350 petahashes/second (350 * 1015 hashes/second).  This works out to 210 exahashes (210 * 1018 hashes) in 10 minutes.  The tenth root of 210 * 1018 is 108 (3.s.f).
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4172
Merit: 8419



View Profile WWW
May 09, 2015, 01:25:22 AM
Last edit: May 09, 2015, 01:41:17 AM by gmaxwell
 #12

Greetings.  The use of multiple smaller hashcashes is described in the original hashcash publications. It is undesirable here.

The hashcash "lottery" in mining serves an essential purpose of making the times between blocks variable (exponentially distributed).

As a thought experiment, imagine that we had a function that always took 10 minutes to use as our proof of work.  Eventually there would be a network fork, and those forks would then remain tied forever; because they'd all keep producing blocks in lockstep.  There would be no convergence. With some but small variation there can be long runs of ties and then eventually convergence happens and causes a huge reorg on one side.

So its this unpredictability that makes is to eventually one chain will overtake another (even if they are equal hashpower).  If you reduce variance this has roughly the same effect on convergence as just lowering the expected time between block to a new value with the same variance.  For a given topology and set of latencies there is an optimal variance where the convergence rate (and thus speed of security against reorgs) is maximized.  This function for this looks like something like the below-- (napkin drawing, I don't have simulation results handy)-- which basically shows there is a wide basin of acceptable values, but you want to avoid being too low (relative to latency in the network) or being a little too fast can have a very negative effect.



Another angle where this has problems is that the N-hashcashes approach is not progress free. Once you have one you're more likely to be successful, instead of each try being independent. The result of this is that higher hashpower miners (or attackers) have a disproportionally high chance of winning, instead of a fair proportional distribution.  Again, back to the thought experiment: imagine you have a POW that always takes an exact number of operations. The single fastest miner would always win, instead of winning proportionally to their performance.  This would be a powerful centralization pressure.

Others can probably suggest some threads where this has been discussed in the past.
gatherc
Newbie
*
Offline Offline

Activity: 53
Merit: 0


View Profile
May 09, 2015, 08:56:39 AM
 #13

Thx for explaining the benefit of variance, this probably puts to bed an idea I had where miners wait 9 minutes after the last block (ignore for the moment that there's no universal time in bitcoin + dishonest miners) and then turn their miners on (this would make mining less costly).
 
What you're saying is that this scheme would have the effective variance of 1-minute blocks and thus be undesirable?
If anyone has a link to some 'non-napkin' graphs on this thanks in advance!



Shatosi Makanoto (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0



View Profile
May 09, 2015, 03:51:35 PM
 #14

Thanks to everyone for your replies... I can see that I was wrong.

Quite an education.
jl2012
Legendary
*
Offline Offline

Activity: 1792
Merit: 1097


View Profile
May 09, 2015, 03:57:11 PM
 #15

Thanks to everyone for your replies... I can see that I was wrong.

Quite an education.

You may notice that exponential distribution is the ONLY memoryless continuous probability distribution: http://en.wikipedia.org/wiki/Exponential_distribution

This is a rule of the universe that we can't change

Donation address: 374iXxS4BuqFHsEwwxUuH3nvJ69Y7Hqur3 (Bitcoin ONLY)
LRDGENPLYrcTRssGoZrsCT1hngaH3BVkM4 (LTC)
PGP: D3CC 1772 8600 5BB8 FF67 3294 C524 2A1A B393 6517
jl2012
Legendary
*
Offline Offline

Activity: 1792
Merit: 1097


View Profile
May 09, 2015, 04:02:00 PM
 #16

Thx for explaining the benefit of variance, this probably puts to bed an idea I had where miners wait 9 minutes after the last block (ignore for the moment that there's no universal time in bitcoin + dishonest miners) and then turn their miners on (this would make mining less costly).
 

You gain nothing by doing this. Mining is a Poisson Process. The expected time until next block is ALWAYS 10 minutes. The time since last block is completely irrelevant. If it is 9 minutes after the last block, you expect the next block will appear at 19 minutes after the last block.

Donation address: 374iXxS4BuqFHsEwwxUuH3nvJ69Y7Hqur3 (Bitcoin ONLY)
LRDGENPLYrcTRssGoZrsCT1hngaH3BVkM4 (LTC)
PGP: D3CC 1772 8600 5BB8 FF67 3294 C524 2A1A B393 6517
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!