Bitcoin Forum
May 05, 2024, 12:00:17 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: An analysis of bitcoin blockchain height divergence from its standard behavior  (Read 268 times)
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
August 16, 2018, 11:04:05 AM
Last edit: August 19, 2018, 12:27:48 PM by aliashraf
Merited by Welsh (5), dbshck (5), ABCbits (2), DdmrDdmr (2), Piggy (1), danda (1)
 #1

How far can possibly bitcoin blockchain height diverge from the ideal one- block-per-10-minutes measure?

Motivation
During a discussion with @gmaxwell about a hypothetical DoS vulnerability, in response to my suggestion for blacklisting peers that send block locators unreasonably large, he objected to my proposal by asserting that a loyal node may become maliciously bootstrapped and fooled to commit to a very long chain with trivial difficulty, so blocking it won't be helpful. I responded with a moderate version, ... but after further assessments, I realized that the whole situation is somewhat abnormal, the possibility of having honest nodes with an unreasonable perception of chain height.

In bitcoin, 10 minutes block time interval is not imposed synchronously nor integrally. Protocol adjusts the difficulty every 2016 blocks to keep the generation pace at 10 minutes target but a steady growth in network hash power makes it a possibility for chain height to exceed the ideal number based on 10 minutes generation rate. Actually maximum block height of the network as of this writing is #535461 while it is supposed to be ~504300 showing  a +6% divergence.

Not satisfied with the situation, I asked myself:
Why should we accept a proposed chain with an unreasonable longitude like 10 times or 1000 times longer than normal in the first place?
Why shouldn't we simply reject such proposals and shut down the peers who made them? Putting difficulty requirements aside, is it possible to have a chain orders of magnitude longer than normal?

In the malicious block locator scenario, a peer node, very commonly a spv, sends us a getheaders message with a payload of hundreds of thousands stupid hashes as block locator and instead of being rejected and banned we hopefully and exhaustively try to locate them one by one.

At first, @gmaxwell didn't believe it to be an attack vector because of the cpu bound nature of the processing involved but finally he made a pull request out of that discussion and it was committed to the source. Bitcoin core now puts a MAX_LOCATOR_SZ hardcoded to be equal to 101. So, we have block locator problem fixed now.

A block locator in bitcoin is a special data structure that spontaneously represents (partially) a node's interpretation of block headers in the chain. The spontaneous nature of block locator generation algorithm guarantees its length to be of O(log(n)) where n is the maximum block height in the chain, it is exactly 10+(log2(n)) . For current block height of 535461 a legitimate block locator should carry a maximum of 29 hashes and a node sending 30 hashes is claiming a chain with 1,073,741,824 height which is almost twice longer and it would be 2 million times longer if block locator was holding 50 hashes. Yet we were worried about block locators holding thousands of hashes which represent chains with astronomical heights.

Although the issue is fixed now, the underlying theoretical base didn't get the attention it deserves: A boundary for the divergence of actual block chain's height  from what we could trivially calculate by dividing the minutes elapsed from an accepted checkpoint block's  timestamp by 10 (normal block interval).

At the time of this writing, the divergence is  6+% but we have no measure to consider 60%, 600% , 6,000,000,000% , ... divergence indexes infeasible, until now.

In  this article I'm trying to establish a mathematical framework for further research on this problem, by introducing a relation between hash power increase requirements for a specific divergence within a relatively large window of time.

I also found it useful to include a background section for interested readers, if you are familiar with the subjects, just skip this section.

Background

Difficulty adjustment: How bitcoin keeps the pace
Two rules are applied in bitcoin protocol for regulating the rate by which blocks are generated:
1- Increase/decrease difficulty every 2016 blocks by comparing the actual time elapsed with an expected 2 week period.
2- Don't Increase/decrease difficulty with a factor greater than 4.

The latter constraint is commonly overlooked because such a large retargeting is very unlikely to happen with the huge inertia of the currently installed hash power but as we are studying extreme conditions, the 4 times increase factor is of much importance.

Longest Chain Rule: How Bitcoin chooses between forks
In bitcoin and its clones, LRU is not interpreted naively as selecting the chain with biggest height as the main, instead it is defined as the chain with most work needed to generate it.
To calculate accumulative work done on each chain:
1-work for each block is calculated as:  floor(2^256 / (target + 1))
This is done in chain.cpp of bitcoin source code via GetBlockProof function:
Code:
 arith_uint256 GetBlockProof(const CBlockIndex& block)
  122 {
  123     arith_uint256 bnTarget;
  124     bool fNegative;
  125     bool fOverflow;
  126     bnTarget.SetCompact(block.nBits, &fNegative, &fOverflow);
  127     if (fNegative || fOverflow || bnTarget == 0)
  128         return 0;
  129     // We need to compute 2**256 / (bnTarget+1), but we can't represent 2**256
  130     // as it's too large for an arith_uint256. However, as 2**256 is at least as large
  131     // as bnTarget+1, it is equal to ((2**256 - bnTarget - 1) / (bnTarget+1)) + 1,
  132     // or ~bnTarget / (bnTarget+1) + 1.
  133     return (~bnTarget / (bnTarget + 1)) + 1;
  134 }

2- During loading/adding a block to block index in memory, not only the block work is computed by calling the above function but also an accumulated chain work is aggregated in nChainWork with this ocde:
Code:
 pindex->nChainWork = (pindex->pprev ? pindex->pprev->nChainWork : 0) + GetBlockProof(*pindex);
which is executed both in LoadBlockIndex and AddToBlockIndex.

The forks are compared based on nChainWork of BlockIndex item of their respective last block and once a chain is found heavier than current active chain a reorg will be performed mainly by updating UTXO and pointing to new chain as the active fork.

Between two difficulty adjustments, this the-most-difficult-chain-is-the-best-chain approach is identical to the-longest-chain-is-the-best-chain originally proposed by Satoshi but when we have to choose between two forks with at least one difficulty adjustment (typically in both forks) there is no guarantee for the results to be identical.

Block timestamp:How bitcoin keeps track of time
In bitcoin few rules apply to block time and current time. This is an important topic for the purpose of this article, defining an upper bound for the height of forks, because to impose such a boundary a node should eventually have a measure of  time and it is not simply its own system clock for obvious reasons.

1- Bitcoin defines a concept named network-adjusted time. It is defined to be the median of times reported by all the peers a node is connected to.

2- In any case network-adjusted time shouldn't offset node's local time more than 70 minutes.

3- Blocks should be marked with a timestamp greater than the median of previous 11  blocks and less than 2 hours after node's network-adjusted-time.

These constraints make it very hard for an adversary to manipulate block times and forging fake chains with fake difficulty and height which is our concern.


Abstract
The short-range difficulty adjustment policy of bitcoin causes a divergence of expected chain height (expected from ideal 10 minutes block time) because of increased hashpower during each epoch, we show that this divergence is exponentially difficult by proving a functional dependency between the ratios by which hash power and the divergence increase: F = (n/N)^n

To prove this relation, we primarily show the maximum divergence caused by introducing any amount of new hashpower to the network in a specific period of time is achieved when its distribution in epochs, is representable by a geometric  progress.

The 4 times maximum threshold imposed by difficulty adjustment algorithm of network is deliberately ignored to to avoid complicating the discussion. It is a justifiable simplifying assumption in the context of this article because a 4+ times per epoch increase in hashpower is not feasible  anyway, as we shortly discuss at the end of the article.

After proving the exponential dependency, we briefly illustrate and discuss its relevance to block locator size problem and other interesting potentials for dealing with similar problems generally.

Lemma:
Suppose at the end of a difficulty adjusting epoch we have already chosen as  a check point and has started at time t0 we have calculated an average hashpower C as current network hashpower and begun to introduce Q as a relatively large hashpower in a period of time t, greater than or equal to 1 ideal difficulty adjustment epoch time T0. The number of epochs n that occur in period [t0, t0+t], is at its maximum when Q is distributed in a way that the network hashrate would go through a geometrical progression With:
 
Ci = C*qi
and we have Sum(Ci-Ci-1)=Q the increased hash power:
proof:
For a network state at the end of a given epoch that is in equilibrium with hashpower C and an arbitrary distribution of new power Q in n epochs we have the sequence
C, C*k1, (C*k1)*k2, ...,  (C*ki-1)*ki, ..., (C*kn-1)*kn
defined recursively as :
Ci=Ci-1*ki , C0 = C
Observing that for n+1>i>0:
Ci=C*K1*k2*...*ki=C*K1k2...*ki
and in the end of application of power Q, we have a total hashpower of C+Q that is equal to the last term of the sequence
We have to prove for n+1>i>0 we have k1=k2=...=kn=q for the n epochs to occur in the least possible actual time. For this, we note that in the end of each epoch the difficulty adjustment algorithm of bitcoin  (4 times limit temporarily being ignored) resets block generation pace to T0 for future epochs, so in each epoch we have:
ti= Ci/Ci-1 = (C*K1k2...ki-1)/ (C*K1k2...ki) = 1/ki
For total elapsed time t during n epochs, we have:
T = T0+T0*1/k1+T0*1/k2+... +T0*1/kn
Then:
T/T0 =1+1/k1+1/k2+...+ 1/kn
                        = 1+ f(k)/(k1k2...kn-1kn
                         = 1+f(k)/ ((C+Q)/C) )
Where f(k) is the sum of the sequence:
ai = k1k2...kn/ki
, now we need to have this sum in its minimum. We first calculate the product of the terms as prod(ai) = (k1k2...kn)n-1 = ((C+Q)/C)n-1

where ((C+Q)/C) and n-1 are constants and the sum is minimum when a1=a2=...=an hence:
k1=k2=...=kn
Now we have proof because for the minimum time to be elapsed the power sequence definitively should be rewritten as:
Ci = Ci-1*q = C0*qi
where C0 = C and n+1>i>1  which is a sequence with geometric progression.

Theorem
Minimum hashpower growth factor F = (Q+C)/C needed for bitcoin blockchain* to grow by an abnormal ratio n/N is determined by F=(n/N)^n
 
where C is current  hashpower of the network and Q is the absolute increase during a period of N*T0.

*Note: Bitcoin difficulty adjustment algorithm applies 4 times maximum threshold that is ignored deliberately for practical purposes

Proof
For a given F = (Q+C)/C Using the geometric progress lemma above we have:
t=T0*n/q    ==> t/T0=N=n/q ==> n/N = q
and
Q=C*(qn-1) ==> (Q+C)/C=F=qn
eliminating q:
F=(n/N)n

Discussion
The exponential relationship with the ratio of power increase and the ratio by which blockchain height diverges from normal N=t/T0 is an important key to understand how impractical would be for a network like bitcoin with tens of thousands penta hash inertia (C) to diverge from its normal behavior in terms of longitude (chain height).
Let's have a graphical illustration of this function. We replace n with N+d where d is the number of abnormally added epochs. So F=((N+d)/N)(n+d)
This form of the equation illustrates how F (relative hashpower) should grow to have N+d epochs instead of N, normal epochs.

figure 1: F/d diagram with various N values for F up to 30 times hashpower increase
Figure 1 illustrates how hard is achieving to 2-3 epochs divergence in 3,6,12,24 standard epoch times (two weeks long each) e.g for reaching to 3 epochs divergence within 24 weeks (12 epochs) even 40 times increase in network hashpower doesn't suffice.

Figure 2 (below) illustrates the same function in larger scale (1000 times hashpower increase). To diverge 6 epochs within 48 weeks we need 800 times increase in network hash power and within 24 weeks even with 1000 times increase divergence can't approach 6.
figure 2: F/d diagram with various N values for F up to 1000 times hashpower increase

This exponential behavior is very interesting and  potentially can be used as a basis for confronting malicious bootstrap attacks and issues like what we mentioned in the beginning of this article: bogus block locator issue.

As of the 4 times maximum threshold bitcoin uses for difficulty adjustment, obviously having a 4 times or more increase in each epoch for the network is infeasible specially when it comes to large windows of time, like 1-2 years that cause exponential network hash power increase up to infeasible quantities. Hence, ignoring that threshold practically won't affect this analysis.
1714910417
Hero Member
*
Offline Offline

Posts: 1714910417

View Profile Personal Message (Offline)

Ignore
1714910417
Reply with quote  #2

1714910417
Report to moderator
1714910417
Hero Member
*
Offline Offline

Posts: 1714910417

View Profile Personal Message (Offline)

Ignore
1714910417
Reply with quote  #2

1714910417
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714910417
Hero Member
*
Offline Offline

Posts: 1714910417

View Profile Personal Message (Offline)

Ignore
1714910417
Reply with quote  #2

1714910417
Report to moderator
1714910417
Hero Member
*
Offline Offline

Posts: 1714910417

View Profile Personal Message (Offline)

Ignore
1714910417
Reply with quote  #2

1714910417
Report to moderator
danda
Full Member
***
Offline Offline

Activity: 201
Merit: 157


View Profile WWW
August 19, 2018, 04:32:50 AM
 #2

bump for comments from theoretical types.

mybitprices.info - wallet auditing   |  hd-wallet-derive - derive keys locally |  hd-wallet-addrs - find used addrs
lightning-nodes - list of LN nodes  |  coinparams - params for 300+ alts  |  jsonrpc-cli - cli jsonrpc client
subaddress-derive-xmr - monero offline wallet tool
domob
Legendary
*
Offline Offline

Activity: 1135
Merit: 1161


View Profile WWW
August 19, 2018, 07:04:39 AM
 #3

This sounds similar to a paper I wrote in 2014, so you might be interested in that: https://link.springer.com/article/10.1007/s12083-015-0347-x (without paywall: https://www.domob.eu/research/DifficultyControl.pdf).

Use your Namecoin identity as OpenID: https://nameid.org/
Donations: 1domobKsPZ5cWk2kXssD8p8ES1qffGUCm | NMC: NCdomobcmcmVdxC5yxMitojQ4tvAtv99pY
BM-GtQnWM3vcdorfqpKXsmfHQ4rVYPG5pKS | GPG 0xA7330737
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
August 19, 2018, 09:54:15 AM
Last edit: August 19, 2018, 12:19:54 PM by aliashraf
 #4

This sounds similar to a paper I wrote in 2014, so you might be interested in that: https://link.springer.com/article/10.1007/s12083-015-0347-x (without paywall: https://www.domob.eu/research/DifficultyControl.pdf).
Thanks for the link, I downloaded your paper and will check.

At the first glance, it seems to be related to this work but you've employed more sophisticated techniques and targeted more general issues including a suggested difficulty adjustment method to help with the divergence issue.

My work is more focused:I treat the introduction of new hashpower to the network like an attack to generate more blocks and I want to show that it is infeasible to have large divergence ratios, because of exponential resource requirements.

My technique is straightforward: First I prove that such an attack will be most effective when it is distributed in a way that the network experiences a geometric progression sequence, i.e. Ci=C0*qi then I simply deduct: (C0+Q)/C0=(n/N)n where C0 is network's hashpower at any given checkpoint and N is the expected, normal block height in any given interval and n is the maximum possible height after inserting Q more hashpower in that interval.

I'm not much of a mathematician and I always find myself lost in academic papers that heavily use mathematics to show their point. I think I'm not alone in the community in this regard and I wish guys like you try harder by using more elegant and simpler techniques and avoiding excessive generalizations to let people like me to catch up with your ideas, easier.

I'll check and compare your results with mine and find out to what extent I've reinvented the wheel  Cheesy
domob
Legendary
*
Offline Offline

Activity: 1135
Merit: 1161


View Profile WWW
August 19, 2018, 11:13:36 AM
 #5

Don't get me wrong, I didn't mean to imply that you reinvented stuff that I already did - I just thought my paper might be interesting to you, given that it is about a similar problem (but indeed not the same!).

I think that I also came to the conclusion that exponentially rising hash rate is "worst" in terms of diverging from the desired block time, though - it is interesting that you found the same result based on a different analysis.

Use your Namecoin identity as OpenID: https://nameid.org/
Donations: 1domobKsPZ5cWk2kXssD8p8ES1qffGUCm | NMC: NCdomobcmcmVdxC5yxMitojQ4tvAtv99pY
BM-GtQnWM3vcdorfqpKXsmfHQ4rVYPG5pKS | GPG 0xA7330737
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
August 19, 2018, 12:19:18 PM
 #6

No worries, I didn't put it that way, it is just always a possibility specially when it comes to new subjects, reinventing the wheel.

By the way, I edited my original post because of some errors in my proof of the lemma, now it seems to be solid
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!