payb.tc


March 12, 2012, 08:31:34 AM 

i was wondering, how does the next difficulty actually get estimated?
after searching for a while, all i can find are sites which give an (the?) estimate, but nothing that explains how that figure was calculated.
eg. blockexplorer.com, bitcoinwatch.com, bitcoindifficulty.com...
how's it worked out?






Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.


Global BTC


March 12, 2012, 09:25:25 AM 

From Wikipedia: Bitcoin changes the difficulty of finding a valid block every 2016 blocks. Each node in the network adjusts the difficulty so the distribution mean is λ = 2016 blocks per two weeks, so that there are roughly ten minutes between the creation of new blocks on average (the wait times between events in a Poisson process follow an exponential distribution). The network sets the difficulty to the value that would have most likely caused the prior 2016 blocks to take two weeks to complete, given the same computational effort (according to the timestamps recorded in the blocks). http://en.wikipedia.org/wiki/Bitcoin#Difficulty




payb.tc


March 12, 2012, 09:41:06 AM 

From Wikipedia: Bitcoin changes the difficulty of finding a valid block every 2016 blocks. Each node in the network adjusts the difficulty so the distribution mean is λ = 2016 blocks per two weeks, so that there are roughly ten minutes between the creation of new blocks on average (the wait times between events in a Poisson process follow an exponential distribution). The network sets the difficulty to the value that would have most likely caused the prior 2016 blocks to take two weeks to complete, given the same computational effort (according to the timestamps recorded in the blocks). http://en.wikipedia.org/wiki/Bitcoin#Difficultythanks, i'd read that before but hadn't taken it in properly i guess. i've highlighted the part which says how it's calculated. i always thought the 'ten minutes' part was hardcoded and that caused '2 weeks' to be a rough estimate, but if i'm understanding the above correctly, it's '2 weeks' which is hardcoded, causing '10 minutes' to be a rough estimate.




Global BTC


March 12, 2012, 10:05:16 AM 

Yes, that's pretty much it. 2 weeks is hardcoded, but it's still based on past performance.




payb.tc


March 12, 2012, 10:41:37 AM 

so, as an example calculation:
say difficulty is 1,000,000
last 2016 blocks took exactly 1 week.
does that mean next difficulty is 2,000,000 ...or is it an order of magnitude thing, like next diff 10,000,000?
(sorry, i don't really know how the total network hash rate might affect things like that)




riikka
Newbie
Offline
Activity: 12
Merit: 0


March 12, 2012, 10:33:13 PM 

so, as an example calculation:
say difficulty is 1,000,000
last 2016 blocks took exactly 1 week.
does that mean next difficulty is 2,000,000 ...or is it an order of magnitude thing, like next diff 10,000,000?
(sorry, i don't really know how the total network hash rate might affect things like that)
It will be 2,000,000 . Also, difficulty cannot change more than fourfold in a single retarget (that is, cannot become more than 4 times higher or lower at once). The code that calculates the new difficulty is in main.cpp file, function GetNextWorkRequired. Code on Github




payb.tc


March 13, 2012, 12:12:57 AM 

cheers, very helpful.
i was trying to think of ways one could more smoothly introduce the next estimate, instead of having the crazywild inaccurate estimate that occurs right after a difficulty change.
next estimate could be introduced more gradually, if that makes sense.
if it doesn't make sense, here's an example:
say X is the current difficulty and Y is the crazyinaccurate estimate that occurrs right after a difficulty change.
Z will be the 'smoother' hopefully more accurate guess of next difficulty...
1st day after difficulty change, Z is 13/14 of X + 1/14 of Y 2nd day after difficulty change, Z is 12/14 of X + 2/14 of Y ... 13th day after difficulty change, Z is 1/14 of X + 13/14 of Y 14th day after difficulty change, Z = Y (which will be much more accurate at this point)
also, instead of doing this in 14 chunks, you could make it even more finegrained, even down to the millisecond if you wish.
tl;dr purely just out of a nerdy love of maths, i'd love to discuss possible algorithms for a (much) more accurate nextdifficulty estimate.




Revalin


March 13, 2012, 12:24:41 AM 

Rather than smoothing jumps you can use a sliding window to update the difficulty on every block. More on that here: https://bitcointalk.org/index.php?topic=64048

War is God's way of teaching Americans geography. Ambrose Bierce Bitcoin is the Devil's way of teaching geeks economics. Revalin 165YUuQUWhBz3d27iXKxRiazQnjEtJNG9g



payb.tc


March 13, 2012, 12:26:50 AM 

great idea... thanks, i hadn't seen that thread. edit: just read that thread... it's more about the current difficulty and bitcoin code... not really about the next estimate, which playing with (eg, on a webpage) wouldn't have to touch the source code whatsoever.




DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1003
Gerald Davis


March 13, 2012, 12:56:50 AM 

The same concept applies. At any point you can adjust the next difficulty by looking at the PRIOR 2016 blocks.
Most sites don't do this because they are using difficulty and future difficulty to project network hashrate and to don't want to complicate the calculations with multiple difficulties.
So they do this. At 1 block after difficulty adjustment they use 1 block. ... At 100 blocks after difficulty adjustment they use 100 blocks.
Instead you can do this At 1 block after difficulty adjustment you use the last 2016 blocks. ... At 100 block after difficulty adjustment you use the last 2016 blocks.




payb.tc


March 13, 2012, 01:04:20 AM 

The same concept applies. At any point you can adjust the next difficulty by looking at the PRIOR 2016 blocks.
Most sites don't do this because they are using difficulty and future difficulty to project network hashrate and to don't want to complicate the calculations with multiple difficulties.
So they do this. At 1 block after difficulty adjustment they use 1 block. ... At 100 blocks after difficulty adjustment they use 100 blocks.
Instead you can do this At 1 block after difficulty adjustment you use the last 2016 blocks. ... At 100 block after difficulty adjustment you use the last 2016 blocks.
at that point (1 block after), wouldn't using the last 2016 blocks give a better 'projection of network hashrate' ?




DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1003
Gerald Davis


March 13, 2012, 01:10:53 AM 

The same concept applies. At any point you can adjust the next difficulty by looking at the PRIOR 2016 blocks.
Most sites don't do this because they are using difficulty and future difficulty to project network hashrate and to don't want to complicate the calculations with multiple difficulties.
So they do this. At 1 block after difficulty adjustment they use 1 block. ... At 100 blocks after difficulty adjustment they use 100 blocks.
Instead you can do this At 1 block after difficulty adjustment you use the last 2016 blocks. ... At 100 block after difficulty adjustment you use the last 2016 blocks.
at that point (1 block after), wouldn't using the last 2016 blocks give a better 'projection of network hashrate' ? Of course but the math gets more complicated because each block needs to be adjusted by the difficulty at the time of that block. For some sites simply looking to provide a quick snapshot of "current hashpower" they opt for the simpler calculations of a flat difficulty which means lots of error right after difficulty adjustment.




