Bitcoin Forum
December 15, 2024, 12:35:57 PM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Calculating Estimated Diff Change  (Read 851 times)
gateway (OP)
Hero Member
*****
Offline Offline

Activity: 552
Merit: 500


View Profile
January 08, 2014, 10:38:41 PM
 #1

Hey everyone, I think this been covered before but I couldnt find a solid answer to the question.
I'm querying my bitcoind to get various details out it then I wanted to calculated the estimated change or best possible way to do it.

Here is the current data that I'm collecting, I hope I also got the right first block of the current diff and last block right..

Quote
Estimated formula 1418481395.2626*(600/((1389220457 - 1388624318)/1170))
Estimated New Hash Rate = 1,670,372,077
Estimated Change = 1.1775777125805
Current Block = 279378
Current Difficulty = 1,418,481,395
Current Block Hash = 0000000000000000eea976a669c940e408e269d4984370ad1c3059db09dad86a
Current Block Hash Time = 1389220457 or January 8, 2014, 2:34 pm
First Block at current Difficulty = 278208
Last Block at current Difficulty = 280223
Next Difficulty at Block at = 280224
Number of Blocks till next Difficulty 846
Number of Blocks posted since Difficulty started 1170
Next Difficulity in 5 days 21 hours

so what im using to estimate the next diff is the following..

Quote
current difficulty*(600/((time of latest block - time of first block of this difficulty)/number of blocks since first block of this difficulty))

Does that look right, or can anyone shed some light on what I might be doing wrong or missing.

Cheers!
gateway (OP)
Hero Member
*****
Offline Offline

Activity: 552
Merit: 500


View Profile
January 09, 2014, 09:31:01 PM
 #2

Anyone, im kinda stuck at the moment Smiley
minerva
Full Member
***
Offline Offline

Activity: 120
Merit: 100


View Profile
January 09, 2014, 09:33:29 PM
 #3

http://bitcoincharts.com/bitcoin/
This does it for you.

Tip-Jar: 15NN2YwMGAntKopJgAsFBJvfuCARkV62xo
gateway (OP)
Hero Member
*****
Offline Offline

Activity: 552
Merit: 500


View Profile
January 09, 2014, 09:48:46 PM
 #4


Ya I know their are a few sites that do this, I was more looking for the formula they use to calculate estimated diff change
minerva
Full Member
***
Offline Offline

Activity: 120
Merit: 100


View Profile
January 09, 2014, 09:58:03 PM
 #5


Ya I know their are a few sites that do this, I was more looking for the formula they use to calculate estimated diff change
https://en.bitcoin.it/wiki/Difficulty

Tip-Jar: 15NN2YwMGAntKopJgAsFBJvfuCARkV62xo
cp1
Hero Member
*****
Offline Offline

Activity: 616
Merit: 500


Stop using branwallets


View Profile
January 09, 2014, 10:00:07 PM
 #6

You could look at the solve time of the last n blocks and calculate the hash rate then convert that to difficulty.

Guide to armory offline install on USB key:  https://bitcointalk.org/index.php?topic=241730.0
Rannasha
Hero Member
*****
Offline Offline

Activity: 728
Merit: 500


View Profile
January 09, 2014, 10:28:56 PM
 #7

The formula you posted is correct for the most basic estimate of the next difficulty. Its disadvantage is that the estimate can fluctuate considerably just after a difficulty change and it will only become more accurate as the next difficulty change approaches. Another approach is to always look at the time spent on the last N blocks, where a larger N gives a more stable prediction, but less accurate in times of quickly changing hashrate.

An even more advanced technique would be to assume that the network hashrate is, at least locally, increasing exponentially. You can then take several datapoints with the time spent on solving the last N blocks, the time spent on solving the last 2N to N blocks, 3N to 2N, etc... and fit an exponential curve to this data.
gateway (OP)
Hero Member
*****
Offline Offline

Activity: 552
Merit: 500


View Profile
January 10, 2014, 02:27:08 AM
 #8

The formula you posted is correct for the most basic estimate of the next difficulty. Its disadvantage is that the estimate can fluctuate considerably just after a difficulty change and it will only become more accurate as the next difficulty change approaches. Another approach is to always look at the time spent on the last N blocks, where a larger N gives a more stable prediction, but less accurate in times of quickly changing hashrate.

An even more advanced technique would be to assume that the network hashrate is, at least locally, increasing exponentially. You can then take several datapoints with the time spent on solving the last N blocks, the time spent on solving the last 2N to N blocks, 3N to 2N, etc... and fit an exponential curve to this data.

ahh ok .. great thanks for the info.. I need think If I want to go that route.. cheers!
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!