Bitcoin Forum
April 26, 2024, 10:43:42 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Why is difficulty a float number?  (Read 2179 times)
MaxDZ8 (OP)
Hero Member
*****
Offline Offline

Activity: 672
Merit: 500



View Profile
February 16, 2015, 08:04:18 AM
 #1

Foreword: I'm not truly sure how difficulty works in BTC at network level. This stems mainly from my work at stratum level so I'm not sure it's supposed to be there but this is perhaps the only section in the forum where I have a chance at getting some real solid information.

It is my understanding difficulty is "the number of bits in the resulting hash". This is a very nice and fairly defined number in my opinion.

For some reason I never fully understood however it is often expressed in float, with the ability of represent really big numbers. I think this was due to the better ability of give an idea of how many tests to perform on average to find a solution while the bit count thing would be non linear in nature. It's not like I consider <big number> particularly expressive either.

I assume I miss something with the truediffone constant and the various multiplications leading to the target bits.

Why was the floating number representation preferred?

Further elaborations on the subject are welcome.
1714128222
Hero Member
*
Offline Offline

Posts: 1714128222

View Profile Personal Message (Offline)

Ignore
1714128222
Reply with quote  #2

1714128222
Report to moderator
1714128222
Hero Member
*
Offline Offline

Posts: 1714128222

View Profile Personal Message (Offline)

Ignore
1714128222
Reply with quote  #2

1714128222
Report to moderator
1714128222
Hero Member
*
Offline Offline

Posts: 1714128222

View Profile Personal Message (Offline)

Ignore
1714128222
Reply with quote  #2

1714128222
Report to moderator
TalkImg was created especially for hosting images on bitcointalk.org: try it next time you want to post an image
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
February 16, 2015, 08:19:52 AM
 #2

It isn't: in the Bitcoin p2p protocol and blockchain there is no "difficulty", difficulty is just a display convention.  What the network uses is the 'bits' number which is a compressed representation of a 256 bit integer target which the block hash is compared to. To be valid the hash must be less than the target.

The difficulty number is a relative number which makes it easier to compare than the huge target or the cryptically encoding bits field. Now that it's over 4 billion it's not as useful a unit as it was back when difficulty was 100k.

There are some ignorantly constructed mining things which try to use the for humans 'difficulty' number for actual important usage. There be dragons. (Among other things, errors related converting block hashes to 'effective difficulty' have been blamed for miners discarding valid blocks in the past).

It is my understanding difficulty is "the number of bits in the resulting hash". This is a very nice and fairly defined number in my opinion.
Thats not correct, nor would it be very useful: if it was a straight leading zero bit check difficulity could only change in huge doubling/halving increments which could not adequately control the rate of blocks.
Remember remember the 5th of November
Legendary
*
Offline Offline

Activity: 1862
Merit: 1011

Reverse engineer from time to time


View Profile
February 16, 2015, 10:45:36 AM
 #3

It isn't: in the Bitcoin p2p protocol and blockchain there is no "difficulty", difficulty is just a display convention.  What the network uses is the 'bits' number which is a compressed representation of a 256 bit integer target which the block hash is compared to. To be valid the hash must be less than the target.

The difficulty number is a relative number which makes it easier to compare than the huge target or the cryptically encoding bits field. Now that it's over 4 billion it's not as useful a unit as it was back when difficulty was 100k.

There are some ignorantly constructed mining things which try to use the for humans 'difficulty' number for actual important usage. There be dragons. (Among other things, errors related converting block hashes to 'effective difficulty' have been blamed for miners discarding valid blocks in the past).

It is my understanding difficulty is "the number of bits in the resulting hash". This is a very nice and fairly defined number in my opinion.
Thats not correct, nor would it be very useful: if it was a straight leading zero bit check difficulity could only change in huge doubling/halving increments which could not adequately control the rate of blocks.
Huh, wait, it isn't?

So one cannot do uint64_t *hash = (uint64_t *)endianswap(block); // should be sufficient for quite a while;

and then if hash <= target is not correct?

BTC:1AiCRMxgf1ptVQwx6hDuKMu4f7F27QmJC2
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
February 16, 2015, 12:18:48 PM
 #4

Foreword: I'm not truly sure how difficulty works in BTC at network level. This stems mainly from my work at stratum level so I'm not sure it's supposed to be there but this is perhaps the only section in the forum where I have a chance at getting some real solid information.

It is my understanding difficulty is "the number of bits in the resulting hash". This is a very nice and fairly defined number in my opinion.

For some reason I never fully understood however it is often expressed in float, with the ability of represent really big numbers. I think this was due to the better ability of give an idea of how many tests to perform on average to find a solution while the bit count thing would be non linear in nature. It's not like I consider <big number> particularly expressive either.

I assume I miss something with the truediffone constant and the various multiplications leading to the target bits.

Why was the floating number representation preferred?

Further elaborations on the subject are welcome.

It seems you are assuming that "difficulty" is primative and that things like "target" are calculated from it.  In fact, target is primative and it is the compact form of target, called the "bits" field which is changed by the protocol every 2 weeks and is stored in the header of every block.  As gmaxwell said, "difficult" is just for the humans.  You might compare it with the "bitcoin" base unit which is also an arbitrary scale added to make things user friendly (at the protocol level, amounts are stored in satoshi).

https://en.bitcoin.it/wiki/Difficulty explains all this well.  You might also find some related calculations I did a few months ago illuminating.  I explain why an apparently approximate 4-fold difficulty increase was in fact precisely a 4-fold increase.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
February 16, 2015, 03:37:42 PM
 #5

and then if hash <= target is not correct?

Target is not difficulty.

As gmaxwell said the network doesn't use difficulty at all.  It computes how small the next hash must be when the network adjusts.  That "how small" measure is expressed as a target.  You are right if hash <= target then it is a valid block but the question was about difficulty.


MaxDZ8 (OP)
Hero Member
*****
Offline Offline

Activity: 672
Merit: 500



View Profile
February 16, 2015, 05:25:18 PM
 #6


There are some ignorantly constructed mining things which try to use the for humans 'difficulty' number for actual important usage. There be dragons. (Among other things, errors related converting block hashes to 'effective difficulty' have been blamed for miners discarding valid blocks in the past).

...

It is my understanding difficulty is "the number of bits in the resulting hash". This is a very nice and fairly defined number in my opinion.
Thats not correct, nor would it be very useful: if it was a straight leading zero bit check difficulty could only change in huge doubling/halving increments which could not adequately control the rate of blocks.
Yeah, I guess mine is one of those thingies Tongue I'm not sure how could I manage to miss this so long... is it just me or this sounds like I can just memcmp it against hash?

I assume the situation at network level is really etter defined in that regard. Between stratum and the pool operators manipulating difficulty multipliers I am very confused.

Quote
It seems you are assuming that "difficulty" is primative and that things like "target" are calculated from it.  In fact, target is primative and it is the compact form of target, called the "bits" field which is changed by the protocol every 2 weeks and is stored in the header of every block.  As gmaxwell said, "difficult" is just for the humans.
It is the only thing I get from stratum servers (let me tell you I don't have high regard for protocol). I can indeed see some computations I am familiar with. I would probably have to really consolidate this information.
Sergio_Demian_Lerner
Hero Member
*****
expert
Offline Offline

Activity: 549
Merit: 608


View Profile WWW
February 17, 2015, 03:02:43 AM
 #7

There is a curious fact about the "bits" field:
In the first private release of Bitcoin, the "bits" field actually counted the number of zero bits the hash would need to have as prefix, and that's why it got named "bits".

teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
February 17, 2015, 11:09:10 AM
 #8

There is a curious fact about the "bits" field:
In the first private release of Bitcoin, the "bits" field actually counted the number of zero bits the hash would need to have as prefix, and that's why it got named "bits".

Cool!  Where might I might find this code?
Sergio_Demian_Lerner
Hero Member
*****
expert
Offline Offline

Activity: 549
Merit: 608


View Profile WWW
February 18, 2015, 01:10:57 AM
 #9

Somebody posted it in bitcointalk some time ago. Search for the post. If you cannot find it, send me a direct message with your e-mail and I send it to you.
Remember remember the 5th of November
Legendary
*
Offline Offline

Activity: 1862
Merit: 1011

Reverse engineer from time to time


View Profile
February 18, 2015, 02:04:42 AM
 #10

Somebody posted it in bitcointalk some time ago. Search for the post. If you cannot find it, send me a direct message with your e-mail and I send it to you.
I respect you a lot!

But why not just post the link here instead of asking for a PM with an email? It's a bit(just a bit) like the XY problem.

BTC:1AiCRMxgf1ptVQwx6hDuKMu4f7F27QmJC2
MaxDZ8 (OP)
Hero Member
*****
Offline Offline

Activity: 672
Merit: 500



View Profile
February 18, 2015, 10:56:32 AM
 #11

Seconded.
If the whole thing is already public I would like it to be openly discussed... as much as discussing an historical legacy makes sense.
Even better if it could be traced in commits.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
February 18, 2015, 12:35:17 PM
 #12

I think Sergio was just trying not to derail the thread.  Poor guy's just too damned interesting for his own good.

I'm not so considerate.

Searching the forum yielded a topic started by Cryddit about a year ago where he posted the source he receive in private correspondence with Satoshi about 2 months before Bitcoin's first release.

One can see that indeed the "bits" field (variable name: "nBits") represented the number of leading zeros required of a block's hash.  The beginning of the block generating code (mining) we see that the hashTarget is set as the binary number: (nBits) '0's followed by (256 - nBits) '1's.  We then see the loop which performs a double-hash and checks the result against this target.
Code:
        //
        // Search
        //
        uint256 hashTarget = (~uint256(0) >> pblock->nBits);
        uint256 hash;
        while (nTransactionsUpdated == nTransactionsUpdatedLast)
        {
            BlockSHA256(&tmp.block, nBlocks0, &tmp.hash1);
            BlockSHA256(&tmp.hash1, nBlocks1, &hash);

            if (hash <= hashTarget)
            {

In the function GetNextWorkRequired we see that blocks were to be spaced 15-minutes apart while target was to be readjusted roughly once every 30 days.  nBits could only be increased or decreased by 1 (corresponding to a difficulty precisely doubling or halving) and it would only change if the target was considered out by more than a factor of 2 (so if the 2880 blocks between re-targets arrived in fewer than 15 days or more than 60 days according to the timestamps).

Even I'm starting to feel guilty now so I'll throw OP a bone.  "bits" was indeed originally a measure of "difficulty" (a logarithmic measure) and it was an integer.  The header of main.cpp contained:
Code:
///static const unsigned int MINPROOFOFWORK = 40; /// need to decide the right difficulty to start with
static const unsigned int MINPROOFOFWORK = 20;  /// ridiculously easy for testing
MaxDZ8 (OP)
Hero Member
*****
Offline Offline

Activity: 672
Merit: 500



View Profile
February 18, 2015, 01:58:28 PM
 #13

So basically the misconception eventually has a basis of truth which is now largely obsolete.

Thanks!
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!