Bitcoin Forum
November 01, 2024, 04:30:42 PM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 [2]  All
  Print  
Author Topic: What EXACTLY means "longest" chain ?  (Read 2720 times)
Pieter Wuille
Legendary
*
qt
Offline Offline

Activity: 1072
Merit: 1181


View Profile WWW
July 12, 2011, 11:13:15 AM
 #21

The weight of a block comes from its difficulty - namely the hash value it had to beat, not the actual value it had.

Since the difficulty is adjusted only every 2016 blocks, all blocks within a 2016-block section have the same difficulty.

If there is A->B, and someone finds a B', it will almost certainly (except on the 2016 block boundary) have the same difficulty as B, and not force anyone to switch. The switch only happens when a C would be found as successor to B' before a successor to B is found.

The point is simply this: if 95% of the network (measured by computing power) thinks B is the successor to A, B has 95% to be extended first, and thus has 95% chance to become part of the final chain.

I do Bitcoin stuff.
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1104


View Profile
July 12, 2011, 12:39:34 PM
 #22

The weight of a block comes from its difficulty - namely the hash value it had to beat, not the actual value it had.

Well, the suggestion is that the value of the hash be used as a tie break, when there are 2 blocks that are the head of the chain.

The winning block in the case of a tie wouldn't be determined by network propagation delays.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Pieter Wuille
Legendary
*
qt
Offline Offline

Activity: 1072
Merit: 1181


View Profile WWW
July 12, 2011, 12:50:03 PM
 #23

The value of the block's hash is not a good measure for the amount of work that was necessary to produce it. The target is.

I do Bitcoin stuff.
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1104


View Profile
July 12, 2011, 02:53:28 PM
 #24

The value of the block's hash is not a good measure for the amount of work that was necessary to produce it. The target is.

I don't agree.  The amount of work required to produce a hash is 1 hash worth of processing power, no matter what the value.

The question is how much processing power would be required to create a hash that is at least as good as the current one.

If you define "good" as lower hash value, then the value directly maps to how hard it is to create a hash that is at least as good.

However, it does mean that a miner can hit the jackpot and every 100 blocks, you will get a block that has an effective difficulty that is 100 times higher than the target.  That effectively would generate lock-in points on the chain naturally.

With the current system, you could in theory form an alternative chain based on forking back when the difficulty was very low and pretend that it was low for lots of blocks.  This is why the client does lock in. 

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Pieter Wuille
Legendary
*
qt
Offline Offline

Activity: 1072
Merit: 1181


View Profile WWW
July 12, 2011, 03:23:21 PM
 #25

Each hash on itself is as hard as any other. The number of tries you need is proportional to the difficulty you had while hashing.


I do Bitcoin stuff.
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1104


View Profile
July 12, 2011, 03:56:30 PM
 #26

Each hash on itself is as hard as any other. The number of tries you need is proportional to the difficulty you had while hashing.

There is no need for an in advance definition of difficulty. 

Work for a block = 2^256/(hash value)

Work for a chain = sum of the works for a block.

To produce a chain with a greater proof of work would require that you carry out the same number of hashes.


With a protocol change, difficulty would only be required to determine what the coin base for the block was.

Coinbase = 50*(target difficulty)/(hash value)    --- but capped at 50 per block.

With a variable coinbase, it would reduce the need for mining pools. 

Nodes could refuse to accept blocks with hashes greater than a threshold, in order to reduce spam.


I wasn't suggesting throwing away the rule that the longest chain is simply the one with the most blocks.  I was suggesting to use the hash value as a tie breaker when a fork happens.  This doesn't require a protocol change, just a change to how nodes forward blocks and how miners decide which block to mine.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Pieter Wuille
Legendary
*
qt
Offline Offline

Activity: 1072
Merit: 1181


View Profile WWW
July 12, 2011, 04:10:47 PM
 #27

I wasn't suggesting throwing away the rule that the longest chain is simply the one with the most blocks.  I was suggesting to use the hash value as a tie breaker when a fork happens.  This doesn't require a protocol change, just a change to how nodes forward blocks and how miners decide which block to mine.

There is no rule that the best chain is the one with the most blocks. It's the one with the most statistically expected number of hashes to have been performed by it.

The average number of hashes that have been performed for a given block is 2^48/65535*difficulty, or 2^256/target. Whether it is a hash much below the target or very close to it doesn't matter. Each attempt for that block had an independent chance of target/2^256 of being good enough, and good enough is all that counts.

Yes, you could move to another definition where you count 1/hash as value of a block, but the only thing this will cause is much more reorganisations, as a block that the majority had already agreed upon may be reverted.

Now there are two interpretations of your proposal: if you simply count the value of a chain as the sum of (1/hash) for each block in it, you could relatively easily create a block that reverts the past 100 blocks, causing much more reorganisations. The other interpretation is still letting the chain with the highest expected number of hashes win, but look at the 1/hash for the last block only for deciding which to pick. This is much more reasonable, but it doesn't gain you anything: instead of letting the majority of the network decide which side of the split wins, it becomes essentially random.

I do Bitcoin stuff.
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1104


View Profile
July 12, 2011, 06:36:57 PM
 #28

There is no rule that the best chain is the one with the most blocks. It's the one with the most statistically expected number of hashes to have been performed by it.

I think the client takes it as most blocks.  In most cases, it's identical.

You make a good point about 1/hash.  If someone had 1% of the network and mined 5 blocks behind the head of the chain, then once every 500 blocks, they could reverse the 5 blocks ahead of them.

Quote
Now there are two interpretations of your proposal: if you simply count the value of a chain as the sum of (1/hash) for each block in it, you could relatively easily create a block that reverts the past 100 blocks, causing much more reorganisations. The other interpretation is still letting the chain with the highest expected number of hashes win, but look at the 1/hash for the last block only for deciding which to pick. This is much more reasonable, but it doesn't gain you anything: instead of letting the majority of the network decide which side of the split wins, it becomes essentially random.

The 2nd one was what I was suggesting.  The gain is that all honest nodes would (subject to network latency) all agree on what was the current head of the chain.  The disadvantage is that 1/confirmed would be weaker, since the attacker only needs to find 1 block instead of 2.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Pages: « 1 [2]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!