Bitcoin Forum

Bitcoin => Development & Technical Discussion => Topic started by: CIYAM on April 24, 2013, 04:13:37 AM



Title: Limits to accepting a new longest chain to prevent >50%
Post by: CIYAM on April 24, 2013, 04:13:37 AM
One of the concerns that has been raised several times about a >50% attack is that if blocks were not immediately published but instead kept secret whilst continuing to mine ahead then finally publishing all of the blocks at once to form a new longer chain invalidating all transactions that occurred before it was started (which could be all the way back to the last checkpoint).

Although perhaps not a very likely scenario such an attack would be a massive confidence destroyer - so I am wondering would it not be reasonable for a client to reject a new chain if it contains blocks that it hasn't seen that are much older than blocks in the chain it is already building on (or is this already the case)?


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: Etlase2 on April 24, 2013, 04:28:37 AM
Although perhaps not a very likely scenario such an attack would be a massive confidence destroyer - so I am wondering would it not be reasonable for a client to reject a new chain if it contains blocks that it hasn't seen that are much older than blocks in the chain it is already building on (or is this already the case)?


It could but it doesn't because "there can be only one". While this could be an attack of hashpower, it could also be an attack (or mishap) on the internet infrastructure that has caused a separation of mining powers for some time. When they rejoin, your solution would cause a fork that would have to be resolved by the users instead of by a computer.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: CIYAM on April 24, 2013, 04:40:23 AM
While this could be an attack of hashpower, it could also be an attack (or mishap) on the internet infrastructure that has caused a separation of mining powers for some time. When they rejoin, your solution would cause a fork that would have to be resolved by the users instead of by a computer.

In the scenario above the fork has already happened *before* trying to apply my solution (i.e. as soon as the miners became separated you have forked) - but yes with my solution those forks would not be able to be rejoined (I think if they were big enough you'd still have a hell of a mess so the current system doesn't really help that much).


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: Etlase2 on April 24, 2013, 04:47:39 AM
(I think if they were big enough you'd still have a hell of a mess so the current system doesn't really help that much).

A hell of a mess that has a dead simple solution - longest chain wins.

Is it the best solution? It certainly is an inelegant one, but it does fix the problem. I have suggested using a bitcoin days destroyed mechanic in addition to longest chain wins under attack-like scenarios, which doesn't even require a soft fork and would only be an incompatibility issue if the problem occurs, but people around here aren't too interested in straying from the satoshicode.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: CIYAM on April 24, 2013, 04:54:22 AM
I understand (and respect) the need to be conservative and I guess in the end maybe it is not much different to having checkpoints every now and again but it just seems that disallowing a new longest chain (due to containing blocks that are too old) would be a more elegant (and automatically ongoing) way to prevent such a major reorg from occurring.

Also if I were (well actually I am) in China and had been running on a separate fork for the last year or so then I don't think I'd want to see my fork merged at all (so it may as well stay forked forever).


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: Etlase2 on April 24, 2013, 05:03:21 AM
"Too old" is a subjective measure and will be disagreed upon. Not everyone is going to see everything at the same time. And disqualifying "too old" means you are accepting a permanent fork if there has been a network split for more than the "too old" amount of time.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: CIYAM on April 24, 2013, 05:09:11 AM
Well consider coinbase rewards - you can spend them after a certain number (is it 100 or 120?) more blocks are added to your chain but if you were mining on the losing fork then such blocks are going to be discarded - if you have already *spent* those funds in the meantime then someone is going to be rather unhappy.

If Bitcoin thinks that 100/120 is the *safe* point to allow spending from coinbase then I would be proposing a figure that would be closely related to that (making it no more subjective than the limit already in place).


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: kjj on April 24, 2013, 11:14:34 AM
Right now, a new chain takes over as long as the embedded difficulty is at least one hash more than the currently known chain.

I have proposed many times that reorgs beyond a trivial depth should require exponential difficulty increases.  For example, we could say that a reorg of more than 10 blocks requires 1.02 times as much work, per block past 10, or dD=1.02(blks-10).

This would force any potential history rewriting attacker to move quickly and make their intentions obvious to all, lest they find themselves fighting that exponential.

My scheme, like all such schemes to add state, has some very serious downsides.  For starters, it makes network convergence not automatic.  I have argued a bunch of times that the trade-off could be worthwhile, but I still have to accept that the burden of proof for messing with such a core concept is very high.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: CIYAM on April 24, 2013, 11:23:19 AM
My scheme, like all such schemes to add state, has some very serious downsides.  For starters, it makes network convergence not automatic.  I have argued a bunch of times that the trade-off could be worthwhile, but I still have to accept that the burden of proof for messing with such a core concept is very high.

Your scheme sounds interesting (and is actually better than my idea I must admit) - the automatic convergence is something I don't really see as being a good thing at all (as I stated before if you have been using a fork for 100s or more importantly 1000s of blocks then such convergence whilst able to occur automatically would would not occur without a hell of a lot of complaints).


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: TierNolan on April 24, 2013, 11:26:47 AM
Although perhaps not a very likely scenario such an attack would be a massive confidence destroyer - so I am wondering would it not be reasonable for a client to reject a new chain if it contains blocks that it hasn't seen that are much older than blocks in the chain it is already building on (or is this already the case)?

Some of the proof of stake rules do that kind of thing.  The checkpoint system is a manual version to a certain extent.

An extreme version would be that you multiply by age.  Block proof of work is (Block POW) * (time since the node first added the block to the chain).

This is already used for tie breaking chains.  If you have 2 equal POW chains, then you go with the one that was extended first.

You could add a maximum bonus (say 60 days old).  This would allow the chain to heal eventually.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: Peter Todd on April 24, 2013, 11:29:12 AM
You can use https://twitter.com/blockheaders as your blockchain information source and create a secure Bitcoin wallet that has no concept of the P2P network or the blockchain itself. If you have a few other sources of block header information you aren't even trusting any one entity; information is easy to spread and difficult to stifle.

Any attempt to change the rules for what constitutes a valid blockchain to something other than longest wins has the ugly consequence that SPV clients no longer work. You can do it - in an emergency we may have no choice at all - but remember that it has ugly consequences.

Anyway, the issue has been discussed to death before. Taking priority into account is something Gavin mentioned on his blog last year, and that was just to make sure the public realized there are last ditch options available.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: TierNolan on April 24, 2013, 11:35:08 AM
If Bitcoin thinks that 100/120 is the *safe* point to allow spending from coinbase then I would be proposing a figure that would be closely related to that (making it no more subjective than the limit already in place).

The rule could be something like, auto-checkpoint if
- the block is on the longest chain
- the block is at least 2160 blocks deep
- the block was received by the node more than 30 days previous
- During the last 30 days,
-- the daily hashing for the chain has never been less than 50% of the median for the 30 days
-- the chain received more than 90% of total hashing power of all forks been mined

Auto-checkpoints could be discarded if the fork gets long enough, so they are soft checkpoints.  They would get harder the older they are.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: CIYAM on April 24, 2013, 11:38:18 AM
I must admit I hadn't thought about SPV clients although if all headers of all blocks (including those of a fork) were available then couldn't an SPV client also decide to ignore new headers that it decides are too old (i.e. they still have the timestamps for every header they are using don't they)?

Anyway, the issue has been discussed to death before. Taking priority into account is something Gavin mentioned on his blog last year, and that was just to make sure the public realized there are last ditch options available.

Any link to where this has been discussed in depth before (maybe I am not searching on the right thing)?


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: Zeilap on April 24, 2013, 04:21:48 PM
I must admit I hadn't thought about SPV clients although if all headers of all blocks (including those of a fork) were available then couldn't an SPV client also decide to ignore new headers that it decides are too old (i.e. they still have the timestamps for every header they are using don't they)?

Anyway, the issue has been discussed to death before. Taking priority into account is something Gavin mentioned on his blog last year, and that was just to make sure the public realized there are last ditch options available.

Any link to where this has been discussed in depth before (maybe I am not searching on the right thing)?


http://gavintech.blogspot.ca/2012/05/neutralizing-51-attack.html (http://gavintech.blogspot.ca/2012/05/neutralizing-51-attack.html)


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: CIYAM on April 24, 2013, 04:27:59 PM
http://gavintech.blogspot.ca/2012/05/neutralizing-51-attack.html (http://gavintech.blogspot.ca/2012/05/neutralizing-51-attack.html)

Thanks for the link - well it does seem that although not so worried about it (as perhaps I am) Gavin has thought that this might be something that should be addressed.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: gmaxwell on April 24, 2013, 04:43:16 PM
Although perhaps not a very likely scenario such an attack would be a massive confidence destroyer - so I am wondering would it not be reasonable for a client to reject a new chain if it contains blocks that it hasn't seen that are much older than blocks in the chain it is already building on (or is this already the case)?
If you make the longest chain decision stateful and not a pure function of the universe equally visible to all nodes then you replace a consensus change with an even more devastating consensus _failure_.

As an example, an oft-repeated suggestion is "just refuse to make any reorg greater than 50 blocks". Great, so now an attacker who can outpace the network can produce a fork 49 blocks back  and then mine two more blocks— one on the real branch one on the fork— and concurrently announce them each to half of the network ... and from one currency you have two: nodes are forever split and will never converge.  ... Or more simply, he makes his longer chain and all new nodes will accept it, all old nodes reject it.

Of course, if you make the fork far back enough then "okay, it'll never happen"— indeed, but if it'll never happen, what value is it?



Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: CIYAM on April 24, 2013, 04:48:28 PM
So basically I think what you are saying is that if anyone gets >50% we are screwed no matter what (so therefore why try and mitigate anything) - correct?

(am willing to accept that there may be nothing we can do about it but it of course does leave some concern if we simply have no defense at all)


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: gmaxwell on April 25, 2013, 04:08:15 AM
So basically I think what you are saying is that if anyone gets >50% we are screwed no matter what (so therefore why try and mitigate anything) - correct?
(am willing to accept that there may be nothing we can do about it but it of course does leave some concern if we simply have no defense at all)
There are things that can be done, but they depend on the specifics of the attacker and the attack... and if the attacker knows about them they will be less effective. You can be confident that Bitcoin wouldn't go down without a fight.

But fundamentally: The security assumption of Bitcoin is that the honest users control the majority.  If it could be stronger, it would be— but at least so far as I've seen the proposals to strengthen it within the algorithm end up trading off one weakness for a worse one. If you break the majority assumption then no algorithm can protect you— but people, acting externally to the system adapting it with the consent of the honest users— still can.  People can make value judgements "this chain is good, that chain is an attack" which are very hard for an algorithm to make especially when the attacker can see the algorithm.  Those value judgements are a liability— they're part of why traditional monies are untrustworthy— but if Bitcoin's security assumptions get broken by an overt attack I expect there would easily be universal consensus for some kind manual intervention.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: Kupsi on April 25, 2013, 10:31:39 PM
Although perhaps not a very likely scenario such an attack would be a massive confidence destroyer - so I am wondering would it not be reasonable for a client to reject a new chain if it contains blocks that it hasn't seen that are much older than blocks in the chain it is already building on (or is this already the case)?

I started a discussion on this back in february.

https://bitcointalk.org/index.php?topic=140695.0


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: CIYAM on April 26, 2013, 02:53:57 AM
I started a discussion on this back in february.

https://bitcointalk.org/index.php?topic=140695.0

Thanks for the link - don't know how I missed that.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: roalwe on April 28, 2013, 04:18:08 PM
Personally, I am not convinced it is needed.

There was so far... about 2 "emergencies" with the blockchain (that "generate a lot of bitcoins" bug, and the doublespend in the recent fork) none of which were exploited by a real malicious party.

As far as I can tell, many major banks have a less stelar record.

It doesn't seem to me that bitcoin's "the hashiest chain wins" approach is "broken", so maybe we should refrain from "fixing" it.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: CIYAM on April 28, 2013, 04:30:25 PM
I do agree that we haven't seen such a threat - but it always is a nagging concern (damn it Satoshi are you sure you got it right?).

;D


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: roalwe on April 28, 2013, 04:56:34 PM
I do agree that we haven't seen such a threat - but it always is a nagging concern (damn it Satoshi are you sure you got it right?).

;D


Unless you run a bitcoin service with aspirations to eventual greatness (which  I do ;) ), doublespends that require immense hashrates should not be a concern for you at all (unless you happen to be good at making enemies among major pool operators :D )


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: TierNolan on April 28, 2013, 05:26:00 PM
It doesn't seem to me that bitcoin's "the hashiest chain wins" approach is "broken", so maybe we should refrain from "fixing" it.

I think the main point is that no scheme can prevent a complete reversal from the genesis block.  If you have 2 chains that fork at the genesis block, then you can only compare the total POW.

However, if they fork at a later point, you could use something like proof of stake. or burn or whatever.  Only stakes from before the fork would count though.

If the coin value before the fork is distributed, then this distributed checkpointing.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: roalwe on April 28, 2013, 06:43:42 PM
Errr. I thought dev-hardcoded checkpoints prevent a complete reversal even in the face of stupidly overwhelming adversary...


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: TierNolan on April 28, 2013, 06:57:11 PM
Errr. I thought dev-hardcoded checkpoints prevent a complete reversal even in the face of stupidly overwhelming adversary...

Yes, I mean if you didn't want central checkpointing.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: kjj on April 29, 2013, 03:16:39 AM
Personally, I am not convinced it is needed.

There was so far... about 2 "emergencies" with the blockchain (that "generate a lot of bitcoins" bug, and the doublespend in the recent fork) none of which were exploited by a real malicious party.

As far as I can tell, many major banks have a less stelar record.

It doesn't seem to me that bitcoin's "the hashiest chain wins" approach is "broken", so maybe we should refrain from "fixing" it.

My issue was always that I didn't like the idea of a hidden chain attack.  If someone has a bunch of hashing power (very unlikely) and generates a chain offline, then publishes it, overturning a huge number of blocks, we pretty much just have to sit and watch.  Of course, we can then intervene after the fact to put it back.

But it seems like a better way would be to devise a scheme where the attacker would be unable to keep their longer chain secret.  That is why I like the exponential difficulty method.  Under ordinary circumstances, and even honest chain forks, the network would operate as usual.  But a high powered attacker is fighting against the clock, and people can look at the number of blocks protecting their transaction, calculate the exponential, and evaluate the risk with something more closely approaching certainty.

The cost is, however, that the notion of correctness gets a little fuzzy.  I still think that it is better to protect the network as a whole, in exchange for individual nodes needing manual intervention during some attacks.  Gmaxwell gives the opposing view, which is widely shared.  It is a really critical part of bitcoin, and won't be tweaked lightly, if ever.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: TierNolan on April 29, 2013, 08:55:33 AM
What about just defining a fork more than 10k blocks from the main chain as just that, a fork.  Have the client consider both alt chains and tell the user he needs to check the internet as to which one the community considers the real one.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: gmaxwell on April 29, 2013, 09:22:33 AM
What about just defining a fork more than 10k blocks from the main chain as just that, a fork.  Have the client consider both alt chains and tell the user he needs to check the internet as to which one the community considers the real one.
So, how do you prevent that from basically being the same as "everyone flip a coin?", I'm not sure what kind of guidance you could usefully give there.

I might, with enough effort and thinking into how you do something useful with the instructions there be convinced that there could be something useful along those lines— or at least, no more harmful than any other rearrangement of the titanic's deckchairs— but then there is the issue of if someone has enough computing power to create a 10k reorg, the could constantly reorg 9999 blocks over and and over again.  "This isn't any better".

If it really is only a matter of picking which color tie the corpse of Bitcoin will wear— well the software to implement anything like that itself has a cost (size, review bandwidth, vulnerability surface)— and I can't get too excited about something costly that doesn't make a real improvement. And I also think thats a general issue for these sorts of last-resort thing: if they're only attractive in a already doomed outcome its hard to justify their cost.


Title: Re: Limits to accepting a new longest chain to prevent >50%
Post by: TierNolan on April 29, 2013, 09:36:39 AM
So, how do you prevent that from basically being the same as "everyone flip a coin?", I'm not sure what kind of guidance you could usefully give there.

"Everyone" would know which is the real chain.  It is saying that the rule in that case is that it requires manual discussion.  

You need to go to forums etc to say which is the "real" chain.

Quote
I might, with enough effort and thinking into how you do something useful with the instructions there be convinced that there could be something useful along those lines— or at least, no more harmful than any other rearrangement of the titanic's deckchairs— but then there is the issue of if someone has enough computing power to create a 10k reorg, the could constantly reorg 9999 blocks over and and over again.  "This isn't any better".

Maybe they are willing to do it once, but not over and over.  In fact, maybe even 10k is to high, perhaps the node could auto-checkpoint all blocks that are at least 24 hours old.

A 24 hour reversal inherently requires manual intervention.

You connect daily and your client tells you a reversal has happened and manual intervention is required.

95% of users would be on the historical chain.  The reversal would have to be short to not trigger the warning.