Bitcoin Forum
December 15, 2024, 04:30:46 AM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 [6]  All
  Print  
Author Topic: On The Longest Chain Rule and Programmed Self-Destruction of Crypto Currencies  (Read 17953 times)
Cryddit
Legendary
*
Offline Offline

Activity: 924
Merit: 1132


View Profile
July 17, 2014, 03:27:43 PM
 #101

I believe transactions-as-proof-of-stake (the heaviest subtree model) is probably the best alternative to proof-of-work - and it isn't all that good.

The basic idea is that the "finite resource" available for deciding to prefer one chain over another, is the set of unspent txouts that exist at the point of the chains' divergence from each other.  If transactions must give the hash of a very recent block (parent or grandparent to the block they'll be in) then they can be counted as a "vote" for a chain including that block. 

In practice, this makes it possible for an attacker to spend ten coins in one chain, then support a different chain by spending a thousand coins (probably to himself) there, and if the second chain is accepted it 'unspends' his ten coins.  Obviously this only works as a double spend if he does it before everybody else in the course of regular spending puts the first chain a thousand coins ahead. 

But it gets worse than that, because at any given moment there may be dozens or even hundreds of crooks looking for a chance to double spend, and if two competing chains appear, their efforts to make a small initial expenditure in the apparent leading chain and then dump a huge transaction into the second chain all reinforce each other. 

On one hand, if everybody understands the security requirement for transactions as proof of stake and regularly transacts their coins several times a day, (which you can arrange with a proof-of-stake interest/security payment for each transaction) the crooks shouldn't be able to overwhelm that traffic with their timing games.  On the other, that would generate an absolutely enormous blockchain and have a high communications overhead.

So in the short run, it doesn't work.  In the long run, it can provide an absolute security guarantee given enough time; Once more than half of all the coins in txouts that existed before a block was created have been spent, that block becomes absolutely irrevocable no matter what proof-of-work anybody pours on or what manipulations they do with spending and transactions.   
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
July 17, 2014, 03:43:33 PM
Last edit: July 17, 2014, 09:11:05 PM by DeathAndTaxes
 #102

I believe transactions-as-proof-of-stake (the heaviest subtree model) is probably the best alternative to proof-of-work - and it isn't all that good.

Agreed.  One issue is that it makes risk analysis difficult.  This means the simplicity of wait for x confirmations and you are safe (unless attacker has a majority of the hashrate) no longer applies.  

Quote
In the long run, it can provide an absolute security guarantee given enough time; Once more than half of all the coins in txouts that existed before a block was created have been spent, that block becomes absolutely irrevocable no matter what proof-of-work anybody pours on or what manipulations they do with spending and transactions.  

One problem is that a large number of outputs have not ever been spend, and may not be spent for years or decades.  So it could be some time before a block was absolutely irrevocable.  The large amount of old unspent outputs create uncertainty.  One variant would be to only include outputs which are below a certain age at the time of the block.   For example you could say for the purpose of block scoring outputs older than one block month (4,320) aren't included in the score.  This would reduce the requirement to only a majority of the outputs less than a month old.

Still it will require some careful analysis to avoid some unexpected weaknesses.   For as complex as Bitcoin is in implementation, it is rather simple (maybe elegant is a better word) in design.  There are still nuances, and gotchas in the Bitcoin protocol and it is built on a simple design.  More complexity may not be the answer.
jonald_fyookball
Legendary
*
Offline Offline

Activity: 1302
Merit: 1008


Core dev leaves me neg feedback #abuse #political


View Profile
July 17, 2014, 07:46:19 PM
 #103

The heaviest subtree model?   What's that? 
I've not heard that one yet.

AnonyMint
Hero Member
*****
Offline Offline

Activity: 518
Merit: 521


View Profile
July 17, 2014, 10:21:27 PM
Last edit: July 17, 2014, 10:33:31 PM by AnonyMint
 #104

If I have misinterpreted his writings, he will I assume point that out.
You have, but I've given up responding.

Nice to see technical discussion has been reduced to politics. Smells like the typical "not invented here, so ignore it" phenomenon of vested interests (or I don't want to help the competition).

What is the point of technical discussion if you are not going to make necessary clarifications.

I believe my interpretation is complete. It is up to you to show otherwise, or give up if you can't/won't.

unheresy.com - Prodigiously Elucidating the Profoundly ObtuseTHIS FORUM ACCOUNT IS NO LONGER ACTIVE
AnonyMint
Hero Member
*****
Offline Offline

Activity: 518
Merit: 521


View Profile
July 17, 2014, 10:30:46 PM
Last edit: July 17, 2014, 10:51:50 PM by AnonyMint
 #105

I believe transactions-as-proof-of-stake (the heaviest subtree model) is probably the best alternative to proof-of-work - and it isn't all that good.

Agreed.  One issue is that it makes risk analysis difficult.  This means the simplicity of wait for x confirmations and you are safe (unless attacker has a majority of the hashrate) no longer applies.

I don't know if I have missed some discussion that would have changed the understanding I formed, but I pointed out egregious flaws in the original proposal for Transactions as a Proof-of-stake.

The fundamental math problem with using any metric from the block chain (or any consensus voting such as proof-of-stake) is that the input entropy can be gamed deterministically unlike proof-of-work which is a randomized process, i.e. the input entropy is not orthogonally unbounded as it is in the randomization of proof-of-work.

unheresy.com - Prodigiously Elucidating the Profoundly ObtuseTHIS FORUM ACCOUNT IS NO LONGER ACTIVE
AnonyMint
Hero Member
*****
Offline Offline

Activity: 518
Merit: 521


View Profile
July 17, 2014, 11:57:02 PM
 #106

Academics have produced nothing but perfect nonsense on the topic of Bitcoin. This is one of the worst.

Nice "not invented here" bravado but incorrect.


He's also right about the effects of block reward halving on hash power allocation.

No he isn't or at least not his conclusions on what "will" happen are just speculation.

...

a) continue to mine bitcoin for half the revenue
b) sell the hardware to a miner with lower costs (namely cheaper/free electricity and cool climate)
c) mine an altcoin.

The author jumps right to c.

The author might have the wrong justification for the conclusion, but the "Programmed Self-Destruction" conclusion is not speculation because the self-evident fact is that investors only spend a small fraction of their income, thus if you don't redistribute currency then its use dies[1].

[1] https://bitcointalk.org/index.php?topic=597878.msg7900846#msg7900846

unheresy.com - Prodigiously Elucidating the Profoundly ObtuseTHIS FORUM ACCOUNT IS NO LONGER ACTIVE
Cryddit
Legendary
*
Offline Offline

Activity: 924
Merit: 1132


View Profile
July 21, 2014, 11:08:29 PM
 #107

The heaviest subtree model?   What's that? 
I've not heard that one yet.

In any decision about which of two potential branches to accept, the transactions-as-proof-of-stake method (aka heaviest-subtree model) prefers the branch whose transactions have spent the greatest proportion of the txouts that existed at the moment when those two branches diverged. 

In order for this to work, transactions must belong clearly to one branch or the other.  So the transaction itself has to have a block ID embedded in it, and it counts as "support" for the branch that includes that block ID. 

So the 'finite resource' that must be used up to support a branch and which, if used, cannot also be used to support another branch, is coins in txouts, not hashing power.  And the people who deserve the coins for helping to secure the blockchain are everybody who made a transaction using their stake, rather than whoever came up with the winning hash. 

andikarl
Newbie
*
Offline Offline

Activity: 10
Merit: 0


View Profile
September 08, 2014, 07:01:00 PM
 #108

@all

A very interesting paper. But I don´t think that the crypto currencies will self destruct themselfes if we can make the right adjustments.
I have thought about a way to guarantee the decentralization of a crypto currency or Bitcoin on the long term. I have written an article and would like to hear what you think about it:
http://techreports2014.wordpress.com/2014/09/07/fundamentals-of-a-possible-new-bitcoin-fork-bitcoin-2-0/

Hope it can help and I would love discuss it with you guys.

Greetings.
Andrew
andytoshi
Full Member
***
Offline Offline

Activity: 179
Merit: 151

-


View Profile
September 08, 2014, 09:22:03 PM
 #109

A very interesting paper. But I don´t think that the crypto currencies will self destruct themselfes if we can make the right adjustments.
I have thought about a way to guarantee the decentralization of a crypto currency or Bitcoin on the long term. I have written an article and would like to hear what you think about it:
http://techreports2014.wordpress.com/2014/09/07/fundamentals-of-a-possible-new-bitcoin-fork-bitcoin-2-0/

Hope it can help and I would love discuss it with you guys.

Hi Andrew,

In future it is better to create a new thread rather than resurrecting an old one, especially one as vivacious as this one.

As to the content of your article, I briefly skimmed it. A few comments -- your concerns about ASIC monopolies are largely addressed in my ASICs and Decentralization FAQ, and secondly, the "anti-monopoly" scheme by Sirer and Eyal is seriously and fundamentally broken by being progress-free. It seems to me that these authors are more concerned with promoting themselves with doomsday headlines than they are getting the fundamentals of what they write about correct, and it's best for the Bitcoin world if they not be given attention.

Andrew
andikarl
Newbie
*
Offline Offline

Activity: 10
Merit: 0


View Profile
September 09, 2014, 01:06:14 PM
 #110


[/quote]

Hi Andrew,

In future it is better to create a new thread rather than resurrecting an old one, especially one as vivacious as this one.

As to the content of your article, I briefly skimmed it. A few comments -- your concerns about ASIC monopolies are largely addressed in my ASICs and Decentralization FAQ, and secondly, the "anti-monopoly" scheme by Sirer and Eyal is seriously and fundamentally broken by being progress-free. It seems to me that these authors are more concerned with promoting themselves with doomsday headlines than they are getting the fundamentals of what they write about correct, and it's best for the Bitcoin world if they not be given attention.

Andrew

[/quote]

Hi Andrew,

thank you for your reply. And thank for your FAQ it does explain a lot of things pretty well. Especially why ASIC-miners are so important to validate the Blockchain on the long term. But on the other hand in my article I have not ruled out the use of ASIC miners. Just possible monopolies in the future for controling the network I see as an threat for Bitcoin. Please read the part of my article which deals with the position stamp for guaranteeing the decentralization of bitcoin. With decentralization I do not mean that in the future the should not be any mining pools or big miners. On the contrary I mean a safeguard which is also implemented into the blockchain to counter attacks and to detect monopolies. My suggestions would also impact on the hardware flow. But these we have to dicuss in detail.

I think I will open a new thread about this topic Smiley

Cheers,
Andrew
UnunoctiumTesticles
Full Member
***
Offline Offline

Activity: 154
Merit: 100


View Profile
November 29, 2014, 08:49:31 PM
 #111

If I have misinterpreted his writings, he will I assume point that out.
You have, but I've given up responding.

Btw, a couple of months ago I solved the selfish-mining problem and proved to myself that gmaxell was wrong. Here I quote a snippet from my private design document:

Code:
   3. Sharing revenue proportationally with the fork(s) of lower cumulative
      difficulty entirely defeats selfish mining (positive relative revenue) for
      any α. All blocks in the merged forks receive their full block reward
      adjusted by relative difficulty.

         r_others = p_0(1-α) + p_1(1-α) + p_2(1-α) + p[k>2](1-α), cases (e), (f), (g), and (h)
         r_pool = p_1(1-α)/2 + p_2(1-α)2 + p[k>2](1-α)/2, cases (f), (g), and (h)

         r_others = p_1(1-α)(1/α + 1 + α/(1-α) + (α-1)/(2α-1) - 1 - α/(1-α))
         r_pool = p_1(1-α)(1/2 + 2α/(1-α) + (α-1)/(4α-2) - 1/2 - α/(2-2α))

         R_pool = (3α/(2-2α) + (α-1)/(4α-2))/(1/α  + 3α/(2-2α) + 3(α-1)/(4α-2))

      Plot the following at wolframalpha.com.

         (3α/(2-2α) + (α-1)/(4α-2))/(1/α  + 3α/(2-2α) + 3(α-1)/(4α-2)),  α, 0 < α < 0.5

      However, merging forks of lower cumulative difficulty would remove the
      economic incentive to propagate forks as fast as possible; and issuing a
      double-spend would be trivial. The solution is to only merge forks of
      lower cumulative difficulty when they are known by the majority before
      receiving a fork of higher culmulative difficulty. Thus an attacker or
      network hiccup that creates a hidden higher culmulative difficulty fork
      has an incentive to propagate it before that fork falls behind the
      majority. A node assumes it is part of the majority if it is participating
      non-selfishly; it will attempt to merge any lower culmulative difficulty
      forks it was aware of upon receiving a higher culmulative difficulty fork.
      If it is not part of the majority, then its attempt will not be accepted.
      If there are double-spends at stake, each node will continue to try for
      the longest fork miners wish to be able to merge, i.e. longer forks with
      double-spends won't be merged so the market will decide the value of the
      two forks. If there are no double-spends at stake then to defeat selfish
      mining, each node will continue to try during the duration of the longest
      fork probable for an attacker α < 0.5[6]. If a node selfishly forsakes its
      obligation and joins the attacker, that node attacks the value of the
      block rewards it earned.

      Also, forfeiting double-spends defeats any incentive to temporarily rent
      greater than 50% of the network hashrate because the persistent honest
      miners will forfeit the double-spends from the attacker’s fork upon it
      relinquishing control. Compared to Bitcoin, there is no increased
      incentive to double-spend, because miners will not place into any block a
      double-spend they’ve already seen; double-spends can only appear in forks.
      In Bitcoin one fork wins so one transaction from each of the double-spend
      is forfeited so the honest recipient loses when the attacker’s fork wins.
      In this new algorithm both of the transactions from each double-spend are
      forfeited, so both the honest recipient and the attacker lose.

      Defeating short-term rented hashrate attacks in conjunction with only
      accepting inputs to a transaction which are as old as the longest duration
      fork that will be merged, proves the probability of a double-spend[6] to
      be determined solely by the number of block confirmations no matter how
      fast the block period is.

      Instead of monolithically choosing a winning fork, forfeiting double-
      spends deletes selective downstream transactions. Thus fully opaque block
      chains such as Zerocash, are thought to be incompatible because they not
      only obscure which transaction consumed its inputs but also hide any coin
      identifier that could correlate spends on separate forks. Cryptonote’s
      one-time ring signatures are more flexible because mixes could choose to
      mix only with sufficiently aged transaction outputs.

      Each block of the Proof Chain contains a tree of block hashes, which are
      the forks that were merged and a branch in a subsequent block may continue
      a branch in a prior block. The restricted merging rule, double-spend
      forfeiture and proof trees avoid the risks of Ethereum’s proposed
      approximation[5]. Miners have to maintain transaction history for the
      longest fork they wish to merge. And downstream consumers of inputs must
      wait for the longest fork they anticipate[6].


   [5] https://blog.ethereum.org/2014/07/11/toward-a-12-second-block-time/
       Now, we can’t reward all stales always and forever; that would be a
       bookkeeping nightmare (the algorithm would need to check very diligently
       that a newly included uncle had never been included before, so we would
       need an “uncle tree” in each block alongside the transaction tree and
       state tree) and more importantly it would make double-spends cost-free...
       Specifically, when only the main chain gets rewarded there is an
       unambiguous argument...Additionally, there is a selfish-mining-esque
       attack against single-level...The presence of one non-standard strategy
       strongly suggests the existence of other, and more exploitative,
       non-standard strategies...True. We solved the selfish mining bugs of
       traditional mining and 1-level GHOST, but this may end up introducing
       some new subtlety; I hinted at a couple potential trouble sources in the
       post.
   [6] Meni Rosenfeld. Analysis of hashrate-based double-spending.
       Section 5 Graphs and analysis.
       http://arxiv.org/pdf/1402.2009v1.pdf#page=10
Pages: « 1 2 3 4 5 [6]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!