jonny1000 (OP)
Member
Offline
Activity: 129
Merit: 14
|
|
December 24, 2015, 04:30:00 PM Last edit: December 26, 2015, 11:33:18 AM by jonny1000 |
|
A new software implementation of Bitcoin has been released called “Bitcoin Unlimited” ( http://www.bitcoinunlimited.info). Admittedly I do not fully understand the proposal, which aims to allow the blocksize limit to move based on signals on the ability to accept larger blocks. The relevant quotes are included below: 3. Addition of an Unlimited Dialog / command line option to change the default block “accept” size. Blocks larger than this (excessive blocks) will only be accepted if they are N deep in the blockchain. This will be 16MB by default.
4. Addition of an Unlimited Dialog / command line option to set the excessive block accept depth. This is the N parameter in the description of #3. The default value will be 4.
Source: https://bitco.in/forum/threads/buip001-unlimited-inspired-extensions-to-the-bitcoin-client.222Within the Bitcoin Unlimited proposal I will talk about three key numbers: - L = The blocksize limit at which miners produce blocks, the default is 1MB
- M = The blocksize limit for incoming blocks, which will be accepted as valid, if the lead is large enough, the default is 16MB
- N = The lead required over the chain with the blocksize limit L, in order to accept the blocks with a larger size as valid, the default is 4
The above element of the Bitcoin unlimited proposal seem to undermine the principal of a confirmation and partially invalidate the longest chain rule principal behind Bitcoin. At the same time I do not see how this proposed methodology converges on a single version of the blockchain. Let me consider some scenarios below, all of which exclude the fact that both N and the default blocksize limit are locally customizable, creating even more divergence about the one true chain. Scenario 1 – Upgrade to larger blocksIn this example, L=1MB, M=16MB and n=4. There is then a desire to produce a 2MB block. A miner therefore produces a 2MB block B. Most nodes by default reject 2MB block B as it only has one confirmation. Two chains are created an in the example image below, the 1MB block B receives 4 confirmations before the 2MB chain eventually gets a 4 block lead and since N=4, this chain is now considered valid. Transactions which had received 4 confirmations can now be double spent. However, this upgrade system is not convergent at all, there is no reason why this battle between the 2MB chain and 1MB chain should be resolved quickly. This could occur over many blocks (greater than 4). The two chains in this example could contain conflicting transactions and this "upgrade" process can encourage double spends, with the meaning of a confirmation hugely depreciated. Scenario 2 – Failed upgrade to larger blocksConsider the above example again, except the next block to be found is a 1MB G block. Once this block is found nodes would reject the 2MB G, H, I & J blocks, since the 2MB chain no longer has a 4 block lead. This means that finding only one block can invalidate 4 confirmations. This attack could be very attractive for those wishing to do double spends. The whole consensus property of Bitcoin is destroyed. Unless anyone can tell me why this analysis is incorrect, I would recommend the Bitcoin Unlimited team remove this "excessive block depth" idea from the software and proposal.
|
|
|
|
Zangelbert Bingledack
Legendary
Offline
Activity: 1036
Merit: 1000
|
|
December 24, 2015, 08:11:01 PM |
|
My response from there: My first impression is that your points are based on a misapprehension of how consensus on blocksize is intended to be reached in BU. I think you have seen N (the oversized-block acceptance depth) as the mechanism for reaching consensus,* when it's really more of a failsafe for miners who don't keep abreast of the situation on the network as well as they should if they want maximum profits. Miners who are paying attention should have their cap set substantially above the normal maximum blocksize as seen on the network; I see N as a bit of a "gimme" to small block advocates who want to make some sacrifices to discourage large blocks, as N allows them to be a little bit riskier about it (they can safely set their max acceptance size M a bit lower) while still tracking consensus in most situations. However, I think you may have misinterpreted the point of this: N wasn't really necessary for BU to work. The original idea was to have blocksize be unlimited, then user-selectable blocksize was added as a concession to small block advocates who worry about big block attacks, and N was largely a way to make life a little easier for those small block advocates (BU engineers please correct me if I'm wrong). However, this upgrade system is not convergent at all, there is no reason why this battle between the 2MB chain and 1MB chain should be resolved quickly. This could occur over many blocks (greater than 4). Such a long-running duel assumes something near a 50/50 split of hash power between the chains, or else they would resolve more quickly, correct? It seems the situation with Core is the same if Core were to try to upgrade to 2MB blocks when miner support was near a 50/50 split on the issue. I know, they won't do this, instead waiting for "overwhelming consensus" like 95% or something, thus avoiding the terrible 50/50 outcome. However, here I think you are seeing the illusion that the tail is wagging the dog. You apparently view Core's policy of waiting for 95% consensus as the means by which a 50/50 split is avoided, as if miners are all a bunch of robots. Since miners are aren't robots but in fact people who are dead-set on maximizing their profits, they certainly would not mine or build on an over-limit block (blocksize > L) unless they believed that would be a profitable choice. That would only be a profitable choice if it were unlikely to result in their block being orphaned. If a miner is rational, which is the governing assumption of Bitcoin in the first place, this profit/loss calculation will of course take into account the very same 95% that Core is looking for, except the miner is free from that market intervention** and can choose his own threshold percentage while also balancing it with whatever other factors he deems relevant to the profit/loss calculation in his own individual situation - with his own unique power costs and connectivity and the transaction fees in the mempool at the time - to dynamically determine the point at which he himself has an expected positive return on mining a bigger block. The first miner to mine a bigger block can be assumed to have performed such a calculation as a profit-maximizing agent, and then others may follow suit by raising their limits. Why would miners be monitoring the network that closely? Perhaps they aren't now, but they will have to in the future if they want to stay profitable, because their competitors will. They will do so when blockspace becomes limited enough and fees become high enough that their expected return on mining a block with some extra juicy fees overshadows the orphan probability based on various factors, including the XX% miner signaling. Centrally planning the switchover at 95% is economically suboptimal and unnecessary. The Core devs imagine that without their paternalistic setting of that 95% threshold, miners would be unable to make the calculation for themselves - even though they are privy to the very same information, and in real time at that! And again, Core devs cannot know the profit/loss calculations that apply to each miner's individual situation. This is akin to Soviet-style price fixing and falls afoul of the Economic Calculation problem as explained by Ludwig von Mises, or the problems with the use of knowledge in society as explained by F.A. Hayek: no central planner can know all the individual valuations and tradeoffs each person in the economy will want to make. The one-size-fits-all approach destroys the whole idea of the division of labor, the mainspring of human progress for the past 6000 years. Core's one-size-fits-all decrees are NOT where consensus comes from; consensus comes from miners not being idiots.*** The miners are the dog and the devs are the tail. Just like governments, they do something like require seatbelts in cars right as the auto industry is putting in seatbelts on its own and imagine that they - the wise overseers - are the source of auto safety. The tail imagines it wags the dog. Everyone can see the decree and the effect, but few can see that the same thing - or probably something far less clunky - would have happened without the decree. If a miner has a positive expected return on an oversize block, he will mine it even if it does end up orphaned. Other miners can see this and adjust accordingly, especially once they start getting outcompeted by accepting fewer fees that they could be. Miners stay in consensus because they are econo-rational, not because of developer nannyism. I would therefore expect blocksize to creep up very conservatively, and barring miner agreement on a flag-day upgrade - which is always a possibility even with BU except they can do without messing around with the Core devs - this will only happen when there are enough fees to warrant the orphan risk, as well as a high percentage of miners signalling support for a bigger blocksize. It would probably move up in some kind of Schelling-point increments like 2MB. * unless you think BU is good and your point is simply that BU shouldn't have the "excessive block depth." That may indeed be and I would have to think about your scenarios more.** which is, after all, nothing more than the inconvenience of adjusting that 95% threshold in the Core code oneself or finding someone to do to it for you.*** Some might say that miners really are idiots, but this violates the basic assumption of Bitcoin, that miners are economically rational. Sure, some miners now might not have to be very smart about their operations, but in the future this will have to change. Right now being dumb in some ways as a miner doesn't make you economically irrational, but in the future where forking and emergent coordination must occur, it will. Miners that refuse to monitor the network and make prudent profit/loss calculations will be outcompeted by those who do.
|
|
|
|
BitcoinNewsMagazine
Legendary
Offline
Activity: 1806
Merit: 1164
|
|
December 24, 2015, 09:42:39 PM |
|
johnny1000 thanks for starting a thread on Bitcoin Unlimited. I too am still reading all the relevant posts I can find. It appears to be run by Andrew Stone (theZerg) with help from less than 20 others and is XT code with BIP 101 removed. I would like to hear gmaxwell weigh in. Roger Ver added Bitcoin Unlimited to the list of full node wallets at www.bitcoin.com so expect more discussion.
|
|
|
|
YarkoL
Legendary
Offline
Activity: 996
Merit: 1013
|
|
December 24, 2015, 10:18:15 PM |
|
FWIW, as a someone who has been sitting on the fence concerning the Core/XT dispute, the argument presented by Peter R in his paper " A Transaction Fee Market Exists Without a Block Size Limit " has won me over. Bitcoin Unlimited is the way to go forward and I hope to run a node soon.
|
“God does not play dice"
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
December 25, 2015, 04:58:19 AM |
|
BU's article of federation have some very positive energy. And the fee market regulated by orphan rate is worth debating. I don't like those President things though
|
|
|
|
RoadTrain
Legendary
Offline
Activity: 1386
Merit: 1009
|
|
December 25, 2015, 10:53:11 AM |
|
And the fee market regulated by orphan rate is worth debating.
And it has been debated, and the theory promoted by Peter R didn't hold well. Spherical cows stuff. I didn't check if he revised his paper since then, though.
|
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
December 25, 2015, 01:41:28 PM |
|
And the fee market regulated by orphan rate is worth debating.
And it has been debated, and the theory promoted by Peter R didn't hold well. Spherical cows stuff. I didn't check if he revised his paper since then, though. I think Peter R did not consider the fact that the block size limit is first a spam filter, and the attackers never follow the market based behavior
|
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
December 25, 2015, 02:19:56 PM |
|
It seems the whole bitcoin ecosystem is like a country, you can not fork at will (that will immediately destroy the weaker fork economically). So some of the concept of managing a country can be borrowed Separation of power: You vote for legislation (BIPs), and someone else ensure the implementation (coding and commit), and someone works as judge (miners) https://docs-of-freedom.s3.amazonaws.com/uploads/image/attachment/37/Ch_1_-_separation_of_power.jpgThe key here is the BIPs must be based on user vote, not decided by programmers, and it is also similar in today's IT industry: You receive change request or trouble report from end user and start to hold meetings to decide if they are acceptable, then assign to programmers to implement So, if lots of users complain about the high fee and low transaction capacity, then corresponding BIPs can be made and start to be voted by users, not like today, anyone is writing their own BIP and no one else cares
|
|
|
|
BitcoinNewsMagazine
Legendary
Offline
Activity: 1806
Merit: 1164
|
|
December 25, 2015, 06:12:30 PM |
|
Based on this thread Bitcoin Unlimited seems to have the purpose of keeping XT alive.
|
|
|
|
Peter R
Legendary
Offline
Activity: 1162
Merit: 1007
|
|
December 25, 2015, 07:48:47 PM |
|
And the fee market regulated by orphan rate is worth debating.
And it has been debated, and the theory promoted by Peter R didn't hold well. Spherical cows stuff. I didn't check if he revised his paper since then, though. I've shown that the fee market exists if: 1. Bitcoin's inflation rate is nonzero 2. More than one miner or mining pool exists 3. Large blocks take longer to propagate than smaller blocks. I believe I will be able to remove #1 based on the current research I'm doing. #2 will always exists (Bitcoin is always susceptible to 51% attack). There is empirical evidence for #3: - Stone, G. A. “An Examination of Bitcoin Network Throughput Via Analysis of Single Transaction Blocks.” No Publisher (2015) http://www.bitcoinunlimited.info/1txn- “Bitcoin Network Capacity Analysis – Part 6: Data Propagation.” Tradeblock Blog (23 June 2015) https://tradeblock.com/blog/bitcoin-network-capacity-analysis-part-6-data-propagation- Decker C. and Wattenhofer R. “Information Propagation in the Bitcoin Network.” 13th IEEE International Conference on Peer-to-Peer Computing, Trento, Italy, September 2013
|
|
|
|
Peter R
Legendary
Offline
Activity: 1162
Merit: 1007
|
|
December 25, 2015, 07:52:01 PM |
|
I think Peter R did not consider the fact that the block size limit is first a spam filter, and the attackers never follow the market based behavior
This is exactly what I considered in Section 9 (page 11). I was able to calculate the cost of spam attacks at various values for the network propagation impedance: Recent research by Dr. Nicolas Houy and G. Andrew Stone has shown that these estimates are actually conservative (the cost of large spam blocks is actually higher if you use a game-theory model rather than a functional-model as I did).
|
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
December 25, 2015, 08:46:03 PM |
|
I think Peter R did not consider the fact that the block size limit is first a spam filter, and the attackers never follow the market based behavior
This is exactly what I considered in Section 9 (page 11). I was able to calculate the cost of spam attacks at various values for the network propagation impedance: It has nothing to do with cost An attacker has only one single purpose that is to disable the network so that no transaction can be done at all. In order to achieve this, they are more likely to use the coinwallet.eu spam trick, combining thousands of small outputs linked to thousands of small outputs in one single transaction, and make this transaction as large as the block. In this case, a 3.2MB block would take 10 minutes for a decent server to verify, and 8MB will take several hours to verify, which means all the nodes will fail to keep up with the new block generation and the network split into segments Currently there is 1MB limit, so the worst case scenario is 30 seconds, still won't cause a panic in the network, but blocksize without limit will definitely cause serious problem
|
|
|
|
Peter R
Legendary
Offline
Activity: 1162
Merit: 1007
|
|
December 25, 2015, 09:49:13 PM |
|
I think Peter R did not consider the fact that the block size limit is first a spam filter, and the attackers never follow the market based behavior
This is exactly what I considered in Section 9 (page 11). I was able to calculate the cost of spam attacks at various values for the network propagation impedance: It has nothing to do with cost An attacker has only one single purpose that is to disable the network so that no transaction can be done at all.... If an attacker had unlimited funds to spend, then I would agree that he might eventually succeed in injecting a large block if there were no block size limit (and nodes attempted to process all blocks regardless). I'm not sure if people realize this, but there is a block size limit in Bitcoin Unlimited. It's just that it is an emergent phenomenon of the network and the decisions of node operators and miners. I'm getting a lot of questions about how this works, so I hope to put together some convincing visuals and documentation over the coming months. Bitcoin Unlimited believes that the protocol should be governed through a "bottom up" organic process. On the other hand, Bitcoin Core believes that the protocol should be governed by a select group of developers through a "top down" process.
|
|
|
|
YarkoL
Legendary
Offline
Activity: 996
Merit: 1013
|
|
December 25, 2015, 10:07:22 PM |
|
I'm not sure if people realize this, but there is a block size limit in Bitcoin Unlimited. It's just that it is an emergent phenomenon of the network and the decisions of node operators and miners. I'm getting a lot of questions about how this works, so I hope to put together some convincing visuals and documentation over the coming months.
I take it that it emerges as a consequence of the key variables L.M,N of the BU proposal, M being the most prominent factor. I'm very curious to see if this can be formalized and boiled down to an equation. Bitcoin Unlimited believes that the protocol should be governed through a "bottom up" organic process. On the other hand, Bitcoin Core believes that the protocol should be governed by a select group of developers through a "top down" process.
There needs to be a sufficient degree of decentralization in order for this to be true. And allowing the block limit to float freely will intensify the centralizing tendency. But it is clear that the "developers as rulemakers" model is even more at odds with decentralization.
|
“God does not play dice"
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
December 26, 2015, 11:55:26 PM |
|
I think Peter R did not consider the fact that the block size limit is first a spam filter, and the attackers never follow the market based behavior
This is exactly what I considered in Section 9 (page 11). I was able to calculate the cost of spam attacks at various values for the network propagation impedance: It has nothing to do with cost An attacker has only one single purpose that is to disable the network so that no transaction can be done at all.... If an attacker had unlimited funds to spend, then I would agree that he might eventually succeed in injecting a large block if there were no block size limit (and nodes attempted to process all blocks regardless). I'm not sure if people realize this, but there is a block size limit in Bitcoin Unlimited. It's just that it is an emergent phenomenon of the network and the decisions of node operators and miners. I'm getting a lot of questions about how this works, so I hope to put together some convincing visuals and documentation over the coming months. Bitcoin Unlimited believes that the protocol should be governed through a "bottom up" organic process. On the other hand, Bitcoin Core believes that the protocol should be governed by a select group of developers through a "top down" process. You don't need unlimited funds, just a bit more than what coinwallet.eu spent then you will really cause some trouble if the block size has no limit I agree that the protocol should have change management, no change goes in without a formal approval process, and the approval should be based on a vote by the majority of the stakeholders But in practical, how to vote? Everyone sign a vote with their node? A private key that contains 1 bitcoin? Sign the vote with hash power? It seems the rich guys will always buy the most vote, so bitcoin will serve the interest of enterprise and capitalists in the end
|
|
|
|
Peter R
Legendary
Offline
Activity: 1162
Merit: 1007
|
|
December 27, 2015, 01:28:21 AM |
|
Hey Johnyj: I don't come here nearly as much since Theymos began his reign of censorship and shut down our discussion thread. I'd like to invite you to join the discussion at https://bitco.in/forum/ . "Gold Collapsing. Bitcoin UP" is our usual chit-chat thread, and there is also a sub-forum dedicated to Bitcoin Unlimited. There are also a lot more people than just me that would like to try to answer your questions. Best, Peter
|
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
December 27, 2015, 02:16:38 AM |
|
Hey Johnyj: I don't come here nearly as much since Theymos began his reign of censorship and shut down our discussion thread. I'd like to invite you to join the discussion at https://bitco.in/forum/ . "Gold Collapsing. Bitcoin UP" is our usual chit-chat thread, and there is also a sub-forum dedicated to Bitcoin Unlimited. There are also a lot more people than just me that would like to try to answer your questions. Best, Peter I don't support censorship, but I think the current political differences are already too much, we need more shared and united view, not working towards different directions, it does not help to reach consensus
|
|
|
|
iCEBREAKER
Legendary
Offline
Activity: 2156
Merit: 1072
Crypto is the separation of Power and State.
|
|
January 03, 2016, 03:54:37 AM |
|
Z0MG SENSOR SHIP Best, Peter Why won't Evil Thermos stop sensor shipping us? Won't somebody please think of the childrens?
|
██████████ ██████████████████ ██████████████████████ ██████████████████████████ ████████████████████████████ ██████████████████████████████ ████████████████████████████████ ████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ████████████████████████████████ ██████████████ ██████████████ ████████████████████████████ ██████████████████████████ ██████████████████████ ██████████████████ ██████████ Monero
|
| "The difference between bad and well-developed digital cash will determine whether we have a dictatorship or a real democracy." David Chaum 1996 "Fungibility provides privacy as a side effect." Adam Back 2014
|
| | |
|
|
|
DumbFruit
|
|
January 04, 2016, 03:23:16 PM |
|
Sounds interesting, let me take a look at this thing.. http://www.bitcoinunlimited.info/downloads/feemarket.pdfNot unexpectedly, we showed that the cost of block space was proportional to both Bitcoin’s inflation rate... You seem to be confusing inflation with block size, but whatever, it is true in either case. It's a cost that's minimized by having the least amount of nodes. ...and the amount of time it takes per uncompressed megabyte to propagate block solutions to the other miners, This cost is minimized by centralization. More interestingly, however, we showed that the orphan cost is not static, but rather increases exponentially with the block size, 𝑄, demanded: The probability of a block being orphaned is minimized by having the most hashpower. Therefore, since costs are minimized by the least amount nodes having the majority of hashpower, that is the tendency of the system in a free market. I don't understand the rationale of your paper. The fact that some form of fee market can exist in the absence of a hard block limit doesn't address the centralization concerns that people like Gmaxwell, LukeJR et al have with a non-existent hardcoded block size limit. In other words; It just looks like an elaborate straw-man. You also do something very misleading. You defined an "Unhealthy fee market" as having infinite block sizes and then wrote "an unhealthy fee market is not physically possible." Any time the block size is finite is defined as "healthy". You can then therefore conclude, since you defined the unhealthy condition to be impossible, that the fee market is always healthy regardless of the block size. That's like saying, "I define fat people as being infinitely fat, therefore fat people are physically impossible. Since every other condition, other than not-existing, is not fat I can conclude everyone is not fat." That's a very dishonest debate tactic.
|
By their (dumb) fruits shall ye know them indeed...
|
|
|
|