johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
December 31, 2015, 05:48:37 PM |
|
Bitfury's paper here: http://bitfury.com/content/4-white-papers-research/block-size-1.1.1.pdf"The table contains an estimate of how many full nodes would no longer function without hard-ware upgrades as average block size is increased. These estimates are based on the assumption that many users run full nodes on consumer-grade hardware, whether on personal computers or in the cloud. Characteristics of node hardware are based on a survey performed by Steam [19]; we assume PC gamers and Bitcoin enthusiasts have a similar amount of resources dedicated to their hardware. The exception is RAM: we assume that a typical computer supporting a node has no less than 3 GB RAM as a node requires at least 2GB RAM to run with margin[15]. For example,if block size increases to 2 MB, a node would need to dedicate 8 GB RAM to the Bitcoin client, while more than a half of PCs in the survey have less RAM." Based on his estimation, raise the block size to 4MB will drop 75% of nodes from the network
|
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
December 31, 2015, 05:50:18 PM |
|
But simulation by bitfury already indicated that we will have a severe performance problem with 4MB blocks on average home computer
I'd like to see that report. Got a link? With only several thousand running full nodes now, it seems to me that 'average home computer' is not the limiting issue. I don't remember exactly, but just google it, and also another statistic by Mark in Montreal conference showing that even a 1MB block can take over 30s to verify on a mining node Today we have the mining relay network, the block is relayed in milliseconds. What you are saying is factually incorrect. As I understand, Matt Corallo's bitcoin relay network is a private company, similar "a phone call to shut it down" risk The free market coming up with solutions on its own, if it did get shut down another one could simply be setup. Not when you are already heavily dependent on that one, you can not come up with a solution overnight The Bitcoin relay network is also open source. It does not help if you don't have servers at major internet backbones. And if you have, those servers will be censored in a similar way
|
|
|
|
BitUsher
Legendary
Offline
Activity: 994
Merit: 1035
|
|
December 31, 2015, 05:50:18 PM |
|
Thank you, I appreciate your criticism. I am sorry if I came across a bit strong, I am to used to dealing with some of the trolls on this thread. Your observations are accurate, and we can assume shortcuts for profits and mistakes will be made despite thee miners best intentions. I even suspect that some of these problems we will need to learn the hard way, the nice thing about BU is that we could all decide on a two megabyte limit for instance and increase it again when required, allowing us more time to understand and further analyze its effects.
I am really sincere and appreciate your hard work. I am working on a project now that will back up my words of support with actual action that shows I am truly sincere with both supporting the core dev and other implementations. Let us all rise above attacking each other and bikeshedding.
|
|
|
|
VeritasSapere
|
|
December 31, 2015, 05:55:26 PM |
|
Can someone explain to me how not raising the block size limit is a good thing? All transactions should be able to go thru in a semi-timely manner and if the limit is reached in a block then that transaction is just cancelled? That doesnt' make sense to me, I don't see a good reason NOT to switch to bitcoin XT. Secondly, why wasn't this thought of originally in the making of bitcoin? Seems odd.
From the beginning the block size limit works as a spam filter (even this year it successfully resisted the spam attack by coinwallet.eu during July and September). And now people realized it also can work as a means to prevent centralization (As long as blocks are small, average people with a little bit IT knowledge can run a full node in his home thus increase the level of decentralization) In the schedule outlined in BIP101 the majority of people will still be able run full nodes out of their homes. We now also have Bitcoin Unlimited which can be seen as a more conservative approach in regards to the blocksize. This has nothing to do with transaction capacity, but a live or death question. The way you are phrasing this is hyperbole. You claim that this is a live or death question. Can you seriously claim that an increase to two megabytes would destroy Bitcoin? Especially as our technology increases this limit becomes more and more arbitrary. Just increasing the limit does not actually increase the blocksize as the history of Bitcoin and the altcoins proves. It would also make these types of spam attacks much more expensive and less effective. This has everything to do with transaction capacity. The blocksize limit as an anti spam measure is meant to be set at a much higher level then the actual level of transactions, these where the conditions under which it was setup, if we decide to use the blocksize limit to block the stream of transactions then it becomes a tool of economic policy, in the case of Core it is centralized economic planning. It would be better to allow the free market to determine the size of the blocks instead. If the blocks are huge and average people can not run a node, thus they all run on large data centers, then a couple of phone call to ISP could disable the bitcoin network. By making blocks small and portable, you can run it on almost any device thus it becomes unlikely you can disable bitcoin network unless you shutdown the whole internet This was never the intention or the security model that was intended for Bitcoin. Full nodes are destined to be run on servers and desktop computers, I do not see anything wrong with that. Furthermore if we had billions of people using Bitcoin, which such large blocksizes would imply, then we would also have hundreds of thousands of nodes run in data centers across the world in different jurisdictions. I would consider this to be highly decentralized and desirable, this was always the intention and design plan of Bitcoin. Security through mass adoption, not obscurity. The original vision of Satoshi was that the majority of users would not be running a full node, he had thought about the problems of scaling and what the solutions should be. He said that we should allow it to grow as big as it needed to be, since doing otherwise would effectively block the stream of transactions, hurting adoption. Pushing transactions off chain is not a solution to scaling up the main Bitcoin blockchain itself, not to mention the terrible user experience this would be when compared to just using Bitcoin directly. Bitcoin needs to have a high volume of transactions to pay for its security while maintaining low fees which promotes adoption, which in turn increases its security. I’m sure that in 20 years there will either be very large transaction volume or no volume.
|
|
|
|
VeritasSapere
|
|
December 31, 2015, 05:56:39 PM |
|
While I don't think Bitcoin is practical for smaller micropayments right now, it will eventually be as storage and bandwidth costs continue to fall. If Bitcoin catches on on a big scale, it may already be the case by that time. Another way they can become more practical is if I implement client-only mode and the number of network nodes consolidates into a smaller number of professional server farms. Whatever size micropayments you need will eventually be practical. I think in 5 or 10 years, the bandwidth and storage will seem trivial. Long before the network gets anywhere near as large as that, it would be safe for users to use Simplified Payment Verification (section Cool to check for double spending, which only requires having the chain of block headers, or about 12KB per day. Only people trying to create new coins would need to run network nodes. At first, most users would run network nodes, but as the network grows beyond a certain point, it would be left more and more to specialists with server farms of specialized hardware. The eventual solution will be to not care how big it gets. But for now, while it’s still small, it’s nice to keep it small so new users can get going faster. When I eventually implement client-only mode, that won’t matter much anymore. The current system where every user is a network node is not the intended configuration for large scale. That would be like every Usenet user runs their own NNTP server. The design supports letting users just be users. It can be phased in, like:
if (blocknumber > 115000) maxblocksize = largerlimit
It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.
When we're near the cutoff block number, I can put an alert to old versions to make sure they know they have to upgrade. The threshold can easily be changed in the future. We can decide to increase it when the time comes. It's a good idea to keep it lower as a circuit breaker and increase it as needed. If we hit the threshold now, it would almost certainly be some kind of flood and not actual use. Keeping the threshold lower would help limit the amount of wasted disk space in that event. Bitcoin users might get increasingly tyrannical about limiting the size of the chain so it's easy for lots of users and small devices.
|
|
|
|
VeritasSapere
|
|
December 31, 2015, 05:59:00 PM |
|
Bitfury's paper here: http://bitfury.com/content/4-white-papers-research/block-size-1.1.1.pdf"The table contains an estimate of how many full nodes would no longer function without hard-ware upgrades as average block size is increased. These estimates are based on the assumption that many users run full nodes on consumer-grade hardware, whether on personal computers or in the cloud. Characteristics of node hardware are based on a survey performed by Steam [19]; we assume PC gamers and Bitcoin enthusiasts have a similar amount of resources dedicated to their hardware. The exception is RAM: we assume that a typical computer supporting a node has no less than 3 GB RAM as a node requires at least 2GB RAM to run with margin[15]. For example,if block size increases to 2 MB, a node would need to dedicate 8 GB RAM to the Bitcoin client, while more than a half of PCs in the survey have less RAM." Based on his estimation, raise the block size to 4MB will drop 75% of nodes from the network I will read this paper and respond later, going to head off now and celebrate the new year. I wish everyone here a happy new year.
|
|
|
|
BitUsher
Legendary
Offline
Activity: 994
Merit: 1035
|
|
December 31, 2015, 06:05:23 PM |
|
Just a friendly word of advice... It isn't a good idea to create a cult of personality and make persistent appeals to authority. Satoshi was/is a genius and while it is easy to revere his opinion, he also made multiple mistakes and from his programming history wasn't the most competent developer either. Many of the current core developers are far more competent and proficient technically than Satoshi... and we shouldn't view them as infallible or use appeals to authority to them either. This is yet another reason why we need to support multiple implementations as the core developers could learn from the others or correct a mistake as well. I will read this paper and respond later, going to head off now and celebrate the new year. I wish everyone here a happy new year. Happy New Year to everyone. Don't drink and drive and if you can avoid the roads by sleeping with the host/hostess of the party than take the opportunity.
|
|
|
|
VeritasSapere
|
|
December 31, 2015, 06:10:30 PM Last edit: December 31, 2015, 07:05:27 PM by VeritasSapere |
|
Just a friendly word of advice... It isn't a good idea to create a cult of personality and make persistent appeals to authority. Satoshi was/is a genius and it is easier to revere his opinion, but he also made multiple mistakes and from his programming history wasn't the most competent developer either so could easily ignore many technical nuances. Many of the current core developers are far more competent and proficient technical than Satoshi... and we shouldn't view them as infallible or use appeals to authority to them either. I agree, and in that spirit, let me state clearly here, that just because Satoshi said these things it does not mean he was right, he could have been wrong as some of the Core developers are now saying. I might agree with the original vision of Satoshi but we do have to be critical and always question ourselfs. I like to believe that is what Satoshi would have wanted us to do. It is true that I often fall into the trap of viewing him as a mythological figure, which is fun but I do need to check myself on that sometimes.
|
|
|
|
jbreher
Legendary
Offline
Activity: 3052
Merit: 1665
lose: unfind ... loose: untight
|
|
December 31, 2015, 06:43:03 PM |
|
Bitfury's paper here: http://bitfury.com/content/4-white-papers-research/block-size-1.1.1.pdf"The table contains an estimate of how many full nodes would no longer function without hard-ware upgrades as average block size is increased. These estimates are based on the assumption that many users run full nodes on consumer-grade hardware, whether on personal computers or in the cloud. Characteristics of node hardware are based on a survey performed by Steam [19]; we assume PC gamers and Bitcoin enthusiasts have a similar amount of resources dedicated to their hardware. The exception is RAM: we assume that a typical computer supporting a node has no less than 3 GB RAM as a node requires at least 2GB RAM to run with margin[15]. For example,if block size increases to 2 MB, a node would need to dedicate 8 GB RAM to the Bitcoin client, while more than a half of PCs in the survey have less RAM." Based on his estimation, raise the block size to 4MB will drop 75% of nodes from the network Thanks for the link. I'll read it in its entirety. However, from my perspective, that latter statement is nothing but a rather weak assumption. According to bitnodes, there are currently about 5650 nodes in operation. Do you really think that a significant percentage of them have less than 8 GiB RAM? Even if they do today (which I do not for a minute believe), it currently costs well under 0.1 BTC to purchase 8 GiB memory. Your assumption that requiring this much RAM will knock a significant percentage of nodes off the network is simply incredible, in the literal sense of the word.
|
Anyone with a campaign ad in their signature -- for an organization with which they are not otherwise affiliated -- is automatically deducted credibility points.
I've been convicted of heresy. Convicted by a mere known extortionist. Read my Trust for details.
|
|
|
brg444 (OP)
|
|
December 31, 2015, 07:01:33 PM |
|
But simulation by bitfury already indicated that we will have a severe performance problem with 4MB blocks on average home computer
I'd like to see that report. Got a link? With only several thousand running full nodes now, it seems to me that 'average home computer' is not the limiting issue. I don't remember exactly, but just google it, and also another statistic by Mark in Montreal conference showing that even a 1MB block can take over 30s to verify on a mining node Today we have the mining relay network, the block is relayed in milliseconds. What you are saying is factually incorrect. There is absolutely no relation between the time it takes to relay and the time it takes for nodes to verify it. As usual, you have no clue what it is you are talking about.
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
BlindMayorBitcorn
Legendary
Offline
Activity: 1260
Merit: 1116
|
|
December 31, 2015, 07:06:18 PM |
|
Huh? The author in question was reporting hearsays from discussion w/ some people from BTCC and is not representing BTCC himself. submitted 10 hours ago by btcc_samson [/To clarify, /u/jtoomim did speak with us so he's not making things up or lying. He's been investing a great deal of time and effort to really talk to everyone and gather real data. He should be applauded. Sorry to post this here, but would rather not post in /r/bitcoin. https://www.reddit.com/r/btc/comments/3yw49p/toomim_btcc_comment/quote]
|
Forgive my petulance and oft-times, I fear, ill-founded criticisms, and forgive me that I have, by this time, made your eyes and head ache with my long letter. But I cannot forgo hastily the pleasure and pride of thus conversing with you.
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
December 31, 2015, 07:06:49 PM |
|
Jeff and Gavin's statement recently: "However, in the short term, we have a disappointing situation where a subset of dev consensus is disconnected from the oft-mentioned desire to increase block size on the part of users, businesses, exchanges and miners. This reshapes bitcoin in ways full of philosophical and economic conflicts of interest. " I'm more interested in the philosophical conflicts of interest here, unfortunately they did not mention what are they. I believe bitcoin's long term success will depends on its core philosophy For example, everyone without permission can set up a full node and start to mine bitcoins (p2pool), thus become a part of a decentralized global banking system. It is this free of entry all the way into money creation attracted so many enthusiasts. So I think permission-less is the core philosophy of bitcoin Following this philosophy, if you have too high barrier of entry (too high requirement on hardware and mining investment for individuals), then it is not a permission-less system any more Similarly, everyone should be able to use the blockchain to do transactions without permission So that is the conflict of interest here: If you want every one can setup a full node at home and mine bitcoins, then you should keep the block size small, thus not everyone will be able to use the blockchain to do transactions. There should be a balance between these two considerations I just had another idea: If you setup a full node, then you can do transactions cheaply. Easy to explain, difficult to implement
|
|
|
|
brg444 (OP)
|
|
December 31, 2015, 07:07:24 PM Last edit: December 31, 2015, 07:27:30 PM by brg444 |
|
This problem is far worse if blocks were 8MB: an 8MB transaction with 22,500 inputs and 3.95MB of outputs takes over 11 minutes to hash. If you can mine one of those, you can keep competitors off your heels forever, and own the bitcoin network… Well, probably not. But there’d be a lot of emergency patching, forking and screaming…
And this is with the initial optimizations completed to speed up Verification. This means that If we hardforked a 2MB MaxBlockSize increase on the main tree and we softforked/hardforked in SepSig, we would essentially have up to a 8MB limit (3.5MB to 8MB) in which an attack vector could be opened up with heavy output and multisig tx which would crash nodes. What you are saying here is completely factually inaccurate, the number of transactions does not increase hashing time. BitUsher may have confused things up with the term "hash" but what he says is factually correct. Even a single-line change such as increasing the maximum block size has effects on other parts of the code, some of which are undesirable. For example, right now it’s possible to construct a transaction that takes up almost 1MB of space and which takes 30 seconds or more to validate on a modern computer (blocks containing such transactions have been mined). In 2MB blocks, a 2MB transaction can be constructed that may take over 10 minutes to validate which opens up dangerous denial-of-service attack vectors. Other lines of code would need to be changed to prevent these problems. . https://bitcoin.org/en/bitcoin-core/capacity-increases-faq
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
VeritasSapere
|
|
December 31, 2015, 07:17:42 PM |
|
Jeff and Gavin's statement recently: "However, in the short term, we have a disappointing situation where a subset of dev consensus is disconnected from the oft-mentioned desire to increase block size on the part of users, businesses, exchanges and miners. This reshapes bitcoin in ways full of philosophical and economic conflicts of interest. " I'm more interested in the philosophical conflicts of interest here, unfortunately they did not mention what are they. I believe bitcoin's long term success will depends on its core philosophy For example, everyone without permission can set up a full node and start to mine bitcoins (p2pool), thus become a part of a decentralized global banking system. It is this free of entry all the way into money creation attracted so many enthusiasts. So I think permission-less is the core philosophy of bitcoin Following this philosophy, if you have too high barrier of entry (too high requirement on hardware and mining investment for individuals), then it is not a permission-less system any more Similarly, everyone should be able to use the blockchain to do transactions without permission So that is the conflict of interest here: If you want every one can setup a full node at home and mine bitcoins, then you should keep the block size small, thus not everyone will be able to use the blockchain to do transactions. There should be a balance between these two considerations I just had another idea: If you setup a full node, then you can do transactions cheaply. Easy to explain, difficult to implement I presume that you did not read the posts I linked earlier that I wrote. In there I explained how increasing the blocksize, does not increase the barrier to entry of mining whatsoever. Miners do not even run full nodes themselves for the purpose of mining, which is why the increased difficulty of running a full node does not increase the difficulty or barrier to entry of mining. I am a miner myself, solo mining is only feasible if you are a huge industrial operation, not exactly contributing to decentralization. P2P mining unfortunately is not good enough both due to the increased complexity and incompatibility with certain ASIC miners, this is reflected in the hashrate. More then seventy percent of the mining power is presently inside of public pools, this is a good thing. Pools promote decentralization compared to the alternatives, pools are like a form of representative democracy for the miners. https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-203#post-7395https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-208#post-7550
|
|
|
|
brg444 (OP)
|
|
December 31, 2015, 07:23:30 PM |
|
I presume that you did not read the posts I linked earlier that I wrote. In there I explained how increasing the blocksize, does not increase the barrier to entry of mining whatsoever. Miners do not even run full nodes themselves for the purpose of mining, which is why the increased difficulty of running a full node does not increase the difficulty or barrier to entry of mining. I am a miner myself, solo mining is only feasible if you are a huge industrial operation, not exactly contributing to decentralization. P2P mining unfortunately is not good enough both due to the increased complexity and incompatibility with certain ASIC miners, this is reflected in the hashrate. More then seventy percent of the mining power is presently inside of public pools, this is good. https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-203#post-7395https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-208#post-7550Larger blocks increase the chances of block orphans which public pools are more at risk of than larger private miners. I know it pleases you to live in your own little lunacy and pretend that you understand these things but this is not really debatable since this is pretty much what came out of the discussion between miners and developers at the most recent Scaling Bitcoin conference.
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
tl121
|
|
December 31, 2015, 07:35:36 PM |
|
But simulation by bitfury already indicated that we will have a severe performance problem with 4MB blocks on average home computer
I'd like to see that report. Got a link? With only several thousand running full nodes now, it seems to me that 'average home computer' is not the limiting issue. The primary limiting technological factor for blocksize today is bandwidth and latency. Just a nit, but there is a subtle point. The primary limiting technological factor for transaction rate is bandwidth and latency. (This changes the wording, so as to eliminate counting games such as Seg W.)
|
|
|
|
coinmaster222
|
|
December 31, 2015, 07:39:52 PM |
|
As I said earlier when looking at block size you have to look at transaction fees.Do we force larger transaction fees on people or do we just let the chinese tell us what they want.I was told this was in the future but sorry the future is now
|
|
|
|
VeritasSapere
|
|
December 31, 2015, 07:40:39 PM |
|
I presume that you did not read the posts I linked earlier that I wrote. In there I explained how increasing the blocksize, does not increase the barrier to entry of mining whatsoever. Miners do not even run full nodes themselves for the purpose of mining, which is why the increased difficulty of running a full node does not increase the difficulty or barrier to entry of mining. I am a miner myself, solo mining is only feasible if you are a huge industrial operation, not exactly contributing to decentralization. P2P mining unfortunately is not good enough both due to the increased complexity and incompatibility with certain ASIC miners, this is reflected in the hashrate. More then seventy percent of the mining power is presently inside of public pools, this is good. https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-203#post-7395https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-208#post-7550Larger blocks increase the chances of block orphans which public pools are more at risk of than larger private miners. I know it pleases you to live in your own little lunacy and pretend that you understand these things but this is not really debatable since this is pretty much what came out of the discussion between miners and developers at the most recent Scaling Bitcoin conference. I will try explaining it in a way in which you might understand: http://www.bitcoinunlimited.info/index.html
|
|
|
|
brg444 (OP)
|
|
December 31, 2015, 07:45:17 PM |
|
I presume that you did not read the posts I linked earlier that I wrote. In there I explained how increasing the blocksize, does not increase the barrier to entry of mining whatsoever. Miners do not even run full nodes themselves for the purpose of mining, which is why the increased difficulty of running a full node does not increase the difficulty or barrier to entry of mining. I am a miner myself, solo mining is only feasible if you are a huge industrial operation, not exactly contributing to decentralization. P2P mining unfortunately is not good enough both due to the increased complexity and incompatibility with certain ASIC miners, this is reflected in the hashrate. More then seventy percent of the mining power is presently inside of public pools, this is good. https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-203#post-7395https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-208#post-7550Larger blocks increase the chances of block orphans which public pools are more at risk of than larger private miners. I know it pleases you to live in your own little lunacy and pretend that you understand these things but this is not really debatable since this is pretty much what came out of the discussion between miners and developers at the most recent Scaling Bitcoin conference. I will try explaining it in a way in which you might understand: I can see Taek made a valiant effort at explaining it to you over there at the mental ward but as usual you were too dense to understand and kept mischaracterizing his arguments and straight up making strawmen out of them. Hopefully you plan to be a little more honest and gracious and less disingenuous in 2016! Bitcoin Unlimited: let's hand it all off to miners and call it the free market!
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
Lauda
Legendary
Offline
Activity: 2674
Merit: 2965
Terminated.
|
|
December 31, 2015, 07:47:08 PM |
|
BU is anything but an elegant solution. It doesn't even have the proper code that it needs (Source: Gavin). Just a nit, but there is a subtle point. The primary limiting technological factor for transaction rate is bandwidth and latency. (This changes the wording, so as to eliminate counting games such as Seg W.)
What about propagation delay?
|
"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks" 😼 Bitcoin Core ( onion)
|
|
|
|