LiteCoinGuy (OP)
Legendary
Offline
Activity: 1148
Merit: 1014
In Satoshi I Trust
|
|
August 06, 2015, 05:13:38 PM |
|
What's the main reason Core Dev's don't want to allow bigger blocks please, somebody?
I'm not a techie expert, I've read Gavin's & Mike's ideas on why we need bigger blocks, I've witnessed the stress testing but why are Core Dev's so hesitant to comply?
some of the devs are worried about the bandwith space. but that is not a problem, even with 5 ,7 or 10 MB blocks. hunderts of millions of people around the world could run a full node with their bandwith space. http://www.nngroup.com/articles/law-of-bandwidth/no problem here.
|
|
|
|
bitcoinmasterlord
Legendary
Offline
Activity: 1148
Merit: 1006
|
|
August 08, 2015, 01:07:53 PM |
|
Bitcoin doesn't scale so why act like raising the block size would change anything?
Um... it would make these spam attacks stop instantly at least. These spammers might be very well miners, a corporation of them maybe, and they seem to test out the best way to raise the fees permanently that way. At the moment they try to spam often so that nobody knows if he has to take a high fee or not. Raising the fee overall to prevent risk at the end. I don't know where you got the idea from that bitcoin doesn't scale. The max block size is arbitrary from the start. Im not sure what you speak about.
|
|
|
|
bitcoinmasterlord
Legendary
Offline
Activity: 1148
Merit: 1006
|
|
August 08, 2015, 01:11:01 PM |
|
Bitcoin doesn't scale so why act like raising the block size would change anything?
Wrong. While it may not be done efficiently, Bitcoin does scale. Similar things would apply to video quality. Just because a raw 4K movie easily takes up more than 100GB of space, that doesn't mean that we won't be using. People have complained about storage, but that will soon be solved with pruning being fully implemented (now it is only partially). If we assume (assuming that they are full) that we have only 10MB blocks (which should be enough for some time), that's only 1,440MB of bandwidth per day (which is less than what I spend just using Chrome in a single day). Current pruning requires one to keep the last 255 blocks which means only 2.55 GB of storage would be needed (I'm pretty sure that everyone has this). Now if we factor in the potential of the lightning network and sidechains, it is going to scale much better. However, these things need time as we are surfing in unexplored waters. Yes Bitcoin might be able to scale in the future with the help of lightning network and other tools. So why the urgent need for a hard fork when it doesn't solve anything? You said it. "these things need time". So why not wait for lightning network to be operational before we go ahead with a dangerous fork? So we need to wait for a third party network to make bitcoin work. Seriously? Really... its no wonder that so many bitcoiners think that the lightning network bitcoin core developers have their own agenda in this. And what dangerous fork? Are you spreading FUD here?
|
|
|
|
bitcoinmasterlord
Legendary
Offline
Activity: 1148
Merit: 1006
|
|
August 08, 2015, 01:17:12 PM |
|
I like that quote: I'm not as connected as some people are to the reddits and githubs and IRC rooms where a lot of the bigwigs talk about this stuff. Does anyone know if consensus is anywhere close on this block size increase issue? It doesn't seem off topic here to ask given the direct relation between the attacks/tests and the potential increase. I know the debate has been rancorous, I wonder if anyone has a gauge on where we stand at the moment on this.
No sign of consensus as of yesterday. The "old devs" (Gavin, Mike Hearn, Jeff Garzik) want to increase the limit before the traffic gets close to saturation. The "new devs" (Adam Back, Greg Maxwell, Peter Wuille, Luke-Jr, and others -- most of them working for Blockstream Inc) want instead to see the network saturate so that a "fee market" develops. Since it is a discrete (binary, boolean) choice, no compromise is possible. Gavin backed down from his original 20 MB proposal to 8 MB, starting early next year, with automatic increases afterwards. Jeff proposed a "compromise" plan with 2 MB blocks. Peter Wuillie made a counter-proposal to increase the limit only in 2017 -- that is, no compromise. I don't trust either sides. Gavin has some behaviour of extortion i don't like. And Hearn has dangerous ideas. But Maxwell, Luke-Jr and others have their own private business reasons to not increase blocks. Thats like politicians that work for banks or big companies. You can't await them making sane decisions. For me that means that we need to do it ourself. We need bigger blocks since that's the only way to let bitcoin survive. 1MB blocks will hinder adoption and will make bitcoin obsolete over time. So we need to chose bigger blocks when we can chose to. And deny all other crap developers want to push onto us.
|
|
|
|
forevernoob
|
|
August 09, 2015, 10:18:19 PM |
|
Um... it would make these spam attacks stop instantly at least. These spammers might be very well miners, a corporation of them maybe, and they seem to test out the best way to raise the fees permanently that way. At the moment they try to spam often so that nobody knows if he has to take a high fee or not. Raising the fee overall to prevent risk at the end.
The spam attacks will stop once we reach a acceptable level of fee's. Currently it's possible to make transactions without paying any fees. That is not good for Bitcoin as it makes it easy to bloat the blockchain. I don't know where you got the idea from that bitcoin doesn't scale. The max block size is arbitrary from the start. Im not sure what you speak about.
Please explain how the block size will scale with a hard fork increasing the block size to 8MB? So we need to wait for a third party network to make bitcoin work. Seriously? Really... its no wonder that so many bitcoiners think that the lightning network bitcoin core developers have their own agenda in this. And what dangerous fork? Are you spreading FUD here? So you are saying Bitcoin doesn't work now? I wonder who is spreading the FUD now?
|
|
|
|
DooMAD
Legendary
Offline
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
|
|
August 10, 2015, 09:00:45 AM |
|
So we need to wait for a third party network to make bitcoin work. Seriously? Really... its no wonder that so many bitcoiners think that the lightning network bitcoin core developers have their own agenda in this. And what dangerous fork? Are you spreading FUD here? So you are saying Bitcoin doesn't work now? I wonder who is spreading the FUD now? It's pretty clear that's not what he's saying. He's alluding to the absurdity of imposing an artificial limit on the system just to bypass it with a third party network built on top. Bitcoin will cope just fine as long as it can support enough transactions to remain sustainable. However, there are still plenty of people who are skeptical that the Lightning Network by itself would be sufficient to achieve this scalability. Just about every single opponent of larger blocks is completely fixated on micro-payments and thinks that once those are swept under the carpet that the scalability issue is magically solved forever. That's just plain wrong. Larger blocks will become a necessity as the usage increases and prolonging the inevitable is only going to result in more disruption later. I trust the original, unhindered Bitcoin implementation to scale far better than an artificially crippled implementation with a new bit bolted on top. Saying you support a 1MB limit and the lightning network is like saying you want to limit the amount of material you can pass in a bowel movement and fit a colostomy bag to deal with the rest because that's somehow better. Get rid of the artificial bottleneck and let nature take its course.
|
|
|
|
Lauda
Legendary
Offline
Activity: 2674
Merit: 3000
Terminated.
|
|
August 10, 2015, 11:12:40 AM |
|
Yes Bitcoin might be able to scale in the future with the help of lightning network and other tools. So why the urgent need for a hard fork when it doesn't solve anything? You said it. "these things need time". So why not wait for lightning network to be operational before we go ahead with a dangerous fork?
It does not solve anything? That is a understatement. 8 MB blocks should help us gain the needed time. Who knows exactly when something like the lightning network or sidechains are going to be fully functional. The developers aren't really saying much about a timeline. Bigger blocks will be needed in the future even with other solutions. Why should we wait when it is much easier to do the fork now? The fork is not dangerous at all if we use the proper consensus rules. -snip- Who knows more about Bitcoin than the CORE devs ? No-One. That's who. Go with the Heart, I follow Mike+Gavin,
Wrong. They are just people who know how to properly code Bitcoin, this does not mean that they have the ultimate knowledge related to cryptography, cryptocurrencies and Bitcoin. Following your heart is a common misconception, unless you think that you feel things with your heart. I don't trust either sides. Gavin has some behaviour of extortion i don't like. And Hearn has dangerous ideas.
But Maxwell, Luke-Jr and others have their own private business reasons to not increase blocks. Thats like politicians that work for banks or big companies. You can't await them making sane decisions. For me that means that we need to do it ourself. We need bigger blocks since that's the only way to let bitcoin survive. 1MB blocks will hinder adoption and will make bitcoin obsolete over time. -snip-
Exactly. While Gavin definitely did do a few things that I do not like, he's at least not acting like a politician. The sole reason for which people related to Blockstream are going to do everything to stop the increase, is profit.
|
"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks" 😼 Bitcoin Core ( onion)
|
|
|
AtheistAKASaneBrain
|
|
August 10, 2015, 12:07:38 PM |
|
Um... it would make these spam attacks stop instantly at least. These spammers might be very well miners, a corporation of them maybe, and they seem to test out the best way to raise the fees permanently that way. At the moment they try to spam often so that nobody knows if he has to take a high fee or not. Raising the fee overall to prevent risk at the end.
The spam attacks will stop once we reach a acceptable level of fee's. Currently it's possible to make transactions without paying any fees. That is not good for Bitcoin as it makes it easy to bloat the blockchain. I don't know where you got the idea from that bitcoin doesn't scale. The max block size is arbitrary from the start. Im not sure what you speak about.
Please explain how the block size will scale with a hard fork increasing the block size to 8MB? So we need to wait for a third party network to make bitcoin work. Seriously? Really... its no wonder that so many bitcoiners think that the lightning network bitcoin core developers have their own agenda in this. And what dangerous fork? Are you spreading FUD here? So you are saying Bitcoin doesn't work now? I wonder who is spreading the FUD now? The idea is incremental of 8mb in synchrony with moore's law. This way we should be able to scale up as far as we need to. The fees must stay low tho.. im not saying they shouldn't be a bit higher if needed, but definitely they must feel LOW or else it will be a mass failure.
|
|
|
|
LiteCoinGuy (OP)
Legendary
Offline
Activity: 1148
Merit: 1014
In Satoshi I Trust
|
|
August 10, 2015, 03:11:36 PM |
|
if you want Bitcoin on the Moon - support bigger blocks. there is no other way. Satoshi himself never wanted 1 MB blocks - he had 32 MB blocks! Satoshi knew the 1MB was a risk to his long-term vision for a global competitor to the likes of VISA, but he did massive code changes, and frequently. He just assumed it would be easily increased or removed when the time was right. At the moment the only way to support bigger blocks is bitcoin XT: http://xtnodes.com/
|
|
|
|
thejaytiesto
Legendary
Offline
Activity: 1358
Merit: 1014
|
|
August 10, 2015, 03:35:47 PM |
|
The main problem is fewer and fewer people will be able to run nodes. Hard disk space is definitely not a problem, it gets cheaper, now bandwidth, thats another history, i dont think moore's law follows there, the bandwith development is not as steady as HDD development.
|
|
|
|
LiteCoinGuy (OP)
Legendary
Offline
Activity: 1148
Merit: 1014
In Satoshi I Trust
|
|
August 10, 2015, 03:49:21 PM |
|
The main problem is fewer and fewer people will be able to run nodes. Hard disk space is definitely not a problem, it gets cheaper, now bandwidth, thats another history, i dont think moore's law follows there, the bandwith development is not as steady as HDD development.
look at Nielsen's Law: http://www.nngroup.com/articles/law-of-bandwidth/we dont talk about 100 MB blocks - we talk about 6, 8 or 16 MB blocks in a few years (when they are actually full). hundreds of millions of people can support such nodes today. no problem.
|
|
|
|
forevernoob
|
|
August 10, 2015, 08:44:23 PM |
|
I trust the original, unhindered Bitcoin implementation to scale far better than an artificially crippled implementation with a new bit bolted on top.
The original? Aren't we running the original now? Saying you support a 1MB limit and the lightning network is like saying you want to limit the amount of material you can pass in a bowel movement and fit a colostomy bag to deal with the rest because that's somehow better.
Get rid of the artificial bottleneck and let nature take its course.
That's a bad analogy. We need to have limits otherwise the block chain would grow so massive that most people would not be able to run full nodes. And what's the point of risking that when we can have third party solutions? It does not solve anything? That is a understatement. 8 MB blocks should help us gain the needed time. Who knows exactly when something like the lightning network or sidechains are going to be fully functional. The developers aren't really saying much about a timeline. Bigger blocks will be needed in the future even with other solutions. Why should we wait when it is much easier to do the fork now? The fork is not dangerous at all if we use the proper consensus rules.
What needed time? I haven't noticed that we have any urgent issues with full blocks? If there are full blocks the market will adapt and either pay higher fees or develop alternatives like lightning network. And so what if 90% thinks we should have bigger blocks? I still don't think it's the best possible option. That's like saying just because the majority of the people voted for Obama makes him the best possible president.
|
|
|
|
DooMAD
Legendary
Offline
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
|
|
August 10, 2015, 09:57:59 PM |
|
I trust the original, unhindered Bitcoin implementation to scale far better than an artificially crippled implementation with a new bit bolted on top.
The original? Aren't we running the original now? The original never had a 1MB cap to begin with and the cap was never intended to be permanent. It was something done in haste that should have been reconsidered long ago. It was never part of the plan that the network should run with full blocks for the rest of forever. Saying you support a 1MB limit and the lightning network is like saying you want to limit the amount of material you can pass in a bowel movement and fit a colostomy bag to deal with the rest because that's somehow better.
Get rid of the artificial bottleneck and let nature take its course.
That's a bad analogy. We need to have limits otherwise the block chain would grow so massive that most people would not be able to run full nodes. And what's the point of risking that when we can have third party solutions? The 1mb cap wasn't introduced because of concerns over the bandwidth requirements for nodes, it was intended to be an anti-spam measure, but apparently it failed miserably in that goal. The blockchain didn't "grow huge" before the cap was put in place. Also, it's not a "solution" at all if it doesn't solve the problem of limiting the amount of fees the miners can collect. Nodes won't be any use at all without sufficiently incentivised miners to secure the network. In order to generate more fees for miners, it will be necessary to accommodate more transactions over time. Users want to get their transactions on a blockchain. If Bitcoin isn't the simplest and most convenient way to do that because you're funneling transactions into your lightning colostomy bag network, another coin will come along that hasn't been artificially crippled and doesn't rely on a third party bolted on top. If that coin turns out to be a more attractive proposition for users, what's their reason to continue using Bitcoin? If users are then transacting on another network, why are miners going to stick around to secure this one? Third party layers should be strictly reserved for micropayments and transactions that don't include a fee. And as much as small block proponents obsess about micropayments, they really aren't the issue. Even after they're swept under the carpet, Bitcoin still needs to support more than two-thirds of a floppy disk every ten minutes. If it doesn't, something else will.
|
|
|
|
Alley
Legendary
Offline
Activity: 910
Merit: 1000
|
|
August 10, 2015, 11:12:48 PM |
|
Are the core devs even doing anything about this? Or is Gavin the only one being pro active? Is the core devs policy to wait until the mempool is permanently backlogged then rush out a fix?
|
|
|
|
Soros Shorts
Donator
Legendary
Offline
Activity: 1617
Merit: 1012
|
|
August 11, 2015, 04:01:01 AM |
|
The main problem is fewer and fewer people will be able to run nodes. Hard disk space is definitely not a problem, it gets cheaper, now bandwidth, thats another history, i dont think moore's law follows there, the bandwith development is not as steady as HDD development.
look at Nielsen's Law: http://www.nngroup.com/articles/law-of-bandwidth/we dont talk about 100 MB blocks - we talk about 6, 8 or 16 MB blocks in a few years (when they are actually full). hundreds of millions of people can support such nodes today. no problem. Tell me, how long does it take to initialize your full node today on your particular hardware? I am talking about a fresh install. How long does it take to re-index the blocks that are already on disk if your Bitcoin core happens to shut down ungracefully? If you think these times are not a problem, then I'd like to know what kind of ninja hardware you are using.
|
|
|
|
jdbtracker
|
|
August 11, 2015, 04:20:28 AM |
|
a precarious situation indeed. If we do not change the structure of the blocks we end up with serious problems when the network is unable to validate the transactions. Just imagine one day when there is a massive black out caused by a solar storm, backing up the network into the millions of transactions causing delays for days, we have seen this before. A larger blocksize would clear the back-log in minutes.
Does it scale? yeah it will but not in it's current form. Either we improve the cryptography, the file compression, restructure the network or increase the blocksize... either way we will be forced to choose and we will cause a fork within the community.
The businesses relying on the cash flow from customers will choose bigger blocks to keep momentum up... they cannot afford not to.
The purists will improve the cryptographic compression to allow equal participation by the lowliest nodes in the network.
Others will partition the network to complimentary chains working in tandem: off-chain transactions.
All these are viable and proving the worth of the Bitcoin Network... It's forcing us to adapt.
Me? I personally would prefer that bandwidth be kept to a minimum, all choices are on the table. It's not just Bitcoin's network that is at stake but the thousands of complimentary chains that will eventually be built on top and around Bitcoin. The ecosystem may become so saturated that there could come a time when all available bandwidth is used up across the globe causing delays that no programmer will be able to solve. We will bog down the internet.
Increasing the size of the blocks is a band-aid solution... we need a solution that we have yet to think of.
|
If you think my efforts are worth something; I'll keep on keeping on. I don't believe in IQ, only in Determination.
|
|
|
Mickeyb
|
|
August 11, 2015, 04:25:09 AM |
|
All of this situation is very worrisome and the solution needs to be found asap. I am for example waiting to see how will this unfold in order to invest more and I am sure many other people are doing the same. I mean why would I buy at $265 if I can buy much, much lower when shit hits the fan and a community divides in 2 sides. This is completely obvious.
|
|
|
|
EternalWingsofGod
|
|
August 11, 2015, 04:34:37 AM |
|
The main problem is fewer and fewer people will be able to run nodes. Hard disk space is definitely not a problem, it gets cheaper, now bandwidth, thats another history, i dont think moore's law follows there, the bandwith development is not as steady as HDD development.
In a sense at least it is manageable with PCs Bitcoin would have been extremely concentrated up to the late 1990's Unfortunately your right about the amount of bandwidth each user has to run nodes and it will be a significant problem unless bandwidth development accelerates at a similar pace to HDD.
|
|
|
|
Lauda
Legendary
Offline
Activity: 2674
Merit: 3000
Terminated.
|
|
August 11, 2015, 07:36:25 AM |
|
That's a bad analogy. We need to have limits otherwise the block chain would grow so massive that most people would not be able to run full nodes. And what's the point of risking that when we can have third party solutions?
I do not agree with either parts of your post. Filling up a block costs money, and I doubt that someone would be able to make huge blocks without that costing them a lot. Even if this was the case, a simple higher cap can be implemented to prevent this from happening. There is no risking with a proper hard fork. Why risk using third party solutions; i.e. meaning we stop relying on the protocol itself? What needed time? I haven't noticed that we have any urgent issues with full blocks? If there are full blocks the market will adapt and either pay higher fees or develop alternatives like lightning network.
You have not? What about those recent spam attacks that happened? A lot of people were complaining about their transactions not going through because the blocks were full. The market did not adapt that time, nor do I understand why people think that higher fees are going to be beneficial to Bitcoin. And so what if 90% thinks we should have bigger blocks? I still don't think it's the best possible option. That's like saying just because the majority of the people voted for Obama makes him the best possible president.
Wrong. Nobody is saying that it is the best possible option. If the majority vote for X, then you go with X regardless of what is best. Is this not the way that decentralized consensus was supposed to work around here? Obviously people have different opinions when we talk about the best possible option, so there is none.
|
"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks" 😼 Bitcoin Core ( onion)
|
|
|
fryarminer
|
|
August 11, 2015, 02:18:48 PM |
|
The main problem is fewer and fewer people will be able to run nodes. Hard disk space is definitely not a problem, it gets cheaper, now bandwidth, thats another history, i dont think moore's law follows there, the bandwith development is not as steady as HDD development.
look at Nielsen's Law: http://www.nngroup.com/articles/law-of-bandwidth/we dont talk about 100 MB blocks - we talk about 6, 8 or 16 MB blocks in a few years (when they are actually full). hundreds of millions of people can support such nodes today. no problem. Tell me, how long does it take to initialize your full node today on your particular hardware? I am talking about a fresh install. How long does it take to re-index the blocks that are already on disk if your Bitcoin core happens to shut down ungracefully? If you think these times are not a problem, then I'd like to know what kind of ninja hardware you are using. My question is, what's the rush? If you're gonna run a node, you're gonna run a node. Its going to be running from now until forever so what's the rush? It's not like you're waiting for Gimp to load so you can work.
|
|
|
|
|