Problem being, once the blocksize is increased, there's effectively no going back. So I do understand Core's conservative blocksize philosophy.
That's only a problem if we're talking about a static blockweight, though. And for the life of me, I can't figure out why, as a community, we still think in such limited, primitive terms. If you make it
adjustable by algorithm, then it can either increase or decrease depending on set parameters. You can also make those parameters as restrictive as you like to prevent any sudden, large increases. You can even work out the maths so it effectively caps increases per year at a level the community would be comfortable with. There's no limit to how "
conservative" you can make it. But, for whatever reason, still people can't seem to get behind the idea in any considerable number. Everyone needs to get a clue and understand that blockweight doesn't have to be defined as an integer. Why can't the maximum blockweight increase during busy periods by 0.01MB or 0.04MB to ease the congestion? And then why not allow it to reduce down again when the legitimate traffic dies down again? If it's done responsibly, the blocks would easily still be far smaller than BCH's maximum after 4 or more years.
There's being "
conservative" and then there's being "
unreasonable and myopic". I think I've used up nearly all the benefit of the doubt I'm prepared to give and now feel that both developers and certain vocal minorities within this community are drifting dangerously close to the latter. These people need to show some nous and soon.
I fully comprehend the reasons why people are so concerned with the overall size of the blockchain and how rapidly it grows. Decentralisation is vital for the network and it's likely the forks like BCH haven't paid nearly enough attention to that aspect. But what we have at the moment is unsustainable. It completely shifts the emphasis of what Bitcoin was intended to be. In all my time on these boards, I've maintained implacably that the users won't support a network that doesn't support them. But it appears to be heading in that direction. If Bitcoin is going to redefine itself as a decentralised settlement layer for the elites and a bunch of idiots who think they should still be able to run a node on a Raspberry Pi, don't expect everyone to play along. There will be further splits and forks if we go down that path. The market won't abide completely abandoning the concept of digital cash for the internet.
But wouldn't larger blocks make it more difficult to run a node? The chain would grow too fast.
Not an issue if we're sensible about it. Small (and I mean small) increases over time in direct correlation with observable demand is preferable to sudden floods of spare room that could be abused by spammers. At all times, we'd retain full control over the maximum increase over any given period of time. If we happened to miscalculate at first and the chain was still growing too quickly, it's technically only a softfork to make the rules more restrictive (although it could potentially lead to a hardfork if some decide to allow it to keep growing as per existing rules at the time while others elect to cap it off, so it's better to get it right on the first attempt).
Block size of 2MB should fix the fees for a while.
Doubling is unnecessary and too reckless. Stop thinking in terms of whole numbers. There's nothing wrong with fractions. If you can handle 8 decimal places in the currency itself, you can manage a couple in the blockweight as well. This is exactly the kind of thing I'm referring to. Less blunt and simplistic, more subtle and considered, please.