Bitcoin Forum
June 21, 2024, 02:59:52 PM *
News: Voting for pizza day contest
 
   Home   Help Search Login Register More  
Pages: « 1 [2]  All
  Print  
Author Topic: Seriously devs, segwit adoption and LN needs more time  (Read 223 times)
penig
Member
**
Offline Offline

Activity: 266
Merit: 13


View Profile
December 22, 2017, 02:36:35 PM
 #21

Decentralization is always the highest priority for core.  Increasing the block size directly threatens that this since it highly favors the larger miners. 

That ship sailed a loooong time ago, when the first ASIC went to production.


But wouldn't larger blocks make it more difficult to run a node?  The chain would grow too fast.

Absolutely, long term its not a great solution.  However its not an immediate problem for existing full nodes (they'd increase new blocks).  Today the immediate and major problem is blocks are full and fees are high. Fix the block size later.  Or we can continue to do neither, let the BCH option gain momentum.
DooMAD
Legendary
*
Offline Offline

Activity: 3822
Merit: 3160


Leave no FUD unchallenged


View Profile
December 22, 2017, 03:51:18 PM
 #22

Problem being, once the blocksize is increased, there's effectively no going back. So I do understand Core's conservative blocksize philosophy.

That's only a problem if we're talking about a static blockweight, though.  And for the life of me, I can't figure out why, as a community, we still think in such limited, primitive terms.  If you make it adjustable by algorithm, then it can either increase or decrease depending on set parameters.  You can also make those parameters as restrictive as you like to prevent any sudden, large increases.  You can even work out the maths so it effectively caps increases per year at a level the community would be comfortable with.  There's no limit to how "conservative" you can make it.  But, for whatever reason, still people can't seem to get behind the idea in any considerable number.  Everyone needs to get a clue and understand that blockweight doesn't have to be defined as an integer.  Why can't the maximum blockweight increase during busy periods by 0.01MB or 0.04MB to ease the congestion?  And then why not allow it to reduce down again when the legitimate traffic dies down again?  If it's done responsibly, the blocks would easily still be far smaller than BCH's maximum after 4 or more years.  

There's being "conservative" and then there's being "unreasonable and myopic".  I think I've used up nearly all the benefit of the doubt I'm prepared to give and now feel that both developers and certain vocal minorities within this community are drifting dangerously close to the latter.  These people need to show some nous and soon.

I fully comprehend the reasons why people are so concerned with the overall size of the blockchain and how rapidly it grows.  Decentralisation is vital for the network and it's likely the forks like BCH haven't paid nearly enough attention to that aspect.  But what we have at the moment is unsustainable.  It completely shifts the emphasis of what Bitcoin was intended to be.  In all my time on these boards, I've maintained implacably that the users won't support a network that doesn't support them.  But it appears to be heading in that direction.  If Bitcoin is going to redefine itself as a decentralised settlement layer for the elites and a bunch of idiots who think they should still be able to run a node on a Raspberry Pi, don't expect everyone to play along.  There will be further splits and forks if we go down that path.  The market won't abide completely abandoning the concept of digital cash for the internet.


But wouldn't larger blocks make it more difficult to run a node?  The chain would grow too fast.

Not an issue if we're sensible about it.  Small (and I mean small) increases over time in direct correlation with observable demand is preferable to sudden floods of spare room that could be abused by spammers.  At all times, we'd retain full control over the maximum increase over any given period of time.  If we happened to miscalculate at first and the chain was still growing too quickly, it's technically only a softfork to make the rules more restrictive (although it could potentially lead to a hardfork if some decide to allow it to keep growing as per existing rules at the time while others elect to cap it off, so it's better to get it right on the first attempt).


Block size of 2MB should fix the fees for a while.

Doubling is unnecessary and too reckless.  Stop thinking in terms of whole numbers.  There's nothing wrong with fractions.  If you can handle 8 decimal places in the currency itself, you can manage a couple in the blockweight as well.  This is exactly the kind of thing I'm referring to.  Less blunt and simplistic, more subtle and considered, please.
HeRetiK
Legendary
*
Offline Offline

Activity: 2968
Merit: 2111



View Profile
December 23, 2017, 05:26:47 PM
 #23

Problem being, once the blocksize is increased, there's effectively no going back. So I do understand Core's conservative blocksize philosophy.

That's only a problem if we're talking about a static blockweight, though.  And for the life of me, I can't figure out why, as a community, we still think in such limited, primitive terms.  If you make it adjustable by algorithm, then it can either increase or decrease depending on set parameters. [...]

Thanks for the link! Seeing how there have been multiple proposals regarding dynamic increases of the maximum blocksize I've actually been wondering why there hasn't been any hardfork yet trying to implement one of them -- unless there is, I sort of lost track.

Nonetheless I'd probably just go for a straightforward static periodical block size increases every year or halving period instead of an algorithm based on network traffic. Assuming the latter can be gamed in one way or another (I still need to let your proposal sink in a bit, but I'm not yet fully convinced that it can't be exploited to force the maximum block size increase anyway) this would at least skip the extra step of trying to anticipate transaction workload.
aznboy84
Full Member
***
Offline Offline

Activity: 137
Merit: 100



View Profile
December 23, 2017, 08:48:08 PM
 #24

I dont understand why a lot of exchanges are not even using segwit right now, that is the most important thing that they could do.
That is the only way in that they could lower the fees at the moment before LN gets implemented.

Pages: « 1 [2]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!