Bitcoin Forum
June 19, 2024, 09:08:29 AM *
News: Voting for pizza day contest
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Pros/cons for dynamic blocksize BIP?  (Read 714 times)
neurotypical (OP)
Hero Member
*****
Offline Offline

Activity: 672
Merit: 502


View Profile
October 21, 2015, 03:48:30 PM
 #1

I remember reading about a BIP that would basically propose that the blockchain would grow or shrink given parameters such as current hashrate, demand (usage as in transactions per second or whatever it's calculated). My question is, this sounds great and all, but im sure it has some serious problems to not be the leading BIP.

So what are those pros and cons compared to other alternatives such as just raising the blocksize at a fixed rate, or keeping it relatively small + blockstream and so on?
DooMAD
Legendary
*
Offline Offline

Activity: 3822
Merit: 3160


Leave no FUD unchallenged


View Profile
October 21, 2015, 09:30:14 PM
 #2

I remember reading about a BIP that would basically propose that the blockchain would grow or shrink given parameters such as current hashrate, demand (usage as in transactions per second or whatever it's calculated). My question is, this sounds great and all, but im sure it has some serious problems to not be the leading BIP.

So what are those pros and cons compared to other alternatives such as just raising the blocksize at a fixed rate, or keeping it relatively small + blockstream and so on?

The main critiques I got when pushing for such a proposal was the potential for miners to game the system and nudge it towards a blocksize that provides a benefit to them.  I'm still convinced that's an improvement over BIP100, where the miners straight up just choose the blocksize they deem most profitable, but I'm still trying to think of ways to minimise that risk.  The other one was that if it doesn't have a "hard" cap and is based primarily on demand, the blocksize could potentially grow too large and jeopardise decentralisation.

It seems the trick is finding the right balance between what works for miners, nodes and users, but the difficulty is that bandwidth limitations vary widely from one region to another.  What's been talked about so far seems to strike a balance between miners and users, but the nodes and their requirements are more challenging to factor in:

The ideal solution is one that doesn't create a blocksize too large for full nodes to cope with, but at the same time, one that doesn't force a large number of people off chain.  Even doubling to 2MB in one go is quite high when you think about it, so we should aim to increase (or decrease) in smaller increments more often, if needed.  One possible route is to take the best elements of BIP100 and BIP106.  BIP100 only considers what benefits the miners and not the users.  BIP106 only considers what benefits the users and not the miners.  So neither is particularly balanced on its own.  If we can find a way of allowing half of the "vote" to go to the miners and half to an automated, algorithmic vote based on traffic volumes, then we maintain some kind of equilibrium that should (in theory, at least) benefit the network as a whole.

Code:
Miners vote by encoding ‘BV’+BlockSizeRequestValue into coinbase scriptSig to: 
    raise blocksize limit by 12.5%,
    lower blocksize limit by 12.5%,
    or remain at current blocksize limit.  

This vote, however, only counts for half of the total vote and the other half is determined by algorithm based on network traffic:

If more than 50% of block's size, found in the first 1008 of the last difficulty period, is more than 90% MaxBlockSize
    Network votes for MaxBlockSize = MaxBlockSize +12.5%

Else if more than 90% of block's size, found in the first 1008 of the last difficulty period, is less than 50% MaxBlockSize
    Network votes for MaxBlockSize = MaxBlockSize -12.5%

Else
    Network votes for keeping the same MaxBlockSize

The 12.5% part is open to negotiation, some think 10% is more reasonable (i.e. BIP105).  If every 1008 blocks is too fast, we could (for example) increase that to 2016 blocks, approximately every two weeks.  Tweaks are inevitable, but I feel it's something that could work if it's not too complex to code.

(the above is largely based on upal's BIP106 proposal with an adjustment by juiceayres and then me bundling it with a hint of BIP100 style voting)

I thought smaller increases or decreases (not like all this "doubling" talk from most of the other proposals), here and there, if required, would be more appealing to those who only seem concerned about the ability to run a node, but they're still not going for it.  Yet they seem inclined to go for the whole 2-4-8 increase, which could potentially result in larger blocks than we actually need.  I was convinced that if people wanted to apply pressure to test their theories about a fee market, a flexible limit would be ideal.  We would always be close to the limit and, over short periods, might briefly brush up against it, so the need to include a reasonable fee to prioritise processing would be far more constant than if we had an average blocksize of 5MB but had an 8MB cap.  But even that doesn't get the small-blockians interested.  I'm genuinely stumped by that.
brg444
Hero Member
*****
Offline Offline

Activity: 644
Merit: 504

Bitcoin replaces central, not commercial, banks


View Profile
October 21, 2015, 09:34:17 PM
 #3

But even that doesn't get the small-blockians interested.  I'm genuinely stumped by that.

To be fair a flex cap proposal might be interesting in the future as a way for miners to maximize the fee pressure but any talk of it without a solide hard cap enforce by miners is largely useless.

"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
DooMAD
Legendary
*
Offline Offline

Activity: 3822
Merit: 3160


Leave no FUD unchallenged


View Profile
October 22, 2015, 08:01:34 PM
 #4

But even that doesn't get the small-blockians interested.  I'm genuinely stumped by that.

To be fair a flex cap proposal might be interesting in the future as a way for miners to maximize the fee pressure but any talk of it without a solide hard cap enforce by miners is largely useless.

But again, the only way to select a hard cap is to guesstimate, unless someone comes up with a more inventive solution.  And if we guess wrong, there could be implications.  All we're doing is picking a number that "sounds safe".  It just seems so clumsy, blunt and inelegant.  It certainly doesn't feel like a valid answer for a technology as revolutionary as this one.  Surely we can do better than a shot in the dark.  Maybe some clever coder-type could find some way to algorithmically measure network decentralisation and factor it into a formula that determines what volume the network can handle at any given time and have a dynamic blocksize cap based on that?
brg444
Hero Member
*****
Offline Offline

Activity: 644
Merit: 504

Bitcoin replaces central, not commercial, banks


View Profile
October 22, 2015, 08:14:44 PM
 #5

But even that doesn't get the small-blockians interested.  I'm genuinely stumped by that.

To be fair a flex cap proposal might be interesting in the future as a way for miners to maximize the fee pressure but any talk of it without a solide hard cap enforce by miners is largely useless.

But again, the only way to select a hard cap is to guesstimate, unless someone comes up with a more inventive solution.  And if we guess wrong, there could be implications.  All we're doing is picking a number that "sounds safe".  It just seems so clumsy, blunt and inelegant.  It certainly doesn't feel like a valid answer for a technology as revolutionary as this one.  Surely we can do better than a shot in the dark.  Maybe some clever coder-type could find some way to algorithmically measure network decentralisation and factor it into a formula that determines what volume the network can handle at any given time and have a dynamic blocksize cap based on that?

The proper measurement of the cost of the option to run a node (decentralization) is external to the system and does can not be fitted into any algo.

On the other hand it can certainly be calculated given available data on the different costs involved. Observation that the existing 1MB limit is already pushing average systems to their limit is not a guess. Any further decision should be grounded in that reality.


"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1004



View Profile
October 23, 2015, 12:58:21 PM
 #6

The problem I have with the dynamic blocksize proposals is that they are demand-based.  The proposals seem to assume that "real demand" for blockspace will always remain respectfully below what can be handled by a well-decentralised network.

Certainly, all else equal, prediction is inferior to algorithmic measurement.  However, I would much prefer a limit given by a prediction of what will be possible to one given by an algorithmic measurement of what we would like to be possible.
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!