Bitcoin Forum
July 21, 2024, 03:09:53 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 [2] 3 4 »  All
  Print  
Author Topic: BIP 106: Dynamically Controlled Bitcoin Block Size Max Cap  (Read 9341 times)
Carlton Banks
Legendary
*
Offline Offline

Activity: 3430
Merit: 3078



View Profile
August 20, 2015, 11:47:47 PM
 #21

While the last 2 make sense, the first one is out of proportion imo. The increment step could be debated over and over but I think a more straight forward solution is to peg it to difficulty, i.e. if an increase is triggered, the block size limit should be readjusted in the same proportion that the difficulty changed:

- If the difficulty increased 20% and a block size limit increase is triggered, the limit would be increased by 20%.
- If the difficulty only increased by 5%, so would the block size limit.
- If the difficulty increased but the block limit increase was not triggered, stay as is.
- If the difficulty was reduced, in every case reduce the block limit by that same proportion.

How does difficulty change affects the size of blocks found today ? Is there any co-relation between difficulty and block size ? If not, then IMO, it wont be wise to make difficulty a parameter to change max block size cap.

Why would the fact that difficulty and blocksize are not related today, preclude that relationship from helping to solve a network problem? Explain why.

Vires in numeris
skang
Sr. Member
****
Offline Offline

Activity: 452
Merit: 252


from democracy to self-rule.


View Profile
August 20, 2015, 11:57:59 PM
 #22

I don't know why people are calling this a good proposal. Either they don't understand the problem at hand or its me. I'd thankfully accept its me if you can explain, please.

Let us try to simulate this proposal.
Let us say the number of transactions are rising and are around 1 mb regularly. This algo will increase the cap accordingly.
Now, with global adoption, let us say, the number of transactions rise further. This algo raises the cap further.
Let us say there are 25mb worth of transactions now. This algo raises the cap to 25 mb, but does that work??

Due to the practical limits, a big block will take a lot of time to propagate in the network. During this time maybe another miner also successfully solves the block only to realize after a while that he isn't the first one, thus producing orphans. The second effect, and a very important effect, is that the winner gets a headstart. He starts working on the next block while the rest of the world is still working on the previous one while waiting to download the successful solution. As mining has now gone big, the effect of this headstart is huge and increases with more mining power a miner or a pool of miners have.

tl121 made this exact point.

There has to be a maximum block size limit for bitcoin nodes to work.  The limit is not just a program variable needed for block chain consensus, it has real world implications in terms of storage, processing and bandwidth resources.  If a node doesn't have sufficient resources it will not be able to work as a properly functioning node. These resources have to be provisioned and managed by node operators who have to plan in advance to acquire the needed resources.  That is the reason for BIP 101 having a schedule for changes to the limits. A dynamic algorithm can not magically instantiate the needed resources.

The counter to this position, given by Gavin, is that his simulations show that this headstart does not have any effect.
But the counter's counter from the other side is that his simulations are not taking internet latency into account.

People, the problem is not 'what' the limit should be & 'how' to reach it. The problem is that large blocks will kill bitcoin, so large blocks are not an option, what to do then is the question? How to make bitcoin scalable?

"India is the guru of the nations, the physician of the human soul in its profounder maladies; she is destined once more to remould the life of the world and restore the peace of the human spirit.
But Swaraj is the necessary condition of her work and before she can do the work, she must fulfil the condition."
Trolololo
Sr. Member
****
Offline Offline

Activity: 263
Merit: 280



View Profile
August 21, 2015, 12:08:33 AM
 #23

I had this same idea today, I created a new thread proposing it, and then I found that you had created one thread and developed the idea some days ago.

I give my 100% support to it, because max block size should be dynamically calculated (based on the 2016 previous block sizes) as Difficulty is dynamically recalculated every 2016 blocks.

Go on with it!!!
goatpig
Legendary
*
Offline Offline

Activity: 3696
Merit: 1356

Armory Developer


View Profile
August 21, 2015, 12:19:47 PM
 #24

How does difficulty change affects the size of blocks found today ? Is there any co-relation between difficulty and block size ? If not, then IMO, it wont be wise to make difficulty a parameter to change max block size cap.

It does not but I would return this question to you: how is a doubling/halving of the block cap representative of the actual market growth/contraction that triggered the change? Difficulty variations are built in the blockchain and provide a very realistic perspective on the economic progression of the network, as they are a marker of profitability.

Keep in mind that my proposal evaluates total fee progression as well over difficulty periods, so in the case a new chip is released that largely outperforms previous generations and the market quickly invests into it, that event on its own would not be enough to trigger a block size increase, as there is no indication fees would also climb in the same fashion.

The idea is to keep the block size limit high enough to support organic market growth, while progressing in small enough increments that each increment won't undermine the fee market. I think difficulty progression is an appropriate metric to achieve that goal.

I've always thought fees should be somehow inversely pegged to difficulty to define the baseline of a healthy fee market. This is a way to achieve it.

Quote
People, the problem is not 'what' the limit should be & 'how' to reach it. The problem is that large blocks will kill bitcoin, so large blocks are not an option, what to do then is the question? How to make bitcoin scalable?

The problem is blocks larger than the network's baseline resource will magnify centralization. This is a bad thing, but the question is not "how to make Bitcoin scalable". My understanding of scalability (please share yours if it differs from mine) is for a piece of software that attempts to consume as much resources as is made available. An example of a scalable system would be Amazon's ec2. The more physical machines support it, the more powerful it gets. Another one is BitTorrent, where the more leechers show up, the more bandwidth the torrent totals (i.e. bandwidth is not defined by seed boxes alone).

I would say the current issue with Bitcoin and big blocks isn't scalability but rather efficiency. We don't want to use more resources, we want to use the same amount of resources in a more efficient manner. Block size is like a barrier to entry: the bigger the blocks, the higher the barrier. Increasing efficiency in block propagation and verification would reduce that barrier in return, allowing for an increase in size while keeping the network healthy. I am not familiar with the Core source but I believe there are a few low hanging fruits we can go after when it comes to block propagation.

Also, I believe the issue isn't truly efficiency, but rather centralization. Reducing the barrier to entry increases participants and thus decentralization but the real issue is that there are no true incentives to run nodes nor to spread mining to smaller clusters. I understand these are non trivial problems, but that's what the September workshop should be about, rather than scalability.

If there is an incentive to run full nodes and if there is an incentive to spread mining, then block size will no longer be a metric that affects centralization on its own. Keep in mind that it currently is the case partly because it is one of the last few metric set to a magic number. If it was controlled by a dynamic algorithm keeping track of economic factors, we wouldn't be wasting sweat and blood on this issue today and be looking at how to make the system more robust and decentralized instead.

KNK
Hero Member
*****
Offline Offline

Activity: 692
Merit: 502


View Profile
August 21, 2015, 12:40:59 PM
Last edit: August 21, 2015, 02:15:04 PM by KNK
 #25

Sorry for the long post ...
TLDR: +1 for dynamic block size. I hope it is not too late for the right change

A dynamic algorithm can not magically instantiate the needed resources.
It doesn't need to! If properly implemented it will be the other way around (see below {1})

The reason I feel OP's proposal is beautiful is because it requires users to fill up nodes with high Tx volumes and then miners to fill up blocks from mempool.
Exactly, what should be used here:
 {1}
  • Hard limit size - calculated by some algorithm for the entire network (see below {2})
  • Client limit size - configured from the client (miner full node) based on it's hardware and bandwidth limitations or other preferences

Each node may set it's own limit of how big blocks it will send to the network, but should accept blocks up to the Hard limit

I hate to rain on the parade, but full blocks are an essential feature going into the future. Any proposal that tries to avoid ever having full blocks also must address how transaction fees are going to replace inflation as it diminishes.
If not, then there will be no funding for the highly redundant network that exists now, and it will necessarily atrophy to a handful of nodes; Being hardly less subject to coercion, malpractice, and discrimination than our financial system today.
This is probably where consensus will be hardly achieved if it should be hard coded and not dynamic - cheaper transactions or bigger fees? Some want the first others the send and the truth is in the middle after both sides make some compromise, so it should also be kept in mind when planning the dynamic algorithm.

What I may suggest for the calculation of the Hard limit is:
{2}
 When calculating the new target difficulty, do the same for the block size.
 
  • Get the average size of the last 4000 nonempty blocks = AvgSize
  • Set the new block size to 150% of AvgSize, but not more than twice bigger/lower than previous block size

    How it is expected to work:
     The Hard limit is kept at 66% with 1 month moving average on each diff change.
     BUT it depends on the Soft limit chosen from the miners, so:
     
    • If the bandwidth is an issue (as it is for the most private pools and those in China) - they will send smaller blocks and thus Vote for the preferred size with their work
    • If there is a need for much bigger blocks, but the current status of the hardware (CPU or HDD) does not allow that - no increase will take place, because the clients won't send bigger blocks than configured
    • If there are not enough transactions to make bigger blocks - the size will be reduced

    EDIT: An option in the mining software to ignore blocks above Soft limit gives the control switch in each miner's hands in addition to the pools
EDIT 2: If you take a look at the average block size chart you will see that the current average size is far from the 1MB limit, if you ignore the stupid stress tests during the last month or two and even then the average is around 80%, so 2/3 (66% full) block size is a good target IMHO

Mega Crypto Polis - www.MegaCryptoPolis.com
BTC tips: 1KNK1akhpethhtcyhKTF2d3PWTQDUWUzHE
upal (OP)
Full Member
***
Offline Offline

Activity: 165
Merit: 102


View Profile
August 21, 2015, 08:05:24 PM
 #26

Thanks to everyone for providing good arguements for improvement of the proposal. I have derived a second proposal and updated OP accordingly. If you have any counter-arguement to this proposal, feel free to put it here or in the comment section of the article - http://upalc.com/maxblocksize.php
skang
Sr. Member
****
Offline Offline

Activity: 452
Merit: 252


from democracy to self-rule.


View Profile
August 22, 2015, 04:38:27 AM
 #27

The problem is blocks larger than the network's baseline resource will magnify centralization. This is a bad thing, but the question is not "how to make Bitcoin scalable". My understanding of scalability (please share yours if it differs from mine) is for a piece of software that attempts to consume as much resources as is made available. An example of a scalable system would be Amazon's ec2. The more physical machines support it, the more powerful it gets. Another one is BitTorrent, where the more leechers show up, the more bandwidth the torrent totals (i.e. bandwidth is not defined by seed boxes alone).

Your understanding is correct but Bitcoin is unlike anything in history. In traditional sense, like the examples you state, if a resource is getting fully utilized you add more of it & the key resources are ones that make that technology possible.

Although disk space is a resource, it is not a key resource in enabling the torrenting technology, in the sense that disk space existed before the invention of internet but that does not allow torrents to exist.
Inter-networking is the key resource that allows torrenting to exist. Now, what do you do if the network is fully occupied? You add more of it, problem solved.

With bitcoin, the network is a resource, but it is not the key resource, in the sense that networks existed before Bitcoin.
Blockchain is the key resource that allows bitcoin to exist. Now what do you do if blocks are full? You add more blocks. Ding! Not allowed, mate!

Blocks are essentially a list of transactions per unit time. So when we say we need to increase blocks, we mean we need to increase the rate of transaction throughput.

There are only 3 things in this equation that we can tweak:
1. increase the blocks. Not allowed, by definition of bitcoin; 1 per 10 minutes
2. decrease the time. Not allowed, by definition of bitcoin; each block comes out in 10 minutes
3. increase the block size. Allowed but practical limits of technology comes in. With each kb increased, download time for a block increases by milliseconds, and the miner who found that block, now has a headstart for these many milliseconds. Bigger the miner, more headstarts he gets and thus smaller miner leaves & this circle continues until only big miners are left. Complete centralization! Not an option.

To people looking at it in traditional way, miners might look like resources, so that more miners ought to mean more transaction throughput. But it does not for the same reason as more hard disk does not mean better torrent speed.

I would say the current issue with Bitcoin and big blocks isn't scalability but rather efficiency. We don't want to use more resources, we want to use the same amount of resources in a more efficient manner. Block size is like a barrier to entry: the bigger the blocks, the higher the barrier. Increasing efficiency in block propagation and verification would reduce that barrier in return, allowing for an increase in size while keeping the network healthy. I am not familiar with the Core source but I believe there are a few low hanging fruits we can go after when it comes to block propagation.

Also, I believe the issue isn't truly efficiency, but rather centralization. Reducing the barrier to entry increases participants and thus decentralization but the real issue is that there are no true incentives to run nodes nor to spread mining to smaller clusters. I understand these are non trivial problems, but that's what the September workshop should be about, rather than scalability.

If there is an incentive to run full nodes and if there is an incentive to spread mining, then block size will no longer be a metric that affects centralization on its own. Keep in mind that it currently is the case partly because it is one of the last few metric set to a magic number. If it was controlled by a dynamic algorithm keeping track of economic factors, we wouldn't be wasting sweat and blood on this issue today and be looking at how to make the system more robust and decentralized instead.

Bitcoin got to where it is today, I mean so much publicity and usage, because everything was taken care of, even the incentive of running full nodes.
What is that incentive, you ask? That incentive is bitcoin's survival.
The way to get something done is not always to reward, but sometimes punishment.
Here the punishment is bitcoin's death.

The reason for running the nodes is same as reason for feeding a goose that lays golden eggs.
But the problem here is that the people feeding the goose (people running nodes) are not the same as people collecting eggs(the miners).
People have difficulty in understanding indirect influences but they need to realize that it is they who are consuming gold, not the collector.
The miners don't even use bitcoin necessarily, but might only be doing it for fiat money.
So people feeding the goose must realize they need to keep doing so, because although directly it looks like the collector is getting rich but indirectly it is the feeders who want gold.

I would go a step further and say anybody not running a full node is not a bitcoin user in the true sense. Why?
Because the fact that your coins got transferred is only guaranteed by the history of those coins. And you don't have a copy of that history!
You are depending on someone else to supply a copy of history.
If its your brother in the family, who runs the full node for you to access, then it's fine. But for everything else, you are better off with banks.

There are counter points to this. And these counter points are only validly made by people who are happy using banks and trusting them but find other benefits in bitcoin, namely three:
1. Bitcoin is pseudonymous
2. Bitcoin has no geographical limit. Bitcoin has no monetary limit.
3. Bitcoin is 24x7, that is more than 3 times the bank opening time.

Now these use cases are huge & bring with them a lot of these people who trust others with their money, because the fire in the jungle hasn't reached their home yet.
They will keep running light wallets and enjoy these benefits, until the banks just tidy up and make themselves 24x7 & without limits.
Then all of these users would leave happily, coz banks have always kept free candy on the counter.
So I don't care about people who don't run full nodes and so shouldn't anyone caring about bitcoin.

"India is the guru of the nations, the physician of the human soul in its profounder maladies; she is destined once more to remould the life of the world and restore the peace of the human spirit.
But Swaraj is the necessary condition of her work and before she can do the work, she must fulfil the condition."
vane91
Member
**
Offline Offline

Activity: 133
Merit: 26


View Profile
August 22, 2015, 06:36:07 AM
 #28

IMHO , the first proposal is good, if, we target for example a x% of average block's capacity.


For example, if on average blocks are 50% full and we target 66% then reduce block size,
if blocks are 70% full then increase block capacity. Let it test and see how affects the fee market.

The best thing about this is that, now we can target an average fee per block! :

if we are targeting 1btc per block in fees, and fees rise too much, lower the % full target, if fees decline rise the target.

There you go! Now people can vote for block increase by simply including higher fees!
Carlton Banks
Legendary
*
Offline Offline

Activity: 3430
Merit: 3078



View Profile
August 22, 2015, 09:41:24 AM
 #29

IMHO , the first proposal is good, if, we target for example a x% of average block's capacity.


For example, if on average blocks are 50% full and we target 66% then reduce block size,
if blocks are 70% full then increase block capacity. Let it test and see how affects the fee market.

The best thing about this is that, now we can target an average fee per block! :

if we are targeting 1btc per block in fees, and fees rise too much, lower the % full target, if fees decline rise the target.

There you go! Now people can vote for block increase by simply including higher fees!

I liked the points about handling the type of inertia that will manifest itself in a dynamic re-sizing scheme, particularly when the limit is sidling around in a narrow band. Any scheme should be able to respond to those circumstances in a way that promotes a healthy fee market, yet simultaneously disincentivise spammers. A decay function sounds like a good idea on that basis.

Vires in numeris
KNK
Hero Member
*****
Offline Offline

Activity: 692
Merit: 502


View Profile
August 22, 2015, 06:24:53 PM
 #30

Thanks to everyone for providing good arguements for improvement of the proposal. I have derived a second proposal and updated OP accordingly. If you have any counter-arguement to this proposal, feel free to put it here or in the comment section of the article - http://upalc.com/maxblocksize.php
I don't thing it is a good idea to include TX fees in the calculation.

See this charts {1}, {2} and {3}

Now consider and old miner consolidating his coins and a spammer attacking the network - both will cause increased volume of transactions (more or less in {1}) for a short period, but to succeed in the attack, the spammer (see 10 July and after) will include larger fees {3} and have less days destroyed {2}, while the old miner may 'donate' larger fees (as on 27 April) or use the fact that he may transfer them without fees (end of November), because they are old enough.

With your proposal of including the fees in the calculation the block size after 10 July will increase, thus helping the attacker even more, as it will keep increasing the block size (even now), just because others add more fees to mitigate the attack and prioritise their own transactions.

Mega Crypto Polis - www.MegaCryptoPolis.com
BTC tips: 1KNK1akhpethhtcyhKTF2d3PWTQDUWUzHE
upal (OP)
Full Member
***
Offline Offline

Activity: 165
Merit: 102


View Profile
August 23, 2015, 02:56:23 PM
 #31

Thanks to everyone for providing good arguements for improvement of the proposal. I have derived a second proposal and updated OP accordingly. If you have any counter-arguement to this proposal, feel free to put it here or in the comment section of the article - http://upalc.com/maxblocksize.php
I don't thing it is a good idea to include TX fees in the calculation.

See this charts {1}, {2} and {3}

Now consider and old miner consolidating his coins and a spammer attacking the network - both will cause increased volume of transactions (more or less in {1}) for a short period, but to succeed in the attack, the spammer (see 10 July and after) will include larger fees {3} and have less days destroyed {2}, while the old miner may 'donate' larger fees (as on 27 April) or use the fact that he may transfer them without fees (end of November), because they are old enough.

With your proposal of including the fees in the calculation the block size after 10 July will increase, thus helping the attacker even more, as it will keep increasing the block size (even now), just because others add more fees to mitigate the attack and prioritise their own transactions.

Thanks for your input. I was thinking the same and hence modified Proposal 2. This time the max cap increase is depndent on block size increase, but still Tx fee is taken care of, so that miners can be compensated for with decreasing block reward. Please check both Proposal 1 & 2 and share your opinion. It is good if someone can do a simulation with Proposal 1 & 2 and from Block 1 and share the result for both against last difficulty change.
KNK
Hero Member
*****
Offline Offline

Activity: 692
Merit: 502


View Profile
August 23, 2015, 03:42:31 PM
 #32

Have you considered my suggestion here about 66% full blocks target of 1 month moving average and soft limit configured from the client?

I don't like the idea to force some fixed compensation for the fee - the network should choose that and it is enough to give it the (right) triggers to do so.
I will use tl121's sentence here
A dynamic algorithm can not magically instantiate the needed resources.
just change that to 'increased fees can not ...'

By having (an easy to set) Soft limit allows the miners and pools to hold the block size growth in case of technical limitations. Yes, the usage will be limited (and more expensive) too, but if/until that doesn't cover the expenses for the bandwidth, space and CPU power required it is better to limit the network instead of crashing it completely with overwhelming requirements

Mega Crypto Polis - www.MegaCryptoPolis.com
BTC tips: 1KNK1akhpethhtcyhKTF2d3PWTQDUWUzHE
CounterEntropy
Full Member
***
Offline Offline

Activity: 214
Merit: 278


View Profile
August 23, 2015, 06:04:11 PM
 #33

Have you considered my suggestion here about 66% full blocks target of 1 month moving average and soft limit configured from the client?
Why 66% and not 75% or 80% ? Where from this magic figure is coming ? It is like Gavin's 8MB. He first suggested 20MB and then to get Chinese miner's support agreed upon to 8MB. These magic figures should not be pillar of a robust system which can scale with time.

I don't like the idea to force some fixed compensation for the fee - the network should choose that and it is enough to give it the (right) triggers to do so.
As I can see, as per OP's Proposal 2 network is chosing everything. Where did u find fixed compensation in Proposal 2 ? (Proposal 1 does no take care of mining fee, so question of compensation is not coming.)
KNK
Hero Member
*****
Offline Offline

Activity: 692
Merit: 502


View Profile
August 23, 2015, 06:42:46 PM
Last edit: August 23, 2015, 07:05:05 PM by KNK
 #34

Why 66% and not 75% or 80% ? Where from this magic figure is coming ?
Good question and I will explain it in my next post.

As I can see, as per OP's Proposal 2 network is chosing everything. Where did u find fixed compensation in Proposal 2 ? (Proposal 1 does no take care of mining fee, so question of compensation is not coming.)
Network is choosing, but based on fixed rules, which can easily be cheated for both. Half and double for Proposal 1 may seem OK for now, but think about 10M and 20M blocks it's a 40k+ of transactions per block added - you ruin the need of fees for quite a while or cause the size to flip-flop between 10M and 20M each time

Mega Crypto Polis - www.MegaCryptoPolis.com
BTC tips: 1KNK1akhpethhtcyhKTF2d3PWTQDUWUzHE
KNK
Hero Member
*****
Offline Offline

Activity: 692
Merit: 502


View Profile
August 23, 2015, 06:57:38 PM
Last edit: August 23, 2015, 07:23:47 PM by KNK
 #35

Now where 66% came from ...
See this charts again {1}, {2} and {3} and also {4}

  • Now pick any two nearby min and max from {1} - it's close to 2:3 proportion
  • Check July's attack on {1} - it's close to 1:2 and the fees on {3} are close to 1:2 too, but actual average size in {4} is also 2:3

Including days destroyed ( {2} ) to ignore short time extreme volumes will be a good idea, but no idea how exactly (EDIT: how to do it properly, so better not to do it at all).

Mega Crypto Polis - www.MegaCryptoPolis.com
BTC tips: 1KNK1akhpethhtcyhKTF2d3PWTQDUWUzHE
99Percent
Full Member
***
Offline Offline

Activity: 407
Merit: 101


🦜| Save Smart & Win 🦜


View Profile WWW
August 24, 2015, 08:29:02 PM
 #36

I was about to post a similar suggestion too. I think this is a great idea.

I hope the core devs take a serious look at it.

CounterEntropy
Full Member
***
Offline Offline

Activity: 214
Merit: 278


View Profile
August 25, 2015, 12:06:06 AM
 #37

Now where 66% came from ...
See this charts again {1}, {2} and {3} and also {4}

  • Now pick any two nearby min and max from {1} - it's close to 2:3 proportion
  • Check July's attack on {1} - it's close to 1:2 and the fees on {3} are close to 1:2 too, but actual average size in {4} is also 2:3

Including days destroyed ( {2} ) to ignore short time extreme volumes will be a good idea, but no idea how exactly (EDIT: how to do it properly, so better not to do it at all).

Deriving a number relying on previous chart might work well for the short term, but it probably is not a good solution for the long run, because the chart will behave absolutely differently for a bubble, a spam attack or a tech innovation. In fact, that is the reason I do not like BIP 101, though I support bigger blocks. Gavin has derived 20mb and then 8mb relying on previous statistics. The better way, in my opinion, is to take signals from the network itself as proposed by OP.
KNK
Hero Member
*****
Offline Offline

Activity: 692
Merit: 502


View Profile
August 25, 2015, 07:06:38 AM
 #38

The better way, in my opinion, is to take signals from the network itself as proposed by OP.
That's what my proposal for Soft Limit does too and not just signals but a way for the miners to even shrink the block size if they have consensus on that with large enough hashrate:
Example 50% of the miners want to keep the block size - they mine ~40% full blocks and the size will not change even if the rest of the network mines full blocks and there are enough transactions to fill them all.
If they want it lower - they mine even smaller blocks (non empty as they are simply ignored), but knowing that they miss some fees. At 10% full blocks from 50% of the miners it is guaranteed that the new size will be less than 83% of the current size and with 1% full blocks - 75% of the current

Mega Crypto Polis - www.MegaCryptoPolis.com
BTC tips: 1KNK1akhpethhtcyhKTF2d3PWTQDUWUzHE
Swordsoffreedom
Legendary
*
Offline Offline

Activity: 2828
Merit: 1115


Leading Crypto Sports Betting & Casino Platform


View Profile WWW
August 25, 2015, 09:56:57 AM
 #39

I like these proposals as they factor in growth alongside transaction fees

Setting a dynamically adjusted max cap is the true solution to blocksize problems as it allows for a right size fits all in all cases puts the issue to rest
and is the best middle ground in my opinion it will have a lot less polarization as it makes sense to me and likely others that blocksize growth should match demand with a dynamic margin that grows or shrinks alongside usage in the future.

That said a setting in addition to a dynamic max cap would be a recommended minimum for client installations as it would address concerns about nodes not having the needed resources, adding into that a warning if a node approaches a limit where it would not be able to optimally function in the future.

..Stake.com..   ▄████████████████████████████████████▄
   ██ ▄▄▄▄▄▄▄▄▄▄            ▄▄▄▄▄▄▄▄▄▄ ██  ▄████▄
   ██ ▀▀▀▀▀▀▀▀▀▀ ██████████ ▀▀▀▀▀▀▀▀▀▀ ██  ██████
   ██ ██████████ ██      ██ ██████████ ██   ▀██▀
   ██ ██      ██ ██████  ██ ██      ██ ██    ██
   ██ ██████  ██ █████  ███ ██████  ██ ████▄ ██
   ██ █████  ███ ████  ████ █████  ███ ████████
   ██ ████  ████ ██████████ ████  ████ ████▀
   ██ ██████████ ▄▄▄▄▄▄▄▄▄▄ ██████████ ██
   ██            ▀▀▀▀▀▀▀▀▀▀            ██ 
   ▀█████████▀ ▄████████████▄ ▀█████████▀
  ▄▄▄▄▄▄▄▄▄▄▄▄███  ██  ██  ███▄▄▄▄▄▄▄▄▄▄▄▄
 ██████████████████████████████████████████
▄▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▄
█  ▄▀▄             █▀▀█▀▄▄
█  █▀█             █  ▐  ▐▌
█       ▄██▄       █  ▌  █
█     ▄██████▄     █  ▌ ▐▌
█    ██████████    █ ▐  █
█   ▐██████████▌   █ ▐ ▐▌
█    ▀▀██████▀▀    █ ▌ █
█     ▄▄▄██▄▄▄     █ ▌▐▌
█                  █▐ █
█                  █▐▐▌
█                  █▐█
▀▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▀█
▄▄█████████▄▄
▄██▀▀▀▀█████▀▀▀▀██▄
▄█▀       ▐█▌       ▀█▄
██         ▐█▌         ██
████▄     ▄█████▄     ▄████
████████▄███████████▄████████
███▀    █████████████    ▀███
██       ███████████       ██
▀█▄       █████████       ▄█▀
▀█▄    ▄██▀▀▀▀▀▀▀██▄  ▄▄▄█▀
▀███████         ███████▀
▀█████▄       ▄█████▀
▀▀▀███▄▄▄███▀▀▀
..PLAY NOW..
DooMAD
Legendary
*
Offline Offline

Activity: 3850
Merit: 3168


Leave no FUD unchallenged


View Profile
August 25, 2015, 10:12:35 AM
 #40

While this proposal is my second preference, I'd suggest getting a move on and coding it into existence if this is your first choice for how the network should be run.  Public support seems to be rallying around other proposals, namely BIP101 and BIP100.  If you want a dynamic blocksize, you need to get this out in the open pretty quick to make it viable.

▄▄▄███████▄▄▄
▄█████████████████▄▄
▄██
█████████▀██▀████████
████████▀
░░░░▀░░██████████
███████████▌░░▄▄▄░░░▀████████
███████
█████░░░███▌░░░█████████
███
████████░░░░░░░░░░▄█████████
█████████▀░░░▄████░░░░█████████
███
████▄▄░░░░▀▀▀░░░░▄████████
█████
███▌▄█░░▄▄▄▄█████████
▀████
██████▄██
██████████▀
▀▀█████████████████▀▀
▀▀▀███████▀▀
.
.BitcoinCleanUp.com.


















































.
.     Debunking Bitcoin's Energy Use     .
███████████████████████████████
███████████████████████████████
███████████████████████████████
███████▀█████████▀▀▀▀█▀████████
███████▌░▀▀████▀░░░░░░░▄███████
███████▀░░░░░░░░░░░░░░▐████████
████████▄░░░░░░░░░░░░░█████████
████████▄░░░░░░░░░░░▄██████████
███████▀▀▀░░░░░░░▄▄████████████
█████████▄▄▄▄▄▄████████████████
███████████████████████████████
███████████████████████████████
███████████████████████████████
...#EndTheFUD...
Pages: « 1 [2] 3 4 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!