Bitcoin Forum
April 25, 2024, 02:52:55 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 3 4 5 »  All
  Print  
Author Topic: Block size limit automatic adjustment  (Read 14496 times)
caveden (OP)
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
November 20, 2010, 11:33:48 PM
Merited by ABCbits (2)
 #1

Hello all,

Recently I just posted on another thread to express my concern about this subject, but I thought it might deserve a topic of its own.

This block size rule is something really "dangerous" to the protocol. Rules like that are almost impossible to change once there are many clients implementing the protocol. Take SMTP as an example. Several improvements could be done to it, but how? It's impractical to synchronize the change.

And, well, if we ever want to scale, such limit will have to grow. I really think we should address this problem while there is only one client used by everyone, and changes in the protocol are still feasible, because in the future we may not be able to.

As far as I understand, one of the purposes of this block size limit was to avoid flooding. Another purpose as well, as mentioned here, is to keep the transaction fees not "too small" in order to create an incentive for block generation once the coin production isn't that interesting anymore. (if only a limited number of transactions can enter a block, those with the smallest fees won't be quickly processed...)

So, if we really need a block size limit, and if we also need it to scale, why not making such limit so that it adjusts itself to the transaction rate, as the difficulty of generation adjust itself to the generation rate?

Some of the smart guys in this forum could come up with an adjustment formula, taking in consideration the total size of all transactions in the latest X blocks, and calculating which should be the block size limit for the next X blocks. Just like the difficulty factor.This way we avoid this "dangerous" constant in the protocol.
One of the things the smart guys would have to decide is how rigorous will the adjustment be. Should the adjustment be done in order to always leave enough room to all transactions in the next block, or should blocks be "tight" enough to make sure that some transactions will have to wait, thus pushing up the transaction fees?

Okay, I do realize that it would allow flooders to slowly increase the limit, but, what for? As long as generators aren't accepting 0-fee transactions, a flooder would have to pay to perform his attack.

So, what do you think?
1714056775
Hero Member
*
Offline Offline

Posts: 1714056775

View Profile Personal Message (Offline)

Ignore
1714056775
Reply with quote  #2

1714056775
Report to moderator
In order to get the maximum amount of activity points possible, you just need to post once per day on average. Skipping days is OK as long as you maintain the average.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714056775
Hero Member
*
Offline Offline

Posts: 1714056775

View Profile Personal Message (Offline)

Ignore
1714056775
Reply with quote  #2

1714056775
Report to moderator
1714056775
Hero Member
*
Offline Offline

Posts: 1714056775

View Profile Personal Message (Offline)

Ignore
1714056775
Reply with quote  #2

1714056775
Report to moderator
1714056775
Hero Member
*
Offline Offline

Posts: 1714056775

View Profile Personal Message (Offline)

Ignore
1714056775
Reply with quote  #2

1714056775
Report to moderator
RHorning
Full Member
***
Offline Offline

Activity: 224
Merit: 141


View Profile
November 21, 2010, 12:20:31 AM
 #2

So, if we really need a block size limit, and if we also need it to scale, why not making such limit so that it adjusts itself to the transaction rate, as the difficulty of generation adjust itself to the generation rate?

Some of the smart guys in this forum could come up with an adjustment formula, taking in consideration the total size of all transactions in the latest X blocks, and calculating which should be the block size limit for the next X blocks. Just like the difficulty factor.This way we avoid this "dangerous" constant in the protocol.
One of the things the smart guys would have to decide is how rigorous will the adjustment be. Should the adjustment be done in order to always leave enough room to all transactions in the next block, or should blocks be "tight" enough to make sure that some transactions will have to wait, thus pushing up the transaction fees?

Okay, I do realize that it would allow flooders to slowly increase the limit, but, what for? As long as generators aren't accepting 0-fee transactions, a flooder would have to pay to perform his attack.

So, what do you think?

I also think this is a useful idea to follow up on.  In this case, it might be nice to have a "floating average" of say the previous 2000 blocks plus some constant or constant percentage.  I'm suggesting perhaps the mean + 50%, to give some flexibility for it to expand?  We could quibble over the exact amount for the expansion room (perhaps allow it to 100% or 200% of the mean) but some sort of limit certainly sounds like a good idea and is something very easy and quick to compute.  It could also be calculated independently by all of the clients quickly to accept or reject a particular block.

A genuine and sustained increase in transactions over the long duration would be accepted into the network at the increased rate and wouldn't put too much "push back" as the network adjusts to the new level.

Besides, we can run the current chain through the algorithm (whatever we come up with) and see how the network would have adjusted based on "real world" data.  It might be fun to see just how that would have worked out, too.
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1007



View Profile
November 21, 2010, 12:42:00 AM
 #3

I think that having a floating block size limit is likely to affect the generation of blocks by having a not so small percentage of generated blocks rejected by the network.

Perhaps a better way is to have a max size for free transactions, perhaps derived as a percentage of the fee paying portion of the block, with the 1 meg as the starting point.

Say that all fee paying transactions, no matter how small, can be included in the current block, and then 20% of that total block size may be added in free transactions if the generator's policy provides for that.  If the end calculation is below 1 meg, then more free transactions can be included up to that point.  This allows the blocksize to grow as needed, without uncapping the restraint on spamming, while also allowing backlogs of free transactions to clear during off-peak situations.

This adds yet another calculation that the client must perform on a block to check for validity, but changes the max block limit from a hard limit as it is now to one that can 'stretch' to meet demand and still perform it's primary function.  It also allows future client upgrades to have a higher 'open max' or 'min max' (what the hell would we call this rule?) than older clients without it being a destructive change.  Said another way, a future client development group decides that it's time to up the 'min max' from 1 meg to 3 megs, so they allow their client to accept 3 meg blocks that don't adhere to the 20% or less free rule, but don't yet produce them.  Other development groups might agree or disagree, or might move at a different pace, but the change only really would take effect once more than half of the network had upgraded, and then generated blocks with a 3 meg 'min max' would be accepted into the main chain, forcing the rest of the network to comply or fork, but without harm to the internim processing of the network.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
theymos
Administrator
Legendary
*
Offline Offline

Activity: 5180
Merit: 12884


View Profile
November 21, 2010, 01:18:52 AM
 #4

The main reason for the block size limit is disk space. At 1MB, an attacker can force every generator to permanently store 53GB per year. At 10MB, an attacker can force every generator to permanently store 526GB per year. Even a small change in block size makes a big difference on the load that generators must bear. It must not be changed until the network is ready for it, and this time can not be predicted reliably.

If the block size limit is too high, an attacker can destroy the network by making it impossible for anyone to be a generator. If the block size is too low, fees get higher until the problem is fixed. Automatic adjustments would still carry the risk of adjusting to a limit that is too low.

1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
caveden (OP)
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
November 21, 2010, 01:37:03 AM
 #5

Even a small change in block size makes a big difference on the load that generators must bear.

Generators must bear whatever the network demands. If we ever reach a "professional" level of hundreds of transactions per minute, generators will have to bear that.

If the block size limit is too high, an attacker can destroy the network by making it impossible for anyone to be a generator. If the block size is too low, fees get higher until the problem is fixed.

That's why an automatic adjustment would be important. Not to allow "too high" or "too low" limits for a long time.

You do realize that if this limit is a constant, it will be really hard to change it when needed, right?

Automatic adjustments would still carry the risk of adjusting to a limit that is too low.

Yes, like the difficulty adjustment, there might be periods where it's not that precise. But it'd be much better than a constant value, wouldn't it?
RHorning
Full Member
***
Offline Offline

Activity: 224
Merit: 141


View Profile
November 21, 2010, 01:51:27 AM
 #6

I think that having a floating block size limit is likely to affect the generation of blocks by having a not so small percentage of generated blocks rejected by the network.

I'm curious, how would the floating block size limit reject a block that is following the rules?  I'm not convinced on this issue here. The expected maximum size of the block would be known before the block is created, and that a block size limit exists is something already in the network.  All that is being asked here is that this size become a variable rather than a constant, and it is a variable which can grow with the activity of the network as a whole rather than throwing in a whole bunch of exceptions.

Rejected blocks would come if this rule change creates blocks larger than the current maximum size of blocks, where it would get rejected by earlier clients following older rules without the variable block size.  I am suggesting that would be rare and likely not happen until well after the whole network (or at least 51%+ of the processing power) switches to this new rule, if it is adopted, whatever the rule is that comes up here from this idea.

The main reason for the block size limit is disk space. At 1MB, an attacker can force every generator to permanently store 53GB per year. At 10MB, an attacker can force every generator to permanently store 526GB per year. Even a small change in block size makes a big difference on the load that generators must bear. It must not be changed until the network is ready for it, and this time can not be predicted reliably.

If the block size limit is too high, an attacker can destroy the network by making it impossible for anyone to be a generator. If the block size is too low, fees get higher until the problem is fixed. Automatic adjustments would still carry the risk of adjusting to a limit that is too low.

I realize that this is to limit the potential damage which a malicious attacker might try to force a whole bunch of miscellaneous data onto the network in an inefficient manner.  If anything, an algorithm using this concept might actually reduce the amount of data needed by miners and coin generators under such an attack, at least while the number of transactions is still quite small on average.  The question is how to set up such a formula so such an attack is essentially futile and would only result in forcing participants to engage in adding fees to their transactions?

This would require a persistent and prolonged attack, something that I think those contributing to Bitcoins would pick out well before it starts to even remotely cause much damage in terms of sucking up a whole bunch of permanent storage.  On the other hand, it would allow flexibility for where special exceptions wouldn't have to be made a couple of years from now if or when what most people genuinely acknowledge is a large number of "legitimate" transactions that really would on average require that much data storage.  It is a problem we are going to be facing eventually, and it seems like trying to put off a problem that we know is going to come.
caveden (OP)
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
November 21, 2010, 01:53:25 AM
 #7

I think that having a floating block size limit is likely to affect the generation of blocks by having a not so small percentage of generated blocks rejected by the network.

If this rule is defined and documented while we still can, it would become a "protocol-rule". Generators would know it, and know how to adapt to it in order not to lose their blocks.
theymos
Administrator
Legendary
*
Offline Offline

Activity: 5180
Merit: 12884


View Profile
November 21, 2010, 04:49:11 AM
 #8

Generators must bear whatever the network demands. If we ever reach a "professional" level of hundreds of transactions per minute, generators will have to bear that.

If no generators are capable of storing the 10TB block chain or whatever, then there will be no generators and Bitcoin will die. Limit adjustments won't make the chain smaller once it has already grown to a gigantic size.

Quote
You do realize that if this limit is a constant, it will be really hard to change it when needed, right?

It will not be hard to change. It will cause a certain amount of disruption, but it's not difficult. A certain group will change, and the change will either catch on with everyone else or the people changing will realize their new coins are becoming worthless and change back.

1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1007



View Profile
November 21, 2010, 05:58:04 AM
 #9

Generators must bear whatever the network demands. If we ever reach a "professional" level of hundreds of transactions per minute, generators will have to bear that.

If no generators are capable of storing the 10TB block chain or whatever, then there will be no generators and Bitcoin will die. Limit adjustments won't make the chain smaller once it has already grown to a gigantic size.


It's not actually necessary for generators to keep the entire blockchain.  And even if the blockchain were to outpace growth in storage (something I question), there isn't really a need for most generators to keep a local copy of the blockchain at all.  There is no technical reason that prevents a specialized generation client from contracting with some online shared storage service that keeps the archive of the blockchain older than a year, in return for a donation 1% of generated coins.  A new client would still be able to verify the chain, and then just keep the block headers should it need to fetch block data older than a year.  Such fetching would be very rare.  1000 different generating users sharing one well protected read only copy of the blockchain would  render your concerns moot, whether they were all in one datacenter owned by one financial insitutution, or individuals spread across the Internet.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
caveden (OP)
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
November 21, 2010, 02:06:18 PM
 #10

If no generators are capable of storing the 10TB block chain or whatever, then there will be no generators and Bitcoin will die. Limit adjustments won't make the chain smaller once it has already grown to a gigantic size.

I think creighto said all that needed to be said about this. Generators can find a way to bypass this issue, if it ever becomes an issue. And, again, with a proper interface to add transaction fees, there would be no incentive for a flood attack. So, blocks would only grow if they really need to grow - if people really are transacting that much.

Quote
You do realize that if this limit is a constant, it will be really hard to change it when needed, right?

It will not be hard to change. It will cause a certain amount of disruption, but it's not difficult. A certain group will change, and the change will either catch on with everyone else or the people changing will realize their new coins are becoming worthless and change back.

I have to disagree here. You don't easily change a protocol constant. It can only be done when just a few softwares implement the protocol. Once it is well diffused throughout the Internet, it's almost impossible to change.
What you are proposing is like a fork on the project, since the chains wouldn't be compatible. Having to fork the project just because the value of a constant became obsolete? It's way too radical. People wouldn't do it, and the bad constant would remain there, causing issues, like too high transaction fees.
ribuck
Donator
Hero Member
*
Offline Offline

Activity: 826
Merit: 1039


View Profile
November 21, 2010, 03:27:09 PM
 #11

You don't easily change a protocol constant.

Agreed.

If we can find a workable algorithmic block size, it makes sense to adopt it earlier rather than later.

I have never understood the argument that when transaction numbers rise you can pay a transaction fee for priority, or use free transactions which will get processed "eventually". It makes no sense. If the average number of transactions per hour is more than six blocks worth, the transaction queue will grow and grow and grow without bound, transaction fees or not.
FreeMoney
Legendary
*
Offline Offline

Activity: 1246
Merit: 1014


Strength in numbers


View Profile WWW
November 21, 2010, 06:49:44 PM
 #12

You don't easily change a protocol constant.

Agreed.

If we can find a workable algorithmic block size, it makes sense to adopt it earlier rather than later.

I have never understood the argument that when transaction numbers rise you can pay a transaction fee for priority, or use free transactions which will get processed "eventually". It makes no sense. If the average number of transactions per hour is more than six blocks worth, the transaction queue will grow and grow and grow without bound, transaction fees or not.

Something like making the max block size increase to 110% of the average size of the last 2016 blocks seems good.

I thought at first that spam would grow unlimitedly, but it won't if just a few % of generators refuse to include it. Generators themselves don't have incentive to bloat blocks, both because it will cost them in future disk space, but also because bigger blocks reduce the fee in equilibrium.

Should max block size ever decrease? I don't think so, but way way down the road it might be a problem.

Play Bitcoin Poker at sealswithclubs.eu. We're active and open to everyone.
db
Sr. Member
****
Offline Offline

Activity: 279
Merit: 261



View Profile
November 21, 2010, 07:45:47 PM
 #13

How about having the transaction fees decide the block size? Perhaps by a rule like this: The total fees of the least expensive half of the transactions in a block must be bigger than half the total fees of the most expensive half. (All transactions get to add their fair share of the newly minted coins to their fee.)

Such a scheme has several benefits.

* Prohibits flooding
* Allows unlimited numbers of transactions if there is real demand
* Makes transaction fees depend on the demand for transactions
* No constants
* No guesses about the future market
FreeMoney
Legendary
*
Offline Offline

Activity: 1246
Merit: 1014


Strength in numbers


View Profile WWW
November 21, 2010, 08:33:32 PM
 #14

How about having the transaction fees decide the block size? Perhaps by a rule like this: The total fees of the least expensive half of the transactions in a block must be bigger than half the total fees of the most expensive half. (All transactions get to add their fair share of the newly minted coins to their fee.)

Such a scheme has several benefits.

* Prohibits flooding
* Allows unlimited numbers of transactions if there is real demand
* Makes transaction fees depend on the demand for transactions
* No constants
* No guesses about the future market


But fees could get very large and very even, no?

I think we need to know what the purpose of limiting the block size is.

If it is to stop spam only, then it should be set to grow with consistent use near the max, spam cannot push this up because at least a few generators will not include spam, especially given that their future pricing power (weak as it may be anyway) will be further reduced by allowing the block size to increase. On the other hand without collusion generators will not refuse legitimate transactions with fees.

If it is to keep the size of the chain down by then we need to somehow weigh that against the goodness of cheap transactions. I don't know how this can be settled at all. Everyone bears the tiny cost of remembering the transaction for potentially a long time, but only one generator gets the payment for putting it in there. 

Play Bitcoin Poker at sealswithclubs.eu. We're active and open to everyone.
db
Sr. Member
****
Offline Offline

Activity: 279
Merit: 261



View Profile
November 21, 2010, 08:56:21 PM
 #15

But fees could get very large and very even, no?
No, if all fees are large and even there is plenty of room for cheap transactions at the bottom.

I think we need to know what the purpose of limiting the block size is.
The most important and most difficult purpose is to keep the transaction fees both reasonable and high enough to give an incentive to generators to provide the unrelated public good of hashing difficulty.
FreeMoney
Legendary
*
Offline Offline

Activity: 1246
Merit: 1014


Strength in numbers


View Profile WWW
November 21, 2010, 09:11:11 PM
 #16

But fees could get very large and very even, no?
No, if all fees are large and even there is plenty of room for cheap transactions at the bottom.

I think we need to know what the purpose of limiting the block size is.
The most important and most difficult purpose is to keep the transaction fees both reasonable and high enough to give an incentive to generators to provide the unrelated public good of hashing difficulty.


Maybe I am confused, imagine a max block size of about 10 transactions and this schedule of people's willingness to pay.

.42BTC, .41BTC, .41BTC, .41BTC, .41BTC, .4BTC, .4BTC, .4BTC, .39BTC, .39BTC, .37BTC, .36BTC, .36BTC, .34BTC, .33BTC, .33BTC, .33BTC, .33BTC, .32BTC, .31BTC, .3BTC, .3BTC...

and so on for maybe hundreds or thousands, but only the top 10 very similar payments will actually be in blocks and available for determining weather to increase max block size and so it will not be increased. Maybe my list is not a realistic structure, but I can't see why it would always have to be very different fees in the top and bottom, especially once block size becomes a limiting factor.

But it does occur to me than in the "very slowly decreasing desire to pay fees" scenario generators may actually want to increase the block size. It will lower average fees, but probably not total fees. I guess this has to do with the elasticity of demand for sending a transfer.

Play Bitcoin Poker at sealswithclubs.eu. We're active and open to everyone.
db
Sr. Member
****
Offline Offline

Activity: 279
Merit: 261



View Profile
November 21, 2010, 09:26:03 PM
 #17

Maybe I am confused, imagine a max block size of about 10 transactions and this schedule of people's willingness to pay.

Ah, sorry if this was not clear: There is no maximum block size that is adjusted. The size of a block is determined by the size of the most profitable rule-abiding set of transactions that can go into it. Completely independent of previous block sizes.
FreeMoney
Legendary
*
Offline Offline

Activity: 1246
Merit: 1014


Strength in numbers


View Profile WWW
November 21, 2010, 10:05:28 PM
 #18

Maybe I am confused, imagine a max block size of about 10 transactions and this schedule of people's willingness to pay.

Ah, sorry if this was not clear: There is no maximum block size that is adjusted. The size of a block is determined by the size of the most profitable rule-abiding set of transactions that can go into it. Completely independent of previous block sizes.


Wait, so what stops an attacker from generating a block with one million or more spam transactions?

Play Bitcoin Poker at sealswithclubs.eu. We're active and open to everyone.
db
Sr. Member
****
Offline Offline

Activity: 279
Merit: 261



View Profile
November 21, 2010, 10:35:22 PM
 #19

Wait, so what stops an attacker from generating a block with one million or more spam transactions?

Nothing, but it stops attackers from drowning legitimate transactions in junk inside the normal blocks. Are gargantuan phony entire blocks really a problem? They will be expensive to produce and won't be long lived as they are extremely hard not to spot and no generators in their right mind would continue building the chain from one of them. They would lose the income from any subsequent blocks when everyone else ditches the offending block. So that should take care of itself through generator self interest.
asdf
Hero Member
*****
Offline Offline

Activity: 527
Merit: 500


View Profile
November 21, 2010, 10:49:58 PM
 #20

It seems to me that the spam issue and the txfee issue are related. Some want to limit the block size to stop spam and some want to limit it to create an artifical scarcity to drive up txfees.

The problem, as I see it, is that there is NO incentive to NOT accept a fee paying transaction, unless it's ridiculously small. Once a generator has established his infrastructure, It costs a negligible amount to process a transaction. If you can impose some sort of protocol rule on blocks that makes smaller fee transactions less desirable, this would solve both problems.

Automatically adjusting the block size is a solution, if you can find an algorithm that scales appropriately with economic activity. If set too high, there will be too much spam and transactions will be too cheap; generators will leave. If set too low, transactions will become very expensive and people will stop using bitcoin.

Also, there is the idea of restricting the distribution of transactions fees in each block. Like mandating that a frequency distribution of fees fit a linear scale. I don't know if this is workable, I'm just throwing ideas around.

So, I think that we need an incentive for generators to NOT accept fee paying transactions as they get smaller. I particular, build in some sort of fixed cost to processing a transaction, that adjusts with the market.
Pages: [1] 2 3 4 5 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!