Bitcoin Forum
November 03, 2024, 09:44:39 AM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 [4] 5 »  All
  Print  
Author Topic: Block size limit automatic adjustment  (Read 14548 times)
ribuck
Donator
Hero Member
*
Offline Offline

Activity: 826
Merit: 1060


View Profile
May 11, 2011, 01:28:11 PM
 #61

How about: The maximum block size equals the higher of: (a) the current hard-coded maximum block size, and (b) 'difficulty' bytes.

That's awesome...

EDIT:  Rule (b) might have to be some agreed upon multiple of difficulty, however.  If the blocksize does not naturally increase until difficulty is over one million, I'm afraid that we really would have some scalability issues.
What? Difficulty will be above one million real soon now. Two to three months probably.

Quote
And Rule (a) should be reduced by half at least.
Why risk compatibility with existing software, just for the sake of a minor tweak that will only be relevant for the next two or three months?
caveden (OP)
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
May 11, 2011, 01:49:06 PM
 #62

I think non-miners don't need to check the block size even if they are full nodes

I'm not really convinced of that...

There are some arbitrary rules regarding what a valid block is which are of interest to the entire bitcoin community, not only miners. And I'm not talking about obvious rules like no double-spending or signature validation. I mean rules like the difficult factor or block rewards, for example. These two concern the inflation control, which are of interest of every bitcoin user.

Of course that miners that disagree with the current rules could always try to change them. But if users reject their blocks, the result of their mining may be worth much less as it would be a fork used by few.
So, when users validate blocks, they create a strong incentive for miners to obey the entire user base consensus. If instead users accept all blocks that miners decide to build upon, then it's up to the miner consensus only to decide these kind of rules. Even if they change to something which is not really of interest to the entire user base, users will passively accept it.

I think that the maximum block size is a rule of this kind. It's not only about spam. It's about creating an artificial scarcity too.
It's true that miners may come up with a good agreement since this artificial scarcity is good for them, but still, it sounds dangerous to me for the entire user base to give a "blank card" to miners to decide on that entirely on their own... don't you think?
caveden (OP)
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
May 11, 2011, 01:51:13 PM
 #63

Automatic adjustments based on difficulty assume difficulty will scale with traffic. I'm not convinced that relationship will hold.

Neither am I. Using the size of the last X blocks seems more reasonable.
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 2301


Chief Scientist


View Profile WWW
May 11, 2011, 01:53:29 PM
 #64

I'd tweak the formula to be:  max block size = 1000000 + (int64)(difficulty)

... just to avoid "if block number is < X max block size = 1000000 else..." logic.  Adding in the current 1MB max limit means all the old blocks are valid under the new rule.

I like Mike's point that difficulty and transaction volume aren't necessarily related.  Maybe a better formula for miners would be something like:

max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

Anybody have access to what Visa daily transaction volume looks like in the days around Christmas?  Are there huge, sudden spikes that the above formula wouldn't handle?

How often do you get the chance to work on a potentially world-changing project?
ribuck
Donator
Hero Member
*
Offline Offline

Activity: 826
Merit: 1060


View Profile
May 11, 2011, 02:40:52 PM
 #65

Automatic adjustments based on difficulty assume difficulty will scale with traffic.

Here's how it scales automatically:

If blocks are getting full, people pay higher fees to get their transactions in the block. Increased mining profitability causes increased mining which causes increased difficulty.
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1010



View Profile
May 11, 2011, 02:55:06 PM
 #66

Automatic adjustments based on difficulty assume difficulty will scale with traffic.

Here's how it scales automatically:

If blocks are getting full, people pay higher fees to get their transactions in the block. Increased mining profitability causes increased mining which causes increased difficulty.

I agree with this perspective.  This simple rule maintains scarcity, prevents scalability issues, and is likely to find it's own equilibrium via transaction price discovery.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
znGoat
Newbie
*
Offline Offline

Activity: 1
Merit: 0


View Profile
May 11, 2011, 04:10:24 PM
 #67

In the long-run the miners are all going to have their own rules on the fee schedules; the best we can do is set the default rules with the expectation that one day they will become ignored.

It will be in the big miners interest to make the most amount of profit ( Sum[of all fees] ).  This might be a smaller number of transactions, but each transaction taking a large fee, or many many small fee transactions.


I propose that the fee schedule is:

A: (optional) first 100KB open to any transactions  -  this is not adjusted no-matter the block size/fees, the miner can optionally not include any free transactions.

B: (recommended) next 100KB given to highest fee transactions - a miner must include up-to 100KB of the highest fee transactions. (I don't know if you could enforce this)

C: (enforced max) Based upon the average of part B over the last 100 blocks, if you can accept transactions up to:

Max size of Section C = (Total fees in B section)/(AVG free last 100 B sections) * 100KB

Total Max:  Must not be over 100x AVG size of last 6 blocks.  (can grow very large very quickly, if those making the transactions are willing to pay for it)


Why I propose the above schedule:

1.  Has a 'no-cost, but limited size area for any transactions of the miners choice... eg the Miner can Choose to include transactions from his buddies for no transaction fee. (Section A)

2.  Top priority transactions have a dedicated place in every block to compete for. (section B)

3.  If there is a strong demand for fee paying transactions then the the blocks will scale quite large very quickly (aka Christmas shopping)

4.  The total fees must always be significantly more than the average for very large blocks.


I have put quite a bit of thought into this fee schedule, I would love the forums comments on it.

Overall, whatever we what decide will not matter as one day the big miners will decide for themselves... This is just my best guess about what will fit with the natural economics of bitcoin.
jimbobway
Legendary
*
Offline Offline

Activity: 1304
Merit: 1015



View Profile
May 11, 2011, 04:19:55 PM
 #68

I'd tweak the formula to be:  max block size = 1000000 + (int64)(difficulty)

... just to avoid "if block number is < X max block size = 1000000 else..." logic.  Adding in the current 1MB max limit means all the old blocks are valid under the new rule.

I like Mike's point that difficulty and transaction volume aren't necessarily related.  Maybe a better formula for miners would be something like:

max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

Anybody have access to what Visa daily transaction volume looks like in the days around Christmas?  Are there huge, sudden spikes that the above formula wouldn't handle?


I think averaging the "last N blocks in the best chain" is good but there may be a better way.  How about we try to predict the size of the next block?  We take the last N blocks and determine if it is linear, exponential, or polynomial.  Then we solve the linear or polynomial equation to determine the N+1 point.  Basically, this method is attempting to predict the size of the next block.

We can start of simple, and just use y=mx+b. 
caveden (OP)
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
May 11, 2011, 04:40:39 PM
 #69

How about we try to predict the size of the next block?  We take the last N blocks and determine if it is linear, exponential, or polynomial.  Then we solve the linear or polynomial equation to determine the N+1 point.  Basically, this method is attempting to predict the size of the next block.

That starts to get more complex than what it needs to be, IMHO. As long as the delay of readjustment is short (24h for ex., as Gavin suggested), any formula which slightly increases the last average size should be fine. Maybe just making the increase relative instead of absolute should help with commercial holidays.
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1010



View Profile
May 11, 2011, 06:18:19 PM
 #70

I was thinking about all this on my commute to work, and I have a proposal.

1,000,000 + (Difficulty * Byte * K) = max block size

Wherein K= some factor high enough that the max block size is never really an issue, say K=2.  But some analysis is due on that.

But here is another change to the default fee schedule, granted that individual miners are likely to have tighter requirements themselves than this...

First, the max block size calculated above becomes the basis metric for a set of soft max block sizes.  I suggest 16 tiers of equal size.

0 to one-sixteenth of max block size, no special requirements, miners can include whatever valid transactions they desire up until this point.

1 to 2 (sixteenth) at least one transaction paying a fee equal or greater than the minimum fee required for unusual transactions must be present.  That transaction can be one wherein the fee was required or not.  As long as at least one is present, miners can include whatever else they desire up until this limit.

2 to 3  At least one transaction paying a fee double the fee for the above class.

3 to 4  At least one transaction paying at least double the rule above this one must be present.

And so on, so the fee paid by the highest fee paying transaction sets the bar for the block, and then the miner can include whatever other transactions that it sees fit.  This not only encourages the use of -sendtomany whenever possible, which is more efficent for the network anyway; most of the fee paying transactions (and free transactions) are then competing for the fill in space left by the one transaction that is paying for the bandwidth.  And this also sets a method of ongoing price discovery, as any client can look at the last block and it's own transaction queue and predict how much it will have to pay in order to get into the next block (probably equal to or higher than the highest single fee in the last block, if the queue is steady, slightly more if it is growing, slightly less if it is dropping) as well as establish a bidding mechanism for the 'median' transaction to be included in a block in the near future; as all other transactions besides the high transaction are then bidding for the remaing space by looking at their own queue of transactions, guessing which will be the high (and therefore the sixe of space available) and looking at the second highest to outbid if it wishes to be included in the next block.

In this way, the well-heeled senders set the bar.  Imagine if Wal-Mart, which has half a million employees to pay each week, were to compile that entire paylist into a single -sendtomany transaction.  They would be able to definitively determine the minimum fee they would have to offer just to be considered, based solely on the actual size of the transaction, and then be able to guess how much more they should offer based upon how many large senders there were in the previous several blocks.  Say in this transaction had a million outputs (probably 10 million inputs) and was 3.2 Mb once done.  The difficulty was 2 million at the last adjustment, so wlmart knows that the max block size is 5Mb.  In order to fit their 3.2 Mb single transaction into the block, they have to offer a fee at least 16 times the minimum fee (5/8=.625, 3.2/.625=5.12, so 6th tier, first tier is free, second is equal to the minumum fee, so 6th tier is 4 doublings of the minumum fee).  If the minimum is .01, then Wal-Mart pays at least .16 just to qualify.

EDIT: somewhre I switched my numbers in my head from 16 teirs to only eight.  So my numbers are wrong, but hopefully I conveyed the idea.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
gim
Member
**
Offline Offline

Activity: 90
Merit: 10


View Profile
May 11, 2011, 06:20:36 PM
 #71

max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

With this formula, asymptotically, block size cannot increase by more than 2MB in 24-hours.
That is roughly 300000 transactions a day.
(What about Visa spikes ? probably similar).

This is a hard limit, so if bitcoins are still in use in a hundred years, maybe it would be better to scale exponentially. For example:
Quote
max block size = 1000000 + 1.01 (average size of last N blocks in the best chain)
and blocksize would scale up to (about) 2% per 24-hours.

Yes, that is one more random constant :p
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1010



View Profile
May 11, 2011, 06:20:37 PM
 #72

I'd tweak the formula to be:  max block size = 1000000 + (int64)(difficulty)

... just to avoid "if block number is < X max block size = 1000000 else..." logic.  Adding in the current 1MB max limit means all the old blocks are valid under the new rule.

I like Mike's point that difficulty and transaction volume aren't necessarily related.  Maybe a better formula for miners would be something like:

max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

Anybody have access to what Visa daily transaction volume looks like in the days around Christmas?  Are there huge, sudden spikes that the above formula wouldn't handle?


I think averaging the "last N blocks in the best chain" is good but there may be a better way.  How about we try to predict the size of the next block?  We take the last N blocks and determine if it is linear, exponential, or polynomial.  Then we solve the linear or polynomial equation to determine the N+1 point.  Basically, this method is attempting to predict the size of the next block.

We can start of simple, and just use y=mx+b. 

How does this do anything but grow?

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526
Merit: 1134


View Profile
May 11, 2011, 07:17:41 PM
 #73

Visa handles around 8,000 transactions per second during holiday shopping and has burst capacity up to 10,000tps.

Of course MasterCard also handles quite a bit. I don't have figures for them but I guess it'd be in the same ballpark.

I don't believe artificial scarcity is a good plan nor necessary in the long run, so requiring end-user software to enforce these sorts of rules makes me nervous. I don't plan on adding max size checks to BitCoinJ at least, they aren't even enforceable as in future SPV clients probably won't request full blocks.
FreeMoney
Legendary
*
Offline Offline

Activity: 1246
Merit: 1016


Strength in numbers


View Profile WWW
May 11, 2011, 07:59:05 PM
 #74

I'd tweak the formula to be:  max block size = 1000000 + (int64)(difficulty)

... just to avoid "if block number is < X max block size = 1000000 else..." logic.  Adding in the current 1MB max limit means all the old blocks are valid under the new rule.

I like Mike's point that difficulty and transaction volume aren't necessarily related.  Maybe a better formula for miners would be something like:

max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

Anybody have access to what Visa daily transaction volume looks like in the days around Christmas?  Are there huge, sudden spikes that the above formula wouldn't handle?

I like it. Don't worry about Christmas, I'm pretty sure that's a bubble.

Play Bitcoin Poker at sealswithclubs.eu. We're active and open to everyone.
jimbobway
Legendary
*
Offline Offline

Activity: 1304
Merit: 1015



View Profile
May 11, 2011, 08:35:04 PM
 #75

I'd tweak the formula to be:  max block size = 1000000 + (int64)(difficulty)

... just to avoid "if block number is < X max block size = 1000000 else..." logic.  Adding in the current 1MB max limit means all the old blocks are valid under the new rule.

I like Mike's point that difficulty and transaction volume aren't necessarily related.  Maybe a better formula for miners would be something like:

max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

Anybody have access to what Visa daily transaction volume looks like in the days around Christmas?  Are there huge, sudden spikes that the above formula wouldn't handle?


I think averaging the "last N blocks in the best chain" is good but there may be a better way.  How about we try to predict the size of the next block?  We take the last N blocks and determine if it is linear, exponential, or polynomial.  Then we solve the linear or polynomial equation to determine the N+1 point.  Basically, this method is attempting to predict the size of the next block.

We can start of simple, and just use y=mx+b.  

How does this do anything but grow?

Not sure if I am answering your question but y=mx + b is a high school algebra equation for a line on a graph.  Using this equation or some other polynomial equation to predict the size of the next block shouldn't be too hard.  Just plug in values for m, x, and b and solve for y.

http://www.math.com/school/subject2/lessons/S2U4L2DP.html

I think Gavin is right in that we need some data, maybe plot it on a graph, and determine which method/equation can best fit that graph.

Just my two millicoins.
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1010



View Profile
May 11, 2011, 08:50:39 PM
 #76


Quote
How does this do anything but grow?

Not sure if I am answering your question but y=mx + b is a high school algebra equation for a line on a graph.  Using this equation or some other polynomial equation to predict the size of the next block shouldn't be too hard.  Just plug in values for m, x, and b and solve for y.


That wasn't really what I was asking.  I'm not a math geek, I'm an econo-geek (and a radio geek, but that's not relevant).  I think that a simple equation to predict the trend in order to set a blocksize has the incentives wrong, and almost certainly trends toward infinity because both those paying for transactions to be processed and miners have an incentive for every transaction to be included in every block, and then we truly do have a 'tragedy of the commons' situation as the blocksize shoots to the moon, senders no longer have an encentive to pay anything over a token fee, and miners start dropping out because the fees can't cover the cost of bandwidth and electric; resulting in a difficulty level that is too low to defend itself as the block reward is reduced.  There needs to be some mechanisim that resists arbitrary growth of the blocksize, even if only a little.  Tying the max blocksize to the difficulty in some linear fashion is a smooth way to do this.  I'm not married to the details, but the implementation just seems smooth to me.  Although I have no concept of how difficult that would be to impliment into the code, because I'm not a coder, but I imagine that it would still be easier than a rolling average or a predictive algorithim, because it's just linear math.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1010



View Profile
May 11, 2011, 08:52:54 PM
 #77

Visa handles around 8,000 transactions per second during holiday shopping and has burst capacity up to 10,000tps.


What's the average size of a simple transaction?

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526
Merit: 1134


View Profile
May 11, 2011, 09:37:34 PM
 #78

Why would it matter? BitCoin is the only financial system I know of that cares about wire-size. Any serious processing company just builds a few datacenters and you're done. They don't even have to be very big.

And the only reason BitCoin cares about wire-size is that we're afraid of scaling up the system, pretty much.
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1010



View Profile
May 11, 2011, 09:42:51 PM
 #79

Why would it matter? BitCoin is the only financial system I know of that cares about wire-size. Any serious processing company just builds a few datacenters and you're done. They don't even have to be very big.

And the only reason BitCoin cares about wire-size is that we're afraid of scaling up the system, pretty much.

Wire-size?

Scaling isn't really an issue if the system is suited to compensate the network for the resources, that 's what I'm concerned about.  If we choose an algorithim that just permits limitless growth, then we might as well just remove the blocksize limit altogether and cross our fingers, because the result is the same.  We don't have the option of "just build a few datacenters" because this is the method by which we must pay for those datacenters, and ours need to be bigger and faster than any others.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
xf2_org
Member
**
Offline Offline

Activity: 98
Merit: 13


View Profile
May 11, 2011, 09:43:26 PM
 #80

And the only reason BitCoin cares about wire-size is that we're afraid of scaling up the system, pretty much.

Well, it's still pretty cheap to dump tons of useless data into the block chain.

Satoshi didn't seem to think the block size limit should be changed... until it needed to be.  Right now, we are nowhere near the limit, so his rationale still seems sound.

Pages: « 1 2 3 [4] 5 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!