Bitcoin Forum
November 01, 2024, 02:18:54 AM *
News: Bitcoin Pumpkin Carving Contest
 
   Home   Help Search Login Register More  
Pages: [1] 2 3 4 5 6 7 8 9 10 »  All
  Print  
Author Topic: The MAX_BLOCK_SIZE fork  (Read 35591 times)
Jeweller (OP)
Newbie
*
Offline Offline

Activity: 24
Merit: 1


View Profile
January 31, 2013, 07:23:52 AM
Merited by ABCbits (1)
 #1

I’d like to discuss the scalability of the bitcoin network, specifically the current maximum block size of 1 megabyte.  The bitcoin wiki states:
Quote
Today the Bitcoin network is restricted to a sustained rate of 7 tps by some artificial limits. … Once those limits are lifted, the maximum transaction rate will go up significantly.
… and then goes on to theorize about transaction rates many orders of magnitude higher.  Certainly from a software engineering point of view, medium-term scalability is a trivial problem. An extra zero in the
Code:
static const unsigned int MAX_BLOCK_SIZE = 1000000;
line would be fine for a good while.  But I think dismissing the block size issue as the wiki and many others have done is a serious mistake.

Some background on the arguments can be found in this thread and others.

Changing this limit needs to be discussed now, before we start hitting it.  Already a quick glance at the blockchain shows plenty of blocks exceeding 300KB.  Granted most of that’s probably S.Dice, but nobody can really dispute that bitcoin is rapidly growing, and will hit the 1MB ceiling fairly soon.

So... what happens then?  What is the method for implementing a hard fork?  No precedent, right?  Do we have a meeting?  With who?  Vote?  Ultimately it’s the miners that get to decide, right?  What if the miners like the 1MB limit, because they think the imposed scarcity of blockchain space will lead to higher transaction fees, and more bitcoin for them?  How do we decide on these things when nobody is really in charge?  Is a fork really going to happen at all?

Personally I would disagree with any pro-1MB miners, and think that it’s in everyone’s interest, miners included, to expand the limit.  I think any potential reductions in fees would be exceeded by the increased value of the block reward as the utility of the network expands.  But this is a source of significant uncertainty for me -- I just don’t know how it’s going to play out.  I wouldn’t be surprised if we are in fact stuck with the 1MB limit simply because we have no real way to build a consensus and switch.  Certainly not the end of bitcoin, but personally it would be disappointing.  A good analogue would be the 4-byte addresses of IPv4... all over again.  You can get around it (NAT), and you can fix it (IPv6) but the former is annoying and the latter is taking forever.

So what do you think?  Will we address this issue?  Before or after every block ≈ 1,000,000 bytes?
flower1024
Legendary
*
Offline Offline

Activity: 1428
Merit: 1000


View Profile
January 31, 2013, 07:29:13 AM
 #2

i guess most miners won't like this change.
they are speculating that if its harder to place a transaction in a block (eg because of size) people will pay more transaction fees.
notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
January 31, 2013, 07:30:25 AM
 #3

The first thing you need to understand that it's not just a matter of the majority of miners for a hard fork.... it's got to be pretty much everybody.  Otherwise, you will have a blockchain split with two different user groups both wanting to call their blockchain "bitcoin".  Unspent outputs at the time of the fork can be spent once on each new chain.  Mass confusion.

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
flower1024
Legendary
*
Offline Offline

Activity: 1428
Merit: 1000


View Profile
January 31, 2013, 07:32:08 AM
 #4

The first thing you need to understand that it's not just a matter of the majority of miners for a hard fork.... it's got to be pretty much everybody.  Otherwise, you will have a blockchain split with two different user groups both wanting to call their blockchain "bitcoin".  Unspent outputs at the time of the fork can be spent once on each new chain.  Mass confusion.

+1

but i am sure that there is a need for a hardfork in the future (more digits or bigger blocks). the earlier the better....but its always hard to predict the future Wink
Jeweller (OP)
Newbie
*
Offline Offline

Activity: 24
Merit: 1


View Profile
January 31, 2013, 07:47:12 AM
 #5

The first thing you need to understand that it's not just a matter of the majority of miners for a hard fork.... it's got to be pretty much everybody.

Quite true.  In fact even more so because "old" protocol nodes will only accept small blocks, while the "new" protocol nodes will accept either small (<1MB) or large (>1MB) blocks.  Thus all blocks produced by old miners will be accepted by the new ones as valid, even when there's an extra 500KB of transactions waiting in line to be published.

You'd need like a >90%, simultaneous switch to avoid total chaos.  In that case substantially all the blocks published would be >1MB, and the old protocol miners wouldn't be able to keep up.  If normal nodes switched at the same time, they would start pushing transactions that old-protocol clients / miners would lose track of.  It seems very likely that when / if the change takes place, blocks will have been at the 1MB limit for some time and the end of the limit would immediately result in 1.5MB blocks, so it would have to be coordinated well in advance.

da2ce7
Legendary
*
Offline Offline

Activity: 1222
Merit: 1016


Live and Let Live


View Profile
January 31, 2013, 07:55:37 AM
 #6

This has been discussed again and again.  This is a hard-limit in the protocol, changing it is as hard as changing the total number of coins... ie. virtually impossible.

Many people have invested into Bitcoin under the pretence that the hard-limits of the protocol do not change.

Even if a super-majority wanted the change.  A significant amount of people (myself included) will reject the chain.  Thus creating a fork.

One off NP-Hard.
notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
January 31, 2013, 07:55:49 AM
 #7

The first thing you need to understand that it's not just a matter of the majority of miners for a hard fork.... it's got to be pretty much everybody.

Quite true.  In fact even more so because "old" protocol nodes will only accept small blocks, while the "new" protocol nodes will accept either small (<1MB) or large (>1MB) blocks.  Thus all blocks produced by old miners will be accepted by the new ones as valid, even when there's an extra 500KB of transactions waiting in line to be published.

You'd need like a >90%, simultaneous switch to avoid total chaos.  In that case substantially all the blocks published would be >1MB, and the old protocol miners wouldn't be able to keep up.  If normal nodes switched at the same time, they would start pushing transactions that old-protocol clients / miners would lose track of.  It seems very likely that when / if the change takes place, blocks will have been at the 1MB limit for some time and the end of the limit would immediately result in 1.5MB blocks, so it would have to be coordinated well in advance.



It's not that bad.  If the larger block miners are >50%, they will build off the longest valid chain, so they will ignore the smaller block miners blocks since they have a lower difficulty.  If the smaller block miners are >50%, they will always have the longest chain and no large blocks will ever survive reorganization for more than a couple confirmations.  Block headers contain the hash of the previous block, so once your chain forks, the blocks built after the first split block are not compatible with the other chain.

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4270
Merit: 8805



View Profile WWW
January 31, 2013, 07:57:13 AM
Last edit: January 31, 2013, 08:17:12 AM by gmaxwell
 #8

It's not that bad.  If the larger block miners are >50%, they will build off the longest valid chain, so they will ignore the smaller block miners blocks since they have a lower difficulty.  If the smaller block miners are >50%, they will always have the longest chain and no large blocks will ever survive reorganization for more than a couple confirmations.  Block headers contain the hash of the previous block, so once your chain forks, the blocks built after the first split block are not compatible with the other chain.
No— "longest valid chain", all of the nodes which have not adopted your Bitcoin-prime will reject the >50% hashpower's "invalid chain" to the 'true' Bitcoin network those miners will simply stop existing. From one currency you will have two. It is a maximally pessimal outcome at the near 50% split, and it wold be against the issue of any Bitcoin user to accept a non-trivial risk of that outcome no matter what the benefit.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4270
Merit: 8805



View Profile WWW
January 31, 2013, 08:01:14 AM
Last edit: January 31, 2013, 08:16:54 AM by gmaxwell
 #9

Opinions differ on the subject. The text on the Wiki largely reflect's Mike Hern's views.

Here are my views:

Without a sharp constraint on the maximum blocksize there is currently _no_ rational reason to believe that Bitcoin would be secure at all once the subsidy goes down.

Bitcoin is valuable because of scarcity. One of the important scarcities is the limited supply of coins, another is the limited supply of block-space: Limited blockspace creates a market for transaction fees, the fees fund the mining needed to make the chain robust against hostile reorganization.  I have not yet seen any suggestion as to how Bitcoin is long term viable without this except ones that argue for cartel or regulatory behavior (both of which I don't consider viable: they moot the decentralized purpose of Bitcoin).

Even going beyond fee funding— as Dan Kaminsky argued so succinctly— with, gigabyte blocks bitcoin would not be functionally decentralized in any meaningful way: only a small self selecting group of some thousands of major banks would have the means and the motive to participate in validation (much less mining), just as some thousands of major banks are the primary drivers of the USD and other major world currencies. An argument that Bitcoin can simply scale directly like that is an argument that the whole decentralization thing is a pretext: and some have argued that it's evidence that bitcoin is just destined to become another centralized currency (with some "bonus" wealth redistribution in the process, that they suggest is the real motive— that the decentralization is a cynical lie).

Obviously decentralization can be preserved for increased scale with technical improvements, and those should be done— but if decentralization doesn't come first I think we would lose what makes Bitcoin valuable and special...  and I think that would be sad. (Though, to be frank— Bitcoin becoming a worldwide centrally controlled currency could quite possibly be the most profitable for me— but I would prefer to profit by seeing the world be a diverse place with may good and personally liberating choices available to people)

Perhaps the proper maximum size isn't 1MB but some other value which is also modest and still preserves decentralization— I don't have much of an opinion beyond that fact that there is some number of years in the future where— say— 10MB will be no worse than 1MB today. It's often repeated that Satoshi intended to remove "the limit" but I always understood that to be the 500k maximum generation soft limit... quite possible I misunderstood, but I don't understand why it would be a hardforking protocol rule otherwise. (and why the redundant soft limit— and why not make it a rule for which blocks get extended when mining instead of a protocol rule? ...  and if that protocol rule didn't exist? I would have never become convinced that Bitcoin could survive... so where are the answers to long term survival?)

(In any case the worst thing that can possibly happen to a distributed consensus system is that fails to achieve consensus. A substantial persistently forked network is the worst possible failure mode for Bitcoin: Spend all your own coins twice!  No hardfork can be tolerated that wouldn't result in an thoroughly dominant chain with near certain probability)

But before I think we can even have a discussion about increasing it I think there must be evidence that the transaction load has gone over the optimum level for creating a market for fees (e.g. we should already be at some multiple of saturation and still see difficulty increasing or at least holding steady).  This would also have the benefit of further incentivizing external fast payment networks, which I think must exist before any blockchain increase: it would be unwise to argue an increase is an urgent emergency because we've painted ourselves into a corner by using the system stupidly and not investing in building the infrastructure to use it well.

Quote
You can get around it (NAT), and you can fix it (IPv6) but the former is annoying and the latter is taking forever

It's not really analogous at all.  Bitcoin has substantial limits that cannot be fixed within the architecture, unrelated to the artificial* block-size cap. The blockchain is a worldwide broadcast medium and will always scale poorly (even if rocket boosters can be strapped to that pig), the consensus it provides takes time to converge with high probability— you can't have instant confirmations,  you can't have reversals for anti-fraud (even when the parties all desire and consent to it),  and the privacy is quite weak owing to the purely public nature of all transactions.

(*artificial doesn't mean bad, unless you think that the finite supply of coin or the limitations on counterfeiting, or all of the other explicit rules of the system are also bad...)

Its important to distinguish Bitcoin the currency and Bitcoin the payment network.  The currency is worthwhile because of the highly trustworth extreme decentralization which we only know how to create through a highly distributed and decentralized public blockchain.  But the properties of the blockchain that make it a good basis for a ultimately trustworthy worldwide currency do _not_ make it a good payment network.  Bitcoin is only as much of a payment network as it must be in order to be a currency and in order to integrate other payment networks.

Or, by analogy— Gold may be a good store of value, but it's a cruddy payment system (especially online!).  Bitcoin is a better store of value— for one reason because it can better integrate good payment systems.

See retep's post on fidelity bonded chaum token banks for my personal current favorite way to produce infinitely scalable trustworthy payments networks denominated in Bitcoin.

Cheers,
notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
January 31, 2013, 08:13:08 AM
 #10

It's not that bad.  If the larger block miners are >50%, they will build off the longest valid chain, so they will ignore the smaller block miners blocks since they have a lower difficulty.  If the smaller block miners are >50%, they will always have the longest chain and no large blocks will ever survive reorganization for more than a couple confirmations.  Block headers contain the hash of the previous block, so once your chain forks, the blocks built after the first split block are not compatible with the other chain.
No— "longest valid chain", all of the nodes which have not adopted your Bitcoin-prime will reject the >50% hashpower's "invalid chain" to the 'true' Bitcoin network those miners will simply stop existing. From one currency you will have two. It is a maximally pessimal outcome at the near 50% split, and it wold be against the issue of any Bitcoin user to accept a non-trivial risk of that outcome no matter what the benefit.


I'm not sure I understand the "No".  As far as I can tell you are agreeing with me, but your notation is confusing me.

I was just refuting his claim that bitcoin prime miners would accept the blocks of the bitcoin classic miners by explaining that blocks wouldn't be compatible between chains since they have to include the hash of the previous block.

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
Jeweller (OP)
Newbie
*
Offline Offline

Activity: 24
Merit: 1


View Profile
January 31, 2013, 08:18:30 AM
 #11

Wow - thanks for the quick, extremely well crafted responses.

da2ce7 - sorry if this is an old topic; I think my confusion stems from the wiki -- it strongly implies a consensus that the size limit will be lifted.

gmaxwell - thanks; I was hadn't thought through how the blockchain would actually fork.  Yeah, you really would immediately get two completely separate chains.  Yikes.

In general I agree, the block size needs to be limited so that tx fees incentivize mining.  Overly high limits mean someone, somewhere, will mine for free, allowing people to low-ball transactions, and ruining mining incentives in general.

What I meant by the IPv4 thing is that... 1MB?  That's it?  Like 500,000 tx a day.  If only they had said 100MB, that wouldn't have really made any difference in the long run, and then millions of people could get their transaction in there every day.  Which is what I've often thought about with IP addresses: if only they'd done 6-bytes like a hardware MAC address, then maybe we wouldn't have to worry about it...

So, the wiki should be changed, right?  I'd say just reading this thread, anyone holding bitcoins, from a conservative perspective would want to avoid the chaos of a split blockchain at all costs, and not consider changing the protocol numbers.  I had been under the impression, and I think many others are, that the network (not just the currency) would in fact be scaling up enormously in the future.

As for centralization, then, the decentralization of the bitcoin transaction network will then suffer in a way.  Right now, anyone can send their bitcoins wherever they wish.  Years from now, when people are bidding against each other for space in the constantly over-crowded blockchain, no normal people will be able to make on-chain, published transactions...
da2ce7
Legendary
*
Offline Offline

Activity: 1222
Merit: 1016


Live and Let Live


View Profile
January 31, 2013, 08:47:48 AM
 #12

Without a sharp constraint on the maximum blocksize there is currently _no_ rational reason to believe that Bitcoin would be secure at all once the subsidy goes down.


This is a very interesting game-theory question.

I have done some preliminary analysis of the problem and have found that gmaxwell may not be correct on this assertion.  (I used to support the razing of the max block size limit; however I now reject it on moral ground.  The same moral grounds that I would reject any change of the number of coins; that it would make any past uses of the protocol under a false pretence)

Now I will try and explain why a bitcoin-like protocol could be secure without the max-block-size limit.

The core issue is the mixing up ‘cost-to-attack’ with ‘active hash rate,’ when in the long run they separate to be quite independent qualities.   While the vast majority of the miners income is from mining blocks for new bitcoin; there happens to be a very strong causation, this causation doesn’t need to hold.

The first thing that we can define is the ‘damage cost of double spends’ this cost can be modelled defined by the equation:
cost = time x value
time: real number between 0 and 1
A double spend a long time after a transaction, for a large amount is a very costly (up to the full cost of the tx).


In the free-market, in the long term, the market will sustain any fraud rate that is less than the cost of reducing it.  (aka, it is cheaper to buy insurance than to increase the difficulty of an attack).
I see no reason why a bitcoin-like protocol wouldn't be subjected to the same principles:  The network will spend the absolute minimum on maintaining its security. (Either via hashing or via insurance).

So what is the cheapest form of network security? Dynamic response to threats.
Bitcoin miners incur virtually no cost; unless they are actively mining.  Bitcoin insurance companies could amass huge collections of bitcoin miners and turn them on when it is cheaper for them to out-mine the double-spend than pay the insurance out.

The bitcoin network will look quite easy to attack; well until you try and attack it.
This will raise the COST TO ATTACK (that is a constant); while the COST TO DEFEND is at a minimum; only the minimum number of miners are turned on to defend the chain when it is attacked.
Otherwise a background mining operation will be run by bitcoin companies for ‘general network health.’

One off NP-Hard.
theymos
Administrator
Legendary
*
Offline Offline

Activity: 5376
Merit: 13348


View Profile
January 31, 2013, 08:59:57 AM
 #13

It's often repeated that Satoshi intended to remove "the limit" but I always understood that to be the 500k maximum generation soft limit... quite possible I misunderstood, but I don't understand why it would be a hardforking protocol rule otherwise.

Satoshi definitely intended to increase the hard max block size. See:
https://bitcointalk.org/index.php?topic=1347.0

I believe that Satoshi expected most people to use some sort of lightweight node, with only companies and true enthusiasts being full nodes. Mike Hearn's view is similar to Satoshi's view.

I strongly disagree with the idea that changing the max block size is a violation of the "Bitcoin currency guarantees". Satoshi said that the max block size could be increased, and the max block size is never mentioned in any of the standard descriptions of the Bitcoin system.

IMO Mike Hearn's plan would probably work. The market/community would find a way to pay for the network's security, and it would be easy enough to become a full node that the currency wouldn't be at risk. The max block size would not truly be unlimited, since miners would always need to produce blocks that the vast majority of full nodes and other miners would be able and willing to process in a reasonable amount of time.

However, enforcing a max block size is safer. It's not totally clear that an unlimited max block size would work. So I tend to prefer a max block size for Bitcoin. Some other cryptocurrency can try the other method. I'd like the limit to be set in a more decentralized, free-market way than a fixed constant in the code, though.

So, the wiki should be changed, right?

It's not yet known how this issue will be handled. The wiki describes one possibility, and this work shouldn't be removed.

1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
flower1024
Legendary
*
Offline Offline

Activity: 1428
Merit: 1000


View Profile
January 31, 2013, 09:04:52 AM
 #14

What do you think about a dynamic block size based on the amount of transactions in the last blocks?
theymos
Administrator
Legendary
*
Offline Offline

Activity: 5376
Merit: 13348


View Profile
January 31, 2013, 09:12:09 AM
 #15

What do you think about a dynamic block size based on the amount of transactions in the last blocks?

That's easily exploited. The limit shouldn't depend entirely on the block chain.

Here's one idea:
The block size limit doesn't need to be centrally-determined. Each node could automatically set its max block size to a calculated value based on disk space and bandwidth: "I have 100 GB disk space available, 10 MB per 10 minutes download speed and 1 MB per 10 minutes upload speed, so I'll stop relaying blocks [discouraging them] if they're near 1/8 MB [enough for each peer] and stop accepting them at all if they're over 2MB because I'd run out of disk space in less than a year at that rate". If Bitcoin ends up rejecting a long chain due to its max block size, it can ask the user whether he wants to switch to a lightweight mode.

Users could also specify target difficulty levels that they'd like the network to have and reduce their max block size when the network's actual difficulty level drops below that. A default target difficulty level could maybe be calculated based on how fast the user's computer is -- as users' computers get faster, you'd expect mining to also get faster.

1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
MPOE-PR
Hero Member
*****
Offline Offline

Activity: 756
Merit: 522



View Profile
January 31, 2013, 09:19:23 AM
 #16

Unspent outputs at the time of the fork can be spent once on each new chain.  Mass confusion.

No, this is actually great insurance for Bitcoin users. Practically it says that if you get Bitcoins now and Bitcoin later forks, you will have your Bitcoins in each and every individual fork. You can never "lose" your Bitcoins for being "on the wrong side" of the fork, because you'll be on all sides.

This incidentally also offers a very efficient market mechanism for handling the issue: people will probably be interested in selling fork-x Bitcoins they own to buy more fork-y Bitcoins if they believe fork-y is good or fork-x bad. This imbalance of offer/demand will quickly bring the respective price ratios into a position where continuing the "bad" fork is economically unfeasible (sure, miners could continue mining forever from a technical standpoint, but in reality people with infinite bank accounts are rare).

Without a sharp constraint on the maximum blocksize there is currently _no_ rational reason to believe that Bitcoin would be secure at all once the subsidy goes down.

Bitcoin is valuable because of scarcity. One of the important scarcities is the limited supply of coins, another is the limited supply of block-space: Limited blockspace creates a market for transaction fees, the fees fund the mining needed to make the chain robust against hostile reorganization.

This is actually true.

(And the worst thing that can possibly happen to a distributed consensus system is that fails to achieve consensus. A substantial persistently forked network is the worst possible failure mode for Bitcoin: Spend all your own coins twice!  No hardfork can be tolerated that wouldn't result in an thoroughly dominant chain with near certain probability)

This is significantly overstated.

Surely from an "I want to be THE BITCOIN DEV!!!" perspective that scenario is the very avatar of complete and unmitigated disaster. The fact is however that most everyone currently propping their ego and answering the overwhelming "what is your point in this world and what are you doing with your life" existentialist questions with "I r Bitcoin Dev herp" will be out before the decade is out, and that includes you. Whether Bitcoin forks persistently or not, you still won't be "in charge" for very much longer.

Knowing that I guess you can view the matter a little closer to what it is: who cares? People do whatever they want. If they want eight different Bitcoin forks, more power to them. It will be even more decentralized that way, it will be even more difficult for "government" to "stop it" - heck, it'd be even impossible to know what the fuck anyone's talking about anymore. That failure mode of horror can very well be a survival mode of greatness, in the end. Who knows? Not me. Not you either, for that matter.

Its important to distinguish Bitcoin the currency and Bitcoin the payment network.  The currency is worthwhile because of the highly trustworth extreme decentralization which we only know how to create through a highly distributed and decentralized public blockchain.  But the properties of the blockchain that make it a good basis for a ultimately trustworthy worldwide currency do _not_ make it a good payment network.  Bitcoin is only as much of a payment network as it must be in order to be a currency and in order to integrate other payment networks.

This is also very true. Bitcoin is not a payment network any more than a girl that went to Stanford and graduated top of her class is a cook: for that limited interval where she's stuck with it. Course I've been saying that for a year now and pretty much everyone just glazes over and goes into derpmode. I guess it's a distinction whose time has not yet come or something.

My Credentials  | THE BTC Stock Exchange | I have my very own anthology! | Use bitcointa.lk, it's like this one but better.
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1006


100 satoshis -> ISO code


View Profile
January 31, 2013, 09:53:14 AM
Last edit: January 31, 2013, 10:43:27 AM by solex
 #17

The max block size seems to me to be a very important issue because 1Mb is certainly too small to support a global currency with a significant user base. Even if bitcoin just has a core function as a currency but not an all-singing all-dancing payment system. Like everyone here I would very much like to see bitcoin one day replace the disastrously managed fiat currencies.

My question is: Does increasing the max block size really need to be a hard fork?

Couldn't the block versioning be used as already described below regarding the introduction of version 2?

"As of version 0.7.0, a new block version number has been introduced. The network now uses version 2 blocks, which include the blockheight in the coinbase, to prevent same-generation-hash problems. As soon as a supermajority, defined as 95% of the last 1000 blocks, uses this new block version number, this version will be automatically enforced, rejecting any new block not using version 2."  (source http://blockorigin.pfoe.be/top.php)

Lets say a block size solution is determined such as a variable limit, or a simple increase to a new fixed value. And it is planned for block version 3.

The new software change could be inactive until a supermajority of the last 1000 blocks are version 3. Then the change to the max block size becomes active. The result is close to a "soft fork" with minimum risk and disruption. This would prevent some of worst blockchain forking scenarios described above.

caveden
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
January 31, 2013, 09:55:26 AM
 #18

Changing this limit needs to be discussed now, before we start hitting it. 

This has been discussed for a while.

I used to support the idea of an algorithm to recalculate the limit, as it's done for the difficulty. But currently I just think miners should be able to create their own limits together with multiple "tolerance levels", like  "I won't accept chains containing blocks larger than X unless it's already N blocks deeper than mine". Each miner should set their own limits. That would push towards a consensus. Miners with limits too different than the average would end up losing work. The point is that like this the consensus is achieved through "spontaneous order" (decentralized), and not via a top-down decision.

That said, I do have the feeling that this change will only be scheduled once we start hitting the limit.
MPOE-PR
Hero Member
*****
Offline Offline

Activity: 756
Merit: 522



View Profile
January 31, 2013, 10:22:57 AM
 #19

Changing this limit needs to be discussed now, before we start hitting it. 

This has been discussed for a while.

I used to support the idea of an algorithm to recalculate the limit, as it's done for the difficulty. But currently I just think miners should be able to create their own limits together with multiple "tolerance levels", like  "I won't accept chains containing blocks larger than X unless it's already N blocks deeper than mine". Each miner should set their own limits. That would push towards a consensus. Miners with limits too different than the average would end up losing work. The point is that like this the consensus is achieved through "spontaneous order" (decentralized), and not via a top-down decision.

That said, I do have the feeling that this change will only be scheduled once we start hitting the limit.

Probably the most sensible approach.

My Credentials  | THE BTC Stock Exchange | I have my very own anthology! | Use bitcointa.lk, it's like this one but better.
flower1024
Legendary
*
Offline Offline

Activity: 1428
Merit: 1000


View Profile
January 31, 2013, 10:34:27 AM
 #20

What do you think about a dynamic block size based on the amount of transactions in the last blocks?

That's easily exploited. The limit shouldn't depend entirely on the block chain.

Here's one idea:
The block size limit doesn't need to be centrally-determined. Each node could automatically set its max block size to a calculated value based on disk space and bandwidth: "I have 100 GB disk space available, 10 MB per 10 minutes download speed and 1 MB per 10 minutes upload speed, so I'll stop relaying blocks [discouraging them] if they're near 1/8 MB [enough for each peer] and stop accepting them at all if they're over 2MB because I'd run out of disk space in less than a year at that rate". If Bitcoin ends up rejecting a long chain due to its max block size, it can ask the user whether he wants to switch to a lightweight mode.

Users could also specify target difficulty levels that they'd like the network to have and reduce their max block size when the network's actual difficulty level drops below that. A default target difficulty level could maybe be calculated based on how fast the user's computer is -- as users' computers get faster, you'd expect mining to also get faster.

i dont like that approach very much, because i think it gave to much influence to nodes.
what about this one: blocksize is determined by median transaction fees?

this is not very easy to game (except you are a big pool which should want to reduce the blocksize anyway so there is no incentive)
Pages: [1] 2 3 4 5 6 7 8 9 10 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!