Bitcoin Forum
April 20, 2024, 04:09:59 AM *
News: Latest Bitcoin Core release: 26.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 »  All
  Print  
Author Topic: Please do not change MAX_BLOCK_SIZE  (Read 13023 times)
piotr_n (OP)
Legendary
*
Offline Offline

Activity: 2053
Merit: 1354


aka tonikt


View Profile WWW
May 31, 2013, 03:06:02 PM
Last edit: May 31, 2013, 03:35:49 PM by piotr_n
Merited by ABCbits (1)
 #1

I watched Gavin's presentation from the San Jose conference and I learned that it is actually being planed to increase the MAX_BLOCK_SIZE withing new next 10 or 20 months.

Please don't do it.

As I have read it many times before, and I completely agree with it: Bitcoin is not designed for micro-transactions.

The network does not scale, we have a worldwide economic crisis and the Moore's law does not seem to be any longer applicable; our internet connections got stuck at what a DLS's copper can do, and the CPU's also don't seem to be getting the speed as much, as they were 10 years ago.

As an average bitcoin user, but also a developer who understand all the pros and cons behind increasing MAX_BLOCK_SIZE, I beg you: don't do it! Instead, do the other thing that Gavin mentioned in his presentation; implement a proper fee discovery mechanism into the client, so anyone would be able to decide how much fee he needs to pay to have his tx mined withing the next N hours...

Please let the fee market to work. The fees behind the transaction is a great feature invented by Satoshi - don't break it, but get advantage of it instead. They will take the load off the net and the net needs it.

Check out gocoin - my original project of full bitcoin node & cold wallet written in Go.
PGP fingerprint: AB9E A551 E262 A87A 13BB  9059 1BE7 B545 CDF3 FD0E
"If you don't want people to know you're a scumbag then don't be a scumbag." -- margaritahuyan
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1713586199
Hero Member
*
Offline Offline

Posts: 1713586199

View Profile Personal Message (Offline)

Ignore
1713586199
Reply with quote  #2

1713586199
Report to moderator
1713586199
Hero Member
*
Offline Offline

Posts: 1713586199

View Profile Personal Message (Offline)

Ignore
1713586199
Reply with quote  #2

1713586199
Report to moderator
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
May 31, 2013, 03:10:32 PM
Merited by ABCbits (2)
 #2

Increasing MAX_BLOCK_SIZE to a larger but reasonable limit will still allow a fee economy and most users will still be able to run a full node.  Personally I believe block size needs to be limited, it just doesn't need to be limited at 1MB.  Raising it to some larger limit (5MB, 10MB, 20MB) would provide "breathing space" and time to develop a more comprehensive strategy for scaling bitcoin.

I think there are lots of differing views on MAX_BLOCK_SIZE and some try to make it binary choice.  These are just the views I have seen expressed:
a) "1MB now, 1MB tomorrow, 1MB forever"
b) Don't raise the limit "now" (where now can be anywhere from this year to next couple years)
c) Raise the limit but don't remove it                                                                                     
d) Remove the cap and hope the "free market" (which Bitcoin mining isn't) adapts without destroying this $1B experiment.
e) Implement some algorithm for raising the blocksize deterministically overtime.

In theory I prefer e over c but it should be discussed, analyzed, peer reviewed, and tested over an extended period of time, thus in the short term I prefer c.  I would also point out that despite the blocksize being 1MB most blocks (even with unconfirmed tx) are much smaller which means major miners are taking a cautious aproach to raising the actual block size they create.


As for low bandwidth residential users what I think really needs to happen is for the "wallet" and "node" functions of bitcoind to be separated*.  This would allow someone to run a full node on a VPS and their local wallet would communicate securely with their node.  The link between nodes won't get any lighter but the link between the user's wallet and their own full node will.  There are also alternatives like electrum, future SPV clients, eWallets, etc but I think the ability for a user to run a full node in the "cloud" and then connect securely to it would be powerful especially as the demands for a full node continue to rise. 


* Yes devs I have looked at the code and refactoring it will be a nightmare but it is something which needs to be done IMHO for a lot of reasons.  I would gladly contribute towards a bounty if someone with extensive knowledge is interested.
Sukrim
Legendary
*
Offline Offline

Activity: 2618
Merit: 1006


View Profile
May 31, 2013, 03:10:44 PM
 #3

Fighting over so few transactions per second as are possible right now is in my opinion too limiting. I would agree to not having 1 TB as MAX_BLOCK_SIZE, but 10-100 MB should be still possible.

A 56k modem can still keep up with ~30 MB blocks. Roll Eyes

https://www.coinlend.org <-- automated lending at various exchanges.
https://www.bitfinex.com <-- Trade BTC for other currencies and vice versa.
fornit
Hero Member
*****
Offline Offline

Activity: 991
Merit: 1008


View Profile
May 31, 2013, 03:12:33 PM
 #4

its not about microtransactions. In a year or two, 1mb wont be enough even for normal commerce with transactions only >10$.


btw the plan is to actually let the market decide the block size, not allow unlimited blocks.
etotheipi
Legendary
*
expert
Offline Offline

Activity: 1428
Merit: 1093


Core Armory Developer


View Profile WWW
May 31, 2013, 03:14:36 PM
Merited by ABCbits (2)
 #5

I watched Gavin's presentation from the San Jose conference and I learned that it is actually being planed to increase the MAX_BLOCK_SIZE withing new next 10 or 20 months.

Please don't do it.

As I have read it many times before, and I completely agree with it: Bitcoin is not designed for micro-transactions.

The network does not scale, we have a worldwide economic crisis and the Moore's law does not seem to be any longer applicable; our internet connections got stuck at what a DLS's copper can do, and the CPU's also don't seem to be getting the speed as much, as they were 10 years ago.

As an average bitcoin user, but also a developer who understand all the pros and cons behind increasing MAX_BLOCK_SIZE, I beg you: don't do it! Instead, do the other thing that Gavin mentioned in his presentation; implement a proper fee discovery mechanism into the client, so anyone would be able to decide how much fee he needs to pay to have his tx mined withing the next N hours...

Please let the fee market to work. The fees behind the transaction is a great feature invented by Satoshi - don't break it, but get advantage of it instead. They will take the load off the net and the net needs it.

This isn't about microtransactions.  It's about a currency system that is supposed to be global, but can only handle 6 tx/sec.  That's just not enough, even if we somehow limit all transactions to 250B and >10 BTC.

There's no question that blocksize should be limited.  The question is whether 1 MB was the correct answer.  I think the answer is a resounding "no."  Bitcoin can't do what it was supposed to do at 1MB.  All other properties of the system can be maintained with a higher blocksize limit, but Bitcoin can't grow with the current one.

Founder and CEO of Armory Technologies, Inc.
Armory Bitcoin Wallet: Bringing cold storage to the average user!
Only use Armory software signed by the Armory Offline Signing Key (0x98832223)

Please donate to the Armory project by clicking here!    (or donate directly via 1QBDLYTDFHHZAABYSKGKPWKLSXZWCCJQBX -- yes, it's a real address!)
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
May 31, 2013, 03:20:09 PM
 #6

This isn't about microtransactions.  It's about a currency system that is supposed to be global, but can only handle 6 tx/sec.  That's just not enough, even if we somehow limit all transactions to 250B and >10 BTC.  There's no question that blocksize should be limited.  The question is whether 1 MB was the correct answer.  I think the answer is a resounding "no."  Bitcoin can't do what it was supposed to do at 1MB.  All other properties of the system can be maintained with a higher blocksize limit, but Bitcoin can't grow with the current one.

Well said.  To put it into context PayPal is about 50tps.  I think a short term goal of Bitcoin eclipsing PayPal in transaction volume would be a pretty impressive milestone and that would require an average block size of ~10MB and given the transaction volume distribution by hour of the day (hmm maybe interesting to chart that out) that might require a max block size of ~20MB.
piotr_n (OP)
Legendary
*
Offline Offline

Activity: 2053
Merit: 1354


aka tonikt


View Profile WWW
May 31, 2013, 03:20:57 PM
 #7

Anyone here seem to agree that sooner or later we will get to that point when we will have to say 'enough is enough'.

So why not to just keep this point at the 1MB, as Satoshi originally designed?

If you don't increase MAX_BLOCK_SIZE people will naturally start using BTC payment processors, which will take the load off the net, and which has to eventually happen anyway.

Check out gocoin - my original project of full bitcoin node & cold wallet written in Go.
PGP fingerprint: AB9E A551 E262 A87A 13BB  9059 1BE7 B545 CDF3 FD0E
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
May 31, 2013, 03:21:01 PM
 #8

Is Gavin talk available on video?  I saw it at the conference but it might help in the discussion if the OP could link to the talk.
Shevek
Sr. Member
****
Offline Offline

Activity: 252
Merit: 250



View Profile
May 31, 2013, 03:21:40 PM
 #9


As I have read it many times before, and I completely agree with it: Bitcoin is not designed for micro-transactions.


So, propaganda fails.

Again.

Proposals for improving bitcoin are like asses: everybody has one
1SheveKuPHpzpLqSvPSavik9wnC51voBa
fornit
Hero Member
*****
Offline Offline

Activity: 991
Merit: 1008


View Profile
May 31, 2013, 03:33:02 PM
 #10

This isn't about microtransactions.  It's about a currency system that is supposed to be global, but can only handle 6 tx/sec.  That's just not enough, even if we somehow limit all transactions to 250B and >10 BTC.

worse still, hitting the block size limit early, particularly before the clients start handling fees and pending transactions in a more graceful manner, might severly slow down bitcoin adoption. gavin thinks there is still around 12 month before we hit 1mb blocks. i think thats a very conservative estimation of bitcoin growth. imho, it might well happen within 3-6 months.

Quote
There's no question that blocksize should be limited.  The question is whether 1 MB was the correct answer.  I think the answer is a resounding "no."  Bitcoin can't do what it was supposed to do at 1MB.  All other properties of the system can be maintained with a higher blocksize limit, but Bitcoin can't grow with the current one.

plus, current average computers and network connections can easily work with, say, 10mb blocks. there is no real risk or assumptions about future computer capabilities involved. a moderate increase in block size is a safe bet and it doesnt make any judgements about the necessity or viablity of some parallel or overlay technology to limit the number of on-chain transactions and blockchain size long-term.


Anyone here seem to agree that sooner or later we will get to that point when we will have to say 'enough is enough'.

So why not to just keep this point at the 1MB, as Satoshi originally designed?

If you don't increase MAX_BLOCK_SIZE people will naturally start using BTC payment processors, which will take the load off the net, and which has to eventually happen anyway.

dont make a practical discussion ideological. 1mb is much lower than necessary and might hurt bitcoin severly in the short term.
piotr_n (OP)
Legendary
*
Offline Offline

Activity: 2053
Merit: 1354


aka tonikt


View Profile WWW
May 31, 2013, 03:34:54 PM
 #11

Is Gavin talk available on video?  I saw it at the conference but it might help in the discussion if the OP could link to the talk.
http://www.youtube.com/watch?v=JfF5mJDgZWc

Check out gocoin - my original project of full bitcoin node & cold wallet written in Go.
PGP fingerprint: AB9E A551 E262 A87A 13BB  9059 1BE7 B545 CDF3 FD0E
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
May 31, 2013, 03:35:25 PM
Last edit: May 31, 2013, 03:46:09 PM by DeathAndTaxes
 #12

Anyone here seem to agree that sooner or later we will get to that point when we will have to say 'enough is enough'.

Agreed however there is a debate on if 1MB is "enough".  Most people would say it isn't and that likely means "enough" will be moved to a higher number at some point in the future and in some manner.  It would be smarter to focus on the how & when given that an increase eventually is an almost certainty.

Quote
So why not to just keep this point at the 1MB, as Satoshi originally designed?

Bitcoin was never designed with a 1MB limit initially.  You can check it doesn't exist in the early versions of the source code.  That was added later as a safety limit to prevent an early attacker from massively bloating the blockchain and thus killing off the project.  Imagine if in 2010 there was no block limit you had to download a 5TB blockchain just to start using this experimental currency with very little actual value or use.  Most people wouldn't and the "ecosystem" might have died in the crib.  1MB limited the size of the blockchain to no more than 52GB per year.  High and luckily early volume was much lower but it provided an upper bound while Bitcoin was young.  When the average block has 2 to 8 transactions it doesn't make sense for a single bad actor to add GB worth of transactions to hinder future users.  Bitcoin is far more developed now so it likely is time in the near future to take the training wheels off.

Quote
If you don't increase MAX_BLOCK_SIZE people will naturally start using BTC payment processors, which will take the load off the net, and which has to eventually happen anyway.

Sure.  However the debate becomes at what point and how much goes off blockchain.  In a perfect world I would say MAX_BLOCK_SIZE should be large enough to allow anyone to run a full node with "reasonable" resources.   Reasonably to be loosely defined as the resources (storage, computational and bandwidth) available to the dedicated user willing to pay for being a peer in a global network (in other words not everyone will be a full node but most "could" be a full node).  Yeah that is a very gray term but it is more important to look at what the MAX_BLOCK_SIZE does conceptually to the idea of centralization:

Lets look at two extremes futures where Bitcoin is massively adopted (say 100x current usage in terms of users, merchants, tx volume both on & off blockchain, etc):
average block size = 1MB.  max annual on blockchain tx volume = ~3.6 million transactions.  tx fees relatively high.  overwhelming majority of tx occur off blockchain.  the blockchain becomes sort of an open interbank settlement system *
average block size = 5GB.  max annual on blockchain tx volume = ~18 billion transactions.  tx fees relatively low.  most non-micro tx remain on blockchain.  the blockchain in theory can be used by anyone however the cost of a full node exclude most**

* At 1MB average block, while the cost of running a full node is relatively trivial the cost of transactions would exclude all but the largest bulk transactions.  Remember as subsidy declines the tx fees pay for the cost of securing the network.  So either Bitcoin is popular and thus fees are high (because only ~3.6 million tx can occur annually) or Bitcoin is unpopular and thus becomes less and less secure as the subsidy declines.

* at 5GB just about anyone running a full node can directly interact with the blockchain with low transaction cost however the resources for a full node would be on the order of:
- 262TB per year in storage requirements
- 500 mbps connectivity (bidirectional) this likely can be reduced up to 80% by optimal tx and blockheader sharing but it would still be high
- memory pool of ~2 blocks worth of tx would be ~26 million transaction thus RAM requirements (to avoid painfully slow disk lookups in validation) would be say something like 32GB.
- The UXTO is likely very large as the number of independent direct users of the blockchain are high.  It is hard to estimate but we can expect the UXTO to be large and efficient validation requires at lest a significant portion to be in high speed memory.

Both extremes result in centralization.
The low limit results in a centralization of transactions.  It becomes too limited and expensive to transaction on blockchain so most occur off blockchain.
The high limit results in a centralization of nodes. The extreme cost of running a node means there will be less of them.

The "optimal" blocksize would be one that perfectly balances the centralization of transactions against the centralization of nodes.   Now 1MB obviously isn't that perfect limit and whatever the limit is raised to likely isn't either but it certainly moving in the right direct.  In other words a 10MB limit is closer to optimal than 1MB is.
ktttn
Full Member
***
Offline Offline

Activity: 126
Merit: 100


Capitalism is the crisis.


View Profile WWW
May 31, 2013, 03:44:29 PM
 #13

Recalling the 5430 dust limit controversy, can the 1MB limit be customized by each miner to suit their needs?

Wit all my solidarities,
-ktttn
Ever see a gutterpunk spanging for cryptocoins?
LfkJXVy8DanHm6aKegnmzvY8ZJuw8Dp4Qc
BitcoinAshley
Sr. Member
****
Offline Offline

Activity: 448
Merit: 250



View Profile
May 31, 2013, 03:49:28 PM
Last edit: May 31, 2013, 04:09:36 PM by BitcoinAshley
 #14

The "not designed for microtransactions" argument is irrelevant in this context, unless you consider $7.50 worth of bitcoins to be a microtransaction. As was stated earlier in the thread, a 1MB limit won't even allow for affordable MACRO transactions if we keep it where it is.

Current hardware could easily support 5-10-20 MB block sizes and we aren't even at a point where we need blocks that large. Stop making something out of nothing, folks.

In addition, a full-time dev has been 100% crowdfunded for the next 3-6 months to develop a lite node that is fully validating and requires 0 trust of any central server. This would make hardware issues moot for "average joe users" who don't want to rely on an online wallet if the full blockchain is too much for their 2008 box.

There is really no issue here aside from folks not doing their research and freaking out about problems that have already been solved.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
May 31, 2013, 03:49:55 PM
 #15

Recalling the 5430 dust limit controversy, can the 1MB limit be customized by each miner to suit their needs?

That would essentially be an unlimited blocksize.  It is possible and I don't want to say it shouldn't be done but I would point out that the reality is mining is heavily centralized.  If the x largest pools (where x is >51% of hashing power) decide to use a 100MB block it doesn't really matter what you think as virtually all miners will accept their blocks or risk (due to >51% control) falling permanently off the longest chain.  

To avoid a continual set of hard forks any limit imposed on blocks created by other miners would need to be soft limit.  As an example the node would not consider the chain with blocks greater than its limit longer until it is some x (also decided by node) ahead of the longest "compliant" chain.  This would result in soft forks that overtime would be resolved and in theory could work.  In reality I personally believe it will add a lot of complexity with no real benefit.  Game theory says that it would be optimal for the largest players (say top 3 or 4 mining pools) to agree on a certain size to avoid orphans between each other.  Those major pools attract hashing power by keeping their revenue high and that means orphans low.  Agreeing on a set block size would ensure the pools aren't competing against each other.  Other pools even if not in the official agreement will have little choice but to follow that as blocks larger than the agreement won't be extended by the largest pools and have a mathematical certainty of always being orphaned eventually.    The settings of every other node will be meaningless as the limit is soft and they will adopt the larger blocks once x block ahead (which will always happen eventually).
piotr_n (OP)
Legendary
*
Offline Offline

Activity: 2053
Merit: 1354


aka tonikt


View Profile WWW
May 31, 2013, 03:57:58 PM
 #16

Thanks @DeathAndTaxes
You have some good point here, that I had not considered.
Still, I don't quite understand the argument why 10MB is "better" than 1MB.
Better for who? And can to prove it? Wink

Someone asked me whether $7.50 is a micro-transaction..
Well: if it's lower than the remaining 2000+ txs that went into a block - then yes.
Obviously whether something was a micro or not, is a relative term.
If you are aiming into Bitcoin becoming the new global currency, then even $1000 would be a micro-transaction.

Check out gocoin - my original project of full bitcoin node & cold wallet written in Go.
PGP fingerprint: AB9E A551 E262 A87A 13BB  9059 1BE7 B545 CDF3 FD0E
fornit
Hero Member
*****
Offline Offline

Activity: 991
Merit: 1008


View Profile
May 31, 2013, 04:31:29 PM
 #17

Someone asked me whether $7.50 is a micro-transaction..
Well: if it's lower than the remaining 2000+ txs that went into a block - then yes.
Obviously whether something was a micro or not, is a relative term.
If you are aiming into Bitcoin becoming the new global currency, then even $1000 would be a micro-transaction.

circular logic alarm!

bitcoins is not for microtransactions
-> the 1mb limit will stop microtransactions
-> everything that doesnt fit into 1mb is a microtransaction
-> bitcoins is not for microtransactions
piotr_n (OP)
Legendary
*
Offline Offline

Activity: 2053
Merit: 1354


aka tonikt


View Profile WWW
May 31, 2013, 04:38:08 PM
 #18

circular logic alarm!

bitcoins is not for microtransactions
-> the 1mb limit will stop microtransactions
-> everything that doesnt fit into 1mb is a microtransaction
-> bitcoins is not for microtransactions

I just mentioned that I read it all the time: "bitcoin is not for microtransactions" - and I agree with it.
But it was not my entry point - rather a by the way argument.

My point was: if it doesn't scale further, just get advantage of it's fine design that has a built in mechanisms to solve it.
Though, at the other hand, I understand the politics and the need to balance between the resource consumption vs. making the system available for anyone, in order for it to become successful.
So yeah, why not 10MB Smiley

Check out gocoin - my original project of full bitcoin node & cold wallet written in Go.
PGP fingerprint: AB9E A551 E262 A87A 13BB  9059 1BE7 B545 CDF3 FD0E
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
May 31, 2013, 05:24:08 PM
 #19

That would essentially be an unlimited blocksize.  It is possible and I don't want to say it shouldn't be done but I would point out that the reality is mining is heavily centralized.  If the x largest pools (where x is >51% of hashing power) decide to use a 100MB block it doesn't really matter what you think as virtually all miners will accept their blocks or risk (due to >51% control) falling permanently off the longest chain.
This is the real problem.

There's no reason to be concerned about blocks getting too big in the absence of a protocol limit unless you're also worried about some entity gaining majority control of the network, and in the case of an entity gaining majority control they can interfere with the network in other ways with or without a protocol-specified maximum block size.

I think the number of non-mining full nodes is going to drastically increase as more businesses begin to adopt Bitcoin. especially as we see the node implementation become heterogeneous with the emergence of Bits of Proof and btcd. The obvious solution if you're worried about mining pool misbehavior is to give all full nodes tools for specifying the maximum block sizes they will relay. If a large pool tries to make an excessively large block it won't do them any good if the rest of the network refuses to relay it and the other miners refuse to build from it.
Saturn7
Full Member
***
Offline Offline

Activity: 147
Merit: 100



View Profile
May 31, 2013, 05:27:39 PM
 #20

This puppy ain't gonna fly with a 7 Transactions per second limit, even if it was only reserved for large transactions.




First there was Fire, then Electricity, and now Bitcoins Wink
Pages: [1] 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!