Bitcoin Forum
November 18, 2024, 11:21:52 PM *
News: Check out the artwork 1Dq created to commemorate this forum's 15th anniversary
 
   Home   Help Search Login Register More  
Poll
Question: What high-level approach should be taken regarding the hard limit of a one megabyte block size  (Voting closed: March 07, 2013, 11:33:20 AM)
Leave as one megabyte - 20 (21.5%)
Let Core Dev decide - 34 (36.6%)
Agree to increase - 39 (41.9%)
Total Voters: 93

Pages: [1] 2 »  All
  Print  
Author Topic: Max Block Size Limit: the community view [Vote - results in 14 days]  (Read 1938 times)
solex (OP)
Legendary
*
Offline Offline

Activity: 1078
Merit: 1006


100 satoshis -> ISO code


View Profile
February 21, 2013, 11:33:20 AM
 #1

Satoshi Nakamoto decided upon a one megabyte block size limit for Bitcoin. He viewed it as a temporary measure, but because it has remained fixed for so long it can reasonably be considered by some as a permanent limit. This means currently that the Bitcoin blockchain can not grow faster than 1Mb every 10 minutes.

If the limit remains fixed then Future A will be the road ahead. If the limit is raised or set by an algorithm, then Future B lies ahead.

Future A: Bitcoin will never handle more than 7 transactions per second. This is a tiny fraction of that handled by major payment systems. Evenutally 99+% will need to be handled by third-party services (such as Coinbase), bitcoin "banks", trusted transfer layers or even alt-chains. Only large transactions (with high fees) get included on the main blockchain. Bitcoin never makes large payment systems obsolete (Visa, Mastercard, PayPal) but full nodes can be run by individuals with moderate computing power and bandwidth. People can personally verify the integrity of the bitcoin network, even though it operates analogously to an internet backbone, but for currency, and people's transactions and holdings are held elsewhere. Mining the main blockchain will remain a niche business,

Future B: Bitcoin grows to handle hundreds or thousands of transactions per second. Perhaps Moore's Law can keep up so that some individuals can run full nodes, although lightweight nodes will be necessary for nearly all. Big companies run nodes. All transactions which people want to have on the main blockchain (with low fees) are supported. Third-party services do not have to be used unless they add value, faster confirmations, etc. Existing large payment systems will wither. Mining the main blockchain will become a big business.

It is fair to say that this subject has been exhaustively discussed in many threads. There are many arguments for and against, so people are invited to wade through the history of the debate before deciding to vote.

Any change will require a hard-fork i.e. everyone will need to upgrade their software. It will likely be necessary before the end of 2013 based upon transaction growth rates. Of course, if no change is decided upon then the limit will become a real constraint, fees will rise, zero-fee transactions will no longer be processed, third-party services must fill the gap.

Jgarzik > [PATCH] increase block size limit
https://bitcointalk.org/index.php?topic=1347.0

caveden >  Block size limit automatic adjustment
https://bitcointalk.org/index.php?topic=1865.0

barbarousrelic > Max block size and transaction fees
https://bitcointalk.org/index.php?topic=96097.0

Jeweller > The MAX_BLOCK_SIZE fork
https://bitcointalk.org/index.php?topic=140233.0

Misterbigg > Why do people pay fees? Why are free transactions accepted by miners?
https://bitcointalk.org/index.php?topic=144421.0

Retep  > How a floating blocksize limit inevitably leads towards centralization
https://bitcointalk.org/index.php?topic=144895.0

notig > The fork
https://bitcointalk.org/index.php?topic=145072.0

hazek  > Why the Bitcoin rules can't change (reading time ~5min)
https://bitcointalk.org/index.php?topic=145475.0

solex (OP)
Legendary
*
Offline Offline

Activity: 1078
Merit: 1006


100 satoshis -> ISO code


View Profile
February 21, 2013, 11:36:10 AM
Last edit: February 21, 2013, 08:58:28 PM by solex
 #2

Bitcoin is now a US$300 million enterprise. You have a seat at the top table and are being asked for an opinion on a strategic direction, a high-level approach. That is why a minimum of options are presented.

"Leaving as one megabyte" could also be a rejection of any future hard-fork proposals which are not unanimously accepted by those running full nodes.

"Let Core Dev decide" could also be an abstention.

"Agree to increase" is simply consent that those with technical knowledge can go away and work out the detail, which could be anything - but it is a change. If you have technical knowledge you might want to suggest ideas. But that is jumping the gun and out-of-scope of this high-level discussion.

Future X is what is going to happen. No one knows what this will look like in detail, all we can say is that because changing the max block size is a binary decision (yes/no) then Future A or Future B will more accurately predict the state of Future X.

In the commercial space "senior user" opinion is very important. All successful enterprises need to take seriously their feedback:

https://bitcointalk.org/index.php?topic=145498.msg1543934#msg1543934

If other senior user opinions come to light then I will link them here too.


cypherdoc
Legendary
*
Offline Offline

Activity: 1764
Merit: 1002



View Profile
February 21, 2013, 12:54:20 PM
 #3

Thanks for this.
Jeweller
Newbie
*
Offline Offline

Activity: 24
Merit: 1


View Profile
February 21, 2013, 01:01:42 PM
 #4

My answer is none of the above.
Might I suggest another choice:

"Whatever everyone else is doing."

This position could be criticized as sheep-like, certainly, unthinking, just following.  But that is exactly what I think is best.  Honestly I don't really think a 1MB block limit is The End of Bitcoin.  And a 100MB block limit wouldn't be either.  Some kind of well designed variable limit based on fees over the last 2016 blocks, or difficulty, or something smart, sure, that'd be OK too.

You know what WOULD be The End of Bitcoin though?  If half the people stick to 1MB blocks, and the other half go to 10MB.  Then that's it, game over man.  MTGox price per bitcoin would plummet, right?  Because... MTGox price per WHICH bitcoin?  Etc.

So I'll go with what everyone else is doing.  And everyone else should too.  (There may be some logical feedback loop there...)  If there is a fork, it needs to come into effect only after substantially all of the last few weeks worth of transactions are all with a big-block compatible client, miners, everything.  Only then should any change be made.
hazek
Legendary
*
Offline Offline

Activity: 1078
Merit: 1003


View Profile
February 21, 2013, 01:19:02 PM
 #5

My answer: None of the above.

I will agree to, even advocate for, any rule change which solves a technical problem but leaves the core principles of Bitcoin, immediately and into the foreseeable future, intact, the most important of which is what I call my Bitcoin sovereignty.

My personality type: INTJ - please forgive my weaknesses (Not naturally in tune with others feelings; may be insensitive at times, tend to respond to conflict with logic and reason, tend to believe I'm always right)

If however you enjoyed my post: 15j781DjuJeVsZgYbDVt2NZsGrWKRWFHpp
Meni Rosenfeld
Donator
Legendary
*
Offline Offline

Activity: 2058
Merit: 1054



View Profile WWW
February 21, 2013, 01:26:36 PM
 #6

You are presenting a false dichotomy between "payments are transactions in the blockchain" and "payments are processed by traditional service providers".

Bitcoin is a powerful technology which allows more advanced applications.

If you make multiple payments to a given merchant you can use payment channels. And if you want more flexibility you can use my add a third party to the mix, but with absolutely minimal trust requirement and dependency which is nothing like traditional providers.


Anyway, the block size limit should eventually be increased, but not by an algorithm.

1EofoZNBhWQ3kxfKnvWkhtMns4AivZArhr   |   Who am I?   |   bitcoin-otc WoT
Bitcoil - Exchange bitcoins for ILS (thread)   |   Israel Bitcoin community homepage (thread)
Analysis of Bitcoin Pooled Mining Reward Systems (thread, summary)  |   PureMining - Infinite-term, deterministic mining bond
Sukrim
Legendary
*
Offline Offline

Activity: 2618
Merit: 1007


View Profile
February 21, 2013, 01:40:59 PM
 #7

According to https://en.bitcoin.it/wiki/Scalability#Network an average transaction is ~0.5 kB (let's say 500 bytes).

Max block size = 1 million bytes, every 10 minutes. This means 2000 transactions per 10 minutes, 200 per minute and 3.333... transactions per second.
The minimum transaction size is "~0.2 kB" --> that's where these 7 transactions per minute come from.

One question is then also what to do with more complex transactions if there are only 7 "minimal" transactions possible in a second (e.g. can we "dumb down" smarter transactions) and so on.

At the moment by the way we're close to bumping against a quarter(!) of this limit, the only reason this is discussed now is that Bitcoin might continue to gain traction and take shorter than another 12 years to bump against the current hard limit of 1 million bytes per block.


https://www.coinlend.org <-- automated lending at various exchanges.
https://www.bitfinex.com <-- Trade BTC for other currencies and vice versa.
Timo Y
Legendary
*
Offline Offline

Activity: 938
Merit: 1001


bitcoin - the aerogel of money


View Profile
February 21, 2013, 01:53:51 PM
 #8

I'm unhappy with either future.

Intuitively, a block size limit of 1MB seems wrong, but allowing blocks of arbitrary size seems wrong too.

I don't think this needs to be a dilemma. I think we can have it both ways: A network that handles thousands of transactions per section AND where a normal user can still contribute towards storing the blockchain.

Whether this will require blockchain pruning, swarm nodes, storage fees, or some other solution, I don't know, but in either case it's a major project that will take years to implement.

In the meantime, we need a stopgap solution, and the limit needs to be increased to some new value X. I trust the developers to decide what that X should be.

GPG ID: FA868D77   bitcoin-otc:forever-d
Rampion
Legendary
*
Offline Offline

Activity: 1148
Merit: 1018


View Profile
February 21, 2013, 02:03:20 PM
 #9

I want a blocksize that allows the average user, with an average connection and an average computer (let's say 5 years old computer) to run a full node.

The magic of bitcoin is its decentralized nature: make it possible only to super-companies to run full nodes, and you will have a new central bank... And bitcoin will die.

Sukrim
Legendary
*
Offline Offline

Activity: 2618
Merit: 1007


View Profile
February 21, 2013, 02:06:27 PM
 #10

I want a blocksize that allows the average user, with an average connection and an average computer (let's say 5 years old computer) to run a full node.

Could you please clarify this a bit more? How does a user with a 5 year old computer and an "average" (what is "average"?) connection benefit the network by operating a full node? How will "average" computers and connections look in 20 years?

https://www.coinlend.org <-- automated lending at various exchanges.
https://www.bitfinex.com <-- Trade BTC for other currencies and vice versa.
Akka
Legendary
*
Offline Offline

Activity: 1232
Merit: 1001



View Profile
February 21, 2013, 02:12:45 PM
 #11

I'm unhappy with either future.

Intuitively, a block size limit of 1MB seems wrong, but allowing blocks of arbitrary size seems wrong too.

I don't think this needs to be a dilemma. I think we can have it both ways: A network that handles thousands of transactions per section AND where a normal user can still contribute towards storing the blockchain.

Whether this will require blockchain pruning, swarm nodes, storage fees, or some other solution, I don't know, but in either case it's a major project that will take years to implement.

In the meantime, we need a stopgap solution, and the limit needs to be increased to some new value X. I trust the developers to decide what that X should be.

I also don't think just increasing the Blocksize to a new max would be a Solution. Then we would eventually end up to have this problem again.

Also a free Blocksize would be dangerous.

We have already a self adjusting process for the creation time of Blocks in place, that ensures, no matter how the technical development goes blocks are (on average) found on a constant rate.

I'm sure a self adjusting Max Blocksize Function is possible, too.

A function that ensures that transaction space always remains scarce and limited. But at the same time ensures that there will be enough space that it remains possible to do a transaction.

F.E.: One that adjust the Blocksize in a way (possibly with each difficult adjustment), that during Peak (14 biggest Blocks of the last  last 144?) hours not all transactions can be included immediately but eventually during the low traffic times (14 smallest Blocks of the last  last 144?). Also include a limited increase / decrease at each change.

All previous versions of currency will no longer be supported as of this update
Rampion
Legendary
*
Offline Offline

Activity: 1148
Merit: 1018


View Profile
February 21, 2013, 02:15:26 PM
 #12

I want a blocksize that allows the average user, with an average connection and an average computer (let's say 5 years old computer) to run a full node.

Could you please clarify this a bit more? How does a user with a 5 year old computer and an "average" (what is "average"?) connection benefit the network by operating a full node? How will "average" computers and connections look in 20 years?

Average users being able to run full nodes means average users being able to verify and sign. Means average users being able to prevent that a few super-nodes change the rules at their own will.

In 20 years "average computers" and "average connections" will grow. The rule is simple: for me too big is what an average computer cannot handle. What about an "average connection"? If you cannot run a full node through Tor, then it's not good. Don't forget that btc could be attacked by governments and other factual powers.

Sukrim
Legendary
*
Offline Offline

Activity: 2618
Merit: 1007


View Profile
February 21, 2013, 02:28:36 PM
 #13

I want to see concrete numbers: How many bits per second in transfer and how many transaction verifications per second can an average computer do?

Keep in mind that if I count the average in my household, the average PC has ~3 cores with ~2 GHz, ~6 GB RAM and about 2 TB of HDD space. In other areas of the world that might not even be reality in 5 years...

https://www.coinlend.org <-- automated lending at various exchanges.
https://www.bitfinex.com <-- Trade BTC for other currencies and vice versa.
Mike Hearn
Legendary
*
Offline Offline

Activity: 1526
Merit: 1134


View Profile
February 21, 2013, 02:35:11 PM
 #14

Do you really think this decision will be made based on a forum vote?

It doesn't work that way, sorry.
ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470
Merit: 1006


Bringing Legendary Har® to you since 1952


View Profile
February 21, 2013, 02:37:37 PM
 #15

My answer is: none of the above.
I think that the block size should be dynamic and recalculated every X blocks using some well-balanced algorithm.

Of course, results of this poll are not going to force anybody into a certain decision.

Piper67
Legendary
*
Offline Offline

Activity: 1106
Merit: 1001



View Profile
February 21, 2013, 02:39:57 PM
 #16

My answer is: none of the above.
I think that the block size should be dynamic and recalculated every X blocks using some well-balanced algorithm.

Of course, results of this poll are not going to force anybody into a certain decision.

I agree... same approach as with difficulty, though perhaps more often.
lunarboy
Hero Member
*****
Offline Offline

Activity: 544
Merit: 500



View Profile
February 21, 2013, 02:44:18 PM
 #17

Agreed. you should have a 4th option. 'dynamically change the block size'
depending on some fixed and pre agreed algorithm
ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470
Merit: 1006


Bringing Legendary Har® to you since 1952


View Profile
February 21, 2013, 02:45:30 PM
 #18

Agreed. you should have a 4th option. 'dynamically change the block size'
depending on some fixed and pre agreed algorithm

Therefore, this poll is totally broken.

I will make a new one.

ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470
Merit: 1006


Bringing Legendary Har® to you since 1952


View Profile
February 21, 2013, 02:50:09 PM
 #19

Here is the proper poll, please come and vote:

https://bitcointalk.org/index.php?topic=145636.0

Rampion
Legendary
*
Offline Offline

Activity: 1148
Merit: 1018


View Profile
February 21, 2013, 03:18:18 PM
 #20

I want to see concrete numbers: How many bits per second in transfer and how many transaction verifications per second can an average computer do?

Keep in mind that if I count the average in my household, the average PC has ~3 cores with ~2 GHz, ~6 GB RAM and about 2 TB of HDD space. In other areas of the world that might not even be reality in 5 years...

I'm sorry but I'm a non-technical user. Anyhow, it's not hard to see what is the "average computer". Just check the specs of a 5 years old Vaio, Apple laptop, personal-use computer. Of course that many people in Africa could not run a super note due to specs/network limitation. But the whole point of this is to prevent super companies being the only players able to run a full node. Something that can run in a standard, 5 years old, "personal use" computer by Dell, Apple, Sony Vaio, etc. would be definitely OK.

If you want me to be more specific: I would say that right now we would be looking at an Intel Core 2 2GH, 2GB RAM and 250MB of HDD space.

Pages: [1] 2 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!