Bitcoin Forum
December 07, 2024, 08:17:19 AM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 3 4 5 6 »  All
  Print  
Author Topic: A Scalability Roadmap  (Read 14927 times)
jonny1000 (OP)
Member
**
Offline Offline

Activity: 129
Merit: 14



View Profile
October 06, 2014, 06:02:49 PM
Last edit: October 06, 2014, 09:17:55 PM by jonny1000
 #1

Please see Gavin's writeup below:
https://bitcoinfoundation.org/2014/10/a-scalability-roadmap/

I think this is a very good and interesting read with some fantastic points, however this is likely to be considered controversial by some in this community.  In particular a fixed schedule for increasing the block size limit over time is a significant proposal.

Is Gavin saying this should grow at 50% per year because bandwidth has been increasing at this rate in the past?  Might it not be safer to choose a rate lower than historic bandwidth growth?  Also how do we know this high growth in bandwidth will continue?

Gavin mentioned that this is "similar to the rule that decreases the block reward over time", however the block reward decreases by 50%, an increase by 50% is somewhat different.  A 50% fall every 4 years implies that there will never be more than 21 million coins, 50% growth in the block size limit implies exponential growth forever.  Perhaps after 21 million coins is reached Bitcoin will stop growing, therefore if one wants to make a comparison, the block size limit increase rate could half every 4 years, reaching zero growth when 21 million coins are reached.  Although I do not know the best solution to this.  Can anyone explain why exponential growth is a good idea?

In my view, should volumes increase above the 7 transaction a second level in the short term, a quick fix like doubling the block size limit should be implemented.  A more long term solution like an annual increase in the block size limit could require more research into transaction fees and the impact this could have on incentivising miners.  Ultimately we may need a more dynamic system where the block size limit is determined in part by transaction volume, the network difficulty and transactions fees in some way, as well as potentially a growth rate.  Although a more robust theoretical understanding of this system may be required before we reach that point.
 
Many thanks
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 2311


Chief Scientist


View Profile WWW
October 06, 2014, 06:25:02 PM
 #2

Is Gavin saying this should grow at 50% per year because bandwidth has been increasing at this rate in the past?  Might it not be safer to choose a rate lower than historic bandwidth growth?  Also how do we know this high growth in bandwidth will continue?

Yes, that is what I am saying.

"Safer" : there are two competing threats here: raise the block size too slowly and you discourage transactions and increase their price. The danger is Bitcoin becomes irrelevant for anything besides huge transactions, and is used only by big corporations and is too expensive for individuals. Hurray, we just reinvented the SWIFT or ACH systems.

Raise it too quickly and it gets too expensive for ordinary people to run full nodes.

So I'm saying: the future is uncertain, but there is a clear trend. Lets follow that trend, because it is the best predictor of what will happen that we have.

If the experts are wrong, and bandwidth growth (or CPU growth or memory growth or whatever) slows or stops in ten years, then fine: change the largest-block-I'll-accept formula. Lowering the maximum is easier than raising it (lowering is a soft-forking change that would only affect stubborn miners who insisted on creating larger-than-what-the-majority-wants blocks).


RE: a quick fix like doubling the size:

Why doubling? Please don't be lazy, at least do some back-of-the-envelope calculations to justify your numbers (to save you some work: the average Bitcoin transaction is about 250 bytes big). The typical broadband home internet connection can support much larger blocks today.

How often do you get the chance to work on a potentially world-changing project?
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1020



View Profile
October 06, 2014, 07:14:13 PM
 #3

I like the block size scaling idea.

It:

1) Grows the on-chain transaction limit.

2) Should keep the network within reach of hobbyists, and therefore, as decentralized as now.

3) Is extremely simple, and everyone should be able to understand

4) Provides some certainty going forward.  Since bitcoin is a foundation upon which many things are built, having that certainty sooner is better.


A question: when would the 50% increases start?  Could the progression be kick-started by jumping directly to, say 2MB or 4MB, and then doubling thereafter?  Or would that put too much strain on the network?
jonny1000 (OP)
Member
**
Offline Offline

Activity: 129
Merit: 14



View Profile
October 06, 2014, 07:51:48 PM
Last edit: October 07, 2014, 09:05:23 PM by jonny1000
 #4

This is what 50% growth per annum looks like.  How will miners earn income when the block reward is low and the block size limit is increasing at such an exponential rate, that transaction fees will also be low, even if demand grows at say 40% per annum?

elendir
Newbie
*
Offline Offline

Activity: 59
Merit: 0


View Profile
October 06, 2014, 08:00:31 PM
 #5

I wonder what the miners think about the blocksize proposal. I believe they can choose the size of the block to be anywhere between 0 to 1MB today. Should the 1MB limit be raised, the final decision will remain with the miners, right?
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1020



View Profile
October 06, 2014, 08:27:44 PM
 #6

I wonder what the miners think about the blocksize proposal. I believe they can choose the size of the block to be anywhere between 0 to 1MB today. Should the 1MB limit be raised, the final decision will remain with the miners, right?

Correct.  Miners do not have to include any transactions in their blocks if they so choose.

Also, miners do not have to build off any particular block.  Lets say a miner starts issuing blocks filled to the brim with 1-satoshi transactions.  The other miners could all agree (or collude, if you see it that way) to reject the 'spam' block and build off the previous one.
alpet
Legendary
*
Offline Offline

Activity: 1912
Merit: 1020


View Profile WWW
October 07, 2014, 12:49:15 PM
 #7

Hi All!
Why not possible partial nodes vs full nodes? Just for example: I will deploy partial nodes on some office computers, and set for every  node disk space quota to 10Gb. Some from this nodes will distribute blockchain for 2011-2012 years, some for 2013-2014. I think this solution little more flexible and distributable for many users.
P.S.: This text was automatically translated from Russian.

Novacoin we trust!
https://svcpool.io - PoS cтeйкинг и oбмeнник NVC/BTC.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 129
Merit: 14



View Profile
October 07, 2014, 01:02:14 PM
Last edit: October 07, 2014, 08:58:48 PM by jonny1000
 #8

I have tried to analyze what is going on graphically.  As Gavin said, typically there is a price boom which causes higher demand for transactions.  This is represented by a shift to the right in the below demand curve.  In order to keep the transaction price low a similar shift to the right may be required in the supply curve, which could be caused by an increase in the block size limit.  I think it might be a bit presumptuous to assume demand will continue to grow exponentially, especially at such a high rate.

Current supply & demand curves for space in blocks


Shift in supply & demand curves

Note: figures for illustrative purposes only.
delulo
Sr. Member
****
Offline Offline

Activity: 441
Merit: 250


View Profile
October 07, 2014, 05:43:21 PM
Last edit: October 07, 2014, 08:54:31 PM by delulo
 #9

Am I interpreting this
Quote
Imposing a maximum size that is in the reach of any ordinary person with a pretty good computer and an average broadband internet connection eliminates barriers to entry that might result in centralization of the network.
right, that it refers to barriers to entry of being a node?

instagibbs
Member
**
Offline Offline

Activity: 114
Merit: 12


View Profile
October 07, 2014, 06:18:47 PM
 #10

Hi All!
Why not possible partial nodes vs full nodes? Just for example: I will deploy partial nodes on some office computers, and set for every  node disk space quota to 10Gb. Some from this nodes will distribute blockchain for 2011-2012 years, some for 2013-2014. I think this solution little more flexible and distributable for many users.
P.S.: This text was automatically translated from Russian.

https://github.com/bitcoin/bitcoin/pull/4701

It's being worked on.
Peter R
Legendary
*
Offline Offline

Activity: 1162
Merit: 1007



View Profile
October 08, 2014, 04:26:47 AM
Last edit: October 08, 2014, 02:11:59 PM by Peter R
 #11

Originally, I imagined a floating blocksize limit based on demand, but after reading Gavin's roadmap I support his recommendation.  The limit should be increased (in a codified way) at a constant yearly % based on historical growth rates for internet bandwidth.  It's important that the blocksize limit be known a priori in order to give innovators more confidence to build on top of our network.

On a somewhat related note, here's a chart that shows Bitcoin's market cap overlaid with the daily transaction volume (excluding popular addresses).  The gray line extrapolates when in time and at what market cap we might begin to bump into the current 1 MB limit.    


Run Bitcoin Unlimited (www.bitcoinunlimited.info)
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1020



View Profile
October 08, 2014, 04:50:23 AM
 #12

I was expecting Gavin's roadmap to be hotly debated, but this thread has been relatively quite.  Is the debate unfolding somewhere else?  Or is there just not much debate about it?

Nice charts, PeterR.
theymos
Administrator
Legendary
*
Offline Offline

Activity: 5404
Merit: 13472


View Profile
October 08, 2014, 05:37:37 AM
 #13

If the experts are wrong, and bandwidth growth (or CPU growth or memory growth or whatever) slows or stops in ten years, then fine: change the largest-block-I'll-accept formula. Lowering the maximum is easier than raising it (lowering is a soft-forking change that would only affect stubborn miners who insisted on creating larger-than-what-the-majority-wants blocks).

Lowering the limit afterward wouldn't be a soft-forking change if the majority of mining power was creating too-large blocks, which seems possible.

I think that a really conservative automatic increase would be OK, but 50% yearly sounds too high to me. If this happens to exceed some residential ISP's actual bandwidth growth, then eventually that ISP's customers will be unable to be full nodes unless they pay for a much more expensive Internet connection. The idea of this sort of situation really concerns me, especially since the loss of full nodes would likely be gradual and easy to ignore until after it becomes very difficult to correct.

As I mentioned on Reddit, I'm also not 100% sure that I agree with your proposed starting point of 50% of a hobbyist-level Internet connection. This seems somewhat burdensome for individuals. It's entirely possible that Bitcoin can be secure without a lot of individuals running full nodes, but I'm not sure about this.

Determining the best/safest way to choose the max block size isn't really a technical problem; it has more to do with economics and game theory. I'd really like to see some research/opinions on this issue from economists and other people who specialize in this sort of problem.

1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
jonald_fyookball
Legendary
*
Offline Offline

Activity: 1302
Merit: 1008


Core dev leaves me neg feedback #abuse #political


View Profile
October 08, 2014, 07:27:56 AM
 #14

I'm not really qualified to comment on the merits of Gavin's plan, but on the surface it sounds like a thoughtful proposal.  I must say, it is exciting to see solutions to scalability being proposed, and I'm sure it is encouraging to the greater Bitcoin community at large.  Just the fact that a plan is on the table should be a nice jab to the naysayers/skeptics who have been "ringing the alarm bell" on this issue.

Although, in a sense they are correct that issues require action.  I would like to thank Gavin and the other developers for all the great work they've done and continue to do for Bitcoin.  

Hats off to you sir.

spin
Sr. Member
****
Offline Offline

Activity: 362
Merit: 262


View Profile
October 08, 2014, 10:25:25 AM
 #15

The post is a great read on the direction of ongoing development. Such posts are really helpful for hobbyists such as myself to get an idea where things are headed.   Keen on testing and supporting some of the new stuff.  I'd love to test out headers first for example.

Quote
After 12 years of bandwidth growth that becomes 56 billion transactions per day on my home network connection — enough for every single person in the world to make five or six bitcoin transactions every single day. It is hard to imagine that not being enough; according the the Boston Federal Reserve, the average US consumer makes just over two payments per day.

I have no idea but the average consumer is not the only one making transactions.  There are also business.  So the 2 per day stat is not all that's relevant.   But 5 to 6 bn p.p.p.d. should cover that also Smiley

Small typo:
Quote
I expect the initial block download problem to be mostly solved in the next relase or three of Bitcoin Core. The next scaling problem that needs to be tackled is the hardcoded 1-megabyte block size limit that means the network can suppor only approximately 7-transactions-per-second.


If you liked this post buy me a beer.  Beers are quite cheap where I live!
bc1q707guwp9pc73r08jw23lvecpywtazjjk399daa
cbeast
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1014

Let's talk governance, lipstick, and pigs.


View Profile
October 08, 2014, 10:51:55 AM
 #16

Is there a relationship between hashrate and bandwidth? If by increasing blocksize and thereby increasing bandwidth, would that eat into the available bandwidth for hashing? For example, if a peta-miner maxes out their bandwidth with just hashrate, then increasing blocksize would lower their hashrate. They would have to buy more bandwidth if it is available. It might favor miners living where there is better internet connectivity rather than cheap electricity or cold climate. It could help decentralize mining by enlarging the blocksize.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1104


View Profile
October 08, 2014, 03:07:06 PM
 #17

What is the plan for handling the 32MB message limit?

Would the 50% per year increase be matched by a way to handle unlimited block sizes?

Blocks would have to be split over multiple messages (or the message limit increased)

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Skoupi
Sr. Member
****
Offline Offline

Activity: 252
Merit: 250

Skoupi the Great


View Profile
October 08, 2014, 03:13:43 PM
 #18

Raise it too quickly and it gets too expensive for ordinary people to run full nodes.

Ordinary people aren't supposed to run full nodes anyway  Tongue
Mrmadden
Newbie
*
Offline Offline

Activity: 6
Merit: 0


View Profile
October 08, 2014, 03:58:47 PM
 #19

50% is conservative based on extrapolated storage and computing power cost efficiencies, decreasing 100% and 67% annually.

50% is very risky based on extrapolated bandwidth cost decreases, decreasing 50% annually.

I would dial it back from 50% to 40%.  Hobbyists will want to download full nodes remotely, and that is just too close for comfort.
andytoshi
Full Member
***
Offline Offline

Activity: 179
Merit: 151

-


View Profile
October 08, 2014, 05:08:34 PM
 #20

Ordinary people aren't supposed to run full nodes anyway  Tongue

This is absurd and false. Bitcoin is deliberately a publically verifiable system.
Pages: [1] 2 3 4 5 6 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!