Bitcoin Forum
September 27, 2022, 08:07:51 AM *
News: Latest Bitcoin Core release: 23.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: 1 2 3 4 5 6 [All]
  Print  
Author Topic: A Scalability Roadmap  (Read 14800 times)
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 06, 2014, 06:02:49 PM
Last edit: October 06, 2014, 09:17:55 PM by jonny1000
 #1

Please see Gavin's writeup below:
https://bitcoinfoundation.org/2014/10/a-scalability-roadmap/

I think this is a very good and interesting read with some fantastic points, however this is likely to be considered controversial by some in this community.  In particular a fixed schedule for increasing the block size limit over time is a significant proposal.

Is Gavin saying this should grow at 50% per year because bandwidth has been increasing at this rate in the past?  Might it not be safer to choose a rate lower than historic bandwidth growth?  Also how do we know this high growth in bandwidth will continue?

Gavin mentioned that this is "similar to the rule that decreases the block reward over time", however the block reward decreases by 50%, an increase by 50% is somewhat different.  A 50% fall every 4 years implies that there will never be more than 21 million coins, 50% growth in the block size limit implies exponential growth forever.  Perhaps after 21 million coins is reached Bitcoin will stop growing, therefore if one wants to make a comparison, the block size limit increase rate could half every 4 years, reaching zero growth when 21 million coins are reached.  Although I do not know the best solution to this.  Can anyone explain why exponential growth is a good idea?

In my view, should volumes increase above the 7 transaction a second level in the short term, a quick fix like doubling the block size limit should be implemented.  A more long term solution like an annual increase in the block size limit could require more research into transaction fees and the impact this could have on incentivising miners.  Ultimately we may need a more dynamic system where the block size limit is determined in part by transaction volume, the network difficulty and transactions fees in some way, as well as potentially a growth rate.  Although a more robust theoretical understanding of this system may be required before we reach that point.
 
Many thanks
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 1939


Chief Scientist


View Profile WWW
October 06, 2014, 06:25:02 PM
 #2

Is Gavin saying this should grow at 50% per year because bandwidth has been increasing at this rate in the past?  Might it not be safer to choose a rate lower than historic bandwidth growth?  Also how do we know this high growth in bandwidth will continue?

Yes, that is what I am saying.

"Safer" : there are two competing threats here: raise the block size too slowly and you discourage transactions and increase their price. The danger is Bitcoin becomes irrelevant for anything besides huge transactions, and is used only by big corporations and is too expensive for individuals. Hurray, we just reinvented the SWIFT or ACH systems.

Raise it too quickly and it gets too expensive for ordinary people to run full nodes.

So I'm saying: the future is uncertain, but there is a clear trend. Lets follow that trend, because it is the best predictor of what will happen that we have.

If the experts are wrong, and bandwidth growth (or CPU growth or memory growth or whatever) slows or stops in ten years, then fine: change the largest-block-I'll-accept formula. Lowering the maximum is easier than raising it (lowering is a soft-forking change that would only affect stubborn miners who insisted on creating larger-than-what-the-majority-wants blocks).


RE: a quick fix like doubling the size:

Why doubling? Please don't be lazy, at least do some back-of-the-envelope calculations to justify your numbers (to save you some work: the average Bitcoin transaction is about 250 bytes big). The typical broadband home internet connection can support much larger blocks today.

How often do you get the chance to work on a potentially world-changing project?
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 06, 2014, 06:54:59 PM
Last edit: October 06, 2014, 07:07:54 PM by jonny1000
 #3

Dear Gavin

Thanks for your reply.  When I said “Safer” I meant the risk that average bandwidth speeds grow less than historic rates in the future. I totally agree with you saying that reducing the block size limit would only be a soft fork, rather than the hard fork of increasing the limit, I hadn’t thought of that.  I guess you could be right, maybe the best option is to follow trends.

Raise the block size too slowly and you discourage transactions and increase their price.  The danger is Bitcoin becomes irrelevant for anything besides huge transactions, and is used only by big corporations and is too expensive for individuals. Hurray, we just reinvented the SWIFT or ACH systems.

This is a good point, however what about all the people like Andreas Antonopoulos, who constantly say things like, “Bitcoin is not a faster, cheaper more efficient way of shopping, thinking of it this way misses the point, its more than that, it’s a distributed platform for open, permissionless, trustless, blah blah blah…… innovation.”
This is also an interesting point of view and I think it would be good to somehow find a balance between this and faster cheaper transactions.  But I think you are right, somehow the block size limit needs to increase.
 
Yes I admit I was being lazy by saying doubling.  A normal home internet connection can easily keep up, however is there not a potential danger of block propagation times becoming larger, which also needs to be considered, at least now when IBLT hasn’t been “fully” implemented?
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1018



View Profile
October 06, 2014, 07:14:13 PM
 #4

I like the block size scaling idea.

It:

1) Grows the on-chain transaction limit.

2) Should keep the network within reach of hobbyists, and therefore, as decentralized as now.

3) Is extremely simple, and everyone should be able to understand

4) Provides some certainty going forward.  Since bitcoin is a foundation upon which many things are built, having that certainty sooner is better.


A question: when would the 50% increases start?  Could the progression be kick-started by jumping directly to, say 2MB or 4MB, and then doubling thereafter?  Or would that put too much strain on the network?
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 06, 2014, 07:51:48 PM
Last edit: October 07, 2014, 09:05:23 PM by jonny1000
 #5

This is what 50% growth per annum looks like.  How will miners earn income when the block reward is low and the block size limit is increasing at such an exponential rate, that transaction fees will also be low, even if demand grows at say 40% per annum?

elendir
Newbie
*
Offline Offline

Activity: 59
Merit: 0


View Profile
October 06, 2014, 08:00:31 PM
 #6

I wonder what the miners think about the blocksize proposal. I believe they can choose the size of the block to be anywhere between 0 to 1MB today. Should the 1MB limit be raised, the final decision will remain with the miners, right?
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1018



View Profile
October 06, 2014, 08:27:44 PM
 #7

I wonder what the miners think about the blocksize proposal. I believe they can choose the size of the block to be anywhere between 0 to 1MB today. Should the 1MB limit be raised, the final decision will remain with the miners, right?

Correct.  Miners do not have to include any transactions in their blocks if they so choose.

Also, miners do not have to build off any particular block.  Lets say a miner starts issuing blocks filled to the brim with 1-satoshi transactions.  The other miners could all agree (or collude, if you see it that way) to reject the 'spam' block and build off the previous one.
alpet
Legendary
*
Offline Offline

Activity: 1912
Merit: 1020


View Profile WWW
October 07, 2014, 12:49:15 PM
 #8

Hi All!
Why not possible partial nodes vs full nodes? Just for example: I will deploy partial nodes on some office computers, and set for every  node disk space quota to 10Gb. Some from this nodes will distribute blockchain for 2011-2012 years, some for 2013-2014. I think this solution little more flexible and distributable for many users.
P.S.: This text was automatically translated from Russian.

Novacoin we trust!
https://svcpool.io - PoS cтeйкинг и oбмeнник NVC/BTC.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 07, 2014, 01:02:14 PM
Last edit: October 07, 2014, 08:58:48 PM by jonny1000
 #9

I have tried to analyze what is going on graphically.  As Gavin said, typically there is a price boom which causes higher demand for transactions.  This is represented by a shift to the right in the below demand curve.  In order to keep the transaction price low a similar shift to the right may be required in the supply curve, which could be caused by an increase in the block size limit.  I think it might be a bit presumptuous to assume demand will continue to grow exponentially, especially at such a high rate.

Current supply & demand curves for space in blocks


Shift in supply & demand curves

Note: figures for illustrative purposes only.
delulo
Sr. Member
****
Offline Offline

Activity: 441
Merit: 250


View Profile
October 07, 2014, 05:43:21 PM
Last edit: October 07, 2014, 08:54:31 PM by delulo
 #10

Am I interpreting this
Quote
Imposing a maximum size that is in the reach of any ordinary person with a pretty good computer and an average broadband internet connection eliminates barriers to entry that might result in centralization of the network.
right, that it refers to barriers to entry of being a node?

instagibbs
Member
**
Offline Offline

Activity: 114
Merit: 10


View Profile
October 07, 2014, 06:18:47 PM
 #11

Hi All!
Why not possible partial nodes vs full nodes? Just for example: I will deploy partial nodes on some office computers, and set for every  node disk space quota to 10Gb. Some from this nodes will distribute blockchain for 2011-2012 years, some for 2013-2014. I think this solution little more flexible and distributable for many users.
P.S.: This text was automatically translated from Russian.

https://github.com/bitcoin/bitcoin/pull/4701

It's being worked on.
Peter R
Legendary
*
Offline Offline

Activity: 1162
Merit: 1007



View Profile
October 08, 2014, 04:26:47 AM
Last edit: October 08, 2014, 02:11:59 PM by Peter R
 #12

Originally, I imagined a floating blocksize limit based on demand, but after reading Gavin's roadmap I support his recommendation.  The limit should be increased (in a codified way) at a constant yearly % based on historical growth rates for internet bandwidth.  It's important that the blocksize limit be known a priori in order to give innovators more confidence to build on top of our network.

On a somewhat related note, here's a chart that shows Bitcoin's market cap overlaid with the daily transaction volume (excluding popular addresses).  The gray line extrapolates when in time and at what market cap we might begin to bump into the current 1 MB limit.    


Run Bitcoin Unlimited (www.bitcoinunlimited.info)
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1018



View Profile
October 08, 2014, 04:50:23 AM
 #13

I was expecting Gavin's roadmap to be hotly debated, but this thread has been relatively quite.  Is the debate unfolding somewhere else?  Or is there just not much debate about it?

Nice charts, PeterR.
theymos
Administrator
Legendary
*
Offline Offline

Activity: 4606
Merit: 10038


View Profile
October 08, 2014, 05:37:37 AM
 #14

If the experts are wrong, and bandwidth growth (or CPU growth or memory growth or whatever) slows or stops in ten years, then fine: change the largest-block-I'll-accept formula. Lowering the maximum is easier than raising it (lowering is a soft-forking change that would only affect stubborn miners who insisted on creating larger-than-what-the-majority-wants blocks).

Lowering the limit afterward wouldn't be a soft-forking change if the majority of mining power was creating too-large blocks, which seems possible.

I think that a really conservative automatic increase would be OK, but 50% yearly sounds too high to me. If this happens to exceed some residential ISP's actual bandwidth growth, then eventually that ISP's customers will be unable to be full nodes unless they pay for a much more expensive Internet connection. The idea of this sort of situation really concerns me, especially since the loss of full nodes would likely be gradual and easy to ignore until after it becomes very difficult to correct.

As I mentioned on Reddit, I'm also not 100% sure that I agree with your proposed starting point of 50% of a hobbyist-level Internet connection. This seems somewhat burdensome for individuals. It's entirely possible that Bitcoin can be secure without a lot of individuals running full nodes, but I'm not sure about this.

Determining the best/safest way to choose the max block size isn't really a technical problem; it has more to do with economics and game theory. I'd really like to see some research/opinions on this issue from economists and other people who specialize in this sort of problem.

1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
jonald_fyookball
Legendary
*
Offline Offline

Activity: 1302
Merit: 1004


Core dev leaves me neg feedback #abuse #political


View Profile
October 08, 2014, 07:27:56 AM
 #15

I'm not really qualified to comment on the merits of Gavin's plan, but on the surface it sounds like a thoughtful proposal.  I must say, it is exciting to see solutions to scalability being proposed, and I'm sure it is encouraging to the greater Bitcoin community at large.  Just the fact that a plan is on the table should be a nice jab to the naysayers/skeptics who have been "ringing the alarm bell" on this issue.

Although, in a sense they are correct that issues require action.  I would like to thank Gavin and the other developers for all the great work they've done and continue to do for Bitcoin.  

Hats off to you sir.

spin
Sr. Member
****
Offline Offline

Activity: 362
Merit: 256


View Profile
October 08, 2014, 10:25:25 AM
 #16

The post is a great read on the direction of ongoing development. Such posts are really helpful for hobbyists such as myself to get an idea where things are headed.   Keen on testing and supporting some of the new stuff.  I'd love to test out headers first for example.

Quote
After 12 years of bandwidth growth that becomes 56 billion transactions per day on my home network connection — enough for every single person in the world to make five or six bitcoin transactions every single day. It is hard to imagine that not being enough; according the the Boston Federal Reserve, the average US consumer makes just over two payments per day.

I have no idea but the average consumer is not the only one making transactions.  There are also business.  So the 2 per day stat is not all that's relevant.   But 5 to 6 bn p.p.p.d. should cover that also Smiley

Small typo:
Quote
I expect the initial block download problem to be mostly solved in the next relase or three of Bitcoin Core. The next scaling problem that needs to be tackled is the hardcoded 1-megabyte block size limit that means the network can suppor only approximately 7-transactions-per-second.


If you liked this post buy me a beer.  Beers are quite cheap where I live!
bc1q707guwp9pc73r08jw23lvecpywtazjjk399daa
cbeast
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1005

Let's talk governance, lipstick, and pigs.


View Profile
October 08, 2014, 10:51:55 AM
 #17

Is there a relationship between hashrate and bandwidth? If by increasing blocksize and thereby increasing bandwidth, would that eat into the available bandwidth for hashing? For example, if a peta-miner maxes out their bandwidth with just hashrate, then increasing blocksize would lower their hashrate. They would have to buy more bandwidth if it is available. It might favor miners living where there is better internet connectivity rather than cheap electricity or cold climate. It could help decentralize mining by enlarging the blocksize.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1014


View Profile
October 08, 2014, 03:07:06 PM
 #18

What is the plan for handling the 32MB message limit?

Would the 50% per year increase be matched by a way to handle unlimited block sizes?

Blocks would have to be split over multiple messages (or the message limit increased)

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Skoupi
Sr. Member
****
Offline Offline

Activity: 252
Merit: 250

Skoupi the Great


View Profile
October 08, 2014, 03:13:43 PM
 #19

Raise it too quickly and it gets too expensive for ordinary people to run full nodes.

Ordinary people aren't supposed to run full nodes anyway  Tongue
Mrmadden
Newbie
*
Offline Offline

Activity: 6
Merit: 0


View Profile
October 08, 2014, 03:58:47 PM
 #20

50% is conservative based on extrapolated storage and computing power cost efficiencies, decreasing 100% and 67% annually.

50% is very risky based on extrapolated bandwidth cost decreases, decreasing 50% annually.

I would dial it back from 50% to 40%.  Hobbyists will want to download full nodes remotely, and that is just too close for comfort.
andytoshi
Full Member
***
Offline Offline

Activity: 178
Merit: 148

-


View Profile
October 08, 2014, 05:08:34 PM
 #21

Ordinary people aren't supposed to run full nodes anyway  Tongue

This is absurd and false. Bitcoin is deliberately a publically verifiable system.
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 1939


Chief Scientist


View Profile WWW
October 08, 2014, 05:36:07 PM
 #22

Lowering the limit afterward wouldn't be a soft-forking change if the majority of mining power was creating too-large blocks, which seems possible.

When I say "soft-fork" I mean "a majority of miners upgrade and force all the rest of the miners to go along (but merchants and other fully-validating, non-mining nodes do not have to upgrade)."

Note that individual miners (or sub-majority cartels) can unilaterally create smaller blocks containing just higher-fee transactions, if they think it is in their long-term interest to put upward pressure on transaction fees.

I think that a really conservative automatic increase would be OK, but 50% yearly sounds too high to me. If this happens to exceed some residential ISP's actual bandwidth growth, then eventually that ISP's customers will be unable to be full nodes unless they pay for a much more expensive Internet connection. The idea of this sort of situation really concerns me, especially since the loss of full nodes would likely be gradual and easy to ignore until after it becomes very difficult to correct.

As I mentioned on Reddit, I'm also not 100% sure that I agree with your proposed starting point of 50% of a hobbyist-level Internet connection. This seems somewhat burdensome for individuals. It's entirely possible that Bitcoin can be secure without a lot of individuals running full nodes, but I'm not sure about this.

Would 40% initial size and growth make you support the proposal?


Determining the best/safest way to choose the max block size isn't really a technical problem; it has more to do with economics and game theory. I'd really like to see some research/opinions on this issue from economists and other people who specialize in this sort of problem.

Anybody know economists who specialize in this sort of problem? Judging by what I know about economics and economists, I suspect if we ask eleven of them we'll get seven different opinions for the best thing to do. Five of which will miss the point of Bitcoin entirely. ("...elect a Board of Blocksize Governors that decides on an Optimal Size based on market supply and demand conditions as measured by an independent Bureau of Blocksize Research....")

How often do you get the chance to work on a potentially world-changing project?
Alpaca Bob
Full Member
***
Offline Offline

Activity: 153
Merit: 100


View Profile
October 08, 2014, 07:06:56 PM
 #23

raise the block size too slowly and you discourage transactions and increase their price. The danger is Bitcoin becomes irrelevant for anything besides huge transactions, and is used only by big corporations and is too expensive for individuals. Hurray, we just reinvented the SWIFT or ACH systems.

SWIFT doesn't work like this at all though: it's incredibly clunky, and only works with government-issued currencies.

If anything, it'd be like reinventing the gold standard, but a digital, cryptographically verifiable, lightning fast and relatively cheap to use version. (In many implementations of the gold standard, gold wasn't actually used in day-to-day trade.)

Furthermore, SWIFT is not decentralized, and certain transfers (to specific countries for instance) can technically be censored. Nor is it anonymous or even pseudonymous.

See:

http://www.swift.com/news/press_releases/SWIFT_disconnect_Iranian_banks
http://www.bloomberg.com/news/2014-08-29/u-k-wants-eu-to-block-russia-from-swift-banking-network.html
http://www.spiegel.de/international/world/spiegel-exclusive-nsa-spies-on-international-bank-transactions-a-922276.html

Quote
Judging by what I know about economics and economists, I suspect if we ask eleven of them we'll get seven different opinions for the best thing to do. Five of which will miss the point of Bitcoin entirely.

LOL Cheesy

(Perhaps mathematicians, though?)

The Times 03/Jan/2009 Chancellor on brink of second bailout for banks
RodeoX
Legendary
*
Offline Offline

Activity: 3066
Merit: 1145


The revolution will be monetized!


View Profile
October 08, 2014, 07:13:32 PM
 #24

...Hurray, we just reinvented the SWIFT or ACH systems.

SWIFT doesn't work like this at all...

I think what he meant was that it would be like SWIFT in that it would mostly be for large international transfers. A fork like this will have to happen sooner or later.

The gospel according to Satoshi - https://bitcoin.org/bitcoin.pdf
Free bitcoin in ? - Stay tuned for this years Bitcoin hunt!
coindate
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
October 08, 2014, 07:16:37 PM
 #25

This is what 50% growth per annum looks like.  How will miners earn income when the block reward is low and the block size limit is increasing at such an exponential rate, that transaction fees will also be low, even if demand grows at say 40% per annum?


My Opinion:
Larger blocks means more transactions.
The fee per transaction stays low, but the net fee can grow along with block size.
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1018



View Profile
October 08, 2014, 07:19:06 PM
 #26

Is there a relationship between hashrate and bandwidth? If by increasing blocksize and thereby increasing bandwidth, would that eat into the available bandwidth for hashing? For example, if a peta-miner maxes out their bandwidth with just hashrate, then increasing blocksize would lower their hashrate. They would have to buy more bandwidth if it is available. It might favor miners living where there is better internet connectivity rather than cheap electricity or cold climate. It could help decentralize mining by enlarging the blocksize.

There is a subtle, but important relationship between hash power and bandwidth.  The hash power / bandwidth relationship can best be looked at in terms of orphan blocks.  A miner needs to have adequate bandwidth to make orphan blocks unlikely.  What is adequate?  What is unlikely?  Assuming 10-minute blocks, each and every second there is 0.16 % chance a miner will discover a block.  If you're a miner, and you find a block, you obviously need to broadcast it to the network ASAP.  A 6-second delay in publishing would mean 1% chance that someone else finds a block in the meantime.  Diminishing returns help to keep the bandwidth 'race' to a minimum.  Having enough bandwidth to keep your orphan rate under 1% would probably be good enough.  If your mining operation was big enough that 1% was a substantial sum, perhaps you would buy bandwidth to push the orphan rate down to 0.1%, but at some point it no longer makes financial sense to invest in more bandwidth.  Consider that being able to push out blocks infinitely fast would mean paying for infinite bandwidth, but at best would mean 1% increase in returns over the miner who took 6-seconds to publish a block.  Also, as miners are not required to include any transactions in blocks they publish, they have another way to compensate for low bandwidth bottlenecks - publish empty blocks (this is because - currently - the block reward is so much bigger than transaction fees are).  Since orphan blocks divert hash power away from the main chain, they are a security consideration, and if the system design encourages miners to publish empty blocks, that certainly doesn't help the network to thrive either.

The block-propagation-race issue could certainly be impacted by larger block sizes.  Fortunately a protocol is being (has been?) implemented that allows blocks to essentially be per-constructed in real time, by each miner, as transactions flow across the network.  If that system was perfectly successful, the miner who finds a new block would just need to transmit the block header, and the block header does not scale with transaction volume.  Another way of thinking about it is this:  Currently blocks are pushed out all at once, creating massive peak loads on miner bandwidth.  The new design spreads out the load over the whole 10 minutes between blocks (or however long).

Miners would always need to have enough bandwidth to comfortably process the entire transaction load on the network.  I currently have a 20 Mbps down / 5 Mbps up cable connection here in Washington State, USA.   It cost me $45 per month.  That is 1,500 MB down and 375 MB up every 10 minutes.  That equates to 10,000 TPS down and 2500 TPS up, respectively.

Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1018



View Profile
October 08, 2014, 08:24:16 PM
 #27

I think that a really conservative automatic increase would be OK, but 50% yearly sounds too high to me. If this happens to exceed some residential ISP's actual bandwidth growth, then eventually that ISP's customers will be unable to be full nodes unless they pay for a much more expensive Internet connection. The idea of this sort of situation really concerns me, especially since the loss of full nodes would likely be gradual and easy to ignore until after it becomes very difficult to correct.

As I mentioned on Reddit, I'm also not 100% sure that I agree with your proposed starting point of 50% of a hobbyist-level Internet connection. This seems somewhat burdensome for individuals. It's entirely possible that Bitcoin can be secure without a lot of individuals running full nodes, but I'm not sure about this.

Would 40% initial size and growth make you support the proposal?

I made a chart showing how some different slopes and y-intercepts compare over time.  As you might be able to guess from the chart, I am partial to the idea to jump starting the process with an initial increase of a few MB, then having a slightly more conservative growth rate going forward.  We know 10 MB blocks (70 TPS) can be supported by many of today's home internet connections.  I would argue it is quite a bit less certain if 3,000 MB blocks would be realistic in 20 years.  Put another way:  I think we are behind the curve.  Making a steeper curve to compensate could really throw us off in the future, but adjusting the Y-intercept to put us back on track, and then making a good guess about the future would be better.



jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 08, 2014, 08:31:25 PM
Last edit: October 08, 2014, 08:45:29 PM by jonny1000
 #28

Would 40% initial size and growth make you support the proposal?

I don’t think the distinction between 50% and 40% is that much, the issue may be whether there should be permanent exponential growth in the blocksize limit or some other model.  Nothing grows exponentially forever, consumer bandwidth speeds won’t and neither will demand for Bitcoin transactions, therefore why does one require permanent exponential growth?  Why not consider a model where say the blocksize grows by a fixed percentage each year, however the rate of increase falls by 50% for example when the block reward drops?

Consider the example below:

Year
Blocksize limit MB
Growth Rate
2015
1.0
100%
2016
2.0
100%
2017
3.0
50%
2018
4.5
50%
2019
6.8
50%
2020
10.1
50%
2021
12.7
25%




Determining the best/safest way to choose the max block size isn't really a technical problem; it has more to do with economics and game theory. I'd really like to see some research/opinions on this issue from economists and other people who specialize in this sort of problem.

Anybody know economists who specialize in this sort of problem? Judging by what I know about economics and economists, I suspect if we ask eleven of them we'll get seven different opinions for the best thing to do. Five of which will miss the point of Bitcoin entirely. ("...elect a Board of Blocksize Governors that decides on an Optimal Size based on market supply and demand conditions as measured by an independent Bureau of Blocksize Research....")


Theymos, I agree that the maximum blocksize is also an economic/game theory problem, that’s why a drew those supply and demand curves to try to analyse this using an economic framework.  If the blocksize limit increases and transaction fees fall, this can increase the velocity of money, boost inflation and stimulate the economy.  In contrast if the blocksize limit falls, the velocity of money can fall, causing deflation and an economic slowdown.  I agree with Gavin that having a “Blocksize policy committee” to manage the economy, is not very consistent with Bitcoin’s values.  There should not be a problem here as all the economic variables (volume of transactions, transaction fee data) are in the blockchain and therefore economic policy could be automated.  Perhaps a simple fixed 50% increase per year is the best solution or maybe a more complicated economic formula is required.  However, obviously consumer bandwidth speeds cannot be obtained from blockchain data, therefore it could be important to act with caution and keep the blocksize growth somewhat restricted.  Permanent exponential growth should be avoided for this reason.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 08, 2014, 08:34:32 PM
 #29

Put another way:  I think we are behind the curve.  Making a steeper curve to compensate could really throw us off in the future, but adjusting the Y-intercept to put us back on track, and then making a good guess about the future would be better.

+1

A correction now and then less aggressive growth going forward could be a better idea.
PRab
Member
**
Offline Offline

Activity: 98
Merit: 10


View Profile
October 08, 2014, 11:10:42 PM
 #30

I agree that we need to increase the block size to prevent transactions getting stuck without a confirmation, but I strongly disagree with the rate that it would be increased. I would support any of the following:

  • 1MB increase per year. - Simple to implement and doesn't lead to exponential growth.
  • 10% increase per year. - Exponential, but with a much smaller constant than has been discussed in this post.
  • Based on fullness of last years blocks (Max 20% increase). - Exponential, but more tightly tied to actual usage. Allows for block size decreases. Miners essentially get to vote on blocksize.

Basically, my hope is that it gets easier to run a full node over time. The total number of transactions that the worlds population is making is not increasing exponentially so (IMO), the blocksize shouldn't either.
mmeijeri
Hero Member
*****
Offline Offline

Activity: 714
Merit: 500

Martijn Meijering


View Profile
October 08, 2014, 11:15:32 PM
 #31

Tree chains with a fixed block size could also work. Has anyone looked at the pros and cons compared to what's being proposed here?

ROI is not a verb, the term you're looking for is 'to break even'.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 09, 2014, 12:05:23 AM
 #32

I think that a really conservative automatic increase would be OK, but 50% yearly sounds too high to me. If this happens to exceed some residential ISP's actual bandwidth growth, then eventually that ISP's customers will be unable to be full nodes unless they pay for a much more expensive Internet connection. The idea of this sort of situation really concerns me, especially since the loss of full nodes would likely be gradual and easy to ignore until after it becomes very difficult to correct.

As I mentioned on Reddit, I'm also not 100% sure that I agree with your proposed starting point of 50% of a hobbyist-level Internet connection. This seems somewhat burdensome for individuals. It's entirely possible that Bitcoin can be secure without a lot of individuals running full nodes, but I'm not sure about this.

Would 40% initial size and growth make you support the proposal?

40%/year = 96%/(2 years).  I hope that gets rounded to "double once every 2 years (105 000 blocks)".

Also, do you propose this growth be open-ended or terminate at some block-size commensurate with the transaction volume of an existing, mature payment system such as Visa?  Given an open-ended proposal I'd echo Theymos' concern.
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 3920
Merit: 2347


Eadem mutata resurgo


View Profile
October 09, 2014, 12:34:27 AM
 #33

Put another way:  I think we are behind the curve.  Making a steeper curve to compensate could really throw us off in the future, but adjusting the Y-intercept to put us back on track, and then making a good guess about the future would be better.

+1

A correction now and then less aggressive growth going forward could be a better idea.

... or begin with 50% growth then have a halving of blocksize growth every 4 years?

hello_good_sir
Hero Member
*****
Offline Offline

Activity: 1008
Merit: 531



View Profile
October 09, 2014, 04:16:27 AM
 #34

Let's keep in mind some important factors:

1) It will be impossible to limit the block size in the future, if bitcoin is very successful.  Governments and the ultra rich will control all of the nodes that are powerful enough to mine blocks.  They will also (due to their power) have indirect influence over the other nodes.
 
If the 10000 bitcoin enthusiasts decide that the block size is too big it won't matter; they don't control any of the 10 nodes capable of mining blocks.  If the enthusiasts decide to start rejecting blocks that are too big they will find themselves on a fork that no one cares about.  We need to assume that the miners of the future will be hostile to the principle of decentralization.

If bandwidth is the bottleneck, the big players can simply take action to slow the rate of bandwidth increase.  No one will notice if bandwidth increases by 49% per year instead of 50%.  One by one nodes will drop out until there are only a few left and then the crisis will be resolved, by implementing a Central Bitcoin Bank to determine the block size.

2) It will be possible to increase the block size in the future, if it has to be done.  The big players will press for it, and the small players will be convinced that they need to go along for the good of the system.

3) Trends don't continue forever, and even if they do it isn't always relevant.  Right now bandwidth is the foreseeable bottleneck.  Perhaps the the growth of bandwidth will slow.  Perhaps it will continue, but something else (that we aren't thinking of right now) will become the bottleneck.

So putting these three principles together here is what I see:

increase the block size by 2X% per year, where X is the block reward.  So we'd have a couple more years at 50%, then four at 25%, then four at 12.5%, and so on.  This is still an astounding growth rate.

marcus_of_augustus
Legendary
*
Offline Offline

Activity: 3920
Merit: 2347


Eadem mutata resurgo


View Profile
October 09, 2014, 04:33:51 AM
 #35

Quote
So putting these three principles together here is what I see:

increase the block size by 2X% per year, where X is the block reward.  So we'd have a couple more years at 50%, then four at 25%, then four at 12.5%, and so on.  This is still an astounding growth rate.

... cute and simple, but isn't that what I just said?

It is an interesting solution too, in that it locks the two scarce resources bitcoin provides (block space and coins) into the same release schedule. In this way, the decrease of block reward to miners might be replaced by a commensurate increase in fees from more competition for blockspace.

redHeadBlunder
Member
**
Offline Offline

Activity: 81
Merit: 10


View Profile
October 09, 2014, 04:34:30 AM
 #36

I agree that we need to increase the block size to prevent transactions getting stuck without a confirmation, but I strongly disagree with the rate that it would be increased. I would support any of the following:

  • 1MB increase per year. - Simple to implement and doesn't lead to exponential growth.
I don't think this would be an option. It may work well in the first few years, however over time this will be less effective as the number of TXs per unit of time will increase at an exponential rate (most likely).
  • 10% increase per year. - Exponential, but with a much smaller constant than has been discussed in this post.
I think this is too slow, especially at first as this is less then the rate of TX growth that we are seeing.
  • Based on fullness of last years blocks (Max 20% increase). - Exponential, but more tightly tied to actual usage. Allows for block size decreases. Miners essentially get to vote on blocksize.
I am not sure about this one. A better solution might be to use your second option (but with a higher increase per year) up to a certain year and the issue can be later revisited.
Basically, my hope is that it gets easier to run a full node over time. The total number of transactions that the worlds population is making is not increasing exponentially so (IMO), the blocksize shouldn't either.
I don't think this will happen. We will likely see less nodes run in households and more nodes run by corporations and on VPS (or hosted servers). The fact is that there is no financial benefit to running a node, and over time people will be less inclined to have one running in their home[/list]
tucenaber
Sr. Member
****
Offline Offline

Activity: 337
Merit: 250


View Profile
October 09, 2014, 09:11:33 AM
 #37

Quote
So putting these three principles together here is what I see:

increase the block size by 2X% per year, where X is the block reward.  So we'd have a couple more years at 50%, then four at 25%, then four at 12.5%, and so on.  This is still an astounding growth rate.

... cute and simple, but isn't that what I just said?

It is an interesting solution too, in that it locks the two scarce resources bitcoin provides (block space and coins) into the same release schedule. In this way, the decrease of block reward to miners might be replaced by a commensurate increase in fees from more competition for blockspace.

Just to clarify. This means a doubling from today's 1MB to eventually 2MB, and it will take decades to get there. Seems pointless to me. This will still not solve the problem (if there is one).

Are there economic reasons for keeping the block size limited to keep it artificially scarce?
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 09, 2014, 10:10:59 AM
Last edit: October 09, 2014, 10:23:51 AM by jonny1000
 #38

My Opinion:
Larger blocks means more transactions.
The fee per transaction stays low, but the net fee can grow along with block size.

Coindate, you could well be correct, larger blocks could mean more transactions and therefore if the fees fall then miners can be compensated by higher transaction volumes.

However, let’s make some assumptions about how the network may operate in the future:

1.      IBLT O(1) block propagation has been successfully implemented, and therefore the marginal costs to miners of including additional transactions is close to zero.

2.      Bitcoin mining is competitive such that there are many miners driving down transaction fees to the point where marginal cost is close the marginal revenue

These are both reasonable and desirable assumptions.  Now let’s consider the implications for transaction fees.  I propose this would result in a scenario in which fees are very low and then suddenly sharply increase as we approach the blocksize limit. For example in the following two illustrative charts the supply curve should be very close to the x-axis and then sharply increase at a tipping point, before being very close to the blocksize limit line.  In the charts the orange area represents mining revenue.

Healthy supply and demand curve for space in blocks

 
Scenario in which the blocksize limit has increased too fast


Demand for Bitcoin transactions is unchanged in both scenarios, however in the first chart mining revenue is far higher than in the 2nd chart as the orange boxes demonstrate.  This is because despite the fact that volume has increased, the transaction fees are too low because the blocksize limit has increased too fast, resulting in almost zero transaction fees.  Therefore it is important to keep in mind that the blocksize needs to be small enough to generate fees that compensate miners when the block reward becomes too small.  Reducing the rate of growth in the blocksize when the reward falls may be a good idea for this reason.  Somebody must pay for miners, distributed consensus is not free, I hope the cost per transaction is very low, however it needs to be sufficient.
Alpaca Bob
Full Member
***
Offline Offline

Activity: 153
Merit: 100


View Profile
October 09, 2014, 10:33:39 AM
 #39

...Hurray, we just reinvented the SWIFT or ACH systems.

SWIFT doesn't work like this at all...

I think what he meant was that it would be like SWIFT in that it would mostly be for large international transfers...

No I got that. What I'm saying, is that it is a false comparison, and that "reinventing" SWIFT, the gold standard, or - as Jon Matonis put it (comments) - a 'wholesale' instrument for global settlement, might not necessarily need to be a bad thing.

The Times 03/Jan/2009 Chancellor on brink of second bailout for banks
kingcolex
Legendary
*
Offline Offline

Activity: 2366
Merit: 1258



View Profile
October 09, 2014, 06:31:03 PM
 #40

Would 40% initial size and growth make you support the proposal?

I don’t think the distinction between 50% and 40% is that much, the issue may be whether there should be permanent exponential growth in the blocksize limit or some other model.  Nothing grows exponentially forever, consumer bandwidth speeds won’t and neither will demand for Bitcoin transactions, therefore why does one require permanent exponential growth?  Why not consider a model where say the blocksize grows by a fixed percentage each year, however the rate of increase falls by 50% for example when the block reward drops?

Consider the example below:

Year
Blocksize limit MB
Growth Rate
2015
1.0
100%
2016
2.0
100%
2017
3.0
50%
2018
4.5
50%
2019
6.8
50%
2020
10.1
50%
2021
12.7
25%




Determining the best/safest way to choose the max block size isn't really a technical problem; it has more to do with economics and game theory. I'd really like to see some research/opinions on this issue from economists and other people who specialize in this sort of problem.

Anybody know economists who specialize in this sort of problem? Judging by what I know about economics and economists, I suspect if we ask eleven of them we'll get seven different opinions for the best thing to do. Five of which will miss the point of Bitcoin entirely. ("...elect a Board of Blocksize Governors that decides on an Optimal Size based on market supply and demand conditions as measured by an independent Bureau of Blocksize Research....")


Theymos, I agree that the maximum blocksize is also an economic/game theory problem, that’s why a drew those supply and demand curves to try to analyse this using an economic framework.  If the blocksize limit increases and transaction fees fall, this can increase the velocity of money, boost inflation and stimulate the economy.  In contrast if the blocksize limit falls, the velocity of money can fall, causing deflation and an economic slowdown.  I agree with Gavin that having a “Blocksize policy committee” to manage the economy, is not very consistent with Bitcoin’s values.  There should not be a problem here as all the economic variables (volume of transactions, transaction fee data) are in the blockchain and therefore economic policy could be automated.  Perhaps a simple fixed 50% increase per year is the best solution or maybe a more complicated economic formula is required.  However, obviously consumer bandwidth speeds cannot be obtained from blockchain data, therefore it could be important to act with caution and keep the blocksize growth somewhat restricted.  Permanent exponential growth should be avoided for this reason.

I like the increase plan on this but as it seems is the one thing we can all agree on is that it NEEDS to be have a growth, and I think it will need a big name like Gavin to make the change.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1006



View Profile
October 09, 2014, 07:00:52 PM
 #41

My Opinion:
Larger blocks means more transactions.
The fee per transaction stays low, but the net fee can grow along with block size.

Coindate, you could well be correct, larger blocks could mean more transactions and therefore if the fees fall then miners can be compensated by higher transaction volumes.

However, let’s make some assumptions about how the network may operate in the future:

1.      IBLT O(1) block propagation has been successfully implemented, and therefore the marginal costs to miners of including additional transactions is close to zero.

2.      Bitcoin mining is competitive such that there are many miners driving down transaction fees to the point where marginal cost is close the marginal revenue

These are both reasonable and desirable assumptions.  Now let’s consider the implications for transaction fees.  I propose this would result in a scenario in which fees are very low and then suddenly sharply increase as we approach the blocksize limit. For example in the following two illustrative charts the supply curve should be very close to the x-axis and then sharply increase at a tipping point, before being very close to the blocksize limit line.  In the charts the orange area represents mining revenue.

Healthy supply and demand curve for space in blocks

 
Scenario in which the blocksize limit has increased too fast


Demand for Bitcoin transactions is unchanged in both scenarios, however in the first chart mining revenue is far higher than in the 2nd chart as the orange boxes demonstrate.  This is because despite the fact that volume has increased, the transaction fees are too low because the blocksize limit has increased too fast, resulting in almost zero transaction fees.  Therefore it is important to keep in mind that the blocksize needs to be small enough to generate fees that compensate miners when the block reward becomes too small.  Reducing the rate of growth in the blocksize when the reward falls may be a good idea for this reason.  Somebody must pay for miners, distributed consensus is not free, I hope the cost per transaction is very low, however it needs to be sufficient.
Sorry to be blunt, but your supply and demand curves are nonsense.

Miners will not automatically create the largest possible block they can regardless of their revenue.

The supply curve has no relationship with the maximum block size allowed by the protocol - it's determined by the costs involved with producing larger blocks.
hello_good_sir
Hero Member
*****
Offline Offline

Activity: 1008
Merit: 531



View Profile
October 09, 2014, 07:12:37 PM
 #42

Quote
So putting these three principles together here is what I see:

increase the block size by 2X% per year, where X is the block reward.  So we'd have a couple more years at 50%, then four at 25%, then four at 12.5%, and so on.  This is still an astounding growth rate.

... cute and simple, but isn't that what I just said?

It is an interesting solution too, in that it locks the two scarce resources bitcoin provides (block space and coins) into the same release schedule. In this way, the decrease of block reward to miners might be replaced by a commensurate increase in fees from more competition for blockspace.

Just to clarify. This means a doubling from today's 1MB to eventually 2MB, and it will take decades to get there. Seems pointless to me. This will still not solve the problem (if there is one).


No, it does not mean that.  It means this:

2015: 1.5MB (block reward = 25)
2016: 2.25MB (block reward = 25)
2017: 2.8125MB (block reward = 12.5)
2018: 3.52...  (block reward = 12.5)
2018: 4.39...  (block reward = 12.5)
2019: 5.49...  (block reward = 12.5)
2020: 6.18...  (block reward = 6.25)
2021: 6.95...  (block reward = 6.25)
2021: 7.82...  (block reward = 6.25)

and so on.  That seems like a schedule that won't kill off too many nodes.

An extremely large block size would mess up the economics of mining eventually.

jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 09, 2014, 07:36:49 PM
 #43

Sorry to be blunt, but your supply and demand curves are nonsense.

Miners will not automatically create the largest possible block they can regardless of their revenue.

The supply curve has no relationship with the maximum block size allowed by the protocol - it's determined by the costs involved with producing larger blocks.

Dear justusranvier

Thank you for you comments about the supply and demand curves.  They may well be nonsense as this is an emergent field and I may not have strong knowledge in this area, the curves are also just theoretical and in practice the situation may be different.  I merely propose they could provide a useful framework to analyse the dynamic between transaction fees and the blocksize limit.

I agree that the supply curve is "determined by the costs involved with producing larger blocks", although of course the limit still matters in the sense that supply cannot go above the limit.  The point I was trying to make is the costs "involved with producing larger blocks" are very low.  I think of it in terms of the marginal cost to a miner of including one more transaction in the block. There are only two costs I can think of:
1. The very small amount of processing required to verify the transaction and include a hash of the transaction in the block header,
2. The cost associated with the higher blocksize increasing the probability of orphan.

The supply curve I made was assuming successful IBLT O(1) block propagation, which should reduce the marginal increase in orphan risk to alomst zero, this will leave only the small amount of computer processing as the marginal cost of including a transaction in a block.  This small cost is why the supply curve looks the way it does.  Miners include as many transactions as they can because there cost of doing so is limited and therefore the market driven transaction fee will be very low.  Then if demand increases such that there is insufficient space in blocks, supply is constrained by the limit and the price increases.  This is the area on the curve the network may need to be at to ensure mining revenue is sufficient.
 
Many thanks
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 1939


Chief Scientist


View Profile WWW
October 09, 2014, 07:51:30 PM
 #44

An extremely large block size would mess up the economics of mining eventually.

I'm working on a follow-up blog post that talks about economics of the block size, but want to get it reviewed by some real economists to make sure my thinking is reasonably correct. But I'm curious: why do you think an extremely large block size will mess up the economics of mining?  What do you think would happen?

RE: geometric growth cannot go on forever:  true, but Moore's law has been going steady for 40 years now. The most pessimistic prediction I could find said it would last at least another 10-20 years; the most optimistic, 600 years.

I'd be happy with "increase block size 40% per year (double every two years) for 20 years, then stop."

Because if Bitcoin is going gangbusters 15 years from now, and CPU and bandwidth growth is still going strong, then either the "X%" or the "then stop date" can be changed to continue growing.

I did some research, and the average "good" broadband Internet connection in the US is 10Mbps speed. But ISPs are putting caps on home users' total bandwidth usage per month, typically 250 or 300GB/month. If I recall correctly, 300GB per month was the limit for my ISP in Australia, too.

Do the math, and 40% of a 250GB connection works out to 21MB dedicated to Bitcoin every ten minutes. Leave a generous megabyte for overhead, that would work out to a starting point of maximum-size-20MB blocks.

(somebody check my math, I'm really good at dropping zeroes)


How often do you get the chance to work on a potentially world-changing project?
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1006



View Profile
October 09, 2014, 08:00:47 PM
 #45

The supply curve I made was assuming successful IBLT O(1) block propagation, which should reduce the marginal increase in orphan risk to alomst zero, this will leave only the small amount of computer processing as the marginal cost of including a transaction in a block.  This small cost is why the supply curve looks the way it does.  Miners include as many transactions as they can because there cost of doing so is limited and therefore the market driven transaction fee will be very low.  Then if demand increases such that there is insufficient space in blocks, supply is constrained by the limit and the price increases.  This is the area on the curve the network may need to be at to ensure mining revenue is sufficient.
Markets don't need quotas in order to prevent producers from operating at a loss.

If the cost of processing transactions drops due to improvements in technology this is a good thing - it means that Bitcoin users get more transactions at a lower price.

You don't have to worry about miners going out of business because they aren't going to add transactions to a block if doing so results in a net loss for them.

By the way: the curves you are looking for are here:

http://econpage.com/301/practice/mt1-s.htm

Right now, the supply curve for Bitcoin transaction processing is like S2 on that graph, except that at the moment the equilibrium point is to the left of the quota, so the quota isn't yet affecting the price.

Without the block size limit, the curve would look like S1.

Technologies which lower the cost of producing and transmitting blocks moves the supply curve to the right (except the quota does not move).
odolvlobo
Legendary
*
Online Online

Activity: 3724
Merit: 2663



View Profile
October 09, 2014, 08:22:39 PM
 #46

There are only two costs I can think of:
1. The very small amount of processing required to verify the transaction and include a hash of the transaction in the block header,
2. The cost associated with the higher blocksize increasing the probability of orphan.

I believe that these per transaction costs are real and they will eventually determine the minimum transaction fee, assuming there is no artificial limit to the block size. Even if these costs are very low they are not 0, and the increased block size will allow more transactions and more income from fees.

Buy stuff on Amazon with BTC or convert Amazon points to BTC here: Purse.io
Join an anti-signature campaign: Click ignore on the members of signature campaigns.
PGP Fingerprint: 6B6BC26599EC24EF7E29A405EAF050539D0B2925 Signing address: 13GAVJo8YaAuenj6keiEykwxWUZ7jMoSLt
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 09, 2014, 08:58:53 PM
 #47

I agree the current supply curve may look like S2 on http://econpage.com/301/practice/mt1-s.htm.

However the perhaps the reason fees are not that low now is the following:
1.   Mining is not currently perfectly competitive and thus miners are not lowering their offer prices of their services to the marginal cost
2.   The orphan risk is currently a large risk and the marginal cost of including a transaction is actually high

The problems I am talking about do not apply now when there is a large block reward, but could occur in the future when the block reward is low and miners need to be incentivised by the transaction fee.

In a competitive market price should equal marginal cost, in that environment mining profit should fall to zero and there will be limited incentives to mine.  I propose that the system needs arbitrary limits on the blocksize to “manipulate” the transaction fee market, so that there are high enough mining incentives.  

In the current network a corollary situation occurs with respect to the difficulty and the Bitcoin price.  For example, at any given Bitcoin price, miners enter or leave the market and profit margins tend to zero and the system self-adjusts.  This is not a problem, if the Bitcoin price falls, miners exit, the difficulty falls, mining profits increase and we are fine.

Let’s consider the situation when the block reward is low and transaction fees compensate miners, then the dynamics are different.  If supply is not constrained, transaction fees fall to the marginal cost, mining profit falls and then miners exit and the difficulty falls.  The remaining miners can then find blocks more easily, but they don’t necessarily get compensated more for this, because the fees would still be low.

If the cost of processing transactions drops due to improvements in technology this is a good thing - it means that Bitcoin users get more transactions at a lower price.

Yes in a normal market this is true.  Bitcoin users getting more transactions at a lower price seems good.  However the market for Bitcoin transactions is not a normal market.  The system requires a large number of miners to be around to provide a large hashrate for security purposes.

For example, let’s consider the music industry, improvements in technology are reducing the cost of distributing music and this is a good thing, it means more people get more music at a lower price.  If the marginal cost of distributing music is zero then in a competitive market the price should be zero and the consumer wins.  The record industry or CD production industry may suffer and shrink, which many people may argue is fine, that’s just market forces at work.  With respect to miners, it is different, users need miners to be around.  Improvements in technology which reduced the cost of processing transactions to zero, could damage network security.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 09, 2014, 09:46:18 PM
Last edit: October 10, 2014, 01:32:29 PM by teukon
 #48

An extremely large block size would mess up the economics of mining eventually.

I'm working on a follow-up blog post that talks about economics of the block size, but want to get it reviewed by some real economists to make sure my thinking is reasonably correct. But I'm curious: why do you think an extremely large block size will mess up the economics of mining?  What do you think would happen?

RE: geometric growth cannot go on forever:  true, but Moore's law has been going steady for 40 years now. The most pessimistic prediction I could find said it would last at least another 10-20 years; the most optimistic, 600 years.

I'd be happy with "increase block size 40% per year (double every two years) for 20 years, then stop."

Because if Bitcoin is going gangbusters 15 years from now, and CPU and bandwidth growth is still going strong, then either the "X%" or the "then stop date" can be changed to continue growing.

I did some research, and the average "good" broadband Internet connection in the US is 10Mbps speed. But ISPs are putting caps on home users' total bandwidth usage per month, typically 250 or 300GB/month. If I recall correctly, 300GB per month was the limit for my ISP in Australia, too.

Do the math, and 40% of a 250GB connection works out to 21MB dedicated to Bitcoin every ten minutes. Leave a generous megabyte for overhead, that would work out to a starting point of maximum-size-20MB blocks.

(somebody check my math, I'm really good at dropping zeroes)

Yeah, 40% of a 250 GB connection works out to about 23 MB depending on how you define month.  May I ask what would happen regarding TOR?

If 1 MB blocks give us, say, 3 transactions per second, then 20 years of "double every 2 years" growth starting from 20 MB would leave us with about 60 million transactions per second.  That's about 25 transaction per hour per human (assuming a world population of 8.5 billion in 20 years time).

This sounds a bit excessive to me but then again I've not thought seriously about how such a volume of transactions could be utilised.  https://en.bitcoin.it/wiki/Scalability doesn't speculate beyond a few hundred thousand transactions per second.  I'd certainly appreciate a link if a discussion on the utility of millions of transactions per second exists.


Edit: Oops.  I miscalculated "double every year for 20 years".  Starting from 1 MB blocks worth, say, 3 transactions per second, then 20 years of "double every 2 years" growth starting from 20 MB will yield about 60 000 transactions per second.  That's about 4 transactions per week per human (assuming a world population of 8.5 billion in 20 years time).

Looking forward to the block-size economics blog post.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1006



View Profile
October 09, 2014, 09:54:34 PM
 #49

I agree the current supply curve may look like S2 on http://econpage.com/301/practice/mt1-s.htm.

However the perhaps the reason fees are not that low now is the following:
1.   Mining is not currently perfectly competitive and thus miners are not lowering their offer prices of their services to the marginal cost
2.   The orphan risk is currently a large risk and the marginal cost of including a transaction is actually high

The problems I am talking about do not apply now when there is a large block reward, but could occur in the future when the block reward is low and miners need to be incentivised by the transaction fee.

In a competitive market price should equal marginal cost, in that environment mining profit should fall to zero and there will be limited incentives to mine.  I propose that the system needs arbitrary limits on the blocksize to “manipulate” the transaction fee market, so that there are high enough mining incentives.  

In the current network a corollary situation occurs with respect to the difficulty and the Bitcoin price.  For example, at any given Bitcoin price, miners enter or leave the market and profit margins tend to zero and the system self-adjusts.  This is not a problem, if the Bitcoin price falls, miners exit, the difficulty falls, mining profits increase and we are fine.

Let’s consider the situation when the block reward is low and transaction fees compensate miners, then the dynamics are different.
1. Miners are not pricing transaction very well because of the huge block subsidy. Like all subsidies, it adversely affects the market. Our only recourse is to wait until it diminishes.
2. Orphan risk affects the slope of the curve, not its basic shape. That adding transactions to a block creates costs for the miner in the form of orphan risk is an argument for why no quota is necessary to keep the block size reasonable.

Price discovery for transaction inclusion will get better with there is no subsidy and all miner revenue is derived from via transaction fees. At that point there's nothing "special" about that industry compared to every other service industry which does a fine job of using price discovery to match supply and demand.

If supply is not constrained, transaction fees fall to the marginal cost, mining profit falls and then miners exit and the difficulty falls.  The remaining miners can then find blocks more easily, but they don’t necessarily get compensated more for this, because the fees would still be low.
If there are fewer miners competing for the same amount of transaction fees, then each miner's revenue has increased. The process you describe will continue until the oversupply of miners is corrected and equilibrium is restored.

For example, let’s consider the music industry, improvements in technology are reducing the cost of distributing music and this is a good thing, it means more people get more music at a lower price.  If the marginal cost of distributing music is zero then in a competitive market the price should be zero and the consumer wins.  The record industry or CD production industry may suffer and shrink, which many people may argue is fine, that’s just market forces at work.  With respect to miners, it is different, users need miners to be around.  Improvements in technology which reduced the cost of processing transactions to zero, could damage network security.

Markets that never could have existed in the first place except for government-granted monopoly rights are not appropriate as models of what we should do.
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 1939


Chief Scientist


View Profile WWW
October 09, 2014, 10:14:22 PM
 #50

Yeah, 40% of a 250 GB connection works out to about 23 MB depending on how you define month.  May I ask what would happen regarding TOR?

Thanks for checking my math!  I used 31-day months, since I assume that is how ISPs do the bandwidth cap.

RE: what happens with Tor:

Run a full node (or better, several full nodes) that is connected to the network directly-- not via Tor.

But to keep your transactions private, you broadcast them through a Tor-connected SPV (not full) node. If you are mining, broadcast new blocks the same way.

That gives you fully-validating-node security plus transaction/block privacy. You could run both the full node and the SPV-Tor-connected node on a machine at home; to the rest of the network your home IP address would look like a relay node that never generated any transactions or blocks.

If you live in a country where even just connecting to the Bitcoin network is illegal (or would draw unwelcome attention to yourself), then you'd need to pay for a server somewhere else and administer it via Tor.

How often do you get the chance to work on a potentially world-changing project?
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 10, 2014, 12:09:43 AM
 #51

Yeah, 40% of a 250 GB connection works out to about 23 MB depending on how you define month.  May I ask what would happen regarding TOR?

Thanks for checking my math!  I used 31-day months, since I assume that is how ISPs do the bandwidth cap.

Ah, that makes sense.  No problem.

RE: what happens with Tor:

Run a full node (or better, several full nodes) that is connected to the network directly-- not via Tor.

But to keep your transactions private, you broadcast them through a Tor-connected SPV (not full) node. If you are mining, broadcast new blocks the same way.

That gives you fully-validating-node security plus transaction/block privacy. You could run both the full node and the SPV-Tor-connected node on a machine at home; to the rest of the network your home IP address would look like a relay node that never generated any transactions or blocks.

If you live in a country where even just connecting to the Bitcoin network is illegal (or would draw unwelcome attention to yourself), then you'd need to pay for a server somewhere else and administer it via Tor.

Thank you, this clears things up for me.  All I don't understand here is the notion of broadcasting newly generated blocks through a Tor-connected SPV node but I imagine I can look this up.
tucenaber
Sr. Member
****
Offline Offline

Activity: 337
Merit: 250


View Profile
October 10, 2014, 02:07:41 AM
 #52

No, it does not mean that.  It means this:

2015: 1.5MB (block reward = 25)
2016: 2.25MB (block reward = 25)
2017: 2.8125MB (block reward = 12.5)
2018: 3.52...  (block reward = 12.5)
2018: 4.39...  (block reward = 12.5)
2019: 5.49...  (block reward = 12.5)
2020: 6.18...  (block reward = 6.25)
2021: 6.95...  (block reward = 6.25)
2021: 7.82...  (block reward = 6.25)

and so on.  That seems like a schedule that won't kill off too many nodes.

I see. I misunderstood you. But it still amounts to basically the same thing. It will level off to a fixed limit eventually. Besides, it is completely ad hoc. Gavin's formula as I understand it is basically to raise the limit as fast as possible without endangering decentralization too much, which makes sense. Yours seems just taken out of thin air.
Quote
An extremely large block size would mess up the economics of mining eventually.

No it won't.
Syke
Legendary
*
Offline Offline

Activity: 3878
Merit: 1187


View Profile
October 10, 2014, 03:39:35 AM
 #53

If 1 MB blocks give us, say, 3 transactions per second, then 20 years of "double every 2 years" growth starting from 20 MB would leave us with about 60 million transactions per second.  That's about 25 transaction per hour per human (assuming a world population of 8.5 billion in 20 years time).

This sounds a bit excessive to me but then again I've not thought seriously about how such a volume of transactions could be utilised.  https://en.bitcoin.it/wiki/Scalability doesn't speculate beyond a few hundred thousand transactions per second.  I'd certainly appreciate a link if a discussion on the utility of millions of transactions per second exists.

I like this line of thinking. What TPS are we shooting for and when? That's what will determine what size blocks we need and how to grow to that target.

Simple growth rates like "50% increase per year" are guaranteed to end up with blocks that are too large, which will require another hard fork. Hard forks are bad, mkay?

Buy & Hold
hello_good_sir
Hero Member
*****
Offline Offline

Activity: 1008
Merit: 531



View Profile
October 10, 2014, 06:17:48 AM
 #54


But I'm curious: why do you think an extremely large block size will mess up the economics of mining?  What do you think would happen?


Limiting block size creates an inefficiency in the bitcoin system.  Inefficiency = profit.  This is a basic law of economics, though it is usually phrased in such a way as to justify profits by pointing out that they eliminate inefficiencies.  I am taking the other position, that if we want mining to be profitable then there needs to be some artificial inefficiency in the system, to support marginal producers.  Of course that profit will attract more hashing power thus reducing/eliminating the profit, but at a higher equilibrium.  However, I am not too worried about this aspect of large block sizes.  It is a fairly minor problem and one that is a century away.


RE: geometric growth cannot go on forever:  true, but Moore's law has been going steady for 40 years now. The most pessimistic prediction I could find said it would last at least another 10-20 years; the most optimistic, 600 years.


Well I found several predictions saying that it was only going to continue for about 7 more years.  However, that was about 12 or so years ago, so obviously those predictions didn't come true.

The problem with Moore's Law predictions is that they don't take into account game theory.  They assume that nearly everyone is either working to make better chips, buying better chips, enjoying better chips, or simply having nothing to do with better chips.

We need to imagine a world where the miners, bankers, and governments work to suppress computing technology.  Not because they want to destroy bitcoin, but because they want to be the dominant players.  If bitcoin is wildly successful, Moore's law will have an opponent.


I'd be happy with "increase block size 40% per year (double every two years) for 20 years, then stop."


That would probably work pretty well.  It would end (hopefully) before bitcoin is too big of a deal.

The goal should be to get to a situation where it is simply socially unacceptable to suggest changes to the bitcoin protocol.  This needs to be the situation before mass acceptance.  Once mass acceptance happens conversations like the one we are having now will be held behind private doors by central banks (central banks will always be with us).

2112
Legendary
*
Offline Offline

Activity: 2128
Merit: 1055



View Profile
October 10, 2014, 06:47:03 AM
 #55

Limiting block size creates an inefficiency in the bitcoin system.  Inefficiency = profit.  This is a basic law of economics, though it is usually phrased in such a way as to justify profits by pointing out that they eliminate inefficiencies.  I am taking the other position, that if we want mining to be profitable then there needs to be some artificial inefficiency in the system, to support marginal producers.  Of course that profit will attract more hashing power thus reducing/eliminating the profit, but at a higher equilibrium.  However, I am not too worried about this aspect of large block sizes.  It is a fairly minor problem and one that is a century away.
This is fairly common misconception that the only way to pay for the space in a mined Bitcoin block is with fees denominated in bitcoins. But this is not true when a miner is integrated with an exchange, because an exchange can shave commissions on both sides of the transactions.

Imagine for a moment that Bitfury branches out into Consolidated Furies and spawns Hryvnafury, Roublefury, Eurofury, DollarFury, etc.; all of them being exchanges. It can then easily outcompete pure Bitcoin miners because it can directly funnel fiat commissions into electric utility bills without having to go twice through the fiat<->coin exchange.

Edit: In fact opportunities for integration are not limited to mining + coin exchange. Imagine for example Marijuanafury which does two things demanding lots of electricity: Bitcoin mining and indoor marijuana growing. If only somebody could come up with new optical ASIC technology that is pumped with energy via photons at the same wavelength that stimulate photosynthesis...

Please comment, critique, criticize or ridicule BIP 2112: https://bitcointalk.org/index.php?topic=54382.0
Long-term mining prognosis: https://bitcointalk.org/index.php?topic=91101.0
painlord2k
Sr. Member
****
Offline Offline

Activity: 453
Merit: 254


View Profile
October 10, 2014, 01:08:32 PM
 #56

Just exposing some ideas:

Gavin plan appear to me to be VERY conservative (maybe too much).

To be able to process the same number of transaction of VISA, Bitcoin should grow x2,000.
The size of blocks should go up ~1.000x at least to accommodate so many transactions.
And we will not just want to take VISA burden, we want, also, offer a service to the currently unbanked (being humans or DACs).
In the block size increase 50% every year, it will take 20 years to take over VISA alone; never mind the unbanked and DAcs.

With a 100% increase every year, it will need 11 years to take over VISA. This will bring us to 2025, when the inflation rate will be around 1% and the coins mined will be 3.125 BTC/block.

If we suppose the income of the miners will be the same as now, when the block reward will become irrelevant and Bitcoin will have the same number of transactions of VISA, we need  0.01 $ value per transaction (200 K transactions x 0.01 cent = 2M$/day- like today).

We must keep in mind, the low cost of Bitcoin transactions will cause a greater use of them, not a lower use of them.
If now it is uneconomic to bill people daily, because VISA cost 20 cents+2% per transaction, but tomorrow, with a deployed BTC infrastructure, you could have people paying daily for a lot of things today they pay weekly or monthly.

We must also keep in mind BTC transactions will be what keep the network working. We could compensate with larger fees (10 cent per transaction?), but this will weed out marginal transactions and push them in some other coin.

I would suggest some schedule to go from 1 to 32 MB in the next 5 years at most, better in two years and plan to increase exponentially from there until 1 GByte block (at least).

Because as the merchants and adopters reach a critical mass, there will be an explosion of transactions.
Today, the great majority of BTC holders have little chance to spend their bitcoins or being paid in bitcoins in their daily lives.
But, as BTC become popular and adopted, they will start to use it more frequently and the number of transactions will explode. We have see nothing until now. The growth was pretty linear in the last two years.


The big problem with Gavin plan is exponential growth is tricky to manage.
Too slow and in grow too slowly initially to keep up with the demand.
Too fast and will not keep up with the resources available.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 10, 2014, 01:51:05 PM
 #57

Gavin plan appear to me to be VERY conservative (maybe too much).

To be able to process the same number of transaction of VISA, Bitcoin should grow x2,000.
The size of blocks should go up ~1.000x at least to accommodate so many transactions.
And we will not just want to take VISA burden, we want, also, offer a service to the currently unbanked (being humans or DACs).
In the block size increase 50% every year, it will take 20 years to take over VISA alone; never mind the unbanked and DAcs.

Thank you.  You awakened me to a calculation error I made earlier.

I believe the proposal involves an initial jump in block size followed by temporary exponential growth with fixed parameters.  If the blocksize were increased to say 20 MB and then grown at 50% per year we'd be up 2 GB blocks in 11-12 years.  At 40% (double every 2 years) it would take 13-14 years.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1006



View Profile
October 10, 2014, 01:53:41 PM
 #58

Can people stop talking about increasing the block size?

It's the block size limit that needs to increase or be abolished.

Higher limits does not imply a suddenly larger blocks, just the possibility for larger blocks to be created when the need exists and the price is right.
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1014


View Profile
October 10, 2014, 02:19:23 PM
 #59

Higher limits does not imply a suddenly larger blocks, just the possibility for larger blocks to be created when the need exists and the price is right.

Right.  The majority of the miners, working together, will always have the ability to set a lower block limit.

The block limit embedded in the reference client is there to prevent massive blocks that cannot be handled in a distributed way.  Large enough blocks mean that miners who don't have a high speed connection can't keep up.

As long as the average VPS can handle the block size, then centralisation risk is low.

Miners might still decide to keep the block size small to push up fees.  There is an inherent cost for larger blocks, since they take longer to distribute (though that depends on low latency optimisations).

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 10, 2014, 02:45:17 PM
 #60

If 1 MB blocks give us, say, 3 transactions per second, then 20 years of "double every 2 years" growth starting from 20 MB would leave us with about 60 million transactions per second.  That's about 25 transaction per hour per human (assuming a world population of 8.5 billion in 20 years time).

This sounds a bit excessive to me but then again I've not thought seriously about how such a volume of transactions could be utilised.  https://en.bitcoin.it/wiki/Scalability doesn't speculate beyond a few hundred thousand transactions per second.  I'd certainly appreciate a link if a discussion on the utility of millions of transactions per second exists.

I like this line of thinking. What TPS are we shooting for and when? That's what will determine what size blocks we need and how to grow to that target.

Simple growth rates like "50% increase per year" are guaranteed to end up with blocks that are too large, which will require another hard fork. Hard forks are bad, mkay?

Apologies.  I miscalculated, the figure should be 60 000 transactions per second, not 60 million (so about 4 transactions per human per week).

More meaningfully, the maximum block size would rise to a final value of 20 GB each.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1006



View Profile
October 10, 2014, 02:46:11 PM
 #61

Right.  The majority of the miners, working together, will always have the ability to set a lower block limit.

The block limit embedded in the reference client is there to prevent massive blocks that cannot be handled in a distributed way.  Large enough blocks mean that miners who don't have a high speed connection can't keep up.

As long as the average VPS can handle the block size, then centralisation risk is low.

Miners might still decide to keep the block size small to push up fees.  There is an inherent cost for larger blocks, since they take longer to distribute (though that depends on low latency optimisations).
Limits are still the wrong word to use.

Each miners will decide to include which transactions into their block. The point at which they decide to stop adding transactions to a block will depend on their own best guess of where it is no longer to profitable to do so.

This is not a limit - it's an equilibrium.

The "centralization risk" everybody keeps talking about is an artifact of the the P2P network lacking price discovery and operating entirely via donated bandwidth. Fix that problem and we'd never need to have these debates ever again.
gtraah
Sr. Member
****
Offline Offline

Activity: 420
Merit: 250



View Profile
October 11, 2014, 10:21:31 AM
 #62

Great Idea Gav, I know its is ridiculously hard to choose appropriate protocol changes when there are MANY different people thinking differently, I like that you guys took your time discussed it with each other and came to the conclusion that the pros outweigh the cons.

I think BTC is growing and this will encourage innovation and more on-chain transactions and take btc to the next level
cbeast
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1005

Let's talk governance, lipstick, and pigs.


View Profile
October 11, 2014, 10:26:36 AM
 #63

Has the dust attack threat been abated? Block size was an issue at one time.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 11, 2014, 06:25:11 PM
Last edit: October 11, 2014, 06:45:35 PM by jonny1000
 #64

Limiting block size creates an inefficiency in the bitcoin system.  Inefficiency = profit.  This is a basic law of economics, though it is usually phrased in such a way as to justify profits by pointing out that they eliminate inefficiencies.  I am taking the other position, that if we want mining to be profitable then there needs to be some artificial inefficiency in the system, to support marginal producers.  Of course that profit will attract more hashing power thus reducing/eliminating the profit, but at a higher equilibrium.  However, I am not too worried about this aspect of large block sizes.  It is a fairly minor problem and one that is a century away.

+1

Very good point hello_good_sir.  I was trying to say this but you put it in a far more articulate way. I think we may need some artificial inefficiency at some point.



If supply is not constrained, transaction fees fall to the marginal cost, mining profit falls and then miners exit and the difficulty falls.  The remaining miners can then find blocks more easily, but they don’t necessarily get compensated more for this, because the fees would still be low.
If there are fewer miners competing for the same amount of transaction fees, then each miner's revenue has increased. The process you describe will continue until the oversupply of miners is corrected and equilibrium is restored.

Yes, but a key factor to consider is what equilibrium?  Will this be at a high enough difficulty and if not, do we need to manipulate the market?  Users pay transaction fees for there transactions to be included in a block, users are not directly paying for network security or network consensus. After the block reward falls, the incentive for network consensus can be considered as an indirect consequence of users paying for their transactions to be included in blocks, and therefore a pure unrestricted competitive market may not be an effective mechanism for determining transaction fees.  Getting a transaction included in a block and the network reaching consensus about the longest chain may be two slightly different things.  There is a mismatch here which I think some people miss.  This could be somewhat analogous to the classic tragedy of the commons problem.
LiteCoinGuy
Legendary
*
Offline Offline

Activity: 1148
Merit: 1010


In Satoshi I Trust


View Profile WWW
October 11, 2014, 07:28:58 PM
 #65

...Hurray, we just reinvented the SWIFT or ACH systems.

SWIFT doesn't work like this at all...

I think what he meant was that it would be like SWIFT in that it would mostly be for large international transfers. A fork like this will have to happen sooner or later.


sooner please. lets do the major changes in these days and the rest on top of bitcoin in other layers. there will be no more changes (hopefully) if we reach a market cap of 100 or 500 billions.

IIOII
Legendary
*
Offline Offline

Activity: 1154
Merit: 1012



View Profile
October 11, 2014, 08:51:19 PM
 #66

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Is it impossible to implement a self-regulating block size limit mechanism similar to the way difficulty is adjusted, which allows the block size limit to be increased and decreased based on "demand"?

I imagine that a dynamic mechanism would be much better at encouraging responsible (resource preserving) network use.

I'm very sceptical regarding a fixed-percentage increase, because there is zero assurance that Moore's "law" will remain true in the future. Because - as you know - past performance is no indicator of future results. And we're quickly approaching the atomic level in storage solutions for example. Decentralization should be preserved by all means possible, because it is the very core that ensures the safety and thereby the value of Bitcoin.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 11, 2014, 10:24:21 PM
 #67

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Is it impossible to implement a self-regulating block size limit mechanism similar to the way difficulty is adjusted, which allows the block size limit to be increased and decreased based on "demand"?

I'm not aware of more recent discussions but I found the first three pages of this Feb 2013 thread good food for thought.

Decentralization should be preserved by all means possible, because it is the very core that ensures the safety and thereby the value of Bitcoin.

There's risk in everything and nothing is absolute.  This attitude would yield the obvious answer: "Don't ever raise the block limit at all".
hello_good_sir
Hero Member
*****
Offline Offline

Activity: 1008
Merit: 531



View Profile
October 12, 2014, 03:47:25 AM
 #68

Just exposing some ideas:

Gavin plan appear to me to be VERY conservative (maybe too much).

To be able to process the same number of transaction of VISA, Bitcoin should grow x2,000.
The size of blocks should go up ~1.000x at least to accommodate so many transactions.
And we will not just want to take VISA burden, we want, also, offer a service to the currently unbanked (being humans or DACs).
In the block size increase 50% every year, it will take 20 years to take over VISA alone; never mind the unbanked and DAcs.

You're not thinking about safety.  Yes it would be nice for bitcoin to be able to handle 2000 as many transactions as it can now, however that is not as important as keeping bitcoin decentralized.  Let's keep in mind why bitcoin was created: to create a digital gold standard so that people could protect their assets from central banks.  If bitcoin also becomes a ubiquitous payment system that would be great, but not if it comes at the expense of decentralization.

hello_good_sir
Hero Member
*****
Offline Offline

Activity: 1008
Merit: 531



View Profile
October 12, 2014, 03:50:13 AM
 #69

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Powerful entities would game the system, turning it into a proof-of-bandwidth system, which would be a bad thing.

IIOII
Legendary
*
Offline Offline

Activity: 1154
Merit: 1012



View Profile
October 12, 2014, 04:03:48 PM
 #70

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Is it impossible to implement a self-regulating block size limit mechanism similar to the way difficulty is adjusted, which allows the block size limit to be increased and decreased based on "demand"?

I'm not aware of more recent discussions but I found the first three pages of this Feb 2013 thread good food for thought.

I've read the initial post of that thread several times and I think that its headline is a bit misleading. Essentially what Peter Todd is saying is that an large blocksize limit in general encourages the miners to drive out low-bandwidth competition. He is actually opposing Gavin's plan as well:

I primarily want to keep the limit fixed so we don't have a perverse incentive. Ensuring that everyone can audit the network properly is secondarily.

If there was consensus to, say, raise the limit to 100MiB that's something I could be convinced of. But only if raising the limit is not something that happens automatically under miner control, nor if the limit is going to just be raised year after year.

According to Peter Todd it is essential that miners do not control the blocksize limit. He argues based on the assumption of an rolling average mechanism that takes its data from the previous observed block sizes. But that's not an argument against a dynamic block size limit (increase/decrease) in general. The point is, that the dynamic block size limit should not be able to be (substantially) influenced by miners, but instead by the transacting parties. So if it would be possible to determine the dynamic block size limit based on the number of transactions multiplied by a fixed "reasonably large" size constant plus safety margin you would get rid of the problem.


Decentralization should be preserved by all means possible, because it is the very core that ensures the safety and thereby the value of Bitcoin.

There's risk in everything and nothing is absolute.  This attitude would yield the obvious answer: "Don't ever raise the block limit at all".

I'd better say: "Only raise block size limit if required by the minimum amount necessary."
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1006



View Profile
October 12, 2014, 04:27:55 PM
 #71

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Powerful entities would game the system, turning it into a proof-of-bandwidth system, which would be a bad thing.
They can only do this as long as network bandwidth is donated and the consumers of it do not pay the suppliers.

Fix that problem and we'll never need to have this debate again.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 12, 2014, 06:01:13 PM
 #72

I'm not aware of more recent discussions but I found the first three pages of this Feb 2013 thread good food for thought.

I've read the initial post of that thread several times and I think that its headline is a bit misleading. Essentially what Peter Todd is saying is that an large blocksize limit in general encourages the miners to drive out low-bandwidth competition. He is actually opposing Gavin's plan as well:

It was simply that many heavy-hitters were expressing opposing views that I found the thread informative.

According to Peter Todd it is essential that miners do not control the blocksize limit. He argues based on the assumption of an rolling average mechanism that takes its data from the previous observed block sizes. But that's not an argument against a dynamic block size limit (increase/decrease) in general. The point is, that the dynamic block size limit should not be able to be (substantially) influenced by miners, but instead by the transacting parties. So if it would be possible to determine the dynamic block size limit based on the number of transactions multiplied by a fixed "reasonably large" size constant plus safety margin you would get rid of the problem.

Certainly, a dynamic means of adjusting the block size which could not be gamed by miners would be great.  Linked by Peter was an idea from Gavin of determining an appropriate block size by the times taken by nodes to verify blocks.

Unfortunately, I'm not aware of any proposal which really does this.  I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

I'd better say: "Only raise block size limit if required by the minimum amount necessary."

What constitutes "necessary"?  What if there are so many "necessary" transactions that it's impossible for Bitcoin to continue as a decentralised system?  I'd like to see as much block space as possible but will happily work to keep the blocksize smaller than many deem "necessary" to avoid centralisation.
cbeast
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1005

Let's talk governance, lipstick, and pigs.


View Profile
October 12, 2014, 07:04:53 PM
 #73

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Powerful entities would game the system, turning it into a proof-of-bandwidth system, which would be a bad thing.
They can only do this as long as network bandwidth is donated and the consumers of it do not pay the suppliers.

Fix that problem and we'll never need to have this debate again.
It doesn't need to be fixed, it needs to be offered by suppliers. If there is a demand, they will supply it. They are not supplying it because vendors and consumers are not aware of the issue. Education is what's needed. If they can be shown the profitability, then they will fill the niche.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
minime
Hero Member
*****
Offline Offline

Activity: 588
Merit: 500



View Profile
October 13, 2014, 11:12:53 AM
 #74

just do it...
IIOII
Legendary
*
Offline Offline

Activity: 1154
Merit: 1012



View Profile
October 13, 2014, 12:37:50 PM
 #75

Certainly, a dynamic means of adjusting the block size which could not be gamed by miners would be great.  Linked by Peter was an idea from Gavin of determining an appropriate block size by the times taken by nodes to verify blocks.

Unfortunately, I'm not aware of any proposal which really does this.  I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

My aim is a broad call for a (re)consideration of a dynamic "demand-driven" block size limit mechanism. The best adjustment estimators have yet to be determined. I think the concept should not be prematurely dismissed, because it could be highly beneficial in terms of resource preservation and hence decentralization.

The problem that selfish large miners create millions of transactions could be alleviated by using a median (instead of a mean) statistic in the adjustment estimation, which is much less susceptible to extreme values. Maybe one could also do statistical correction based on IP adresses (those that frequently submit only blocks with a excessively huge number of transactions get less weight).


I'd better say: "Only raise block size limit if required by the minimum amount necessary."

What constitutes "necessary"?  What if there are so many "necessary" transactions that it's impossible for Bitcoin to continue as a decentralised system?  I'd like to see as much block space as possible but will happily work to keep the blocksize smaller than many deem "necessary" to avoid centralisation.

Of course "necessary" has to be defined. I think it is acceptable to make Bitcoin progressively more unviable (through higher fees) for microtransactions if decentralization is at risk. Very small transactions could also happen off-the-chain. However what "small" means is open to debate.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 13, 2014, 05:27:01 PM
 #76

The problem that selfish large miners create millions of transactions could be alleviated by using a median (instead of a mean) statistic in the adjustment estimation, which is much less susceptible to extreme values. Maybe one could also do statistical correction based on IP adresses (those that frequently submit only blocks with a excessively huge number of transactions get less weight).

This is starting to sound hairy to me.  I can easily imagine that 60% of the largest miners would benefit sufficiently from the loss of the weakest 20% of miners that it's profitable for them to all include some number of plausible-looking transactions between their addresses (thereby causing an inflated median).  I feel that anything involving IP addresses is prone to abuse and much worse than the admittedly ugly fixed-growth proposal.

Of course "necessary" has to be defined. I think it is acceptable to make Bitcoin progressively more unviable (through higher fees) for microtransactions if decentralization is at risk. Very small transactions could also happen off-the-chain. However what "small" means is open to debate.

My own feeling is that we should be looking at "as much block-space as possible given the decentralisation requirement" rather than "as little block-space as necessary given current usage".  However, if you can find an appealing notions of necessity, smallness, or some alternative method of attempting to balance centralisation risk against utility which involves fewer magic numbers and uncertainty than the fixed-growth proposal then it's certainly worth it's own thread in the development section.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 13, 2014, 08:56:30 PM
Last edit: October 13, 2014, 09:28:36 PM by jonny1000
 #77

I propose the following rule to determine the block size limit, once the block reward is low
The block size limit would increase (or decrease), by X%, if total transaction fees in the last N blocks is Y Bitcoin or more (or less).  

For example:
If the average aggregate transaction fees in the last 100,000 blocks is 1 Bitcoin per block, or more, then there could be a 20% increase in the block size limit.  

Advantages of this methodology include:
  • This algorithm would be relatively simple
  • The limit is determined algorithmically from historic blockchain data and therefore there will be a high level of agreement over the block size limit
  • The system ensures sufficient fees are paid to secure the network in a direct way
  • It would be difficult and expensive to manipulate this data, especially if mining is competitive and decentralized
  • The limit would relate well to demand for Bitcoin usage and real demand based on transaction fees, not just volume

I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

It could be in miners interests to keep the block size limit small, to make the resource they are “selling” more scarce and improve profitability.  The assumption that miners would try to manipulate the block size limit upwards is not necessary true, it depends on the bandwidth issue versus the need for artificial scarcity issue dynamics at the time.  If Moore’s law holds then eventually the artificial scarcity argument will become overwhelmingly more relevant than the bandwidth issues and miners may want smaller blocks.  Miners could manipulate it both ways depending on the dynamics at the time.

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1018



View Profile
October 13, 2014, 10:27:44 PM
 #78

It could be in miners interests to keep the block size limit small, to make the resource they are “selling” more scarce and improve profitability.  The assumption that miners would try to manipulate the block size limit upwards is not necessary true, it depends on the bandwidth issue versus the need for artificial scarcity issue dynamics at the time.  If Moore’s law holds then eventually the artificial scarcity argument will become overwhelmingly more relevant than the bandwidth issues and miners may want smaller blocks.  Miners could manipulate it both ways depending on the dynamics at the time.

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?

It is already explicit in the bitcoin network structure that miners can 'manipulate the block size down'.  They could all issue empty blocks if they wanted.  And yes, miners can also 'manipulate' the block size up.  So the lower bound for the 'manipulation' is zero.  The upper bound is the block size limit, currently at 1MB.  We all agree miners can do whatever they want within those limits.  Gavin's proposal is just a concept for moving that upper bound, and thus giving miners a larger range of sizes of which they may choose to make a block.  An idea I support, and I think Gavin supports, is to have the block size be bounded by the technical considerations of decentralization.  Miners can create their own cartel if they want to create artificial scarcity, so they don't need a max block size to do it.  But cartel or not, max block size enshrines, essentially into 'bitcoin law', that bitcoin will remain auditable and and available to the interested individual, both financially and practically speaking.

My own feeling is that we should be looking at "as much block-space as possible given the decentralisation requirement" rather than "as little block-space as necessary given current usage". 

Totally agree. 

However, if you can find an appealing notions of necessity, smallness, or some alternative method of attempting to balance centralisation risk against utility which involves fewer magic numbers and uncertainty than the fixed-growth proposal then it's certainly worth it's own thread in the development section.

I think MaxBlockSize will remain a magic number, and I think that is okay.  It is a critical variable that needs to be adjusted for environmental conditions, balancing, exactly as you put it teukon, [de]centralization against utility.  As computing power grows, it is easier to conceal, hide, and keep "decentralized" computational activities.

Raise it too quickly and it gets too expensive for ordinary people to run full nodes.

So I'm saying: the future is uncertain, but there is a clear trend. Lets follow that trend, because it is the best predictor of what will happen that we have.

If the experts are wrong, and bandwidth growth (or CPU growth or memory growth or whatever) slows or stops in ten years, then fine: change the largest-block-I'll-accept formula. Lowering the maximum is easier than raising it (lowering is a soft-forking change that would only affect stubborn miners who insisted on creating larger-than-what-the-majority-wants blocks).

The more accurate the projection of computing / bandwidth growth is, the less often the magic number would need to be changed.  If we project very accurately, the magic number may never need to be adjusted again.  That being said, it is safer to err on the side of caution, as Gavin has done, to make sure any MaxBlockSize formula does not allow blocks to grow bigger than the hobbiest / interested individual's ability to keep up.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 13, 2014, 10:54:15 PM
 #79

It is already explicit in the bitcoin network structure that miners can 'manipulate the block size down'.  They could all issue empty blocks if they wanted.  And yes, miners can also 'manipulate' the block size up.  So the lower bound for the 'manipulation' is zero.  The upper bound is the block size limit, currently at 1MB.  We all agree miners can do whatever they want within those limits.  Gavin's proposal is just a concept for moving that upper bound, and thus giving miners a larger range of sizes of which they may choose to make a block.  An idea I support, and I think Gavin supports, is to have the block size be bounded by the technical considerations of decentralization.  

I apologise that I was not being very clear, I was talking about miners manipulating the block size limit upwards or downwards in the hypothetical scenario that the block size limit is determined dynamically by an algorithm, for example the one I mention above linking the block size limit to aggregate transaction fees.  What do you think on this proposal?

Miners can create their own cartel if they want to create artificial scarcity, so they don't need a max block size to do it.  But cartel or not, max block size enshrines, essentially into 'bitcoin law', that bitcoin will remain auditable and and available to the interested individual, both financially and practically speaking.

Why do you say that miners can create their own cartel to create artificial scarcity?  Perhaps they can do this, but a healthy Bitcoin network has a competitive and diverse mining industry where this may not be possible.  If miners collude together in this way then Bitcoin has more serious problems that this scalability issue.

I agree that a max block size is also helpful to keep the network decentralised and “available to the interested individual, both financially and practically speaking” as you say, however I postulate that the max size is also necessary for another reason:  

Artificial scarcity in block space -> higher aggregate transaction fees -> higher equilibrium mining difficulty -> more secure network

No scarcity in block space -> lower aggregate transaction fees (yes a higher volume, but no "artificial" profit) -> lower equilibrium mining difficulty -> less secure network
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1018



View Profile
October 14, 2014, 03:23:33 AM
 #80

I apologise that I was not being very clear, I was talking about miners manipulating the block size limit upwards or downwards in the hypothetical scenario that the block size limit is determined dynamically by an algorithm, for example the one I mention above linking the block size limit to aggregate transaction fees.  What do you think on this proposal?

I think I did understand what you were saying.  I was trying to point out that miners already have control of the size of blocks they publish.  And therefore - collectively - miners have control over how fast the blockchain grows.  But that freedom is not absolute.  There upper and lower limits.  Since blocks with a size less-than-zero is mostly an absurd concept, we can safely put just a few bytes as the smallest possible block.  The biggest possible block size is what we are discussing here.  It basically serves as a check that full nodes can use against the miners, meaning nodes can audit the service the miners are providing and otherwise connect and communicate about the sate of the network.  Any proposal that gives the miners some automated way to influence the MaxBlockSize could be used to make the blocks so big as to promote centralization of the nodes.  Individuals would loose there ability to audit the network.

Miners currently do influence of the MaxBlockSize variable, but the influence is based human communication, persuasion, and lobbying within the ranks of the Bitcoin Community.  If MaxBlockSize was algorithmically controlled, with the formula taking as input conditions the miners had some form of control over, then MaxBlockSize could be raised or lowered by the miners directly, without the consensus of full nodes.  It would no longer be a check.



Why do you say that miners can create their own cartel to create artificial scarcity?  Perhaps they can do this, but a healthy Bitcoin network has a competitive and diverse mining industry where this may not be possible.  If miners collude together in this way then Bitcoin has more serious problems that this scalability issue.

I agree that a max block size is also helpful to keep the network decentralised and “available to the interested individual, both financially and practically speaking” as you say, however I postulate that the max size is also necessary for another reason:  

Artificial scarcity in block space -> higher aggregate transaction fees -> higher equilibrium mining difficulty -> more secure network

No scarcity in block space -> lower aggregate transaction fees (yes a higher volume, but no "artificial" profit) -> lower equilibrium mining difficulty -> less secure network

That's why I said cartel, not collude.  Perhaps I should have used the word 'association' to describe miners working together in a constructive fashion.  Miners collaborating is itself not a problem.  In fact, they do work together all the time and the shared computational output is the blockchain.  If at some future point a majority of the miners start behaving badly, the community will respond.  if the MaxBlockSize was very large, and the dynamics of the bitcoin system were causing the hashrate to fall, I would expect miners to get together and solve the problem.  That could include a miner-only agreement to only publish blocks of a certain size, to drive up fee requirements.  This is not a proposal to raise MinBlockSize.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 14, 2014, 08:23:09 AM
Last edit: October 14, 2014, 10:30:04 AM by jonny1000
 #81

Game theory suggests that under certain conditions, these types of agreements or associations are inherently unstable, as the behaviour of the members is an example of a prisoner's dilemma. Each member would be able to make more profit by breaking the agreement (producing larger blocks or including transactions at lower prices) than it could make by abiding by it.

There are several factors that will affect the miners ability to monitor the association:

1.         Number of firms in the industry – High in a competitive mining market, with low barriers to entry and exit for potentially anonymous miners -> association difficult

2.         Characteristics of the products sold by the firms – Homogenous -> association is possible

3.         Production costs of each member – Differing and low costs -> association difficult

4.         Behaviour of demand – Transaction volume demand is highly volatile in different periods -> association difficult
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1001


Gresham's Lawyer


View Profile WWW
October 14, 2014, 04:38:08 PM
 #82

Certainly, a dynamic means of adjusting the block size which could not be gamed by miners would be great.  Linked by Peter was an idea from Gavin of determining an appropriate block size by the times taken by nodes to verify blocks.

Unfortunately, I'm not aware of any proposal which really does this.  I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

My aim is a broad call for a (re)consideration of a dynamic "demand-driven" block size limit mechanism. The best adjustment estimators have yet to be determined. I think the concept should not be prematurely dismissed, because it could be highly beneficial in terms of resource preservation and hence decentralization.

The problem that selfish large miners create millions of transactions could be alleviated by using a median (instead of a mean) statistic in the adjustment estimation, which is much less susceptible to extreme values. Maybe one could also do statistical correction based on IP adresses (those that frequently submit only blocks with a excessively huge number of transactions get less weight).


I'd better say: "Only raise block size limit if required by the minimum amount necessary."

What constitutes "necessary"?  What if there are so many "necessary" transactions that it's impossible for Bitcoin to continue as a decentralised system?  I'd like to see as much block space as possible but will happily work to keep the blocksize smaller than many deem "necessary" to avoid centralisation.

Of course "necessary" has to be defined. I think it is acceptable to make Bitcoin progressively more unviable (through higher fees) for microtransactions if decentralization is at risk. Very small transactions could also happen off-the-chain. However what "small" means is open to debate.
QFT
Lets use measurement and math over extrapolations where possible,  Balance the risk to decentralization vs making it easier for transaction volume in favor of decentralization.  It is difficult to recover from centralizing effects.
If block bloat by conspiring miners is a concern then there can be growth caps on top of a dynamic scalability protocol too.

We have no crystal ball to tell us the future.  All we know is that we don't know.

And I'll just leave this here:
http://xkcd.com/605/



FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 1939


Chief Scientist


View Profile WWW
October 14, 2014, 10:12:07 PM
 #83

I propose the following rule to determine the block size limit, once the block reward is low
The block size limit would increase (or decrease), by X%, if total transaction fees in the last N blocks is Y Bitcoin or more (or less).  

......

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.




How often do you get the chance to work on a potentially world-changing project?
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1001


Gresham's Lawyer


View Profile WWW
October 15, 2014, 12:42:59 AM
 #84

This blocksize increase effort is to support the interests of the merchant service companies, Circle et. al.  I sympathize with their plight, but Bitcoin is not made for these first.  Bitcoin is for everyone.  There are parts of the planet (some of which have the greatest need for Bitcoin) that have very limited bandwidth today and can be expected to not see rapid improvement.

We do need a path forward.  We need a way to scale up.  What I can't abide is the notion of picking a number based on historical data, extrapolating, and applying it to the future.  Whatever we guess, we are guaranteed to be wrong.  Its wrong now, (and since we are not facing any imminent existential crisis) unless we can do better than still being wrong, we aren't ready to contemplate hard forks.

Isn't it worth it to the future generations of Bitcoiners to get this right?  At the moment we have the luxury of time, and we have other developments that will further mitigate this issue are coming to give us even more time.

So... Either let the large(ish) companies that are pushing for this (through TBF) make the best use of this time to give us a path forward that will be a lasting one... or wait until the decentralized brains come up with something more future proof than a guess based on historical data.

Essentially... good work Gavin, for raising the issue and making a proposal, but more research is needed.  I have faith that you'll be able to win me over on this (as well as the others opposing it in its current form).  Its just not there yet.  I don't know the answer, and I don't think anyone else does yet, but with all of us working toward it (again thanks to you for raising the issue), we may find it.

We need to be better than the Central Bankers who get together with their economic advisers and pick numbers arbitrarily.  We need automated future-proof solutions written into open protocols that will still be working when we are long dead.  It is our responsibility being alive and here now at the beginning, to see it done right.  To do less than our best is shameful.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1006



View Profile
October 15, 2014, 01:42:28 AM
 #85

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.
Price discovery of bandwidth is the solution.

Users want their transaction to be relayed to miners.
Miners want the transaction to reach them so they can earn the fees associated with those transactions.
Miners want other miners to receive their blocks to have their reward recognized by the network.
Users want to receive the block the miners produces to they can know the state of the network.

Relay nodes provide a service which everybody wants. Build a competitive open marketplace for relay nodes to offer connectivity to users on both sides of the network and then price discovery can occur. The relay nodes will get compensated for the resources which they are providing and the price signal will automatically make sure we have the right amount of relay node.

Then we never need to have these unresolvable debates again. When block sizes and tx rates increase, there will automatically be a mechanism in place to make sure the relay nodes receive additional income which they can use to defray their rising expenses.
cbeast
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1005

Let's talk governance, lipstick, and pigs.


View Profile
October 15, 2014, 01:48:21 AM
 #86

I propose the following rule to determine the block size limit, once the block reward is low
The block size limit would increase (or decrease), by X%, if total transaction fees in the last N blocks is Y Bitcoin or more (or less).  

......

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.




Elements to make mining competitive are cheap power, good connectivity, cheap heat management, and technology development. Cartels can form that take advantage of either of these elements. As long as all of these elements are not overly abundant in only a few geographical regions, mining can stay decentralized.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
acoindr
Legendary
*
Offline Offline

Activity: 1050
Merit: 1001


View Profile
October 15, 2014, 04:09:23 AM
 #87

I sympathize with their plight, but Bitcoin is not made for these first.  Bitcoin is for everyone.  There are parts of the planet (some of which have the greatest need for Bitcoin) that have very limited bandwidth today and can be expected to not see rapid improvement.

You know I'm starting to think it doesn't matter. We win either way.

In the worst case, say we overshoot and Bitcoin becomes completely centralized by powerful miners which then emulate the current SWIFT system, blocking and regulating transactions. What would happen next? Would we curse and shout CRAP! We were this close. If only we'd ratcheted down our numbers a tiny bit. Well everyone go home. Nothing more to see here.

LOL of course not. We'd move to the next alt-coin not co-opted and continue on, having learned from our mistakes. In a post I wrote long ago which seems to have come true I talked about how alt-coins gave a value to our community Bitcoin never could by providing the one thing it alone never could: alternative.

The people who still say there can be only one will always be wrong. Alt-coins are not going anywhere. Most will have low market caps or blow up and deservedly die horrible deaths, but Bitcoin won't ever be all by itself. Won't happen. And if the free market demands a coin with fixed or less-than-bitcoin block size limit then that's what it will get, and value and usage will flow there.

The converse is also true. Say we are unable to gain consensus for raising the size limit, causing a collapse in price as people perceive Bitcoin as unable to serve the base they thought it would; or we proceed with a messy hard fork creating a rift in the community and price crash as people become confused about the future of Bitcoin and what to do next. Cryptocurrency would still go on, eventually, because that cat is out of the bag and people will continue working on it. Of course, I'd rather see the first scenario (a need to adopt an alt-coin) than second as I'm less certain about recovering well from the second since cryptocurrency ultimately has no backing other than overall confidence in its viability.

Either way I see Bitcoin as providing the world with education. It's teaching the world the possibilities of decentralization with currency and that's where the real value is, because Bitcoin isn't the only thing which can work in that model.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 15, 2014, 09:29:52 AM
 #88

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.
Price discovery of bandwidth is the solution.

A bandwidth market can lead to an efficient use of bandwidth but may do nothing to address the potential tragedy of the commons concerning decentralisation.

Users want their transaction to be relayed to miners.
Miners want the transaction to reach them so they can earn the fees associated with those transactions.
Miners want other miners to receive their blocks to have their reward recognized by the network.
Users want to receive the block the miners produces to they can know the state of the network.

I submit that pursuit of just these policies would actually encourage centralisation.  A small number of large miners will consume fewer resources than a decentralised mass.  A single trusted data centre could be even more efficient.

Elements to make mining competitive are cheap power, good connectivity, cheap heat management, and technology development. Cartels can form that take advantage of either of these elements. As long as all of these elements are not overly abundant in only a few geographical regions, mining can stay decentralized.

A market entity is not restricted to a single geographical location.  McDonald's have locations all over the world.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1006



View Profile
October 15, 2014, 10:31:39 AM
 #89

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.
Price discovery of bandwidth is the solution.

A bandwidth market can lead to an efficient use of bandwidth but may do nothing to address the potential tragedy of the commons concerning decentralisation.

Users want their transaction to be relayed to miners.
Miners want the transaction to reach them so they can earn the fees associated with those transactions.
Miners want other miners to receive their blocks to have their reward recognized by the network.
Users want to receive the block the miners produces to they can know the state of the network.

I submit that pursuit of just these policies would actually encourage centralisation.  A small number of large miners will consume fewer resources than a decentralised mass.  A single trusted data centre could be even more efficient.
Can you define decentralization/centralization in this context?
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 15, 2014, 10:47:43 AM
Last edit: October 15, 2014, 11:00:11 AM by jonny1000
 #90

I propose the following rule to determine the block size limit, once the block reward is low
The block size limit would increase (or decrease), by X%, if total transaction fees in the last N blocks is Y Bitcoin or more (or less).  

......

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.


With the current mining dynamics my proposal would not be suitable for the reasons you suggest.  I merely suggest it as an eventual objective for when the block reward becomes low and hopefully mining becomes more decentralised, competitive and fee driven.  If mining doesn’t develop this way then Bitcoin may not be sustainable in the long run anyway.  We could still keep another maximum of maximums block size limit based on bandwidth considerations and then this transaction fee targeting based limit system could operate within this.

Whatever happens to the hash rate total mining revenue represents the “economic value” of network security.  For example currently the security of the network can now be considered as 25 bitcoin per block, regardless of the large hash rate increases as in theory 25 bitcoin per 10 minutes is the cost of mining.  In the future the value of the total transaction fees will represent the network security and therefore the dynamics which determine the fees will be vital.  Having “supply” potentially grow exponentially forever may not be appropriate.

The above proposal could be a good framework for a discussion on how the dynamics for the transaction fees could be determined in the future.  The system is kind of an aggregate transaction fee targeting scheme.  For example a target of 1 bitcoin per block is around 50,000 bitcoin per annum or 0.24% of the eventual total supply per annum.  Deciding if this is a suitable level would be difficult.  Is 0.24% high enough to secure the network or should it be 1%?  What if the number is too high, we create an arbitrarily high amount or environmental damage?
cbeast
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1005

Let's talk governance, lipstick, and pigs.


View Profile
October 15, 2014, 11:25:33 AM
 #91

Elements to make mining competitive are cheap power, good connectivity, cheap heat management, and technology development. Cartels can form that take advantage of either of these elements. As long as all of these elements are not overly abundant in only a few geographical regions, mining can stay decentralized.

A market entity is not restricted to a single geographical location.  McDonald's have locations all over the world.
That's not the point. McDonald's is also not a cartel. Geography plays a part in where you can have beef or pork sandwiches as well. My point is that nobody has every competitive edge in everything. Anyone that can gain a competitive edge over certain resources will attempt to exploit them. This merely addresses profitability. If you want to realize an actual profit, you will need customers, and customers need incentives.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 15, 2014, 11:43:16 AM
 #92

Users want their transaction to be relayed to miners.
Miners want the transaction to reach them so they can earn the fees associated with those transactions.
Miners want other miners to receive their blocks to have their reward recognized by the network.
Users want to receive the block the miners produces to they can know the state of the network.

I submit that pursuit of just these policies would actually encourage centralisation.  A small number of large miners will consume fewer resources than a decentralised mass.  A single trusted data centre could be even more efficient.

Can you define decentralization/centralization in this context?

Sure.  By centralisation here I'm referring to the gradual reduction in the number of block-generating entities.  To be clear, I claim: absent a block-size limit, this centralisation process would occur naturally and that a good relay bandwidth market would accelerate this process.

Always happy to be proven wrong; just want to give you something more concrete to work with.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1006



View Profile
October 15, 2014, 11:51:57 AM
 #93

Sure.  By centralisation here I'm referring to the gradual reduction in the number of block-generating entities.  To be clear, I claim: absent a block-size limit, this centralisation process would occur naturally and that a good relay bandwidth market would accelerate this process.

Always happy to be proven wrong; just want to give you something more concrete to work with.
The number of individuals who control hashing equipment has been increasing since 2008, during the time in which the block size limit is effectively non-existent (because tx volume is too low to be affected by the limit).

Why are you predicting that this trend would reverse instead of continue?
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 15, 2014, 12:00:54 PM
 #94

Elements to make mining competitive are cheap power, good connectivity, cheap heat management, and technology development. Cartels can form that take advantage of either of these elements. As long as all of these elements are not overly abundant in only a few geographical regions, mining can stay decentralized.

A market entity is not restricted to a single geographical location.  McDonald's have locations all over the world.
That's not the point. McDonald's is also not a cartel. Geography plays a part in where you can have beef or pork sandwiches as well. My point is that nobody has every competitive edge in everything. Anyone that can gain a competitive edge over certain resources will attempt to exploit them. This merely addresses profitability. If you want to realize an actual profit, you will need customers, and customers need incentives.

Certainly, McDonald's is not a monopoly; it is a single player in a competitive market.  I merely wished to highlight that geographical diversity of the factors of production alone is not sufficient to ensure decentralisation of mining.

The point that "nobody has every competitive edge in everything" is what I'm worried about.  To be clear, I'm not saying that free-market forces cause an industry to converge to monopoly, far from it.  I'm suggesting that comparing Bitcoin mining with a typical industry requires care because the miners have some influence over the very nature of Bitcoin itself.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 15, 2014, 12:12:29 PM
 #95

The number of individuals who control hashing equipment has been increasing since 2008, during the time in which the block size limit is effectively non-existent (because tx volume is too low to be affected by the limit).

Why are you predicting that this trend would reverse instead of continue?

Because I believe these trends are dependent on the negligible cost of handling blocks, costs which will become significant soon enough if the block-size growth trend continues unabated.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1006



View Profile
October 15, 2014, 12:43:39 PM
 #96

Because I believe these trends are dependent on the negligible cost of handling blocks, costs which will become significant soon enough if the block-size growth trend continues unabated.
Blocks don't just magically grow in size for no reason.

Blocks get larger because of an increased demand for Bitcoin transactions.

A larger number of bitcoin transactions means more aggregate fee revenue, which means as the block size grows the miners are competing for an expanding market.

Expanding markets attract new entrants into the market, and in the case of Bitcoin mining there is no way for incumbents to exclude competitors who produce valid proof of work.
painlord2k
Sr. Member
****
Offline Offline

Activity: 453
Merit: 254


View Profile
October 15, 2014, 01:17:03 PM
 #97

...
To be able to process the same number of transaction of VISA, Bitcoin should grow x2,000.
The size of blocks should go up ~1.000x at least to accommodate so many transactions.
And we will not just want to take VISA burden, we want, also, offer a service to the currently unbanked (being humans or DACs).
In the block size increase 50% every year, it will take 20 years to take over VISA alone; never mind the unbanked and DAcs.

You're not thinking about safety.  Yes it would be nice for bitcoin to be able to handle 2000 as many transactions as it can now, however that is not as important as keeping bitcoin decentralized.  Let's keep in mind why bitcoin was created: to create a digital gold standard so that people could protect their assets from central banks.  If bitcoin also becomes a ubiquitous payment system that would be great, but not if it comes at the expense of decentralization.

Mining will need to be self sufficient using just fees and without block rewards in the long run.
So, if we take the capability of VISA network as a goal, we need at least 0.01 $ fee per transaction to give the miners the same amount of money they get today. And the minimum block size to be able to do so is very large to accommodate at least 4.000 transactions/second (and VISA process 8.000 transactions per second during high traffic days).

Decentralization is the main goal, because it make the network censor resistant.
But what make possible to maintain decentralization is openness. Anyone, no matter who, can connect and join the network if he has the resources to do so.

Miners need to distribute their block faster first to other miners and then to other nodes. If they distribute their block to other miners first, these miners will start to mine the next block earlier. If a miner was able to flash his blocks to the core nodes of network instantly (E.G. broadcast by DVB-T), but not to the (hash rate) majority of the miners, his ability to communicate would not count a lot, because the core nodes would be exposed to a longer competing chain shortly after. The core nodes just need to download the new blocks from the miners and/or offer them to other core nodes to be able to check the blocks, they don't mine, so they don't need the new blocks in a matter of seconds after they are discovered.
As with many different P2P networks, they work better if different nodes organize themselves in different ways/strata.

Just now, miners (usually miners pools) make large investments in mining rigs and related HW, SW, labor, power and real estate. They can afford to buy enough bandwidth to disperse blocks between themselves in a matter of seconds. More so if there is a Reverse lookup table  and they just need to exchange mainly references to the Tx included and not the actual data.



The rationale to grow faster now with the block size exist because when the block reward will be halved, the revenues of the miners will halve if the price do not at least double.
The single fees can not grow x100 to compensate the reduction of the reward in just few months. They must be able to grow in number to compensate the diminishing reward, not become more costly for the consumer.
At least there must be room to allow this grow to happen.

Now a transaction cost 18$ (miners income = reward + fees). It is obviously impossible to sustain this costs if it came out directly from the users.
https://blockchain.info/it/charts/cost-per-transaction

The network must be able to manage the 1000 times the current transactions to bring the cost down to 1.8 cents from 18 $. Currently we have around 300 KB blocks, so the block size should increase to 300 MB (at least). Make it 600 MB to bring it down to 1 cent and 1 GB to compete with VISA.

The network will grow proportionally faster now than in the future, when it will have taken larger part of the market of payments. The curve will be something resembling an asymptotic curve to some level (E.G. 10x VISA) or linear increase (VISA + x transactions/year). But the steepest part will happen now and in the near future (4-6 years and no more).
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1002



View Profile
October 15, 2014, 02:44:24 PM
 #98

Expanding markets attract new entrants into the market, and in the case of Bitcoin mining there is no way for incumbents to exclude competitors who produce valid proof of work.

Expanding markets do attract new entrants in general, but in the case of Bitcoin, it also increases the barrier to entry, because each participant must be capable of handling all Bitcoin transactions.  The cost to the network of processing a transaction grows at least as quickly as the number of miners (assuming constant technology for simplicity).

Also, you're right that in a reasonably decentralised environment, large mining companies cannot exclude competitors through voluntary means alone.  However, with Bitcoin, there is the 50% threshold to worry about too, something absent in most markets.  In a typical free-market, if an entity accumulates more than 50% of the business then they'll keep that monopoly if and only if they continue to outperform their competition.  This entity will begin to lose market-share if they even cease to innovate, let alone try to abuse their position.  With Bitcoin, there are different incentives involved and a monopolist may well stand to gain from excluding small miners by dropping their blocks.
painlord2k
Sr. Member
****
Offline Offline

Activity: 453
Merit: 254


View Profile
October 15, 2014, 03:39:16 PM
 #99

Expanding markets attract new entrants into the market, and in the case of Bitcoin mining there is no way for incumbents to exclude competitors who produce valid proof of work.

Expanding markets do attract new entrants in general, but in the case of Bitcoin, it also increases the barrier to entry, because each participant must be capable of handling all Bitcoin transactions.  The cost to the network of processing a transaction grows at least as quickly as the number of miners (assuming constant technology for simplicity).

Also, you're right that in a reasonably decentralised environment, large mining companies cannot exclude competitors through voluntary means alone.  However, with Bitcoin, there is the 50% threshold to worry about too, something absent in most markets.  In a typical free-market, if an entity accumulates more than 50% of the business then they'll keep that monopoly if and only if they continue to outperform their competition.  This entity will begin to lose market-share if they even cease to innovate, let alone try to abuse their position.  With Bitcoin, there are different incentives involved and a monopolist may well stand to gain from excluding small miners by dropping their blocks.

The costs of entering the market of mining grow with the number of transaction and the hashing power of the network, but there are ways to collaborate that lower the cost of entering in the mining sector (pools).
The pool hub is the only in need to pump out the blocks, the miners do not need a large bandwidth to start mining.

It is true a 51% pool could drop other miners blocks, but it would lose a lot of support from the associated miners if it did so, because it would be detrimental to the health of the network long term.
A large miner could invest heavily and gain the 51% and make an attack (just dropping others blocks), but he would jeopardize its investment in doing so, because it would become the controller and the central point of failure of the network. Essentially it would paint a big target on its back for law enforcement and assorted nasties. I bet, when someone make an investment on the order of $180M (plus assorted ancillary costs) a need a 150 MW power line he consult lawyers about its business plan.
Syke
Legendary
*
Offline Offline

Activity: 3878
Merit: 1187


View Profile
October 15, 2014, 05:17:12 PM
 #100

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.

Sounds like an irrational fear to me. Big centralized miners couldn't care less about little miners. Little miners are insignificant. And what are these big centralized miners going to use to fill up these huge blocks? Fake transactions? That makes no sense. Real transactions? Then there's a real need for such huge block sizes.

Buy & Hold
cbeast
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1005

Let's talk governance, lipstick, and pigs.


View Profile
October 16, 2014, 01:49:27 AM
 #101

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.

Sounds like an irrational fear to me. Big centralized miners couldn't care less about little miners. Little miners are insignificant. And what are these big centralized miners going to use to fill up these huge blocks? Fake transactions? That makes no sense. Real transactions? Then there's a real need for such huge block sizes.

Yes they can make fake transactions to fill blocks. That would partially shut some miners with small bandwidth. The occasional very large blocks might cause the smaller blocks to not propagate and thus get orphaned. The low bandwidth miners would need to make up for it by increasing their hashrate if they have other advantages like cheap power or government subsidies. It's just another competition vector.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
October 16, 2014, 02:02:28 PM
Last edit: October 16, 2014, 02:19:04 PM by jonny1000
 #102

Gavin has now written part 2, Blocksize economics
https://bitcoinfoundation.org/2014/10/blocksize-economics/

New thread
https://bitcointalk.org/index.php?topic=825601.new#new
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 3920
Merit: 2347


Eadem mutata resurgo


View Profile
June 23, 2015, 12:36:25 AM
Last edit: June 23, 2015, 01:55:49 AM by marcus_of_augustus
 #103

So do we have a roadmap, or is more research needed?

BitUsher
Legendary
*
Offline Offline

Activity: 994
Merit: 1034


View Profile
June 23, 2015, 01:24:07 AM
 #104

So we do we have a roadmap, or is more research needed?

Looks like we have multiple "roadmaps" or proposals to consider now and much more testing ahead, starting with today. Peter Todd had an interesting write up concerning todays stress test - https://gist.github.com/petertodd/8e87c782bdf342ef18fb

We Have Garzik's BIP 100 - http://gtf.org/garzik/bitcoin/BIP100-blocksizechangeproposal.pdf
Gavin's Proposal - https://github.com/gavinandresen/bitcoinxt/commit/821e223ccc4c8ab967399371761718f1015c766b

A few other proposals that are being discussed and miners looking to form consensus with developers -

Quote from: Evan Mo, CEO of Huobi
“The pool operators could actually make such a change themselves without proposing that the core developers do such a thing. Instead, we would like to express our views and concerns to the core developers, and let the community form a discussion rather than rudely cast a divergence. We are happy to see the consensus on the final improvement plan. After all, a 'forked' community is not what we are chasing after.”


Quote from: F2Pool admin Wang Chung
"We do not necessarily consider an 8 MB block size limit a temporary solution, as we cannot predict what will happen years into the future. But we do think 8 MB is enough for the foreseeable future, presumable at least for the next three years. An increase to 20 MB however, is too risky, and we do not like the proposed Bitcoin-Xt alternative either. We do, on the other hand, support BIP 100 as proposed by Jeff Garzik."



Appears 8  may be a lucky number but only after more discussion and testing is done. It isn't clear the exact specifics of which proposal will be selected. From reviewing the 2 above I have concerns with both of them.

Acidyo
Hero Member
*****
Offline Offline

Activity: 560
Merit: 500


Will Bitcoin Rise Again to $60,000?


View Profile
June 23, 2015, 07:03:13 AM
 #105

I'm siding with Gavins proposal all day long. We certainly need the changes he is proposing. They were running a stress test today on the blockchain and it proved that the core needs helps in a number of ways.
scarsbergholden
Hero Member
*****
Offline Offline

Activity: 686
Merit: 500



View Profile
June 23, 2015, 03:00:00 PM
 #106


Thanks good read, i have gotten to see hes perspective on the transaction growth if bitcoin want to be ready for the future where a complete nation can make one transaction a day and the system would still keep up,
now thats the power of a good vision for the future.

jonny1000 (OP)
Member
**
Offline Offline

Activity: 127
Merit: 12



View Profile
June 23, 2015, 09:22:46 PM
Last edit: June 27, 2015, 11:34:03 AM by jonny1000
 #107

The game theory idea that miners will keep including more transactions in blocks, which is in each miners own selfish interest but against the interests of a whole industry, is not just a theory, but is happening today in many commodity markets like oil.

http://www.docdroid.net/14gw5/oil.pdf.html

This will eventually destroy the mining industry if the blocksize is too large.  The move by Gavin to push ahead with 8GB blocks at this point is very dangerous.  Who knows what demand will be like in the 2030s?
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1001


Gresham's Lawyer


View Profile WWW
June 23, 2015, 10:02:56 PM
 #108

I'm siding with Gavins proposal all day long. We certainly need the changes he is proposing. They were running a stress test today on the blockchain and it proved that the core needs helps in a number of ways.

Is there some way you think Gavin's proposal is superior to BIP 100?

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1018



View Profile
June 26, 2015, 07:12:32 AM
 #109

I'm amazed this thread was started over 8 months ago!  I am glad to see Gavin forging ahead, and I hope the other CoreDevs eventually come around and will lend their support to the chain with larger blocks.  They may not ever agree it was the best strategic move, but Bitcoin has many other fronts that need work and development, and their extensive knowledge of crypto-systems will be needed moving forward.  I think the "keep-it-small" group is correct, in a very technical sense.  But Bitcoin is more than just technology, it is also a political movement that needs broad support to reach is maximum effect, and I think that Gavin is more savvy, by far, to that side of things than most of the other devs.  Bitcoin does not necessarily need to be designed to withstand large-scale government-style attack.  It could be designed so, and that threat of 'could' alone is a powerful deterrent.  Yes, we have the technology to hide a blocks in tor, with everything encrypted and miners are in secret locations, etc, etc, etc.  But the more widely Bitcoin moves into the main stream, the less such techniques will be required.  Even proof-of-work mining may be altered.  It is a means to a secure system, not an end.  If more efficient ways of having a secure system are devised, perhaps the amount energy used to secure the chain can be safely reduced.

Bitcoin is not secure in the abstract, it is secure against certain threats from certain directions.  As the threat model changes, the security engineering of Bitcoin can, and should, change as well.  Reaching 'mass adoption' quickly is, I believe, a valid way of making Bitcoin more secure.  Not via a 'technical' approach, but through a 'social' one.
LiteCoinGuy
Legendary
*
Offline Offline

Activity: 1148
Merit: 1010


In Satoshi I Trust


View Profile WWW
June 27, 2015, 09:19:04 AM
 #110


NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1001


Gresham's Lawyer


View Profile WWW
June 27, 2015, 02:24:33 PM
 #111

I'm amazed this thread was started over 8 months ago!  I am glad to see Gavin forging ahead, and I hope the other CoreDevs eventually come around and will lend their support to the chain with larger blocks. 

If you haven't been keeping up, Gavin has a BIP for Bitcoin XT and there is another BIP for Bitcoin Core.

So both clients have a proposal in progress.

http://gtf.org/garzik/bitcoin/BIP100-blocksizechangeproposal.pdf

https://github.com/gavinandresen/bitcoinxt/tree/blocksize_fork

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1014


View Profile
June 27, 2015, 03:17:59 PM
 #112


Technically, they are both potentially proposals for core.

BIP-100 isn't actually being pushed, I think, it is just a suggestion.

There is no proposal that is accepted by all core devs.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
GingerAle
Legendary
*
Offline Offline

Activity: 1246
Merit: 1008


View Profile WWW
July 01, 2015, 04:02:16 AM
 #113

i don't see why you guys don't work on researching blockrope.

< Track your bitcoins! > < Track them again! > <<< [url=https://www.reddit.com/r/Bitcoin/comments/1qomqt/what_a_landmark_legal_case_from_mid1700s_scotland/] What is fungibility? >>> 46P88uZ4edEgsk7iKQUGu2FUDYcdHm2HtLFiGLp1inG4e4f9PTb4mbHWYWFZGYUeQidJ8hFym2WUmWc p34X8HHmFS2LXJkf <<< Free subdomains at moneroworld.com!! >>> <<< If you don't want to run your own node, point your wallet to node.moneroworld.com, and get connected to a random node! @@@@ FUCK ALL THE PROFITEERS! PROOF OF WORK OR ITS A SCAM !!! @@@@
JeromeL
Member
**
Offline Offline

Activity: 554
Merit: 11

CurioInvest [IEO Live]


View Profile
July 03, 2015, 09:36:44 PM
 #114

So we do we have a roadmap, or is more research needed?

Looks like we have multiple "roadmaps" or proposals to consider now and much more testing ahead, starting with today. Peter Todd had an interesting write up concerning todays stress test - https://gist.github.com/petertodd/8e87c782bdf342ef18fb

We Have Garzik's BIP 100 - http://gtf.org/garzik/bitcoin/BIP100-blocksizechangeproposal.pdf
Gavin's Proposal - https://github.com/gavinandresen/bitcoinxt/commit/821e223ccc4c8ab967399371761718f1015c766b

A few other proposals that are being discussed and miners looking to form consensus with developers -

Quote from: Evan Mo, CEO of Huobi
“The pool operators could actually make such a change themselves without proposing that the core developers do such a thing. Instead, we would like to express our views and concerns to the core developers, and let the community form a discussion rather than rudely cast a divergence. We are happy to see the consensus on the final improvement plan. After all, a 'forked' community is not what we are chasing after.”


Quote from: F2Pool admin Wang Chung
"We do not necessarily consider an 8 MB block size limit a temporary solution, as we cannot predict what will happen years into the future. But we do think 8 MB is enough for the foreseeable future, presumable at least for the next three years. An increase to 20 MB however, is too risky, and we do not like the proposed Bitcoin-Xt alternative either. We do, on the other hand, support BIP 100 as proposed by Jeff Garzik."



Appears 8  may be a lucky number but only after more discussion and testing is done. It isn't clear the exact specifics of which proposal will be selected. From reviewing the 2 above I have concerns with both of them.



And Greg Maxwell said he was going to produce soon a BIP (?) on this promising flexcap proposal. Can't wait to see that.

Pages: 1 2 3 4 5 6 [All]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!