Bitcoin Forum
May 06, 2024, 10:45:27 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 »  All
  Print  
Author Topic: Increasing the block size is a good idea; 50%/year is probably too aggressive  (Read 14267 times)
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 23, 2014, 02:38:11 PM
 #141

Are you simply unaware of the other ways scalability is limited and only focused on this one?
We can go into it if you like.
I was looking to keep this discussion on the more narrow issue.

Start a new thread.  HAVE you read my Scalability Roadmap blog post?

I read it, I offered some criticisms in the thread by that title a while back.

It is nice and theoretical.  There are practical things it misses (such as the zero cost mining that does occur in the real world from time to time when the equipment is not owned by the person controlling it)
There are also non-economic actors that do things for reasons other than money, and those working in larger economies (in which Bitcoin is only a minor part) with different agenda entirely.

It was a nice blog post and explained things in a simple way under ideal conditions.  I would refer people to it that need a primer on the matter.


In basic physics we give students problems that assume they are operating in a vacuum.  Basic economics also does this.  The real world is more complex.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
1714992327
Hero Member
*
Offline Offline

Posts: 1714992327

View Profile Personal Message (Offline)

Ignore
1714992327
Reply with quote  #2

1714992327
Report to moderator
1714992327
Hero Member
*
Offline Offline

Posts: 1714992327

View Profile Personal Message (Offline)

Ignore
1714992327
Reply with quote  #2

1714992327
Report to moderator
1714992327
Hero Member
*
Offline Offline

Posts: 1714992327

View Profile Personal Message (Offline)

Ignore
1714992327
Reply with quote  #2

1714992327
Report to moderator
Activity + Trust + Earned Merit == The Most Recognized Users on Bitcointalk
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714992327
Hero Member
*
Offline Offline

Posts: 1714992327

View Profile Personal Message (Offline)

Ignore
1714992327
Reply with quote  #2

1714992327
Report to moderator
1714992327
Hero Member
*
Offline Offline

Posts: 1714992327

View Profile Personal Message (Offline)

Ignore
1714992327
Reply with quote  #2

1714992327
Report to moderator
naplam
Sr. Member
****
Offline Offline

Activity: 252
Merit: 250

Coin Developer - CrunchPool.com operator


View Profile WWW
October 23, 2014, 02:47:56 PM
 #142


I don't know why there's so much discussion about the max block size when the real issue should be how are you going to increase adoption (more people willing to do more txs and pay more fees to keep the network secure) so that Bitcoin is sustainable without a much lower or zero block subsidy in the future.

btchris
Hero Member
*****
Offline Offline

Activity: 672
Merit: 504

a.k.a. gurnec on GitHub


View Profile WWW
October 23, 2014, 02:48:13 PM
 #143

NewLiberty, I have a quick question for you which will hopefully clarify your position in my mind.

Excluding DOS flooding or other malicious actors, do you believe it would ever be a beneficial thing to have the blocksize limit hit?
amaclin
Legendary
*
Offline Offline

Activity: 1260
Merit: 1019


View Profile
October 23, 2014, 02:49:38 PM
 #144

Quote
It is certainly true that nobody can predict the future with 100% accuracy.

I can.
Bitcoin will die in 5 months.
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 23, 2014, 02:55:56 PM
 #145

NewLiberty, I have a quick question for you which will hopefully clarify your position in my mind.

Excluding DOS flooding or other malicious actors, do you believe it would ever be a beneficial thing to have the blocksize limit hit?

Primarily the limit is a safeguard, a backstop.  It is not meant to be a constraint on legitimate commerce.
It also serves to facilitate adoption and decentralization by keeping the ability to participate affordable.
Backstops are beneficial when hit, if they are protecting grandma from getting hit with the ball.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 23, 2014, 02:57:50 PM
Last edit: October 23, 2014, 03:10:19 PM by NewLiberty
 #146


I don't know why there's so much discussion about the max block size when the real issue should be how are you going to increase adoption (more people willing to do more txs and pay more fees to keep the network secure) so that Bitcoin is sustainable without a much lower or zero block subsidy in the future.

There are WAY more people working on the adoption issue, than there are on this one.  If you point is that I should go do something else.  Granted.  If this is done properly I would certainly be doing something else.  But consider that doing it properly, is a matter accretive to adoption as well.

In the later days, Bitcoin will be supported by its tx fees.  Currently the fees support 1/300th of the miner payment.
We are doing about 1 tx per second or so, the limit is about 7tx per second, so now is the time to address this.

It takes time to do things right.  The alternative is that we just patch it and move on, sweeping the problem under the rug for the next time it needs to be patched.  I think that it is likely, and that we (or our children) may regret that.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
btchris
Hero Member
*****
Offline Offline

Activity: 672
Merit: 504

a.k.a. gurnec on GitHub


View Profile WWW
October 23, 2014, 03:01:06 PM
 #147

Primarily the limit is a safeguard, a backstop.  It is not meant to be a constraint on legitimate commerce.
It also serves to facilitate adoption and decentralization by keeping the ability to participate affordable.
Backstops are beneficial when hit, if they are protecting grandma from getting hit with the ball.

Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 23, 2014, 03:07:00 PM
 #148

Primarily the limit is a safeguard, a backstop.  It is not meant to be a constraint on legitimate commerce.
It also serves to facilitate adoption and decentralization by keeping the ability to participate affordable.
Backstops are beneficial when hit, if they are protecting grandma from getting hit with the ball.

Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?

Yes.  Though you should also recognize that block size and bandwidth use are not so tightly tied.
Network and disk compression can compress data, the block size is measured after decompression.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
btchris
Hero Member
*****
Offline Offline

Activity: 672
Merit: 504

a.k.a. gurnec on GitHub


View Profile WWW
October 23, 2014, 03:13:42 PM
 #149

Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?

Yes.  Though you should also recognize that block size and bandwidth use are not so tightly tied.
Network and disk compression can compress data, the block size is measured after decompression.

Fair enough- block size may not be the best parameter to tweak to maintain the stated "bandwidth and disk space" goal, but it is a technically simple parameter to tweak.

Do you believe that there exists somewhere in the blockchain a metric, let's call it X, which would serve as a good predictor of "the bandwidth and disk space an average enthusiast can afford"?

I think this is the same question, though you may disagree: Do you believe that this metric X has a causal relationship with "the bandwidth and disk space an average enthusiast can afford"?
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 23, 2014, 03:20:16 PM
 #150

Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?

Yes.  Though you should also recognize that block size and bandwidth use are not so tightly tied.
Network and disk compression can compress data, the block size is measured after decompression.

Fair enough- block size may not be the best parameter to tweak to maintain the stated "bandwidth and disk space" goal, but it is a technically simple parameter to tweak.

Do you believe that there exists somewhere in the blockchain a metric, let's call it X, which would serve as a good predictor of "the bandwidth and disk space an average enthusiast can afford"?

I think this is the same question, though you may disagree: Do you believe that this metric X has a causal relationship with "the bandwidth and disk space an average enthusiast can afford"?

The Socratic inquiry is a bit pedantic, don't you think?
Skip to your point please.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
btchris
Hero Member
*****
Offline Offline

Activity: 672
Merit: 504

a.k.a. gurnec on GitHub


View Profile WWW
October 23, 2014, 03:29:15 PM
 #151

Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?

Yes.  Though you should also recognize that block size and bandwidth use are not so tightly tied.
Network and disk compression can compress data, the block size is measured after decompression.

Fair enough- block size may not be the best parameter to tweak to maintain the stated "bandwidth and disk space" goal, but it is a technically simple parameter to tweak.

Do you believe that there exists somewhere in the blockchain a metric, let's call it X, which would serve as a good predictor of "the bandwidth and disk space an average enthusiast can afford"?

I think this is the same question, though you may disagree: Do you believe that this metric X has a causal relationship with "the bandwidth and disk space an average enthusiast can afford"?

The Socratic inquiry is a bit pedantic, don't you think?
Skip to your point please.

Very well.

No metric that can be gleaned from the blockchain has a causal relationship with "the bandwidth and disk space an average enthusiast can afford", and therefore any such predictor has a high danger of being either too restrictive or not restrictive enough.

Using Nielsen's Law also has a danger of being inaccurate, however given that it has at least been historically accurate, I find this danger much lower.

Do you disagree? (let's leave ossification out of this just for the moment, if you don't mind)
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 23, 2014, 04:37:28 PM
 #152

Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?

Yes.  Though you should also recognize that block size and bandwidth use are not so tightly tied.
Network and disk compression can compress data, the block size is measured after decompression.

Fair enough- block size may not be the best parameter to tweak to maintain the stated "bandwidth and disk space" goal, but it is a technically simple parameter to tweak.

Do you believe that there exists somewhere in the blockchain a metric, let's call it X, which would serve as a good predictor of "the bandwidth and disk space an average enthusiast can afford"?

I think this is the same question, though you may disagree: Do you believe that this metric X has a causal relationship with "the bandwidth and disk space an average enthusiast can afford"?

The Socratic inquiry is a bit pedantic, don't you think?
Skip to your point please.

Very well.

No metric that can be gleaned from the blockchain has a causal relationship with "the bandwidth and disk space an average enthusiast can afford", and therefore any such predictor has a high danger of being either too restrictive or not restrictive enough.

Using Nielsen's Law also has a danger of being inaccurate, however given that it has at least been historically accurate, I find this danger much lower.

Do you disagree? (let's leave ossification out of this just for the moment, if you don't mind)

Thank you.  You saved yourself a lot of time.  I had enough socratic in law school.  And we'll set aside ossification for your benefit even though it cuts against your position here.

Yes, I disagree. 
Both Block size and transaction fee may be better tools than Nielsen's law, the combination may be even more so.  Dismissing inquiry on the matter, is a missed opportunity.

Having worked in multinational telcos for a few decades designing resilient scalable systems serving 193+ countries, managing teams of security software engineers, and responsibility for security and capacity management, the concepts are not so foreign.  The benefit of something like the block chain to provide consolidated data to rightsize applications over time for their audience, is a ripe fruit.


Neilson's law is less fit for purpose. 
1) It has measured fixed line connections.
- Usage demographics have changed over the period of history it covers.  More connections are mobile now than previous, and telco resources and growth have shifted.  There are other shifts to come.  These are not accommodated in the historical averages, nor are they factored into the current ones under Neilson.

2) It is not a measure of the average enthusiast.
- It measures a first world enthusiast, whose means have improved with age, in a rich nation with good infrastructure in time of peace.  This is not average with respect to place and time through history.

3) Following bandwidth growth is not the only function of max block size, though tying it to the average enthusiast capabilities (if that were possible) would be a suitable way of addressing other functions.
- ultimately it must accommodate the transactions of sufficient fees to maintain the network, and to not constrain reasonable commerce.  These will be business decisions which may be depending on the capacity and cost of the Bitcoin network and its associated fees.  These may radically bend the curve in one way or another.  A fixed non-responsive rate can not be flexible to a changing environment.  Avoiding a requirement for central decision makers to accommodate (or not) puts perverse incentives on Bitcoin developers.

I get that the core devs, (and former core devs) do have do deal with a lot of crazies.  But what is not needed is the "either you agree with me or your are stupid, crazy, or lazy" dismissals of doing real science instead of merely technicians work.  Science is hard, but it is often worth it.

I recall Gavin's talk in San Jose in 2013 being a lot more nuanced on this matter, and it looked like there were real solutions coming, with a future-proof market sensitive approach.  That conference was better in many ways than TBF did this year in Amsterdam.

That earlier stance was optimistic and well founded, it was abandoned.  The explanations for why it was abandoned don't seem compelling at all.


In my first proposal in this thread, I replicated Gavin's Nielsen's law approach with a simple algorithm that replicated it in effect, but took its cues from the block chain to accomplish that (so growth would stop or accelerate if real world circumstances changed).  This was simply an exercise to show that it would be easy enough to do so.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
jonny1000
Member
**
Offline Offline

Activity: 129
Merit: 13



View Profile
October 23, 2014, 06:08:28 PM
 #153

Commodity prices never drop to zero, no matter how abundant they are (assuming a reasonably free market-- government can, of course supply "free" goods, but the results are never pretty). The suppliers of the commodities have to make a profit, or they'll find something else to do.

Gavin
Thanks for being so responsive on this issue.  Although, I am still not fully convinced by the blockszie economics post.

Suppliers of “commodities” need to make a profit and in this case if mining is competitive the difficulty will adjust and miners profit will reach a new equilibrium level.  The question is what is the equilibrium level of difficulty?  Letting normal market forces work means the price reaches some level, however this market has a “positive externality” which is network security.  Using an artificially low blocksize limit could be a good, effective and transparent way of manipulating the market to ensure network security.

Network security can be looked at in two ways:
1.   The network hashrate
2.   Aggregate mining revenue per block (as in theory at least, the cost of renting the bitcoin network’s hashrate to attack it could be related to mining revenue)

Mining revenue is therefore an important factor in network security.  Please try to consider this carefully when considering the maximum blocksize issue.  To be clear, I am not saying it shouldn’t increase above 1MB, I think it should.  However please consider mining incentives once the block reward falls, as one of the factors.  Bandwidth and the related technical issues should not be the only consideration.
btchris
Hero Member
*****
Offline Offline

Activity: 672
Merit: 504

a.k.a. gurnec on GitHub


View Profile WWW
October 23, 2014, 06:39:25 PM
Last edit: October 23, 2014, 07:10:56 PM by btchris
 #154

NewLiberty, thanks again for taking the time to explain your point of view.

The reason, by the way, I was asking the earlier questions was because I actually didn't know the answers. In particular, this answer (happily) surprised me:

Excluding DOS flooding or other malicious actors, do you believe it would ever be a beneficial thing to have the blocksize limit hit?

Primarily the limit is a safeguard, a backstop.  It is not meant to be a constraint on legitimate commerce.
It also serves to facilitate adoption and decentralization by keeping the ability to participate affordable.
Backstops are beneficial when hit, if they are protecting grandma from getting hit with the ball.

Regarding Nielsen's law:
Yes, I disagree.  
Both Block size and transaction fee may be better tools than Nielsen's law, the combination may be even more so.  Dismissing inquiry on the matter, is a missed opportunity.

I don't disagree that Nielsen's law is inaccurate, however I remain quite skeptical that there's something in the blockchain that can more accurately predict grandma's computing resources. Having said that, I think I'm misunderstanding your goal here (and I'm maybe OK with that): it seems as though you're not interested in using grandma's computing resources as a block size limit, you'd prefer a much lower bound at times when transaction volume isn't growing.

My biggest concern with the alternatives discussed in this thread isn't the potential for unchecked growth, but rather the potential for miners creating forced artificial scarcity (hence my first question, for which I expected a different response).

For example in the first algorithm you suggested, a majority mining cartel could artificially limit the max block size, preventing a mining minority from including transactions. It's this lack of free-market choice that I'd disagree with.

If the difference between average block size and max block size were a magnitude or two of order away, I'd find it much more agreeable.

My (ideal) goals, in particular, would be to (1) never kick out grandma, and (2) never prevent a minor from including a legitimate transaction. (edited to add: those are in priority order)
acoindr
Legendary
*
Offline Offline

Activity: 1050
Merit: 1002


View Profile
October 23, 2014, 06:49:30 PM
 #155

NewLiberty, you seem to be ignoring me.

Your sticking point, in my mind, is less about solving this issue than it is you feel people are not taking adequate time to find an input based solution to "fix it right".

As I said before my goal isn't to be right. It's to find a solution which can pass the community so we're not stuck. Ideally it also meets Bitcoin's promises of decentralization and global service. I made a bullet point list outlining my thinking on the two proposals, but please note I didn't refer to any specific plan from you. I said any input based solution, which implies any taking accurate measurements too - lack of consideration in uncovering such isn't relevant. I fundamentally think that approach wouldn't work as well for reasons I outlined.

Would you make a bullet point list of your likes and dislikes on the two proposed paths so we can at least see in a more granular way where our beliefs differ?
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 23, 2014, 07:07:35 PM
 #156

NewLiberty, you seem to be ignoring me.

Your sticking point, in my mind, is less about solving this issue than it is you feel people are not taking adequate time to find an input based solution to "fix it right".

As I said before my goal isn't to be right. It's to find a solution which can pass the community so we're not stuck. Ideally it also meets Bitcoin's promises of decentralization and global service. I made a bullet point list outlining my thinking on the two proposals, but please note I didn't refer to any specific plan from you. I said any input based solution, which implies any taking accurate measurements too - lack of consideration in uncovering such isn't relevant. I fundamentally think that approach wouldn't work as well for reasons I outlined.

Would you make a bullet point list of your likes and dislikes on the two proposed paths so we can at least see in a more granular way where our beliefs differ?
Oh?  And I thought you were ignoring me.

I understand your goal, and your ossification fears.  I don't mean to be ignoring you, only thought this was already fully addressed.

If your ossification fears are justified (and they may be), then (I would argue) that it is more important to do it right than to do it fast, as the ossification would be progressive, and more difficult in years to come.
I understand your position to be that a quick fix to patch this element is needed, that we are at a crisis, and it may be now or never.
I disagree.  If it were a crisis, (even in an ossified state) consensus would be easy, and even doing something foolish would be justified and accepted broadly.  

Unless you are Jeremy Allaire, I probably want this particular issue fixed even more than you do, but I would rather see it fixed for good and all, than continuously twiddled with over the decades to come.

To your bullet point assignment...  maybe.
One of my publishers has been pestering me for a paper so I will likely 'write something'.  I'll try not to point to it and say "but didn't you read this" as if it were the definitive explanation of everything, because it surely will not be that.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
BitMos
Full Member
***
Offline Offline

Activity: 182
Merit: 123

"PLEASE SCULPT YOUR SHIT BEFORE THROWING. Thank U"


View Profile
October 23, 2014, 07:44:24 PM
 #157

Finally good news, even Satoshi told the 1MB limit is temporary.

But easier would be square maxblocksize every block reward halving, to make Bitcoin simple...

50 BTC = 1 MB
25 BTC = 2 MB
12.5 BTC = 4 MB
6.25 BTC = 16 MB
3.125 BTC = 256 MB


and so on.

(edit by me in bold)

you're welcome.

money is faster...
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
October 23, 2014, 09:02:06 PM
 #158

I dunno know; here I am watching for blocks at or near the limit of 1MB and along comes ... it just seems strange to me https://blockchain.info/block-height/326639 -- apparently the miner couldn't be bothered to include even just one other transaction except the coinbase transaction in the block?  Could the pool have been empty from his point of view?

Miner algorithm: listen for a block to be broadcast and immediately begin searching for the next block with only their coinbase transaction in it, ignore all other transactions.  Is there some sort of advantage to ignoring the other transactions?
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
October 23, 2014, 09:05:43 PM
 #159

Hmm, it came only 19 seconds (if the timestamps can be trusted) after the previous one; lucky guy.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
October 23, 2014, 09:13:09 PM
 #160

I'm trying to build the JMT queuing model of Bitcoin.  What is the measured distribution http://en.wikipedia.org/wiki/Probability_distribution of times between blocks?  https://blockchain.info/charts/avg-confirmation-time are points averaged over 24 hours which isn't helping me see it.  I know the target is 10 minutes but it's clear that is not being achieved consistently.
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!