Bitcoin Forum
November 03, 2024, 10:50:17 PM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 [4] 5 6 7 8 9 10 11 »  All
  Print  
Author Topic: Increasing the block size is a good idea; 50%/year is probably too aggressive  (Read 14288 times)
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 17, 2014, 05:20:54 PM
Last edit: October 17, 2014, 05:53:20 PM by NewLiberty
 #61

It would seem that there could be a simple mathematical progressive increase/decrease, which is based on the factual block chain needs and realities of the time that can work forever into the future.

Here is an example that can come close to Gavin's first proposal of 50% increase per year.

If average block size of last 2 weeks is 60-75% of the maximum, increase maximum 1%, if >75% increase 2%
If average block size of last 2 weeks is 25-40% of the maximum decrease maximum 1%, if <29% decrease 2%

Something like this, would have no external dependencies, would adjust based on what future events may come, and won't expire or need to be changed.

These percentage numbers are ones that I picked arbitrarily.  They are complete guesses and so I don't like them anymore than any other number.  This is just to create a model of the sort of thing that would be better than extrapolating.  To do even better, we can do a regression analysis of previous blocks to see where we would be now and tune it further from there.

This may be manipulable:  miners with good bandwidth can start filling the blocks to capacity, to increase the max and push miners with smaller bandwidth out of competition.

Agreed.  And thank you for contributing.

It is offered as an example of the sort of thing that can work, rather than a finished product.
It is merely "better" not best.  I don't think we know of something that will work yet.
By better, I mean that Gavin gets his +50%/year, iff it is needed, and not if it isn't.  And if circumstances change, so does the limit.

If it is 100% manipulated, it is only as bad as Gavin's first proposal. (+4% or so)
That of course could only happen if miners with good bandwidth win all block and also want to manipulate.

If we fear manipulation, we can add anomaly dropping  and exclude the 10% most extreme outside of standard variance (so that fully padded and empty blocks are dropped out of the calculations).

It would be good to avoid creating any perverse incentives entirely wherever possible.

And again, the percentages chosen here are samples only, arbitrarily chosen.  A regression analysis of the block chain ought be employed to determine where we would be with this sort of thing as well as how it would affect the path forward.


The point here is to allow market forces to dictate.  If some miners want to shrink block size to make transactions more precious and extract fees, others will want to get those fees and increase block size.  We want something that can work in perpetuity, not a temporary fix which may get adjusted centrally whenever the whim arises.

Our guide must be math and measurement, not central committees, no matter how smart they may be.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
acoindr
Legendary
*
Offline Offline

Activity: 1050
Merit: 1002


View Profile
October 17, 2014, 05:55:38 PM
 #62

I am just hoping that some more serious thought goes into avoiding the need to guess or extrapolate (an educated guess but still a guess).

It is offered as an example of the sort of thing that can work, rather than a finished product.

This is the problem.

People don't seem to realize Gavin's proposal may be the best we can do. I happen to think it is. If anyone had a better idea we'd have heard it by now. We, the entire community, have brooded on this issue for months if not years now. Here is a spoiler alert: nobody can predict the future.

Did anyone ever stop to think Bitcoin couldn't work? I mean I have, not for reasons technological, but for reasons of solving issues via consensus. Have you ever watched a three-legged human race, you know where one leg gets tied to the leg of another person? The reason they're funny is because it's hard to coordinate two separate thinking entities with different ideas on how to move forward, the result being slow or no progress and falling over. That may be our fate and progress gets harder the more legs get tied in. That's the reason for taking action sooner rather than later.

I've posted it before, but I'll say it again. I think a big reason Satoshi left is because he took Bitcoin as far as he could. With Gavin and other devs coming on-board he saw there was enough technical expertise to keep Bitcoin moving forward. I don't think he thought he had any more ironclad valuable ideas to give Bitcoin. Its fate would be up to the community/world he released it into. Bitcoin is an experiment. People don't seem to want to accept that, but it is. What I'd love to see is somebody against Gavin's proposal offer an actual debatable alternative. Don't just say, sorry it has to be 1MB blocks and as for what else, well that's not our thought problem; and don't just say no we don't want Gavin's proposal because it doesn't matter-of-factly predict the future, and as for what else, well we don't know.

Come up with something else or realize we need to take a possibly imperfect route, but one which could certainly work, so that we take some route at all.
calim
Newbie
*
Offline Offline

Activity: 17
Merit: 0


View Profile
October 17, 2014, 07:33:57 PM
 #63

I happen to think we can do better than Gavin's idea.  I like the idea of trying to come up with a solution that works with the blockchain and adapts over time instead of relying on Gavin, or NL or whomever.  The answer should be in the blockchain.
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 17, 2014, 08:55:34 PM
 #64

I happen to think we can do better than Gavin's idea.  I like the idea of trying to come up with a solution that works with the blockchain and adapts over time instead of relying on Gavin, or NL or whomever.  The answer should be in the blockchain.

This.


To imagine I (or anyone) can predict the future would be engaging in hubris.
Thanks to Satoshi we do not have to predict anything, because the block chain will be there, in the future, telling us what is needed.

I've offered one option.  Heard one good criticism and responded to that with a modification that I think will resolve the concern.
Then outlined one research task to help further refine this option (regression testing with the block chain).

There is more work to be done here, that much is clear.  There are graphs to be plotted, data to be crunched, and code to be written.   There are a LOT of smart folks engaged in this, who else can step up with a critique, or spare some cycles to work on the data?

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
jgarzik
Legendary
*
qt
Offline Offline

Activity: 1596
Merit: 1100


View Profile
October 17, 2014, 11:51:46 PM
 #65

It would seem that there could be a simple mathematical progressive increase/decrease, which is based on the factual block chain needs and realities of the time that can work forever into the future.

This can be easily gamed by stuffing transactions into the blockchain, shutting out smaller players prematurely.


Jeff Garzik, Bloq CEO, former bitcoin core dev team; opinions are my own.
Visit bloq.com / metronome.io
Donations / tip jar: 1BrufViLKnSWtuWGkryPsKsxonV2NQ7Tcj
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 18, 2014, 12:03:47 AM
Last edit: October 18, 2014, 03:26:12 AM by NewLiberty
 #66

It would seem that there could be a simple mathematical progressive increase/decrease, which is based on the factual block chain needs and realities of the time that can work forever into the future.

This can be easily gamed by stuffing transactions into the blockchain, shutting out smaller players prematurely.
Thank you for contributing.
This was already mentioned earlier, you may have missed it.  Yes it can possibly be gamed in the way you mention, it is just unlikely, unprofitable, and ineffective to do so.

This effect of such an "attack" is limited by
1) Anomaly dropping
2) The % of blocks won
3) The disadvantage to those that do so by requiring transmission of larger blocks
4) Even if this "attack" is performed with 100% success by all miners, the max size only grows only a bit over 50% per year anyway (with the proposed numbers - so worse case scenario, it is about the same as Gavin's proposal).
5) Counter-balanced perhaps by other miners may want to shrink the limit and make inclusion in a block more valuable?

If you think that these factors are insufficient disincentive, and the benefits of doing such an attack are still worth it, please help us to better understand why that is?  

I maintain that I do not think we have the best answer yet, so these criticisms are valuable.  This is simply better than other proposals we have seen so far simply because it accommodates for an unpredictable future, but IMHO, not yet good enough for implementation.  Regression testing on previous block chain and some more game theory analysis.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
hello_good_sir (OP)
Hero Member
*****
Offline Offline

Activity: 1008
Merit: 531



View Profile
October 18, 2014, 03:14:55 AM
 #67

Decreasing the block limit (note, not required block size) in the future would not be a hard fork, it would be a soft fork.

It won't be a soft fork, it will be an impossibility.  The miners of the future will be few in number and hostile to the ideas of bitcoin.  This is the reality that we need to design for.  Entities in control of a node will want to keep the price of maintaining a node as high as possible, so that they can control access to the information in the blockchain.

tdryja
Newbie
*
Offline Offline

Activity: 6
Merit: 0


View Profile
October 18, 2014, 03:47:35 AM
 #68

My 2uBTC on this issue:
Instead of guessing the costs of the network via extrapolation, code in a constant-cost negative feedback mechanism.  For example, similar to difficulty adjustments, if mean non-coinbase block reward > 1 BTC, increase max size.  If mean block reward < 1 BTC, decrease max size (floor of 1MB).

Here's why I think this is a long term solution.  With Bitcoin, "costs" and "value" have a very interesting relationship; currently with mining, the costs to run the network are determined by the exchange value of a bitcoin.  Long term, the block size constrains both the cost and value of the network.  By "long term", I mean 100 years from now.  Long term, there's no more coinbase reward.  So miners compete for transaction fees.  Limited block size causes transactors to compete for space in the block, driving up the fees.  An unlimited block size would, without other costs, tend to drive fees to near-zero, and then there's not enough incentive for miners to bother, and the security of the system is compromised.  That's the death spiral idea anyway, which may not actually happen, but it's a legitimate risk, and should be avoided.  The value and utility of bitcoin today has a lot to do with the probability that it will have value in 100 years.

Max block sizes doubling every two years makes them pretty much unlimited.  Capping after 20 years is also a big guess. That also extrapolates Moore's law for potentially longer than the law keeps going.  Gigabit ethernet is what, 15 years old?  And that's what every PC has now, I've never seen 10G over copper ethernet.  Reliance on everything else becoming awesome is a very fragile strategy.

An issue I have with expoentially increasing block size, or static block size, is there's no feedback, and can't respond to changes in the system.  The block size in many ways determines the value of the network. All else being equal, a network that can handle more transactions per day is more useful and more valuable.

I think that similar to the current system of mining costs determined by bitcoin value, block propagation, verification and storage should be determined by how much people are willing to pay.  If transaction fees are high, block space is scarce, and will expand.  If transaction fees are low, block space is too cheap, and the max block size will expand.

This fixes a cost independent of the mining coinbase reward, allowing for sustainable, predictable mining revenue.  The issue is we would have to come up with a number.  What should it cost to run the bitcoin network?  1% of M0 per year?  That would be 210,000 coins per year in transaction fees to miners.  That would be about 3BTC per block.

0.5% M0 annually would be 1.5BTC per block, and so on.  This would be a ceiling cost; it could cost less, if people didn't make too many transactions, or most things happened off-blockchain, and the blocks tended back towards the 1MB floor.  It would effectively put a ceiling on the maintenance cost of the network, however; if blocks were receiving 6BTC in fees, the size would double at the next difficulty adjustment, which would tend to push total fees down.

If you wanted to get fancy you could have hysteresis and non-linearity and stuff like that but if it were up to me I'd keep it really simple and say that max block size is a linear function of the previous epoch block rewards.

This can be "gamed" in 2 ways.  It can be gamed to a limited extent by miners who want to push up the max block size.  They can pay a bunch of fees to themselves and push up that average.  I can't think of a clean way to get rid of that, but hopefully that's OK; isn't it the miners who want smaller blocks anyway?  If miners are competing for larger blocks, why would the non-mining users complain?  The only issue is one miner who wants larger blocks, and everyone else wants smaller ones.  Maybe use median instead of mean to chop out malicious miners or fat-finger giant transaction fees.

It can also be gamed the other way.  Your transaction fee is 0, but you have some off-channel account with my mining group which includes all your txs for a flat monthly rate.  This also seems unlikely; if it were more expensive that way, transactors would stop using the off-channel method and just go to the open market for transaction inclusion.  If it were cheaper, why would the miner forgo that revenue?

So if I ran this whole Bitcoin thing (which would defeat the point... Smiley, that's what I would do.  The question is how much it should cost.  1BTC per block sounds OK, it's nice round number.  That's 50K BTC per year for the miners.

I'd welcome comments / criticism of why having such a feedback mechanism is a good or bad idea.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 503



View Profile
October 18, 2014, 05:26:42 AM
 #69

We should anticipate governments becoming miners; if they aren't already.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 503



View Profile
October 18, 2014, 05:33:59 AM
 #70

A government with a strong enough military/police can potentially take over a miner's equipment by force/violence all in the name of supposed social good while calling it eminent domain http://en.wikipedia.org/wiki/Eminent_domain.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 503



View Profile
October 18, 2014, 06:14:25 AM
 #71

My 2uBTC on this issue:
...
I'd welcome comments / criticism of why having such a feedback mechanism is a good or bad idea.
I realize transactions can come in a wide variety of sizes so my back-of-the-envelope calculations need to be taken with a big grain of salt;

https://blockchain.info/charts/n-transactions-per-block shows around 3-Mar-2014 a peak of 618 transactions in a block (as averaged over 24 hours) & https://blockchain.info/block-index/477556 is a 396KB block with 710 transactions in it.  1BTC/710txn ~= 0.0014BTC/txn or about $0.53 at the current exchange rate; so much for micro-transactions.  Also, 396KB/710txn ~= 558B/txn, so, 1MB/558B/txn ~= 1792txn/MB.  Even, 1BTC/1792txn*$377.79/BTC ~= $0.21/txn.  I think maybe 0.1BTC/block would be nice.  If exchange rate climbs to $2000/BTC and if the block size were still at 1MB then 0.1BTC/1792txn*$2000/BTC ~= $0.11/txn but if the block size were at 2MB then the per transaction fee drops to $0.055 or so.  As legit transaction rates climb presumably so does the exchange rate.  At what point does BTC decouple from fiat?
wachtwoord
Legendary
*
Offline Offline

Activity: 2338
Merit: 1136


View Profile
October 18, 2014, 10:04:48 AM
 #72

I am just hoping that some more serious thought goes into avoiding the need to guess or extrapolate (an educated guess but still a guess).

It is offered as an example of the sort of thing that can work, rather than a finished product.

This is the problem.

People don't seem to realize Gavin's proposal may be the best we can do. I happen to think it is. If anyone had a better idea we'd have heard it by now. We, the entire community, have brooded on this issue for months if not years now. Here is a spoiler alert: nobody can predict the future.

New Liberty's unpolished prototype is already far superior to Gavin's nonsense so this is easily debunked.
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 18, 2014, 10:54:10 AM
Last edit: October 18, 2014, 11:16:31 AM by NewLiberty
 #73

My 2uBTC on this issue:
Instead of guessing the costs of the network via extrapolation, code in a constant-cost negative feedback mechanism.  For example, similar to difficulty adjustments, if mean non-coinbase block reward > 1 BTC, increase max size.  If mean block reward < 1 BTC, decrease max size (floor of 1MB).

Here's why I think this is a long term solution.  With Bitcoin, "costs" and "value" have a very interesting relationship; currently with mining, the costs to run the network are determined by the exchange value of a bitcoin.  Long term, the block size constrains both the cost and value of the network.  By "long term", I mean 100 years from now.  Long term, there's no more coinbase reward.  So miners compete for transaction fees.  Limited block size causes transactors to compete for space in the block, driving up the fees.  An unlimited block size would, without other costs, tend to drive fees to near-zero, and then there's not enough incentive for miners to bother, and the security of the system is compromised.  That's the death spiral idea anyway, which may not actually happen, but it's a legitimate risk, and should be avoided.  The value and utility of bitcoin today has a lot to do with the probability that it will have value in 100 years.

Max block sizes doubling every two years makes them pretty much unlimited.  Capping after 20 years is also a big guess. That also extrapolates Moore's law for potentially longer than the law keeps going.  Gigabit ethernet is what, 15 years old?  And that's what every PC has now, I've never seen 10G over copper ethernet.  Reliance on everything else becoming awesome is a very fragile strategy.

An issue I have with exponentially increasing block size, or static block size, is there's no feedback, and can't respond to changes in the system.  The block size in many ways determines the value of the network. All else being equal, a network that can handle more transactions per day is more useful and more valuable.

I think that similar to the current system of mining costs determined by bitcoin value, block propagation, verification and storage should be determined by how much people are willing to pay.  If transaction fees are high, block space is scarce, and will expand.  If transaction fees are low, block space is too cheap, and the max block size will expand.

This fixes a cost independent of the mining coinbase reward, allowing for sustainable, predictable mining revenue.  The issue is we would have to come up with a number.  What should it cost to run the bitcoin network?  1% of M0 per year?  That would be 210,000 coins per year in transaction fees to miners.  That would be about 3BTC per block.

0.5% M0 annually would be 1.5BTC per block, and so on.  This would be a ceiling cost; it could cost less, if people didn't make too many transactions, or most things happened off-blockchain, and the blocks tended back towards the 1MB floor.  It would effectively put a ceiling on the maintenance cost of the network, however; if blocks were receiving 6BTC in fees, the size would double at the next difficulty adjustment, which would tend to push total fees down.

If you wanted to get fancy you could have hysteresis and non-linearity and stuff like that but if it were up to me I'd keep it really simple and say that max block size is a linear function of the previous epoch block rewards.

This can be "gamed" in 2 ways.  It can be gamed to a limited extent by miners who want to push up the max block size.  They can pay a bunch of fees to themselves and push up that average.  I can't think of a clean way to get rid of that, but hopefully that's OK; isn't it the miners who want smaller blocks anyway?  If miners are competing for larger blocks, why would the non-mining users complain?  The only issue is one miner who wants larger blocks, and everyone else wants smaller ones.  Maybe use median instead of mean to chop out malicious miners or fat-finger giant transaction fees.

It can also be gamed the other way.  Your transaction fee is 0, but you have some off-channel account with my mining group which includes all your txs for a flat monthly rate.  This also seems unlikely; if it were more expensive that way, transactors would stop using the off-channel method and just go to the open market for transaction inclusion.  If it were cheaper, why would the miner forgo that revenue?

So if I ran this whole Bitcoin thing (which would defeat the point... Smiley, that's what I would do.  The question is how much it should cost.  1BTC per block sounds OK, it's nice round number.  That's 50K BTC per year for the miners.

I'd welcome comments / criticism of why having such a feedback mechanism is a good or bad idea.

Thank you for this.

At first look I very much like this feedback mechanism.  I'd also considered using the non-coinbase transaction fees initially for the source data, but had abandoned the idea, perhaps prematurely.  It may be a better place to look for a mechanism to determine this.

I had dropped it for two main reasons.  
1)  I looked at the historical charts. Number of transactions per block was the closest representation I could swiftly find that approximates block size (although it ignores the effects of 2.0 transactions which are larger, there are few of these now).  The fee chart shows much greater variation and less of the rise which I see as needed as a means of enabling adoption.
2)  I wasn't able to reconcile a way around needing an external value for BTC, to get at the mining cost.

Your proposal, tdrja, shows that both my initial reasons are not sufficient to abandon the idea of using fees paid as the means of sensing appropriate block size from the chain data.

I like this proposal foremost because it draws on the data within the block chain to self correct for the unpredictable future.  I also like it for its simplicity, I like that it uses a directly financial metric.  
After thinking about it a bit more, I may have some useful criticisms.



FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
trout
Sr. Member
****
Offline Offline

Activity: 333
Merit: 252


View Profile
October 18, 2014, 01:30:18 PM
 #74

My 2uBTC on this issue:
Instead of guessing the costs of the network via extrapolation, code in a constant-cost negative feedback mechanism.  For example, similar to difficulty adjustments, if mean non-coinbase block reward > 1 BTC, increase max size.  If mean block reward < 1 BTC, decrease max size (floor of 1MB).


a miner can include into his block a transaction with an arbitrary large fee (which he gets back of course), throwing the mean off the chart.
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 2301


Chief Scientist


View Profile WWW
October 18, 2014, 03:16:46 PM
 #75

I happen to think we can do better than Gavin's idea.  I like the idea of trying to come up with a solution that works with the blockchain and adapts over time instead of relying on Gavin, or NL or whomever.  The answer should be in the blockchain.

The answer cannot be in the blockchain, because the problem being addressed (resource usage rising too quickly so only people willing to spend tens of thousands of dollars can participate as fully validating nodes) is outside the blockchain.

You will go down the same path as the proof-of-stake folks, coming up with ever more complicated on-blockchain solutions to a problem that fundamentally involves something that is happening outside the blockchain. In this case, real-world CPU and bandwidth growth. In the POS case, proof that some kind of real-world effort was performed.

How often do you get the chance to work on a potentially world-changing project?
2112
Legendary
*
Offline Offline

Activity: 2128
Merit: 1073



View Profile
October 18, 2014, 03:28:19 PM
Last edit: October 18, 2014, 04:49:49 PM by 2112
 #76

a miner can include into his block a transaction with an arbitrary large fee (which he gets back of course), throwing the mean off the chart.
What about the following modification:

a1) fold a modified p2ppool protocol into the mainline protocol
a2) require that transactions mined into the mainline blockchain have to be seen in the majority of p2ppool blocks
a3) p2ppool then has an additional function of proof-of-propagation: at least 50% of miners have seen the tx
a4) we can then individually adjust the fees and incentives individually for:
a4.1) permanent storage of transactions (in the mainline blockchain)
a4.2) propagation of transactions (in the p2ppool blockchain, which is ephemeral)

Right now the problem is that miners receive all fees due, both for permanent storage and for network propagation.

Another idea in the similar vein:

b1) make mining a moral equivalent of a second-price auction: the mining fees of block X accrue to the miner of block X+1
b2) possibly even replace 1 above with a higher, constant natural number N.
Late edit:
b3) reduce the coinbase maturity requirement by N
Later edit:
b4) since nowadays the fees are very low compared to subsidy, (b3) would imply a temporary gap of global mining income. Subsidy of block X accrues to the miner of X, fees of block X accrue to the miner of block X+N.
End of edits.

Both proposals above aim to incentivize and enforce propagation of the transactions on the network and discourage self-mining of non-public transactions and self-dealing on the mining fee market.

Please comment, critique, criticize or ridicule BIP 2112: https://bitcointalk.org/index.php?topic=54382.0
Long-term mining prognosis: https://bitcointalk.org/index.php?topic=91101.0
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 18, 2014, 04:06:13 PM
Last edit: October 18, 2014, 04:43:55 PM by NewLiberty
 #77

I happen to think we can do better than Gavin's idea.  I like the idea of trying to come up with a solution that works with the blockchain and adapts over time instead of relying on Gavin, or NL or whomever.  The answer should be in the blockchain.

The answer cannot be in the blockchain, because the problem being addressed (resource usage rising too quickly so only people willing to spend tens of thousands of dollars can participate as fully validating nodes) is outside the blockchain.

You will go down the same path as the proof-of-stake folks, coming up with ever more complicated on-blockchain solutions to a problem that fundamentally involves something that is happening outside the blockchain. In this case, real-world CPU and bandwidth growth. In the POS case, proof that some kind of real-world effort was performed.


Thank you for your contribution and criticism.

Since the difficulty adjustment already effectively assesses real-world CPU growth, I'm unready to assume impossibility of real-world assessment with respect to bandwidth, as there are evidence of both in the block chain awaiting our use.
Analogies to PoS are also no proof of a negative.  

They answer may be in the block chain, and it seems the best place to look, as the block chain will be there in the future providing evidence of bandwidth usage if we can avoid breaking Bitcoin protocol today.  

I don't need anyone to be right or wrong here so long as in the end we get the best result for Bitcoin.  I am very happy to be wrong if that means an improvement can be made.

Gavin, I remain grateful for your raising the issue publicly, and for keeping engaged in the discussion.  I do not agree that discussion on the matter ought end, and think we can do better through continuing.

Wherever we can squeeze out arbitrary human decision through math and measurement, it is our duty to the future to do so.  The alternative is to commit our progeny to the whims and discretion of whomever is in authority in the decades to come.  As David Rabahy pointed out a few posts ago, we may not be pleased with that result.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 18, 2014, 04:41:45 PM
 #78

a miner can include into his block a transaction with an arbitrary large fee (which he gets back of course), throwing the mean off the chart.
What about the following modification:

a1) fold a modified p2ppool protocol into the mainline protocol
a2) require that transactions mined into the mainline blockchain have to be seen in the majority of p2ppool blocks
a3) p2ppool then has an additional function of proof-of-propagation: at least 50% of miners have seen the tx
a4) we can then individually adjust the fees and incentives individually for:
a4.1) permanent storage of transactions (in the mainline blockchain)
a4.2) propagation of transactions (in the p2ppool blockchain, which is ephemeral)

Right now the problem is that miners receive all fees due, both for permanent storage and for network propagation.

Another idea in the similar vein:

b1) make mining a moral equivalent of a second-price auction: the mining fees of block X accrue to the miner of block X+1
b2) possibly even replace 1 above with a higher, constant natural number N.
Late edit:
b3) reduce the coinbase maturity requirement by N
End of edit.

Both proposals above aim to incentivize and enforce propagation of the transactions on the network and discourage self-mining of non-public transactions and self-dealing on the mining fee market.

These are interesting propositions in their own right.
There is a virtue in simplicity in that it is less likely to create perverse incentives.  (Gavin alludes to this in his critique)
For example adding a p2ppool dependency may have complexity risks we don't see so the (b) series by that metric may be better than the (a).


FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
acoindr
Legendary
*
Offline Offline

Activity: 1050
Merit: 1002


View Profile
October 18, 2014, 06:27:33 PM
 #79

As I see it Bitcoin is like the U.S. government. It has made too many promises to keep. I agree with Gavin Bitcoin has been sold as being able to serve the world's population. At the same time it has been sold as being effectively decentralized. These two things can't happen at the same time with today's technology, because bandwidth numbers (primarily) don't  align with global transaction data numbers. They will work eventually, but they don't today.

The question is how to get from today to the future day when Bitcoin can handle the world's transaction needs while remaining decentralized down to technology available to average people.

We have effectively three choices.

- Do nothing and remain at 1MB blocks
- Gavin's proposal to grow transaction capacity exponentially, possibly fitting in line with Bitcoin adoption numbers
- Some algorithmic formula to determine block size which is probably more conservative than exponential growth, but less predictable

I think doing nothing is unrealistic.

I like Gavin's proposal because it can solve the issue while also being predictable. Predictability has value when it comes to money. I agree that some other algorithm using real world inputs is safer, but I wonder at what expense. In the worst case, using Gavin's proposal, there may be some risk of heaving hitting players hogging market share from lesser miners, maybe even to the extent of becoming centralized cartels. I don't think there is a good chance of that happening, but agree it's in the realm of possibility. In that case, though, nobody would be forced to continue using Bitcoin, since it's a voluntary currency. It's easy to move to an alternative coin. Free market forces, in my mind, would solve the problem.

If we try being as cautious as possible, seeking inputs along the way we probably rest assured centralization won't happen with Bitcoin. At the same time, though, the market has to continually assess what is Bitcoin's transaction capacity, and therefore value. I'm not sure how that would play out.

My question is can a majority of the community (say 70-80%) be convinced to choose one of the last two options?
trout
Sr. Member
****
Offline Offline

Activity: 333
Merit: 252


View Profile
October 18, 2014, 07:05:39 PM
 #80

a miner can include into his block a transaction with an arbitrary large fee (which he gets back of course), throwing the mean off the chart.
What about the following modification:

a1) fold a modified p2ppool protocol into the mainline protocol
a2) require that transactions mined into the mainline blockchain have to be seen in the majority of p2ppool blocks
a3) p2ppool then has an additional function of proof-of-propagation: at least 50% of miners have seen the tx
a4) we can then individually adjust the fees and incentives individually for:
a4.1) permanent storage of transactions (in the mainline blockchain)
a4.2) propagation of transactions (in the p2ppool blockchain, which is ephemeral)

Right now the problem is that miners receive all fees due, both for permanent storage and for network propagation.

Another idea in the similar vein:

b1) make mining a moral equivalent of a second-price auction: the mining fees of block X accrue to the miner of block X+1
b2) possibly even replace 1 above with a higher, constant natural number N.
Late edit:
b3) reduce the coinbase maturity requirement by N
Later edit:
b4) since nowadays the fees are very low compared to subsidy, (b3) would imply a temporary gap of global mining income. Subsidy of block X accrues to the miner of X, fees of block X accrue to the miner of block X+N.
End of edits.

Both proposals above aim to incentivize and enforce propagation of the transactions on the network and discourage self-mining of non-public transactions and self-dealing on the mining fee market.


a) is vulnerable to sybil attacks
b) smothers the incentive to include any transactions in blocks: why should I (as a miner) include a tx if the fee would go to someone else?

Also it seems both  are too disruptive  to be implemented in bitcoin.
Anything this much different would take an altcoin to be tried.

Pages: « 1 2 3 [4] 5 6 7 8 9 10 11 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!