dEBRUYNE
Legendary
Offline
Activity: 2268
Merit: 1141
|
|
June 05, 2015, 06:15:33 PM |
|
This is correct, and I hadn't given enough thought to this problem prior to posting.
Now that I've given it more thought, I think it can be significantly alleviated by making collection from the pool span a longer period, on the time scale of years. Relative hashrate is likely to change over these period, so it may not be the best plan to publish excessively large blocks hoping to reclaim the fee (and publishing typical-sized blocks does not give big miners an advantage). Also, with a different function you can change the balance between the marginal and total penalty so that the actual penalty is small, nullifying the effect (it will require the cap to be a bit harder, but still more elastic than what we have now).
I agree that this calls for more analysis. A longer time period over which the reward is given doesn't help, as the larger nodes or entities will still get a larger ratio of the rolling over fees, by definition. Actually, making the rollover fees only extend over a couple of blocks would more likely mitigate the problem, but if you roll over the fee for about 3 blocks or so, then it might be worth it for a miner to hold blocks and release 2 at a time, depending on fee-over-time. This, in turn, might exacerbate the selfish miner exploit 1. The natural monopoly condition that already exists in Bitcoin 2 seems to be exacerbated either way. Getting around this would be tricky, if it's possible. 1http://fc14.ifca.ai/papers/fc14_submission_82.pdf2https://bitcointalk.org/index.php?topic=176684.msg9375042#msg9375042An examination of the prior art is warranted.
Pointing to Monero as an examination of prior art is asking a bit much. Are you expecting us to dig through the Monero source code? How do they get around the problem? This is not very helpful; The Basics A special type of transaction included in each block, which contains a small amount of monero sent to the miner as a reward for their mining work. https://getmonero.org/knowledge-base/moneropedia/coinbaseDid you miss this link? -> https://github.com/monero-project/bitmonero/blob/c41d14b2aa3fc883d45299add1cbb8ebbe6c9ed8/src/cryptonote_core/blockchain.cpp#L2230-L2244
|
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
June 07, 2015, 12:45:09 PM Last edit: June 07, 2015, 07:42:00 PM by Meni Rosenfeld |
|
I'll try to repeat the calculations with a different demand curve, to demonstrate my point. But this will take some time and Shabbat is in soon, so that will have to wait.
Let's assume the demand curve - the number of transactions demanded as a function of the fee, per 10 minutes - is d(p) = 27/(8000p^2). It's safe to have d(p)->infinity as p->0 because supply is bounded (if there was no bound on supply, we'd need a more realistic bound on demand to have meaningful results). The behavior below is the same for other reasonable demand curves, as long as demand diminishes superlinearly with p (sublinear decay is less reasonable economically, and results in very different dynamics). We'll assume 4000 transactions go in a MB, and that T=1MB. So the penalty, as a function of the number n of transactions, is f(n) = max(n-4000,0)^2 / (4000*(8000-n)). We'll also assume that transactions are in no particular rush - users will pay the minimal fee that gives them a good guarantee to have the tx accepted in reasonable time (where this time is long enough to include blocks from the different miner groups). So there is a specific fee p for which the tx demand clears with the average number of txs per block (the number of txs can change between blocks). It would have been more interesting to analyze what happens when probabilistic urgency premiums enter the scene, but that's not relevant to the issue of mining centralization. Scenario 1: 100 1% miners. Each miner reclaims 1% of the penalty. If the optimal strategy is to have n txs per block, resulting in a fee of p, then n=d(p) and the marginal penalty (derivative of f) at n, corrected for the reclaiming, must equal p (so that adding another transaction generates no net profit). In other words, 0.99f'(d(p)) = p. Solving this gives p = 0.7496 mBTC, n = 6007. Penalty is 0.5053 BTC, so pool size is 50.53. Miners get 4.5027 BTC per block (6007 * 0.0007496 from txs + 0.5053 collection - 0.5053 penalty). 6007 txs are included per block. Scenario 2: One 90% miner and 10 1% miners. The market clears with a tx fee of p, with the 90% miner including n0 txs per block and the 1% miners including n1 txs per block. The average #txs/block must equal the demand, so 0.9n0 + 0.1n1 = d(p). Every miner must have 0 marginal profit per additional transaction, correcting for reclaiming. So 0.1 f'(n0) = p 0.99 f'(n1) = p Solving all of this results in: n0 = 7251 n1 = 5943 p = 0.6885 mBTC (lower than in scenario 1) Penalty paid by 1% miners: f(5943) = 0.4589 BTC Penalty paid by 90% miner: f(7251) = 3.5294 BTC Average penalty: 0.9*3.5294 + 0.1*0.4589 = 3.2223 BTC Pool size: 322.23 BTC Reward per block for 1% miner: 5943 * 0.0006885 + 3.2223 - 0.4589 = 6.8552 BTC (more than in scenario 1) Reward per block for 90% miner: 7251 * 0.0006885 + 3.2223 - 3.5294 = 4.68521 BTC (less than 1% miners in this scenario; more than the miners in scenario 1). Average number of txs per block: 0.9 * 7251 + 0.1 * 5943 = 7120, more than in scenario 1. Miners are happy - big or small, they gain more rewards. Users are happy - more of their transactions are included, at a lower fee. Nodes are not happy - they have to deal with bigger blocks. Exactly as with the previously discussed demand curve. Over time, difficulty will go up, nullifying the extra mining reward; and whoever is in charge of placing the checks and balances, will make the function tighter (or hold on with making it looser), to keep the block sizes at the desired level. There is another issue at play here - the ones who benefit the most from the big miner's supersized blocks, are the small miners. The big miner could threaten to stop creating supersized blocks if the small miners don't join and create supersized blocks themselves. Forming such a cartel is advantageous over not having supersized blocks at all - however, I think the big miner's bargaining position is weak, and small miners will prefer to call the bluff and mine small blocks, avoiding the penalty and enjoying the big miner's supersized blocks. This is classic tragedy of the commons, but in a sort of reverse way - usually, TotC is discussed in this context when the mining cartel wants to exclude txs, not include them.
|
|
|
|
MayDee
Member
Offline
Activity: 84
Merit: 10
|
|
June 07, 2015, 06:12:09 PM |
|
I really like this idea! Keep up the great work Meni Rosenfeld
|
|
|
|
molecular
Donator
Legendary
Offline
Activity: 2772
Merit: 1019
|
|
June 07, 2015, 07:00:47 PM |
|
I really like this idea! Keep up the great work Meni Rosenfeld I like it, too. Thinking about the next steps I re-skimmed the OP (pretending to be someone just being introduced to the idea) and I think the introduction of (and reference to) the 'rollover fee pool' is very misleading. I know it is explained right after that it's really a 'rollover size penalty', but I fear it might lead people onto the wrong track and make it harder than necessary for them to grasp the idea. Maybe it'd be less confusing and easier to understand the concept if that part was removed? I have a feeling this idea is a hard sell, mainly because it isn't what many might expect: it's neither... - a way to dynamically set the block size cap nor
- a solution for scaling nor
- a rollover fee pool
It concerns itself with a different (although related) issue, namely the way the system behaves when approaching the transaction throughput limit. I personally think this is a very important issue and my expectation of the current behavior and the ramifications thereof regarding user experience and media coverage is one of the reasons I'm for Gavins simple 20MB kicking-the-can-down-the-road proposal. With the rollover penalty in place I might be willing to wait longer and let some pressure build on developing scaling solutions. I'm not opposed to seeing how a fee market would develop, I'm also not opposed to seeing business opportunities for entities working on scaling solutions. I just don't want to hit a brick wall, as Meni so aptly put it... it would do much damage and can potentially set us back years, I fear. So what are peoples ideas of how a roadmap could look like, what kind of funds we might need and how we could organize enough (monetary and political) support?
|
PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0 3F39 FC49 2362 F9B7 0769
|
|
|
dexX7
Legendary
Offline
Activity: 1106
Merit: 1026
|
|
June 07, 2015, 07:40:34 PM |
|
So far I like this proposal very much, too. I'm for Gavins simple 20MB kicking-the-can-down-the-road proposal. With the rollover penalty in place I might be willing to wait longer and let some pressure build on developing scaling solutions. The elastic cap, penalty fee pool and the hard limit could be addressed seperately, although it probably makes not much sense to introduce this mechanism with the current block size limit. In this context I'd like to add some visibility to TierNolan's post on the second page: ... increasing the block size requires a hard fork, but the fee pool part could be accomplished purely with a soft fork. It seems viable to set a high hard limit, and start with a lower-than-max elastic cap, which could be increased further at some point in the future.
|
|
|
|
vane91
Member
Offline
Activity: 133
Merit: 26
|
|
June 07, 2015, 08:02:49 PM |
|
The key here is how is T set. If T is fixed then 2T becomes the hard limit and the problem remains. If T is set based on an some average of previously mined blocks then this may address the problem
thisactually, just use twice the average blocksize of the last two weeks. And you don't really need any of this complicated system.
|
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
June 07, 2015, 08:21:15 PM |
|
The key here is how is T set. If T is fixed then 2T becomes the hard limit and the problem remains. If T is set based on an some average of previously mined blocks then this may address the problem
thisactually, just use twice the average blocksize of the last two weeks. And you don't really need any of this complicated system. 1. Floating block limits have their own set of problems, and may result in a scenario where there is no effective limit at all. 2. Even if you do place a floating block limit, it doesn't really relieve of the need for an elastic system. Whatever the block limit is, tx demand can approach it and then we have a crash landing scenario. We need a system that gracefully degrades when approaching the limit, whatever it is.
|
|
|
|
klondike_bar
Legendary
Offline
Activity: 2128
Merit: 1005
ASIC Wannabe
|
|
June 08, 2015, 12:58:54 AM |
|
The key here is how is T set. If T is fixed then 2T becomes the hard limit and the problem remains. If T is set based on an some average of previously mined blocks then this may address the problem
thisactually, just use twice the average blocksize of the last two weeks. And you don't really need any of this complicated system. 1. Floating block limits have their own set of problems, and may result in a scenario where there is no effective limit at all. 2. Even if you do place a floating block limit, it doesn't really relieve of the need for an elastic system. Whatever the block limit is, tx demand can approach it and then we have a crash landing scenario. We need a system that gracefully degrades when approaching the limit, whatever it is. 1) i beg to differ, so long as the timespan is sufficient that only a LONG lasting spam attack or other growth could cause massive block caps. Personally, i think as long as it averages over at least 1-2 weeks, thats sufficient to prevent any sort of rampant spamming. 2) if the cap is set at double the recent volumes, it should provide enough room for fuller blocks so long as we dont see 5x network growth within less than a 1-2 month timespan. even then, the cap would grow over time and lower prioriy transactions may just be pushed back a few blocks. fees take priority until everything balances out after a few days/weeks (I suggest 2.5x the average of 40days, or 6000 blocks) OR ((2x the last 6000 blocks) + (0.5x the last 400 blocks)). The second allows for slightly faster growth if there's sudden demand for room.
|
|
|
|
onelineproof
|
|
June 08, 2015, 11:51:49 AM |
|
I think a better solution would be to require miners to do more work for larger block sizes. Instead of hashing just the header of a block, miners have to hash something more: perhaps something proportional to the block size. So if a header is 80 bytes, it takes up 80/1000000=8e-05 of the whole block. So for any block size x > 1 MB, require a miner to hash the first (8e-05)x of the block in order for it to be valid. This will make Bitcoin automatically scale to the power of computers in the future, as big blocks will only be plentiful if computers (ASICs) are fast enough that it is worth taking the extra transaction fees with bigger blocks. Any problems with this?
|
|
|
|
dexX7
Legendary
Offline
Activity: 1106
Merit: 1026
|
|
June 08, 2015, 11:59:28 AM |
|
i think as long as it averages over at least 1-2 weeks, thats sufficient to prevent any sort of rampant spamming. I'm not sure, if "how long can spam last" covers the whole picture. I'd like to ask "how fast can miners deploy new hardware/adjust to an increased cap?" in addition.
|
|
|
|
klondike_bar
Legendary
Offline
Activity: 2128
Merit: 1005
ASIC Wannabe
|
|
June 08, 2015, 12:31:19 PM |
|
i think as long as it averages over at least 1-2 weeks, thats sufficient to prevent any sort of rampant spamming. I'm not sure, if "how long can spam last" covers the whole picture. I'd like to ask "how fast can miners deploy new hardware/adjust to an increased cap?" in addition. mining hardware has little to do with the issue, besides the fact that if large blocks are slow to download it could allow large miners to [unintentionally] start creating small forms while the slower miners create orphans. The mining process is virtually unchanged
|
|
|
|
dexX7
Legendary
Offline
Activity: 1106
Merit: 1026
|
|
June 08, 2015, 12:44:12 PM |
|
mining hardware has little to do with the issue, besides the fact that if large blocks are slow to download it could allow large miners to [unintentionally] start creating small forms while the slower miners create orphans. The mining process is virtually unchanged Sorry, poor choice of words. I wasn't thinking about mining hardware, but deploying additional bandwidth/adjusting hosting plans/[...] and alike, to handle larger blocks.
|
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
June 08, 2015, 02:14:35 PM |
|
I think a better solution would be to require miners to do more work for larger block sizes. Instead of hashing just the header of a block, miners have to hash something more: perhaps something proportional to the block size. So if a header is 80 bytes, it takes up 80/1000000=8e-05 of the whole block. So for any block size x > 1 MB, require a miner to hash the first (8e-05)x of the block in order for it to be valid. This will make Bitcoin automatically scale to the power of computers in the future, as big blocks will only be plentiful if computers (ASICs) are fast enough that it is worth taking the extra transaction fees with bigger blocks. Any problems with this?
That's the basic idea behind Greg's proposal. I've yet to examine it in detail; I think it was actually what I thought about first, before eschewing it in favor of a deductive penalty. I think there are errors in your description of how to implement this. It's not about what you hash, it's about what your target hash is. Also, you need to carefully choose the function that maps block size to mining effort.
|
|
|
|
molecular
Donator
Legendary
Offline
Activity: 2772
Merit: 1019
|
|
June 08, 2015, 04:34:37 PM |
|
I think a better solution would be to require miners to do more work for larger block sizes. Instead of hashing just the header of a block, miners have to hash something more: perhaps something proportional to the block size. So if a header is 80 bytes, it takes up 80/1000000=8e-05 of the whole block. So for any block size x > 1 MB, require a miner to hash the first (8e-05)x of the block in order for it to be valid. This will make Bitcoin automatically scale to the power of computers in the future, as big blocks will only be plentiful if computers (ASICs) are fast enough that it is worth taking the extra transaction fees with bigger blocks. Any problems with this?
That's the basic idea behind Greg's proposal. I've yet to examine it in detail; I think it was actually what I thought about first, before eschewing it in favor of a deductive penalty. I think there are errors in your description of how to implement this. It's not about what you hash, it's about what your target hash is. Also, you need to carefully choose the function that maps block size to mining effort. Can you link Gregs proposal? I haven't seen it.
|
PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0 3F39 FC49 2362 F9B7 0769
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
June 08, 2015, 07:14:13 PM |
|
I think a better solution would be to require miners to do more work for larger block sizes. Instead of hashing just the header of a block, miners have to hash something more: perhaps something proportional to the block size. So if a header is 80 bytes, it takes up 80/1000000=8e-05 of the whole block. So for any block size x > 1 MB, require a miner to hash the first (8e-05)x of the block in order for it to be valid. This will make Bitcoin automatically scale to the power of computers in the future, as big blocks will only be plentiful if computers (ASICs) are fast enough that it is worth taking the extra transaction fees with bigger blocks. Any problems with this?
That's the basic idea behind Greg's proposal. I've yet to examine it in detail; I think it was actually what I thought about first, before eschewing it in favor of a deductive penalty. I think there are errors in your description of how to implement this. It's not about what you hash, it's about what your target hash is. Also, you need to carefully choose the function that maps block size to mining effort. Can you link Gregs proposal? I haven't seen it. Currently I have this link - http://sourceforge.net/p/bitcoin/mailman/message/34100485/. It's not the original proposal but some followup discussion. I still need to dig and trace it back to the source.
|
|
|
|
goatpig
Legendary
Online
Activity: 3752
Merit: 1364
Armory Developer
|
|
June 08, 2015, 08:16:04 PM |
|
I'm getting slammed at work so I won't be able to come up with a properly crafted response to your new model this week.
|
|
|
|
tacotime
Legendary
Offline
Activity: 1484
Merit: 1005
|
|
June 08, 2015, 10:47:24 PM |
|
Thanks for that, it was my reading also. Thus TX fees that are not in the block but paid out of band are not subject to penalty...
It's an additive deduction, not multiplicative. You seem to be thinking the miner's reward is: (1 - penalty) * (minted coins + tx fees + collection from pool) Where it is really: minted coins + tx fees + collection from pool - penalty Having more fees in the txs included doesn't increase the penalty. There is no difference between adding 1 mBTC to the fee and paying 1 mBTC out-of-band. I don't see how this differs from in Monero? In Monero, addition of txs up to the median blocksize is free. As you surpass the median blocksize, a quadratic penalty is applied to the subsidy of the coinbase, but amounts obtained from tx fees are untouched. The subsidy of the coinbase is initially dependent of the number of coins in existence, and so takes into account the previous penalties to coinbases of any previously generated blocks (comparable to your "pool" method). Then, the miner is free to add transactions meeting some economic equilibrium that maximizes their overall income when taking into account the blocksize penalty to the coinbase subsidy. So, it's like this: (1 - penalty) * (minted coins) + tx fees where penalty is dependent on the size of the block above the median size according to the formulas found in the CN whitepaper. gmaxwell criticizes this as promoting out-of-band transactions, but the fact remains that to permanently and secure transfer money you must use the blockchain and have it included in a block somewhere. Thus, I never thought it was much of an issue.
|
XMR: 44GBHzv6ZyQdJkjqZje6KLZ3xSyN1hBSFAnLP6EAqJtCRVzMzZmeXTC2AHKDS9aEDTRKmo6a6o9r9j86pYfhCWDkKjbtcns
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
June 09, 2015, 04:18:59 AM |
|
Thanks for that, it was my reading also. Thus TX fees that are not in the block but paid out of band are not subject to penalty...
It's an additive deduction, not multiplicative. You seem to be thinking the miner's reward is: (1 - penalty) * (minted coins + tx fees + collection from pool) Where it is really: minted coins + tx fees + collection from pool - penalty Having more fees in the txs included doesn't increase the penalty. There is no difference between adding 1 mBTC to the fee and paying 1 mBTC out-of-band. I don't see how this differs from in Monero? In Monero, addition of txs up to the median blocksize is free. As you surpass the median blocksize, a quadratic penalty is applied to the subsidy of the coinbase, but amounts obtained from tx fees are untouched. The subsidy of the coinbase is initially dependent of the number of coins in existence, and so takes into account the previous penalties to coinbases of any previously generated blocks (comparable to your "pool" method). Then, the miner is free to add transactions meeting some economic equilibrium that maximizes their overall income when taking into account the blocksize penalty to the coinbase subsidy. So, it's like this: (1 - penalty) * (minted coins) + tx fees where penalty is dependent on the size of the block above the median size according to the formulas found in the CN whitepaper. gmaxwell criticizes this as promoting out-of-band transactions, but the fact remains that to permanently and secure transfer money you must use the blockchain and have it included in a block somewhere. Thus, I never thought it was much of an issue. That's not the part that differs from Monero. In fact, that's the part that NewLiberty claimed was worse than Monero, and I claimed is the same. (See also context) Do you have a reference for Greg's criticism? It doesn't make sense to me. I've explained above why out-of-band is not a problem in my suggestion (as you said - inclusion in the blockchain is what you need to secure the tx, so you have to pay the penalty anyway, and it doesn't matter if you get payment in or out of band), and since Monero is similar in this regard, it shouldn't be a problem for Monero either. It's possible he means that miners will make an out-of-band commitment to include a tx at a later, less crowded block. But... That's a problem with any other system (e.g. the current hard cap in Bitcoin), and it's not a problem - we want txs to be taken out of big blocks and into smaller ones. I don't think the users will actually agree to such a contract, but if they do, it's perfectly fine.
|
|
|
|
NewLiberty
Legendary
Offline
Activity: 1204
Merit: 1002
Gresham's Lawyer
|
|
June 09, 2015, 11:22:16 AM |
|
One difference with the Monero model is that the amount of the fee is not a factor of the penalty, nor is size of the coinbase reward. The penalty is not multiplied by the block reward. Additive, not multiplicative.
I'd missed that detail at first pass because it seemed impossible for anyone to advocate such a function, when on its face it seems to fail the 'future-proof' test.
Without knowing what the fees will be in the future, how can the penalty be the same for every transaction? Fee amount tends to adjust with how much milk and bread a bitcoin can purchase. The penalty may thus become overly burdensome (if bitcoin value increases) or meaningless (if it falls). We can know the coinbase reward, but we should not always expect that to be 99% of the total as it is now. This will matter later.
The advantage of the Monero multiplicative method is that it does not have to guess, it works at all fee levels, block rewards, valuation, etc.
|
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
June 09, 2015, 11:56:42 AM |
|
One difference with the Monero model is that the amount of the fee is not a factor of the penalty, nor is size of the coinbase reward. The penalty is not multiplied by the block reward. Additive, not multiplicative.
I'd missed that detail at first pass because it seemed impossible for anyone to advocate such a function, when on its face it seems to fail the 'future-proof' test.
Without knowing what the fees will be in the future, how can the penalty be the same for every transaction? Fee amount tends to adjust with how much milk and bread a bitcoin can purchase. The penalty may thus become overly burdensome (if bitcoin value increases) or meaningless (if it falls). We can know the coinbase reward, but we should not always expect that to be 99% of the total as it is now. This will matter later.
The advantage of the Monero multiplicative method is that it does not have to guess, it works at all fee levels, block rewards, valuation, etc.
As I've explained multiple times, this is exactly the reason for choosing a hyperbolic function. The marginal penalty is the derivative of f, f'. f' ranges from 0 to infinity when the block size ranges from T to 2T. Whatever the current Bitcoin price etc., there is some x for which f'(x) is equal to the typical fee. So the block sizes stretch and expand to accommodate the changing fees. The same happens with the quadratic function in Monero, but much more slowly, giving us less control over the block sizes. Because of this, the quadratic function actually assumes more about the fees than the hyperbolic one. The penalty itself (as opposed to its derivative) is less important in my proposal, because it more or less cancels out with the collection from penalties of past blocks. But still, we prefer to keep the penalty as low as possible, meaning that we are interested in the ratio f'/f. A hyperbolic function, because it grows faster, achieves a better ratio - and we can improve it further with parameter choice. Furthermore, if the reward is (1 - penalty) * (minted coins) + tx fees, then in the future when the minted coins go down to 0, there is no penalty at all. That's the opposite of future-proof.
|
|
|
|
|