Bitcoin Forum
June 23, 2024, 11:39:02 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 [2] 3 4 5 6 7 »  All
  Print  
Author Topic: Elastic block cap with rollover penalties  (Read 24015 times)
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1084


View Profile
June 03, 2015, 09:13:12 AM
 #21

It also requires modifying, or at least amending consensus rules, something the majority of the Core team has been trying to keep to a minimum. I believe there is wisdom in that position.

Obviously increasing the block size requires a hard fork, but the fee pool part could be accomplished purely with a soft fork. 

The coinbase of the transaction must pay <size penalty> BTC to OP_TRUE as its first output.  Even if there is no size penalty, the output needs to exist but pay zero.

The second transaction must be the fee pool transaction.

The fee pool transaction must have two inputs; the coinbase OP_TRUE output from 100 blocks previously and the OP_TRUE output from the fee pool transaction in the previous block. 

The transaction must have a single output that is 99% (or some other value) of the sum of the inputs paid to OP_TRUE.


By ignoring fees paid in the block, it protects against miners using alternative channels for fees.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
HCLivess
Legendary
*
Offline Offline

Activity: 2114
Merit: 1090


=== NODE IS OK! ==


View Profile WWW
June 03, 2015, 12:07:14 PM
 #22

are you suggesting we drop btc and pick up vtc?

btcdrak
Legendary
*
Offline Offline

Activity: 1064
Merit: 1000


View Profile
June 03, 2015, 12:35:14 PM
 #23

My proposal (on bitcoin-development and previously on the forum) is effectively (and explicitly credited to) the monero/bytecoin behavior, but rather than transferring fees/subsidy it changes the cost of being successful at the work function.

This is the most attractive concept I have seen yet for dynamic scaling which places a penalty by increasing the required difficulty target for miners building >1MB blocks. Do you have any idea of how that penalty could be calculated? I assume it would scale according to the percentage size increase above MAX_BLOCK_SIZE. I believe this would work because miners would not be incentivised to build builder blocks there was a need to, because prematurely doing so would put them at a disadvantage. This would also help in building fee pressure which will become more and more important as subsidy decreases.

thezerg
Legendary
*
Offline Offline

Activity: 1246
Merit: 1010


View Profile
June 03, 2015, 12:48:09 PM
Last edit: June 03, 2015, 01:03:15 PM by thezerg
 #24

An elastic supply is very important, but I think it can be accomplished more simply, without a pool.

Allow blocks to be expanded beyond their "nominal" size with high fee transactions.  The higher the fee, the further it can appear in the block.  Formally, define a function fee = T(x), where x is the location in the block.  If a transaction's fee is >= T(x), it can be placed in the block at location x.  T(x) = 0 for all x < 8MB (say) and increases super-linearly from there.


Note that this proposal does NOT look at the fees in aggregate -- the max block size <= S(sum(fees)), where S is some super-linear function.  That does not work because a miner could create a dummy transaction that pays himself a very large fee, thereby increasing the block size to allow space for a lot of low fee transactions.

Meni may have added the idea of a pool to solve the above problem.  But I believe that it is more easily solved by not looking at fees in aggregate.


EDIT: the biggest problem with this class of proposal is sizing the fee.  Especially given bitcoin's volatility.  However, if the fee function chosen starts at 1 satoshi, a high bitcoin price will tighten the elasticity of supply (in practice) but not entirely remove it.  At the same time, we STILL need to grow the "nominal" block size: i.e. 8MB + 20% per year, or risk pricing out personal transactions as adoption increases.  However, this class of proposal allows the network to react in a classic supply/demand fashion.  This reduces the pain when supply is exceeded, meaning that a "last-minute" hard fork as proposed by many of Gavin's opponents would be a lot less damaging to the network (block size increases could trail adoption rather than precede it).

dexX7
Legendary
*
Offline Offline

Activity: 1106
Merit: 1024



View Profile WWW
June 03, 2015, 02:54:31 PM
 #25

If T=3MB it's like a 6MB limit with pressure to keep blocks smaller than 3MB unless there are enough transactions paying fees so it's worth including them?

I think T should scale over time as bandwidth is growing. 42 transactions per second is still a low limit for a global payment network.

As far as I can see, and given that:

Obviously increasing the block size requires a hard fork, but the [penality] fee pool part could be accomplished purely with a soft fork.

Then it would be possible to raise the block size limit to 6, 20, 40, ... MB, but introduce a soft cap and a penality mechanism for "large" blocks. The penality function (and thus soft cap) may be freely adjusted over time, as long as the resulting block size doesn't exceed the hard limit.

The process will resemble climbing a hill rather than running into a brick wall.

Very well put, I like it.

klondike_bar
Legendary
*
Offline Offline

Activity: 2128
Merit: 1005

ASIC Wannabe


View Profile
June 03, 2015, 03:02:28 PM
 #26


The key here is how is T set. If T is fixed then 2T becomes the hard limit and the problem remains. If T is set based on an some average of previously mined blocks then this may address the problem
We still need some way to determine the optimal block size, but we have much more leeway. The wrong choice will not cause catastrophic failure, rather gradually increasing fees which will indicate that a buff is needed. The flexibility will make it easier to reach community consensus about changing hardcoded parameters.

Reusing what I wrote to Gavin an a private exchange - I don't believe in having a block limit calculated automatically based on past blocks. Because it really doesn't put a limit at all. Suppose I wanted to spam the network. Now there is a limit of 1MB/block so I create 1MB/block of junk. If I keep this up the rule will update the size to 2MB/block, and then I spam with 2MB/block. Then 4MB, ad infinitum. The effects of increasing demand for legitimate transaction is similar. There's no real limit and no real market for fees.

Perhaps we can find a solution that uses an automatic rule for short-term fluctuations, and hardcoded parameters for long-term trends. If a good automatic cap rule can be found, it will be compatible with this method.


+1 to that.  I think a max thats determined as either:
T = 2.50*(average(last 8000 blocks))         #T is set the average transactions for the last ~2 months. Plenty of room for slow and steady growth, and too great a timespan to attack the blockchain with spam. keep in mind that transactions at night will probably be 1/5th the volume of those during business hours
or
T = (2.00*(average(last 8000 blocks))) + (0.50*(average(last 144 blocks)))     #This would allow short-term fluctuations that take a day or two to develop. Could be susceptible to a spam attack that last longer that 3+ days.

personally, i think a block limit thats set based on the average volume of the last 1-3 months would be fine. It would be flexible if the number of transactions increases very quickly, and could grow to 3-8x the maximum within a year if theres substancial volume. combined with your proposal above it could be extremely flexible. However...

I'm EXTREMELY cautious of altering how fees are created and distributed, as any changes made will directly impact miners and could lead to bribery and corruption of the bitcoin code to better pay the centralised mining companies. Any code changes that are implemented should not involve factors or values that will need to be adjusted down the road, or it will simply lead to a 'democracy' of core-qt 'improvement'

24" PCI-E cables with 16AWG wires and stripped ends - great for server PSU mods, best prices https://bitcointalk.org/index.php?topic=563461
No longer a wannabe - now an ASIC owner!
Meni Rosenfeld (OP)
Donator
Legendary
*
expert
Offline Offline

Activity: 2058
Merit: 1054



View Profile WWW
June 03, 2015, 03:30:07 PM
 #27

But short-term, if I have a transaction I'm set on sending right now (e.g. a restaurant tab), I'll be willing to pay very high fees for it if I must. So fees are not effective in controlling the deluge of transactions.

This part seems a bit off. At any given time, some people will have an urgent need for settlement, but many/most won't. So we get smooth scaling for quite a while from a purely economic perspective. Now once we reach a point in adoption where there are so many urgent transactions that they fill the blocks on their own, that kicks up the frustration to unacceptable levels, but even then some will be willing to outbid others and again it's a gradual increase in pain, not a deluge.

Insofar as the prices miners charge do rise properly and users have an easy way of getting their transactions in at some price, fees will limit transactions even in the short term. All you're really describing here is reaching a point that is pretty far along that smooth pain curve, after all the less important transactions have been priced out of the market.

Overall this is a great idea, though!
It's difficult to know exactly how the quantitative factors will play out exactly. The inelasticity is not total, but I believe it is significant, and contributes to the phenomenon. Even if things will not be as catastrophic as Mike describes, I believe they can get rather ugly, so any change that alleviates it is welcome.


are you suggesting we drop btc and pick up vtc?
Not familiar with it.


An elastic supply is very important, but I think it can be accomplished more simply, without a pool.

Allow blocks to be expanded beyond their "nominal" size with high fee transactions.  The higher the fee, the further it can appear in the block.  Formally, define a function fee = T(x), where x is the location in the block.  If a transaction's fee is >= T(x), it can be placed in the block at location x.  T(x) = 0 for all x < 8MB (say) and increases super-linearly from there.
This could work, but:

1. I'm not convinced it's actually simpler. If I understand it correctly, it requires, among other things, sorting the transactions by fee. Verification also requires examining each individual transaction in a rather elaborate way.
2. I think it's much harder to analyze how it will play out economically; and my initial thought is that it will be less stable. In my suggestion, the fee will be more or less consistent over txs, for any given load level. Here, some txs will be accepted with 0 fee and some will require very high fees; it will be difficult for each transaction to decide where it wants to go, and they can oscillate wildly between states.


EDIT: the biggest problem with this class of proposal is sizing the fee.  Especially given bitcoin's volatility.  However, if the fee function chosen starts at 1 satoshi, a high bitcoin price will tighten the elasticity of supply (in practice) but not entirely remove it.  At the same time, we STILL need to grow the "nominal" block size: i.e. 8MB + 20% per year, or risk pricing out personal transactions as adoption increases.  However, this class of proposal allows the network to react in a classic supply/demand fashion.  This reduces the pain when supply is exceeded, meaning that a "last-minute" hard fork as proposed by many of Gavin's opponents would be a lot less damaging to the network (block size increases could trail adoption rather than precede it).
This is the reason I chose a hyperbolic function rather than a polynomial one. Being hyperbolic means a wide range of marginal costs is covered with a relatively small span of block sizes. So whatever the reasonable fee should be, the system will find a block size that matches it.

1EofoZNBhWQ3kxfKnvWkhtMns4AivZArhr   |   Who am I?   |   bitcoin-otc WoT
Bitcoil - Exchange bitcoins for ILS (thread)   |   Israel Bitcoin community homepage (thread)
Analysis of Bitcoin Pooled Mining Reward Systems (thread, summary)  |   PureMining - Infinite-term, deterministic mining bond
Meni Rosenfeld (OP)
Donator
Legendary
*
expert
Offline Offline

Activity: 2058
Merit: 1054



View Profile WWW
June 03, 2015, 04:06:52 PM
 #28

My email correspondence with Gavin so far:

Quote from: Meni Rosenfeld
Hi Mike,

As I was reading your posts about the block size limit, I came to the realization that the problem isn't really that the block size is too low. It's that the protocol lacks a mechanism for graceful degradation.

People expect that as the size limit is approached, fees will elastically adapt. I agree with your arguments that it doesn't work this way, but I think that it should work this way; and if it doesn't now, we should solve that problem. Once we do, the worst that could happen with a block limit too low, is that fees will be too high. Of course, that could require some significant protocol changes.

I've been thinking along the following lines: A miner can create a block of arbitrary size, however, he must pay a penalty for large blocks. This penalty will be deducted from his coinbase transaction, and added to a rollover fee pool, to be collected by future miners (as in https://bitcointalk.org/index.php?topic=80387.0). The penalty will be a hardcoded function of the block size.

The function should be superlinear; it can optionally be 0 for block sizes up to a given threshold; and it could have a bend around some agreed upon value (1MB, 20MB, whatever) to encourage the size to be around this value. An optimal miner will include a transaction only if the marginal penalty is lower than the fee. As the block size increases, the marginal penalty per kB will be higher, requiring a higher fee.

This is superior to a hard cap in several ways. First, it's always possible for all txs to clear, as long as users are willing to pony up; with a hard cap, even if all users agree to pay more, you still can't include all of their transactions, creating a backlog. Second, the overall behavior of the fees is smoother over time. Instead of the marginal cost per transaction being roughly 0 in low-traffic times and approaching infinity in high-traffic times, it varies continuously with the current traffic. This makes it easier to gather statistics, and to choose the fee to pay accordingly. And you still have a market that adapts to actual economic incentives.

Of course there's more I can say about the analysis of this suggestion, but that's the basic idea. I might post about this somewhere more public, not sure exactly where though...

Meni

Quote from: Gavin Andresen
Mike's on vacation, don't expect a response (and Mike, you're on vacation, you shouldn't be thinking about this stuff....)

My knee-jerk reaction is:  cool, write up a patch for Bitcoin Core (with tests, please) so we can see how extensive the changes would be.  It is easy to have an idea, but there are so many ideas I need a filter to winnow down the number of possibilities or it is impossible to carefully consider them all.  "Go write some code we can look at" is a very good filter.

Other people's knee-jerk reactions will be:  this won't work when the subsidy goes away, so it is a non-starter.  See Greg Maxwell's proposal for "require more mining (higher nBits) to produce bigger blocks" for a scheme that might work when the subsidy goes away.

On a higher level:  I agree that graceful degradation is much better than a hard crash-- that is why I implemented 'smart fees' for Bitcoin Core.

Quote from: Meni Rosenfeld
Hi Gavin,

1. That's a fair request, unfortunately writing code is not where my comparative advantage is. I might be able to persuade others to write the code, though.

There's never a shortage of ideas, of course - but not all ideas are born equal, some are bad, some are good; and some ideas are so obviously bad you don't even need to test them.

2. As I've argued in the past, and in an interview posted today (http://bit-post.com/bitcoiners/interview-with-meni-rosenfeld-the-block-size-limit-and-mining-fee-structure-6105), funding miners when the subsidy goes away is a completely different problem which needs completely different solutions, which have nothing to do with block size.

Anyway, I'm not sure what exactly you mean by "it won't work" - in case you meant that without subsidy there will be nowhere to take the penalty from, of course the penalty can be taken out of tx fees, and the block is illegal if the total penalty is higher than the total fee. So miners will still only accept txs with sufficiently high fees.

Quote from: Meni Rosenfeld
FYI, I've posted about this suggestion - https://bitcointalk.org/index.php?topic=1078521.
Meni

Quote from: Gavin Andresen
Interesting.  How do we decide what "T" should be ?

My knee-jerk reaction: I bet a much simpler rule would work, like:

   max block size = 2 * average size of last 144 blocks.

That would keep the network at about 50% utilization, which is enough to keep transaction fees falling from to zero just due to people having a time preference for having transactions confirmed in the next 1/2/3 blocks (see http://hashingit.com/analysis/34-bitcoin-traffic-bulletin ).

I think this simple equation is very misleading:
  Bigger blocks -> Harder to run a node -> Less nodes -> More centralization

People are mostly choosing to run SPV nodes or web-based wallets because:

  Fully validating -> Less convenience -> Less nodes -> More centralization

Node count on the network started dropping as soon as good SPV wallets were available, I doubt the block size will have any significant effect.


Also: Greg's proposal:
  http://sourceforge.net/p/bitcoin/mailman/message/34100485/

Quote from: Meni Rosenfeld
Hi Gavin,

1. a. I don't believe in having a block limit calculated automatically based on past blocks. Because it really doesn't put a limit at all. Suppose I wanted to spam the network. Now there is a limit of 1MB/block so I create 1MB/block of junk. If I keep this up the rule will update the size to 2MB/block, and then I spam with 2MB/block. Then 4MB, ad infinitum. The effects of increasing demand for legitimate transaction is similar. There's no real limit and no real market for fees.

b. I'll clarify again my goal here is not to solve the problem of what the optimal block limit is - that's a separate problem. I want to prevent a scenario where a wrong block limit creates catastrophic failure. With a soft cap, any parameter choice creates a range of legitimate block sizes.

You could set now T = 3MB, and if in the future we see that tx fees are too high and there are enough blocks, increase it.

2. I have described one causal path. Of course SPV is a stronger causal path but it's also completely irrelevant, because SPV clients are already here and we don't want them to go away. They are a given. Block size, however, is something we can influence; and the primary drawback of bigger blocks is, as I described, the smaller number of nodes.

You can argue that the effect is insignificant - but it is still the case that

    Many people currently do believe the effect is significant, and
    This argument will be easier to discuss once we don't have to worry about crash landing.

3. Thanks, I'll try to examine Greg's proposal in more detail.

Meni

Quote from: Gavin Andresen
On Tue, Jun 2, 2015 at 5:37 PM, Meni Rosenfeld wrote:

    1. a. I don't believe in having a block limit calculated automatically based on past blocks. Because it really doesn't put a limit at all. Suppose I wanted to spam the network.


Who are "you" ?

Are you a miner or an end-user?

If you are a miner, then you can produce maximum-sized blocks and influence the average size based on your share of hash rate. But miners who want to keep blocks small have equal influence.

If you are an end-user, how do you afford transaction fees to spam the network?

----------------------

If you are arguing that transaction fees may not give miners enough reward to secure the network in the future, I wrote about that here:
   http://gavinandresen.ninja/block-size-and-miner-fees-again
and here:
   https://blog.bitcoinfoundation.org/blocksize-economics/

And re: "there is no real limit and no real market for fees" :  see
  http://gavinandresen.ninja/the-myth-of-not-full-blocks

There IS a market for fees, even now, because there is demand for "I want my transaction to confirm in the next block or three."

Quote from: Meni Rosenfeld
1. I'm an end user.

If there are hard coded rules for tx fees and spam prevention, then that is what is ultimately keeping the block size in check, not the block limit.

If there are none, and the only source of fees is competition over the limited block size, then there will be no real competition (for the reason I mentioned - the limit keeps increasing), and I will not have to pay any fees.

In both cases, the floating block limit doesn't do much.

2. I argue, as I always do, that funding miners for the hashing should not have anything to do with the data size of transactions and blocks.

In the current argument I'm not talking about the amortized cost of hashing. I'm talking about paying for the marginal cost of handling transactions (which does depend on size), and that the fees will make their way to the nodes bearing these costs. Under that assumption, I want to make sure people are actually paying fees for the resources consumed - and for that, I want to keep supply in check.

3. There is indeed a fee market, when the variability in the rates of clearing and adding txs exceeds the difference between the block limit and the global average tx rate. However, at low-traffic times, rational markets will not require significant fees. As a spammer I can use that time to create spam and trick the recalibration mechanism. As a legitimate user, I could use this time to send non-urgent txs. This reduces variability and works to stretch the limit.

Perhaps automatic calibration can work with a good enough mechanism, but I think it's more dangerous than occasionally updating a hardcoded value.

1EofoZNBhWQ3kxfKnvWkhtMns4AivZArhr   |   Who am I?   |   bitcoin-otc WoT
Bitcoil - Exchange bitcoins for ILS (thread)   |   Israel Bitcoin community homepage (thread)
Analysis of Bitcoin Pooled Mining Reward Systems (thread, summary)  |   PureMining - Infinite-term, deterministic mining bond
chmod755
Legendary
*
Offline Offline

Activity: 1442
Merit: 1021



View Profile WWW
June 03, 2015, 04:58:31 PM
 #29

Quote
max block size = 2 * average size of last 144 blocks.

That's not a real limitation. It could easily grow to more than 100Mb in a single week.

I think it should be more like:
max block size = 1.2 * average block size of last 1008 blocks

binaryFate
Legendary
*
Offline Offline

Activity: 1484
Merit: 1003


Still wild and free


View Profile
June 03, 2015, 05:10:07 PM
 #30

Quote
max block size = 2 * average size of last 144 blocks.

That's not a real limitation. It could easily grow to more than 100Mb in a single week.

I think it should be more like:
max block size = 1.2 * average block size of last 1008 blocks

Using the median instead of the average would make the scheme less prone to the influence of just one or few blocks.
Again, we are describing precisely the scheme used in Monero, proposed in the cryptonote whitepaper (see Sec. 6.2.2): https://cryptonote.org/whitepaper.pdf

Monero's privacy and therefore fungibility are MUCH stronger than Bitcoin's. 
This makes Monero a better candidate to deserve the term "digital cash".
kazuki49
Sr. Member
****
Offline Offline

Activity: 350
Merit: 250



View Profile
June 03, 2015, 05:28:49 PM
 #31

My email correspondence with Gavin so far:


lol he just called you an ideas man, your efforts are futile, Bitcoin is the Eldorado and has no flaws no dev will ever adopt something made by other coins, it would imply Satoshi was slight equivocated in his first attempt at blockchain Cheesy
nutildah
Legendary
*
Offline Offline

Activity: 3024
Merit: 8149



View Profile WWW
June 03, 2015, 06:14:55 PM
 #32

Why would a mod delete this comment from this thread? It was the second comment and it was completely related to the topic.

A reply of yours, quoted below, was deleted by a Bitcoin Forum moderator. Posts are most frequently deleted because they are off-topic, though they can also be deleted for other reasons. In the future, please avoid posting things that need to be deleted.

Quote
Brilliant!

▄▄███████▄▄
▄██████████████▄
▄██████████████████▄
▄████▀▀▀▀███▀▀▀▀█████▄
▄█████████████▄█▀████▄
███████████▄███████████
██████████▄█▀███████████
██████████▀████████████
▀█████▄█▀█████████████▀
▀████▄▄▄▄███▄▄▄▄████▀
▀██████████████████▀
▀███████████████▀
▀▀███████▀▀
.
 MΞTAWIN  THE FIRST WEB3 CASINO   
.
.. PLAY NOW ..
kazuki49
Sr. Member
****
Offline Offline

Activity: 350
Merit: 250



View Profile
June 03, 2015, 06:18:07 PM
 #33

Why would a mod delete this comment from this thread? It was the second comment and it was completely related to the topic.

A reply of yours, quoted below, was deleted by a Bitcoin Forum moderator. Posts are most frequently deleted because they are off-topic, though they can also be deleted for other reasons. In the future, please avoid posting things that need to be deleted.

Quote
Brilliant!


lol man I remembered you when I saw this thread on reddit: https://www.reddit.com/r/Bitcoin/comments/389pq6/elastic_block_cap_with_rollover_penalties_my/
Tongue
unamis76
Legendary
*
Offline Offline

Activity: 1512
Merit: 1009


View Profile
June 03, 2015, 06:21:10 PM
 #34

This is quite the idea... It has definitely some legs to run the long run.

What bothers me is really how to implement it... I'm also not knowledgeable in code, but the pool seems a pretty complex thing to setup. The funds would have to reside in an address. Who would hold such private key?
kazuki49
Sr. Member
****
Offline Offline

Activity: 350
Merit: 250



View Profile
June 03, 2015, 06:23:44 PM
 #35

What bothers me is really how to implement it... I'm also not knowledgeable in code, but the pool seems a pretty complex thing to setup. The funds would have to reside in an address. Who would hold such private key?

Gavin ofc, he owns Bitcoin now like his ideological fellow at darkcoin/dash.
nutildah
Legendary
*
Offline Offline

Activity: 3024
Merit: 8149



View Profile WWW
June 03, 2015, 06:28:10 PM
 #36


Holy Cow I was unaware of this thread, thanks I have some reading to do at work now.

And the initial reason I commented on this post is because nobody else had and it would have been a shame to see it fall off the first page with no responses.

▄▄███████▄▄
▄██████████████▄
▄██████████████████▄
▄████▀▀▀▀███▀▀▀▀█████▄
▄█████████████▄█▀████▄
███████████▄███████████
██████████▄█▀███████████
██████████▀████████████
▀█████▄█▀█████████████▀
▀████▄▄▄▄███▄▄▄▄████▀
▀██████████████████▀
▀███████████████▀
▀▀███████▀▀
.
 MΞTAWIN  THE FIRST WEB3 CASINO   
.
.. PLAY NOW ..
Meni Rosenfeld (OP)
Donator
Legendary
*
expert
Offline Offline

Activity: 2058
Merit: 1054



View Profile WWW
June 03, 2015, 06:55:06 PM
 #37

My email correspondence with Gavin so far:
lol he just called you an ideas man, your efforts are futile, Bitcoin is the Eldorado and has no flaws no dev will ever adopt something made by other coins, it would imply Satoshi was slight equivocated in his first attempt at blockchain Cheesy
That's not really what he said. I am mostly an idea man though, and happy to be one.


Holy Cow I was unaware of this thread, thanks I have some reading to do at work now.

And the initial reason I commented on this post is because nobody else had and it would have been a shame to see it fall off the first page with no responses.
Well, I think there is some policy about writing comments that don't add new content of their own and just support previous content, or something. I was happy to see your feedback. As I recall, you posted your comment quite early, before there was a real need for bumping, but I appreciate the intent.


This is quite the idea... It has definitely some legs to run the long run.

What bothers me is really how to implement it... I'm also not knowledgeable in code, but the pool seems a pretty complex thing to setup. The funds would have to reside in an address. Who would hold such private key?
No, the funds don't reside in an address. That's like saying that the 6.8 million bitcoins that haven't been mined yet reside in an address.

The funds just exist as a feature of the network, and the protocol defines how they are paid out to future miners (in the same way that the protocol dictates that each miner currently gets 25 BTC).

I don't believe the implementation is that complicated, but people more familiar with the codebase are in a better position to comment on that.

1EofoZNBhWQ3kxfKnvWkhtMns4AivZArhr   |   Who am I?   |   bitcoin-otc WoT
Bitcoil - Exchange bitcoins for ILS (thread)   |   Israel Bitcoin community homepage (thread)
Analysis of Bitcoin Pooled Mining Reward Systems (thread, summary)  |   PureMining - Infinite-term, deterministic mining bond
Zangelbert Bingledack
Legendary
*
Offline Offline

Activity: 1036
Merit: 1000


View Profile
June 03, 2015, 06:58:45 PM
 #38

It's difficult to know exactly how the quantitative factors will play out exactly. The inelasticity is not total, but I believe it is significant, and contributes to the phenomenon. Even if things will not be as catastrophic as Mike describes, I believe they can get rather ugly, so any change that alleviates it is welcome.

Rusty Russell did some modeling on that today: What Transactions Get Crowded Out If Blocks Fill?
goatpig
Legendary
*
Offline Offline

Activity: 3682
Merit: 1347

Armory Developer


View Profile
June 03, 2015, 08:22:10 PM
Last edit: June 03, 2015, 08:48:49 PM by goatpig
 #39

This is similar to the idea of eschewing a block limit and simply hardcoding a required fee per tx size.

I assume you are referring to the debate on "hard block size limit + organic fees" versus "no block size limit + hard fees", the third option (no block limit and organic fees) being a non solution. Obviously an "organic block size limit + organic fees" is the ideal solution, but I think the issue is non trivial, and I have no propositions to achieve it. I don't even know if its philosophically possible.

In this light, a "pseudo elastic block size limit + organic fees" is the better and most accessible solution at the moment, and I will argue that my proposal cannot be reduced to "no block size limit + hard fees", and that it actually falls under the same category as yours. Indeed, like your proposal, mine relies on an exponential function to establish the fee expended to block size ratio. Essentially the T-2T range remains, where any blocks below T needs no fees to be valid, and the total fee grows exponentially from T to 2T.

In this regard, my proposal uses the same soft-hard cap range mechanics as yours. As I said, ideally I'd prefer a fully scalable solution (without any artificial hard cap), but for now this kind of elastic soft-hard cap mechanic is better than what we got and simple enough to review and implement. The fact that my solution has caps implies there will be competition for fees as long as the seeding constants of the capping function are tuned correctly. On this front it behaves neither worse nor better than your idea.

Since I believe fees should be pegged on difficulty, fees wouldn't be hard coded either. Rather the baseline would progress inversely to network hashrate, while leaving room for competition over scarce block room.

Quote
The main issue I have with this kind of ideas is that it doesn't give the market enough opportunity to make smart decisions

I will again argue to the contrary. As a matter of fact, I believe your solution offers no room for such adaptive market behavior, while mine does. To take both your examples in order:

Quote
such as preferring to send txs when traffic is low

With T being the soft cap and 2T the hard cap, your solution proposes to penalize all miners creating blocks larger than T. This pegs blockchain space to fees, the same as my proposal: the more txs waiting in the mempool, the higher the fee you need to get included in the next block. Inversely, the less items in the mempool, the more likely you are to have your low/zero fee tx mined right away, which creates an incentive to emit transactions during low traffic periods.

While your approach supports emitting txs during low traffic by imposing extra costs on high traffic, mine simply doesn't do with the extra cost. That doesn't mean emitting transactions during low traffic is NOT cheaper. As a matter of fact, it is, but the difference between low and high traffic isn't as significant.

The true difference between my solution and yours is that while mine allows miners to exceed the T soft cap as long as there are enough fees to go by, yours penalizes all blocks outgrowing T, which effectively locks all blocks at size T.

Indeed a selfish miner would have no incentive to build blocks beyond T, and they will also benefit in not including 0/low fee transactions. Indeed, by leaving all 0/low fee transactions in the mempool and only creating small blocks, where the block size is defined min(total size of high fee txs, T), a large selfish miner can deplete the mempool from all high fee transactions, leaving the rest of the network to pick up the slack.

Other miners are left with the choice to fill blocks up to T, but not further. Due to the selfish miners action (who are pumping high fee, small size blocks), there are not enough fees to be redeemed from the mempool, and the penalties for including transaction past T would come out of the good willed miners' coinbase rewards. You may have "benevolent" miners who would rather empty the mempool than follow game theory, but they only stand to earn less money than everybody else. On the other hand, selfish miners still qualify to get a cut of the fee pool, to which they make a point not to contribute to, effectively siphoning revenue from good will and "benevolent" miners.

The true effect on the network is that no one will bother creating blocks larger than T, and we will still have a defacto hardcoded block size cap.

My proposal offers to allow miners to take in extra fees as long as they remain below the curve defined by the capping function. Selfish miners do not have an opportunity to vampirize good willed miners anymore so while the entire behavior (high fee, low size blocks) is not deterred, it is at least not encouraged.

Please keep in mind that this analysis of your system relies on my current understanding of it. I'm still not 100% clear how the "fee pool" functions. I'm assuming it is either only funded in penalties, and all fees are paid to miners directly, or that all fees are pooled and distributed equally per blocks. The later assumption seems pointless since it can be easily bypassed, so I'm using the former one as the basis to my critics of your proposal.

Quote
or to upgrade hardware to match demand for txs

Again I will argue that my proposal supports hardware improvement while yours doesn't. In your case, T will act as a defacto hard cap on block size limit, so there is no incentive for miners to be able to handle more traffic and create blocks beyond that value. As long as miners won't output blocks larger than T, there is no reason for the rest of the network to upgrade either.

With my solution, as long as there are enough fees to go by, up until 2T blocksize (or whatever the hard cap ends up being), miners are motivated to include transactions paying fees beyond the soft cap, which justifies hardware improvement to handle the extra load, with the consequences this has on the rest of the network.

Quote
Another issue with this is miner spam - A miner can create huge blocks with his own fee-paying txs, which he can easily do since he collects the fees.

Both in the current state and the solution you propose, malevolent miners can disturb the network by mining either empty blocks, or blocks full of "useless transactions", sending their own coins back to themselves. In my solution malevolent miners also have the added opportunity to mine large blocks by paying fees to themselves. Let's analyze what counter there is to these 3 disturbance attacks.

1) In case of empty blocks, all good willed and selfish miners should simply ignore them. It increases their revenue and impeaches the attack.

2) In case of "useless transactions", either the transactions were never made public, in which case the blocks are easy to identify (full of txs that never hit the mempool) and can be ignored for the same reason as above, or the transactions have been published publicly, and the attacker is making a point of mining only these. At this point you can't really distinguish this miner from good willed ones with either solution.

3) With my solution, malevolent miners can pay themselves fees and enlarge blocks. However that only holds true if they are keeping the transactions private. In this case other miners can identify such blocks as malevolent and ignore them entirely (2T wide blocks full of large fee txs that never hit the mempool). If the attacking miner makes the transactions public, he can't expect to rake all fees, so the solution to this attack is no different from 1 & 2

4) With your solution, what is to stop a malevolent miner from maxing blocks with 0 fee transactions? Sure he would give up the coinbase reward in form of penalties, but now these are available for the taking to other miners. Good willed miners may ignore such blocks, but selfish miners probably won't.

With my solution, there is an incentive for both good willed and selfish miners to ignore blocks outright constructed to bloat the network. With yours, there is a built in cost to such disturbance, so selfish miners can choose to ignore the disturbance for the benefit of the reward. In my system, the only viable economic option is to ignore the blocks.

Quote
Using difficulty to determine the size/fee ratio is interesting. I wanted to say you have the problem that difficulty is affected not only by BTC rate, but also by hardware technology. But then I realized that the marginal resource costs of transactions also scales down with hardware. The two effects partially cancel out, so we can have a wide operational range without much need for tweaking parameters.

2 years ago I would have opposed pegging fees and block size to difficulty, because ASIC technology was catching up to current manufacturing processes and as such was growing much faster than every other hardware supporting the network. That would require too much manual adjustments of the pegging function to be acceptable. As time passes by, that criticism loses ground, and now is not a bad time to consider it.

I would be interested to see if you have room to factor difficulty in your current function.

I expect a fee pool alone will increase block verification cost.
It would not, in any meaningful way.

I try to not be so quick with drawing such conclusions. I'm not savvy with the Core codebase, but my experience with blockchain analysis has taught me that the less complicated a design is, the more room for optimization it has. You can't argue that adding a verification mechanic will simplify code or reduce verification cost, although the magnitude of the impact is obviously relevant. I'm not in a position to evaluate that, but I would rather remain cautious.

The point still remains, you don't need a fee pool to establish a relationship between fee, block size, and possibly difficulty.

Don't get me wrong, I believe the idea has merits. What I don't believe is that these merits apply directly to the issue at hand. It can fix other issues, but other issues aren't threatening to split the network. I also don't think this idea is mature enough.

As Gavin says, without an implementation and some tests it is hard to see how the system will perform. If we are going to “theorycraft”, I will attempt to keep it as lean as possible.

It also requires modifying, or at least amending consensus rules, something the majority of the Core team has been trying to keep to a minimum. I believe there is wisdom in that position.

Obviously increasing the block size requires a hard fork, but the fee pool part could be accomplished purely with a soft fork.  

The coinbase of the transaction must pay <size penalty> BTC to OP_TRUE as its first output.  Even if there is no size penalty, the output needs to exist but pay zero.

The second transaction must be the fee pool transaction.

The fee pool transaction must have two inputs; the coinbase OP_TRUE output from 100 blocks previously and the OP_TRUE output from the fee pool transaction in the previous block.  

The transaction must have a single output that is 99% (or some other value) of the sum of the inputs paid to OP_TRUE.


By ignoring fees paid in the block, it protects against miners using alternative channels for fees.

It seems your implementation pays the fee pool in full to the next block. That defeats the pool purpose in part. The implementation becomes more complicated when you have to gradually distribute pool rewards to “good” miners, while you keep raking penalties from larger blocks.

Otherwise, a large block paying high penalties could be followed right away by another large block, which will offset its penalties with the fee pool reward. The idea here is to penalize miners going over and reward those staying under the soft cap. If you let the miners going over the cap get a cut of the rewards, they can offset their penalties and never care for the whole system.

As a result you need a rolling fee pool, not just a 1 block lifetime pool, and that complicates the implementation, because you need to keep track of the pool size across a range of blocks.

nutildah
Legendary
*
Offline Offline

Activity: 3024
Merit: 8149



View Profile WWW
June 03, 2015, 09:36:35 PM
 #40

Well, I think there is some policy about writing comments that don't add new content of their own and just support previous content, or something. I was happy to see your feedback. As I recall, you posted your comment quite early, before there was a real need for bumping, but I appreciate the intent.

Hey no problem. And I always feel slightly smarter after having read one of your theses, so thank you for that. As no doubt others have told you in the past, you have quite a gift for thinking outside the box in a non-profiteering sort of way, a quality which the cryptocurrency community is in dire need of.

Cheers from Hawaii,

Nutildah the Hungry

▄▄███████▄▄
▄██████████████▄
▄██████████████████▄
▄████▀▀▀▀███▀▀▀▀█████▄
▄█████████████▄█▀████▄
███████████▄███████████
██████████▄█▀███████████
██████████▀████████████
▀█████▄█▀█████████████▀
▀████▄▄▄▄███▄▄▄▄████▀
▀██████████████████▀
▀███████████████▀
▀▀███████▀▀
.
 MΞTAWIN  THE FIRST WEB3 CASINO   
.
.. PLAY NOW ..
Pages: « 1 [2] 3 4 5 6 7 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!