gmaxwell
Staff
Legendary
Offline
Activity: 4284
Merit: 8808
|
|
January 26, 2016, 02:35:58 AM |
|
them utilising the methods Greg describes for sending blocks.
The greatest irony is that after that dozen back and forth set of emails where I was completely unable to communicate to Peter R the ability of miners to completely remove the communication delay from the time of mining, amid his irrelevant poetic waxing about information theory... He went on to submit a paper to Ledger describing the idea, without attribution... which incorporated most of the previous elements I'd already described in public-- save one main difference: it screwed up the design so that participants would take orphan risk for new transactions by only adding them to the pre-consensus by attempting to put them in a Bitcoin block. Of course, rational miners could easily overlay on top of it the more efficient scheme to commit to and pre-forward new transactions _before_ actually including them; so it's no great harm... the only real effect of this design flaw was that making it seemingly serves the desperate attempt at perpetuating some residue of the "natural fee market" myth. We probably have spent too much time here thinking about these fancy schemes: The mechanism miners have historically used to mitigate orphaning is to centralize around larger pools. It's easy, has no startup cost, no development risks, and can be incrementally-- even accidentally-- deployed. In my view these more efficient schemes are important not because they show that "natural fee market" fails the game theory test, but because they're useful tools to cut latency (thus progress) out of mining, and thus have a positive use for making mining more fair and less centralized. The fact that they also blow up any remaining claim that orphaning risk will act as a natural, if too limited to be at all effectual, control on block size is regrettable but unavoidable.
|
|
|
|
jonald_fyookball
Legendary
Offline
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
|
|
January 26, 2016, 02:49:31 AM |
|
them utilising the methods Greg describes for sending blocks.
The greatest irony is that after that dozen back and forth set of emails where I was completely unable to communicate to Peter R the ability of miners to completely remove the communication delay from the time of mining, amid his irrelevant poetic waxing about information theory... He went on to submit a paper to Ledger describing the idea, without attribution... which incorporated most of the previous elements I'd already described in public-- save one main difference: it screwed up the design so that participants would take orphan risk for new transactions by only adding them to the pre-consensus by attempting to put them in a Bitcoin block. Of course, rational miners could easily overlay on top of it the more efficient scheme to commit to and pre-forward new transactions _before_ actually including them; so it's no great harm... the only real effect of this design flaw was that making it seemingly serves the desperate attempt at perpetuating some residue of the "natural fee market" myth. We probably have spent too much time here thinking about these fancy schemes: The mechanism miners have historically used to mitigate orphaning is to centralize around larger pools. It's easy, has no startup cost, no development risks, and can be incrementally-- even accidentally-- deployed. In my view these more efficient schemes are important not because they show that "natural fee market" fails the game theory test, but because they're useful tools to cut latency (thus progress) out of mining, and thus have a positive use for making mining more fair and less centralized. The fact that they also blow up any remaining claim that orphaning risk will act as a natural, if too limited to be at all effectual, control on block size is regrettable but unavoidable. Isn't there a little bit of 'chicken-and-egg' here? Why would miners use these new 'efficient schemes' in the first place if it was to render the fee market ineffectual, unless they were already relying on something else (blocksize limit) to do that (support a fee market)?
|
|
|
|
BlindMayorBitcorn
Legendary
Offline
Activity: 1260
Merit: 1116
|
|
January 26, 2016, 02:51:35 AM |
|
@Gmax I did some sleuthy reconnaissance for you here.
|
Forgive my petulance and oft-times, I fear, ill-founded criticisms, and forgive me that I have, by this time, made your eyes and head ache with my long letter. But I cannot forgo hastily the pleasure and pride of thus conversing with you.
|
|
|
gmaxwell
Staff
Legendary
Offline
Activity: 4284
Merit: 8808
|
|
January 26, 2016, 03:04:39 AM |
|
Isn't there a little bit of 'chicken-and-egg' here?
Why would miners use these new 'efficient schemes' in the first place if it was to render the fee market ineffectual, unless they were already relying on something else (blocksize limit) to do that (support a fee market)?
These schemes reduce orphaning and allow miners to make a greater profit. It's not hypothetical, Matt's fast block relay protocol is the simplest and least effective of this class of scheme and it is used almost universally today to carry blocks between pools. (A good thing too: otherwise there likely would only be a single mining pool now...). @Gmax I did some sleuthy reconnaissance for you here. And deploy a system for democratic governance that is better than ours. About that: https://i.imgur.com/5kUV5d3.jpgBut really, no one can 'neuter' bitcoin classic but themselves; and they've done that just well from day in most of the same ways as Bitcoin XT.
|
|
|
|
BlindMayorBitcorn
Legendary
Offline
Activity: 1260
Merit: 1116
|
|
January 26, 2016, 03:18:55 AM |
|
You gave up your commit access. What does that mean? Is it because we're toxic?
|
Forgive my petulance and oft-times, I fear, ill-founded criticisms, and forgive me that I have, by this time, made your eyes and head ache with my long letter. But I cannot forgo hastily the pleasure and pride of thus conversing with you.
|
|
|
Peter R
Legendary
Offline
Activity: 1162
Merit: 1007
|
|
January 26, 2016, 03:51:50 AM Last edit: January 26, 2016, 04:16:04 AM by Peter R |
|
If a fee market breaks down because miners are not acting rationally, it seems that that could happen with or without a blocksize limit, and generally to the same degree. Right. Miners being net econo-rational is probably an axiom for Bitcoin to work in the first place. Story about imperfect models and not-perfectly-realistic assumptions.... A few years ago, I hired a new electronics technician (from Russia) to work at my non-bitcoin company. We needed to increase the bandwidth of a sensor circuit and I noticed him messing around with bunch of capacitors, soldering them on and off as though by trial and error. This seemed really strange to me so I showed him how to calculate the bandwidth based on the RC filters used and the GBWP of the amplifier. He told me that calculations like that "only works in theory" and that he preferred trial and error instead. Anyways, I showed him how to calculate the correct values for the resistors and capacitors and then we went to look at the results with the oscilloscope. He was surprised at how close we were able to get to the behaviour he wanted without any trial and error at all. But I was surprised out how far out the bandwidth was from my theoretical calculation, and I ended up making a slight "trial and error" change anyways! [I missed a couple of nuances about that circuit that I now understand] The point of the story is that theoretical models only need to be useful--but we must understand that they are often not perfect. By ignoring theory, you'll spin your wheels and progress will be slow; but by overstating the validity of a theory, reality will kick you in the butt and show you what you don't know. With the academic work regarding the transaction fee market, what we're doing is considering different lenses through which to view the problem, so that we can make incremental progress in our understanding. We know that these models are imperfect, but we ask what happens if the assumptions hold, so that we can gain intuition about the more complicated real problem. For example, if we assume that perfect competition exists AND that the marginal cost of block space is zero (the top left square above), then indeed fees go to zero and the Blockchain fills with spam. I believe Dr. Nicolas Houy was the first to study the problem through that lens in his 2014 paper "The economics of Bitcoin transaction fees." Although the model was obviously simplistic, this paper revealed the interesting insight that a block size limit is economically-equivalant to a fixed-fee requirement. If we assume that miners communicate a non-zero amount of information about the transactions in the block at block-solution time, but still assume perfect competition, then the marginal cost of block space is nonzero and a transaction fee market exists without a block size limit. I believe I was the first to formally analyze this problem in my paper from August 2015 (the top right square in the table above). This is obviously simplistic as well because the market won't be perfectly competitive--miners might have some pricing power. If we assume monopoly conditions (the two bottom squares in the table above), then the "mining cartel" will enforce a block size limit that maximizes the "producer surplus" as shown below. Reality is somewhere in the middle of all these models. By doing research and asking ourselves what happens given certain simplifying assumption, we slowly build a deep understanding of the complex real phenomena.
|
|
|
|
Peter R
Legendary
Offline
Activity: 1162
Merit: 1007
|
|
January 26, 2016, 04:00:38 AM Last edit: January 26, 2016, 06:40:56 AM by Peter R |
|
them utilising the methods Greg describes for sending blocks.
The greatest irony is that after that dozen back and forth set of emails where I was completely unable to communicate to Peter R the ability of miners to completely remove the communication delay from the time of mining, amid his irrelevant poetic waxing about information theory... He went on to submit a paper to Ledger describing the idea, without attribution... which incorporated most of the previous elements I'd already described in public-- save one main difference: it screwed up the design so that participants would take orphan risk for new transactions by only adding them to the pre-consensus by attempting to put them in a Bitcoin block. Indeed you did inspire me to work on subchains, Greg. For two reasons: 1. I didn't buy your claim that things like orphaning risk couldn't contribute to proof-of-work security. 2. I wanted to better understand how pre-consensus through weak blocks might affect the fee market (I could see that the fee market wouldn't collapse, but I wanted a better understanding for how the math would play out). ...without attribution...
I actually did cite you, Greg. Here's a screen shot: I had originally included another citation, but an internal reviewer suggested that pointing out more of your economic misunderstandings might come across of unnecessarily hostile.
|
|
|
|
jonald_fyookball
Legendary
Offline
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
|
|
January 26, 2016, 04:02:41 AM |
|
Isn't there a little bit of 'chicken-and-egg' here?
Why would miners use these new 'efficient schemes' in the first place if it was to render the fee market ineffectual, unless they were already relying on something else (blocksize limit) to do that (support a fee market)?
These schemes reduce orphaning and allow miners to make a greater profit. It's not hypothetical, Matt's fast block relay protocol is the simplest and least effective of this class of scheme and it is used almost universally today to carry blocks between pools. (A good thing too: otherwise there likely would only be a single mining pool now...). I don't understand. If its 'used almost universally' then how can it be a competitive advantage that would result in higher profits? Also whatever the miners are doing today seems like completely different scenario (block subsidies are still high and there is also a blocksize limit).
|
|
|
|
Peter R
Legendary
Offline
Activity: 1162
Merit: 1007
|
|
January 26, 2016, 04:07:56 AM Last edit: January 26, 2016, 04:18:51 AM by Peter R |
|
Isn't there a little bit of 'chicken-and-egg' here?
Why would miners use these new 'efficient schemes' in the first place if it was to render the fee market ineffectual, unless they were already relying on something else (blocksize limit) to do that (support a fee market)?
More efficient block propagation results in higher profits for the miner, so miners are incentivized to improve their communication with the network's hash power. As the network as a whole gets better at block propagation, the "supply curve" falls, resulting in lower fees per kilobyte (measured in BTC) and higher average block sizes, for a given demand curve. This is a good thing, as it is what will allow miners to produce much larger blocks in the future and as well as allow fees per kilobyte to continue to fall (especially important should the price of a bitcoin rise dramatically).
|
|
|
|
gmaxwell
Staff
Legendary
Offline
Activity: 4284
Merit: 8808
|
|
January 26, 2016, 04:29:11 AM Last edit: January 26, 2016, 06:56:26 AM by gmaxwell |
|
them utilising the methods Greg describes for sending blocks.
The greatest irony is that after that dozen back and forth set of emails where I was completely unable to communicate to Peter R the ability of miners to completely remove the communication delay from the time of mining, amid his irrelevant poetic waxing about information theory... He went on to submit a paper to Ledger describing the idea, without attribution... which incorporated most of the previous elements I'd already described in public-- save one main difference: it screwed up the design so that participants would take orphan risk for new transactions by only adding them to the pre-consensus by attempting to put them in a Bitcoin block. Indeed you did inspire me to work on subchains, A fact your work failed to disclose, along with the fact that nearly the totality of the design-- including soft confirmations and the effect on orphaning-- had already been described by me previously ( e.g. and in many other places going back some time) and including painstaking explanation in direct correspondence with you. Considering the over the top incivility you've expressed towards me in the past-- As well as towards my company and the community maintained reference implementation of Bitcoin (consider your current message signature here: "The fall of Blockstream Core draws near."), as well as the vigor in which you disputed the ability of pre-consensus to eliminate size proportional orphaning, it's difficult to believe this was an honest omission. I actually did cite you, Greg. Here's a screen shot: Certain investigators have argued that fees that result from orphaning risk do not contribute to network security. For example, Maxwell argued, "the fact that verifying and transmitting transactions has a cost isn't enough, because all the funds go to pay that cost and none to the POW 'artificial' cost."With a simple diagram, we prove this line of reasoning false That isn't a citation for my work; it's a petty dispute of an unrelated criticism. A citation for my work would have been something like "Maxwell proposed a general class of second-order consensus techniques for eliminating size-proportional orphaning risk and achieving fast estimation of confirmation[], in this paper we flesh out and formalize an instance of this approach, which we call subchains, and analyze its scaling properties and incentives." Yet again it seems this is another case where I offer offer up novel inventions for true scalablity in the bitcoin system only to have them first claimed impossible and then, when their truth can no longer be denied, falsely attributed to other people who then turn around and argue that I don't understand or care about scalablity, puffing up a narrative that I do not attempt solutions to the concerns I raise. Instead of providing even the most basic credit for my invention, you've accused me of being a "technician" and argue "it's become increasingly clear that [Greg Maxwell] actually has a fairly superficial understanding of large swaths of computer science, information theory, physics and mathematics"; perhaps to placate your conscience for ripping off my work wholesale? I'd find it funny that people who think my efforts are of such low value seem to be so eager to attach their own names to them; if not for the fact that the effort to falsely elevate your expertise above others is an activity with a risk of seriously negative consequences for the bitcoin ecosystem.
|
|
|
|
Peter R
Legendary
Offline
Activity: 1162
Merit: 1007
|
|
January 26, 2016, 04:47:27 AM Last edit: January 31, 2016, 10:33:41 PM by Peter R |
|
I actually did cite you, Greg. Here's a screen shot:
That isn't a citation for my work; it's a petty dispute of an unrelated criticism. It is a citation to your past statement where you claim that things like orphaning risk cannot contribute to PoW security. I believe I showed that those claims are false. It's okay to be wrong sometimes. Everyone is. Just admit to it once and a while...haha you might even help to create a more positive environment around here! In regards to your comment about using pre-consensus techniques to completely eliminating block-size dependent orphaning risk, you were wrong about that too, as I demonstrated in the subchain paper. For the lurkers out there, Greg and I used to get along fine but things took a bad turn when I began researching topics related to on-chain scaling for Bitcoin. I can trace the hostility between us back to this (IMO) unnecessarily-spiteful public comment Greg made after I posted some ideas (that later inspired this work by the way) in Cypherdoc's thread: This post seems to be filled with equations and graphs which may baffle the non-technical while actually making some rather simple and straight-forward claims that are, unfortunately, wrong on their face. ... what your post (and this reddit thread) have shown is that someone can throw a bunch of symbolic markup and mix in a lack of understanding and measurement and make a pseudo-scientific argument that will mislead a lot of people, and that you're willing to do so or too ignorant to even realize what you're doing. https://www.reddit.com/r/Bitcoin/comments/3c579i/yesterdays_fork_suggests_we_dont_need_a_blocksize/cssghn6
|
|
|
|
r0ach
Legendary
Offline
Activity: 1260
Merit: 1000
|
|
January 26, 2016, 06:49:16 AM |
|
Another day, another Blockstream vs Peterstream thread.
|
|
|
|
YarkoL
Legendary
Offline
Activity: 996
Merit: 1013
|
|
January 26, 2016, 07:29:33 AM |
|
You have two ways to deal with the situation: Raise block size to mega centralization levels so only datacenters can run nodes which makes Bitcoin useless but at least you can pay lower fees. Or keep block size small and decentralized, let a market fee develop, and run spammy transactions through LN.
Choose one.
Both are forms of centralization, since in the small block solution some royal personage has to decide what is the block size small enough to create scarcity. I'd rather try to work towards some other solution - like for instance one where nodes express their block size preferences individually - in a way that reflects their actual computing resources - and we arrive at consensus from there, by negotiation that transcends the purely algorithmic domain. A prerequisite for that is to make full nodes relevant to miners, and that can happen if the nodes can form a superior relay network which does away with the pressure for miner cartelization. Even if it turns out that in that situation the orphaning cost is too small to enable natural market, a block size limit will still exist, and furthermore, nodes can set a lower limit on the fees they are willing to accept, as alluded here.
|
“God does not play dice"
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
January 26, 2016, 03:04:14 PM |
|
TL/DR: A transaction fee market exists without a block size limit assuming miners act rationally. . . . Miners being net econo-rational is probably an axiom for Bitcoin to work in the first place.
This is the biggest assumption of various schools of economics. However, economy is always political, economy policy is decided by the ruling class to manage the majority of poor people, so they are modeled towards average households that barely can live without the next pay check. For these people, econo-rational means short term profit seeking However, the definition of econo-rational is different for different people, depends on their income, their time frame, and their risk tolerance level. If you move those theory to bankers and large capitalists, you will clearly see their behavior do not follow these models. As we know, the miners and pool owners are bankers and capitalists in bitcoin ecosystem, so they won't seek short term profit like average household, their concerns are much larger and longer term
|
|
|
|
Peter R
Legendary
Offline
Activity: 1162
Merit: 1007
|
|
January 26, 2016, 06:28:17 PM |
|
TL/DR: A transaction fee market exists without a block size limit assuming miners act rationally. . . . Miners being net econo-rational is probably an axiom for Bitcoin to work in the first place.
This is the biggest assumption of various schools of economics. However, economy is always political, economy policy is decided by the ruling class to manage the majority of poor people, so they are modeled towards average households that barely can live without the next pay check. For these people, econo-rational means short term profit seeking However, the definition of econo-rational is different for different people, depends on their income, their time frame, and their risk tolerance level. If you move those theory to bankers and large capitalists, you will clearly see their behavior do not follow these models. As we know, the miners and pool owners are bankers and capitalists in bitcoin ecosystem, so they won't seek short term profit like average household, their concerns are much larger and longer term. Yes, I completely agree. Like I said up-thread, with this academic work regarding the transaction fee market, what we're doing is considering different lenses through which to view the problem, so that we can make incremental progress in our understanding. We know that these models are imperfect, but we ask what happens if the assumptions hold, so that we can gain intuition about the more complicated real problem. More here: https://bitcointalk.org/index.php?topic=1274102.msg13678877#msg13678877And YarkoL's post was great: https://bitcointalk.org/index.php?topic=1274102.msg13680077#msg13680077
|
|
|
|
TooDumbForBitcoin
Legendary
Offline
Activity: 1638
Merit: 1001
|
|
January 27, 2016, 01:47:50 AM |
|
These schemes reduce orphaning and allow miners to make a greater profit. It's not hypothetical, Matt's fast block relay protocol is the simplest and least effective of this class of scheme and it is used almost universally today to carry blocks between pools. (A good thing too: otherwise there likely would only be a single mining pool now...).
In the last week or ten days, Matt posted that he was looking for someone to (paraphrasing here) "take over the running of the relay network server". If the relay network server goes unmaintained, what will be the consequences? Is this server a point of centralization?
|
|
|
|
jonald_fyookball
Legendary
Offline
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
|
|
January 27, 2016, 02:33:50 AM |
|
You have two ways to deal with the situation: Raise block size to mega centralization levels so only datacenters can run nodes which makes Bitcoin useless but at least you can pay lower fees. Or keep block size small and decentralized, let a market fee develop, and run spammy transactions through LN.
Choose one.
Both are forms of centralization, since in the small block solution some royal personage has to decide what is the block size small enough to create scarcity. I'd rather try to work towards some other solution - like for instance one where nodes express their block size preferences individually - in a way that reflects their actual computing resources - and we arrive at consensus from there, by negotiation that transcends the purely algorithmic domain. A prerequisite for that is to make full nodes relevant to miners, and that can happen if the nodes can form a superior relay network which does away with the pressure for miner cartelization. Even if it turns out that in that situation the orphaning cost is too small to enable natural market, a block size limit will still exist, and furthermore, nodes can set a lower limit on the fees they are willing to accept, as alluded here. I like the general direction of thought here. Intuitively, I don't like artificial scarcity either. I am wondering if there is some possible configuration where the difficulty of a solving a block diminishes as fees increase. Could that be done in such a way where its easier for miners to build blocks that have fees, while mitigating any attack vectors that would arise out of such a scheme? (such as an attacker sending huge fees to himself and mining endless empty blocks)
|
|
|
|
YarkoL
Legendary
Offline
Activity: 996
Merit: 1013
|
|
January 27, 2016, 08:07:48 AM |
|
I am wondering if there is some possible configuration where the difficulty of a solving a block diminishes as fees increase. Could that be done in such a way where its easier for miners to build blocks that have fees, while mitigating any attack vectors that would arise out of such a scheme? (such as an attacker sending huge fees to himself and mining endless empty blocks)
Bitcoin behaves in the opposite way. When mining profits increase, it attracts more hashpower and that drives up the difficulty. Otherwise you'd end up with a congestion of blocks.
|
“God does not play dice"
|
|
|
jonald_fyookball
Legendary
Offline
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
|
|
January 27, 2016, 12:51:57 PM |
|
I am wondering if there is some possible configuration where the difficulty of a solving a block diminishes as fees increase. Could that be done in such a way where its easier for miners to build blocks that have fees, while mitigating any attack vectors that would arise out of such a scheme? (such as an attacker sending huge fees to himself and mining endless empty blocks)
Bitcoin behaves in the opposite way. When mining profits increase, it attracts more hashpower and that drives up the difficulty. Otherwise you'd end up with a congestion of blocks. Well sort of. Mostly because BTC price valued in dollars rises. Actual subsidies remain steady until they halve. In both cases we are talking about Longterm profits but I guess I'm thinking more about short term Variations from one block to the next.
|
|
|
|
Peter R
Legendary
Offline
Activity: 1162
Merit: 1007
|
|
January 27, 2016, 04:51:08 PM |
|
I am wondering if there is some possible configuration where the difficulty of a solving a block diminishes as fees increase.
We suspect this will sort of happen naturally as the fee/reward ratio increases. But it won't be that difficulty will decrease as available fees increase, but that the hash rate will increase instead. Since the expected time to solve a block is proportional to the difficulty/hashrate ratio, the effect of the hash rate increasing is similar to the difficulty decreasing. The reason we suspect this will happen is that if there are very few fees available, then only the miners with the lowest electricity costs will mine. As more and more fee-paying transactions pile up in mempool, then more and more miners will turn on their equipment, thereby increasing the network hash rate. Block times will no longer follow an exponential distribution, but something more like a second-order gamma distribution instead. The really cool thing about this is that it means that bitcoin transactions will always remain essentially free if you are willing to wait a long time to get your transaction mined.
|
|
|
|
|