adamstgBit
Legendary
Offline
Activity: 1904
Merit: 1037
Trusted Bitcoiner
|
|
September 02, 2015, 08:03:56 PM |
|
In 12 months, average blocksize will be about 1mb on the current trajectory, never mind spikes, unforseen circumstances, and the many months time needed for consensus and rollout.
How long do you think we should wait?
~11.5months
|
|
|
|
poeEDgar
|
|
September 02, 2015, 08:05:49 PM |
|
poeEDgar,
you make some fair points.
Still, I think it is outrageous that the devs refuse to accept even Bip 102 with its 2 MB limit. It's clear to me they do not want main chain scaling and that it is counter to their business interests.
I think that in order to establish that point about their "business interests", you need to prove that their objections to the solutions proposed to date are without merit. (I also wouldn't mind an explanation as to how Blockstream's business model works and why they are incentivized against increasing block size. I have done a moderate level of research, but I would like to hear from BIP101 supporters why they are so confident that there is a clear conflict of interest here.) I don't think it is unreasonable that devs don't explicitly support one proposal or another at this point. I think these discussions need to be further fleshed out. The most responsible approach is to poke as many holes as you can in every proposal, and see if the result is robust enough to move forward. If you look at how this debate unfolded, it should be clear why devs aren't joining "Team BIP100" or "Team BIP102" or "Team BIP103." Doing so just results in more sectarian debate, which tends not to be about technical merit. Ie. this gets us nowhere. The situation is not nearly as urgent as people are making it out to be. We won't be pushing the limit for a year or more, and even then, a temporary fee market is preferable to rolling out an update when there may be legitimate concerns about how it may adversely affect network security. I think BIP102 is the most reasonable solution put forth thus far. I think any exponential scaling is dangerous (certainly when it is not warranted by real capacity needs), as it puts the protocol into untested conditions where new problems will surely arise. An incremental, gradual process is far superior, as it allows us to a) move forward with scaling without b) jeopardizing network security to unprecedented extent. However, I don't consider myself to be on "Team BIP102" because I don't want to be biased against potentially better solutions that may arise, and I want to see more rigorous debate, with more emphasis on the technical issues of scaling. Less politics and accusations of nefarious intention, more discussion of technical merits. Core is the problem, nothing else.
Core should clearly revise their priorities then. Their prioritizations are an obvious failure.
Like this.
|
I woulda thunk you were old enough to be confident that technology DOES improve. In fits and starts, but over the long term it definitely gets better.
|
|
|
brg444
|
|
September 02, 2015, 08:06:59 PM |
|
In 12 months, average blocksize will be about 1mb on the current trajectory, never mind spikes, unforseen circumstances, and the many months time needed for consensus and rollout.
How long do you think we should wait?
If a solution is obviously needed consensus will be easy establish once we've taken the time to lay out the proper alternatives for implementation. It is absolutely reasonable to stay as long as possible under current limit and reaching it so far has caused nothing but healthy fee pressure. Understand that there is still a ton of spam transactions on the network daily which could be better handled off-chain for their absolutely don't need the security of Bitcoin.
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
jonald_fyookball
Legendary
Offline
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
|
|
September 02, 2015, 08:11:51 PM |
|
poeEDgar,
you make some fair points.
Still, I think it is outrageous that the devs refuse to accept even Bip 102 with its 2 MB limit. It's clear to me they do not want main chain scaling and that it is counter to their business interests.
I think that in order to establish that point about their "business interests", you need to prove that their objections to the solutions proposed to date are without merit. (I also wouldn't mind an explanation as to how Blockstream's business model works and why they are incentivized against increasing block size. I have done a moderate level of research, but I would like to hear from BIP101 supporters why they are so confident that there is a clear conflict of interest here.) I don't think it is unreasonable that devs don't explicitly support one proposal or another at this point. I think these discussions need to be further fleshed out. The most responsible approach is to poke as many holes as you can in every proposal, and see if the result is robust enough to move forward. If you look at how this debate unfolded, it should be clear why devs aren't joining "Team BIP100" or "Team BIP102" or "Team BIP103." Doing so just results in more sectarian debate, which tends not to be about technical merit. Ie. this gets us nowhere. The situation is not nearly as urgent as people are making it out to be. We won't be pushing the limit for a year or more, and even then, a temporary fee market is preferable to rolling out an update when there may be legitimate concerns about how it may adversely affect network security. I think BIP102 is the most reasonable solution put forth thus far. I think any exponential scaling is dangerous (certainly when it is not warranted by real capacity needs), as it puts the protocol into untested conditions where new problems will surely arise. An incremental, gradual process is far superior, as it allows us to a) move forward with scaling without b) jeopardizing network security to unprecedented extent. However, I don't consider myself to be on "Team BIP102" because I don't want to be biased against potentially better solutions that may arise, and I want to see more rigorous debate, with more emphasis on the technical issues of scaling. Less politics and accusations of nefarious intention, more discussion of technical merits. Core is the problem, nothing else.
Core should clearly revise their priorities then. Their prioritizations are an obvious failure.
Like this. I started another thread "is Greg Maxwell wrong about the blocksize" where I give my thoughts on his blocksize comments. The conflict of interest, while relevant, isn't critical to showing a clear failure to act. In other words, regardless of their motives, the core team hasn't come to a consensus, despite some very reasonable proposals (BIP 102) and in fact have argued nothing should be done until there is a backlog. While you may feel Gavin is on the deep end of the pool, Greg is certainly on the other extreme.
|
|
|
|
poeEDgar
|
|
September 02, 2015, 08:18:14 PM |
|
In 12 months, average blocksize will be about 1mb on the current trajectory, never mind spikes, unforseen circumstances, and the many months time needed for consensus and rollout.
How long do you think we should wait?
As long as it takes to ensure that we are approaching network security responsibly. The idea of full blocks/fee market pales in comparison to the potential losses that will be incurred when exponential scaling causes unexpected threats to security. If BIP101 supporters came around to a more responsible, incremental approach that is line with real transaction growth -- not transaction growth that they want to happen -- this issue would be a lot less controversial.
|
I woulda thunk you were old enough to be confident that technology DOES improve. In fits and starts, but over the long term it definitely gets better.
|
|
|
jonald_fyookball
Legendary
Offline
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
|
|
September 02, 2015, 08:20:22 PM |
|
If BIP101 supporters came around to a more responsible, incremental approach that is line with real transaction growth -- not transaction growth that they want to happen -- this issue would be a lot less controversial.
I agree with that statement... but as I have little to zero contact or influence with Gavin (although he might be reading this), there's not much I can personally do.
|
|
|
|
adamstgBit
Legendary
Offline
Activity: 1904
Merit: 1037
Trusted Bitcoiner
|
|
September 02, 2015, 08:22:12 PM |
|
If BIP101 supporters came around to a more responsible, incremental approach that is line with real transaction growth -- not transaction growth that they want to happen -- this issue would be a lot less controversial.
I agree with that statement... but as I have little to zero contact or influence with Gavin (although he might be reading this), there's not much I can personally do. the miners too agree on a "approach that is line with real transaction growth" https://www.blocktrail.com/BTC/pools>60% hashing for BIP100
|
|
|
|
poeEDgar
|
|
September 02, 2015, 08:33:41 PM |
|
In other words, regardless of their motives, the core team hasn't come to a consensus, despite some very reasonable proposals (BIP 102) and in fact have argued nothing should be done until there is a backlog.
While you may feel Gavin is on the deep end of the pool, Greg is certainly on the other extreme.
Like I said earlier, doing nothing is preferable to jeopardizing security. I'm also not convinced that waiting until there is a consistent backlog is necessarily wrong. That may have been Satoshi's intent when he said we can change the limit when we need to, in response to the suggestion in 2010 that the block size limit be removed. I think we all know that this debate really began in earnest following the first stress test and the release of the XT client. Not much time has passed at all. Like I said, I'm not sure it's appropriate for devs to "pick a team" just yet. I think that may simply cause more devolution in this debate. I have found Greg's approach to be pretty compelling, as he gives considerable forethought to the issues and potential issues that come with scaling. This is in contrast to Gavin's approach, which I believe boils down to this and little more: I woulda thunk you were old enough to be confident that technology DOES improve. In fits and starts, but over the long term it definitely gets better. Sure, technology gets better. That's not a very good basis to support the extent of scaling suggested in BIP101. And his cavalier attitude towards the potential problems that it may cause is a bit insulting.
|
I woulda thunk you were old enough to be confident that technology DOES improve. In fits and starts, but over the long term it definitely gets better.
|
|
|
forevernoob
|
|
September 02, 2015, 08:44:07 PM |
|
Anyone know if Mike Hearn calls himself a libertarian? I know Gavin does.
Not that it matters but it's interesting to me. Mike & Gavin are really pushing the regulatory agenda.
|
|
|
|
jonald_fyookball
Legendary
Offline
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
|
|
September 02, 2015, 09:22:19 PM |
|
Like I said earlier, doing nothing is preferable to jeopardizing security.
again, most people agree Bip102 doesn't jeopardize security , but the devs would rather do nothing. Doing nothing until there is a crisis and then rushing an emergency fix seems more dangerous than going with a reasonable approach now and testing the hell out of it. And this is preferable to jeopardizing adoption too.
|
|
|
|
brg444
|
|
September 02, 2015, 09:25:26 PM |
|
Like I said earlier, doing nothing is preferable to jeopardizing security.
again, most people agree Bip102 doesn't jeopardize security , but the devs would rather do nothing. Doing nothing until there is a crisis and then rushing an emergency fix seems more dangerous than going with a reasonable approach now and testing the hell out of it. And this is preferable to jeopardizing adoption too. If we have the opportunity to fork the network only once this is what we should aim for. There is still no hurry to adopt BIP102 or any proposals until every alternative has been worked through.
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
poeEDgar
|
|
September 02, 2015, 09:25:57 PM Last edit: September 02, 2015, 09:49:26 PM by poeEDgar |
|
Like I said earlier, doing nothing is preferable to jeopardizing security.
again, most people agree Bip102 doesn't jeopardize security , but the devs would rather do nothing. Doing nothing until there is a crisis and then rushing an emergency fix seems more dangerous than going with a reasonable approach now and testing the hell out of it. And this is preferable to jeopardizing adoption too. I believe I have already addressed this. I agree that a simple doubling of the block size is unlikely to jeopardize security. But I'm not sure that it is wise for devs to force the debate at this point by throwing their weight behind this or that proposal, though. The urgency is overstated; if we are still at this impasse in 9 months or a year, I will agree with you, assuming we are actually pushing the 1MB block limit. However, I don't consider myself to be on "Team BIP102" because I don't want to be biased against potentially better solutions that may arise, and I want to see more rigorous debate, with more emphasis on the technical issues of scaling. Less politics and accusations of nefarious intention, more discussion of technical merits.
|
I woulda thunk you were old enough to be confident that technology DOES improve. In fits and starts, but over the long term it definitely gets better.
|
|
|
agath
|
|
September 03, 2015, 12:33:04 AM |
|
A Proposal for the Consensus Building Process:
1) All BIP's have equal weight in the voting process.
2)Miners vote with each newly mined Block.
3) BIPs are revised and amended according to argument and reservations submitted so that they are molded into the best possible versions that everyone is happy with.
4) Once 75% of blocks mined within the last 1000 blocks is in support of one BIP, an alert is sent out to everyone indicating that they should switch over to supporting this BIP.
5) Once 90% of blocks mined within the last 1000 blocks indicates support of the BIP the debate is over and the new BIP is implemented, all blocks that do not include the new BIP protocol changes are after that point rejected.
Historically the threshold has been 95% such as with BIP69, but I suggest lowering that to 90%. Agree? Disagree?
A few of us are working on creating an Open Letter to the Devs from The Bitcoin Community which will include this proposal. If you would like to help us shape this document PM me and I will send you a link to the Google Doc so you can help us write the letter.
What if someone implements a BIP that "establishes that consensus should be aknowledged with only 100 out of 1000 blocks", and this BIP gets >90% of the last blocks mined? :-)
|
|
|
|
HostFat
Staff
Legendary
Offline
Activity: 4270
Merit: 1209
I support freedom of choice
|
|
September 03, 2015, 12:45:11 AM |
|
https://www.reddit.com/r/Bitcoin/comments/3j8rg1/an_open_letter_to_the_bitcoin_community_from_the/cuo15ptI'd be happy if they come to some form of agreement at the second workshop. The website does not suggest that this will happen - the statement that absolutely no decisions are made at workshops (plural) seems definitive. Still, it may be that whoever wrote the website FAQ isn't really representative of what the attendees will do or want. From the outside looking in, it doesn't seem like a great start. But perhaps things will happen anyway, despite the public statements of some of the attendees. Regardless, the underlying problems remain. Gavin was having private discussions with some of the signers of this letter months ago, or trying to. So was I. Back in April I asked Wladimir privately what his proposed timeline was for a block size increase, and on that same thread Gavin proposed a one-time bump to 20mb with no further increases (as he'd tested that already). Wladimir didn't even bother replying to us. Gavin also tried talking to Gregory Maxwell in private. So have I. He simply doesn't respond to our emails either. There is just no way this project can be credible when key players are signing open letters saying they want some kind of amazing consensus building process .... and yet refuse to even reply to emails from developers who have been contributing to Bitcoin for longer than they have even been around.
|
|
|
|
squatter
Legendary
Offline
Activity: 1666
Merit: 1196
STOP SNITCHIN'
|
|
September 03, 2015, 12:59:09 AM |
|
https://www.reddit.com/r/Bitcoin/comments/3j8rg1/an_open_letter_to_the_bitcoin_community_from_the/cuo15ptI'd be happy if they come to some form of agreement at the second workshop. The website does not suggest that this will happen - the statement that absolutely no decisions are made at workshops (plural) seems definitive. Still, it may be that whoever wrote the website FAQ isn't really representative of what the attendees will do or want. From the outside looking in, it doesn't seem like a great start. But perhaps things will happen anyway, despite the public statements of some of the attendees. Regardless, the underlying problems remain. Gavin was having private discussions with some of the signers of this letter months ago, or trying to. So was I. Back in April I asked Wladimir privately what his proposed timeline was for a block size increase, and on that same thread Gavin proposed a one-time bump to 20mb with no further increases (as he'd tested that already). Wladimir didn't even bother replying to us. Gavin also tried talking to Gregory Maxwell in private. So have I. He simply doesn't respond to our emails either. There is just no way this project can be credible when key players are signing open letters saying they want some kind of amazing consensus building process .... and yet refuse to even reply to emails from developers who have been contributing to Bitcoin for longer than they have even been around. he is right as far as this: Wladimir and Gregory should have given them a courtesy response to the effect of, "no we are not implementing a 20x or a 8000x capacity increase. the small minority of developers that supports such drastic measures -- irrespective of real growth in transactions -- have shown no forethought as to why the drastic nature of these changes is necessary, nor have they accounted for the these various security related issues (x, y, z, ab, cd, ef, gh, etc) that may arise under such conditions"
|
|
|
|
brg444
|
|
September 03, 2015, 01:03:13 AM |
|
on that same thread Gavin proposed a one-time bump to 20mb with no further increases (as he'd tested that already). Wladimir didn't even bother replying to us.
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
brg444
|
|
September 03, 2015, 01:05:56 AM |
|
https://www.reddit.com/r/Bitcoin/comments/3j8rg1/an_open_letter_to_the_bitcoin_community_from_the/cuo15ptI'd be happy if they come to some form of agreement at the second workshop. The website does not suggest that this will happen - the statement that absolutely no decisions are made at workshops (plural) seems definitive. Still, it may be that whoever wrote the website FAQ isn't really representative of what the attendees will do or want. From the outside looking in, it doesn't seem like a great start. But perhaps things will happen anyway, despite the public statements of some of the attendees. Regardless, the underlying problems remain. Gavin was having private discussions with some of the signers of this letter months ago, or trying to. So was I. Back in April I asked Wladimir privately what his proposed timeline was for a block size increase, and on that same thread Gavin proposed a one-time bump to 20mb with no further increases (as he'd tested that already). Wladimir didn't even bother replying to us. Gavin also tried talking to Gregory Maxwell in private. So have I. He simply doesn't respond to our emails either. There is just no way this project can be credible when key players are signing open letters saying they want some kind of amazing consensus building process .... and yet refuse to even reply to emails from developers who have been contributing to Bitcoin for longer than they have even been around. he is right as far as this: Wladimir and Gregory should have given them a courtesy response to the effect of, "no we are not implementing a 20x or a 8000x capacity increase. the small minority of developers that supports such drastic measures -- irrespective of real growth in transactions -- have shown no forethought as to why the drastic nature of these changes is necessary, nor have they accounted for the these various security related issues (x, y, z, ab, cd, ef, gh, etc) that may arise under such conditions" A simple "Gavin "tests" are broken and we pointed out to you why it is so" could've done the trick as well.
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
BayAreaCoins
Legendary
Offline
Activity: 4004
Merit: 1250
Owner at AltQuick.com
|
|
September 03, 2015, 02:33:42 AM |
|
on that same thread Gavin proposed a one-time bump to 20mb with no further increases (as he'd tested that already). Wladimir didn't even bother replying to us.
|
|
|
|
brg444
|
|
September 03, 2015, 02:52:13 AM |
|
on that same thread Gavin proposed a one-time bump to 20mb with no further increases (as he'd tested that already). Wladimir didn't even bother replying to us.
There is nothing sensible about pushing for a 20mb increase under current circumstances, further increased planned or not.
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
September 03, 2015, 02:55:38 AM |
|
It's about how to reach a consensus on a complex topic
You don't need any vote for the validity of "1+1=2", you always reach 100% consensus. The reason you need to vote is because the topic is too complex that majority of participants can not grasp
However, if a topic is too difficult for majority to understand, then you have a complexity problem: The complexity make participants lose their decision making ability, thus they have to rely on political practice (to be more precise, dice casting) to make a decision
In such case, I guess simulation will be a better way to see into the future: Suppose that an airplane's crew member are all dead fighting terrorists, none of the survivors have ever flied an airplane, your best bet is to let the one who have played a lot of flight simulation to land the airplane
|
|
|
|
|