brg444 (OP)
|
|
December 15, 2015, 08:53:46 PM |
|
Transactions being non reversible is a feature not a bug. Under the schedule outlined in BIP101 Bitcoin will be able to compete with payment processors now and in ten years from now as transaction volume increases. I do not have anything against off chain solutions as long as people are not "forced" to use them in order to cheaply transact in Bitcoin because of an arbitrarily and unnecessarily low blocksize limit.
I agree but tell that to your regular Joe consumer Under the proposed Lightning schedule Bitcoin will be able to compete with payment processors by sometimes next year while maintaining the highest standards of security and decentralization. If we increase the blocksize then we can compete with payment processors now, you can not expect the market to just wait for this without negative consequences especially as Bitcoin fails to handle the transactional demand. I also think that increasing the blocksize would lead to more security and decentralization over the long run. Since having a high volume of low fee transactions would be better compared to a low volume of transactions with high fees. No we can't. Do you really think a meager 8x increase can have us compete with VISA or Paypal!? As long as blocks do not fill up we can, this depends on the rate of adoption. No it doesn't. It's about capacity. Realistically only Lightning can get us there. Raising the blocksize is not a scaling solution, as we've repeatedly tried to tell you over the last 200 pages or so.
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
VeritasSapere
|
|
December 15, 2015, 09:22:01 PM |
|
Implementing LN without an increase in the blocksize would lead to massive centralization because of the limited amount of off chain solutions that would be able to operate within such a limit. Even your precious Core developers have admitted that an increase in the blocksize is still necessary even with LN for global adoption.
Are you really trying to say that increasing the blocksize does not increase capacity and scaling? Have you ever heard of the term doublespeak?
|
|
|
|
forevernoob
|
|
December 15, 2015, 09:30:00 PM |
|
Implementing LN without an increase in the blocksize would lead to massive centralization because of the limited amount of off chain solutions that would be able to operate within such a limit. Even your precious Core developers have admitted that an increase in the blocksize is still necessary even with LN for global adoption.
Are you really trying to say that increasing the blocksize does not increase capacity and scaling? Have you ever heard of the term doublespeak?
So what about SW? Is that increase not enough?
|
|
|
|
BitUsher
Legendary
Offline
Activity: 994
Merit: 1035
|
|
December 15, 2015, 09:33:31 PM |
|
Implementing LN without an increase in the blocksize would lead to massive centralization because of the limited amount of off chain solutions that would be able to operate within such a limit. Even your precious Core developers have admitted that an increase in the blocksize is still necessary even with LN for global adoption.
Are you really trying to say that increasing the blocksize does not increase capacity and scaling? Have you ever heard of the term doublespeak?
So what about SW? Is that increase not enough? I don't know of any developer who thinks its enough, that is why they are discussing a long list of options to address scalability. segwit essentially will increase capacity by 75% (like a 1.75 MB block limit) and only more if heavy multig becomes more popular.
|
|
|
|
forevernoob
|
|
December 15, 2015, 09:39:55 PM |
|
I don't know of any developer who thinks its enough, that is why they are discussing a long list of options to address scalability. segwit essentially will increase capacity by 75% (like a 1.75 MB block limit) and only more if heavy multig becomes more popular.
I was thinking more like SW + Lightning Network?
|
|
|
|
VeritasSapere
|
|
December 15, 2015, 09:42:11 PM |
|
Implementing LN without an increase in the blocksize would lead to massive centralization because of the limited amount of off chain solutions that would be able to operate within such a limit. Even your precious Core developers have admitted that an increase in the blocksize is still necessary even with LN for global adoption.
Are you really trying to say that increasing the blocksize does not increase capacity and scaling? Have you ever heard of the term doublespeak?
So what about SW? Is that increase not enough? I don't know of any developer who thinks its enough, that is why they are discussing a long list of options to address scalability. segwit essentially will increase capacity by 75% (like a 1.75 MB block limit) and only more if heavy multig becomes more popular. It also still needs to implemented across the board through the entire ecosystem, from wallets, block explorers, and many other services they will all need to modify their own custom software in order to adopt this change, this will take time, furthermore as was already said the actual increase in the throughput is rather minimal.
|
|
|
|
VeritasSapere
|
|
December 15, 2015, 09:52:19 PM |
|
I don't know of any developer who thinks its enough, that is why they are discussing a long list of options to address scalability. segwit essentially will increase capacity by 75% (like a 1.75 MB block limit) and only more if heavy multig becomes more popular.
I was thinking more like SW + Lightning Network? I do not consider the Lighting Network as an alternative to increasing the blocksize, this is also comes down to ideology however. Though I do not consider moving the majority of transactions off the Bitcoin blockchain as a solution to scaling the Bitcoin blockchain itself. A solution to creating a Peer-to-Peer Electronic Cash System already exists, its called Bitcoin. We just need to allow it to operate at scale like it was always intended to be.
|
|
|
|
BitUsher
Legendary
Offline
Activity: 994
Merit: 1035
|
|
December 15, 2015, 09:55:59 PM |
|
I don't know of any developer who thinks its enough, that is why they are discussing a long list of options to address scalability. segwit essentially will increase capacity by 75% (like a 1.75 MB block limit) and only more if heavy multig becomes more popular.
I was thinking more like SW + Lightning Network? The Lightning Network won't be very useful with just segwit alone in the longterm . https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011865.htmlSegwit + LN + Relay Network Improvements + 2+4+8MB limit increase + more off the chain solutions and than other solutions layers ontop for scalability. A solution to creating a Peer-to-Peer Electronic Cash System already exists, its called Bitcoin. We just need to allow it to operate at scale like it was always intended to be.
Satoshi is human, and despite being a genius , made many flaws that need to be corrected. It isn't surprising that most developers are against Bip101 and XT as being too aggressive. They are familiar with the limitations of technology and how it can scale and the inherent tradeoffs.
|
|
|
|
forevernoob
|
|
December 15, 2015, 10:03:59 PM |
|
I do not consider the Lighting Network as an alternative to increasing the blocksize, this is also comes down to ideology however. Though I do not consider moving the majority of transactions off the Bitcoin blockchain as a solution to scaling the Bitcoin blockchain itself. A solution to creating a Peer-to-Peer Electronic Cash System already exists, its called Bitcoin. We just need to allow it to operate at scale like it was always intended to be.
Hate to break it to ya, but Bitcoin doesn't scale. I didn't get the 2+4+8MB part. Are you talking about a hard fork?
|
|
|
|
VeritasSapere
|
|
December 15, 2015, 10:08:17 PM |
|
I do not consider the Lighting Network as an alternative to increasing the blocksize, this is also comes down to ideology however. Though I do not consider moving the majority of transactions off the Bitcoin blockchain as a solution to scaling the Bitcoin blockchain itself. A solution to creating a Peer-to-Peer Electronic Cash System already exists, its called Bitcoin. We just need to allow it to operate at scale like it was always intended to be.
Hate to break it to ya, but Bitcoin doesn't scale. I don't get the 2+4+8MB part. Are you talking about a hard fork? This meme that Bitcoin can not scale is false, it is perpetuated by the Core developers who maintain the nirvana fallacy in engineering. Bitcoin does not scale efficiently, but it certainly can scale. We just need to increase the blocksize. To answer your question, this does require a hard fork, hard forks are the only way to increase the blocksize. Hard forks are also an important governance mechanism within Bitcoin which allow us to resolve fundamental differences and disagreements, even if that might mean splitting Bitcoin, at least that is better then the tyranny of the majority or in this case the tyranny of a minority.
|
|
|
|
forevernoob
|
|
December 15, 2015, 10:11:28 PM |
|
This meme that Bitcoin can not scale is false, it is perpetuated by the Core developers who maintain the nirvana fallacy in engineering. Bitcoin does not scale efficiently, but it certainly can scale. We just need to increase the blocksize. To answer your question, this does require a hard fork, hard forks are the only way to increase the blocksize.
Hard forks are also an important governance mechanism within Bitcoin which allow us to resolve fundamental differences and disagreements, even if that might mean splitting Bitcoin, at least that is better then the tyranny of the majority or even minority in this case.
I thought BitcoinXT max out at 8GB's per block? How is that scaling?
|
|
|
|
marcus_of_augustus
Legendary
Offline
Activity: 3920
Merit: 2349
Eadem mutata resurgo
|
|
December 15, 2015, 10:14:27 PM |
|
Bitcoin does not scale efficiently, but it certainly can scale. We just need to increase the blocksize. To answer your question, this does require a hard fork, hard forks are the only way to increase the blocksize.
Hard forks are also an important governance mechanism within Bitcoin which allow us to resolve fundamental differences and disagreements, even if that might mean splitting Bitcoin, at least that is better then tyranny of the majority. ... these are your talking points, and the limit of your understanding of what you think you know about bitcoin, let alone anything of the technology. You are doing an admirable job of banging on incessantly, in the way of propaganda, upon these points but nothing else, nor adding to any solutions. Go invent your own altcoin if you hate the Core developers with such venom. Engineering projects are inevitably the worse off when politicians and lawyers try to politicise the technology. If you want to use a monetary system that has been politicised and legislated to death, use the US Federal Reserve's debt notes system.
|
|
|
|
VeritasSapere
|
|
December 15, 2015, 10:16:54 PM |
|
This meme that Bitcoin can not scale is false, it is perpetuated by the Core developers who maintain the nirvana fallacy in engineering. Bitcoin does not scale efficiently, but it certainly can scale. We just need to increase the blocksize. To answer your question, this does require a hard fork, hard forks are the only way to increase the blocksize.
Hard forks are also an important governance mechanism within Bitcoin which allow us to resolve fundamental differences and disagreements, even if that might mean splitting Bitcoin, at least that is better then the tyranny of the majority or even minority in this case.
I thought BitcoinXT max out at 8GB's per block? How is that scaling? This is an example of the nirvana fallacy at work, an 8000x increase in throughput in twenty years from now, I would certainly consider a significant increase in the scale of Bitcoin. Compared to leaving the blocksize at one megabyte. It is worth thinking about the practical difference even a small increase would make for the economy as a whole. Some of these engineers seem to be to academic in these questions and not actually grounded in the practical reality of the world.
|
|
|
|
theymos
Administrator
Legendary
Offline
Activity: 5404
Merit: 13498
|
|
December 15, 2015, 10:16:58 PM Merited by iCEBREAKER (2) |
|
When I heard yesterday that btcdrak was made a moderator of /r/btc, my first thoughts were "Maybe /r/Bitcoin finally has a bit of competition" and "I wonder how long that'll last". The answer: probably not very long. As someone (iCEBREAKER?) predicted, these people will just keep forking themselves into oblivion.
|
1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
|
|
|
BitUsher
Legendary
Offline
Activity: 994
Merit: 1035
|
|
December 15, 2015, 10:18:28 PM |
|
I didn't get the 2+4+8MB part. Are you talking about a hard fork?
Yes, the devs are weighing in on complex solutions like flex cap and Adam Back's 2-4-8 proposal or other hard fork solutions. The LN devs need over 8MB to make the caching layer scale practical or worthwhile. To out-compete Visa tps they discussed 133MB size blocks with LN. Hopefully those sizes are never needed or we live in a future where new tech makes them "small" https://lightning.network/lightning-network.pdfWhen I heard yesterday that btcdrak was made a moderator of /r/btc, my first thoughts were "Maybe /r/Bitcoin finally has a bit of competition" and "I wonder how long that'll last". The answer: probably not very long. As someone (iCEBREAKER?) predicted, these people will just keep forking themselves into oblivion. Shame, btcdrak is a really smart and prolific guy. https://www.youtube.com/watch?v=kjP6hezfpUI
|
|
|
|
forevernoob
|
|
December 15, 2015, 10:25:19 PM |
|
I didn't get the 2+4+8MB part. Are you talking about a hard fork?
Yes, the devs are weighing in on complex solutions like flex cap and Adam Back's 2-4-8 proposal or other hard fork solutions. The LN devs need over 8MB to make the caching layer scale practical or worthwhile. To out-compete Visa tps they discussed 133MB size blocks with LN. Hopefully those sizes are never needed or we live in a future where new tech makes them "small" https://lightning.network/lightning-network.pdfWell I'm not entirely against hard forks. But I don't think we should rush any hard fork decisions at this point. Hopefully like you say in the future the new tech will make LN work without huge blocks.
|
|
|
|
forevernoob
|
|
December 15, 2015, 10:29:57 PM |
|
This is an example of the nirvana fallacy at work, an 8000x increase in throughput in twenty years from now, I would certainly consider a significant increase in the scale of Bitcoin. Compared to leaving the blocksize at one megabyte. It is worth thinking about the practical difference even a small increase would make for the economy as a whole. Some of these engineers seem to be to academic in these questions and not actually grounded in the practical reality of the world.
With side chains the possibilities for scaling Bitcoin are endless. Why risk a dangerous fork if it doesn't solve the scaling issue? What you describe is a increase of blocksize not a scaling solution.
|
|
|
|
VeritasSapere
|
|
December 15, 2015, 10:31:08 PM |
|
I didn't get the 2+4+8MB part. Are you talking about a hard fork?
Yes, the devs are weighing in on complex solutions like flex cap and Adam Back's 2-4-8 proposal or other hard fork solutions. The LN devs need over 8MB to make the caching layer scale practical or worthwhile. To out-compete Visa tps they discussed 133MB size blocks with LN. Hopefully those sizes are never needed or we live in a future where new tech makes them "small" https://lightning.network/lightning-network.pdfUnder the BIP101 schedule it would take more then twelve years in order to even reach 133MB blocks. The presumption is that within twelve years the technology will have advanced and this will be considered "small". I have even suggested changing BIP101 so that it starts at 2MB, that way you would have 2-4-8 within a six year period and it would then take another twenty years to reach 133MB blocks, this would lower the potential max blocksize limit as well. I figured some of the people here at least would like such a solution, I bring it up only as a compromise, though it does seem like Core is completely unwilling to compromise with the other side of this debate, so this is unlikely to happen, I suspect we will see Bitcoin split. It is also good to point out that the blocksize limit does not actually determine the size of the blocks, in the same sense that blocks are only filling up now yet we have had this limit for a long time, the limit was even set at 32MB earlier in its history and this was also not a problem, the blocksize limit was only added as a temporary anti spam measure. I would like to add here as well that after twenty years the block size can of course be changed again, though I suspect the landscape will be very different by then.
|
|
|
|
BitUsher
Legendary
Offline
Activity: 994
Merit: 1035
|
|
December 15, 2015, 10:32:11 PM |
|
Well I'm not entirely against hard forks. But I don't think we should rush any hard fork decisions at this point. Hopefully like you say in the future the new tech will make LN work without huge blocks.
Don't worry , there is definitely not going to be a rush with segwit and relaynetwork improvements being a softfork ..expect 6months to 1 year. This is why Gavin was freaking out. We aren't at capacity now, but realistically will be sometime in 2016 and a fee market will start to take hold raising prices. Smart business people should consider this and start creating trustless "off-the chain " solutions like changetip + CLTV contracts as workarounds when these capacity limits are hit.... it is coming, my guess march to may 2016 we will start see tx fees increasing.
|
|
|
|
VeritasSapere
|
|
December 15, 2015, 10:33:23 PM |
|
This is an example of the nirvana fallacy at work, an 8000x increase in throughput in twenty years from now, I would certainly consider a significant increase in the scale of Bitcoin. Compared to leaving the blocksize at one megabyte. It is worth thinking about the practical difference even a small increase would make for the economy as a whole. Some of these engineers seem to be to academic in these questions and not actually grounded in the practical reality of the world.
With side chains the possibilities for scaling Bitcoin are endless. Why risk a dangerous fork if it doesn't solve the scaling issue? What you describe is a increase of blocksize not a scaling solution. Increasing the blocksize does increases the throughput of the Bitcoin blockchain, I certainly would consider that a scaling solution. I think it is more dangerous not to do any hard forks considering how important this mechanism is for the governance of Bitcoin.
|
|
|
|
|