fellowtraveler
|
|
February 11, 2015, 11:56:55 AM |
|
Permanently keeping the 1MB (anti-spam) restriction is a great idea ... if you are a bank. Those favoring a permanent 1MB cap saying that Bitcoin can still be a financial backbone of sorts they don't know how right they are. The problem isn't a limit in general but that 1MB is so low that under any meaningful adoption scenario it will push all individual users off the blockchain to rely on trusted third parties. 1MB is useless for end user direct access but is sufficient for a inter-"bank" settlement network.
...
Conclusion The blockchain permanently restricted to 1MB is great if you are a major bank looking to co-opt the network for a next generation limited trust settlement network between major banks, financial service providers, and payment processors. It is a horrible idea if you even want to keep open the possibility that individuals will be able to participate in that network without using a trusted third party as an intermediary.
I just wanted to point out that Monetas / Open-Transactions is working on a solution that eliminates the need to trust third parties. So the bulk of transactions could move off-chain to Monetas notary servers, and a notary would be unable to steal the coins, and would be unable to falsify receipts or change balances without a user's permission. With the vast majority of transactions occurring safely off-chain on Monetas notaries, any user would still be able to perform on-chain transactions when necessary -- it would just be a lot more rare, and would usually consist of a transfer from one voting pool to another. So keep your 1mb limit if you want, it will definitely benefit Monetas.
|
|
|
|
ShadowOfHarbringer
Legendary
Offline
Activity: 1470
Merit: 1006
Bringing Legendary Har® to you since 1952
|
|
February 11, 2015, 01:59:18 PM |
|
Maximum supported users based on transaction frequency. Assumptions: 1MB block, 821 bytes per txn Throughput: 2.03 tps, 64,000,000 transactions annually
Total # Transactions per Transaction direct users user annually Frequency <8,000 8760 Once an hour 178,000 365 Once a day 500,000 128 A few (2.4) times a week 1,200,000 52 Once a week 2,600,000 24 Twice a month 5,300,000 12 Once a month 16,000,000 4 Once a quarter 64,000,000 1 Once a year 200,000,000 0.3 Less than once every few years 1,000,000,000 0.06 Less than once a decade
As you can see even with an average transaction frequency of just once a week or once a month the network can't support more than a token number of users. When someone advocates a permanent cap of 1MB what they are saying is I think Bitcoin will be great if it is never used by more than a couple million users making less than one transaction per month. Such a system will never flourish as a store of value as it is eclipsed by alternatives which are more inclusive. To support even 100 million direct users making an average of one transaction every two weeks would require a throughput of 82 tps and an average block size of 20 to 40 Megabytes. This. This is excellent. Good work on explaining it this way. With this post, you have done a great deed to support the Bitcoin long-term.
|
|
|
|
R2D221
|
|
February 11, 2015, 02:25:33 PM |
|
I just wanted to point out that Monetas / Open-Transactions is working on a solution that eliminates the need to trust third parties.
What? With Bitcoin, you already don't need to trust a third party. Why would this be different?
|
An economy based on endless growth is unsustainable.
|
|
|
fellowtraveler
|
|
February 11, 2015, 05:15:34 PM |
|
I just wanted to point out that Monetas / Open-Transactions is working on a solution that eliminates the need to trust third parties.
What? With Bitcoin, you already don't need to trust a third party. Why would this be different? Sure you do. You need to trust MtGox and Bitstamp when you want to do market trading. And as this thread shows, you will soon need to do off-chain just for normal transactions (as a result of the 1mb limit.)
|
|
|
|
R2D221
|
|
February 11, 2015, 05:29:18 PM |
|
I just wanted to point out that Monetas / Open-Transactions is working on a solution that eliminates the need to trust third parties.
What? With Bitcoin, you already don't need to trust a third party. Why would this be different? Sure you do. You need to trust MtGox and Bitstamp when you want to do market trading. And as this thread shows, you will soon need to do off-chain just for normal transactions (as a result of the 1mb limit.) Well, you eventually need to trust someone. Also, off chain? I will not stop using Bitcoin because of the block changes.
|
An economy based on endless growth is unsustainable.
|
|
|
justusranvier
Legendary
Offline
Activity: 1400
Merit: 1013
|
|
February 11, 2015, 05:43:36 PM |
|
Also, off chain? I will not stop using Bitcoin because of the block changes. If the block size remains at 1 MB, most of humanity will be forced by necessity to transact off chain. The rationing is a form of artificially subsidizing off-chain solutions, causing them to see usage that they otherwise would not if people were free to choose between on-chain and off-chain transactions. OT has many other uses besides financial transactions so it would be fine even if Bitcoin handled all the world's monetary transactions. If Bitcoin is not allowed to step up to the plate, however, OT (and other off-chain systems) would need to take up the slack. That's not a particular great scenario. It'd hardly be better than the fragmented and high-friction financial system we have now.
|
|
|
|
R2D221
|
|
February 11, 2015, 06:05:21 PM |
|
Also, off chain? I will not stop using Bitcoin because of the block changes. If the block size remains at 1 MB, most of humanity will be forced by necessity to transact off chain. The rationing is a form of artificially subsidizing off-chain solutions, causing them to see usage that they otherwise would not if people were free to choose between on-chain and off-chain transactions. OT has many other uses besides financial transactions so it would be fine even if Bitcoin handled all the world's monetary transactions. If Bitcoin is not allowed to step up to the plate, however, OT (and other off-chain systems) would need to take up the slack. That's not a particular great scenario. It'd hardly be better than the fragmented and high-friction financial system we have now. I thought the point was to increase the block size limit to avoid having to use off chain transactions.
|
An economy based on endless growth is unsustainable.
|
|
|
justusranvier
Legendary
Offline
Activity: 1400
Merit: 1013
|
|
February 11, 2015, 06:26:04 PM |
|
I thought the point was to increase the block size limit to avoid having to use off chain transactions. The block size limit should raised because off-chain transaction systems should not be artificially subsided. If the limit is not raised, the more transactions will be pushed off chain which otherwise belong on the chain, which won't exactly kill the entire space right away, but as a plan B it's far inferior to allowing the distribution of on-chain vs off-chain transactions to find a natural balance.
|
|
|
|
najzenmajsen
|
|
February 11, 2015, 06:33:31 PM |
|
Did you cry as much about the upgrade from 32bit to 64bit operating systems?
Or from 1.44 MB floppy disks to 200 MB CDs?
I'm still using floppies, I don't know why anyone would think they are obsolete. But I still need to upgrade my operating system, so I think this will be a long night. dayum , floppies that were the timse man ! didnt know people actually still used them , and now i want to use them : >
|
|
|
|
solex
Legendary
Offline
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
|
|
February 11, 2015, 07:49:17 PM |
|
I thought the point was to increase the block size limit to avoid having to use off chain transactions. The block size limit should raised because off-chain transaction systems should not be artificially subsided. If the limit is not raised, the more transactions will be pushed off chain which otherwise belong on the chain, which won't exactly kill the entire space right away, but as a plan B it's far inferior to allowing the distribution of on-chain vs off-chain transactions to find a natural balance. I don't even think transactions will be pushed off-chain anymore than what is happening organically today. Yes, it is good that off-chain solutions develop, but they should attract user business on their own merits, not be a refuge for users who are forced there. Forcing users to do something stretches the network effect to its limits. Users will instead transact with a different cryptocurrency, and Bitcoin will hemorrhage market share.
|
|
|
|
tvbcof
Legendary
Online
Activity: 4760
Merit: 1282
|
|
February 12, 2015, 01:27:27 AM |
|
Permanently keeping the 1MB (anti-spam) restriction is a great idea ... if you are a bank. Those favoring a permanent 1MB cap saying that Bitcoin can still be a financial backbone of sorts they don't know how right they are. The problem isn't a limit in general but that 1MB is so low that under any meaningful adoption scenario it will push all individual users off the blockchain to rely on trusted third parties. 1MB is useless for end user direct access but is sufficient for a inter-"bank" settlement network.
...
Conclusion The blockchain permanently restricted to 1MB is great if you are a major bank looking to co-opt the network for a next generation limited trust settlement network between major banks, financial service providers, and payment processors. It is a horrible idea if you even want to keep open the possibility that individuals will be able to participate in that network without using a trusted third party as an intermediary.
I just wanted to point out that Monetas / Open-Transactions is working on a solution that eliminates the need to trust third parties. So the bulk of transactions could move off-chain to Monetas notary servers, and a notary would be unable to steal the coins, and would be unable to falsify receipts or change balances without a user's permission. With the vast majority of transactions occurring safely off-chain on Monetas notaries, any user would still be able to perform on-chain transactions when necessary -- it would just be a lot more rare, and would usually consist of a transfer from one voting pool to another. So keep your 1mb limit if you want, it will definitely benefit Monetas. Sounds like the OT technology might be a promising way to manage the internal accounting inside of a sidechain. I'll be interested to see how it performs in such a capacity and might entrust a small fraction of my stash to a sidechain which employs it. If they do other interesting things as well of course.
|
sig spam anywhere and self-moderated threads on the pol&soc board are for losers.
|
|
|
amincd
|
|
February 12, 2015, 01:30:04 AM |
|
I thought the point was to increase the block size limit to avoid having to use off chain transactions. The block size limit should raised because off-chain transaction systems should not be artificially subsided. If the limit is not raised, the more transactions will be pushed off chain which otherwise belong on the chain, which won't exactly kill the entire space right away, but as a plan B it's far inferior to allowing the distribution of on-chain vs off-chain transactions to find a natural balance. I don't even think transactions will be pushed off-chain anymore than what is happening organically today. Yes, it is good that off-chain solutions develop, but they should attract user business on their own merits, not be a refuge for users who are forced there. Forcing users to do something stretches the network effect to its limits. Users will instead transact with a different cryptocurrency, and Bitcoin will hemorrhage market share. +1 And you will not find a single prominent proponent of a permanent 1 MB limit give a compelling counter-argument to this.
|
|
|
|
justusranvier
Legendary
Offline
Activity: 1400
Merit: 1013
|
|
February 12, 2015, 01:36:25 AM |
|
Users will instead transact with a different cryptocurrency, and Bitcoin will hemorrhage market share.
+1 And you will not find a single prominent proponent of a permanent 1 MB limit give a compelling counter-argument to this. Makes you wonder whether those proponents regard Bitcoin hemorrhaging market share as desirable or as undesirable.
|
|
|
|
tvbcof
Legendary
Online
Activity: 4760
Merit: 1282
|
|
February 12, 2015, 02:24:07 AM |
|
Users will instead transact with a different cryptocurrency, and Bitcoin will hemorrhage market share.
+1 And you will not find a single prominent proponent of a permanent 1 MB limit give a compelling counter-argument to this. Makes you wonder whether those proponents regard Bitcoin hemorrhaging market share as desirable or as undesirable. Is Bitcoin 'hemorrhaging market share'? I hadn't heard. Whatever the case, losing a lot of the users is a fairly desirable thing to me. Users who want to use a very powerful technology such as Bitcoin to buy their morning coffee are a distinct liability to the system and are in the process of killing it right now (hence this thread.) No alt I've seen targets the only use-case that I care deeply about nor seeks to address the main problem that Bitcoin has: too many idiot users. The reason most alts don't have capacity problems is because they don't have users rather than any particular architectural design that they did right. That is my sense since I've really not studied any of the alts very closely. I liked some of the ideas in Litecoin when it first came out, but it was basically a regression from Bitcoin in the ways which were important to me. In spite of it's rather severe design faults Bitcoin is still the leading contender as a reserve and balancing currency on the basis of it's first mover advantage alone, though it is also still better tuned for this duty than a lot of the alts. Eventually a solution will emerge that makes use of the very powerful method of one simple layer of abstraction (main chain supporting many tuned, targeted, and non-critical subordinate chains.) I hope it turns out to be Bitcoin which takes on this role and the core devs who organized under Blockstream seem to be thinking the same thing. It's probably Bitcoin's only real chance of survival as a robust monetary solution which is independent of government control, and falling into that trap will ultimately kill off Bitcoin's value more than making user's pay what it's worth to use the system.
|
sig spam anywhere and self-moderated threads on the pol&soc board are for losers.
|
|
|
lophie
|
|
February 12, 2015, 07:54:01 AM |
|
OP, Weren't you vehemently against raising the limit few years ago? I think I remember a lot of intellectual technical discussion around here involving you, gmaxwell and others regarding this matter. Around v0.0.9?
What changed?
I am personally very much against hard forks of such. However I am all in for new crypto with new parameter that considers how a previous crypto has lagged behind. From my point of view a hard fork with a lot of publicity to adhere to and "update" to keep up with is simply an act of how a few control the mass. Whether it is for a good reason or bad or whatever, It really breaks the original principles of decentralization and fairness.
What do you think?
|
Will take me a while to climb up again, But where is a will, there is a way...
|
|
|
Cryddit
Legendary
Offline
Activity: 924
Merit: 1132
|
|
February 12, 2015, 10:13:49 AM |
|
Okay, a lot of people appear to be talking without knowledge of how hard forks actually happen.
The process starts with miners getting mining software that will understand and could produce blocks on the new version. They start putting the new protocol version number into the blocks. But they continue to produce blocks that follow the old protocol until the last 1000 blocks are 95% produced by miners that are putting the new protocol version into the blocks. After that, the hard fork is enforced. Two things happen; they start using the new protocol, and they start rejecting any new blocks that have the earlier protocol number.
Meanwhile, when the number of recently mined blocks showing a higher version number rises above 50%, any client with the old version will warn that it's on the losing side of an upcoming hard fork and a software update is required.
Here is an interesting thing about this split in particular; even when 95% of the miners start rejecting blocks for having too low a version number, there may not be an immediate split. If this happens while we aren't bumping up against the 1MB limit, then the earlier clients will see nothing in the first few of the new blocks to object to, because all the new blocks (until a >1Mb block actually happens) are still acceptable to the existing clients. So, until that time all that happens, from the point of view of the original clients, is that blocks bearing the original version number are getting orphaned by blocks bearing the new version number.
So any original-version blocks created during this time wouldn't wind up in their own block chain, they'd just get rejected, by the new clients for having too low a version number and by the original clients for having another perfectly-acceptable block chain get longer. Even though the hard fork is being "enforced" by the new clients, it hasn't happened yet as long as the new blocks are acceptable to the old clients. So any original-version blocks mined during this time are a dead loss for miners, regardless of which side of the hard fork they intend to be on. Rather than mining at a dead loss, I expect that miners who haven't upgraded yet, mostly will.
But after a while, a >1Mb block arrives, and the new version clients will build on it and the old version won't, so the hard fork actually happens. Now the miners have a choice. They could choose to mine on the new version or the old version. Let's stack this up for a minute and see what the choice looks like.
The old version has already lost 95% of the hashing power before the new-version clients pulled the trigger by starting to reject old-version blocks. Also probably lost at least 90% of whatever was left because miners don't like mining at a dead loss, so let's say they've got about one half of one percent of the hashing power. That means they'd get a block about once per 33 hours. A one-megabyte block every 33 hours. Meaning, a limit of 1 transaction per 86 seconds. Your transaction gets into a block in 33 hours, and it takes about ten days to get to the confirmation depth of 6 blocks. If you get a coinbase transaction, good luck waiting out that 100 block depth until it's spendable. It'll take over four and a half months. Oh, but of course there are difficulty adjustments to make everything work right! There sure are, every 2016 blocks. Assuming you still have miners for that long, that takes over seven and a half years. Taking the maximum reduction that the protocol can take, that'll quadruple the rate at which you get blocks. then you get up to 1 transaction per 21 seconds and you get another difficulty adjustment in 2 years, and another one six months after that, and another one just a month and a half later. So we're up to about ten years later before the original chain gets back up close to one transaction per second. This does not sound like a winner to me.
Meanwhile the new version takes off with 99.5% of the hashing power, and essentially flicks whatever holdouts and weirdos stick on the old chain off like an elephant shakes off a fly. They are able to take 40 or so transactions per second, and the old-version chain, with its smaller crop of miners, is limited to less than one transaction per minute.
Seriously? Seriously, you think staying on the original-version chain is a viable choice? I invite you to do so; the rest of us will ignore you, except for the ones that care enough about your welfare to stage an intervention and tell you you're crazy and you need to get help.
|
|
|
|
funkenstein
Legendary
Offline
Activity: 1066
Merit: 1050
Khazad ai-menu!
|
|
February 12, 2015, 11:03:41 AM |
|
DeathAndTaxes for president!
Well stated.
I see a number of questions developing. Here is one:
1) Imagine you are a miner (pool operator) in, lets say, 2015. You have a choice to include more TX to go over 1MB or not. You know going over 1MB is better for the network, it has to happen, etc. However, you also know there are some nodes on the network that might reject your block. Your fiduciary responsibility is to your hashers. Leaving out a few TX, with minimal fees, lets you stay neutral on the issue as all nodes will accept your block, and increases your chance of getting 25 large coin. What do you do?
|
|
|
|
marcus_of_augustus
Legendary
Offline
Activity: 3920
Merit: 2349
Eadem mutata resurgo
|
|
February 12, 2015, 11:31:13 AM |
|
Ok, so say we go ahead with a hard fork to a > 1MB max block size ... are we also going to see a couple of other hard forking issues put through all at the same time or will this one be done alone?
Could it become like those congressional bills where all the crap legislation gets stuck inside the fine print of the guns 'n drugs 'n terrroists 'n kids cover page?
|
|
|
|
jmw74
|
|
February 12, 2015, 04:34:18 PM |
|
I like Justus' pay-to-relay idea, but I'm having second thoughts about whether it would actually work.
Consider this thought experiment, there are three otherwise identical cryptocurrency networks. The total cost to do a trustless transaction is the tx fee plus whatever it costs to rent full-node capable hardware long enough to validate the transaction you receive.
A: 1mb block limit, $50 tx fee, validation cost: $0.05 B: 100mb block limit, $0.50 tx fee, validation cost: $0.50 C: 10,000mb block limit, $0.005 tx fee, validation cost $50
Which network do you want to receive payment on, assuming you don't trust the sender or any third party? B. The reason B is so cheap, is that fees are inversely proportional to block size limit, but bandwidth costs are not linear. They're pretty flat until you get up to exotic speeds.
As Gavin pointed out, the optimal block size limit will scale with commodity bandwidth.
So, in the unlimited "pay to relay" model, what forces cause the price to converge at this optimal level? I'd suggest there aren't any. If nothing else, the block reward ruins it. The cost to relay will be insignificant until the block size is so huge, that it takes one of the world's fastest connections to transmit it. By then bitcoin is sunk, or at least depending solely on its head start rather than competing on features and price.
Miners have no reason not to accept all nonzero fees, no matter how large the resulting block is. They do not care if the large blocks knock commodity full nodes off the network. Users care! They'll just go to the hypothetical competing network that's the same, except much cheaper.
Now you could argue that miners are rational and wouldn't do anything to make bitcoin un-competitive. I think that ignores the prisoner's dilemma that would come up.
Let's say all miners know that accepting tiny fees creates huge blocks, causing users to flee to cheaper networks. They don't want to harm their own operation, right?
Some miner will defect, and sweep all the tiny fees from the mempool, even though it creates a huge block. One huge block won't kill bitcoin. Except the miners can't stop at just one. It will happen repeatedly, with no individual miner able to stop it from going down in flames. So they might as well defect themselves and collect every last satoshi while they can.
I think Gavin's got a solid plan. It's got ugly arbitrary constants, but it's simple and there's little doubt it will work.
|
|
|
|
onemorebtc
|
|
February 12, 2015, 04:58:26 PM |
|
I like Justus' pay-to-relay idea, but I'm having second thoughts about whether it would actually work.
Consider this thought experiment, there are three otherwise identical cryptocurrency networks. The total cost to do a trustless transaction is the tx fee plus whatever it costs to rent full-node capable hardware long enough to validate the transaction you receive.
A: 1mb block limit, $50 tx fee, validation cost: $0.05 B: 100mb block limit, $0.50 tx fee, validation cost: $0.50 C: 10,000mb block limit, $0.005 tx fee, validation cost $50
Which network do you want to receive payment on, assuming you don't trust the sender or any third party? B. The reason B is so cheap, is that fees are inversely proportional to block size limit, but bandwidth costs are not linear. They're pretty flat until you get up to exotic speeds.
As Gavin pointed out, the optimal block size limit will scale with commodity bandwidth.
So, in the unlimited "pay to relay" model, what forces cause the price to converge at this optimal level? I'd suggest there aren't any. If nothing else, the block reward ruins it. The cost to relay will be insignificant until the block size is so huge, that it takes one of the world's fastest connections to transmit it. By then bitcoin is sunk, or at least depending solely on its head start rather than competing on features and price.
Miners have no reason not to accept all nonzero fees, no matter how large the resulting block is. They do not care if the large blocks knock commodity full nodes off the network. Users care! They'll just go to the hypothetical competing network that's the same, except much cheaper.
Now you could argue that miners are rational and wouldn't do anything to make bitcoin un-competitive. I think that ignores the prisoner's dilemma that would come up.
Let's say all miners know that accepting tiny fees creates huge blocks, causing users to flee to cheaper networks. They don't want to harm their own operation, right?
Some miner will defect, and sweep all the tiny fees from the mempool, even though it creates a huge block. One huge block won't kill bitcoin. Except the miners can't stop at just one. It will happen repeatedly, with no individual miner able to stop it from going down in flames. So they might as well defect themselves and collect every last satoshi while they can.
I think Gavin's got a solid plan. It's got ugly arbitrary constants, but it's simple and there's little doubt it will work.
miners also have an incentive for smaller blocks as they reduces their risk that it get orphaned. miners also know that if they allow small fee transactions people will only pay a small fee. its still a prisoners dilemma with downward pressure tough.
|
transfer 3 onemorebtc.k1024.de 1
|
|
|
|