Bitcoin Forum
April 27, 2024, 08:41:04 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 3 »  All
  Print  
Author Topic: Share your ideas on what to replace the 1 MB block size limit with  (Read 6958 times)
amincd (OP)
Hero Member
*****
Offline Offline

Activity: 772
Merit: 501


View Profile
July 25, 2014, 05:55:38 PM
Last edit: August 18, 2014, 06:38:27 PM by amincd
Merited by ABCbits (1)
 #1

I think the biggest obstacle to mass adoption of Bitcoin is the 1 MB block size limit.

It's a major source of uncertainty for the Bitcoin economy because, on the one hand:

Bitcoin can't achieve mass adoption and global currency status status until the limit is lifted, because 7 transactions per second (the transaction rate that results in block reaching the 1 MB size limit) is miniscule for any global payment network or currency

and on the other hand, a change in the protocol to lift the 1 MB block size limit portends many risks, the biggest two of which are:

  • a split in the community leading to the Bitcoin blockchain being forked
  • poor bloat control leading to garbage being dumped into the blockchain by malicious actors, making it too costly to run a full node for all but the largest players

I think we should have more discussion about potential replacements for the current block size limit, in order to get us closer to a solution.

Some might argue that we should wait until we are closer to the 1 MB block limit before discussing it, but consider that from May 2012 to May 2013, Bitcoin's transaction volume increased almost 10 fold.

If we see a similar growth in transaction volume, we would reach the block limit in a matter of four-five months (it's currently at 240 KB, meaning it can grow 4 fold before hitting the limit). And then what happens? The uncertainty hangs over future Bitcoin development.

For my part, I think the best solution is a two part one.

For the first part, we should eliminate the block size limit altogether, as Gavin Andresen and Mike Hearn advocate. If a miner creates a block that is too big, the other miners will simply reject it. This would not be a protocol level rule, but it would be enforced as if it were, because any miner whose default block size limit is not accepted by at least 50% of the network hashing rate, will eventually see all of their blocks and block rewards orphaned, so they would have an incentive to conform to the most common limit.

For the second part, miners should adopt a rule whereby their block size limit tracks the difficulty. This is a simple construction that will allow Bitcoin to scale as the economic value of the network increases. It's not perfect, but then no solution is, and between imperfect solutions, simpler ones are better.

If you have an idea on what to replace the 1 MB block size limit with, please post it here.

Edit:

Gmaxwell makes some great points, which I'll include in the OP for visibility:

Imagine— you want your message to be read by dozens or hundreds of people— consuming a few minutes of their valuable time each. It makes sense to spend quite a few minutes making sure you are well informed first, considering how much of other people's time your message will consume.

In particular, I think it's especially unhelpful when people make posts which make it clear that they don't understand that there isn't a free lunch here. In particular, I think any productive post will have been made understanding the following points:

  • Blocksize has a trade-off with decentralization. If verifying the blockchain is made expensive (relative to hardware and bandwidth costs), then past some limit Bitcoin becomes a centralized system where everyone is economically forced to trust some consortium of large miners— which are themselves more efficient if centralized since they can just verify once, instead of verifying for themselves. (If the economic majority is trusting and not verifying, you need to also do the same so you don't end up split from the other users of the system.)
  • Bitcoin isn't secure unless there is income to pay to apply computation to the honest chain (and thus far the alternatives appear not clearly workable), we argue that once the subsidy is gone transaction fees will support the security. But the existence of a market for transaction fees requires a degree of scarcity to make the rational price non-zero and to encourage efficient use. Just like Bitcoin itself wouldn't be valuable if everyone had access to infinite Bitcoins, our incentives require a degree of scarcity of blockspace.
  • Bitcoin currency throughput can be increased to arbitrary levels without increasing blocksizes, especially if you're willing to make decentralization tradeoffs. Importantly, handling high volume transactions in other ways than expressing each and every one in the global bitcoin ledger can help avoid pulling down the available security for all transactions just because a large volume of low value transactions need the throughput and can accept the lower security. Work in this space has been under-developed, but I'm not aware of anyone disagreeing with the broad possibilities here. Because of the lack of need until now it's only recently become possible to raise substantial funding for work in this space.

None of this to say that its not an interesting subject to discuss (though it has been discussed in depth before), but it's at least my view that posts which are unaware of these points are unlikely to be productive. If you don't understand what I'm saying in these points, you need to read up more (or even feel free to contact me in PM to talk to you about them one on one before taking the stage yourself).

The Bitcoin systems exists in a careful and somewhat subtle balance between two extremes: one where it is too costly to transact in, thus not valuable— or one where it is to costly to verify and so it offers little to no trustlessness advantage over traditional systems (which have a much more efficient and scalable design, made possible in part because they are not attempting to be trustless). Like most engineering tradeoff discussions every choice has ups and downs.

Also, you can review some previous discussions on the 1 MB block size limit in these links:

https://bitcointalk.org/index.php?topic=1865.0  Block size limit automatic adjustment (one of the earliest discussions on it, from 11/2010)

https://bitcointalk.org/index.php?topic=140233.0  The MAX_BLOCK_SIZE fork

https://bitcointalk.org/index.php?topic=144895.0  How a floating blocksize limit inevitably leads towards centralization

https://bitcointalk.org/index.php?topic=96097.0  Max block size and transaction fees
1714250464
Hero Member
*
Offline Offline

Posts: 1714250464

View Profile Personal Message (Offline)

Ignore
1714250464
Reply with quote  #2

1714250464
Report to moderator
Transactions must be included in a block to be properly completed. When you send a transaction, it is broadcast to miners. Miners can then optionally include it in their next blocks. Miners will be more inclined to include your transaction if it has a higher transaction fee.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714250464
Hero Member
*
Offline Offline

Posts: 1714250464

View Profile Personal Message (Offline)

Ignore
1714250464
Reply with quote  #2

1714250464
Report to moderator
1714250464
Hero Member
*
Offline Offline

Posts: 1714250464

View Profile Personal Message (Offline)

Ignore
1714250464
Reply with quote  #2

1714250464
Report to moderator
ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
July 25, 2014, 07:33:59 PM
Last edit: July 25, 2014, 07:57:57 PM by ArticMine
 #2

My suggestion, to get this discussion started, is a dynamic block size limit that is based on the following two parameters:

1) The median size of the a set of the previous blocks. A good example is the model of the CryptoNote coins, for example Monero (XMR). https://cryptonote.org/inside.php.
2) We could also add the requirement that for an increase in the block size limit the difficulty must be rising and for a decrease the difficulty must be falling.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
tsoPANos
Hero Member
*****
Offline Offline

Activity: 602
Merit: 500

In math we trust.


View Profile
July 25, 2014, 07:49:13 PM
 #3

Well, I'm not a bitcoin expert, but here's the idea.

I think the limit should be removed for the network to allow mass-adoption.
Here's an interesting idea.

Bitcoin devs should:
Set a date, like January 1 2015 and declare that a new client will be released with no limit on that date.
Also, make every client released from now on, have a timestamp-based limit, (change the limit automatically depending on the date)
As the clocks hit 1/1/2015, hopefully more than 50% of the people will have updated their clients to a timestamp-based ones.

Well, I hope you took the point.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
July 25, 2014, 08:09:30 PM
Last edit: July 26, 2014, 07:25:10 AM by gmaxwell
Merited by ABCbits (4)
 #4

This thread is beginning to just rehash the discussion from the several prior ones, please take the time to search the forum some, read, and contemplate a bit before replying.  

Imagine— you want your message to be read by dozens or hundreds of people— consuming a few minutes of their valuable time each. It makes sense to spend quite a few minutes making sure you are well informed first, considering how much of other people's time your message will consume.

In particular, I think it's especially unhelpful when people make posts which make it clear that they don't understand that there isn't a free lunch here. In particular, I think any productive post will have been made understanding the following points:

  • Blocksize has a trade-off with decentralization. If verifying the blockchain is made expensive (relative to hardware and bandwidth costs), then past some limit Bitcoin becomes a centralized system where everyone is economically forced to trust some consortium of large miners— which are themselves more efficient if centralized since they can just verify once, instead of verifying for themselves. (If the economic majority is trusting and not verifying, you need to also do the same so you don't end up split from the other users of the system.)
  • Bitcoin isn't secure unless there is income to pay to apply computation to the honest chain (and thus far the alternatives appear not clearly workable), we argue that once the subsidy is gone transaction fees will support the security. But the existence of a market for transaction fees requires a degree of scarcity to make the rational price non-zero and to encourage efficient use. Just like Bitcoin itself wouldn't be valuable if everyone had access to infinite Bitcoins, our incentives require a degree of scarcity of blockspace.
  • Bitcoin currency throughput can be increased to arbitrary levels without increasing blocksizes, especially if you're willing to make decentralization tradeoffs. Importantly, handling high volume transactions in other ways than expressing each and every one in the global bitcoin ledger can help avoid pulling down the available security for all transactions just because a large volume of low value transactions need the throughput and can accept the lower security. Work in this space has been under-developed, but I'm not aware of anyone disagreeing with the broad possibilities here. Because of the lack of need until now it's only recently become possible to raise substantial funding for work in this space.

None of this to say that its not an interesting subject to discuss (though it has been discussed in depth before), but it's at least my view that posts which are unaware of these points are unlikely to be productive. If you don't understand what I'm saying in these points, you need to read up more (or even feel free to contact me in PM to talk to you about them one on one before taking the stage yourself).

The Bitcoin systems exists in a careful and somewhat subtle balance between two extremes: one where it is too costly to transact in, thus not valuable— or one where it is to costly to verify and so it offers little to no trustlessness advantage over traditional systems (which have a much more efficient and scalable design, made possible in part because they are not attempting to be trustless). Like most engineering tradeoff discussions every choice has ups and downs.

With respect to  the suggestion to use the scheme from bytecoin (and it's forks like monero and fantomcoin), since that didn't exist when the prior threads were active I might as well give my thoughts on it—

Sadly, what bytecoin does is objectively broken. (Had their paper had a modicum of peer review this would have been noticed</whine>) My understanding is that it has a limitless blocksize (well at some point I presume nodes run out of ram, so not really limitless, just "undefined") with a median operation such that a miner cannot produce a block larger than a median of recent blocks without throwing out a fraction of their fees which is quadratically related to the size— the bigger the block the more coins you must throw out. Unfortunately, it's perfectly possible to pay miner fees 'out of band', e.g. author a transaction with zero-fee but pays the miner directly (some Bitcoin mining pools like Eligius already do this), so this as a control on blocksize cannot work in the long run. It also fails to control for the incentives of larger centeralized pools— who can justify beefier nodes due to the mining income— in pushing everyone else out. I understand that monero will be hardforking soon, in part to reign in the blockchain growth (which has grown almost 2GB in its 3 months of operation in spite far far lower exposure than Bitcoin), though also because the median is too constraining when there haven't been a lot of transactions recently.

Mod note: I'm going to remove posts that look like they haven't even read _this_ thread, much less past ones— prior threads spiraled a bit into uselessness with people jumping in repeated/rehashed uninformed opinion that had been thoughtfully covered in prior posts.
ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
July 25, 2014, 10:04:37 PM
Last edit: July 25, 2014, 10:42:58 PM by ArticMine
 #5

First I should point that I am familiar with many of the old threads and the arguments in them. Ironically I came across CryptoNote and Monero while researching these old threads. I will address first two points miner incentives and decentralization because they are both very valid. They also form part of most of the arguments against increasing the 1 MB block size limit.

Bitcoin isn't secure unless there is income to pay to apply computation to the honest chain (and thus far the alternatives appear not clearly workable), we argue that once the subsidy is gone transaction fees will support the security. But the existence of a market for transaction fees requires a degree of scarcity to make the rational price non-zero and to encourage efficient use. Just like Bitcoin itself wouldn't be valuable if everyone had access to infinite Bitcoins, our incentives require a degree of scarcity of blockspace.
This is all very true but it can be addressed by making an increase in difficulty a necessary requirement for an increase in the blocksize limit. In short if the miners are not receiving the proper incentives then the difficulty can not also be rising at the same time. This is why I included the second requirement of a rising difficulty for a blocksize increase in my suggestion.

Blocksize has a trade-off with decentralization. If verifying the blockchain is made expensive (relative to hardware and bandwidth costs), then past some limit Bitcoin becomes a centralized system where everyone is economically forced to trust some consortium of large miners— which are themselves more efficient if centralized since they can just verify once, instead of verifying for themselves. (If the economic majority is trusting and not verifying, you need to also do the same so you don't end up split from the other users of the system.)
The problem here is that a fixed blocksize does not take into account that these hardware and bandwith costs are dropping at an exponential rate. Furthermore this also does not take into account that what will likely limit the "economic majority" namely consumers form verifying transactions is not cost but rather the fact that they choose to cede control over their devices to large centralized corporations such as Apple and Microsoft.  An iPad or a Surface tablet may have the processing power and memory to handle a full Bitcoin node but is not able to do so because the DRM in the device. Now for under 20% of the cost of an iPad one can purchase a used laptop running GNU/Linux that is perfectly capable of handling a full Bitcoin node. If the 1-2% of Bitcoin users, who choose to run GNU/Linux, are the only ones verifying transactions that still provides enough decentralization to secure the network. Furthermore it avoids an attack by a propriety OS vendor who could use the DRM in their OS to cripple Bitcoin. One thing we must keep in mind here is that those who run a propriety OS do not control their devices they only think they do. Crippling Bitcoin with a 1 MB block size limit is not going to solve the centralization issue.

This brings me to the third point
 
Bitcoin currency throughput can be increased to arbitrary levels without increasing blocksizes, especially if you're willing to make decentralization tradeoffs. Importantly, handling high volume transactions in other ways than expressing each and every one in the global bitcoin ledger can help avoid pulling down the available security for all transactions just because a large volume of low value transactions need the throughput and can accept the lower security. Work in this space has been under-developed, but I'm not aware of anyone disagreeing with the broad possibilities here. Because of the lack of need until now it's only recently become possible to raise substantial funding for work in this space.
True but this is essentially self defeating. If it turns out that a centralized or semi centralized solution can be competitive with Bitcoin in certain situations then so be it. This however should be a result of true market forces and not an arbitrary limit placed on Bitcoin.

Edit: With respect to CryptoNote and Monero, I do see merit in the argument that the fee penalty alone may not be enough to constrain blocksize; however when combined with the difficulty increase requirement the picture changes. As for the specifics of the Monero blockchain there are also other factors including dust from mining pools that led to the early bloating, and we must also keep in mind the CryptoNote and Monero also have built in privacy which adds to the blocksize by its very nature.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
July 26, 2014, 12:01:42 AM
 #6

This is all very true but it can be addressed by making an increase in difficulty a necessary requirement for an increase in the blocksize limit.
It seems like a reasonable thing, not sufficient on its own but reasonable... (in fact!) I've previously proposed it myself.  (not sufficient, among other things that it doesn't do anything to keep incentives aligned, or keep centralization gains moderate.)

Quote
Now for under 20% of the cost of an iPad one can purchase a used laptop running GNU/Linux that is perfectly capable of handling a full Bitcoin node. If the 1-2% of Bitcoin users
Yes, but thats only true for a certain set of limits at a certain size. 100MB blocks wouldn't be tolerable on thrifty resources like that today. You may be making an error in reading my message as some kind of opposition instead of tradeoffs which must be understood and carefully handled.

Quote
True but this is essentially self defeating. If it turns out that a centralized or semi centralized solution can be competitive with Bitcoin in certain situations then so be it. This however should be a result of true market forces and not an arbitrary limit placed on Bitcoin.
Centeralized services are inherently more efficient, enormously so. My desktop could handle 40,000 TPS with a simple ledger system, giving nearly instant confirmation... physics creates limits for decentralized systems in scale and latency, and thats okay— the decenteralized systems are much better from a security perspective.  My point about there being alternatives to increasing scaling are not limited to (semi-)centralized systems, though they're useful tools which will exist in the ecosystem, there are decenteralized approaches as well.

I also think it's wrong to think of semi-centralized systems as being in competition with Bitcoin, if process transactions for Bitcoin value they're part of the broader ecosystem; and Bitcoin's trustlessness should enable semi-centeralized systems which are far more trust worthy than what is possible in semi-centeralized systems absent something like Bitcoin. We should be able to adopt them in the places where they make the most sense and have the least risk, rather than try to force all of Bitcoin to the level of centralization needed to make processing $0.25 coffee cup purchases economically efficient while still doing a poor job of it.

Quote
Edit: With respect to CryptoNote and Monero, I do see merit in the argument that the fee penalty alone may not be enough to constrain blocksize; however when combined with the difficulty increase requirement the picture changes. As for the specifics of the Monero blockchain there are also other factors including dust from mining pools that led to the early bloating, and we must also keep in mind the CryptoNote and Monero also have built in privacy which adds to the blocksize by its very nature.
Yes, its complicated— but it's also concerning. The only altcoins that I'm aware of which have diverged from the current Bitcoin behavior in this front, and did so with the benefits of all the discussions we've had before being available to them, have been suffering from crippling bloat. That they have stronger privacy features which make transactions somewhat larger is somewhat relevant, but the average mixing size on monero is small.. and we may need to adopt similar functionality in Bitcoin (esp as increased mining centralization, partially fueled by the cost of operating nodes makes censorship more of a risk).  This doesn't prove anything one way or another, just something to think about— the way it was originally presented sounded to me like you were saying it was solved over there, but instead I think that the experience in Bytecoin and monero brings about more questions than answers.
jubalix
Legendary
*
Offline Offline

Activity: 2618
Merit: 1022


View Profile WWW
July 26, 2014, 01:58:45 AM
 #7

Some reflection on backbone currencies may yield the a view that the  utility of high security far out weight micro transaction for store of wealth.

Backbones can run 80% of the economy, the other 20% purchasing from starbucks/retial cannot.

Notice how when say Target/or K-mart fail or some other retail store, nothing really changes to you quality of life, and another provide can pop up.

Notice how your quality of life changes when you make large payments...eg house......these large payments is where the value is.

BTC seems to take a slant toward more security than TPS, which is a fine balance and probably the right one. The entire system crumbles is security is compromised, or even perceived to be lessened by centralization.

The example of the thin blockchain set up for high value low TPS is Peercoin. Down load that block chain and see exactly how fast it is, the contra of LTC/DOGE and basically everything else that is bloating itself out of existence, by haveing to many "features" and not enough focus on a clear objective


Admitted Practicing Lawyer::BTC/Crypto Specialist. B.Engineering/B.Laws

https://www.binance.com/?ref=10062065
ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
July 26, 2014, 02:34:58 AM
Last edit: July 26, 2014, 04:28:12 AM by ArticMine
 #8

This is all very true but it can be addressed by making an increase in difficulty a necessary requirement for an increase in the blocksize limit.
It seems like a reasonable thing, not sufficient on its own but reasonable... (in fact!) I've previously proposed it myself.  (not sufficient, among other things that it doesn't do anything to keep incentives aligned, or keep centralization gains moderate.)

Quote
Now for under 20% of the cost of an iPad one can purchase a used laptop running GNU/Linux that is perfectly capable of handling a full Bitcoin node. If the 1-2% of Bitcoin users
Yes, but thats only true for a certain set of limits at a certain size. 100MB blocks wouldn't be tolerable on thrifty resources like that today. You may be making an error in reading my message as some kind of opposition instead of tradeoffs which must be understood and carefully handled.

Quote
True but this is essentially self defeating. If it turns out that a centralized or semi centralized solution can be competitive with Bitcoin in certain situations then so be it. This however should be a result of true market forces and not an arbitrary limit placed on Bitcoin.
Centeralized services are inherently more efficient, enormously so. My desktop could handle 40,000 TPS with a simple ledger system, giving nearly instant confirmation... physics creates limits for decentralized systems in scale and latency, and thats okay— the decenteralized systems are much better from a security perspective.  My point about there being alternatives to increasing scaling are not limited to (semi-)centralized systems, though they're useful tools which will exist in the ecosystem, there are decenteralized approaches as well.

I also think it's wrong to think of semi-centralized systems as being in competition with Bitcoin, if process transactions for Bitcoin value they're part of the broader ecosystem; and Bitcoin's trustlessness should enable semi-centeralized systems which are far more trust worthy than what is possible in semi-centeralized systems absent something like Bitcoin. We should be able to adopt them in the places where they make the most sense and have the least risk, rather than try to force all of Bitcoin to the level of centralization needed to make processing $0.25 coffee cup purchases economically efficient while still doing a poor job of it.

Quote
Edit: With respect to CryptoNote and Monero, I do see merit in the argument that the fee penalty alone may not be enough to constrain blocksize; however when combined with the difficulty increase requirement the picture changes. As for the specifics of the Monero blockchain there are also other factors including dust from mining pools that led to the early bloating, and we must also keep in mind the CryptoNote and Monero also have built in privacy which adds to the blocksize by its very nature.
Yes, its complicated— but it's also concerning. The only altcoins that I'm aware of which have diverged from the current Bitcoin behavior in this front, and did so with the benefits of all the discussions we've had before being available to them, have been suffering from crippling bloat. That they have stronger privacy features which make transactions somewhat larger is somewhat relevant, but the average mixing size on monero is small.. and we may need to adopt similar functionality in Bitcoin (esp as increased mining centralization, partially fueled by the cost of operating nodes makes censorship more of a risk).  This doesn't prove anything one way or another, just something to think about— the way it was originally presented sounded to me like you were saying it was solved over there, but instead I think that the experience in Bytecoin and monero brings about more questions than answers.


I consider the increase in difficulty requirement on its own a necessary but not a sufficient condition, while the CryptoNote / Monero solution or something similar also a necessary but not a sufficient condition. My proposal is requiring both as a necessary and sufficient condition. Both suggestions have been made before individually but I have not seen a proposal that required both of them.

The problem of 100MB blocks needs to be considered not just in terms of current technology, but in terms of technology costs down the road. (2, 5 10 years etc.) I run a full Bitcoin node on an over ten year old laptop (that still has its Windows 2000 logo, and a floppy drive), that is way inferior in performance from what one can buy used for say 100 - 200 USD today.

The efficiency of centralization argument is on the surface very valid, furthermore my desktop can also handle "40,000 TPS with a simple ledger system, giving nearly instant confirmation". The problem arises when an individual with value on your desktop wishes to do business with an individual with value on mine. It is then when the fees and costs start to get really expensive. There are two problems here:
First businesses left to their own devices tend to not co-operate with their competitors in order to allow each other's customers to do business. They instead prefer to keep their customers for the most part locked up in their own "walled gardens". Just witness the behaviour of Apple with IOS or while on the subject of coffee Keurig adding DRM to their coffee makers in order to prevent customers from purchasing coffee from suppliers other than those approved by Keurig.
The second is regulatory if we both are in different jurisdictions then we both have now to comply with two sets of regulators. As the number of jurisdictions is increased so does the number of regulators, and even worse the sometimes conflicting interactions between the various regulators each provider has to deal with.
These are the reasons why we see many innovative payment methods that work only within one jurisdiction but only few and expensive options across international borders. Paying 2.50 USD for coffee at the local coffee shop is not where Bitcoin shines, but paying 2.50 USD for a good or service from a provider across the world is where Bitcoin can really shine. Centralized and semi-centralized solutions do have a role to play in reducing blockchain bloat, but cannot not by themselves solve the problem. For example: Coinbase provides both exchange and merchant services to persons in the US, and requires a US bank account. if one of their exchange customers make a purchase from one of their merchants with Bitcoin, this transaction does not need to go through the blockchain. One the other hand if I in Canada makes a purchase from a Coinbase merchant my transaction does have to go through the blockchain, and I, by the way, obtained my BTC from Virtex, who requires Canadian citizenship as a requirement of doing business with them!

One thing to note about Monero and Bytecoin is that Monero has two orders of magnitude the transaction volume by value (BTC or USD) over Bytecoin for very similar capitalizations, so the stress test of bloat should happen in Monero long before it happens in Bytecoin, even though Bytecoin is the older coin.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
Challisto
Newbie
*
Offline Offline

Activity: 13
Merit: 0


View Profile
July 26, 2014, 04:41:59 AM
 #9

I think improving block propagation should take priority before block size. Currently block distribution is an O(n) operation, the bigger the block the longer it will take to propagate, which means miners are forced to mine small blocks to reduce orphan rates. Even if the max block size is increased there is no incentive for miners to use it.

This is a possible way to reduce block propagation time if a solution to the drawback is found. Since most nodes will have a nearly identical mempool of transactions a template could be sent that will construct an entire newly mined block. The benefits are block propagation time is no longer directly tied to its size and there is a reduction in bandwidth. The drawback is if some transactions are missing the receiving node will have to send a request for them, this process can end being slower than just receiving an entire block from the start.

   
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
July 26, 2014, 06:34:11 AM
Last edit: July 26, 2014, 07:47:37 AM by gmaxwell
 #10

while the CryptoNote / Monero solution or something similar also a necessary but not a sufficient condition
The quadratic disincentive Bytecoin&forks does is very bad, there is no upside to it. It doesn't achieve the advertised end, it encourages people to pay miners directly instead of fees, thus improving the position of large well known miners. (If you just mean using a median-controlled limit— sure, but thats a old proposal in these threads that has nothing to do with bytecoin. I would point out that an explicit desired size is better than an actual size in that calculation, to avoid miners needing to pad their blocks or omit transactions they could have included just to express a desire for another limit).

Sadly there is still no proposal that I've seen which really closes the loop between user capabilities (esp bandwidth handling, as bandwidth appears to be the 'slowest' of the technology to improve), at best I've seen applying some additional local exponential cap based on historical bandwidth numbers, which seems very shaky since small parameter changes can make the difference between too constrained and completely unconstrained easily.  The best proxy I've seen for user choice is protocol rule limits, but those are overly brittle and hard to change.

Petertodd and jdillion previously had a POS-ish voting proposal that might be interesting... but any of that family seemed complex to implement. The notion there was that signers in recent transactions, selected by non-interactive cut and choose using the block hashes weighed by coins days destroyed, could reveal signatures of block-hashes after their transactions in order register approval for increasing the block size.  Since miners can always decrease the blocksize unilatterly and would usually be in favor of increases, and they can censor users— so if the users consent is one sided and needed for increases, and the protocol is constructed so that users can't be forced to give consent in advance except by giving up their private keys— this actually has a fighting chance of keeping the users in the decision.  But there are a lot of free parameters in how it could work... and giving a majority of coin holders a voice is actually a pretty poor proxy for doing something smarter:  Democracy is still an oppressive system which forces the will of some onto others, and those holding the most coins might be large corporations or goverments who benefit from centralization, still better than just handing the decision unilateral to miners.
coinft
Full Member
***
Offline Offline

Activity: 187
Merit: 100



View Profile
July 26, 2014, 06:18:58 PM
 #11

Just an idea regarding incentives:

A miner who wants to mine a block larger than 1MB needs to give up a part of the block reward. That way there is a strong incentive not to break the 1MB limit unless there are enough TX available which make up for the lost part (and the higher orphan risk) in fees.

Of course the actual numbers need to be determined with great care.

I would guess that even if this would be introduced today with conservative numbers, it wouldn't be used much to mint larger blocks until several block reward halvings have passed, or TX volume picks up a lot. Also by reducing block rewards this does lower the theoretical maximum of 21M, unless further measures are taken. And there will have to be an upper limit for the time when block rewards reach zero.

Has anyone ever thought about such a system?

-coinft
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
July 26, 2014, 06:54:21 PM
 #12

A miner who wants to mine a block larger than 1MB needs to give up a part of the block reward. That way there is a strong incentive not to break the 1MB limit unless there are enough TX available which make up for the lost part (and the higher orphan risk) in fees.
This is the approach we were talking about with Bytecoin above.  The income is reduced by a multiple determined as function of size^2. This, sadly, does not work in the long run once subsidy is not an overwhelming portion of miner income, because transactions will just pay miners directly (e.g. just author N different transactions, one for each miner you know of, each including an output for that miner. Eligius started supporting being paid fees in this way in 2011).

A version that wasn't a multiple but instead was an absolute number of coins to destroy might work, since you couldn't escape it by moving fees to outputs— but then you have a free parameter of not knowing the value of a bitcoin to its users— and as we've seen with fees, constants that depend on how much a bitcoin is worth easily get out of wack. Any magical thoughts there?
coinft
Full Member
***
Offline Offline

Activity: 187
Merit: 100



View Profile
July 26, 2014, 08:15:15 PM
 #13

This is the approach we were talking about with Bytecoin above.  The income is reduced by a multiple determined as function of size^2. This, sadly, does not work in the long run once subsidy is not an overwhelming portion of miner income, because transactions will just pay miners directly (e.g. just author N different transactions, one for each miner you know of, each including an output for that miner. Eligius started supporting being paid fees in this way in 2011).

A version that wasn't a multiple but instead was an absolute number of coins to destroy might work, since you couldn't escape it by moving fees to outputs— but then you have a free parameter of not knowing the value of a bitcoin to its users— and as we've seen with fees, constants that depend on how much a bitcoin is worth easily get out of wack. Any magical thoughts there?

In what I proposed above you don't give up on a percentage of fees (which may be zero due to out of band fees) but on the block reward itself. And you need to do that for every single block to exceed the 1MB, it does not change the limit for future blocks. For example, if for x% of the block reward you may increase the block by N*x%, with N=10, the new block limit range becomes 1-11MB for a linear range of subsidies from 25 to 0BTC. There is no way to "cheat" with out of band fees.

At the highest end there is no block reward at all. It is true this will become cheaper for miners as the block reward goes down, and eventually it will be free, but in my opinion that's a feature compensating for cheaper future storage and processing resources. With N=10 and today's subsidy, few miners would elect to create larger blocks unless the market changes a lot, but it becomes more reasonable over time.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
July 26, 2014, 09:35:39 PM
 #14

but in my opinion that's a feature compensating for cheaper future storage and processing resources
Storage in the (not so far distant future) will not be free. Talk about "programmed destruction" yikes. What the bytecoin stuff does reduces all the generated coin, including subsidy— what you're suggesting really is a duplicate of it, but less completely considered, please check out the bytecoin whitepaper. I suppose that leaving _out_ the fees at least avoids the bad incentive trap. It's still broken, none the less, and you really can't wave your hands and ignore the fact that subsidy will be pretty small in only a few years... esp with the same approach being apparently ineffective in bytecoin and monero when their subsidy is currently quite large.
coinft
Full Member
***
Offline Offline

Activity: 187
Merit: 100



View Profile
July 26, 2014, 09:53:59 PM
 #15

Storage in the (not so far distant future) will not be free. Talk about "programmed destruction" yikes. What the bytecoin stuff does reduces all the generated coin, including subsidy— what you're suggesting really is a duplicate of it, but less completely considered, please check out the bytecoin whitepaper. I suppose that leaving _out_ the fees at least avoids the bad incentive trap. It's still broken, none the less, and you really can't wave your hands and ignore the fact that subsidy will be pretty small in only a few years... esp with the same approach being apparently ineffective in bytecoin and monero when their subsidy is currently quite large.

Yes it was an idea of the moment to fix the bytecoin model, especially the adaptive limit. With N small enough it might not have bad effects, but it also limits blocks to only 1+N MB forever, which is conservative but maybe not worth the effort. This suits me fine, I would much prefer to compromise on trust and decentralization for payments of small amounts off the block chain any way, and keep the core system as small and trusted (distributed) as possible.

KriszDev
Sr. Member
****
Offline Offline

Activity: 364
Merit: 250


View Profile
July 27, 2014, 06:17:06 AM
 #16

Why you wanna change the 1MB? Currently most of the blocks is <500KB.
amincd (OP)
Hero Member
*****
Offline Offline

Activity: 772
Merit: 501


View Profile
July 30, 2014, 11:17:43 PM
 #17

BTC seems to take a slant toward more security than TPS, which is a fine balance and probably the right one. The entire system crumbles is security is compromised, or even perceived to be lessened by centralization.

If Bitcoin is merely a high powered money for international settlement between large commercial entities, it will have failed in its mission of providing the world with a decentralized electronic cash. Bitcoin has to be accessible for the average Joe. We can draw the line at on-chain micro-transactions, but I don't see any reason why Bitcoin can't match Visa's transaction volumes. Satoshi's original calculations argued as much.

I do agree that security should take priority, but I believe a balance can be found that doesn't compromise BTC's decentralized structure, while also not relegating it to only large value international transactions.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
July 30, 2014, 11:35:19 PM
 #18

If Bitcoin is merely a high powered money for international settlement between large commercial entities, it will have failed in its mission of providing the world with a decentralized electronic cash. Bitcoin has to be accessible
Without commenting on the rest, the logic doesn't follow here.  It is not necessary that not doing soda pop buys directly in the Bitcoin blockchain means that Bitcoin hasn't provided people with decenteralized electronic cash.

Quote
but I don't see any reason why Bitcoin can't match Visa's
The Bitcoin blockchain is a very different system from the visa payment work which makes very different trade-offs. It will always be the case that in some respects the bitcoin blockchain doesn't match visa, just as much as visa fails to match Bitcoin.  If you insist your floor wax be a tasty desert topping you may get something which is the worst of all worlds instead.
amincd (OP)
Hero Member
*****
Offline Offline

Activity: 772
Merit: 501


View Profile
July 31, 2014, 02:57:03 AM
 #19

Without commenting on the rest, the logic doesn't follow here.  It is not necessary that not doing soda pop buys directly in the Bitcoin blockchain means that Bitcoin hasn't provided people with decenteralized electronic cash.

How's that so? How can bitcoin be decentralized cash if it can't be used for everyday purchases like soda pop buys, while using the decentralized network?

Quote
The Bitcoin blockchain is a very different system from the visa payment work which makes very different trade-offs. It will always be the case that in some respects the bitcoin blockchain doesn't match visa, just as much as visa fails to match Bitcoin.

Visa's maximum possible transaction throughput will undoubtedly be higher than Bitcoin's, but the current transaction volume, which is limited by consumer demand rather than the technical capabilities of Visa's data centers, is something Satoshi envisioned Bitcoin matching:

http://www.mail-archive.com/cryptography@metzdowd.com/msg09964.html

Quote
The bandwidth might not be as prohibitive as you think.  A typical transaction
would be about 400 bytes (ECC is nicely compact).  Each transaction has to be
broadcast twice, so lets say 1KB per transaction.  Visa processed 37 billion
transactions in FY2008, or an average of 100 million transactions per day.  
That many transactions would take 100GB of bandwidth, or the size of 12 DVD or
2 HD quality movies, or about $18 worth of bandwidth at current prices.

If the network were to get that big, it would take several years, and by then,
sending 2 HD movies over the Internet would probably not seem like a big deal.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
July 31, 2014, 03:44:23 AM
Last edit: July 31, 2014, 04:15:58 AM by gmaxwell
 #20

Without commenting on the rest, the logic doesn't follow here.  It is not necessary that not doing soda pop buys directly in the Bitcoin blockchain means that Bitcoin hasn't provided people with decenteralized electronic cash.
How's that so? How can bitcoin be decentralized cash if it can't be used for everyday purchases like soda pop buys, while using the decentralized network?
It is perfectly possible to transact in Bitcoin without using the blockchain for every transaction (in a decenteralized way too— though for buying sodapop federated solutions may be much less costly).
Quote
That many transactions would take 100GB of bandwidth, or the size of 12 DVD or
Yep, though note that that post was also from before the million byte limit was added to Bitcoin, along with many other protections against lost of decentralization and against denial of service... it's an argument to the general feasibility of this class of approach, and indeed— it's fine. We're still not yet to a point where sending 100GB/day is "not a big deal", nor is demand for Bitcoin transactions anything like that (and, arguably if Bitcoin currently required 100GB/day now we never would reach that level of demand— because such a costly system at this point would be completely centralized and inferior to traditional banks and Visa). Visa's 2008 transaction volume is also a long way from handling the total transaction volume of the worlds small cash transactions, as you seemed to be envisioning— yet Bitcoin _can_ accommodate that, but not if you continue to believe you can shove all the worlds transactions constantly in to a single global broadcast network.
Pages: [1] 2 3 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!