Bitcoin Forum
April 23, 2024, 12:53:07 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 [15] 16 17 18 19 20 21 22 23 24 25 26 »
  Print  
Author Topic: How a floating blocksize limit inevitably leads towards centralization  (Read 71509 times)
Zangelbert Bingledack
Legendary
*
Offline Offline

Activity: 1036
Merit: 1000


View Profile
February 21, 2013, 11:13:53 AM
 #281

Either way, the incentives are to create blocks so large that they only reliably propagate to a bit over 50% of the hashing power, *not* 100%

Anticipating what humans would be motivated to do in dynamic situations involving other humans is notoriously difficult. If there is already a soft limit in place below the hard one, yet we do not have miners padding their blocks to shut out the little guys, doesn't that immediately call the existence of this incentive into question?

And if the reason no one does this is because the soft limit is so effective, doesn't that then suggest that hard limits are unnecessary?

Either "the incentives are to create blocks so large that they only reliably propagate to a bit over 50% of the hashing power," or they are not. At least prima facie, it looks like you're trying to argue both halves of a contradiction.
1713833587
Hero Member
*
Offline Offline

Posts: 1713833587

View Profile Personal Message (Offline)

Ignore
1713833587
Reply with quote  #2

1713833587
Report to moderator
1713833587
Hero Member
*
Offline Offline

Posts: 1713833587

View Profile Personal Message (Offline)

Ignore
1713833587
Reply with quote  #2

1713833587
Report to moderator
1713833587
Hero Member
*
Offline Offline

Posts: 1713833587

View Profile Personal Message (Offline)

Ignore
1713833587
Reply with quote  #2

1713833587
Report to moderator
According to NIST and ECRYPT II, the cryptographic algorithms used in Bitcoin are expected to be strong until at least 2030. (After that, it will not be too difficult to transition to different algorithms.)
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1713833587
Hero Member
*
Offline Offline

Posts: 1713833587

View Profile Personal Message (Offline)

Ignore
1713833587
Reply with quote  #2

1713833587
Report to moderator
markm
Legendary
*
Offline Offline

Activity: 2940
Merit: 1090



View Profile WWW
February 21, 2013, 11:17:05 AM
 #282

Either way, the incentives are to create blocks so large that they only reliably propagate to a bit over 50% of the hashing power, *not* 100%

Anticipating what humans would be motivated to do in dynamic situations involving other humans is notoriously difficult. If there is already a soft limit in place below the hard one, yet we do not have miners padding their blocks to shut out the little guys, doesn't that immediately call the existence of this incentive into question?

No, because the hard limit is a hard limit on how large of a "little guy" they could push out, and that limit is, basically, guys so frikkin tiny that pushing them out gains so little the effect is lost in the noise of more not quite that extremely tiny guys coming online all the time.

Basically you can't kick out the little guy, given the current hard limit; you can only kick out the trivially tiny guy who isn't even worth the trouble of kicking out.

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
Zangelbert Bingledack
Legendary
*
Offline Offline

Activity: 1036
Merit: 1000


View Profile
February 21, 2013, 11:21:26 AM
Last edit: February 21, 2013, 11:40:18 AM by Zangelbert Bingledack
 #283

No, because the hard limit is a hard limit on how large of a "little guy" they could push out, and that limit is, basically, guys so frikkin tiny that pushing them out gains so little the effect is lost in the noise of more not quite that extremely tiny guys coming online all the time.

Basically you can't kick out the little guy, given the current hard limit; you can only kick out the trivially tiny guy who isn't even worth the trouble of kicking out.

OK, that makes sense. I withdraw the argument.
Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 21, 2013, 11:38:39 AM
 #284

Based on the calculations of how much bandwidth a certain block size would require, and how many transactions that size gives us, I would say that for now a 10MB limit would be quite enough. That would give us as much transactions as PayPal has, and it's arguable that we might not even need more than that.

There is no sign of Bitcoin being widely adopted in brick & mortar, if it only reaches PayPal level adoption as a payment system, more might not actually be necessary. Bitcoin is a cumbersome system for planning on being able to cheaply send any and all transactions, that should not be a valid goal at any point.

The 10MB limit would make it impossible for some people to mine, but that is life. Mining or running a full node is already restricted for a very large portion of the world's people. The only thing we really need to worry about when tweaking the block size is that it's unlikely for it to lead to a mining monopoly, and that there remains some scarcity.

A 10MB limit for now would probably solve it, I think that if ever more was needed, it would actually have to put to the test. With this I mean just keep it at 10MB and see what happens. Maybe we already need to to this, keep it at 1MB and see what happens.

I'm still all for a some sort of a floating max if a very well thought of model can be agreed upon, but otherwise I'd eventually just change it to 10MB and let it be.

Denarium closing sale discounts now up to 43%! Check out our products from here!
Zangelbert Bingledack
Legendary
*
Offline Offline

Activity: 1036
Merit: 1000


View Profile
February 21, 2013, 11:40:31 AM
 #285

For clarification, what happens if a single high-bandwidth miner were to actually start creating huge blocks that push out half the other miners. What would be the reaction? Is there really nothing the cut-off half could do? And even if they could do nothing, why would the surviving half go along with this, knowing that this spirals inevitably higher, leaving them out?

This is another apparent contradiction: if the scenario in the OP is really a problem, why would even the upper-tier miners go along with unreasonable blocksize inflation, knowing half of them could be next to fall? It seems we must look not only at the incentives of the highest-bandwidth miners, but those of all miners - or at least the top 50% whose cooperation they apparently need.

The other miners aren't robots; they can anticipate such a problem just like retep did, and take pains to ensure it does not happen. They could ostracize pools that allow unreasonable blocksizes, etc. It feels like the dynamic human factor is being ignored.
hazek
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


View Profile
February 21, 2013, 11:42:45 AM
 #286

Technomage, I just wonder, if block space isn't scarce causing the fees to stay puny, just how are miners going to get funded once block reward gets super small or zero?

My personality type: INTJ - please forgive my weaknesses (Not naturally in tune with others feelings; may be insensitive at times, tend to respond to conflict with logic and reason, tend to believe I'm always right)

If however you enjoyed my post: 15j781DjuJeVsZgYbDVt2NZsGrWKRWFHpp
Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 21, 2013, 11:46:36 AM
 #287

Technomage, I just wonder, if block space isn't scarce causing the fees to stay puny, just how are miners going to get funded once block reward gets super small or zero?

Indeed. That is why I've advocated for retaining some form of scarcity for the block size. A model with no scarcity could be a disaster. Keeping it as is, and letting Bitcoin slide into a system where user's only function is to keep miners very rich, and pay $20 for a transaction, is quite unacceptable though. There has to be a middle ground. I feel a large majority of userbase will support a middle ground because they want to continue to use Bitcoin more or less as they do now. Perhaps not exactly like now, but more or less.

Denarium closing sale discounts now up to 43%! Check out our products from here!
Sukrim
Legendary
*
Offline Offline

Activity: 2618
Merit: 1006


View Profile
February 21, 2013, 11:51:07 AM
 #288

After all the only real discovery in a new block is the nonce value.
Well, that's what one is mining for, but also the timestamp is far from given (it has to be within a certain range) and some other fields can also be chosen by the miner. So while the nonce is the real "secret", there's still some local information that's not too easy to know.
Miners might be able to share these other fields between each other though (but is trying out a handful of timestamps really faster than just getting a header instead of a nonce...?).

With only a hash of the previous block header, one could already try for a few seconds to mine an empty block, then with the merkle root + transaction hashes coming in you can start to check out which transactions you already can forget about/move into that previous block and you can already say which transactions you know of are for sure not yet in a block (you can't tell if they are still valid though, as one of the unknown transaction hashes might have changed something). Then you get the missing transactions as well and then you can start including the remaining valid transactions. You can do so earlier too, since you are running on an empty block anyways - so it might be better to "risk" creating a block that might turn out valid than to do nothing.
Currently I guess this is done more or less at the same time, as there is no real issue with getting a block quickly.

As long as miners can have custom clients and relay blocks directly between them (which they should, to reduce stales), having rules that make blocks propagate slower through the network is a "fix" on the wrong end.

https://www.coinlend.org <-- automated lending at various exchanges.
https://www.bitfinex.com <-- Trade BTC for other currencies and vice versa.
markm
Legendary
*
Offline Offline

Activity: 2940
Merit: 1090



View Profile WWW
February 21, 2013, 12:09:13 PM
 #289

Ten times the block size seems like scarcity is banished far into the future in one huge jump.

Even just doubling it is a massive increase, especially while blocks are typically still far from full.

Thus to me it seems better never to more than double it in any one jump.

If relating those doublings to the halvings of block-subsidy it too slow a rate of increase then maybe use Moore's Law or thereabouts, increasing by 50% yearly or by 100% every eighteen months.

It is hard to feel like there is anywhere close to being a "need" for more space when I have never yet ever had to pay a fee to transfer bitcoins.

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
hazek
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


View Profile
February 21, 2013, 12:19:07 PM
 #290

Technomage, I just wonder, if block space isn't scarce causing the fees to stay puny, just how are miners going to get funded once block reward gets super small or zero?

Indeed. That is why I've advocated for retaining some form of scarcity for the block size. A model with no scarcity could be a disaster. Keeping it as is, and letting Bitcoin slide into a system where user's only function is to keep miners very rich, and pay $20 for a transaction, is quite unacceptable though. There has to be a middle ground. I feel a large majority of userbase will support a middle ground because they want to continue to use Bitcoin more or less as they do now. Perhaps not exactly like now, but more or less.

I already said in this thread somewhere on page 5 or 6 that I'm not opposed to a compromise. But it has to be a compromise that isn't at the expense of what I call my Bitcoin sovereignty. I don't care if you increase the block size limit as long as this wont immediately or down the road mean that I cannot personally validate rules miners validate anymore and that I must give my explicit consent to a rule change.

I think it would be a good exercise to further explore the likely game play scenarios if the block size limit is doubled(or some higher factor increase) every time the block reward is halved and how long that would take for Bitcoin to be able to handle Paypal amount of transactions while still keep the block space scarce and the blockchain at a reasonable size.

My personality type: INTJ - please forgive my weaknesses (Not naturally in tune with others feelings; may be insensitive at times, tend to respond to conflict with logic and reason, tend to believe I'm always right)

If however you enjoyed my post: 15j781DjuJeVsZgYbDVt2NZsGrWKRWFHpp
mp420
Hero Member
*****
Offline Offline

Activity: 501
Merit: 500


View Profile
February 21, 2013, 01:58:01 PM
 #291

Given this bandwidth limitation, here's a new proposal for the adjustment of maximum block size:

1) A boolean flag is added to each block. The flag represents the block solver's yes or no vote for increasing the block size. The independent miner or mining pool sets this flag according to their preference for an increase.

2) Every time the difficulty is adjusted, the number of yes votes is counted from the last adjustment. If the number of yes votes is greater than 90%, then the block size is increased by 1%. Both percentages are baked-in constants, requiring a hard fork to change.


I like this proposal. If we want to waste another bit per block for the vote, we could also have the options "decrease max block size" and "ignore my vote" (the other options being "increase" and "keep"). I think having a decrease option would be an elegant addition, in case of unexpected dynamic effects in the relatively distant future.

The thing I like most about this proposal is that it would only need one hardfork, and it could actually be implemented in a way that is relatively soft as hard forks go. Just count blocks without the vote field as "keep" votes. Fork can only happen after there are over 90% "increase" votes in the last adjustment period. Hm, maybe it should actually have a lag of one adjustment period (or a fixed number of blocks), in case of a chain reorg event.
Sukrim
Legendary
*
Offline Offline

Activity: 2618
Merit: 1006


View Profile
February 21, 2013, 02:15:48 PM
 #292

I don't see why 90% would be required and not just the majority of one of the 3 options (1% larger, same, 1% smaller). To block a change in the 90% model you need "only" 10% of the net hash rate. To have something you want to see in a ">=33.4%" version, you need more than 3 times the hash power and other can still force the opposite change by not voting uniformly.

To abstain, you'd just create blocks with alternating options so it evens out or you can add "abstention" as 4th option.

https://www.coinlend.org <-- automated lending at various exchanges.
https://www.bitfinex.com <-- Trade BTC for other currencies and vice versa.
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1007



View Profile
February 21, 2013, 02:33:49 PM
 #293


Was this earlier post insufficient?

If we want to cap the time of downloading overhead the latest block to say 1%, we need to be able to download the MAX_BLOCKSIZE within 6 seconds on average so that we can spend 99% time hashing.

At 1MB, you would need a ~1.7Mbps  connection to keep downloading time to 6s.
At 10MB, 17Mbps
At 100MB, 170Mbps

and you start to see why even 100MB block size would render 90% of the world population unable to participate in mining.
Even at 10MB, it requires investing in a relatively high speed connection.


No, because it only addresses the effects of one variable resource, thus assuming that all other variables would remain either static or significantly independent from the effects of this variable so as to be ignored.  This might be a valid assumption, but I cannot accept that as a given.  The core purpose of economic analysis is to be able to predict the effects of changes to all the variables, not just those you assume are dominant.  The unseen is usually of greater net effect on the outcome than the seen.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
paraipan
In memoriam
Legendary
*
Offline Offline

Activity: 924
Merit: 1004


Firstbits: 1pirata


View Profile WWW
February 21, 2013, 02:36:38 PM
 #294

So let me see if I have the math right according to the scalability article on Bitcoin.

Visa does 2000tps we can do 7....

Why can't people stop comparing Bitcoin with whatever payment system they can come up with?

It will never be used as the only existent system so it will always balance itself out with transaction number/block depending on the fees charged at any time. Btw, increasing block size over the 1mb hard-limit will have a negative impact on fees. Miners will not be able to increase them in a natural way defending their self interest and block space scarcity. I really think Satoshi saw this coming when thinking how to implement the halving block reward, hence the 1mb block limit will allow almost 90% of the network to stay in sync while opening a new market for transaction inclusion in the blockchain. Genius!

BTCitcoin: An Idea Worth Saving - Q&A with bitcoins on rugatu.com - Check my rep
d'aniel
Sr. Member
****
Offline Offline

Activity: 461
Merit: 251


View Profile
February 21, 2013, 02:39:36 PM
 #295

All the important protocol rules can be enforced by SPV clients if support for "error messages" is added to the network.  This is described here: https://bitcointalk.org/index.php?topic=131493.0

The trust model relies on information being hard to suppress, which is the same as the trust model nearly everyone running a full node is subscribing to in practice anyway by not personally vetting the source code.

Of course with little expenditure most people will still be able to run massively scaled full nodes, anyway, once all the proposed optimizations are implemented.  But it's at least nice to know even the smart phone clients can "have a vote".

If the transaction rate does reach such huge levels, then it strikes me that the hashing power funding problem has been solved automatically - all those default but technically optional half-cent transaction fees sure would add up.

It also strikes me as unlikely that a block size limit would actually achieve an optimal amount of hashing power.  Even in the case where most users have been driven off the blockchain - and some off of Bitcoin entirely - why should it?  Why shouldn't we just expect Ripple-like trust networks to form between the Chaum banks, and blockchain clearing to happen infrequently enough so as to provide an inconsequential amount of fees to miners?  What if no matter what kind of fake scarcity is built into the blockchain, transaction fees are driven somewhere around the marginal transaction fee of all possible alternatives?
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1007



View Profile
February 21, 2013, 02:40:45 PM
 #296

One of the reasons why high bandwidth is required is because you need it in bursts every 10mins(on average).

Sending blocks with only hashes of TX is a start.

Any other optimisations which reduce the download size within that 6s period either by pre-downloading known information or by downloading unimportant data(data not needed to begin mining next block) later once mining has commenced will help to drastically reduce bandwidth requirements.

After all the only real discovery in a new block is the nonce value.

Indeed.  If the block propagation can remove the transaction data, and simply expect all mining clients to already have that data in their queue; then the actual block transmitted can be reduced to the 80 byte header and the myrkel hash tree.  That should make the 6 second baseline trivial for any common broadband connection well into the future.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1007



View Profile
February 21, 2013, 02:45:36 PM
 #297


The other miners aren't robots; they can anticipate such a problem just like retep did, and take pains to ensure it does not happen. They could ostracize pools that allow unreasonable blocksizes, etc. It feels like the dynamic human factor is being ignored.

Thank you!  Finally, someone who understands!

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
paraipan
In memoriam
Legendary
*
Offline Offline

Activity: 924
Merit: 1004


Firstbits: 1pirata


View Profile WWW
February 21, 2013, 02:47:56 PM
 #298

One of the reasons why high bandwidth is required is because you need it in bursts every 10mins(on average).

Sending blocks with only hashes of TX is a start.

Any other optimisations which reduce the download size within that 6s period either by pre-downloading known information or by downloading unimportant data(data not needed to begin mining next block) later once mining has commenced will help to drastically reduce bandwidth requirements.

After all the only real discovery in a new block is the nonce value.

Indeed.  If the block propagation can remove the transaction data, and simply expect all mining clients to already have that data in their queue; then the actual block transmitted can be reduced to the 80 byte header and the myrkel hash tree.  That should make the 6 second baseline trivial for any common broadband connection well into the future.

If you think about this solution for enough time you realize it can be messy. Why? Because nodes are independent and they listen transactions on their own, so you can really mine a block and be sure everybody has the transactions in their memory pool after those same transactions have a certain age, more than 10 minutes for example, if not the orphan rate would be very high with clients all around trying to untangle new blocks with their corresponding transactions from memory. Not a nice picture...

BTCitcoin: An Idea Worth Saving - Q&A with bitcoins on rugatu.com - Check my rep
ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470
Merit: 1005


Bringing Legendary Har® to you since 1952


View Profile
February 21, 2013, 02:52:20 PM
 #299

I made a poll specially for this occasion, please come and vote:

https://bitcointalk.org/index.php?topic=145636.0

wtfvanity
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500


WTF???


View Profile
February 21, 2013, 03:33:57 PM
 #300

If we want to cap the time of downloading overhead the latest block to say 1%, we need to be able to download the MAX_BLOCKSIZE within 6 seconds on average so that we can spend 99% time hashing.

At 1MB, you would need a ~1.7Mbps  connection to keep downloading time to 6s.
At 10MB, 17Mbps
At 100MB, 170Mbps

and you start to see why even 100MB block size would render 90% of the world population unable to participate in mining.
Even at 10MB, it requires investing in a relatively high speed connection.

Thank you for that so much. That is most definitely the clearest explanation I've seen yet. Even if miners sacrificed a few more seconds so the bandwidth requirements weren't as high, you still would only halve the connection speed, making a 10 meg block potentially realistic today.

Something to remember too, is that we are not packing every block with a meg currently. Most are a fraction of that. The maximum block size would be needed for peak TPS until a later date when the transactions get slower.

Something please check my math though, I'll update it on just the 10 meg block.

At 0.0005 minimum transaction fee, on blockchain.info I'm seeing about 0.5 BTC per 250 KB of block size. That would be an additional 20 BTC per block with a fully loaded 10 meg block. Is that not a decent 600USD reason to upgrade your internet connection?

And for those like Hazek who simply want to verify the rules of bitcoin, the requirements of bandwidth for a 10 meg block are any first world DSL connection with plenty to spare.

          WTF!     Don't Click Here              
          .      .            .            .        .            .            .          .        .     .               .            .             .            .            .           .            .     .               .         .              .           .            .            .            .     .      .     .    .     .          .            .          .            .            .           .              .     .            .            .           .            .               .         .            .     .            .            .             .            .              .            .            .      .            .            .            .            .            .            .             .          .
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 [15] 16 17 18 19 20 21 22 23 24 25 26 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!