How a floating blocksize limit inevitably leads towards centralization

<< < (3/103) > >>

cjp:
I think we need to have a block size limit. My original objection against removing the block size limit was that, as the number of new coins per block drops to zero, mining incentive will also drop to zero, if you have nothing to keep transaction fee above zero (transaction capacity has to be "scarce"). The OP showed an entirely new way how things can go wrong if there is no block size limit.

I don't see how making the block size limit "auto-adjustable" is different in this respect from having no block size limit at all.

In my opinion, the future block size limit can be very high, to allow for very high (but not unlimited) transaction volume. But it has to be low enough to prevent all the problems related to unlimited block sizes.

See the paper I presented in this thread: https://bitcointalk.org/index.php?topic=94674.0. In chapter 3, it contains some estimations about scalability of different concepts. I mention it here, because it contains some estimates about the number of transactions needed for different technologies, when used worldwide for all transactions. When assuming 2 transactions pppd for 10^10 people, these are some conclusions:
normal Bitcoin system: 1e8 transactions/blockwhen my proposed system is widely used: 1e5 transactions/blockThat should give you an idea of how high the block size limit should be. Maybe it should even be a bit lower, to increase scarcity a bit, and for the current level of technology, to allow normal-PC users to verify the entire block chain. For comparison: the current limit is around 1e3 transactions/block.

Quote from: Gavin Andresen on February 18, 2013, 05:14:32 PM

So, as I've said before:  we're running up against the artificial 250K block size limit now, I would like to see what happens. There are lots of moving pieces here, so I don't think ANYBODY really knows what will happen (maybe miners will collectively decide to keep the block size low, so they get more fees.  Maybe they will max it out to force out miners on slow networks.  Maybe they will keep it small so their blocks relay through slow connections faster (maybe there will be a significant fraction of mining power listening for new blocks behind tor, but blasting out new blocks not via tor)).

I'd like to see that too, since it's IMHO such an important piece of Bitcoin, and I'd rather have it tested now than when the whole world starts using Bitcoin; after successful halving of the block reward, this is the next big step.

Quote from: Gavin Andresen on February 18, 2013, 05:14:32 PM

I think we should put users first. What do users want? They want low transaction fees and fast confirmations. Lets design for that case, because THE USERS are who ultimately give Bitcoin value.


I think the users want more than that, at least in the current Bitcoin community. Bitcoins most unique characteristics come from its decentralized nature; if you lose that, everything else is in danger. If you just want low fees and fast confirmation, Bitcoin is not the right technology: it would be far more efficient to have a couple of centralized debit card issuers who issue properly secured cards without chargeback: every transaction only needs to be verified and stored once or twice, so there would be almost no costs (and hence almost no transaction fees) and confirmation would be near-instantaneous.

Gavin Andresen:
RE: lots of code to write if you can't keep up with transaction volume:  sure.  So?

Quote from: retep on February 18, 2013, 07:09:02 PM

Transaction volume itself leads to centralization too, simply by ensuring that only a miner able to keep up with the large volume of low-fee transactions can make a profit.


I really don't understand this logic.

Yes, it is a fact of life that if you have a system where people are competing, the people who are less efficient will be driven out of business. So there will be fewer people in that business.

You seem to be saying that we should subsidize inefficient miners by limiting the block size, therefore driving up fees and making users pay for their inefficiency.

All in the name of vague worries about "too much centralization."

Peter Todd:
Quote from: Gavin Andresen on February 18, 2013, 07:18:36 PM

RE: lots of code to write if you can't keep up with transaction volume:  sure.  So?


Well, one big objection is the code required is very similar to that required by fidelity-bonded bank/ledger implementations, but unlike the fidelity stuff, because it's consensus screwing it up creates problems that are far more difficult to fix and far more widespread in scale.


Quote from: Gavin Andresen on February 18, 2013, 07:18:36 PM

Quote from: retep on February 18, 2013, 07:09:02 PM

Transaction volume itself leads to centralization too, simply by ensuring that only a miner able to keep up with the large volume of low-fee transactions can make a profit.


I really don't understand this logic.

Yes, it is a fact of life that if you have a system where people are competing, the people who are less efficient will be driven out of business. So there will be fewer people in that business.

You seem to be saying that we should subsidize inefficient miners by limiting the block size, therefore driving up fees and making users pay for their inefficiency.


"This mining this is crazy, like all that work when you could just verify a transaction's signatures, and I dunno, ask a bunch of trusted people if the transaction existed?"

So, why do we give miners transaction fees anyway? Well, they are providing a service of "mining a block", but the real service they are providing is the service of being independent from other miners, and we value that because we don't want >50%  of the hashing power to be controlled by any one entity.

When you say these small miners are inefficient, you're completely ignoring what we actually want miners to do, and that is to provide independent hashing power. The small miners are the most efficient at providing this service, not the least.

The big issue is the cost to be a miner comes in two forms, hashing power and overhead. The former is what makes the network secure. The latter is a necessary evil, and costs the same for every independent miner. Fortunately with 1MiB blocks the overhead is low enough that individual miners can profitably mine on P2Pool, but with 1GiB blocks P2Pool mining just won't be profitable. We already have 50% of the hashing power controlled by about three or four pools - if running a pool requires thousands of dollars worth of equipment the situation will get even worse.

Of course, we've also been focusing a lot on miners, when the same issue applies to relay nodes too. Preventing DoS attacks on the flood-fill network is going to be a lot harder when when most nodes can't verify blocks fast enough to know if a transaction is valid or not, and hence the limited resource of priority or fees is being expended by broadcasting it. Yet if the "solution" is fewer relay nodes, you've broken the key security assumption that information is easy to spread and difficult to stifle.

Quote from: Gavin Andresen on February 18, 2013, 07:18:36 PM

All in the name of vague worries about "too much centralization."


Until Bitcoin has undergone a serious attack we just aren't going to have a firm idea of what's "too much centralization"

cjp:
Quote from: Gavin Andresen on February 18, 2013, 07:18:36 PM

I really don't understand this logic.

Yes, it is a fact of life that if you have a system where people are competing, the people who are less efficient will be driven out of business. So there will be fewer people in that business.

You seem to be saying that we should subsidize inefficient miners by limiting the block size, therefore driving up fees and making users pay for their inefficiency.

All in the name of vague worries about "too much centralization."


It's interesting, and a bit worrying too, to see the same ideological differences of the "real" world come back in the world of Bitcoin.

In my view, the free market is a good, but inherently instable system. Economies of scale and network effects advantage large parties, so large parties can get larger and small parties will disappear, until only one or just a few parties are left. You see this in nearly all markets nowadays. Power also speeds up this process: more powerful parties can eliminate less powerful parties; less powerful parties can only survive if they subject themselves to more powerful parties, so the effect is that power tends to centralize.

For the anarchists among us: this is why we have governments. It's not because people once thought it was a good idea, it's because that happens to be the natural outcome of the mechanisms that work in society.

In the light of this, and because the need for bitcoins primarily comes from the need for a decentralized, no-point-of-control system, I think it's not sufficient to call worries about centralization "vague": you have to clearly defend why this particular form of centralization can not be dangerous. The default is "centralization is bad".

Mike Hearn:
In the absence of a block size cap miners can be supported using network assurance contracts. It's a standard way to fund public goods, which network security is, so I am not convinced by that argument.

I feel these debates have been going on for years. We just have wildly different ideas of what is affordable or not.

Perhaps I've been warped by working at Google so long but 100,000 transactions per second just feels totally inconsequential. At 100x the volume of PayPal each node would need to be a single machine and not even a very powerful one. So there's absolutely no chance of Bitcoin turning into a PayPal equivalent even if we stop optimizing the software tomorrow.

But we're not going to stop optimizing the software. Removing the block cap means a hard fork, and once we decided to do that we may as well throw in some "no brainer" upgrades as well, like supporting ed25519 which is orders of magnitude faster than ECDSA+secp256k1. Then a single strong machine can go up to hundreds of thousands of transactions per second.

The cost of a Bitcoin transaction is just absurdly low and will continue to fall in future. It's like nothing at all. Saying Bitcoin is going to get centralized because of high transaction rates is kinda like saying in 1993 that the web can't possibly scale because if everyone used it web servers would fall over and die. Well yes, they would have done, in 1993. But not everyone started using the web overnight and by the time they did, important web sites were all using hardware load balancers and multiple data centers and it was STILL cheap enough that Wikipedia - one of the worlds top websites - could run entirely off donations.

Navigation

[0] Message Index

[#] Next page

[*] Previous page