Bitcoin Forum
April 25, 2024, 09:55:30 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Reduce Block Size Limit  (Read 1831 times)
mrvision (OP)
Sr. Member
****
Offline Offline

Activity: 527
Merit: 250



View Profile
May 19, 2015, 03:43:29 PM
 #1

It is so cool to have a reduced limit, that i would propose to reduce it. Miners would be as happy as a rainbow.

It seems to me that they don't believe in market forces, also that they think that with a reduced size limit (scarcity) their profit is higher. They are failing to see that transactions are being made outside the blockchain due that limit, so they are losing fees. Anyway, reduce it and they will see.

But of course they won't agree either, so ask them: is it 1mb a magical number? Why is it the correct size? Of course in a free block economy market forces would determine the correct amount, but that is soooo risky...
1714038930
Hero Member
*
Offline Offline

Posts: 1714038930

View Profile Personal Message (Offline)

Ignore
1714038930
Reply with quote  #2

1714038930
Report to moderator
1714038930
Hero Member
*
Offline Offline

Posts: 1714038930

View Profile Personal Message (Offline)

Ignore
1714038930
Reply with quote  #2

1714038930
Report to moderator
1714038930
Hero Member
*
Offline Offline

Posts: 1714038930

View Profile Personal Message (Offline)

Ignore
1714038930
Reply with quote  #2

1714038930
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714038930
Hero Member
*
Offline Offline

Posts: 1714038930

View Profile Personal Message (Offline)

Ignore
1714038930
Reply with quote  #2

1714038930
Report to moderator
DannyHamilton
Legendary
*
Offline Offline

Activity: 3374
Merit: 4606



View Profile
May 19, 2015, 03:50:20 PM
 #2

I'm not impressed with your straw man.

The issue isn't that 1MB is a "magical number".

The intelligent discussion revolves around determining how to avoid having this conversation again, and how to maintain the decentralized nature of the system.

If you are ignoring those two concerns, then you are ignoring the entire issue.
mrvision (OP)
Sr. Member
****
Offline Offline

Activity: 527
Merit: 250



View Profile
May 19, 2015, 07:42:33 PM
 #3

We're having this conversation over and over again because the block size limit has not been removed yet. The block size limit is a socialist production quota that removes the incentive to improve the p2p network in order to avoid centralization.

Suddently this graph will be an S curve because of that limit: https://blockchain.info/charts/n-transactions-excluding-popular?timespan=all&showDataPoints=false&daysAverageString=7&show_header=true&scale=0&address=

And then market forces will make miners beg for the limit to be increased as their memory will begin to be overflowed.

And remember:
Quote from: Satoshi Nakamoto
The bandwidth might not be as prohibitive as you think.  A typical transaction
would be about 400 bytes (ECC is nicely compact).  Each transaction has to be
broadcast twice, so lets say 1KB per transaction.  Visa processed 37 billion
transactions in FY2008, or an average of 100 million transactions per day. 
That many transactions would take 100GB of bandwidth, or the size of 12 DVD or
2 HD quality movies, or about $18 worth of bandwidth at current prices.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
May 20, 2015, 08:11:22 AM
 #4

a socialist production quota that removes the incentive to improve the p2p network in order to avoid centralization.
Yes? and so?   The limited total number of coins is a socialist production quota used to create the scarcity needed for Bitcoin to function as a money like good.   The enforcement of digital signatures is a socialist constraint on the spend-ability of coins that makes possible something akin to ownership.  Decentralization is no less a fundamental defining characteristic of Bitcoin than limited supply or ownership of coins, -- that it must be protected shouldn't be up for debate anywhere; but reasonable people can easily disagree about the contours of trade-offs or the ramifications of decisions.

Following your general logic you suggest, nothing at all would be enforced miners could publish whatever they wanted-- and the system would be worthless.  Bitcoin is a system that has any value at all because it enforces rules against behavior that would otherwise be permitted by the laws of nature.

This isn't to say that all limits are proper or good or well calibrated; but you cannot take a principled stance against any and all limits in general and then speak reasonably about Bitcoin at all.
JeromeL
Member
**
Offline Offline

Activity: 554
Merit: 11

CurioInvest [IEO Live]


View Profile
May 20, 2015, 10:02:37 AM
 #5

Quote from: mrvision
And remember:
Quote from: Satoshi Nakamoto
...

And trying to put words in Satoshi's mouth with a quote he said 5 years ago is pretty silly.

Nobody knows what Satoshi would say about this maxblocksize issue in today's light, because things have changed a whole lot since (mining centralization, #nodes dropping, etc).

DumbFruit
Sr. Member
****
Offline Offline

Activity: 433
Merit: 254


View Profile
May 20, 2015, 01:17:53 PM
Last edit: May 20, 2015, 01:54:45 PM by DumbFruit
 #6

But of course they won't agree either, so ask them: is it 1mb a magical number? Why is it the correct size? Of course in a free block economy market forces would determine the correct amount, but that is soooo risky...
They sure would. Markets tend to be highly efficient, running in only the most extreme situations (at large scales) at like 10% profit margins (In financials, which surely aren't free markets anyway). In more reasonable situations running below 2%.

In Bitcoin's case, this would mean that competitors in this space would drive transaction prices down to whatever it costs them to receive those transactions plus about 10%. However, the cost of each transaction is actually several hundred thousand times the cost of one node... Because there are lots of nodes that need to duplicate the same transaction.

So in order for your idea to work (Presumably you want the limit removed entirely) we would need to see profit margins in a free market of something like 10,000,000%.

That's only the tip of the iceberg though, because obviously we wouldn't want to see all that profit go to a single entity, we'd want it spread to all miners in proportion to their work, and if you can figure out a way to do all that, then maybe we don't even need free markets.

TLDR; Something about cakes and eating them.

By their (dumb) fruits shall ye know them indeed...
mrvision (OP)
Sr. Member
****
Offline Offline

Activity: 527
Merit: 250



View Profile
May 20, 2015, 07:41:24 PM
 #7

I think you people believe that if you remove the block size limit, then the block space won't be a scarce resource. But it would be, as very big blocks won't be able to propagate quickly enough and won't build the longest chain. That means that for each moment in history, the world bandwidth will act as a dynamic block size limit. Miners will need to deploy tactics to ensure which size limit is the optimum and that will make fees drop to the most efficient level.

I don't see where centralization is more likely to ocur in this kind of ecosystem (which would be prepared to scale) than in a capped one, but i see lots of benefits in a free bitcoin ecosystem.

Moreover think about this: Have you noticed that when more roads are built, more cars drive through them? That's because there is a car latent demand awaiting to the roads to be built. The same happened when the blocksize was increased from 256kb to 1mb. In that case we kicked the can, no centralization happened, and the number of transactions grew.

Check this again and remember -> Block size was increased on 12 March 2013:
https://blockchain.info/charts/n-transactions-excluding-popular?timespan=all&showDataPoints=false&daysAverageString=7&show_header=true&scale=0&address=

DumbFruit
Sr. Member
****
Offline Offline

Activity: 433
Merit: 254


View Profile
May 20, 2015, 08:44:57 PM
Last edit: May 20, 2015, 08:57:43 PM by DumbFruit
 #8

I think you people believe...
If you want to have a discussion you should quote somebody and make your arguments based on their position.

I don't see where centralization is more likely to ocur in this kind of ecosystem (which would be prepared to scale) than in a capped one, but i see lots of benefits in a free bitcoin ecosystem.
You were told by at least two people exactly why lifting the limit would necessarily lead to centralization.

We know there is more than enough transactional information in the world than we could ever hope to fit in a sufficiently decentralized blockchain using today's technology. There is zero incentive for miners to not fill the blocks entirely; almost any non-zero fee would be sufficient.

The hope is that the block space will be scarce enough to bid transaction fees high enough to cover some semblance of decentralization when inflation stops. Whatever that limit should be we can only say for certain that it is not "infinite".

If the block space is infinite then transaction costs, through competition, would fall to roughly the rate needed to operate single node (when inflation stops), because this is the most competitive configuration in a free market.

By their (dumb) fruits shall ye know them indeed...
DannyHamilton
Legendary
*
Offline Offline

Activity: 3374
Merit: 4606



View Profile
May 20, 2015, 09:08:23 PM
 #9

I think you people believe that if you remove the block size limit, then the block space won't be a scarce resource.

You think incorrectly.  You continue to prop up and knock down the same straw man.  I'm sure there are some uninformed individuals that are concerned that removing (or increasing) the block size limit will result in the block space no longer being a scarce resource, but those that are aware of the issues know better. Perhaps take the time to get a better understanding of the issues being discussed?

But it would be, as very big blocks won't be able to propagate quickly enough and won't build the longest chain.

That depends on the bandwidth of the majority of the miners, doesn't it?  Those that have the resources to access the highest bandwidth will be able to shut out entirely those that have lesser bandwidth by intentionally creating blocks that are too big for the lower bandwidth miners (and pools) to propagate fast enough.

That means that for each moment in history, the world bandwidth will act as a dynamic block size limit. Miners will need to deploy tactics to ensure which size limit is the optimum and that will make fees drop to the most efficient level.

Optimum for those with access to faster bandwidth isn't the same as optimum for everyone.

I don't see where centralization is more likely to ocur in this kind of ecosystem (which would be prepared to scale) than in a capped one, but i see lots of benefits in a free bitcoin ecosystem.

There are a variety of attack vectors that open up when block size is unlimited.  It is important to discuss them all, consider the ramifications, and make an intelligent decision about what will best protect the functionality of bitcoin, rather then making a knee-jerk reactionary "unlimited is always better!" decision.

Moreover think about this: Have you noticed that when more roads are built, more cars drive through them?

Nope.  Haven't noticed that.  As a matter of fact there are several roads in my town that were built within the past 8 years and have seen very little traffic on them at all.  Anyhow, we aren't talking about roads here, we're talking about something that has a very different set of issues and requirements.

That's because there is a car latent demand awaiting to the roads to be built. The same happened when the blocksize was increased from 256kb to 1mb. In that case we kicked the can, no centralization happened, and the number of transactions grew.

Check this again and remember -> Block size was increased on 12 March 2013:

There was a hard 256kb limit on maximum acceptable blocksize?  Are you sure about that?  I don't remember that.  Regardless, there's a significant difference in risk between increasing the block size limit and removing it.
DannyHamilton
Legendary
*
Offline Offline

Activity: 3374
Merit: 4606



View Profile
May 20, 2015, 09:27:14 PM
 #10

There is zero incentive for miners to not fill the blocks entirely; almost any non-zero fee would be sufficient.

You are mistaken.  There are physical limits and costs that would prevent this.  Each additional transaction increases the size of the block.  There are costs associated with increasing the size of a block.  At a minimum, there is a (very small) increase in the chance that the block will be orphaned.  Miners (and pools) would need to consider the cost of that risk and weigh it against the revenue gained from adding transactions.  Additionally, there storage costs associated with holding the list of unconfirmed transactions in memory.  Since some miners may choose to set a higher fee requirement, users would find that their transaction may not confirm quite as quickly with a lower fee.

The hope is that the block space will be scarce enough to bid transaction fees high enough to cover some semblance of decentralization when inflation stops.

Yes, but that scarcity is could potentially be controlled by the costs associated with adding a transaction to a block.  However, at the moment, that size may be large enough to have other repercussions that need to be considered and dealt with. If I remember correctly the 1 MB limit wasn't added to create scarcity, it was added to protect the blockchain from certain attack vectors with an assumption that it would be increased significantly in the future.

Whatever that limit should be we can only say for certain that it is not "infinite".

Obviously, it would be impossible for anyone to create (or broadcast) an "infinite" block.  The question that mrvision appears to be attempting to discuss is whether that limit should be arbitrarily chosen, should be based on some reasoning by the consensus of the community, or should be controlled by market forces based on the the physical limitations and costs associated with increasing the block size.

justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
May 20, 2015, 09:33:20 PM
 #11

The limited total number of coins is a socialist production quota used to create the scarcity needed for Bitcoin to function as a money like good.
Almost all the terms of art used in that sentence are used incorrectly.

Money is not a good and the production of currency is not a service. Scarcity as a concept does not apply to currency production.

Money is a ledger whose value is derived (among other things) from its accuracy.

Creating new entries in the ledger by fiat (issuing currency) reduces the accuracy of the ledger and thus the value of money.

Unfortunately since it's not possible for currency to exist before it exists, it must be issued and thus no monetary ledger can ever be perfect.

What makes Bitcoin's reward schedule tolerable is the fact that it's not subject to change, and that the distortion decays exponentially over a reasonable timescale so that we expect it to behave close enough to an ideal ledger in the medium term.

Also that there is no clear candidate for a less-harmful method of bringing a currency into existence.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
May 20, 2015, 09:52:32 PM
 #12

There was a hard 256kb limit on maximum acceptable blocksize?  Are you sure about that?  I don't remember that.  Regardless, there's a significant difference in risk between increasing the block size limit and removing it.

There was a target on the size-- not a blockchain validation rule (this has created some confusion for people because they go look back at old discussions about the temporary target and how easy it would be to increase and thing it was about the blocksize limit);  but that was just local policy, by default miners running stock software wouldn't create blocks over 250k, but all nodes would happily accept larger blocks up to the validation rule limit. When that policy-target was up upped we saw a massive influx of things like unsolicited advertisement transactions, which also increased when it was increased further. The only actual limit on block sizes (beyond the message encoding behavior) has only ever been the million byte limit.

There is zero incentive for miners to not fill the blocks entirely; almost any non-zero fee would be sufficient.
There are physical limits and costs that would prevent this.  Each additional transaction increases the size of the block.  There are costs associated with increasing the size of a block.  At a minimum, there is a (very small) increase in the chance that the block will be orphaned.
The only _fundamental_ cost is communicating the discrepancy between the transactions included and the assumed included transactions.  This can be arbitrarily low, e.g. if miners delay a little to include only somewhat older well propagated transactions-- the cost then is not a question of "size" but in breaking rank with what other miners are doing (and, in fact, producing a smaller block would be more costly).

Even without optimal differential transmission, and only looking at techniques which are nearly _universally_ deployed by large miners today; with the relay network protocol the marginal cost of including an already relayed transaction is two bytes per transaction. I can no longer measure a correlation with block size and orphaning rate; though there was a substantial one a few years ago before newer technology mostly eliminated size related impact on orphaning.

Importantly, to whatever extent residual marginal cost exists these costs can be completely eliminated by consolidating the control of mining into larger pools. We saw people intentionally centralizing pooling as a response to orphaning already (two years ago) which prompted the creation of the block-relay-network/protocol to try to remove some of that centralization pressure by reducing the cost of block relay so there was less gain to lowering the cost by centralizing. Moreover, any funds being spent coping with these costs (e.g. paying for faster connectivity to the majority of the hash-power) cannot be funds spent on POW security.  So I would refine DumbFruit's argument to point out that it isn't that "fees would naturally be priced at zero" but that the equilibrium is one where there is only a single full node in the network (whos bandwidth costs the fees pay for) and no POW security, because the that is the most efficient configuration and there is no in system control or pressure against it, and no ability to empower the users to choose another outcome except via the definition of the system.  I believe this is essentially the point that he's making with "the most competitive configuration in a free market"-- even to the extent those costs exist at all they are minimized through maximal centralization.  This is why it is my believe that its essential that the cost of running a node be absolutely low and relatively insignificant compared to POW security, or otherwise centralizing is a dominant strategy for miners.

Quote
storage costs associated with holding the list of unconfirmed transactions in memory.
One does not need to store transactions in memory ever-- that Bitcoin Core currently does is just an engineering artifact and because there is currently no reason not to.  Technically a miner does not need to store a transaction they've verified in any way at all, beyond remembering that it successfully verified. (and remembering that something verified doesn't even need to be reliable.) Depending on people to not get around to writing more efficient software or forming more efficient (e.g. more centralized) institutions would be a weak protection indeed!

Quote
or should be controlled by market forces based on the the physical limitations and costs associated with increasing the block size.
Thats a problem when the physical limitations largely do not exist, and to the extent that they exist can be eliminated almost completely by configuring the ecosystem in a more centralized manner (and incrementally so, given an existing ecosystem with block relay related costs you can always mitigate those costs by centralizing a little bit more).

DannyHamilton
Legendary
*
Offline Offline

Activity: 3374
Merit: 4606



View Profile
May 20, 2015, 10:30:08 PM
 #13

There was a hard 256kb limit on maximum acceptable blocksize?  Are you sure about that?  I don't remember that.  Regardless, there's a significant difference in risk between increasing the block size limit and removing it.
The only actual limit on block sizes (beyond the message encoding behavior) has only ever been the million byte limit.

That's what I thought.  Thanks for the confirmation.

I'll give the rest of your comments some thought.  There are some facts and some concepts in there that I hadn't yet considered.
DumbFruit
Sr. Member
****
Offline Offline

Activity: 433
Merit: 254


View Profile
May 21, 2015, 12:43:30 PM
Last edit: May 21, 2015, 01:48:21 PM by DumbFruit
 #14

You are mistaken.  There are physical limits and costs that would prevent this.  Each additional transaction increases the size of the block.  There are costs associated with increasing the size of a block.  At a minimum, there is a (very small) increase in the chance that the block will be orphaned.  Miners (and pools) would need to consider the cost of that risk and weigh it against the revenue gained from adding transactions.  Additionally, there storage costs associated with holding the list of unconfirmed transactions in memory.  Since some miners may choose to set a higher fee requirement, users would find that their transaction may not confirm quite as quickly with a lower fee.
Yes, mrvision was saying the same thing. "Epsilon" might have been a better way to put it. The fact is that the cost to a node is many orders of magnitude less than the cost to the network as a whole. Just saying, "it's not zero" whitewashes the fact that it is much much less than what's needed to compensate a highly redundant network. You're right, I'm guilty of hyperbole.

The hope is that the block space will be scarce enough to bid transaction fees high enough to cover some semblance of decentralization when inflation stops.

Yes, but that scarcity is could potentially be controlled by the costs associated with adding a transaction to a block.  However, at the moment, that size may be large enough to have other repercussions that need to be considered and dealt with. If I remember correctly the 1 MB limit wasn't added to create scarcity, it was added to protect the blockchain from certain attack vectors with an assumption that it would be increased significantly in the future.
It wasn't the intention, but ultimately that is the only thing that's going to prevent transaction fees from plummeting.

Whatever that limit should be we can only say for certain that it is not "infinite".

Obviously, it would be impossible for anyone to create (or broadcast) an "infinite" block.  The question that mrvision appears to be attempting to discuss is whether that limit should be arbitrarily chosen, should be based on some reasoning by the consensus of the community, or should be controlled by market forces based on the the physical limitations and costs associated with increasing the block size.
Same as above, I assumed from the way he was writing that his preference would be to let the market handle it entirely, without any imposed block size limit.
Some sort of algorithmically increasing blocksize limit is reasonable but at the same time very weird to consider when we don't even know what the Bitcoin network is going to look like once inflation stops at the current block size limit. What if it turns out, after the block size limit is increased (Or increased algorithmically), that transaction fees don't support the network? Are we going to see backpedaling at the glacial pace of a Bitcoin fork?

Edit: Er... Just noticed GMaxwell already said what I meant;
Quote from: GMaxwell
So I would refine DumbFruit's argument to point out that it isn't that "fees would naturally be priced at zero" but that the equilibrium is one where there is only a single full node in the network (whos bandwidth costs the fees pay for) and no POW security, because the that is the most efficient configuration and there is no in system control or pressure against it, and no ability to empower the users to choose another outcome except via the definition of the system.  I believe this is essentially the point that he's making with "the most competitive configuration in a free market"-- even to the extent those costs exist at all they are minimized through maximal centralization.

By their (dumb) fruits shall ye know them indeed...
mrvision (OP)
Sr. Member
****
Offline Offline

Activity: 527
Merit: 250



View Profile
May 21, 2015, 03:59:53 PM
 #15

What is the problem with transaction fees plummeting? As long as fees per block remain constant (or higher) it shouldn't matter if fees per transaction drop. Probably fees per transaction will drop at the same time that the number of transactions increase, so miners (and other market actors) will tweek their systems to get the maximum income possible. In fact we can see that bigger blocks get bigger fee rewards right now that there isn''t much demand for a 1mb+ block.

You can check yourself in blockchain.info

Quote from: dumbFruit
The hope is that the block space will be scarce enough to bid transaction fees high enough to cover some semblance of decentralization when inflation stops.

If block size is not removed/increased, then people will transact bitcoins outside the blockchain (Just like it happens inside bitstamp). So people will implement centralized wallets which will hold as a collateral bitcoins in one address, while moving in a centralized database the amounts between users. And even when this looks as a smart solution, if bitcoins are not moving around, miners wont be getting those fees which puts in danger the whole ecosystem.

If we want the miner to earn transaction fees, then transactions must be done in the blockchain and not outside.

About centralization.. i still believe that centralization comes by the increase in difficulty (Which by the way is something we knew since the begining of times) more than because of transaction costs... and for that we could talk long about how to create incentives to use P2Pool or any other system.

Again: reduce block size limit and see how bitcoin is not going to be more decentralized.
DumbFruit
Sr. Member
****
Offline Offline

Activity: 433
Merit: 254


View Profile
May 21, 2015, 07:03:03 PM
Last edit: May 21, 2015, 08:25:54 PM by DumbFruit
 #16

What is the problem with transaction fees plummeting? As long as fees per block remain constant (or higher) it shouldn't matter if fees per transaction drop.
The point is that the fees that an individual node will accept does not reflect the cost of a transaction to the system. So whatever the amount of fees per block, we can be sure that it would be too low to support all nodes, once inflation stops... And blocks are too big... Or demand is too low.

If block size is not removed/increased, then people will transact bitcoins outside the blockchain (Just like it happens inside bitstamp). So people will implement centralized wallets which will hold as a collateral bitcoins in one address, while moving in a centralized database the amounts between users. And even when this looks as a smart solution, if bitcoins are not moving around, miners wont be getting those fees which puts in danger the whole ecosystem.

If we want the miner to earn transaction fees, then transactions must be done in the blockchain and not outside.
The centralization of Bitcoin through small blocks is true, but it happens through a different mechanism. If the maximum amount of transactions per second is saturated with a certain quantity of users, then the ratio of indirect to direct users increases. That doesn't represent atrophy of hashpower, or the amount mining entities, or the amount of direct users. On one extreme we have centralization toward handling only expensive transactions of the wealthy few that can bid their way in, and on the other extreme we have atrophy of nodes.

I don't want to see a small fraction of the world having total control over Bitcoin in either direction, and what I think a lot of people would like to see is a automatic way of finding that balance. Just making the argument to increase block size is missing the larger point, as DannyHamilton pointed out earlier. Developers don't want that responsibility and the users don't want them to have it either.

By their (dumb) fruits shall ye know them indeed...
mrvision (OP)
Sr. Member
****
Offline Offline

Activity: 527
Merit: 250



View Profile
May 21, 2015, 08:39:20 PM
 #17

I don't want to see a small fraction of the world having total control over Bitcoin in either direction, and what I think a lot of people would like to see is a automatic way of finding that balance.
Yeah, but if in 3 years we're having 2000 transactions per second, you must assume those will be done in coinbase ledger (or similar) and not in the blockchain. That means that we've done a lot of work to go in circles to the same point as the gold standard with the same danger to get into fractional reserve system. Only few actors (other coinbases) will use the blockchain to move coins between systems.


The point is that the fees that an individual node will accept does not reflect the cost of a transaction to the system. So whatever the amount of fees per block, we can be sure that it would be too low to support all nodes, once inflation stops... And blocks are too big... Or demand is too low.

Do you realize that the same statment but changing "blocks are too big" with "blocks are too small" doesn't make the situation better? Big blocks allow more demand to produce transactions, so lets imagine that the optimal block size is 20mb at one point of the history in order to optimice propagation, etc etc... then more people would be able to pay fees than in 1mb.

Of course, because the optimal size is 20 mb and not 200mb, people would have to bid transaction fees anyway, but the main difference is that all the demand would be satisfied, because an individual could increase the optimal block size by broadcasting a transaction with 1btc fee making the optimum 30mb block size at that particular time (as 1btc would pay for his transaction costs + many others). This means that in a collaborative way with each fee we would be contributing to the increment of the optimum size of the block where our transaction would be recorded. Again transactions with the bigger fees would have preference as they contribute more than the others.

The thing is that the optimum block size would be dynamically discovered in a way that the total amount of fees pay the block production, with 1mb block size you cannot asure that, because you are supposing people will pay like 4btc to be one of the VIP who can make a transaction, and worse... that the rest of transactions will be simply lost or returned to the payer... this is disfunctional.

Also I have just checked that when the SOFT limit of 250kb has lifted to 1mb, we didn't experiment more centralization but less.
2 years ago: https://i.imgur.com/83W3Q2U.png
Now: https://blockchain.info/es/pools
BIT-Sharon
Sr. Member
****
Offline Offline

Activity: 266
Merit: 250


View Profile
May 26, 2015, 01:34:35 AM
 #18

I'm not impressed with your straw man.

The issue isn't that 1MB is a "magical number".

The intelligent discussion revolves around determining how to avoid having this conversation again, and how to maintain the decentralized nature of the system.

If you are ignoring those two concerns, then you are ignoring the entire issue.

I agree that maining the decentralized nature of the system is the key.
DumbFruit
Sr. Member
****
Offline Offline

Activity: 433
Merit: 254


View Profile
May 26, 2015, 04:44:42 PM
Last edit: July 10, 2015, 09:13:58 PM by DumbFruit
 #19

[Redacted] Cut out original response to a document to revise bad math, then my computer immediately crashed. So never-mind, dead horse stuff anyway. I'm surprised Word didn't save it.

By their (dumb) fruits shall ye know them indeed...
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!