Bitcoin Forum
December 14, 2024, 04:53:25 AM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 [4] 5 6 »  All
  Print  
Author Topic: A Scalability Roadmap  (Read 14933 times)
cbeast
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1014

Let's talk governance, lipstick, and pigs.


View Profile
October 11, 2014, 10:26:36 AM
 #61

Has the dust attack threat been abated? Block size was an issue at one time.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 129
Merit: 14



View Profile
October 11, 2014, 06:25:11 PM
Last edit: October 11, 2014, 06:45:35 PM by jonny1000
 #62

Limiting block size creates an inefficiency in the bitcoin system.  Inefficiency = profit.  This is a basic law of economics, though it is usually phrased in such a way as to justify profits by pointing out that they eliminate inefficiencies.  I am taking the other position, that if we want mining to be profitable then there needs to be some artificial inefficiency in the system, to support marginal producers.  Of course that profit will attract more hashing power thus reducing/eliminating the profit, but at a higher equilibrium.  However, I am not too worried about this aspect of large block sizes.  It is a fairly minor problem and one that is a century away.

+1

Very good point hello_good_sir.  I was trying to say this but you put it in a far more articulate way. I think we may need some artificial inefficiency at some point.



If supply is not constrained, transaction fees fall to the marginal cost, mining profit falls and then miners exit and the difficulty falls.  The remaining miners can then find blocks more easily, but they don’t necessarily get compensated more for this, because the fees would still be low.
If there are fewer miners competing for the same amount of transaction fees, then each miner's revenue has increased. The process you describe will continue until the oversupply of miners is corrected and equilibrium is restored.

Yes, but a key factor to consider is what equilibrium?  Will this be at a high enough difficulty and if not, do we need to manipulate the market?  Users pay transaction fees for there transactions to be included in a block, users are not directly paying for network security or network consensus. After the block reward falls, the incentive for network consensus can be considered as an indirect consequence of users paying for their transactions to be included in blocks, and therefore a pure unrestricted competitive market may not be an effective mechanism for determining transaction fees.  Getting a transaction included in a block and the network reaching consensus about the longest chain may be two slightly different things.  There is a mismatch here which I think some people miss.  This could be somewhat analogous to the classic tragedy of the commons problem.
LiteCoinGuy
Legendary
*
Offline Offline

Activity: 1148
Merit: 1014


In Satoshi I Trust


View Profile WWW
October 11, 2014, 07:28:58 PM
 #63

...Hurray, we just reinvented the SWIFT or ACH systems.

SWIFT doesn't work like this at all...

I think what he meant was that it would be like SWIFT in that it would mostly be for large international transfers. A fork like this will have to happen sooner or later.


sooner please. lets do the major changes in these days and the rest on top of bitcoin in other layers. there will be no more changes (hopefully) if we reach a market cap of 100 or 500 billions.

IIOII
Legendary
*
Offline Offline

Activity: 1153
Merit: 1012



View Profile
October 11, 2014, 08:51:19 PM
 #64

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Is it impossible to implement a self-regulating block size limit mechanism similar to the way difficulty is adjusted, which allows the block size limit to be increased and decreased based on "demand"?

I imagine that a dynamic mechanism would be much better at encouraging responsible (resource preserving) network use.

I'm very sceptical regarding a fixed-percentage increase, because there is zero assurance that Moore's "law" will remain true in the future. Because - as you know - past performance is no indicator of future results. And we're quickly approaching the atomic level in storage solutions for example. Decentralization should be preserved by all means possible, because it is the very core that ensures the safety and thereby the value of Bitcoin.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1011



View Profile
October 11, 2014, 10:24:21 PM
 #65

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Is it impossible to implement a self-regulating block size limit mechanism similar to the way difficulty is adjusted, which allows the block size limit to be increased and decreased based on "demand"?

I'm not aware of more recent discussions but I found the first three pages of this Feb 2013 thread good food for thought.

Decentralization should be preserved by all means possible, because it is the very core that ensures the safety and thereby the value of Bitcoin.

There's risk in everything and nothing is absolute.  This attitude would yield the obvious answer: "Don't ever raise the block limit at all".
hello_good_sir
Hero Member
*****
Offline Offline

Activity: 1008
Merit: 531



View Profile
October 12, 2014, 03:47:25 AM
 #66

Just exposing some ideas:

Gavin plan appear to me to be VERY conservative (maybe too much).

To be able to process the same number of transaction of VISA, Bitcoin should grow x2,000.
The size of blocks should go up ~1.000x at least to accommodate so many transactions.
And we will not just want to take VISA burden, we want, also, offer a service to the currently unbanked (being humans or DACs).
In the block size increase 50% every year, it will take 20 years to take over VISA alone; never mind the unbanked and DAcs.

You're not thinking about safety.  Yes it would be nice for bitcoin to be able to handle 2000 as many transactions as it can now, however that is not as important as keeping bitcoin decentralized.  Let's keep in mind why bitcoin was created: to create a digital gold standard so that people could protect their assets from central banks.  If bitcoin also becomes a ubiquitous payment system that would be great, but not if it comes at the expense of decentralization.

hello_good_sir
Hero Member
*****
Offline Offline

Activity: 1008
Merit: 531



View Profile
October 12, 2014, 03:50:13 AM
 #67

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Powerful entities would game the system, turning it into a proof-of-bandwidth system, which would be a bad thing.

IIOII
Legendary
*
Offline Offline

Activity: 1153
Merit: 1012



View Profile
October 12, 2014, 04:03:48 PM
 #68

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Is it impossible to implement a self-regulating block size limit mechanism similar to the way difficulty is adjusted, which allows the block size limit to be increased and decreased based on "demand"?

I'm not aware of more recent discussions but I found the first three pages of this Feb 2013 thread good food for thought.

I've read the initial post of that thread several times and I think that its headline is a bit misleading. Essentially what Peter Todd is saying is that an large blocksize limit in general encourages the miners to drive out low-bandwidth competition. He is actually opposing Gavin's plan as well:

I primarily want to keep the limit fixed so we don't have a perverse incentive. Ensuring that everyone can audit the network properly is secondarily.

If there was consensus to, say, raise the limit to 100MiB that's something I could be convinced of. But only if raising the limit is not something that happens automatically under miner control, nor if the limit is going to just be raised year after year.

According to Peter Todd it is essential that miners do not control the blocksize limit. He argues based on the assumption of an rolling average mechanism that takes its data from the previous observed block sizes. But that's not an argument against a dynamic block size limit (increase/decrease) in general. The point is, that the dynamic block size limit should not be able to be (substantially) influenced by miners, but instead by the transacting parties. So if it would be possible to determine the dynamic block size limit based on the number of transactions multiplied by a fixed "reasonably large" size constant plus safety margin you would get rid of the problem.


Decentralization should be preserved by all means possible, because it is the very core that ensures the safety and thereby the value of Bitcoin.

There's risk in everything and nothing is absolute.  This attitude would yield the obvious answer: "Don't ever raise the block limit at all".

I'd better say: "Only raise block size limit if required by the minimum amount necessary."
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1013



View Profile
October 12, 2014, 04:27:55 PM
 #69

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Powerful entities would game the system, turning it into a proof-of-bandwidth system, which would be a bad thing.
They can only do this as long as network bandwidth is donated and the consumers of it do not pay the suppliers.

Fix that problem and we'll never need to have this debate again.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1011



View Profile
October 12, 2014, 06:01:13 PM
 #70

I'm not aware of more recent discussions but I found the first three pages of this Feb 2013 thread good food for thought.

I've read the initial post of that thread several times and I think that its headline is a bit misleading. Essentially what Peter Todd is saying is that an large blocksize limit in general encourages the miners to drive out low-bandwidth competition. He is actually opposing Gavin's plan as well:

It was simply that many heavy-hitters were expressing opposing views that I found the thread informative.

According to Peter Todd it is essential that miners do not control the blocksize limit. He argues based on the assumption of an rolling average mechanism that takes its data from the previous observed block sizes. But that's not an argument against a dynamic block size limit (increase/decrease) in general. The point is, that the dynamic block size limit should not be able to be (substantially) influenced by miners, but instead by the transacting parties. So if it would be possible to determine the dynamic block size limit based on the number of transactions multiplied by a fixed "reasonably large" size constant plus safety margin you would get rid of the problem.

Certainly, a dynamic means of adjusting the block size which could not be gamed by miners would be great.  Linked by Peter was an idea from Gavin of determining an appropriate block size by the times taken by nodes to verify blocks.

Unfortunately, I'm not aware of any proposal which really does this.  I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

I'd better say: "Only raise block size limit if required by the minimum amount necessary."

What constitutes "necessary"?  What if there are so many "necessary" transactions that it's impossible for Bitcoin to continue as a decentralised system?  I'd like to see as much block space as possible but will happily work to keep the blocksize smaller than many deem "necessary" to avoid centralisation.
cbeast
Donator
Legendary
*
Offline Offline

Activity: 1736
Merit: 1014

Let's talk governance, lipstick, and pigs.


View Profile
October 12, 2014, 07:04:53 PM
 #71

I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Powerful entities would game the system, turning it into a proof-of-bandwidth system, which would be a bad thing.
They can only do this as long as network bandwidth is donated and the consumers of it do not pay the suppliers.

Fix that problem and we'll never need to have this debate again.
It doesn't need to be fixed, it needs to be offered by suppliers. If there is a demand, they will supply it. They are not supplying it because vendors and consumers are not aware of the issue. Education is what's needed. If they can be shown the profitability, then they will fill the niche.

Any significantly advanced cryptocurrency is indistinguishable from Ponzi Tulips.
minime
Hero Member
*****
Offline Offline

Activity: 588
Merit: 500



View Profile
October 13, 2014, 11:12:53 AM
 #72

just do it...
IIOII
Legendary
*
Offline Offline

Activity: 1153
Merit: 1012



View Profile
October 13, 2014, 12:37:50 PM
 #73

Certainly, a dynamic means of adjusting the block size which could not be gamed by miners would be great.  Linked by Peter was an idea from Gavin of determining an appropriate block size by the times taken by nodes to verify blocks.

Unfortunately, I'm not aware of any proposal which really does this.  I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

My aim is a broad call for a (re)consideration of a dynamic "demand-driven" block size limit mechanism. The best adjustment estimators have yet to be determined. I think the concept should not be prematurely dismissed, because it could be highly beneficial in terms of resource preservation and hence decentralization.

The problem that selfish large miners create millions of transactions could be alleviated by using a median (instead of a mean) statistic in the adjustment estimation, which is much less susceptible to extreme values. Maybe one could also do statistical correction based on IP adresses (those that frequently submit only blocks with a excessively huge number of transactions get less weight).


I'd better say: "Only raise block size limit if required by the minimum amount necessary."

What constitutes "necessary"?  What if there are so many "necessary" transactions that it's impossible for Bitcoin to continue as a decentralised system?  I'd like to see as much block space as possible but will happily work to keep the blocksize smaller than many deem "necessary" to avoid centralisation.

Of course "necessary" has to be defined. I think it is acceptable to make Bitcoin progressively more unviable (through higher fees) for microtransactions if decentralization is at risk. Very small transactions could also happen off-the-chain. However what "small" means is open to debate.
teukon
Legendary
*
Offline Offline

Activity: 1246
Merit: 1011



View Profile
October 13, 2014, 05:27:01 PM
 #74

The problem that selfish large miners create millions of transactions could be alleviated by using a median (instead of a mean) statistic in the adjustment estimation, which is much less susceptible to extreme values. Maybe one could also do statistical correction based on IP adresses (those that frequently submit only blocks with a excessively huge number of transactions get less weight).

This is starting to sound hairy to me.  I can easily imagine that 60% of the largest miners would benefit sufficiently from the loss of the weakest 20% of miners that it's profitable for them to all include some number of plausible-looking transactions between their addresses (thereby causing an inflated median).  I feel that anything involving IP addresses is prone to abuse and much worse than the admittedly ugly fixed-growth proposal.

Of course "necessary" has to be defined. I think it is acceptable to make Bitcoin progressively more unviable (through higher fees) for microtransactions if decentralization is at risk. Very small transactions could also happen off-the-chain. However what "small" means is open to debate.

My own feeling is that we should be looking at "as much block-space as possible given the decentralisation requirement" rather than "as little block-space as necessary given current usage".  However, if you can find an appealing notions of necessity, smallness, or some alternative method of attempting to balance centralisation risk against utility which involves fewer magic numbers and uncertainty than the fixed-growth proposal then it's certainly worth it's own thread in the development section.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 129
Merit: 14



View Profile
October 13, 2014, 08:56:30 PM
Last edit: October 13, 2014, 09:28:36 PM by jonny1000
 #75

I propose the following rule to determine the block size limit, once the block reward is low
The block size limit would increase (or decrease), by X%, if total transaction fees in the last N blocks is Y Bitcoin or more (or less).  

For example:
If the average aggregate transaction fees in the last 100,000 blocks is 1 Bitcoin per block, or more, then there could be a 20% increase in the block size limit.  

Advantages of this methodology include:
  • This algorithm would be relatively simple
  • The limit is determined algorithmically from historic blockchain data and therefore there will be a high level of agreement over the block size limit
  • The system ensures sufficient fees are paid to secure the network in a direct way
  • It would be difficult and expensive to manipulate this data, especially if mining is competitive and decentralized
  • The limit would relate well to demand for Bitcoin usage and real demand based on transaction fees, not just volume

I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

It could be in miners interests to keep the block size limit small, to make the resource they are “selling” more scarce and improve profitability.  The assumption that miners would try to manipulate the block size limit upwards is not necessary true, it depends on the bandwidth issue versus the need for artificial scarcity issue dynamics at the time.  If Moore’s law holds then eventually the artificial scarcity argument will become overwhelmingly more relevant than the bandwidth issues and miners may want smaller blocks.  Miners could manipulate it both ways depending on the dynamics at the time.

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1020



View Profile
October 13, 2014, 10:27:44 PM
 #76

It could be in miners interests to keep the block size limit small, to make the resource they are “selling” more scarce and improve profitability.  The assumption that miners would try to manipulate the block size limit upwards is not necessary true, it depends on the bandwidth issue versus the need for artificial scarcity issue dynamics at the time.  If Moore’s law holds then eventually the artificial scarcity argument will become overwhelmingly more relevant than the bandwidth issues and miners may want smaller blocks.  Miners could manipulate it both ways depending on the dynamics at the time.

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?

It is already explicit in the bitcoin network structure that miners can 'manipulate the block size down'.  They could all issue empty blocks if they wanted.  And yes, miners can also 'manipulate' the block size up.  So the lower bound for the 'manipulation' is zero.  The upper bound is the block size limit, currently at 1MB.  We all agree miners can do whatever they want within those limits.  Gavin's proposal is just a concept for moving that upper bound, and thus giving miners a larger range of sizes of which they may choose to make a block.  An idea I support, and I think Gavin supports, is to have the block size be bounded by the technical considerations of decentralization.  Miners can create their own cartel if they want to create artificial scarcity, so they don't need a max block size to do it.  But cartel or not, max block size enshrines, essentially into 'bitcoin law', that bitcoin will remain auditable and and available to the interested individual, both financially and practically speaking.

My own feeling is that we should be looking at "as much block-space as possible given the decentralisation requirement" rather than "as little block-space as necessary given current usage". 

Totally agree. 

However, if you can find an appealing notions of necessity, smallness, or some alternative method of attempting to balance centralisation risk against utility which involves fewer magic numbers and uncertainty than the fixed-growth proposal then it's certainly worth it's own thread in the development section.

I think MaxBlockSize will remain a magic number, and I think that is okay.  It is a critical variable that needs to be adjusted for environmental conditions, balancing, exactly as you put it teukon, [de]centralization against utility.  As computing power grows, it is easier to conceal, hide, and keep "decentralized" computational activities.

Raise it too quickly and it gets too expensive for ordinary people to run full nodes.

So I'm saying: the future is uncertain, but there is a clear trend. Lets follow that trend, because it is the best predictor of what will happen that we have.

If the experts are wrong, and bandwidth growth (or CPU growth or memory growth or whatever) slows or stops in ten years, then fine: change the largest-block-I'll-accept formula. Lowering the maximum is easier than raising it (lowering is a soft-forking change that would only affect stubborn miners who insisted on creating larger-than-what-the-majority-wants blocks).

The more accurate the projection of computing / bandwidth growth is, the less often the magic number would need to be changed.  If we project very accurately, the magic number may never need to be adjusted again.  That being said, it is safer to err on the side of caution, as Gavin has done, to make sure any MaxBlockSize formula does not allow blocks to grow bigger than the hobbiest / interested individual's ability to keep up.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 129
Merit: 14



View Profile
October 13, 2014, 10:54:15 PM
 #77

It is already explicit in the bitcoin network structure that miners can 'manipulate the block size down'.  They could all issue empty blocks if they wanted.  And yes, miners can also 'manipulate' the block size up.  So the lower bound for the 'manipulation' is zero.  The upper bound is the block size limit, currently at 1MB.  We all agree miners can do whatever they want within those limits.  Gavin's proposal is just a concept for moving that upper bound, and thus giving miners a larger range of sizes of which they may choose to make a block.  An idea I support, and I think Gavin supports, is to have the block size be bounded by the technical considerations of decentralization.  

I apologise that I was not being very clear, I was talking about miners manipulating the block size limit upwards or downwards in the hypothetical scenario that the block size limit is determined dynamically by an algorithm, for example the one I mention above linking the block size limit to aggregate transaction fees.  What do you think on this proposal?

Miners can create their own cartel if they want to create artificial scarcity, so they don't need a max block size to do it.  But cartel or not, max block size enshrines, essentially into 'bitcoin law', that bitcoin will remain auditable and and available to the interested individual, both financially and practically speaking.

Why do you say that miners can create their own cartel to create artificial scarcity?  Perhaps they can do this, but a healthy Bitcoin network has a competitive and diverse mining industry where this may not be possible.  If miners collude together in this way then Bitcoin has more serious problems that this scalability issue.

I agree that a max block size is also helpful to keep the network decentralised and “available to the interested individual, both financially and practically speaking” as you say, however I postulate that the max size is also necessary for another reason:  

Artificial scarcity in block space -> higher aggregate transaction fees -> higher equilibrium mining difficulty -> more secure network

No scarcity in block space -> lower aggregate transaction fees (yes a higher volume, but no "artificial" profit) -> lower equilibrium mining difficulty -> less secure network
Cubic Earth
Legendary
*
Offline Offline

Activity: 1176
Merit: 1020



View Profile
October 14, 2014, 03:23:33 AM
 #78

I apologise that I was not being very clear, I was talking about miners manipulating the block size limit upwards or downwards in the hypothetical scenario that the block size limit is determined dynamically by an algorithm, for example the one I mention above linking the block size limit to aggregate transaction fees.  What do you think on this proposal?

I think I did understand what you were saying.  I was trying to point out that miners already have control of the size of blocks they publish.  And therefore - collectively - miners have control over how fast the blockchain grows.  But that freedom is not absolute.  There upper and lower limits.  Since blocks with a size less-than-zero is mostly an absurd concept, we can safely put just a few bytes as the smallest possible block.  The biggest possible block size is what we are discussing here.  It basically serves as a check that full nodes can use against the miners, meaning nodes can audit the service the miners are providing and otherwise connect and communicate about the sate of the network.  Any proposal that gives the miners some automated way to influence the MaxBlockSize could be used to make the blocks so big as to promote centralization of the nodes.  Individuals would loose there ability to audit the network.

Miners currently do influence of the MaxBlockSize variable, but the influence is based human communication, persuasion, and lobbying within the ranks of the Bitcoin Community.  If MaxBlockSize was algorithmically controlled, with the formula taking as input conditions the miners had some form of control over, then MaxBlockSize could be raised or lowered by the miners directly, without the consensus of full nodes.  It would no longer be a check.



Why do you say that miners can create their own cartel to create artificial scarcity?  Perhaps they can do this, but a healthy Bitcoin network has a competitive and diverse mining industry where this may not be possible.  If miners collude together in this way then Bitcoin has more serious problems that this scalability issue.

I agree that a max block size is also helpful to keep the network decentralised and “available to the interested individual, both financially and practically speaking” as you say, however I postulate that the max size is also necessary for another reason:  

Artificial scarcity in block space -> higher aggregate transaction fees -> higher equilibrium mining difficulty -> more secure network

No scarcity in block space -> lower aggregate transaction fees (yes a higher volume, but no "artificial" profit) -> lower equilibrium mining difficulty -> less secure network

That's why I said cartel, not collude.  Perhaps I should have used the word 'association' to describe miners working together in a constructive fashion.  Miners collaborating is itself not a problem.  In fact, they do work together all the time and the shared computational output is the blockchain.  If at some future point a majority of the miners start behaving badly, the community will respond.  if the MaxBlockSize was very large, and the dynamics of the bitcoin system were causing the hashrate to fall, I would expect miners to get together and solve the problem.  That could include a miner-only agreement to only publish blocks of a certain size, to drive up fee requirements.  This is not a proposal to raise MinBlockSize.
jonny1000 (OP)
Member
**
Offline Offline

Activity: 129
Merit: 14



View Profile
October 14, 2014, 08:23:09 AM
Last edit: October 14, 2014, 10:30:04 AM by jonny1000
 #79

Game theory suggests that under certain conditions, these types of agreements or associations are inherently unstable, as the behaviour of the members is an example of a prisoner's dilemma. Each member would be able to make more profit by breaking the agreement (producing larger blocks or including transactions at lower prices) than it could make by abiding by it.

There are several factors that will affect the miners ability to monitor the association:

1.         Number of firms in the industry – High in a competitive mining market, with low barriers to entry and exit for potentially anonymous miners -> association difficult

2.         Characteristics of the products sold by the firms – Homogenous -> association is possible

3.         Production costs of each member – Differing and low costs -> association difficult

4.         Behaviour of demand – Transaction volume demand is highly volatile in different periods -> association difficult
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
October 14, 2014, 04:38:08 PM
 #80

Certainly, a dynamic means of adjusting the block size which could not be gamed by miners would be great.  Linked by Peter was an idea from Gavin of determining an appropriate block size by the times taken by nodes to verify blocks.

Unfortunately, I'm not aware of any proposal which really does this.  I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

My aim is a broad call for a (re)consideration of a dynamic "demand-driven" block size limit mechanism. The best adjustment estimators have yet to be determined. I think the concept should not be prematurely dismissed, because it could be highly beneficial in terms of resource preservation and hence decentralization.

The problem that selfish large miners create millions of transactions could be alleviated by using a median (instead of a mean) statistic in the adjustment estimation, which is much less susceptible to extreme values. Maybe one could also do statistical correction based on IP adresses (those that frequently submit only blocks with a excessively huge number of transactions get less weight).


I'd better say: "Only raise block size limit if required by the minimum amount necessary."

What constitutes "necessary"?  What if there are so many "necessary" transactions that it's impossible for Bitcoin to continue as a decentralised system?  I'd like to see as much block space as possible but will happily work to keep the blocksize smaller than many deem "necessary" to avoid centralisation.

Of course "necessary" has to be defined. I think it is acceptable to make Bitcoin progressively more unviable (through higher fees) for microtransactions if decentralization is at risk. Very small transactions could also happen off-the-chain. However what "small" means is open to debate.
QFT
Lets use measurement and math over extrapolations where possible,  Balance the risk to decentralization vs making it easier for transaction volume in favor of decentralization.  It is difficult to recover from centralizing effects.
If block bloat by conspiring miners is a concern then there can be growth caps on top of a dynamic scalability protocol too.

We have no crystal ball to tell us the future.  All we know is that we don't know.

And I'll just leave this here:
http://xkcd.com/605/



FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
Pages: « 1 2 3 [4] 5 6 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!