Bitcoin Forum
May 07, 2024, 03:43:15 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 9 10 [11] 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 »
  Print  
Author Topic: How a floating blocksize limit inevitably leads towards centralization  (Read 71512 times)
wtfvanity
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500


WTF???


View Profile
February 20, 2013, 10:23:02 PM
 #201

Centralized services handling a larger amount of payments has another drawback that hasn't been discussed much, it's the fact that using a fractional reserve would become much easier. As long as the blockchain itself is the dominant transaction platform, widespread use of fractional reserve is impossible. This is one more reason to do everything possible to keep as much of the transactions in the blockchain as reasonably possible.

With this I mean the exact same thing as Gavin meant. Subcent transactions should be disregarded as not viable. Transactions worth more than $0.01 should be something that Bitcoin can handle. There really is no reason why it can't, only a whole truckload of FUD.

I'd go as far as to saying greater than a dollar with 1/10th cent precision.

It reminds me of that 1010220 commercial. Can't buy much for less than a buck these days. That doesn't mean you don't receive change still down to the penny.

          WTF!     Don't Click Here              
          .      .            .            .        .            .            .          .        .     .               .            .             .            .            .           .            .     .               .         .              .           .            .            .            .     .      .     .    .     .          .            .          .            .            .           .              .     .            .            .           .            .               .         .            .     .            .            .             .            .              .            .            .      .            .            .            .            .            .            .             .          .
1715096595
Hero Member
*
Offline Offline

Posts: 1715096595

View Profile Personal Message (Offline)

Ignore
1715096595
Reply with quote  #2

1715096595
Report to moderator
Activity + Trust + Earned Merit == The Most Recognized Users on Bitcointalk
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1715096595
Hero Member
*
Offline Offline

Posts: 1715096595

View Profile Personal Message (Offline)

Ignore
1715096595
Reply with quote  #2

1715096595
Report to moderator
1715096595
Hero Member
*
Offline Offline

Posts: 1715096595

View Profile Personal Message (Offline)

Ignore
1715096595
Reply with quote  #2

1715096595
Report to moderator
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 20, 2013, 10:23:58 PM
 #202

I don't trust off-blockchain transactions...Especially with...the "threat" that they could be executed by being put into the blockchain at any time

Keep in mind that when we talk about off-blockchain transactions, we are talking about alternate block chains. These would be separate crypto-currencies with different properties. Ripple is one example (it relies on trust, unlike Bitcoin). I'm sure there will be others when the opportunity for profit arises.

Quote
I'd rather see a dynamic solution...

There are a lot of things we'd rather see but the point being made is that there are limits to what can be done with Bitcoin, while keeping it Bitcoin (global consensus, proof of work, etc...) Raising the block limit by a non-trivial amount may not be practical.

It should be easy to see that if there was no limit on block sizes, that fees would trend towards zero (ignoring the OP's original stated problem of miners attacking other miners by producing large padded blocks). Do you understand that with increasing block sizes come smaller fees?

For that matter I've seen a lot of talking about bandwidth, storage, and processing power but it seems everyone has overlooked that:

Fees will decrease as block size is increased (all else being equal)

Do people not get this, or am I wrong?
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1007



View Profile
February 20, 2013, 10:25:10 PM
 #203

There is most likely no need to do incremental changes. There are already pretty good looking suggestions on how we can simply let miners decide the block size. To do that we would need some fairly strict rules for how long validating blocks by regular nodes can take until they are rejected. This would create a cap for the block size that is actually relative to the processing power of most full nodes. That is something we want, I think.

The other interesting suggestion was linking mining difficulty and the block size. Those are the two decent suggestions so far. Finding a long term way to fix this is much better than simply making a one time increase. If there is a one time increase, it should be a massive increase together with a smaller soft limit, so that the soft limit can be raised more easily in the future if necessary.

While I can see your point, I still think that there should be a hard limit, however high that might be.  In this way, there is always an absolute for which large miners 'cheating' by trying to force out smaller players (via padding of the transaction queue or similar) will run into that limit.  Hopefully this absolute limit will discourage those unscrupulous players from turning to the dark side, simply based on the idea that there are going to be a percentage of players that they could not force out of the mining business, no matter how much larger they could grow on a relative basis.

However, upon further thought, I think that said hard limit should be pretty freaking high, in order to allow the soft limit to be our control method for many more years without further need of hard forks.  And if we are going to do this hard fork against the desires of some miners (which is probably a certainty) it's better that we get to it sooner rather than later.  If we start bumping up against the hard limit, some miners might just manage to profit from that artificial scarcity in ways that would turn a good portion of the miners aginst the idea of a hard fork, whereas they might otherwise not oppose one before we get there.

How high would such a hard limit be?  Can we estimate how many transactions per second that, say, a one Gb hard limit could process?  As already pointed out by many, we don't need to accommodate all of the transactions that occur in the world; but we need to be able to do much better than 7 per second.  What is a good target?  VISA's transaction volume?  Paypal's? Or VISA + Mastercard?  Once we decide, then we need to set it and let it go, and be willing to let the market in transaction fees and off-network transfers simply run.  We are going to get way too big to do this kind of thing twice.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 20, 2013, 10:29:08 PM
 #204

why don't Mastercard and Visa set a limit on the number of transactions they process in order to maximize their fee revenue?

It's because VISA and MasterCard are not Bitcoin. They are each a centralized, private payment system. There's no limit on the number of transactions they can process.

Do you understand that it is only scarcity of transaction space that guarantees a non-zero equilibrium level of fees (in the Bitcoin system)? When miners are forced to choose which transactions to exclude in a candidate block, due to limited space, they will obviously discard the transactions with the lowest fees (per kilobyte). This creates competition and drives the fees up to a new equilibrium level.

See my earlier post which predicts the future of block size and transaction fees.
Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 20, 2013, 10:32:01 PM
 #205

misterbigg, I don't think many here are rooting for a change where all scarcity is removed from the equation. I think that could be a disastrous move long term. There are however methods to retain some scarcity without having too much of it. My favorite so far is the solution where we give more freedom to the miners, they can basically decide the block size. At the same time we would put hard limits on block validation so that the full nodes will have to be able to validate them reasonably fast. This way the miners can't just create super large blocks since they would all be rejected.

I believe the hard limit for block size needs to be massive or lifted entirely, so that we don't have to ever do another hard fork, at least not thanks to this issue. The soft limit is another issue, it might not be needed either if the system in place is reliable.

Denarium closing sale discounts now up to 43%! Check out our products from here!
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
February 20, 2013, 10:35:54 PM
 #206

When miners are forced to choose which transactions to exclude in a candidate block, due to limited space, they will obviously discard the transactions with the lowest fees (per kilobyte). This creates competition and drives the fees up to a new equilibrium level.
Or maybe when people can't get their transactions processed in a timely fashion, regardless of how much they pay in fees, they'll just abandon Bitcoin entirely and the miners will get nothing.

Seriously though, if reducing the number of transactions will increase miner revenue then why stop at seven transactions per second? Reduce the limit so that only one transaction is allowed per block and watch miner profitability go through the roof.
Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 20, 2013, 10:41:07 PM
 #207

On the other hand I kind of like the solution where the block size is calculated based on mining difficulty. That makes sense from the perspective of mining incentive. If mining becomes less profitable, difficulty lowers, max block size becomes smaller. Thus blocks would have more scarcity and users would need to pay more fees to get transactions into the blockchain, thus giving the miners more incentive. Some kind of equilibrium would happen.

I mean, if the average speed of a full node is the main factor we use to determine this, it really doesn't mean anything other than the fact that we can be reasonably sure that running a full node will remain in reach of regular users. It doesn't actually do much else.

Linking it to mining difficulty would increase mining (our security) if there was more Bitcoin usage and more fees (a higher reward, more miners, more difficulty), thus allowing for a larger block size.

I just changed my mind, I do believe that is actually the best solution. It requires more study for sure, to make sure there isn't some huge loophole. There is obviously the question of how exactly is the max block size calculated from the difficulty.

Denarium closing sale discounts now up to 43%! Check out our products from here!
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 20, 2013, 10:45:23 PM
 #208

I don't think many here are rooting for a change where all scarcity is removed

I beg to differ. There are plenty who feel that there should be no limit, so that Bitcoin will "scale" to accommodate any amount of transaction volume. It should be clear that, storage and bandwidth issues aside, this will result in fees trending to zero.

Quote
There are however methods to retain some scarcity without having too much of it.

Agreed, and I posted a simple method of adjusting MAX_BLOCK_SIZE based on measured scarcity.

Quote
My favorite so far is the solution where we give more freedom to the miners, they can basically decide the block size.

It seems obvious that all miners will converge on the same algorithm for producing blocks: Include all transaction with fees. This maximizes mining revenue. But only in the short term. Once ALL miners start doing this, then we have the equivalent of no limit and my comment earlier applies: fees will trend towards zero. Giving "freedom" to miners isn't really a choice. We already know what strategy miners will use. It will be to maximize their fees. Anyone who doesn't follow this strategy will go bankrupt.

Quote
At the same time we would put hard limits on block validation

At which point, miners will again converge on the same algorithm: Attempt to fill each block up to the hard limit of size, choosing the transactions which offer the highest fee per kilobyte.

Quote
I believe the hard limit for block size needs to be...lifted entirely

Which goes against what you earlier agreed to, that scarcity is a requirement for fees to reach a non zero equilibrium See, there ARE some people who are rooting for limits to be removed!  Grin

Quote
The soft limit is another issue

Keep in mind that the soft limit is just an artificial barrier. The instant that it becomes profitable for miners to comment out that piece of code and substitute it with the "winning" strategy (fill the block with transactions having the highest fees per kilobyte), they will do so.

if the average speed of a full node is the main factor we use to determine this...

Any scheme for dynamically adjusting the block size should be based ONLY on the information contained in the block chain, and not any other information like the "speed" of nodes, or more importantly attributes of the memory pool of pending transactions.

MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1007



View Profile
February 20, 2013, 10:47:47 PM
 #209

I don't trust off-blockchain transactions...Especially with...the "threat" that they could be executed by being put into the blockchain at any time

Keep in mind that when we talk about off-blockchain transactions, we are talking about alternate block chains. These would be separate crypto-currencies with different properties. Ripple is one example (it relies on trust, unlike Bitcoin). I'm sure there will be others when the opportunity for profit arises.


No we are not.  At least I'm not.  I'm talking about off-network Bitcoin transactions, of any type, but that are still denominated in bitcoins.  This could be people who trade in sets of private keys (unsafe, yes; but possible), it could be people who trade in a paper currency denominated and/or backed by a bitcoin reserve, or this could be people trading within a single online wallet service (think Ebay, using something like MyBitcoin.com rather than Paypal as the default transaction method) or a potential parrallel network of wallet services that function like banks, taking the Bitcoin equivilant of bankchecks and settling up with one another on much longer intervals.

All of these methods, at least all that I can think of, require more trust between parties and involve greater risk than using the main Bitcoin network; but that is also why they would be a cheaper alternative most of the time.  If you're going to be buying/selling a car with bitcoins, you're going to want to use the Bitcoin network; but if you're buying a snickers bar at the store, a standardized method of trading private keys might be a viable alternative to a blockchain transaction fee.  Microtransactions were never Bitcoin's strong point; distance & high security with low trust were.  Not all transactions require that kind of certainty.

Quote
Quote
I'd rather see a dynamic solution...

There are a lot of things we'd rather see but the point being made is that there are limits to what can be done with Bitcoin, while keeping it Bitcoin (global consensus, proof of work, etc...) Raising the block limit by a non-trivial amount may not be practical.

It should be easy to see that if there was no limit on block sizes, that fees would trend towards zero (ignoring the OP's original stated problem of miners attacking other miners by producing large padded blocks). Do you understand that with increasing block sizes come smaller fees?

For that matter I've seen a lot of talking about bandwidth, storage, and processing power but it seems everyone has overlooked that:

Fees will decrease as block size is increased (all else being equal)

Do people not get this, or am I wrong?


I get this, and I agree from an economic incentives perspective.  This is the very same problem of scarcity that prompted a minimum fee rule, in order for a transaction to be considered a fee paying transaction according to the priority score algo that Gavin put in a couple of years ago.  The limits prompt delays in free transactions, and that is a cost to consumers.  Wise consumers will pay a fee for speed, but not more than they must.  If the transaction fees can ever be expected to replace the decending block reward, there must be some kind of scarcity there.  If we remove the limits altogether, the supply of transaction space will (functionally) be as close to infinate as the supply of bandwidth availabe to email spammers.  There will be no incentive for the majority of users to contribute a fee, even after the block rewards decrease to nominally zero, unless they are running up against some kind of limiting resource.  I believe that free transactions should always be possible, but if speed of confirmation is the concern you're going to have to contribute.  We can suffer a lot more 'freeloading' than can VISA or Mastercard, but the network is not costless, so the service to users cannot be costless either.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 20, 2013, 10:48:41 PM
 #210

Which goes against what you earlier agreed to, that scarcity is a requirement for fees to reach a non zero equilibrium See, there ARE some people who are rooting for limits to be removed!  Grin

The block size limit doesn't have much meaning if there are other systems in place that make it impossible or impractical to create super large blocks. However my favorite solution right now is to simply link the max size with difficulty in some fashion, that makes most sense to me.

Denarium closing sale discounts now up to 43%! Check out our products from here!
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 20, 2013, 10:59:14 PM
 #211

my favorite solution right now is to simply link the max size with difficulty in some fashion, that makes most sense to me.

This doesn't take scarcity into account and would require an oracle to provide the constants for the necessary formula linking size to difficulty. It's easy to see the case where difficulty outpaces transaction volume; We're about to see that now with ASICs coming online. Once the maximum block size is sufficiently large so that all pending transactions can fit, now we're back to the case where there's no limit and fees trend to zero. Hopefully this example should kill off any ideas about tying block size to difficulty.

I'll repost the scheme I described elsewhere. It uses only information found in the block chain, and should be resistant to miners gaming the system. It increases the maximum block size based strictly on scarcity (preserving it). It doesn't depend on measurements of time or bandwidth. I'm not claiming this is the perfect system, but it provides some ideas to use as a starting point:

1) Block size adjustments happen at the same time that network difficulty adjusts
2) On a block size adjustment, the size either stays the same or is increased by a fixed percentage (say, 10%). This percentage is a baked-in constant requiring a hard fork to change.
3) The block size is increased if more than 50% of the blocks in the previous interval have a size greater than or equal to 90% of the max block size. Both of the percentage thresholds are baked in.

How high would such a hard limit be? Can we estimate how many transactions per second that, say, a one Gb hard limit could process?

Any adjustment to the maximum block size must preserve scarcity. The question is not how many transactions can be handled by a one gigabyte hard limit, but rather will a one gigabyte hard limit produce sufficient scarcity?
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1007



View Profile
February 20, 2013, 11:07:59 PM
 #212

Another possible solution to the scarcity of transaction space versus an infinately growing transaction queue (with infinately growing free transaction confirmation delays) is to permit an unlimited (or massively huge limit) block to legally occur under specific conditions that would not incentivise miners towards anti-competitive activities.

For example, there could be an exception rule for the blocksize limit if zero fee paying transactions were included into the block.  This rule would permit a miner to queue dump for his own reasons, be it that his queue is too large, it's owned by a major bitcoin holder that simply wishes to help the network at his own expense, or a major retailer (think WalMart) that has a huge amount of free transactions from customers to process, and mining isn't directly their business model.

Another method would be to permit an un-limited block based upon an interval that miners could not depend upon.  For example, the difficulty is adjusted every two weeks or so, but the difficulty adjustment block could also be an unlimited block, also permitting (perhaps expecting) the miner that finds that block to not only include every single fee paying transaction in the block, but also every single transaction still in his queue.  This would limit the delay for free transactions to a two week maximum, and only burden the bandwidth challenged clients and miners once every two weeks and on a predictable schedule.

Occasionally flushing the transaction queues would benefit all players, without significantly impacting the scarcity of transaction space for the remainder of the time period.  However, the second method is likely to encourage free transactions leading up to the unlimited block's expected arrival; whereas the prior method of permitting unlimited blocks based upon a fee-less transaction queue would not encourge same, but nor could users be certain that their free transactions be processed in any reasonable time frame, as there is not way to be certain that any miner would be willing to do this.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
February 20, 2013, 11:12:56 PM
 #213

If the soft limit reached, I would prefer putting some transaction limit into each client, e.g. you have one low fee transaction every xxx hour, and further transaction will have to pay higher fee

If a hard fork happened and generated unexpected devastating effect, investors lost lot's of wealth, who is going to responsible for that? I do not think any developer can stand for that, not even btc foundation

Of course there is always an option to roll back to original chain, but maybe by then the original chain also become heavily hurt, since the promise of limited supply has broken (each fork will double the existing coin supply), now there are two different bitcoin at various different locations, people will question how many fork there will be, and which branch of btc you are talking about, these all created a whole lot of confusion and uncertainty, which will almost for sure cause a mass sell off. Even FED has never created that amount of inflation at any given day

All my concern is about the fork, not related to block size limit, they are totally different level discussion

I think the community should have a constitution: Never ever change the original protocol

Perfectionism kills, it is much less risk to deal with the imperfection in original protocol with client side solutions. Consistency and integrity of the protocol will gain people's trust and increase the adoption rate

Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 20, 2013, 11:13:11 PM
 #214

This doesn't take scarcity into account and would require an oracle to provide the constants for the necessary formula linking size to difficulty. It's easy to see the case where difficulty outpaces transaction volume; We're about to see that now with ASICs coming online. Once the maximum block size is sufficiently large so that all pending transactions can fit, now we're back to the case where there's no limit and fees trend to zero. Hopefully this example should kill off any ideas about tying block size to difficulty.

I don't think you thought it through. It does take scarcity into account. Whenever fees start trending towards zero, it will eventually lead to a decrease in mining power. Miners will stop mining. This will decrease difficulty, thus lowering the block size. Eventually there will be scarcity again, thus leading to increased fees, and thus more mining, and a larger block size. There would be an equilibrium and a market.

The problem is the baseline and how much of a difficulty change equals how much of a block size limit change. The baseline could be how much difficulty is at the time of the change (for example), but the other part is tricky. Maybe it could be done as a percentage change, it would change the same percentage as difficulty?

Denarium closing sale discounts now up to 43%! Check out our products from here!
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1007



View Profile
February 20, 2013, 11:15:19 PM
 #215


Any adjustment to the maximum block size must preserve scarcity. The question is not how many transactions can be handled by a one gigabyte hard limit, but rather will a one gigabyte hard limit produce sufficient scarcity?


I don't agree that the hard limit is the only way to promote scarcity.  Bear in mind, no matter how we do this, the scarcity is still artificial.  If we don't do it right with a hard fork, we're stuck with it.  If we increase the hard limit to a high predicted future limit, and use soft limits and/or other block verification rules to impose scarcity on transaction processing, most of the miners & pools will abide by the soft rules even when commenting out those rules is provablely in their own economic interests.  Reputation matters here, even moreso than it does in the "real" business world.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1007



View Profile
February 20, 2013, 11:22:18 PM
 #216

and how much of a difficulty change equals how much of a block size limit change. The baseline could be how much difficulty is at the time of the change (for example), but the other part is tricky. Maybe it could be done as a percentage change, it would change the same percentage as difficulty?

Or it could be a hybrid of your two ideas.  The increase in difficulty triggers an increase in the blocksize, but not linerally.  For example, no matter how much the difficulty increases (beyond a minimum), the blocksize increases by 10%.  No matter how much the difficulty decreases (beyond a minimum) the blocksize decreases by 5%.  Or vise versa, depending upon which is more likely to result in a favorable sscarcity.

Throw in my unlimited-if-all-free transactions rule.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 20, 2013, 11:31:20 PM
 #217

The problem is the baseline and how much of a difficulty change equals how much of a block size limit change. The baseline could be how much difficulty is at the time of the change (for example), but the other part is tricky. Maybe it could be done as a percentage change, it would change the same percentage as difficulty?

This is what I mean when I say that an oracle would be needed to determine the formula relating difficulty to block size. There is absolutely no way to know what this formula would look like. The penalty for getting it wrong (and it would be wrong most of the time) is vanishing fees, resulting in hysteresis (wild oscillations in network hash rate). ASCIMiner's 24% of the network hash rate could become 51% after a difficulty adjustment.

What's wrong with the scheme I described, which responds directly to changes in transaction space scarcity?

If we...use soft limits and/or other block verification rules to impose scarcity on transaction processing, most of the miners & pools will abide by the soft rules even when commenting out those rules is provablely in their own economic interests.  Reputation matters here, even moreso than it does in the "real" business world.

False. At least one of the pools will rewrite the soft limit to their economic advantage. Once this happens, members of other pools will quickly abandon ship and join the pool that modified the soft limit, since they have higher payouts. More realistically, all the competitive pools would make this change immediately after getting a source code patch with a soft limit. The only reason you don't see it happening now is because the block reward subsidy is orders of magnitude greater than the fees. As the block rewards get cut in half, there will be increasing pressure on miners to optimize their selection of transactions and that means abandoning the soft limit.

You're saying that miners will choose to make less money rather than more? Miners at the margin (those who would go bankrupt with the soft limit) obviously will choose to optimize the transaction selection code rather than going out of business. Your premise that reputation matters more than profit is wrong.

markm
Legendary
*
Offline Offline

Activity: 2940
Merit: 1090



View Profile WWW
February 20, 2013, 11:35:15 PM
Last edit: February 20, 2013, 11:53:05 PM by markm
 #218

Miners like lots of paying transactions, but payment processors like lots of transactions that pay the payment processor, and if they can get those transactions into the blockchain without paying miners heck that is a nice bonus for them, is it not?

So it seems to me that outfits that make money by spamming as many transactions as they can into the blockchain, making their money on what they can charge whoever their customers are for either that spamming or the effects of that spamming or even the ultimate long term effects/results of that spamming, are another group of actors who stand to gain by the largest blocks they can convince miners to create. I am not sure whether they would also ally with miners in the forcing out of competing miners, but maybe if miners demand such co-operation in return for preference in getting into that miner's mined blocks maybe they would happily go along with it?

Mike suggested somewhere that one (such a one as Google, to give you an idea of the kind of one he is accustomed to) can handle massive numbers of transactions, even using commodity hardware (which Google is wont to do) by processing them with many machines in parallel, so no matter how many transactions pour in one can simply add more machines. Obviously for one such as Google, with their databases designed and proven for handling huge chunks (aka blocks) of data, handling large blocks is also not a problem.

So if we scale up to a point where only the Googles of the world are capable of keeping up, what does that buy us? Nice high value for our hoards of bitcoins, hopefully, but what else? And at what cost in terms of freedom, accountability and the like?

Maybe acquire deepbit while one is at it, maybe a couple of other pools? How many would one need to acquire, if even any at all, to reach a point where one's specialised ability to verify transactions, possibly accompanied by enough miner co-operation or aquisition of enough mining pools and/or ASIC fleets, lets you squeeze/force everyone else other than maybe Microsoft (would they even care?), Yahoo (would they?), Amazon (hmm they have a payment processor, would they maybe put up some resistance or be just another don't care?) Paypal/Ebay (would even they care, really? Isn't an unverifiable competitor better for them than a verifiable one, one whose work/transactions they are not a party to verifying better than one whose they are?) and so on and so on out of the business?

Why the heck did we ever want more than just the single most-equipped-to-do-it player on the planet to verify transactions, again?

Can we leave watching over them to backwards-looking approaches, maybe? Never be able to catch up to where the blockchain is actually at but with a fleet of accounting/audit firms on the job be able to determine within a few weeks or months of some slipup that a slipup has happened, leaving the last several weeks of the chain invalid with respect to all transactions downstream of some erroneous one the accountants eventually discovered?

What if that "error" turned out to be a failure of a large number of silk road's coins to arrive at silk road's wallet?

Etc... All this push toward eliminating more and more of the population's chance of being able to verify makes me wonder why the heck we ever cared about verifying anything in the first place? Can't we just let Big Brother take care of everything as he always did/does, maybe even trusting that if blockchain technology has any application in such fields Big Brother will make use of it on our behalves?

A skeptical / cynical little voice in me scoffs ha ha you'd be lucky to only be halved, more likely you'll be quartered or decimated or worse...

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
MoonShadow
Legendary
*
Offline Offline

Activity: 1708
Merit: 1007



View Profile
February 20, 2013, 11:39:16 PM
 #219

The problem is the baseline and how much of a difficulty change equals how much of a block size limit change. The baseline could be how much difficulty is at the time of the change (for example), but the other part is tricky. Maybe it could be done as a percentage change, it would change the same percentage as difficulty?

This is what I mean when I say that an oracle would be needed to determine the formula relating difficulty to block size. There is absolutely no way to know what this formula would look like. The penalty for getting it wrong (and it would be wrong most of the time) is vanishing fees, resulting in hysteresis (wild oscillations in network hash rate). ASCIMiner's 24% of the network hash rate could become 51% after a difficulty adjustment.

What's wrong with the scheme I described, which responds directly to changes in transaction space scarcity?

If we...use soft limits and/or other block verification rules to impose scarcity on transaction processing, most of the miners & pools will abide by the soft rules even when commenting out those rules is provablely in their own economic interests.  Reputation matters here, even moreso than it does in the "real" business world.

False. At least one of the pools will rewrite the soft limit to their economic advantage. Once this happens, members of other pools will quickly abandon ship and join the pool that modified the soft limit, since they have higher payouts.

You're saying that miners will choose to make less money rather than more? Huh Huh

I can see it now: "p2pool: smaller payouts, better reputation!"

Yeah, I don't think so.


Not a certainty; so don't depend entirely upon limits that can be commented out by miners.  Use block verfication rules as well, which could be commented out by the users, but why would they do this?  The propogation of the block is very much part of the system.  Include the soft limit into the verification rules of as many clients as possible, and miners who first comment out that rule for themselves will be punished by the network at least until a majority of users upgrade their clients to match.  The rest of the miners that didn't commetn out the rule would benefit from teh harm the first mover takes upon himself.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 20, 2013, 11:44:34 PM
 #220

This is what I mean when I say that an oracle would be needed to determine the formula relating difficulty to block size. There is absolutely no way to know what this formula would look like. The penalty for getting it wrong (and it would be wrong most of the time) is vanishing fees, resulting in hysteresis (wild oscillations in network hash rate). ASCIMiner's 24% of the network hash rate could become 51% after a difficulty adjustment.

The fundamental market logic behind that idea seems solid enough that it actually doesn't matter too much how the relation is calculated. There could be some limit for how much the block size can change which is what MoonShadow suggested, if it's feared that the change can be suddenly too drastic. Otherwise it's all irrelevant since the market will balance the block size regardless. If it's too high miners will stop due to lack of fees and incentive, leading to a smaller block size, if it's too low it will lead to more fees, more miner incentive, more miners, and a higher difficulty and higher block size.

Quote
What's wrong with the scheme I described, which responds directly to changes in transaction space scarcity?

What you described isn't a bad idea either. I don't have more advanced opinions on it yet, but it looked decent.

Denarium closing sale discounts now up to 43%! Check out our products from here!
Pages: « 1 2 3 4 5 6 7 8 9 10 [11] 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!