Bitcoin Forum
October 24, 2017, 06:14:16 AM *
News: Latest stable version of Bitcoin Core: 0.15.0.1  [Torrent]. (New!)
 
   Home   Help Search Donate Login Register  
Pages: 1 2 3 4 5 6 7 8 9 10 11 [All]
  Print  
Author Topic: Funding of network security with infinite block sizes  (Read 23676 times)
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
March 23, 2013, 10:57:27 PM
 #1

Note: I have moved this post back from its wiki page because Peter Todd repeatedly replaced it with a completely different document. Please update any links to point to this forum thread.

One open question is how will funding of network security (mining) work if there's no competition for block space. If funding for proof of work comes from fees attached to transactions and the fees are motivated by scarcity of block space, then the funding mechanism is clear, though whether it will achieve "enough" funding is not.

In a world where block sizes are always large enough to meet demand for space, we can fund mining using per-block assurance contracts. From Wikipedia:

Quote
An assurance contract, also known as a provision point mechanism, is a game theoretic mechanism and a financial technology that facilitates the voluntary creation of public goods and club goods in the face of the free rider problem.

The free rider problem is that there may be actions that would benefit a large group of people, but once the action is taken, there is no way to exclude those who did not pay for the action from the benefits. This leads to a game theoretic problem: all members of a group might be better off if an action were taken, and the members of the group contributed to the cost of the action, but many members of the group may make the perfectly rational decision to let others pay for it, then reap the benefits for free, possibly with the result that no action is taken. The result of this rational game play is lower utility for everyone.

This describes network security with large block sizes. Everyone needs mining to be done, but nobody wants to be the one who pays for it given that everyone else will get the benefits for free.

As described on the contracts wiki page, an assurance contract is one in which an entrepreneur says "I will pay X towards the cost of a public good, but only if other people chip in enough to reach Y (the cost of providing the good)". If not enough people pledge money, the contract does not complete and nobody pays anything. This mechanism has a proven track record of funding public goods, mostly obviously via Kickstarter.

What is the target?

We need to figure out what "enough" mining means. This is related to the maximum time-value of transactions on the network. For micropayments, you don't need a whole lot of mining to protect all of them because the value of reversing them is very low, and any given attacker isn't trying to reverse all transactions but only the one for their own payment (they can't reverse arbitrary transactions). If you have very high value transactions where the receivers are willing to wait six months then you also don't need a whole lot. If you have very high value transactions that are occurring only between people/entities who trust each other, again, don't need much mining. That might sound unrealistic, but once you go above a certain level transactions almost always take place between people who know and can find each other - think about billion dollar business deals, etc. You don't need miners to ensure that won't be reversed. You just need a functioning legal system.

The type of transaction that's most tricky to secure are kind of middle of the road transactions - not billion dollar business deals, not micropayments but the ones where some real value is moving and it's not worth enough to sue the other side if something goes wrong. If there is enough mining to secure this kind of transaction, then there's enough for the other kinds too. There should be a lot of these, and there should also be a lot of participants.

For an assurance contract to work, do you need everyone who benefits to be a participant? No, some freeloading is OK, as long as those freeloaders weren't going to contribute anyway. Let's imagine that 200 major Bitcoin participants get together and form an assurance contract that funds 10 terahashes of mining (numbers are arbitrary). Does their agreement break if 200,000 users then make 500,000 micropayments? Not really - those micropayments hardly need any mining to be secure and those users weren't going to pay for 10 Thash no matter what, not even collectively. There's no downside to them being able to benefit from the extra mining you pay for and indeed, there may be indirect upsides (your Bitcoins become more valuable because they have greater utility).

Implementation

Implementation can be done effectively via the introduction of a new network rule. It says that money spent to an output that contains an unspendable script can be claimed as fees. We're already introducing the idea of OP_RETURN outputs that simply result in insta-pruning of that output, as it's provably unspendable. Allowing that value to be claimed as fees is a hard-forking rule change, but it's also a relatively straightforward one (the alternative is to have the money be destroyed ... which is bad), and we need to hard fork to increase the block size anyway. Once this is done, we can have a separate P2P broadcast network in which participants broadcast pledges. The pledge is an invalid transaction that takes X coins with a SIGHASH_ANYONECANPAY signature and then re-allocates it to an unspendable output of Y coins, where Y > X. X is the pledge and Y is the target, of course. Peers listen and when they have seen enough pledges to sum up to a value of greater than Y, they just combine them to make the tx valid and broadcast it on the regular Bitcoin network. Peers can do this in parallel - there's a chance of naturally occurring double spends but rules on how to construct the contract out of the pledges can probably reduce that to a trivial level.

Note that it is possible to do spend-to-fee assurance contracts without the new rule, but it'd be complicated and involve a multi-round protocol in which people first announce they want to pledge an output, then someone has to build a template transaction based on all the announcements, then the group signs it in parallel. It can be done but it's messier.

What are X and Y set to? That depends on what the participants want. It'd be nice to find economic research on the case where the public good in question is continuous, but I don't know of any at the moment. I think it'd likely work like this - merchants that have noticed that they start seeing double spends when network speed drops below 5 THash/sec would clearly be targeting that amount and no more. Other merchants might only care about 1 Thash/sec. In that case, the first class of merchants can set their Y to 1 Thash and just broadcast multiple independent contracts, this means there is a chance that they'll get less than what they wanted (which weakens the definition of the target) but there's more of a chance of the contracts completing. At any rate, they would reduce their pledge size as well for each contract, so they aren't exposing themselves to freeloaders at any greater level.

For X, I imagine that if you start from a steady state your goal is to lower it as much as possible without stopping the contracts from completing entirely. One way is to lower your pledge and see how fast the contract completes - if the contract doesn't complete within your required time limit, you can just (anonymously) broadcast a second pledge to make it back to what you were previously pledging. But other players don't know you might do that.

In a healthy system I'd expect there to be many independent, overlapping contracts being formed. They're simple to construct so doing one per block is reasonable.

Conclusion

Obviously whilst there are many participants with different ideas about what network speed is "enough", with only one chain there can only be one speed. People with extreme needs would end up not being able to use the block chain for their security and would have to find alternative arrangements, or just accept that they'd be subsidising the activities of others who can tolerate lower security.

I think the next piece of work needed to explore this idea is searching the literature for studies of assurance contracts for continuous public goods, as the primary difference between what this scheme would do and proven models like Kickstarter is the need for group consensus on what "enough" mining means.

That said, I note that the alternative proposal (restrict block size and charge fees to get in) doesn't even try to answer the question of how much mining is enough. It's just assumed that some arbitrary byte limit would result in demand exceeding supply to the level that mining is well funded. The problem being that limiting blocks by physical size doesn't jive with the fact that they move abstract value - if anything, for such a scheme to work, block sizes should be limited by satoshis transferred. Otherwise you could get a bunch of very high value transactions that end up with tiny fees because that's all that's necessary to get into the block, as lower value transactions have long since given up on the system and gone elsewhere.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
paraipan
Legendary
*
Offline Offline

Activity: 924


Firstbits: 1pirata


View Profile WWW
March 23, 2013, 10:58:10 PM
 #2

Interesting, now were getting somewhere.

I disagree with one of your points though, why enact a percentage fee similar to what we have now with the banking system, on Bitcoin?

BTCitcoin: An Idea Worth Saving - Q&A with bitcoins on rugatu.com - Check my rep
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
March 23, 2013, 11:24:14 PM
 #3

You meant why not enact a percentage fee, right?

The argument is that unless there is a hard block size limit, miners are incentivised to include any transaction no matter how small its fee because the cost of doing so is practically zero (less than a microdollar, according to Gavins calculations). Therefore if a bunch of transactions stack up in the memory pool that pay a smaller percentage than "normal", some miner will include them anyway because it costs nothing to do so and maximizes short term profit. Hence, you get a race to the bottom and you need some kind of hard network rule saying you can't do that. We already have one in the form of block byte size, so the debate becomes "let's keep the size limit" vs "let's remove it".
misterbigg
Hero Member
*****
Offline Offline

Activity: 896


GET IN - Smart Ticket Protocol - Live in market!


View Profile
March 23, 2013, 11:49:26 PM
 #4

why not enact a percentage fee, similar to what we have now with the banking system, on Bitcoin?

There's no certain way to determine which of the outputs are the part being spent versus what is being returned as change.

An assurance contract, also known as a provision point mechanism, is a game theoretic mechanism and a financial technology that facilitates the voluntary creation of public goods and club goods in the face of the free rider problem.

The proposal is to replace variable transaction fees (there always needs to be a minimum fee, to prevent relay spam) by rewarding miners with assurance contract payouts? This sounds very interesting.

Should one of the goals of reward system be to make sure that the hash rate is never decreasing? What if the businesses that provide the bulk of the financing of assurance contracts go bankrupt? The network could experience a significant reduction in hash rate; Miners at the margin of profitability would take their hardware offline. All this offline hardware could flood the secondary market, driving the cost of the equipment down, and cause a "hardware deflationary spiral". Then the network could be vulnerable to attack. Once mining hardware is produced, it is always "out there" even if it isn't active. So idle equipment might considered a threat; Hence, no system should allow for hash rate to drop suddenly.

It seems that a system of assurance contracts would also be susceptible to hysteresis. You even gave an example of it, that merchants would start to notice double spends and bump up the total value of their contracts accordingly. Shouldn't any system of rewards err on the side of always having too much hashing power instead of sometimes having too little?

Liberating the block size limit (assuming no propagation issues) is appealing in the context of assurance contracts because the economic interests of miners are aligned with those who have the greatest need for security but it seems to be a fragile manual process that involves a lot of subjective measure.


               ████
             ███  ███
           ████     ███
         ███  ███    ███
       ████     ███    ███
     ███  ███     ███    ███
   ████     ███     ███   ██
 ███  ███     █████████████████
███     ███     ███           ██
 ███      ███     ██          ██
   ███      ██████████      ███
     ███      ██████      ███
       ███      ██      ███
         ███          ███
           ███      ███
             ███  ███
               ████

GUTS
    ███
███
███
███
███
███
███
███
███
███
███
███
███
███
   
smart-ticket protocol for events
live product with market traction!
    ███
███
███
███
███
███
███
███
███
███
███
███
███
███
   
  BTC ANN
  WEBSITE
  BLOG
   
  SANDBOX
  WHITEPAPER
  BOUNTY
   
gmaxwell
Moderator
Legendary
*
qt
Offline Offline

Activity: 2324



View Profile
March 24, 2013, 12:35:21 AM
 #5



A transaction pays me 1000 BTC in block X.
I want to make sure that payment stays put and isn't reversed.
I contribute to an assurance contract for 10 BTC payable in block >=X+1.

The person who paid me announces a conflicting transaction which pays 50 btc in fees, with child transactions locked at X+1,X+2,X+3,X+4 paying 25, 12.5, 6.25, 3.125, etc.

Miners fork at X, the forking chain collects the 100 BTC in treachery award, additionally they collect the 10 BTC assurance contract.

Sufficiently smart mining software that computes the fork to mine on based on expected returns would be automatically complicit in this attack.

I think this description of having the assurance contracts paid by unrelated transactions is vulnerable to something similar to the POS problem ("proof of stake doesn't work because there is nothing at stake"): miners can mine honest chains or dishonest ones, and get paid all the same.


I'm also still not following the incentives here. Say I am one of the minority with special high value transaction security requirements. And I want to spend X on security, it would be more economical for me to just perform my own mining.. at least then I can be sure my expenditure wouldn't fund the mining of a fork which is not in my interest.  Of course, I have no particular incentive to mine anyone elses transactions if doing so is costly.... and if there are few such parties you wouldn't expect mining to be well distributed...

Bitcoin will not be compromised
ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470


Bringing Legendary Har® to you since 1952


View Profile
March 24, 2013, 12:53:08 AM
 #6

Is this problem of block size really so complex ? Maybe you are all overcomplicating simple things.

Can't it be solved by using one of following ways ?:

1. Miners VOTE for the next block size every X blocks, using special transactions, extra data put in previously mined blocks or whatever OR
2. Block size is AUTOMATICALLY calculated every Y blocks using some relatively simple algo (like basing on how full previous blocks were) ?

Why is this debacle so long and painful ? Maybe we should do one of these and if it doesn't work, try the next in the row ?

gmaxwell
Moderator
Legendary
*
qt
Offline Offline

Activity: 2324



View Profile
March 24, 2013, 01:52:31 AM
 #7

1. Miners VOTE for the next block size every X blocks, using special transactions, extra data put in previously mined blocks or whatever OR
2. Block size is AUTOMATICALLY calculated every Y blocks using some relatively simple algo (like basing on how full previous blocks were) ?
Both of these cases reduce to "miners choose" (because, e.g. under (2) miners can stuff their blocks with 'fake' txn). Miners choose is problematic for many reasons (discussed in detail in many other threads), but this thread is not about the block size, it's about how you can fund miners in ways other that using transaction fees to compete for limited space in blocks since thats not available if the size is not limited. It would be helpful to not knock this thread offtopic. The alternative funding question is interesting regardless of what happens with the block size.

Bitcoin will not be compromised
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
March 24, 2013, 02:09:31 AM
 #8

That said, I note that the alternative proposal (restrict block size and charge fees to get in) doesn't even try to answer the question of how much mining is enough. It's just assumed that some arbitrary byte limit would result in demand exceeding supply to the level that mining is well funded. The problem being that limiting blocks by physical size doesn't jive with the fact that they move abstract value - if anything, for such a scheme to work, block sizes should be limited by satoshis transferred. Otherwise you could get a bunch of very high value transactions that end up with tiny fees because that's all that's necessary to get into the block, as lower value transactions have long since given up on the system and gone elsewhere.

With regard to how we will paying miners to keep the network secure, restricting the blocksize isn't an alternative to assurance contracts, it's something you can do in addition to assurance contracts. Thus if one method fails we have the other as an alternative. Unlimited blocksizes on the other hand leave us with just the untested mechanism of assurance contracts, and no backup option.

I dunno about you, but when it comes to the technology keeping a good chunk of my wealth secure, I happen to like redundancy.


EDIT: Having said that, don't get the impression I think assurance contracts are a bad idea. I'd encourage you to start a discussion about making OP_RETURN, possibly followed by a small amount of data, IsStandard()

d'aniel
Sr. Member
****
Offline Offline

Activity: 461


View Profile
March 24, 2013, 04:02:13 AM
 #9

With large blocks supporting say 2000 tps, if 20% of transactors "do their part" and pay an economically insignificant $0.01 fee - way more than enough to cover the microcent cost of the actual transaction - then this would add up to $2400 per block in mining fees.  Currently we've got about a $1500 block reward.

So if the optimal hash rate scales sub-linearly with the transaction rate, then it strikes me that if demand permits scaling up of block sizes, then relying on social pressure and client software defaults to yield lots of individually economically insignificant fees is also a solution to the problem of funding of network security.
r.willis
Jr. Member
*
Offline Offline

Activity: 42


View Profile
March 24, 2013, 07:12:21 AM
 #10

I think block size is limited not to drive miner's profits up, but to protect network from malicious miners generating large blocks with tx spam.
There need not be limit for block size as network-wide rule in the future, each miner can deside for himself where his limit will be. So, we will have open market and it will regulate itself.
Again, without fees, how tx flood protection will work?

1GVmS56pvVL7YZA7YqMBXmaDedCoputKuJ BitEN - Bitcoin Erlang Node
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
March 24, 2013, 09:24:15 AM
 #11

Miners fork at X, the forking chain collects the 100 BTC in treachery award, additionally they collect the 10 BTC assurance contract.

Isn't this just a complicated way of saying, "if the majority hashpower is dishonest, then ..."? Bitcoin has always assumed that the majority hashpower follows the rules of the system, changing that so you can effectively buy it on demand with fees does indeed break things, but then (to quote Satoshi), that is a point I already concede.
gmaxwell
Moderator
Legendary
*
qt
Offline Offline

Activity: 2324



View Profile
March 24, 2013, 03:21:01 PM
 #12

Isn't this just a complicated way of saying, "if the majority hashpower is dishonest, then ..."? Bitcoin has always assumed that the majority hashpower follows the rules of the system, changing that so you can effectively buy it on demand with fees does indeed break things, but then (to quote Satoshi), that is a point I already concede.
No. This works with high probability with minorities of hashpower for shallow depth if you're talking about shallow burrying, I should have been more clear.

The concern I was trying to highlight is that the separate transaction is just as redeemable if the txns you care about are in the chain or not, so using something like this to secure particular transactions doesn't seem to work until the particular transactions are already secured. So I'm having trouble understanding the motivation the people paying the bond.  "Say I'm an altruist that wants to convince his employer to fund Bitcoin network security, how do I convince my boss that this is a good thing to spend money on?"

Bitcoin will not be compromised
TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
March 24, 2013, 05:28:17 PM
 #13

If the system requires a hard-fork, you could just directly add it to the rules.

For example, a transaction could have specify a fee and also how much the payer is willing to pay per THash.

So, if a transaction had 1BTC in fees and set the constant to 0.1 BTC/THash, then collecting the fees requires that you add proof of work on top of the transaction.

Block X (0.3 THash):
- transaction included
- fee = 0.3
- remaining = 0.7

Block X+1 (0.3 THash):
- fee = 0.3
- remaining = 0.4

Block X+2 (0.25 THash - difficulty change):
- fee = 0.25
- remaining = 0.15

Block X+3 (0.25 THash):
- fee = 0.15 (only 0.15 left)
- remaining = 0

This allows a user to spread their fee out over the next few blocks.  Also, it is effectively an assurance contract, they pay 1BTC and it is collected by miners securing the transaction with 1BTC / 0.1 BTC/THash = 10THash.

The could be a limit to how many blocks the fee can be paid forward.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
r.willis
Jr. Member
*
Offline Offline

Activity: 42


View Profile
March 24, 2013, 05:34:28 PM
 #14

because, e.g. under (2) miners can stuff their blocks with 'fake' txn
Than make block depend not on block size, but total fees collected pervious week. Low fees = block too large. High fees = block too small.

1GVmS56pvVL7YZA7YqMBXmaDedCoputKuJ BitEN - Bitcoin Erlang Node
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
March 24, 2013, 06:19:06 PM
 #15

The concern I was trying to highlight is that the separate transaction is just as redeemable if the txns you care about are in the chain or not, so using something like this to secure particular transactions doesn't seem to work until the particular transactions are already secured. So I'm having trouble understanding the motivation the people paying the bond.  "Say I'm an altruist that wants to convince his employer to fund Bitcoin network security, how do I convince my boss that this is a good thing to spend money on?"

It's not really altruism, it's a cost of business. If you're a Bitcoin using business, you clearly need mining to happen and it's OK for that to have some costs, you just don't want your costs to be higher than your competitors as otherwise you'd be at a disadvantage.

I'd imagined that for most businesses they'd just constantly take part in the contracts rather than try to match up pledges with pending payments precisely. Even if you try to only take part when there's an inbound payment, why would you pledge for block X+1 instead of the same block in which your transaction occurs?
TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
March 24, 2013, 08:12:44 PM
 #16

I'd imagined that for most businesses they'd just constantly take part in the contracts rather than try to match up pledges with pending payments precisely. Even if you try to only take part when there's an inbound payment, why would you pledge for block X+1 instead of the same block in which your transaction occurs?

Block X+1 also secures your transaction.  You are paying to get your transaction secured with a certain number of hashes.  This may require multiple blocks.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
yordan
Newbie
*
Offline Offline

Activity: 14


View Profile
March 25, 2013, 04:52:24 PM
 #17

OK clearly some creative solutions might be needed in the interim, that's granted.  But, when and if Bitcoin achieves a really wide level of adoption - let's say a Facebook level - and of course the effects of Moore's law - what if the client was re-writen to require everybody that's running it to do a little mining?  I'm not a mathematician but I bet there is one on the forum that can tell if this is viable...  Let's just say Bitcoin has a billion users and each and every one of them dedicates 10-20% of their CPU power for a couple of hours per day - would that be enough to support the network?
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
March 27, 2013, 12:55:07 AM
 #18

Is this problem of block size really so complex ? Maybe you are all overcomplicating simple things.

Can't it be solved by using one of following ways ?:

1. Miners VOTE for the next block size every X blocks, using special transactions, extra data put in previously mined blocks or whatever OR
2. Block size is AUTOMATICALLY calculated every Y blocks using some relatively simple algo (like basing on how full previous blocks were) ?

Why is this debacle so long and painful ? Maybe we should do one of these and if it doesn't work, try the next in the row ?

2. [max] Block size is AUTOMATICALLY calculated...

I agree.
I think option 2. is the simplest and most effective solution to the roadblock of the current fixed limit. I am all for scaling for demand, but scaling for infinity is just going to the opposite extreme.

There is a magic block size around 10MB where the fees market will work properly because they start to exceed the block reward. The existing fee structure, which also allows for some zero-rated transactions, seems to be proving itself in the field as overall fees are slowly increasing anyway. Why don't we let this organic growth continue by employing option 2 soon?

markm
Legendary
*
Offline Offline

Activity: 1988



View Profile WWW
March 27, 2013, 01:20:50 AM
 #19

If it is changeable it can be manipulated and you are basically asking the wolves and sheep to vote, using the sharpness of their teeth as the value of their votes...

You might as well just decide outright mining is something only first world nations and fortune ten or fortune 25 or so companies should be able to afford, maybe with a few telecom corps that own lots of cable even without being in the fortune 25.

Heck having the governments do it makes a lot of sense to a lot of companies maybe, as it puts the cost onto the taxpayers instead of having to pay it directly themselves... Since that would increase government control over the money maybe politicians would agree with that...

Unlimited basically means who ever wins the next world war or world currency war controls a monopoly on it... and invites that war to please commence...

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
March 27, 2013, 09:57:20 AM
 #20

Why is this debacle so long and painful ? Maybe we should do one of these and if it doesn't work, try the next in the row ?

Any change to the rules needs to have potential exploits considered.  If you change the rules of the game, then you end up with a different result.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
bitlancr
Hero Member
*****
Offline Offline

Activity: 616


View Profile
March 27, 2013, 11:39:45 AM
 #21

Wouldn't things be much simpler if Bitcoin inflation were allowed to continue past 21m BTC?
ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470


Bringing Legendary Har® to you since 1952


View Profile
March 27, 2013, 12:02:34 PM
 #22

Wouldn't things be much simpler if Bitcoin inflation were allowed to continue past 21m BTC?


caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
March 27, 2013, 01:02:01 PM
 #23

Wouldn't things be much simpler if Bitcoin inflation were allowed to continue past 21m BTC?

The whole point of all these discussions is to come up with spontaneous/natural/market-based ways to finance the work that is done for the blockchain, and not just arbitrarly pick some magic numbers or formulas and impose them via the protocol.

The inflation rate is necessarily an "arbitrary formula". Satoshi was wise to put an asymptotic cap on it.

If you use an arbitrary number/formula, there's just no way of knowing if this number really reflects the amount of security the network needs. It's likely to be higher (but it can also be lower), meaning that the resources being spent on such security would be better spent elsewhere (or are not enough). It's the knowledge problem Hayek talks about. You'd have to be omniscient to know what's the optimal formula, and it would need to change constantly.

Finally, inflation is even worse than just an arbitrary transaction fee, as you don't put the load on those who create the load. You force every holder to pay, regardless of how many UTXO they're creating.

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
March 27, 2013, 01:08:12 PM
 #24

This assurance contract theory is interesting, but I don't view merchants themselves doing the pledges.
Perhaps insurance+assurance would work better. The merchant only wants to be sure he won't be defrauded. He pays an insurance contract for his transactions. Insurers want to make sure the network has enough hash rate to make it unlikely that many of their insured transactions get double-spent, so they may start these pledges. Other insurers may contribute.

I don't think merchants themselves would start the pledges because that would involve complex calculations (how much is enough?) whose results would change frequently. Merchants wouldn't want to be bothered with such things.

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
bitlancr
Hero Member
*****
Offline Offline

Activity: 616


View Profile
March 27, 2013, 02:51:14 PM
 #25

Maybe there is a market-based way to set the inflation rate? Say the BTC transacted (or fees) in a block is low, then newly mined coins could be used to top the reward up to an acceptable amount. The 'acceptable amount' needs to be something high enough to cover the miner's expenses - this could be decided based on the hashrate, perhaps.

The leap of logic here is that More Txs => More Fees though. Of course with the free riders problem, this isn't necessarily the case.
allten
Sr. Member
****
Offline Offline

Activity: 449


You Don't Bitcoin 'till You Mint Coin


View Profile WWW
March 27, 2013, 07:38:29 PM
 #26

Leaned something new today: Assurance Contract

Though I'm still having a difficult time seeing how this
could be implemented in a simple manner for Bitcoin.

There's one idea I've had (I post it here because it seems to relate.....maybe)

I would like the ability to pay fees not just to the miner who solved the block
but to the miners of the following X number of blocks. Doesn't seem to controversial
and easy to implement. Fees are still kind of free and for those that do pay a decent
amount, it would be nice to have a choice to break up those fees over the next
10, 100, 1000, even 10,000+ blocks.
ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470


Bringing Legendary Har® to you since 1952


View Profile
March 28, 2013, 06:53:33 AM
 #27

Maybe there is a market-based way to set the inflation rate? Say the BTC transacted (or fees) in a block is low, then newly mined coins could be used to top the reward up to an acceptable amount. The 'acceptable amount' needs to be something high enough to cover the miner's expenses - this could be decided based on the hashrate, perhaps.

The leap of logic here is that More Txs => More Fees though. Of course with the free riders problem, this isn't necessarily the case.
Please go find and read 100 threads that have talked about changing the total number of BTC and then come back.

The inflation rate will not be changed. Period.
Deal with it.

bitlancr
Hero Member
*****
Offline Offline

Activity: 616


View Profile
March 28, 2013, 10:09:42 AM
 #28

The inflation rate will not be changed. Period.
Deal with it.

From what I've read, the arguments against increasing inflation just seem to be a fundamentalist "NO - inflation is theft" stance. That may be, but it's also an elegant way to reward the miners - the people who make the whole bitcoin system work. Also, increasing the money supply doesn't always mean inflation... we have an increasing money supply now, and massive deflation!

As someone who's trying to start a business in the bitcoin economy, I just want to make bitcoin the best currency it can be. That means being a great medium of exchange, not just a store of value.

caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
March 28, 2013, 10:58:22 AM
 #29

That may be, but it's also an elegant way to reward the miners

Inflation is not elegant at all. It's the worst way to charge people. It's disguised, distributes its load on people who are not creating the load, etc. Even in the context of a voluntary currency, it's almost dishonest. In the context of legal tender, it being theft is not a "fundamentalist stance", it's a fact.

Also, increasing the money supply doesn't always mean inflation...

Sure it does. That's the very definition of monetary inflation.

we have an increasing money supply now, and massive deflation!

You're confusing monetary inflation with price increases.

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
bitlancr
Hero Member
*****
Offline Offline

Activity: 616


View Profile
March 28, 2013, 11:10:19 AM
 #30

Also, increasing the money supply doesn't always mean inflation...

Sure it does. That's the very definition of monetary inflation.

we have an increasing money supply now, and massive deflation!

You're confusing monetary inflation with price increases.

I mean inflation in this sense:
https://www.google.com/search?q=define%3A+inflation

If the purchasing power of your money is increasing, why do you care about the money supply?
ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470


Bringing Legendary Har® to you since 1952


View Profile
March 28, 2013, 11:50:39 AM
 #31

Also, increasing the money supply doesn't always mean inflation...

Sure it does. That's the very definition of monetary inflation.

we have an increasing money supply now, and massive deflation!

You're confusing monetary inflation with price increases.

I mean inflation in this sense:
https://www.google.com/search?q=define%3A+inflation

Bitcoin's inflation rate is practically a HOLY rule of Bitcoin.
If you ever try to change it, hard fork, major havoc and panic on the market are unevitable.

So what you are asking for is impossible to do. I for sure, would immediately leave (or rather: switch to the proper fork) after somebody tried to mess with the holy rule.

If the purchasing power of your money is increasing, why do you care about the money supply?

I don't think you understand what "money supply" actually means. Or perhaps you don't understand what money is and what it stands for at all.

bitlancr
Hero Member
*****
Offline Offline

Activity: 616


View Profile
March 28, 2013, 02:07:19 PM
 #32


Bitcoin's inflation rate is practically a HOLY rule of Bitcoin.
If you ever try to change it, hard fork, major havoc and panic on the market are unevitable.

So what you are asking for is impossible to do. I for sure, would immediately leave (or rather: switch to the proper fork) after somebody tried to mess with the holy rule.


I can accept that - and if it were ever to come to a hard fork the majority would win. That's the beauty of Bitcoin, right?

I don't think you understand what "money supply" actually means. Or perhaps you don't understand what money is and what it stands for at all.

Yes, I do. Right now, the money supply of Bitcoin is increasing, and yet its purchasing power is also increasing. We have monetary inflation, along with deflation in real terms.

The point is monetary inflation doesn't imply real inflation.

I'm going to stop there, because I'm feeling like I've hijacked this thread... I'm sure this debate could still be of interest to others in the community, though, particularly those who want to see a flourishing Bitcoin economy, not just a store of value.
caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
March 28, 2013, 02:23:53 PM
 #33


On the terminology: https://mises.org/daily/908

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470


Bringing Legendary Har® to you since 1952


View Profile
March 28, 2013, 03:00:31 PM
 #34


Bitcoin's inflation rate is practically a HOLY rule of Bitcoin.
If you ever try to change it, hard fork, major havoc and panic on the market are unevitable.

So what you are asking for is impossible to do. I for sure, would immediately leave (or rather: switch to the proper fork) after somebody tried to mess with the holy rule.


I can accept that - and if it were ever to come to a hard fork the majority would win. That's the beauty of Bitcoin, right?

Yeah, you can do that even today - that's the beauty of it. Just fork the code. Or find somebody that can do it for you.
Of course the market would quickly verify your claims and pick a winner.

I don't think you understand what "money supply" actually means. Or perhaps you don't understand what money is and what it stands for at all.

The point is monetary inflation doesn't imply real inflation.

Doesn't matter. The more money are gaining on value, the better. Deflation is good for the economy, contrary to popular beliefs. And Bitcoin will be the perfect proof of that.

bitlancr
Hero Member
*****
Offline Offline

Activity: 616


View Profile
March 28, 2013, 03:22:29 PM
 #35

On the terminology: https://mises.org/daily/908

His arguments seem to be based on an oversimplification, so I'm not sure I buy them. Where does velocity of money and 'real' value of the transactions come in to play?


Yeah, you can do that even today - that's the beauty of it. Just fork the code. Or find somebody that can do it for you.
Of course the market would quickly verify your claims and pick a winner.

Would be a bit pointless to do that without any community support though, right?  Smiley

Doesn't matter. The more money are gaining on value, the better. Deflation is good for the economy, contrary to popular beliefs. And Bitcoin will be the perfect proof of that.

That may be true - but I still contend that you can have real deflation alongside monetary inflation (this is evidently true, as it's the situation we're in now). And even then, you still need to figure out a way to pay the miners fairly (the original point of this thread).

On a separate note, consider this:

If you're holding Bitcoins but not spending them, you're still using the services of the bitcoin network. The service being - in this case - storing your captial. You're relying on the activity in the network around you (the miners) to hold the value of that capital. Why shouldn't you pay for that service?
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
March 28, 2013, 04:21:38 PM
 #36

If you're holding Bitcoins but not spending them, you're still using the services of the bitcoin network. The service being - in this case - storing your captial. You're relying on the activity in the network around you (the miners) to hold the value of that capital. Why shouldn't you pay for that service?

It's unfortunate that the economics of the PoW system have a kind of "asymmetrical warfare" aspect to them in that the amount of funds an attacker needs to spend to destroy the system with a 51% attack will always be far less than the value that they have destroyed. It's a simple market cap thing: if we spend 1% of the market cap per year on hashing power an attacker only needs to to spend just over 1% to attack the other 99% of value. Even if you assume the attacker can't rent the hashing power and that mining has a significant capital cost, you're probably looking at something like 5% of the market cap to attack the other 95%

An emergency proof of stake system is a doable option, but getting any of them to work - even the really simple pseudo-PoS idea of just prioritizing based on coin-age spent in the block - without requiring every node to have a UTXO set copy is going to be tricky.

nevafuse
Sr. Member
****
Offline Offline

Activity: 248


View Profile
March 28, 2013, 05:14:22 PM
 #37

The funding of network security is self-regulated via transaction fees.  If the network starts being attacked, people will either try to save it or cash out.  Saving it will mean increased transaction fees to pay for a higher hash rate.  No other mechanism is necessary.  Artificially limiting block size will only cause the network to be overpaying for security which will cause people to leave bitcoin for another currency.

The only reason to limit the block size is to subsidize non-Bitcoin currencies
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
March 28, 2013, 06:03:12 PM
 #38

...

The argument is that unless there is a hard block size limit, miners are incentivised to include any transaction no matter how small its fee because the cost of doing so is practically zero (less than a microdollar, according to Gavins calculations). Therefore if a bunch of transactions stack up in the memory pool that pay a smaller percentage than "normal", some miner will include them anyway because it costs nothing to do so and maximizes short term profit. Hence, you get a race to the bottom and you need some kind of hard network rule saying you can't do that. We already have one in the form of block byte size, so the debate becomes "let's keep the size limit" vs "let's remove it".

Hang on a second. Am I missing something? I don't think miners need a hard block size limit to have incentive to stop accepting transactions. They will do so because there is always a time limit.

The difficulty target is adjusted to regulate time between blocks, and results in a target with a probability a correct hash will be found within a certain time (regardless the total hashing power of the network). Every second a miner waits to include more transactions in their block the probability is increased a competing miner will find a correct hash for their own block.

When the network receives more than one valid block version within close time proximity it holds both and waits to see which is extended longest breaking the tie. Again, every second a miner waits to announce a block it increases the chance their found block won't be permanently regarded as valid. Physical block size is not a factor in that, and miners will naturally stuff their block with most profitable transactions first.

justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
March 28, 2013, 06:08:21 PM
 #39

Hang on a second. Am I missing something? I don't think miners need a hard block size limit to have incentive to stop accepting transactions. They will do so because there is always a time limit.

The difficulty target is adjusted to regulate time between blocks, and results in a target with a probability a correct hash will be found within a certain time (regardless the total hashing power of the network). Every second a miner waits to include more transactions in their block the probability is increased a competing miner will find a correct hash for their own block.

When the network receives more than one valid block version within close time proximity it holds both and waits to see which is extended longest breaking the tie. Again, every second a miner waits to announce a block it increases the chance their found block won't be permanently regarded as valid. Physical block size is not a factor in that, and miners will naturally stuff their block with most profitable transactions first.
This is exactly true, and rather obvious.

The fact that solutions are being proposed to a problem that can be so trivially shown not to exists calls into question the real motives of the people pushing said solutions.
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
March 28, 2013, 06:17:14 PM
 #40

Hang on a second. Am I missing something? I don't think miners need a hard block size limit to have incentive to stop accepting transactions. They will do so because there is always a time limit.

The difficulty target is adjusted to regulate time between blocks, and results in a target with a probability a correct hash will be found within a certain time (regardless the total hashing power of the network). Every second a miner waits to include more transactions in their block the probability is increased a competing miner will find a correct hash for their own block.

When the network receives more than one valid block version within close time proximity it holds both and waits to see which is extended longest breaking the tie. Again, every second a miner waits to announce a block it increases the chance their found block won't be permanently regarded as valid. Physical block size is not a factor in that, and miners will naturally stuff their block with most profitable transactions first.
This is exactly true, and rather obvious.

The fact that solutions are being proposed to a problem that can be so trivially shown not to exists calls into question the real motives of the people pushing said solutions.

I think that's rather harsh. People are processing Bitcoin problems (scalability, block size, etc.) in different parts, from different aspects, and with differing information. I don't think calling anything obvious is fair.
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
March 28, 2013, 06:24:32 PM
 #41

I think that's rather harsh. People are processing Bitcoin problems (scalability, block size, etc.) in different parts, from different aspects, and with differing information. I don't think calling anything obvious is fair.
I didn't necessarily mean it should have been obvious to you, but it should be for the person you were quoting, especially since it's been brought up many times in the past.
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
March 28, 2013, 06:31:26 PM
 #42

I think that's rather harsh. People are processing Bitcoin problems (scalability, block size, etc.) in different parts, from different aspects, and with differing information. I don't think calling anything obvious is fair.
I didn't necessarily mean it should have been obvious to you, but it should be for the person you were quoting, especially since it's been brought up many times in the past.

Yes, that's what I meant. I've always believed the time between blocks factor was the thing giving miners incentive to broadcast hurriedly, but I don't recall reading that in any block size debate thread. That's what I mean about people processing problems from different aspects and with different information sets. How do you know most everyone has seen that point mentioned?
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
March 28, 2013, 06:31:31 PM
 #43

Hang on a second. Am I missing something? I don't think miners need a hard block size limit to have incentive to stop accepting transactions. They will do so because there is always a time limit.

The difficulty target is adjusted to regulate time between blocks, and results in a target with a probability a correct hash will be found within a certain time (regardless the total hashing power of the network). Every second a miner waits to include more transactions in their block the probability is increased a competing miner will find a correct hash for their own block.

Yeah you're missing something.

Miners are always constantly hashing to try to find a new block that would include the most profitable set of transactions that they could include in that block. If they find a hash that meets the target they should immediately send the block they found to the network and start trying to find a hash that would build on the block they found.

Under no circumstance does it ever make sense to withhold a solution. If they find two blocks in a row, splitting the transactions they could have included in the blocks between the two blocks, that's actually better for the miner because it makes it harder for other miners to orphan those blocks, and thus collect the fees themselves.

You have to remember that mining is a random process. It's not like you work towards solving a block, it's more like you have this machine that spits out lottery tickets, and you are scratching them off as fast as possible hoping for a winner. You might get lucky and have two winners in a row, or unlucky and go for days before finding another one, but either way ever winner you do find you should cash in immediately however much it's worth.

acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
March 28, 2013, 06:34:16 PM
 #44

Hang on a second. Am I missing something? I don't think miners need a hard block size limit to have incentive to stop accepting transactions. They will do so because there is always a time limit.

The difficulty target is adjusted to regulate time between blocks, and results in a target with a probability a correct hash will be found within a certain time (regardless the total hashing power of the network). Every second a miner waits to include more transactions in their block the probability is increased a competing miner will find a correct hash for their own block.

Yeah you're missing something.

Miners are always constantly hashing to try to find a new block that would include the most profitable set of transactions that they could include in that block. If they find a hash that meets the target they should immediately send the block they found to the network and start trying to find a hash that would build on the block they found.

Under no circumstance does it ever make sense to withhold a solution. If they find two blocks in a row, splitting the transactions they could have included in the blocks between the two blocks, that's actually better for the miner because it makes it harder for other miners to orphan those blocks, and thus collect the fees themselves.

You have to remember that mining is a random process. It's not like you work towards solving a block, it's more like you have this machine that spits out lottery tickets, and you are scratching them off as fast as possible hoping for a winner. You might get lucky and have two winners in a row, or unlucky and go for days before finding another one, but either way ever winner you do find you should cash in immediately however much it's worth.

I don't see that you've shown something I missed (not trying to be sarcastic). It sounds like you're describing my point.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
March 28, 2013, 06:43:27 PM
 #45

I don't see that you've shown something I missed (not trying to be sarcastic). It sounds like you're describing my point.

Ah, you're saying that because miners have a time limit, they won't want to fill up their blocks.

What I'm saying, and now I think you do understand, is that mining is a random process so miners should send every block out with whatever transactions they included in it when they found the correct PoW; we're in agreement on that point.

However, without a limit, what reason do I have to send the miner a high fee in the first place? Provided marginal cost of including my transaction, based on network costs and the increased chance the block will be orphaned, is less than the fee I attached they'll include it. So naturally fees will settle down to that marginal cost. The problem is the network cost is tiny, yet has nothing to do with the long-term cost of storing the UTXO set, and also is fixed so that profitability for larger, more centralized, pools is always higher than smaller pools. The other side of the cost, the orphaning chance, goes down as fees go down, essentially because if fees aren't significant, the loss due to orphaning isn't significant either, so you can take more risks and try to stuff more low-fee transactions into your blocks.

It's a nasty race to the bottom - a textbook example of how capital intensive businesses where efficiency goes up as capital investment tends to result in oligopolies or monopolies in the long run.

acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
March 28, 2013, 07:23:06 PM
 #46

I don't see that you've shown something I missed (not trying to be sarcastic). It sounds like you're describing my point.

Ah, you're saying that because miners have a time limit, they won't want to fill up their blocks.

What I'm saying, and now I think you do understand, is that mining is a random process so miners should send every block out with whatever transactions they included in it when they found the correct PoW; we're in agreement on that point.

However, without a limit, what reason do I have to send the miner a high fee in the first place? Provided marginal cost of including my transaction, based on network costs and the increased chance the block will be orphaned, is less than the fee I attached they'll include it. So naturally fees will settle down to that marginal cost. The problem is the network cost is tiny, yet has nothing to do with the long-term cost of storing the UTXO set, and also is fixed so that profitability for larger, more centralized, pools is always higher than smaller pools. The other side of the cost, the orphaning chance, goes down as fees go down, essentially because if fees aren't significant, the loss due to orphaning isn't significant either, so you can take more risks and try to stuff more low-fee transactions into your blocks.

It's a nasty race to the bottom - a textbook example of how capital intensive businesses where efficiency goes up as capital investment tends to result in oligopolies or monopolies in the long run.

OK, got it, thanks. Yes, I was missing something. That's what I get for reading too quickly. The following, which I quoted earlier, is actually right:

...

The argument is that unless there is a hard block size limit, miners are incentivised to include any transaction no matter how small its fee because the cost of doing so is practically zero (less than a microdollar, according to Gavins calculations). Therefore if a bunch of transactions stack up in the memory pool that pay a smaller percentage than "normal", some miner will include them anyway because it costs nothing to do so and maximizes short term profit. Hence, you get a race to the bottom and you need some kind of hard network rule saying you can't do that. We already have one in the form of block byte size, so the debate becomes "let's keep the size limit" vs "let's remove it".

In my mind I was thinking of this text from the OP:

One question that comes up often in the block size debate is how will mining be funded if there's no competition for block space.

I took that as meaning no fees, but the other quote is about low/marginal fees, not zero fees.

My response about blocks being limited by time addresses zero fees, not low fees, as miners will prioritize transactions with any fee (even if very low) first.

I see what you mean about the race to the bottom now for marginal fees. I remember reading that point in another debate thread.
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
March 28, 2013, 07:39:14 PM
 #47

Block size is effectively infinite right now because they are nowhere near being full to the 500KB soft limit.

Empirical evidence is preferred over theoretical chains of cause and effect, and this shows a steady long-term increase in fees already. The chart below is in BTC, ignoring the USD equivalent one...
https://blockchain.info/charts/transaction-fees?showDataPoints=false&timespan=&show_header=true&daysAverageString=7&scale=0&address=

What is being done in the field to increase fees, right now, is WORKING.

All that needs to happen is allow the 1MB to be replaced by a capping algorithm which just keeps pace ahead of demand. Then see what happens to fees. If they plateau at too low a level - then try to fix it. Why fix something which is not broken (except the need to avoid the sudden train wreck due to an arbitrary constant).

acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
March 28, 2013, 08:04:43 PM
 #48

Block size is effectively infinite right now because they are nowhere near being full to the 500KB soft limit.

That's something that occurred to me too...

Empirical evidence is preferred over theoretical chains of cause and effect, and this shows a steady long-term increase in fees already. The chart below is in BTC, ignoring the USD equivalent one...
https://blockchain.info/charts/transaction-fees?showDataPoints=false&timespan=&show_header=true&daysAverageString=7&scale=0&address=

What is being done in the field to increase fees, right now, is WORKING.

All that needs to happen is allow the 1MB to be replaced by a capping algorithm which just keeps pace ahead of demand. Then see what happens to fees. If they plateau at too low a level - then try to fix it. Why fix something which is not broken (except the need to avoid the sudden train wreck due to an arbitrary constant).

Have there been good arguments against a dynamic cap?
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
March 28, 2013, 08:31:16 PM
 #49

The problem is the network cost is tiny, yet has nothing to do with the long-term cost of storing the UTXO set, and also is fixed so that profitability for larger, more centralized, pools is always higher than smaller pools.
There is a marginal cost to the miner for increasing the UTXO set in the form of capital investment of memory and fast storage to store it in. When the UTXO set gets large enough to be a problem miners will have an economic incentive to reduce their hardware costs by favouring transactions that shrink the set over those that grow the set.

Even the miners with lower capital costs will have an incentive to limit the size of the set because it affects the speed at which other nodes can validate their blocks and thus their orphan rate.
markm
Legendary
*
Offline Offline

Activity: 1988



View Profile WWW
March 28, 2013, 08:38:32 PM
 #50

What orphan rate? Miners who cannot service the large population markets hardly even count, do they? If you are serving billions of people, who will even care that a bunch of third world nation peasants local miners fail to rubber-stamp megacorp's blocks?

Heck if merely having more bandwidth isn't enough to nuke the competition and gain a monopoly, why not buy some hashing power too fergoshsakes?

If your huge nuke the smaller people blocks aren't making you enough money to buy up a majority of hashing power too, maybe you aren't doing it right or are doing it too soon or are merely too bandwidth-centric and not balancing your bandwidth advantage with hashing advantage.

Maybe you can get together with number two and together try harder?

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
markm
Legendary
*
Offline Offline

Activity: 1988



View Profile WWW
March 28, 2013, 08:43:54 PM
 #51

Block size is effectively infinite right now because they are nowhere near being full to the 500KB soft limit.

Bandwidth is NOT effectively infinite right now, because there still exists at least one entity or nation on the planet that has enough bandwidth and processing power to process blocks.

Before it reaches effectively infinite, it will reach effectively too much for anyone other than the one global mega best at it cartel to handle.

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
March 28, 2013, 09:12:17 PM
 #52

The fact that solutions are being proposed to a problem that can be so trivially shown not to exists calls into question the real motives of the people pushing said solutions.

I believe their motive is to try to convince those who want to cripple Bitcoin to a 1Mb block size limit that such crippling is not a good idea...

Have there been good arguments against a dynamic cap?

I used to support such idea, until I realize that a dynamic cap implies in an arbitrary formula, and that's an attempt to guess subjective demands and unpredictable supplies. It's impossible, and fortunately, not necessary.

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
March 28, 2013, 09:24:47 PM
 #53

Have there been good arguments against a dynamic cap?

I used to support such idea, until I realize that a dynamic cap implies in an arbitrary formula, and that's an attempt to guess subjective demands and unpredictable supplies. It's impossible, and fortunately, not necessary.

That doesn't make sense to me. The formula wouldn't need to be arbitrary; it could be based on actual data. If your sentiment were true the difficulty target wouldn't work.
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
March 28, 2013, 09:30:25 PM
 #54

Block size is effectively infinite right now because they are nowhere near being full to the 500KB soft limit.
Bandwidth is NOT effectively infinite right now,...
-MarkM-

MarkM, I still read your posts because you embed enough useful feedback within your stream-of-consciousness padding to make it worthwhile, but it is a struggle at times.

Clearly bandwidth is finite, but the increase in propagation time between 50KB, 500KB or 5MB blocks is not very significant in a 10 minute time window. The increase in verification time seems to be the real limiting factor.

I used to support such idea, until I realize that a dynamic cap implies in an arbitrary formula, and that's an attempt to guess subjective demands and unpredictable supplies. It's impossible, and fortunately, not necessary.

The advantages are nice to have, not mission critical:
Politically, a cap is a less radical departure from the soft and hard block limit which people know about. Psychologically, it maintains a perceived need to add fees, and might price out SD-like flooding. It also prevents the chance that an unexpected monster block gets accepted and built on causing problems for some miners.



caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
March 28, 2013, 09:40:08 PM
 #55

That doesn't make sense to me. The formula wouldn't need to be arbitrary; it could be based on actual data.

But the formula remains arbitrary. You can't come up with an algorithm capable of measuring actual demand and actual supply, since these units are impossible to measure. So you can't really know how many security is demanded (remember, demand is subjective!), nor how such demand would compete for Earth's scarce resources. You need to be omniscient to know all that.
 
If your sentiment were true the difficulty target wouldn't work.

The difficulty target aims to make one block at every 10min. But why 10min? This is an arbitrary value. It may be too much sometimes, too little at other times. It's certainly not optimal. That said, it's not such a big deal, and trying to improve it would not be worth the risks.

Concerning mining remuneration, if we can go directly to spontaneous order - and that's the closest you'll ever get from "optimal" -, then why not? Why try to come up with arbitrary formulas? That would be "presumption of knowledge".
Quote from: Hayek
"The curious task of economics is to demonstrate to men how little they know about what they imagine they can design."

Politically, a cap is a less radical departure from the soft and hard block limit which people know about. Psychologically, it maintains a perceived need to add fees, and might price out SD-like flooding.

SD is not flooding anything. They're not attacking the network, Bitcoin users want to use their services.
Of all business, they're likely the one that has mostly contributed to miners via transaction fees.

It also prevents the chance that an unexpected monster block gets accepted and built on causing problems for some miners.

Miners have no interest in keeping a "monster block". And they can easily choose not to build on top of such block, unless it is N blocks deep already, what would likely get the monster block rejected by the network.

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2408



View Profile
March 28, 2013, 10:27:34 PM
 #56

Interesting discussion.

solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
March 28, 2013, 11:11:24 PM
 #57

SD is not flooding anything. They're not attacking the network, Bitcoin users want to use their services.
Of all business, they're likely the one that has mostly contributed to miners via transaction fees.

This is the Circe-like character of SD. It looks attractive but carries great dangers. I am still concerned that this type of transaction source can scale far faster than the Bitcoin network.

Miners have no interest in keeping a "monster block". And they can easily choose not to build on top of such block, unless it is N blocks deep already, what would likely get the monster block rejected by the network.

Consider variance. One hallmark of any successful, complex system is low variance of important intrinsic parameters. The Earth's ecosystem depends upon low variance in climate: e.g. the difference in air pressure between a cyclone and anticyclone is not a large percentage of 1 atmosphere.
In the case of Bitcoin, a very small block followed by a very large one is an unhealthy sign. A cap will help keep the variance (standard of deviation) of block size lower. This must be helpful to all miners as they know what to expect and plan accordingly, making incremental changes, which are always safer. A cap helps ensure all miners are on the same page about what is considered an expected block or a oversized one.


markm
Legendary
*
Offline Offline

Activity: 1988



View Profile WWW
March 29, 2013, 01:55:07 PM
 #58

If there is no cap, there is no amount of resources you can buy and set up and run that will be enough unless you are the top spender, or possibly one of the cartel of top spenders.

A cap means we can know how many millions of dollars each node is going to cost, so that we can start figuring out how many such nodes the world, or a given nation, or a given demographic, or a given multinational corporation, or a given corner store, or a given mid-sized business, or a given chain of grocery stores, etc etc can afford to set up and run.

No cap means those things basically cannot be known, so trying to build a node becomes a massive hole in the budget that you keep throwing money at but never maybe manage to throw as much money at it as sprint and bell and google and yahoo and virgin so end up having thrown away all your money for nothing.

So we have to know, are nodes something only the fortune 25 should be able to afford? Or something even the entire fortune 500 could afford? Could any of the fortune 1000 that are not in the fortune 500 afford it if they happen to be very highly aligned to and optimised for that particular kind of business? Or should they be able to afford it even if it is not really strongly aligned with their existing infrastructure and business?

Those are the kinds of things we need to know, that a lack of cap on block size makes unknowable.

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
March 29, 2013, 06:38:42 PM
 #59

That doesn't make sense to me. The formula wouldn't need to be arbitrary; it could be based on actual data.

But the formula remains arbitrary. You can't come up with an algorithm capable of measuring actual demand and actual supply, since these units are impossible to measure. ...

No, but you can have an algorithm that measures actual data, which is how the difficulty target works. You can measure what happened in the past.

If your sentiment were true the difficulty target wouldn't work.

The difficulty target aims to make one block at every 10min. But why 10min? This is an arbitrary value. It may be too much sometimes, too little at other times. It's certainly not optimal. That said, it's not such a big deal, and trying to improve it would not be worth the risks.

No, the difficulty target being set at 10 minutes is not arbitrary. It may or may not be optimal, but it's not arbitrary. If that value could be set arbitrarily then it could be two weeks, or two years, which of course is not workable for the application.

Speaking of all this it occurs to me we could have a dynamic cap provide both a limit and non-limit for block size. That may be a workable way to satisfy both camps.

I once had a friend download a movie using BitTorrent  Roll Eyes and noticed the download speed varied from an absolute trickle to a full flood of throughput. Like a race car on a freeway speeds alternated between open and constricted. I'm pretty sure that's done so "leechers" don't drain "seeder" resources too much as would naturally happen if transfer channels were left unchecked.

Bitcoin could work the same way. Form a mental picture of the block size beating slow like a heart. At times the block size could be constricted, allowing small players equal chance to participate meaningfully. However, that constriction could also be released to allow unlimited throughput.

In the real world that would translate to inconvenience only if and when you needed a transaction with Bitcoin's desirable features (anonymity, irreversibility, etc.) at a time of block size constriction and weren't willing to bid a high enough fee for priority inclusion. You might instead opt for an alternate cryptocurrency or suitable off-chain transaction option. This seems a small price to pay if it makes Bitcoin workable at a global scale.
doobadoo
Sr. Member
****
Offline Offline

Activity: 364


View Profile
March 30, 2013, 03:46:15 PM
 #60

You meant why not enact a percentage fee, right?

The argument is that unless there is a hard block size limit, miners are incentivised to include any transaction no matter how small its fee because the cost of doing so is practically zero (less than a microdollar, according to Gavins calculations). Therefore if a bunch of transactions stack up in the memory pool that pay a smaller percentage than "normal", some miner will include them anyway because it costs nothing to do so and maximizes short term profit. Hence, you get a race to the bottom and you need some kind of hard network rule saying you can't do that. We already have one in the form of block byte size, so the debate becomes "let's keep the size limit" vs "let's remove it".

Not exactly true, very large blocks are slow to transmit and slow for others to process before relaying.  This increases the chance that a miner scores a block but it becomes orphaned by a miner with a smaller sized block.  Where this limit plays out is not known yet.  Is it 1MB blocks, 2MB, 10MB, 100MB?

"It is, quite honestly, the biggest challenge to central banking since Andrew Jackson." -evoorhees
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
March 30, 2013, 06:45:39 PM
 #61

Actually, forget my earlier "heartbeat" block size. I have a better idea.

...
All that needs to happen is allow the 1MB to be replaced by a capping algorithm which just keeps pace ahead of demand. ...

I think this is right. It's effectively not a cap at all just like the U.S. debt ceiling. The problem with the debt ceiling is people, at least prior, were not paying attention, but there is a check in place - raising the ceiling requires a vote.

Increasing block size could happen the same way, but instead of congressmen ignorant of economics and/or apathetic of votes, miners have financial incentive to vote responsibly.

I think a brilliant idea of Gavin's is this:

A hard fork won't happen unless the vast super-majority of miners support it.

E.g. from my "how to handle upgrades" gist https://gist.github.com/gavinandresen/2355445

Quote
Example: increasing MAX_BLOCK_SIZE (a 'hard' blockchain split change)

Increasing the maximum block size beyond the current 1MB per block (perhaps changing it to a floating limit based on a multiple of the median size of the last few hundred blocks) is a likely future change to accomodate more transactions per block. A new maximum block size rule might be rolled out by:

New software creates blocks with a new block.version
Allow greater-than-MAX_BLOCK_SIZE blocks if their version is the new block.version or greater and 100% of the last 1000 blocks are new blocks. (51% of the last 100 blocks if on testnet)
100% of the last 1000 blocks is a straw-man; the actual criteria would probably be different (maybe something like block.timestamp is after 1-Jan-2015 and 99% of the last 2000 blocks are new-version), since this change means the first valid greater-than-MAX_BLOCK_SIZE-block immediately kicks anybody running old software off the main block chain.


Checking for version numbers IMO is how almost all network changes should be handled - if a certain percentage isn't compliant no change happens. Doing this would have prevented the recent accidental hard fork. It's what I call an anti-fork ideology. Either we all move forward the same way or we don't change at all. That's important given the economic aspects of Bitcoin.

So we use this model also to meter block size. One of the points in the debate is future technological advances can be an accommodating factor for decentralization, but that's unfortunately unknown. No problem, let the block size increase by polling to see what miners can handle.

Think of a train many many boxcars long. Maybe the biggest most impressive boxcars are upfront near the engine powering along, but way back are small capacity cars barely staying connected. To ensure no cars are lost even the smallest car has powerful brakes that can limit the speed of the entire train.

Gavin's earlier thoughts are close:

Quote
.. (perhaps changing it to a floating limit based on a multiple of the median size of the last few hundred blocks) ...

The problem here is within a network of increasingly centralized mining capacity the median size of most any number of blocks will always be too high to account for small scale miners, allowing larger limits by default.

Instead we make it more like that train. The network checks for the lowest block limit (maybe in 100MB increments) announced by at least say 10% of all blocks every thousand blocks (or whatever). It can't be the absolute lowest value found at any given time since some people will simply not change by neglect. However, I think 10% or so sends a clear signal people are not ready to go higher. At the same time all miners have financial incentive to allow higher capacity as soon as possible due to fees they can collect.

This method would keep the block size ideal for decentralization as long as there was good decentralization of miners. So it's like the 51% attack rationale - centralized miners could only become monopolies by controlling nearly 100% of all blocks found.
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
March 30, 2013, 08:39:03 PM
 #62

Good points acoindr, but I am not clear on the last paragraphs. (I think you mean 100KB too).

Demand is indeed predictable based upon the last few thousand blocks. Because Bitcoin is a global currency transaction volumes, over a time period of a week or two, should ebb and flow steadily like the sea level.

acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
March 30, 2013, 09:33:53 PM
 #63

Good points acoindr, but I am not clear on the last paragraphs. (I think you mean 100KB too).

I didn't think through the math, so the 100MB increment and one thousand block sampling size numbers are only placeholders.

I don't know how often would be best to check for possible block size increase. I don't think it should be a constant thing. Instead it could be done once per year, or maybe once every 3 months. I think knowing the fixed limit for a significant time period is helpful for planning. So that would mean maybe checking the last one thousand blocks (about 1 week's worth) every 13,400 blocks which is about every 3 months.

For size of increase, I don't know... I'm thinking 10-100MB (just a guesstimate) which may accommodate even explosive global adoption rates. Remember, this carries into a future of technological capacity we don't yet know.

Demand is indeed predictable based upon the last few thousand blocks. Because Bitcoin is a global currency transaction volumes, over a time period of a week or two, should ebb and flow steadily like the sea level.

This actually doesn't care about demand. It only cares about the network capacity comfortable for even miners of lower resources to continue participating. It guards against mining operations evolving into monopolies and oligopolies resulting from an unlimited block size (he who has the highest bandwidth/resources wins) without the automatic crippling to widespread usage a hard limit would ensure.

There may be times when network capacity available doesn't keep pace with total demand, but that simply puts market pressure on increasing network capacity and/or viable alternate channels. At least the entire project isn't wrecked because neither implementation of cap or no cap will gain consensus.

Timo Y
Legendary
*
Offline Offline

Activity: 938


bitcoin - the aerogel of money


View Profile
March 31, 2013, 08:53:09 AM
 #64

There would be tons of freeloaders.

What are freeloaders doing? They are betting that the hashrate will be above their desired value, even if they don't pledge. So why not bring them into the system and let them bet for profit?

If you pledge for a certain hashrate, and the hashrate doesn't materialize, you get back your pledge + x percent profit. The profit comes from the people who pledged for the hashrate that did materialize.  A fraction of their pledge goes to the miners and another fraction is used for betting.

GPG ID: FA868D77   bitcoin-otc:forever-d
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
March 31, 2013, 12:21:44 PM
 #65

That sounds a bit like a dominant assurance contract with a twist. My question is why it's better/worth the extra complexity.
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2408



View Profile
March 31, 2013, 08:48:26 PM
 #66

Quote
Increasing the maximum block size beyond the current 1MB per block (perhaps changing it to a floating limit based on a multiple of the median size of the last few hundred blocks) is a likely future change to accomodate more transactions per block. A new maximum block size rule might be rolled out by:

Did not know Gavin (or anyone) was considering floating MAX_BLOCK_SIZE, when I suggested it a while back on IRC it went down like a lead balloon.

Anyway, imo it needs to float and be based on some sensible calculation of previous block sizes over a 'long enough' period. Also I think there needs to be a way to float the min tx fee, this is the other piece that is hard-coded and adjusted by human 'seems about right' to prevent spam tx. Obviously as the value of btc goes higher then what is and isn't considered spam tx changes.

The two variables max_block_size and min_tx_fee are coupled though. Maybe a simple LQR controller for a 2 variable system could be sufficient for closing the loop for stability here?

bitlancr
Hero Member
*****
Offline Offline

Activity: 616


View Profile
March 31, 2013, 11:22:01 PM
 #67

Idea: why not increase the hash difficulty for larger blocks? So if a miner wants to pass the 1MB threshold, require an extra zero bit on the hash. There's a big disincentive to include lots of tx dust.
Frozenlock
Sr. Member
****
Offline Offline

Activity: 434



View Profile
April 01, 2013, 03:20:34 AM
 #68

I very much like the idea of adjusting the difficulty with the block size.

This gives an incentive to keep the blocks small, unless there's enough fees to counteract the additional difficulty.
IE: for 2x the difficulty, I can make 5x as much profits in fees.
It can also deal more easily with fast surge, such as the week before Christmas.

This leaves a pressure to keep the size small, contrary to simply adjusting the limit after X blocks.

From there, a minimum fee required for a transaction to be relayed by the network could be a fraction the smallest fee of the newest block.
So if it was 0.00001 BTC and you try to send with a fee of 0.00000001, you are most likely to not be included in the next block and you are not relayed.
ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470


Bringing Legendary Har® to you since 1952


View Profile
April 01, 2013, 03:26:38 AM
 #69

I very much like the idea of adjusting the difficulty with the block size.

This gives an incentive to keep the blocks small, unless there's enough fees to counteract the additional difficulty.
IE: for 2x the difficulty, I can make 5x as much profits in fees.
It can also deal more easily with fast surge, such as the week before Christmas.

This leaves a pressure to keep the size small, contrary to simply adjusting the limit after X blocks.

From there, a minimum fee required for a transaction to be relayed by the network could be a fraction the smallest fee of the newest block.
So if it was 0.00001 BTC and you try to send with a fee of 0.00000001, you are most likely to not be included in the next block and you are not relayed.

This idea seems nice, however I am afraid there will be some hidden consequences.

Can we have a comment on that from a developer ?

Sukrim
Legendary
*
Offline Offline

Activity: 2156


View Profile
April 01, 2013, 07:48:58 AM
 #70

That sounds a bit like a dominant assurance contract with a twist. My question is why it's better/worth the extra complexity.
It would encourage betting/bidding as opposed to not betting/bidding. Either you get some small profit or you really paid for the hash rate you were looking for.

However I'm not too sure if it would work out - who would even bet on any realistic difficulty? All you need to do is bet higher than 4 times the current diff (impossible to reach) to always get a profit. If everybody does that though, nobody gets a profit at all.
Probably it would be smart to have a (exponentially? diminishing quadratically or cubic?) larger reward for people who bet close to the really reached difficulty or simply do parimutel betting...

All in all it seems to me as if you want to do some kind of "penny auction" of mining fees or network costs. As this is about money, the reward is good for anyone contributing and people donating out of charity is unlikely/unsustainable, there need to be rewards or potential rewards for anyone taking part, not only miners and as few freeloaders as possible. Also the system needs to be as automatable as possible, betting might be fun and nice for a few times, but if I had to look at hash rate diagrams and estimates every week to make sure I either get a small profit or loose the whole wager to secure the network I guess I'd quickly choose to cash out and leave this behind.

On the other hand, this could be done by an external service as well for the beginning - do parimutel betting with for example a half cut off bell shaped curve for rewards on the difficulty after the next difficulty switch, pay high bettors 10%(?) of the low bets and 90% as pure fee transactions towards each of the 2016 blocks created. Should there be blocks built too fast to include some transactions, just divide the remaining reward by the new count and the remaining blocks then get paid a little more.
Might not be as lucrative as SD but (if done charitable --> only covering hosting+maintenance costs) could be used by various services also as some kind of CSR measure - they either support the miners or make a small profit that they then can automatically reinvest for the next round of course or just directly donate to a pool, service they like, still donate to miners...

https://bitfinex.com <-- leveraged trading of BTCUSD, LTCUSD and LTCBTC (long and short) - 10% discount on fees for the first 30 days with this refcode: x5K9YtL3Zb
Mail me at Bitmessage: BM-BbiHiVv5qh858ULsyRDtpRrG9WjXN3xf
jgarzik
Legendary
*
qt
Offline Offline

Activity: 1470


View Profile
April 01, 2013, 03:36:50 PM
 #71

In general it sounds like an unworkable scheme.  There is definitely no consensus on the block size issue at all.

Jeff Garzik, bitcoin core dev team and BitPay engineer; opinions are my own, not my employer.
Donations / tip jar: 1BrufViLKnSWtuWGkryPsKsxonV2NQ7Tcj
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
April 01, 2013, 03:41:48 PM
 #72

What sounds unworkable? The last post? Or the whole thread?
jgarzik
Legendary
*
qt
Offline Offline

Activity: 1470


View Profile
April 01, 2013, 03:54:39 PM
 #73

Infinite block sizes.

Jeff Garzik, bitcoin core dev team and BitPay engineer; opinions are my own, not my employer.
Donations / tip jar: 1BrufViLKnSWtuWGkryPsKsxonV2NQ7Tcj
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
April 01, 2013, 03:59:39 PM
 #74

Yes, but what's your reasoning for that? What specific thing about using assurance contracts to fund mining with large (or floating capped) block sizes seems unworkable to you?
TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 01, 2013, 04:38:09 PM
 #75

Idea: why not increase the hash difficulty for larger blocks? So if a miner wants to pass the 1MB threshold, require an extra zero bit on the hash. There's a big disincentive to include lots of tx dust.

Hmm, so a block with twice the difficulty can have twice the size?  In effect, this allows the block rate increase faster than once every 10 minutes, by combining multiple headers into a single header.

If you have 1MB of transactions worth 10BTC and another 1MB worth 5BTC (since they have lower tx fees), then it isn't worth combining them.  A double difficulty block would give you 50% the odds of winning, but you only win 15BTC if you win.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
jgarzik
Legendary
*
qt
Offline Offline

Activity: 1470


View Profile
April 01, 2013, 05:50:47 PM
 #76

Idea: why not increase the hash difficulty for larger blocks? So if a miner wants to pass the 1MB threshold, require an extra zero bit on the hash. There's a big disincentive to include lots of tx dust.

Hmm, so a block with twice the difficulty can have twice the size?  In effect, this allows the block rate increase faster than once every 10 minutes, by combining multiple headers into a single header.

If you have 1MB of transactions worth 10BTC and another 1MB worth 5BTC (since they have lower tx fees), then it isn't worth combining them.  A double difficulty block would give you 50% the odds of winning, but you only win 15BTC if you win.

That actually effectively illustrates part of the difficulty in creating a solution:  our economic reasoning will be clouded for many years by the block subsidy, which will probably dwarf the transaction fees for years to come.  Efficiencies which must exist in the self-supporting, fee-only future are unseen at this time.

Jeff Garzik, bitcoin core dev team and BitPay engineer; opinions are my own, not my employer.
Donations / tip jar: 1BrufViLKnSWtuWGkryPsKsxonV2NQ7Tcj
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 01, 2013, 05:56:17 PM
 #77

our economic reasoning will be clouded for many years by the block subsidy, which will probably dwarf the transaction fees for years to come.
Only if by "many" you mean less than two.

http://blockchain.info/charts/transaction-fees?timespan=all&showDataPoints=false&daysAverageString=1&show_header=true&scale=1&address=
bitlancr
Hero Member
*****
Offline Offline

Activity: 616


View Profile
April 01, 2013, 06:15:48 PM
 #78

If you have 1MB of transactions worth 10BTC and another 1MB worth 5BTC (since they have lower tx fees), then it isn't worth combining them.  A double difficulty block would give you 50% the odds of winning, but you only win 15BTC if you win.

OK, so it doesn't work if you pick those particular numbers. How about if doubling the difficulty gave you 4x the space then? Or more?
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 01, 2013, 07:47:50 PM
 #79

Mike, sorry to derail this thread some but creating multiple threads on the same general subject seems unhelpful  Smiley

I've finally had time to consider assurance contracts. The idea is good (e.g. Kickstarter) but it won't work IMO for Bitcoin.

The reason stems from something you pointed out and which I mentioned elsewhere: transactions come in different types. Often in block size discussions we refer to transactions quite generically - we view them only in the sense of data. However, transactions are not created equally. You correctly note some transactions can happily wait days for clearance and others, like micropayments, are not concerned much with double spends.

This line of thinking led me to realize the block size issue should be easily solvable. I'll get to that in a second. The reason assurance contracts won't work for Bitcoin is the participants you imagine might form such contracts won't opt for an inefficient payment channel.

... I think it'd likely work like this - merchants that have noticed that they start seeing double spends when network speed drops below 5 THash/sec would clearly be targeting that amount and no more. ..

If I'm a dentist or coffee shop proprietor why would I be using the block-chain for transactions? As you note it's subject to double spends, and at the very least unpredictable payment confirmation delay which can stretch well over an hour.

As I've posted often before I see Bitcoin transactions evolving to be handled largely off-chain. Native Bitcoin is not ideal for the majority of the world's transactions and never will be. If I'm a legitimate lawful business I care little about anonymous payments and irreversibility for clients. I'd just as soon use "BitcoinPal" to accept bitcoins instantly. If I'd pledge money for an assurance contract I'd certainly offer it to an entrepreneur that could offer a more elegant and professional payment solution.

So the block size issue should be easy to solve. We'll never need an infinite block size to ensure Bitcoin can handle the world's transactions. The majority of these will prefer not to route through Bitcoin. So why are we insisting they be able to?

Instead, all we need to achieve is usability for those transactions which uniquely value native Bitcoin. Such transactions (like those of Silk Road) should have little problem paying a decent sized fee for the value the block-chain provides. This solves DOS attacks for trivial fee amounts and other block-chain spam from low value transactions.

As I describe above we should allow the cap to be dynamically set by polling miners every so often to see when they are comfortable - even those of lower resources - with a higher block size limit. This prevents formation of mining monopolies while allowing Bitcoin to scale with technological progess.
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2408



View Profile
April 01, 2013, 11:04:47 PM
 #80

Quote
That actually effectively illustrates part of the difficulty in creating a solution:  our economic reasoning will be clouded for many years by the block subsidy, which will probably dwarf the transaction fees for years to come.  Efficiencies which must exist in the self-supporting, fee-only future are unseen at this time.

Agreed, the block reward incentive drives solely hash-power, whereas the fees-only regime (after circa 2040) is expected to incentivise both hash-power and tx storage. The phase we are entering now will be the gradual transition between the two regimes for approx. the next 25 years.

Maybe an interim solution for an interim situation?

TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 01, 2013, 11:21:05 PM
 #81

OK, so it doesn't work if you pick those particular numbers. How about if doubling the difficulty gave you 4x the space then? Or more?

Maybe, though, if you make it so that you get super-linear reward for linear effort, then everyone will try to make their blocks as large as possible.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
April 01, 2013, 11:56:14 PM
 #82

OK, so it doesn't work if you pick those particular numbers. How about if doubling the difficulty gave you 4x the space then? Or more?

Maybe, though, if you make it so that you get super-linear reward for linear effort, then everyone will try to make their blocks as large as possible.

Lateral thinking is always welcome, but any change which makes you think "hmm. That could have unintended consequences..." requires a great deal of analysis and prototyping before getting near live.

Maybe an interim solution for an interim situation?

Plan A: simple increase of block size limit based upon median size of recent blocks (or similar approach)
Plan B: optimal block sizing algorithm OR infinite size allowed. Either case with whatever incentives are determined to work.

Have plan A ready for a new release, at least by the time 500KB blocks are common. Plan B can simmer in the background until a version is demonstrably better than the simple plan A implementation. Only then do it.

Frozenlock
Sr. Member
****
Offline Offline

Activity: 434



View Profile
April 01, 2013, 11:57:31 PM
 #83

OK, so it doesn't work if you pick those particular numbers. How about if doubling the difficulty gave you 4x the space then? Or more?

Maybe, though, if you make it so that you get super-linear reward for linear effort, then everyone will try to make their blocks as large as possible.

--> As long as there is users paying the fees for it.

Also, Mike Hearn posted many topics about bitcoin control with blacklisted addresses and tainted coins, all depending upon miners centralization.
This is another reason I would prefer a pressure to keep everything as small as possible.

Suppose we have huge miners imposing themselves 5-10x more difficulties than the minimum required by the network. They will make more money, but at the same time the smaller miners can choose to let go the superfluous transactions (and their fees) in exchange of a higher probability to solve the next block. It's like the pools fee: some are OK with getting less money as long as it's more consistent.
bitlancr
Hero Member
*****
Offline Offline

Activity: 616


View Profile
April 02, 2013, 09:49:18 AM
 #84

OK, so it doesn't work if you pick those particular numbers. How about if doubling the difficulty gave you 4x the space then? Or more?

Maybe, though, if you make it so that you get super-linear reward for linear effort, then everyone will try to make their blocks as large as possible.

--> As long as there is users paying the fees for it.

Also, Mike Hearn posted many topics about bitcoin control with blacklisted addresses and tainted coins, all depending upon miners centralization.
This is another reason I would prefer a pressure to keep everything as small as possible.

Suppose we have huge miners imposing themselves 5-10x more difficulties than the minimum required by the network. They will make more money, but at the same time the smaller miners can choose to let go the superfluous transactions (and their fees) in exchange of a higher probability to solve the next block. It's like the pools fee: some are OK with getting less money as long as it's more consistent.

Exactly. It's the law of diminishing returns - all the best paying transactions will be in the first 1MB.
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 02, 2013, 07:37:53 PM
 #85

Guys we really need some semblance of a solution forming. I have a feeling it will get far more crowded around here soon.

Quote
Everyone is buzzing about Bitcoin as its total value tops one billion ...

http://www.cnbc.com/id/100609853

I'm thinking the middle ground between cap and unlimited size (i.e. dynamic cap) is our best hope.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 02, 2013, 08:01:09 PM
 #86

Guys we really need some semblance of a solution forming. I have a feeling it will get far more crowded around here soon.

Note how the recent run-up in price, and in particular the media coverage about it, has been focusing on Bitcoin as a store of value rather than Bitcoin as a payment system. The media has correctly identified what makes Bitcoin truly special, and it's not by being a irrevocable version of PayPal.

We're an order of magnitude away from filling up blocks without crowding out the low-value stuff yet at all, and rudimentary off-chain transaction stuff already exists in multiple forms. At the same time it's clear that FinCEN may in the future decide to regulate mining directly, so the arguments to keep the blocksize small to allow for anonymous mining are clearly valid; even 1MB blocks are a bit marginal with Tor.

justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 02, 2013, 08:10:39 PM
 #87

At the same time it's clear that FinCEN may in the future decide to regulate mining directly, so the arguments to keep the blocksize small to allow for anonymous mining are clearly valid; even 1MB blocks are a bit marginal with Tor.
At worst FinCEN would just drive the pool operators to operate outside the US and business as usual would continue, with US-based pool members routing their connections through VPNs.
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 02, 2013, 08:13:09 PM
 #88

... so the arguments to keep the blocksize small to allow for anonymous mining are clearly valid; even 1MB blocks are a bit marginal with Tor.

Well, that works under my proposed solution, which is a dynamic cap that increases only if there isn't significant objection from the field of miners to an increase. That way as more bandwidth becomes available in the future, even for Tor users, the cap can be raised as opposition to increased size would die down.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 02, 2013, 08:24:24 PM
 #89

... so the arguments to keep the blocksize small to allow for anonymous mining are clearly valid; even 1MB blocks are a bit marginal with Tor.

Well, that works under my proposed solution, which is a dynamic cap that increases only if there isn't significant objection from the field of miners to an increase. That way as more bandwidth becomes available in the future, even for Tor users, the cap can be raised as opposition to increased size would die down.

...which means an attacker has strong incentives to buy, rent, or coerce hashing power to vote for a blocksize increase. For instance in FinCEN was devious, part of regulating mining would be forcing miners to vote for a blocksize increase; inherently on-chain transactions vs. off-chain meet their goals of financial transparency and restricting privacy. Or this may be something as subtle as funneling money to tx fees for lots of low-value tx's, perhaps with unspendable but non-prunable outputs to further increase the cost of running a mining node well into the future.

Fundamentally miners are not representative of Bitcoin and the choice of what the blocksize needs to be must not be under their control.

acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 02, 2013, 08:44:43 PM
 #90

...which means an attacker has strong incentives to buy, rent, or coerce hashing power to vote for a blocksize increase.

I don't believe that would work, at least not any more than an attacker doing the same thing to engineer a 51% attack. It doesn't work by amassing enough 'yes' votes (it's assumed resourceful miners always want an increase). Rather it polls to see if there are significant 'no' votes.

... inherently on-chain transactions vs. off-chain meet their goals of financial transparency and restricting privacy.

Really? I wouldn't think so considering the success of Silk Road.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 02, 2013, 08:56:41 PM
 #91

...which means an attacker has strong incentives to buy, rent, or coerce hashing power to vote for a blocksize increase.

I don't believe that would work, at least not any more than an attacker doing the same thing to engineer a 51% attack. It doesn't work by amassing enough 'yes' votes (it's assumed resourceful miners always want an increase). Rather it polls to see if there are significant 'no' votes.

It's impossible to ask for a significant minority of 'no' votes to be your decision point, because a majority of 'yes' votes can decide to only mine on top of blocks that do not vote 'no'

All you can measure is a majority either way.

... inherently on-chain transactions vs. off-chain meet their goals of financial transparency and restricting privacy.
Really? I wouldn't think so considering the success of Silk Road.

On-chain have some privacy, and privacy that can be taken away by regulating miners. Off-chain with chaum tokens can have mathematically provable privacy simply unattainable by on-chain transactions.

acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 02, 2013, 09:07:31 PM
 #92

It's impossible to ask for a significant minority of 'no' votes to be your decision point, because a majority of 'yes' votes can decide to only mine on top of blocks that do not vote 'no'

That requires collusion, though, just like a 51% attack. Any attacker that can convince a significant percentage of miners to behave a certain way can disrupt Bitcoin. That happened recently when Gavin played the role of attacker and convinced enough miners to introduce forking software (aka version 0.8 ).

On-chain have some privacy, and privacy that can be taken away by regulating miners.

I don't see how you get that. Not that I have anything to do with Silk Road but I, for example, have wallet addresses that can't be linked to me in any possible way, no matter how much you analyze block-chain info. Regulating miners can't change that.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 02, 2013, 09:22:33 PM
 #93

That requires collusion, though, just like a 51% attack. Any attacker that can convince a significant percentage of miners to behave a certain way can disrupt Bitcoin. That happened recently when Gavin played the role of attacker and convinced enough miners to introduce forking software (aka version 0.8 ).

"God damn it, why won't that 5% of miners stop putting '1MB' in their coinbases. They're holding up progress!

Fine, lets just block those stuck pigs, we're not colluding or anything."

I don't see how you get that. Not that I have anything to do with Silk Road but I, for example, have wallet addresses that can't be linked to me in any possible way, no matter how much you analyze block-chain info. Regulating miners can't change that.

"FinCEN: From now on miners are money transmitters and must report all unusual transactions.

FinCEN: From now on miners must not mine transactions over xBTC if they satisfy the criteria of unusual.

FinCEN: From now on miners must also not build upon blocks that have transactions that they themselves would not be able to mine due to our regulations."

VPN's and offshore jurisdiction are irrelevant in this scenario - Bitcoin hasn't been banned so we can fully expect mining pools to stay in the US and other visible, regulated, countries to take advantage of the cheap high-bandwidth internet connections required to run a pool.

tl;dr: Don't be naive.

Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
April 02, 2013, 10:05:44 PM
 #94

I think you are massively over-estimating the difficulty of mining anonymously (as is usual with these debates).

Firstly, there is no particular reason mining on a pool requires a lot of bandwidth. Stratum with high difficulty shares already cut bandwidth usage very low.

For running the full node/pool itself, resource requirements are low. You can connect to the P2P network to gather transactions just like any other client. Nobody knows you are mining, even without Tor, as your behaviour is indistinguishable from any other node. Nodes can synchronise their mempools are startup and then know what each peers preferred block contents is likely to be: once solved, a block can be transmitted as a delta against the expected contents. This has the side effect of increasing bandwidth requirements for people who want to mine empty blocks, which is satisfying.

However, even without that change, I can't see a time when mining anonymously is impossible. Your argument, as always, relies on the assumption that "things" cannot possibly scale up, where the thing here is Tor. Tor can scale, so even if mining ends up requiring a lot of bandwidth it doesn't change anything.

Honestly Peter, to be blunt I long ago concluded you're just working backwards from your preferred outcome. You don't want Bitcoin to scale up, and you will continue to invent ever more convoluted and baseless theories as to why it can't, until the end of time. Convincing you doesn't seem to be possible.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 02, 2013, 11:28:47 PM
 #95

Mike, lets suppose for a second that I am correct, and Tor bandwidth and other anonymity technologies do not scale with the bandwidth possible at VPN/co-location providers, and thus running mining (not hashing) operations is not possible anonymously.

What do you expect to happen?

Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
April 02, 2013, 11:34:55 PM
 #96

Not much?

In the event that some jurisdictions with aggressive regulators try to impede mining, it will migrate elsewhere. Only if all mining is made illegal everywhere would it be a problem, and that's equivalent to a worldwide ban on Bitcoin, at which point it doesn't matter anymore.
jgarzik
Legendary
*
qt
Offline Offline

Activity: 1470


View Profile
April 02, 2013, 11:50:02 PM
 #97

Nobody knows you are mining, even without Tor, as your behaviour is indistinguishable from any other node.

Not true.  Sites already track the first node relaying a particular block.  Targeted observation (wiretap) makes the activity even more transparent, when you find a block.


Jeff Garzik, bitcoin core dev team and BitPay engineer; opinions are my own, not my employer.
Donations / tip jar: 1BrufViLKnSWtuWGkryPsKsxonV2NQ7Tcj
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2408



View Profile
April 02, 2013, 11:54:59 PM
 #98

I'm missing the jump to bitcoin concerning itself about enabling anonymous mining from block size limits?

Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
April 03, 2013, 12:03:11 AM
 #99

Not true.  Sites already track the first node relaying a particular block.  Targeted observation (wiretap) makes the activity even more transparent, when you find a block.

I was referring to the gathering of transactions stage, which is arguably the expensive part, bandwidth wise. If blocks are represented efficiently (eg, list of hashes or deltas against remote expected blocks) then almost all your bandwidth would go on receiving transaction broadcasts, and that doesn't require Tor.
Frozenlock
Sr. Member
****
Offline Offline

Activity: 434



View Profile
April 03, 2013, 12:31:20 AM
 #100

I'm missing the jump to bitcoin concerning itself about enabling anonymous mining from block size limits?

It's an implied jump  Wink

If mining requires a lot of resources (bandwidth and storage), it could be hard to maintain a decentralized network.
With only a handful of miners, it's easier for any would-be hero of this world to try to coerce them for 'the greater good'.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 03, 2013, 12:36:10 AM
 #101

Not much?

In the event that some jurisdictions with aggressive regulators try to impede mining, it will migrate elsewhere. Only if all mining is made illegal everywhere would it be a problem, and that's equivalent to a worldwide ban on Bitcoin, at which point it doesn't matter anymore.

So, in the countries where mining, and presumably Bitcoin in general, has been banned, do you think Bitcoins will have any value?


I was referring to the gathering of transactions stage, which is arguably the expensive part, bandwidth wise. If blocks are represented efficiently (eg, list of hashes or deltas against remote expected blocks) then almost all your bandwidth would go on receiving transaction broadcasts, and that doesn't require Tor.

...which means if I want to prevent mining behind Tor, all I have to do is fill my blocks with transactions that I haven't relayed to them. On the other hand, if it's a rule that you only mine to extend blocks if the transactions were broadcast first, anything that even makes transaction propagation slower, like a bunch of fake nodes, but far from 100%, causes orphan rates to jump. Not to mention I can make it very risky, due to orphaning, to mine blocks containing transactions that a large fraction of the nodes out there have decided to blacklist for whatever reason. You also create new forking risks if transaction propagation is ever prevented, when block propagation isn't.

acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 03, 2013, 05:23:46 PM
 #102

Honestly Peter, to be blunt I long ago concluded you're just working backwards from your preferred outcome. You don't want Bitcoin to scale up, and you will continue to invent ever more convoluted and baseless theories as to why it can't, until the end of time. Convincing you doesn't seem to be possible.

You know before I read this I had a question mark pop up when Peter brought up chaum signatures in one of these threads. I didn't see an immediate connection to a solution. Considering he wrote a detailed post on Chaum banks I wondered if personal preference was part of his outlook for the block size issue.

No one person owns Bitcoin so people are free to push opinions for whatever reasons they wish, but I'm also free to suggest the community ignore development input from any such people  (I don't know what retep's motivations ultimately are).

Guys let me throw an idea out. What about a one time increase to something like 50MB? Let the market work out the rest (off-chain options, altl-coins, etc.). I don't think we'll find an optimum solution - Bitcoin's design may not include one - but we do need something a majority of people can live with, and I think we need a general consensus on it soon.

Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 03, 2013, 07:07:31 PM
 #103

You know, it says a lot when your opponents resort to attacking you rather than your ideas:

Honestly Peter, to be blunt I long ago concluded you're just working backwards from your preferred outcome. You don't want Bitcoin to scale up, and you will continue to invent ever more convoluted and baseless theories as to why it can't, until the end of time. Convincing you doesn't seem to be possible.

You know before I read this I had a question mark pop up when Peter brought up chaum signatures in one of these threads. I didn't see an immediate connection to a solution. Considering he wrote a detailed post on Chaum banks I wondered if personal preference was part of his outlook for the block size issue.

solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
April 03, 2013, 07:19:59 PM
 #104

Peter, what is your opinion here:

Should complementary off-chain solutions develop organically on their own merits, taking loading off the Bitcoin blockchain, or should Bitcoin be crippled in the hope and expectation that complementary services develop around it faster?

acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 03, 2013, 07:25:52 PM
 #105

You know, it says a lot when your opponents resort to attacking you rather than your ideas:

Honestly Peter, to be blunt I long ago concluded you're just working backwards from your preferred outcome. You don't want Bitcoin to scale up, and you will continue to invent ever more convoluted and baseless theories as to why it can't, until the end of time. Convincing you doesn't seem to be possible.

You know before I read this I had a question mark pop up when Peter brought up chaum signatures in one of these threads. I didn't see an immediate connection to a solution. Considering he wrote a detailed post on Chaum banks I wondered if personal preference was part of his outlook for the block size issue.

In the post I quoted Mike Hearn from he directly addressed your arguments. I have done so as well.

I'm not attacking anyone and I'm not anyone's opponent. I'm looking at all possible avenues - cap, no cap, dynamic cap, etc. - objectively. However, I think it's also helpful to put all information, including suspicions on opinions, out in the open. Please feel free to do the same for me, for example.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 03, 2013, 07:39:35 PM
 #106

Peter, what is your opinion here:

Should complementary off-chain solutions develop organically on their own merits, taking loading off the Bitcoin blockchain, or should Bitcoin be crippled in the hope and expectation that complementary services develop around it faster?

Making the blocksize large as a solution will cripple Bitcoin with centralization and lack of anonymity.

The most revolutionary thing about Bitcoin is that it is a truly decentralized store of value; 1MB will be enough to act as a store of value for the forseeable future.


I'm not attacking anyone and I'm not anyone's opponent. I'm looking at all possible avenues - cap, no cap, dynamic cap, etc. - objectively. However, I think it's also helpful to put all information, including suspicions on opinions, out in the open. Please feel free to do the same for me, for example.

If you want to sink to that level, fine.

What does Mike's employer, Google, stand to gain from large blocks that only large companies can afford to process and validate? What does Google stand to gain from a system where every last transaction is recorded on a public blockchain, ripe for datamining? Mike after all works for a company that has a "real names" policy and actively tries to ensure users can-not use its services anonymously. Keep in mind Mike is also being paid by Google to work on Bitcoin; 20% time projects, while often speculative, are approved by management and must relate to Google's business interests in some fashion.

justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 03, 2013, 07:45:47 PM
 #107

Making the blocksize large as a solution will cripple Bitcoin with centralization and lack of anonymity.
Leaving the block size limited to 1 MB will create centralization and lack of anonymity.

With a fixed block size, most people in the world won't be able to conduct transactions directly on the blockchain - they will be forced to to route their transactions through the few privileged entities which can. There's your centralization and lack of anonymity.

Large blocks allow more transactions, which means more transaction fees, which means more revenue for mining as an industry, which means more players can afford to enter the market. That causes decentralization.

The most revolutionary thing about Bitcoin is that it is a truly decentralized store of value;
"Store of value" is an economic myth. Value is not a thing which can be stored.
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 03, 2013, 08:13:01 PM
 #108

If you want to sink to that level, fine.

What does Mike's employer, Google, stand to gain from large blocks that only large companies can afford to process and validate? What does Google stand to gain from a system where every last transaction is recorded on a public blockchain, ripe for datamining? Mike after all works for a company that has a "real names" policy and actively tries to ensure users can-not use its services anonymously. Keep in mind Mike is also being paid by Google to work on Bitcoin; 20% time projects, while often speculative, are approved by management and must relate to Google's business interests in some fashion.

I'm sorry but the above seems incredibly thin as rationale that Mike is arguing in biased fashion motivated by Google's interest. That's just my opinion.

Leaving the block size limited to 1 MB will create centralization and lack of anonymity.

With a fixed block size, most people in the world won't be able to conduct transactions directly on the blockchain ...

Let me ask you something. Do you think most people in the world would want to? If so, why?
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 03, 2013, 08:14:30 PM
 #109

Let me ask you something. Do you think most people in the world would want to?
I don't know and neither do you.

So let's not take the option away from them before we find out.
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 03, 2013, 08:20:10 PM
 #110

Let me ask you something. Do you think most people in the world would want to?
I don't know and neither do you.

So let's not take the option away from them before we find out.

Well, let me ask it another way. If you could send bitcoins to any person or business now, like you currently can with the block-chain, but the option was available with PayPal-like interface and immediacy which would you choose for common legal transactions? You really think that's unpredictable?
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 03, 2013, 08:26:12 PM
 #111

If you could send bitcoins to any person or business now, like you currently can with the block-chain, but the option was available with PayPal-like interface and immediacy which would you choose for common legal transactions?
This is a false dichotomy. The same interface can be built for either type of system and in practice transaction confirmation time in Bitcoin isn't a problem.
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 03, 2013, 08:32:13 PM
 #112

This is a false dichotomy. The same interface can be built for either type of system and in practice transaction confirmation time in Bitcoin isn't a problem.

I agree about the interface. I was mostly referring to confirmation time. When you say in practice Bitcoin confirmation time (which is unpredictable and can stretch over an hour) isn't a problem are you speaking for yourself or everyone?
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 03, 2013, 08:37:26 PM
 #113

I agree about the interface. I was mostly referring to confirmation time. When you say in practice Bitcoin confirmation time isn't a problem are you speaking for yourself or everyone?
People are conducting transactions using Bitcoin, and complaints about transaction times only arise in a tiny minority of instances, therefore transaction time is not a problem.

In addition to this, Bitcoin confirmation is faster than any other means of online transaction, when you compare like to like. An "instant" PayPal transfer is reversible for at least several months, and probably indefinitely. Bitcoin transaction also show up within a matter of seconds and are typically irreversible within an hour.

Assuming equivalent UIs, what theoretical advantages to your third party off-chain service provide have that zero confirmation Bitcoin transactions don't already provide?
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 03, 2013, 08:54:26 PM
 #114

People are conducting transactions using Bitcoin, and complaints about transaction times only arise in a tiny minority of instances, therefore transaction time is not a problem.

Actually I've noticed more and more threads lately about confirmation time stretching over an hour, I think due to Bitcoin growing in general, hence more transactions. Aside from that if you poll the community on whether they would rather have instantly trustworthy transfers or deal with confirmations I'd be willing to put bitcoins on what the result would be.

In addition to this, Bitcoin confirmation is faster than any other means of online transaction, when you compare like to like.

Please be practical. We're not out to dissect bitcoin and financial services. We're simply asking what people would likely prefer to do in the future.

An "instant" PayPal transfer is reversible for at least several months, and probably indefinitely.

The thing about reversibility is not important for the majority of transactions, which are largely legal. If you pay a dentist, or for a cup of coffee are you really thinking reversibility will be an issue? Also, a company can offer a no-reverse transaction option.

Bitcoin transaction also show up within a matter of seconds and are typically irreversible within an hour.

A Bitcoin transaction merely showing up is not considered by the community (for good reason) to be a good indicator a transaction is valid. I've addressed reversibility above.

Assuming equivalent UIs, what theoretical advantages to your third party off-chain service provide have that zero confirmation Bitcoin transactions don't already provide?

Assurance the transaction is valid.

Although I've got other things to do I'm happy to continue addressing your concerns to try and convince you, because I think I'm right. However, if you are unwilling to be convinced please let me know so we can save both of us and this thread some time.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 03, 2013, 11:21:23 PM
 #115

Assuming equivalent UIs, what theoretical advantages to your third party off-chain service provide have that zero confirmation Bitcoin transactions don't already provide?

Privacy, and the best way to achieve that, chaum signatures, is inherently irreversible and instant as well.

solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
April 03, 2013, 11:59:07 PM
 #116

Assuming equivalent UIs, what theoretical advantages to your third party off-chain service provide have that zero confirmation Bitcoin transactions don't already provide?

Privacy, and the best way to achieve that, chaum signatures, is inherently irreversible and instant as well.

Accepting this as true, for which you present a strong case in your thread on trusted banks, then this is indeed a complementary service which may well attract a significant user base in the future. It may even succeed in handling 90% of the transaction volume which would otherwise hit the main blockchain. Is that your optimistic scenario?

However, and correct me if I am wrong, but such a trusted banking service does not exist yet. Not even in a prototype form, let alone one that can rapidly substitute for blockchain transactions. Acceptance of new services like this will take some time, at least a few years, surely. The people on this forum are ahead of the masses on bitcoin usage, yet they universally appreciate that their holding is stored on thousands of nodes worldwide. How many of us would quickly and permanently move our bitcoin holding to one single service instead of having it stored directly on the main chain?

You use the word "trust", but it takes time to earn it. It has taken Bitcoin four years to earn the trust that is fueling its success today.
I argue that there is not enough time left for that level of trust to be earned by complementary services before the 1MB arbitrary constant becomes as effective as any ddos attack in the history of bitcoin.

Please consider this chart and let us know, in your considered opinion, whether trusted banks will be fully ready, with a proven track record, before its blue line reaches 345,600.

https://blockchain.info/charts/n-transactions?showDataPoints=false&show_header=true&daysAverageString=7&timespan=&scale=1&address=



marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2408



View Profile
April 04, 2013, 01:55:59 AM
 #117

The solution that allows for the most number of connected nodes must be the preferred one.

Whether those nodes are mining or not seems immaterial at this point as mining is already a highly specialised endeavour. There are already 'pooled' blockchain storage solutions like electrum available.

It seems to me that the blockchain storage that your average commodity desktop/laptop PC can tolerate for the foreseeable future must be the over-riding determiner of maximum block size so that the most number of transaction transmitting, relaying nodes can realistically remain connected to the network.

solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
April 04, 2013, 02:13:49 AM
 #118

The solution that allows for the most number of connected nodes must be the preferred one.

Whether those nodes are mining or not seems immaterial at this point as mining is already a highly specialised endeavour. There are already 'pooled' blockchain storage solutions like electrum available.

It seems to me that the blockchain storage that your average commodity desktop/laptop PC can tolerate for the foreseeable future must be the over-riding determiner of maximum block size so that the most number of transaction transmitting, relaying nodes can realistically remain connected to the network.


Miners obtain revenue from the block reward and fees. Non-mining nodes benefit indirectly: by owning a long-term bitcoin holding. No one can complain if they are left behind when they freeload. If a non-mining node wants to stay fully connected then they should be prepared to spend a tiny amount of their holding on computer hardware to achieve it.

https://www.bitcoinstore.com/fantom-greendrive-2-tb-external-hard-drive.html

1.2 BTC to store a Bitcoin blockchain which has 690 days of 20MB blocks in it!

Not much of an obligation.

justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 04, 2013, 02:21:37 AM
 #119

Although I've got other things to do I'm happy to continue addressing your concerns to try and convince you, because I think I'm right. However, if you are unwilling to be convinced please let me know so we can save both of us and this thread some time.
I was unable to find a single point you made in your post wasn't either specious or failed to address what you were replying to, so unless you've got something more substantial than that it's probably not worth anyone's time.
markm
Legendary
*
Offline Offline

Activity: 1988



View Profile WWW
April 04, 2013, 03:17:29 AM
 #120

Maybe we are seeing parts of why the world does not already use just one currency universally.

Right now I have my *coin holdings in full form, the actual blockchains, on my own machine.

I can spend them to anyone who has a machine no more powerful than my own, by having them run the daemon of the particular variety of coin they wish me to spend to them, and providing therm myself, directly with the entire blockchain that validates the entire thing all the way back to The Block of Genesis.

I might be very very wealthy when whichever coin or coins become vast mass-market enterprise mined by massive enterprises on the scale of Google, but will I really feel that my private keys to coins on big brother's chain are really as god and useful as the keys I have now to chains held in private hands in private homes throughout many nations? Even now already if I want to buy, say, two hundred and fifty bitcoins worth of perfectly legal goods, such as maybe some computer gear to run some nodes on, can I even now using these small, hidden in private homes everywhere, chains actually do that, or is big brotyer already reaching for it, maybe holding over bitcoinstore or bitcoinshop's heads already a bunch of threats that they must track who I am and maybe not even tell me that my information is being given to big brother and so on?

How much worse is it going to be when I have to submit my private keys for approval to some megacorp that is the only entity with access to the blockchain not because only a big entity can afford the transaction fees to get onto the blockchain but because only a big entity can actually mine the thing thus only the big entity can choose which private keys are even permitted onto the blockchain or to be used to move values on the blockchain, and to where they are allowed to move how much value, and how many permits are needed and how many copies of apostilled passports and so on?

Maybe if there isw a dichotomy between smal and secret store and private transfer out of big brother's view, and mass market full subject to big brother control megablockchain, we should just go ahead and admit litecoin already has bitcoin beat for sheer amount of transactions and sheer speed of confirmation and focus on the unlimited blocks ideas for litecoin and the keeping one's money out of big brother's hands for bitcoin?

Pls if both are too big for some uses, what the heck we still have namecoin and ixcoin and devcoin and i0coin coiledcoin all merged mine able alongside bitcoin so any one or more of those might happen to still be small enough for some time to come that even as more and more of te bloat more and more it still might be a year or two before we need to ad a bunch more chaisn to the merge to ensure small people, private people, still have bitcoin-type technology scaled suitably to their needs?

Problem here though that I have already seen is that despite all the verbiage about independence from big brother bitcoiners actually have massive tendency to deliberately not help keep the smaller less of a target for big brother so far due to smaller market cap chains strong in hashing power...

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
April 04, 2013, 10:37:31 AM
 #121

Oh goodie, more Google conspiracy theories. Actually I never had to ask for approval to use 20% time on Bitcoin. That's the whole point of the policy - as long as there's some justifiable connection to the business, you can do more or less whatever you want with it and managers can't tell you not to unless it's clearly abusive. That's how we ensure it's usable for radical (i.e. unexpected) innovation.

But even if I was being paid to work on Bitcoin full time by Google, the idea that I'd want Bitcoin to grow and scale up as part of some diabolical corporate master plan is stupid. Occam's Razor, people! The simplest explanation for why I have worked so hard on Bitcoin scalability is that I want it to succeed, according to the original vision laid down by Satoshi. Which did not include arbitrary and pointless limits on its traffic levels.

The idea that Bitcoin can be a store of value with a 1mb block size limit seems like nonsense to me. That's reversing cause and effect. Bitcoin gained value because it was useful, it didn't gain use because it had value - that can't be the case because it started out with a value of zero. So if Bitcoin is deliberately crippled so most people can't use it, it will also cease to have much (if any) value. You can't have one without the other. The best way to ensure Bitcoin is a solid store of value is to ensure it's widely accepted and used on an every day basis.

If Bitcoin was banned in a country then I think it's obvious its value would be close to zero. This is one of the most widely held misconceptions about Bitcoin, that it's somehow immune to state action. A currency is a classic example of network effects, the more people that use it, the more useful it becomes but it goes without saying that you have to actually know other people are using it to be able to use it yourself. If there was immediate and swift punishment of anyone who advertised acceptance of coins or interacted with an exchange, you would find it very hard to trade and coins would be useless/valueless in that jurisdiction.

The reason I'm getting tired of these debates is that I've come to agree with Gavin - there's an agenda at work and the arguments are a result of people working backwards from the conclusion they want to try and find rationales to support it.

Every single serious point made has been dealt with by now. Let's recap:

  • Scalability leads to "centralization". It's impossible to engage in meaningful debate with people like Peter on this because they refuse to get concrete and talk specific numbers for what they'd deem acceptable. But we now know that with simple optimisations that have been prototyped or implemented today, Bitcoin nodes can handle far more traffic than the worlds largest card networks on one single computer, what's more, a computer so ordinary that our very own gmaxwell has several of them in his house. This is amazing - all kinds of individuals can, on their own, afford to run full nodes without any kind of business subsidisation at all, including bandwidth. And it'll be even cheaper tomorrow.
  • Mining can't be anonymous if blocks are large. Firstly, as I already pointed out, if mining is illegal in one place then it'll just migrate to other parts of the world, and if it's illegal everywhere then it's game over and Bitcoin is valueless anyway, so at that point nobody cares anymore. But secondly, this argument is again impossible to really grapple with because it's based on an unsupported axiom: that onion networks can't scale. Nobody has shown this. Nobody has even attempted to show this. Once again, it's an argument reached by working backwards from a desired conclusion.
  • Mining is a public good and without artificial scarcity it won't get funded. This is a good argument but I've shown how alternative funding can be arranged via assurance contracts, with a concrete proposal and examples in the real world of public goods that get funded this way. It'll be years before we get to try this out (unless the value of Bitcoin falls a lot), but so far I haven't seen any serious rebuttals to this argument. The only ones that exist are of the form, "we don't have absolute certainty this will work, so let's not try". But it's not a good point because we have no certainty the proposed alternatives will work either, so they aren't better than what I've proposed.

Are there any others? The amount of time spent addressing all these arguments has been astronomical and at some point, it's got to be enough. If you want to continue to argue for artificial scaling limits, you need to get concrete and provide real numbers and real calculations supporting that position. Otherwise you're just peddling vague fears, uncertainties and doubts.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 04, 2013, 10:40:07 AM
 #122

Accepting this as true, for which you present a strong case in your thread on trusted banks, then this is indeed a complementary service which may well attract a significant user base in the future. It may even succeed in handling 90% of the transaction volume which would otherwise hit the main blockchain. Is that your optimistic scenario?

However, and correct me if I am wrong, but such a trusted banking service does not exist yet. Not even in a prototype form, let alone one that can rapidly substitute for blockchain transactions. Acceptance of new services like this will take some time, at least a few years, surely. The people on this forum are ahead of the masses on bitcoin usage, yet they universally appreciate that their holding is stored on thousands of nodes worldwide. How many of us would quickly and permanently move our bitcoin holding to one single service instead of having it stored directly on the main chain?

We already have off-chain transactions, for instance Mt. Gox, and AurumXchange's code systems. Similarly web wallets allow for transfers from one user to another directly; MPOE reports that quite large amounts of BTC are exchanged between users every day. The Silk Road is another example of off-chain transactions - deposits go into a big shared wallet and transfers within the system are totally off-chain. In that case there are significant privacy advantages, not to mention how it makes implementing escrow easy.

There isn't going to be a single service that does this, that's my whole point: if you achieve scalability by just raising the blocksize, you wind up with all your trust is in a tiny number of validating nodes and mining pools. If you achieve scalability through off-chain transactions, you will have many choices and options. I'm sure the users of the Silk Road have very different ideas about who they can trust than users of BitPay...

You use the word "trust", but it takes time to earn it. It has taken Bitcoin four years to earn the trust that is fueling its success today.
I argue that there is not enough time left for that level of trust to be earned by complementary services before the 1MB arbitrary constant becomes as effective as any ddos attack in the history of bitcoin.

Please consider this chart and let us know, in your considered opinion, whether trusted banks will be fully ready, with a proven track record, before its blue line reaches 345,600.

https://blockchain.info/charts/n-transactions?showDataPoints=false&show_header=true&daysAverageString=7&timespan=&scale=1&address=

Off-chain transactions aren't something that will be implemented in one big go. Like I said above, the markets can naturally adjust bit by bit gradually moving uses of Bitcoin from on-chain to off-chain as the fees gradually increase. At worst growth in the least important, lowest value, parts of the Bitcoin economy is slowed while off-chain tx solutions catch up.

On the other hand, if the blocksize is raised and it leads to centralization, Bitcoin as a decentralized currency will be destroyed forever.

It might not be sexy and exciting, but like it or not leaving the 1MB limit in place for the foreseeable future is the sane, sober and conservative approach. Exactly what I want from the developers of the brand-new and still poorly understood technology underpinning Bitcoin's $1.5billion market cap. For that matter, I personally have a good chunk of my wealth tied up in Bitcoins and want them to be valuable in the long run.

You know, I work in engineering at a company filled with ex-aerospace guys, guys who are careful and try not to get people killed by things they don't understand. They all find my work on Bitcoin interesting, and a big part of that is because of how crazy high-stakes it is. If anything Bitcoin is kinda like being given alien technology that happens to be able to make a plane that never has to land, and unfortunately doesn't even even seem to be able to land. It's brilliant at moving passengers around, although the current plane can only fit 1000 of them. It's also the only one we have so if you want to change anything you have to do a bit of wing walking and be careful not to get any body parts near the props. Oh, and not to mention, it's currently raining cats and dogs, well, actually mostly dead puppies...

Now, under those circumstances do you think I'd go up to my boss and say "Gee, I dunno, how about I climb out that access hatch at the back and add some sheet metal until the plane is long enough to shove 200,000 passengers in there?" Fuck no. For one thing this plane of ours damn near fell out of the sky a month ago, and it only took 700 passengers to make that happen. We also don't really understand why it flies at all; we've all got our own theories and some of the theories are probably even right, but we've never flown this plane in serious turbulence, let alone been attacked by those monocled guys in airships off in the distance. Heck, we got worried enough when one of them sent us a nasty telegram the other day about how shabby our records were or something.

No, instead I'd be told to work on some prototypes and write some papers and see if maybe I could make some small nimble plane that could fly along-side our hulking monolith. Preferably a plane that can land for maintenance.

caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
April 04, 2013, 12:17:50 PM
 #123

Great post Mike! You should create a topic with it in the OP, as it deserves more than to be buried here in the 7th page of this topic.


If Bitcoin was banned in a country then I think it's obvious its value would be close to zero.

Hum, are you sure about that? I'd like to know what happened to the market price of gold in US during the ban of the 30s...
If Bitcoin is aggressively banned everywhere, then I'd agree with you. But if it's still allowed in a significant number of places, I don't think so.

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
Nubarius
Sr. Member
****
Offline Offline

Activity: 309


View Profile
April 04, 2013, 02:35:42 PM
 #124

[...]
The idea that Bitcoin can be a store of value with a 1mb block size limit seems like nonsense to me. That's reversing cause and effect. Bitcoin gained value because it was useful, it didn't gain use because it had value - that can't be the case because it started out with a value of zero. So if Bitcoin is deliberately crippled so most people can't use it, it will also cease to have much (if any) value. You can't have one without the other. The best way to ensure Bitcoin is a solid store of value is to ensure it's widely accepted and used on an every day basis.
[...]

I fully agree with this.

[...]
On the other hand, if the blocksize is raised and it leads to centralization, Bitcoin as a decentralized currency will be destroyed forever.
[...]

Peter, I've struggled to try to understand you concerns but I still find your arguments convoluted and hard to follow. You want it to be possible to run a Bitcoin node on average hardware, but you don't want the people with average hardware to be able to use the network. How can that work? It's obvious that a blockchain recording all transactions is bound to be resource-intensive, but as far as I can see that's the very nature of a proof-of-work blockchain. Satoshi's idea was that such a resource-intensive system could be viable in the 21st century and he always presented Bitcoin as an accessible payment system, not as some sort of infrastructure for large financial services. And that idea is also what practically all of Bitcoin enthusiasts have been promoting during the last three years. I think I'm not the only one who's been telling his friends about how easy it is to get a Bitcoin wallet, and make payments to anyone around the world, and how this system could eventually be used to pay for a restaurant bill or to buy a book online. This is probably the idea that first attracted most of us to Bitcoin. If you think that such a thing is not possible and that Bitcoin can only survive as a low-volume network where payments are outrageosly expensive, well then that's like saying that you don't really believe in the original Bitcoin idea. You could be right, who knows, but keeping the block size limit is what effectively kills the Bitcoin dream, either by turning it into a specialist service used by a minority or, more likely, condemning it to irrelevance and oblivion. We need to find a compromise, and I think the sensible thing would be to agree to a higher block size limit. Those, like myself, who would remove the limit altogether will find a much higher limit more reasonable whereas those who are adamant that the limit is necessary could accept a value like 50 MB, and avoid crippling the system during the next few years.
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
April 04, 2013, 03:13:56 PM
 #125

I don't agree with the idea that keeping the 1mb limit is conservative. It's actually a highly radical position. The original vision for Bitcoin was that it supports everyone who wants to use it, that's why the very first discussion of it ever was about scalability and Satoshi answered back with calculations based on VISA traffic levels. The only reason the limit is there at all was to avoid anyone mining huge blocks "before the community was ready for it", in his words, so it was only meant to be temporary.

Peter, your usual response to this is that Satoshi made mistakes, etc, that he was wrong and that you know better than him. Which may or may not be true. However, regardless of whether he was wrong or right, that is how the project was envisioned and trying to fundamentally change it now is a controversial position - every bit as radical as wanting to change the inflation formula.

It's actually me that's the conservative one, because I advocate "staying the course".
fornit
Hero Member
*****
Offline Offline

Activity: 989


View Profile
April 04, 2013, 04:02:24 PM
 #126

i agree with mike that staying with the 1mb limit is far more radical - and unreasonable - than increasing block sizes. however, i think there is little need to find a final solution at this point. a conservative increase of the max block size - for example to 5mb - is a very simple and predictable short term solution. 5mb would work on computers and bandwidth a lot of people already have so you dont even have to argue with moores law or anything to show that you wont run into trouble.
and when the exponential growth hits that new limit hard and fast and there are still lots and lots of full nodes around, the 1mb-extremists will be very very quite and it will be much easier to get a consent on something more final  Wink

there might be more elegant solutions, but pushing for anything that might even remotely look like it wont run on an already existing, affordable consumer computer is a very hard sell. at least for now.

ps: mike, i think you have trouble grasping how alien exponential growth is for pretty much everyone. it eludes the natural human capability to make good estimates and seems overwhelming and scary. psychologically, predictability is extremly important right now and thats one of the main reasons i advocate for a fixed increase in block size for the time being. i think its worth limiting the best case scenario for short term bitcoin growth for that.

justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 04, 2013, 04:18:18 PM
 #127

ps: mike, i think you have trouble grasping how alien exponential growth is for pretty much everyone. it eludes the natural human capability to make good estimates and seems overwhelming and scary. psychologically, predictability is extremly important right now and thats one of the main reasons i advocate for a fixed increase in block size for the time being. i think its worth limiting the best case scenario for short term bitcoin growth for that.
I think you might even have succumbed to this with your 5 MB recommendation.

With no further increases in the adoption rate, average block size will reach 1 MB by the end of 2013 and 10 MB by the end of 2014. An increase to 5 MB only buys a few months, and if something happens between now and then to increase the rate of Bitcoin adoption all bets are off.
fornit
Hero Member
*****
Offline Offline

Activity: 989


View Profile
April 04, 2013, 04:40:52 PM
 #128

I think you might even have succumbed to this with your 5 MB recommendation.

With no further increases in the adoption rate, average block size will reach 1 MB by the end of 2013 and 10 MB by the end of 2014. An increase to 5 MB only buys a few months, and if something happens between now and then to increase the rate of Bitcoin adoption all bets are off.

i am well aware of that. as i said it limits the best case scenario. but if we really do hit 5mb in say summer or fall 2014 that also means bitcoin is twenty times as big as it is now. imho, at that point its likely too big to fail - pun intended.
so i would like to intentionally sacrifice the best case to avoid the worst case. short term at least.

acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 04, 2013, 06:48:18 PM
 #129

Well, one thing seems clear to me by now. No amount of continued arguments will change entrenched views here. So, on to the next question. What do we actually do?

In negotiations where parties are far apart and demonstrably unwilling to move the only solution AFAIK is one where nobody gets what they want entirely, but instead uses something all can live with.

In my opinion that would be a change that appears most "safe". That seems like raising the limit by some safe appearing amount and seeing how things go. I estimate that would be raising the limit to something like 5-10MB.

marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2408



View Profile
April 05, 2013, 12:35:50 AM
 #130

Okay, here's some numbers so we are all on same page and not throwing around "vagaries"

The maximum size of the blockchain growth at present 1 MB block limit is 52.5 GByte per year.
At an increased block limit of 5 MByte => 262.8 GByte per year.
Similarly for 10Mbyte, max. growth => 525.6 GByte per year.

When someone says "they want to increase the block limit so that the network can grow", they can mean only one thing, they want the number of transactions on the network to grow since clearly the number of new people willing to run a full node with increased storage decreases (above some limit) as the size of the blockchain increases. So just to be clear here, they do not mean they want the network to grow but that they want the usage of the network to grow, which is not the same thing.

So as long as everybody is clear that increasing the size of the blocks is limiting the number of full nodes then that is okay, but what you are using as the metric for the "size" of the network is important here.

Is the size of the network the number of nodes or the number of transactions? What is the stated goal here, maximizing transactions or maximizing network nodes?

acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 05, 2013, 12:48:13 AM
 #131

Okay, here's some numbers so we are all on same page and not throwing around "vagaries"

The maximum size of the blockchain growth at present 1 MB block limit is 52.5 GByte per year.
At an increased block limit of 5 MByte => 262.8 GByte per year.
Similarly for 10Mbyte, max. growth => 525.6 GByte per year.

When someone says "they want to increase the block limit so that the network can grow", they can mean only one thing, they want the number of transactions on the network to grow since clearly the number of new people willing to run a full node with increased storage decreases (above some limit) as the size of the blockchain increases. So just to be clear here, they do not mean they want the network to grow but that they want the usage of the network to grow, which is not the same thing.

So as long as everybody is clear that increasing the size of the blocks is limiting the number of full nodes then that is okay, but what you are using as the metric for the "size" of the network is important here.

Is the size of the network the number of nodes or the number of transactions? What is the stated goal here, maximizing transactions or maximizing network nodes?

I don't think storage is a big problem, even storing the full blockchain with 10MB figures. Memory has kept a good pace at becoming cheaper, more so than higher bandwidth options. Even with today's costs you could handle the 10MB figures for about $65 per year in storage costs, which will only get cheaper.

Quote
Dell Price      $129.99 - 1TB Hard Drive

http://accessories.us.dell.com/sna/productdetail.aspx?c=us&l=en&s=bsd&cs=04&sku=A4489204&dgc=ST&cid=248711&lid=4318543&acd=12309152537461010
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 05, 2013, 12:50:34 AM
 #132

When someone says "they want to increase the block limit so that the network can grow", they can mean only one thing, they want the number of transactions on the network to grow since clearly the number of new people willing to run a full node with increased storage decreases (above some limit) as the size of the blockchain increases.
This is not at all clear.

Increased storage requirements caused by increased transaction demand may reduce the number of home users willing to run full nodes but at the same time this growth can only happen if the number of people using bitcoins is increasing. Perhaps the fraction of home users willing to run a full node is decreasing by a certain percentage, but it's not at all obvious this percentage will be larger than the percentage increase of the user base as a whole.

In addition, if the  transaction rate is growing because of increased adoption bitcoin will be spreading into entirely different sectors than just home users; there will be more bitcoin-based businesses. It's not a given that the decrease in the number of home users running full nodes will exceed the growth in the number of businesses running full nodes.

It would be silly to assume that the composition of users during the extreme early adopter phase will be in any way representative of the composition of users in the future as Bitcoin moves higher up on the adoption curve.
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 05, 2013, 01:23:17 AM
 #133

...

In addition, if the  transaction rate is growing because of increased adoption bitcoin will be spreading into entirely different sectors than just home users; there will be more bitcoin-based businesses. It's not a given that the decrease in the number of home users running full nodes will exceed the growth in the number of businesses running full nodes.

It would be silly to assume that the composition of users during the extreme early adopter phase will be in any way representative of the composition of users in the future as Bitcoin moves higher up on the adoption curve.

I agree.
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2408



View Profile
April 05, 2013, 01:48:16 AM
 #134

When someone says "they want to increase the block limit so that the network can grow", they can mean only one thing, they want the number of transactions on the network to grow since clearly the number of new people willing to run a full node with increased storage decreases (above some limit) as the size of the blockchain increases.
This is not at all clear.


Ok, maybe not crystal clear down to the exact numbers but qualitatively it is correct. This can proved easily by considering the upper bound, take the total installed 'potential nodes' on the planet with more than max_block_size*144*365 available storage space and there is your total available number of nodes. Make some assumptions about how many of those available nodes will actually run bitcoin and you get your maximum practical network size, measured in number of full nodes.

Increasing the block size limits the number of nodes unless there is added sufficient incentive for new nodes to bring online the new storage/relaying capacity. Whether that is a good or bad thing is a separate issue.

jmw74
Full Member
***
Offline Offline

Activity: 236


View Profile
April 05, 2013, 02:03:27 AM
 #135

Well, one thing seems clear to me by now. No amount of continued arguments will change entrenched views here. So, on to the next question. What do we actually do?

In negotiations where parties are far apart and demonstrably unwilling to move the only solution AFAIK is one where nobody gets what they want entirely, but instead uses something all can live with.

In my opinion that would be a change that appears most "safe". That seems like raising the limit by some safe appearing amount and seeing how things go. I estimate that would be raising the limit to something like 5-10MB.

"So you say that Porsche you have for sale is worth $40k, I say it's worth two bucks.  Why don't we meet in the middle and call it $20k, that's fair right?"

People can't come along with a radically different vision for what bitcoin should be, and expect everyone else to meet them in the middle.

I think it's pretty clear who would win this battle if it came down to a fork.  The one with cheap payments.

caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
April 05, 2013, 05:44:00 AM
 #136

Is the size of the network the number of nodes or the number of transactions? What is the stated goal here, maximizing transactions or maximizing network nodes?

Repeating what was already said once again: you can safely transact without being a full node. On the other hand, what's the point in being a full node if you can't even transact since there's no more room in the blockchain for your transactions?

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2408



View Profile
April 05, 2013, 06:00:19 AM
 #137

Is the size of the network the number of nodes or the number of transactions? What is the stated goal here, maximizing transactions or maximizing network nodes?

Repeating what was already said once again: you can safely transact without being a full node. On the other hand, what's the point in being a full node if you can't even transact since there's no more room in the blockchain for your transactions?

Well that's disingenuous, there is always room in the blockchain (up to 1MByte per block at present), it is just the price to get in the blockchain that is at issue here. As I've tried to make clear on numerous occasions, above in this thread also, you cannot simply divorce the discussions on block size limits from fees as simply as you are wont to do here.

The whole discussion is about who is going to be paying for the N*max_block_size*365*144 Mbytes annual global storage requirement for the blockchain (N - number of full network nodes), trying to  block one's ears to discussions of fees is ignoring half the argument. Shall we have some quantification of optimal size of N for those who seem to be saying it is a number that can be discounted?

Frozenlock
Sr. Member
****
Offline Offline

Activity: 434



View Profile
April 05, 2013, 06:25:05 AM
 #138

I think it's pretty clear who would win this battle if it came down to a fork.  The one with cheap payments.

I'm not so convinced... What if someone introduced inflation instead
of fees? Transactions would be "free", but I'm pretty darn sure the
resulting cryptocurrency would be immediately discarded.

Rather, the one which would win the battle is the one that could
preserve the value. So yeah, I hear you Mike when you say that Bitcoin
has value because it's useful... but it also has value because it has
the potential to maintain it. Bitcoin became money because it had all
the required characteristics to be so, not just 'some' of them.

The biggest selling point of Bitcoin for everyone is the capped 21
millions. If that's not telling something, I don't know what will. The
only ones for which the BTC price is irrelevant are those who
immediately transfer to another asset, like... oh I don't know... a
payment processing service? *Wink to bit-pay*

Anyhow...
I'm still trying to wrap my head about the assurance contracts, but I
don't really see it. Need more IQ points perhaps?

Assuming there's no need to limit the block size or the bandwidth, I
don't really see why/how one could pay for THash. You pay to be
included in a block, miners compete to get the fees, a hashing war
follows... and that's it.

Isn't it always more profitable for a potential attacker to just mine
with the rest, instead of waiting in the shadows with an idle
supercomputer to occasionally reverse a transaction 1-2 blocks ago?
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 05, 2013, 06:57:03 AM
 #139

Well, one thing seems clear to me by now. No amount of continued arguments will change entrenched views here. So, on to the next question. What do we actually do?

In negotiations where parties are far apart and demonstrably unwilling to move the only solution AFAIK is one where nobody gets what they want entirely, but instead uses something all can live with.

In my opinion that would be a change that appears most "safe". That seems like raising the limit by some safe appearing amount and seeing how things go. I estimate that would be raising the limit to something like 5-10MB.

Raising the limit as the first response to scalability problems sets a precedent that the limit will be simply raised again and again as Bitcoin grows. Why spend the effort solving the problem, when you can simply accept less security and punt the issue another year into the future? Fast growing internet startups aren't exactly known for their long-term planning.

I think it's pretty clear who would win this battle if it came down to a fork.  The one with cheap payments.

Bitcoin as a payment system is interesting in that as it becomes easier and faster to complete the fiat->Bitcoin->fiat loop required to make a payment, the economic influence of that use becomes less and less important, all things being equal. The reason is simple: the faster you can complete that loop, the fewer Bitcoins are tied up making payments, and thus the demand for Bitcoins for that application goes down. Similarly those users care less about what the value of Bitcoin is at any given moment.

Conversely your investors, the people holding Bitcoins who believe they will maintain their value in the long run, perform far fewer transactions, yet constitute the economic majority and for now are the people paying for the security of Bitcoin. (via the still large inflation subsidy) This group has every reason to oppose changes that will sacrifice the security of their investment just so people can make cheap transactions.

TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 05, 2013, 08:29:22 AM
 #140

Increased storage requirements caused by increased transaction demand may reduce the number of home users willing to run full nodes but at the same time this growth can only happen if the number of people using bitcoins is increasing. Perhaps the fraction of home users willing to run a full node is decreasing by a certain percentage, but it's not at all obvious this percentage will be larger than the percentage increase of the user base as a whole.

Some kind of distributed verification of the block chain would be a potential way to get around the size problems.  When you connect a node, you could say how much you are willing to verify and how much hard disk space you will allocate to bitcoin.  You would then only check some of the information.

This requires that the protocol be modified slightly so that a node can broadcast proof that a block is invalid and then all nodes that receive that proof will discard the block.

There could also be distribution of the storage.  Making sure no info is lost is a potential weakness, but as long as there is enough overlap that should be unlikely.  Also, there would still likely be full nodes which would store everything.

Also, proving a block is invalid sometimes can't be done if info is withheld.  You can't prove a block is invalid, if you don't have some of the transactions referenced by the block.  There would also need to be some system for broadcasting something like "tx with hash = <some-hash> does not exist".  This would be proof that the block in question is not valid.  It isn't clear how to prevent spamming of such messages.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
April 05, 2013, 10:05:55 AM
 #141

Most full nodes do not need to store the entire chain. Though it's not implemented yet, block pruning will mean that your disk usage will eventually stabilize at some multiple of transaction traffic. Only a small number of nodes really need to store the entire thing stretching all the way back to 2009.

Anyway, ultimately this will be decided by Gavin and so far he's been saying he wants to raise the block size limit.
caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
April 05, 2013, 10:15:50 AM
 #142

Anyway, ultimately this will be decided by Gavin and so far he's been saying he wants to raise the block size limit.

I'd say ultimately it's the main services (MtGox, Bitpay, BitInstant, BitcoinStore, WalletBit, Silk Road etc) that will decide. If they all stay at the same side of the fork, that will likely be the side that "wins", regardless of Gavin or even miners' will (most miners would just follow the money).
Of course that services have a strong interest in staying on the branch that's more professionally supported by developers, so yeah, if most of the core team goes to one side, we could predict most of these services would too.

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
April 05, 2013, 10:57:21 AM
 #143

Thanks Peter for the detailed explanation of your position. I do understand the thrust of your arguments but disagree over a number of areas...

There isn't going to be a single service that does this, that's my whole point: if you achieve scalability by just raising the blocksize, you wind up with all your trust is in a tiny number of validating nodes and mining pools. If you achieve scalability through off-chain transactions, you will have many choices and options.
...
On the other hand, if the blocksize is raised and it leads to centralization, Bitcoin as a decentralized currency will be destroyed forever.

I am already concerned about the centralization seen in mining. Only a handful of pools are mining most of the blocks, so decentralization is already being lost there. Work is needed in two areas before the argument for off-chain solutions becomes strong: first blockchain pruning, secondly, initial propagation of headers (presumably with associated utxo) so that hashing can begin immediately while the last block is propagated and its verification done in parallel. These would help greatly to preserve decentralization.

MtGox and other sites are not a good place for people to leave their holdings permanently. As has been pointed out, most people will not run nodes to support the blockchain if their own transactions are forced or priced away from it. Bitcoin cannot be a store of value without being a payment system as well. The two are inseparable.

It might not be sexy and exciting, but like it or not leaving the 1MB limit in place for the foreseeable future is the sane, sober and conservative approach.

Unfortunately, this is the riskiest approach at the present time. The conservative approach is to steadily increase it ahead of demand, which maintains the status quo as much as market forces permit. The dead puppy transaction sources have forced this issue much earlier than would otherwise be the case.

You mention your background in passing, so I will just mention mine. I spent many years at the heart of one of the largest TBTF banks working on its equities proprietary trading system. For a while 1% of the shares traded globally (by value) was our execution flow. On average every three months we encountered limitations of one sort or another (software, hardware, network, satellite systems), yet every one of them was solved by scaling, rewriting or upgrading. We could not stand still as the never-ending arms race for market-share meant that to accept a limitation was to throw in the towel.

The block limit here is typical of default/preset software limits that have to be frequently reviewed, revised or even changed automatically.
The plane that temporarily choked on 700 passengers may now be able to carry 20,000. Bitcoin's capacity while maintaining a desired level of decentralization may be far higher than we think, especially if a lot of companies start to run nodes. It just needs the chance to evidence this.


Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 05, 2013, 11:10:39 AM
 #144

Of course that services have a strong interest in staying on the branch that's more professionally supported by developers, so yeah, if most of the core team goes to one side, we could predict most of these services would too.

FWIW currently the majority of the core team members, Gregory Maxwell, Jeff Garzik and Pieter Wuille, have all stated they are against increasing the blocksize as the solution to the scalability problem. Each has different opinions and degrees of course on exactly what that position constitutes, but ultimately all of them believe off-chain transactions need to be the primary way to make Bitcoin scale.

EDIT: to be clear no-one, including myself, thinks the blocksize must never change. Rather achieve scalability first through off-chain transactions, and only then do you consider increasing the limit. I made a rough guess myself that it may make sense to raise the blocksize at a market cap of around 1 trillion - still far off in the future. Fees in this scenario would be something like $5 per transaction, or $1billion/year of proof of work security. (not including the inflation subsidy) That's low enough to be affordable for things like payroll, and is still a lot cheaper than international wire transfers. Hopefully at that point Bitcoin will need less security against takedowns by authority, and/or technological improvements make it easier to run nodes.


As far as I know Wladimir J. van der Laan and Nils Schneider haven't stated an opinion, leaving Gavin Andresen.

I think Jeff Garzik's post on the issue is apropos, particularly his last point:

Quote
That was more than I intended to type, about block size. It seems more like The Question Of The Moment on the web, than a real engineering need. Just The Thing people are talking about right now, and largely much ado about nothing.

The worst that can happen if the 1MB limit stays is growth gets slowed for awhile. In the grand scheme of things that's a manageable problem.

TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 05, 2013, 11:18:31 AM
 #145

Anyway, ultimately this will be decided by Gavin and so far he's been saying he wants to raise the block size limit.

I'd say ultimately it's the main services (MtGox, Bitpay, BitInstant, BitcoinStore, WalletBit, Silk Road etc) that will decide. If they all stay at the same side of the fork, that will likely be the side that "wins", regardless of Gavin or even miners' will (most miners would just follow the money).

Doubtful, in practice the miners will decide.  However, a "suggestion" by Gavin (and more to the point an update to the reference client) would be a very strong push, as would a suggestion by large users of bitcoin.

In fact, have any pool owners stated what their opinion is?

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
conv3rsion
Sr. Member
****
Offline Offline

Activity: 307


View Profile
April 05, 2013, 05:25:28 PM
 #146

.

The idea that Bitcoin must be crippled to 1 MB blocksizes forever is absurd and perverse and I'm extremely thankful that people smarter than I am agree with that stance. 

If people like you and Gavin weren't working on Bitcoin (for example, if it was run by Peter), I would be getting as far away from it as I could.
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 05, 2013, 06:59:24 PM
 #147

Anyway, ultimately this will be decided by Gavin and so far he's been saying he wants to raise the block size limit.

That gives us pretty much zero information. I'm sure 99% of us "want to raise the block size limit". The question is how. Do we raise it to 2MB or 10MB or infinite? Do we raise it now? If not now when? Do we raise it once? What about dynamically? Dynamically using data or preset parameters? Do we consider hard fork risks in the decision?

There are many ways to raise the limit and all have different ramifications. No matter the precise course of action someone will be dissatisfied.

Actually, what Gavin said, quoting directly, is this:

A hard fork won't happen unless the vast super-majority of miners support it.

E.g. from my "how to handle upgrades" gist https://gist.github.com/gavinandresen/2355445

Quote
Example: increasing MAX_BLOCK_SIZE (a 'hard' blockchain split change)

Increasing the maximum block size beyond the current 1MB per block (perhaps changing it to a floating limit based on a multiple of the median size of the last few hundred blocks) is a likely future change to accomodate more transactions per block. A new maximum block size rule might be rolled out by:

New software creates blocks with a new block.version
Allow greater-than-MAX_BLOCK_SIZE blocks if their version is the new block.version or greater and 100% of the last 1000 blocks are new blocks. (51% of the last 100 blocks if on testnet)
100% of the last 1000 blocks is a straw-man; the actual criteria would probably be different (maybe something like block.timestamp is after 1-Jan-2015 and 99% of the last 2000 blocks are new-version), since this change means the first valid greater-than-MAX_BLOCK_SIZE-block immediately kicks anybody running old software off the main block chain.


I think this shows great consideration and judgement because I note and emphasize the following:

Quote
100% of the last 1000 blocks is a straw-man; the actual criteria would probably be different ...  since this change means the first valid greater-than-MAX_BLOCK_SIZE-block immediately kicks anybody running old software off the main block chain.

What I think is of greatest value in Gavin's quote is that it's inclusive of data from the field. It's not him unilaterally saying the new size will be X, deal with it. Instead he essentially says the new size can be X if Y and Z are also true. It appears he has a regard for the ability of the market to decide. Indeed no change remains an option and is actually the default.

I think Jeff Garzik's post on the issue is apropos, particularly his last point:

Thanks for that link. I hadn't seen that post and I think it's brilliant. It probably aligns with my views 99.999%. Ironically it's his last point I disagree with most:

Quote
Just The Thing people are talking about right now, and largely much ado about nothing.

I completely disagree. Think how easily this issue could have been solved if in 2009 Satoshi implemented a rule such as Jeff suggests here:

Quote
My off-the-cuff guess (may be wrong) for a solution was:  if (todays_date > SOME_FUTURE_DATE) { MAX_BLOCK_SIZE *= 2, every 1 years }  [Other devs comment: too fast!]  That might be too fast, but the point is, not feedback based nor directly miner controlled.

I think the above could be a great solution (though I tend to agree it might be too fast). However, implementing it now will meet resistance from someone feeling it misses their views. If Satoshi had implemented it then it wouldn't be an issue now. We would simply be dealing with it and the market working around it. Now however there is a lot of money tied up in protocol changes and many more views about what should or shouldn't be done. That will only increase, meaning the economic/financial damage possible from ungraceful changes increases as well.

I also note early in Jeff's post he says he reversed his earlier stance, my point here being people are not infallible. I actually agree with his updated views, but what if they too are wrong? Who is to say? So the same could apply to Gavin. That's why I think it's wise he appears to include a response from the market in any change, and no change is the default.
conv3rsion
Sr. Member
****
Offline Offline

Activity: 307


View Profile
April 06, 2013, 01:37:01 AM
 #148


EDIT: to be clear no-one, including myself, thinks the blocksize must never change. Rather achieve scalability first through off-chain transactions, and only then do you consider increasing the limit. I made a rough guess myself that it may make sense to raise the blocksize at a market cap of around 1 trillion - still far off in the future. Fees in this scenario would be something like $5 per transaction, or $1billion/year of proof of work security. (not including the inflation subsidy) That's low enough to be affordable for things like payroll, and is still a lot cheaper than international wire transfers. Hopefully at that point Bitcoin will need less security against takedowns by authority, and/or technological improvements make it easier to run nodes.


We won't get to $10 billion, let alone $1 trillion, if it costs $5 to make a bitcoin transaction. I refuse to believe that you truly do not understand this. Bitcoin will not capture enough economy if it is expensive to use because it won't be useful.

Nobody is going to convert their fiat currencies twice in order to use bitcoin as a wire transfer service with a marginal (at best) savings. And they will have to convert back to their fiat currencies because bitcoin is to expensive to spend or to move to an off chain transaction service (to spend there). Expensive is not decentralized.

Go fucking make PeterCoin, give it a 10KB block size, and see how far you get with that horseshit.

jgarzik
Legendary
*
qt
Offline Offline

Activity: 1470


View Profile
April 06, 2013, 03:08:30 AM
 #149

I don't agree with the idea that keeping the 1mb limit is conservative. It's actually a highly radical position. The original vision for Bitcoin was that it supports everyone who wants to use it, that's why the very first discussion of it ever was about scalability and Satoshi answered back with calculations based on VISA traffic levels. The only reason the limit is there at all was to avoid anyone mining huge blocks "before the community was ready for it", in his words, so it was only meant to be temporary.

You continue to repeat this -- but it is only half the story.

Satoshi also intended the subsidy-free, fee-only future to support bitcoin.  He did not describe fancy assurance contracts and infinite block sizes; he cleared indicated that fees would be driven in part by competition for space in the next block.

Unlimited block sizes are also a radical position quite outside whatever was envisioned by the system's creator -- who cleared did think that far ahead.


Jeff Garzik, bitcoin core dev team and BitPay engineer; opinions are my own, not my employer.
Donations / tip jar: 1BrufViLKnSWtuWGkryPsKsxonV2NQ7Tcj
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 06, 2013, 03:51:59 AM
 #150

Satoshi also intended the subsidy-free, fee-only future to support bitcoin.  He did not describe fancy assurance contracts and infinite block sizes; he cleared indicated that fees would be driven in part by competition for space in the next block.

Unlimited block sizes are also a radical position quite outside whatever was envisioned by the system's creator -- who cleared did think that far ahead.
Appeal to authority: Satoshi didn't mention assurance contracts therefore they can not be part of the economics of the network
Strawman argument: The absence of a specific protocol-defined limit implies infinite block sizes.
False premise: A specific protocol-defined block size limit is required to generate fee revenue.
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
April 06, 2013, 03:56:20 AM
 #151

Satoshi also intended the subsidy-free, fee-only future to support bitcoin.  He did not describe fancy assurance contracts and infinite block sizes; he cleared indicated that fees would be driven in part by competition for space in the next block.

(side-stepping justusranvier's points, right now, but accepting the above statement)
So what block size, realistically, allows a fees-market to function?

It can be approximated fairly easily if we assume that it happens once the fee revenue per block matches or exceeds the block reward. We have another 3.5 years at 25 BTC, so this needs to be the starting point.

How many transactions fit in a block? It varies because they have a variable number of inputs and outputs. Using blockchain.info terminology an average of 600 transactions populate an average 250KB block, so 2,400 will fit in a 1MB block. Perhaps most are "vanilla" Bitcoin transactions having a few inputs and one output.

What is a sensible fee for a vanilla transaction in the market-place? I had considered for a while that it is closer to the BTC equivalent of 5c than 0.5c or 50c. So 2,400 transactions will accrue $120 in fees.

With the Bitcoin fx rate at $150 then the block reward is $3,750, which is (a rounded) 30MB block size before the fees market functions properly. A few weeks ago this was 10MB. Perhaps by the end of the year it will be anywhere between 10 and 100MB.

These are quite large blocks so perhaps a realistic fee would be more like 20c per transaction, reducing the required block size to the range 2.5MB to 25MB. The market will find the optimum if it is given a chance.

Under no scenario will a 1MB limit give a chance for the fees market to become established, unless transaction fees are forced up to average $1.50, or we wait 11 years until the block reward is 6.25 BTC, or the fx rate collapses back to something like $5 per BTC. None are desirable options.


Qoheleth
Legendary
*
Offline Offline

Activity: 918


Spurn wild goose chases. Seek that which endures.


View Profile WWW
April 06, 2013, 04:07:02 AM
 #152

I completely disagree. Think how easily this issue could have been solved if in 2009 Satoshi implemented a rule such as Jeff suggests here:

Quote
My off-the-cuff guess (may be wrong) for a solution was:  if (todays_date > SOME_FUTURE_DATE) { MAX_BLOCK_SIZE *= 2, every 1 years }  [Other devs comment: too fast!]  That might be too fast, but the point is, not feedback based nor directly miner controlled.
Satoshi also intended the subsidy-free, fee-only future to support bitcoin.  He did not describe fancy assurance contracts and infinite block sizes; he cleared indicated that fees would be driven in part by competition for space in the next block.

Unlimited block sizes are also a radical position quite outside whatever was envisioned by the system's creator -- who cleared did think that far ahead.
It's my impression that the 2009 Satoshi implementation didn't have a block size limit - that it was a later addition to the reference client as a temporary anti-spam measure, which was left in until it became the norm.

Is this impression incorrect?

If there is something that will make Bitcoin succeed, it is growth of utility - greater quantity and variety of goods and services offered for BTC. If there is something that will make Bitcoin fail, it is the prevalence of users convinced that BTC is a magic box that will turn them into millionaires, and of the con-artists who have followed them here to devour them.
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 06, 2013, 04:11:09 AM
 #153

Not to imply that Satoshi's original intentions are binding on the network forever, but before anyone allows themselves to be misled about his intentions they should read what he actually said:

http://www.mail-archive.com/cryptography@metzdowd.com/msg09964.html
Zeilap
Full Member
***
Offline Offline

Activity: 154


View Profile
April 06, 2013, 04:27:40 AM
 #154

Satoshi also intended the subsidy-free, fee-only future to support bitcoin.  He did not describe fancy assurance contracts and infinite block sizes; he cleared indicated that fees would be driven in part by competition for space in the next block.

(side-stepping justusranvier's points, right now, but accepting the above statement)
So what block size, realistically, allows a fees-market to function?

It can be approximated fairly easily if we assume that it happens once the fee revenue per block matches or exceeds the block reward. We have another 3.5 years at 25 BTC, so this needs to be the starting point.

How many transactions fit in a block? It varies because they have a variable number of inputs and outputs. Using blockchain.info terminology an average of 600 transactions populate an average 250KB block, so 2,400 will fit in a 1MB block. Perhaps most are "vanilla" Bitcoin transactions having a few inputs and one output.

What is a sensible fee for a vanilla transaction in the market-place? I had considered for a while that it is closer to the BTC equivalent of 5c than 0.5c or 50c. So 2,400 transactions will accrue $120 in fees.

With the Bitcoin fx rate at $150 then the block reward is $3,750, which is (a rounded) 30MB block size before the fees market functions properly. A few weeks ago this was 10MB. Perhaps by the end of the year it will be anywhere between 10 and 100MB.

These are quite large blocks so perhaps a realistic fee would be more like 20c per transaction, reducing the required block size to the range 2.5MB to 25MB. The market will find the optimum if it is given a chance.

Under no scenario will a 1MB limit give a chance for the fees market to become established, unless transaction fees are forced up to average $1.50, or we wait 11 years until the block reward is 6.25 BTC, or the fx rate collapses back to something like $5 per BTC. None are desirable options.



This argument doesn't work, you're working backwards from the premise that miners need to maintain their current income to be profitable.

1GLeSqooAPe8PfWbJecnL3AteDac2B3cqj
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
April 06, 2013, 04:41:57 AM
 #155

This argument doesn't work, you're working backwards from the premise that miners need to maintain their current income to be profitable.

The argument is attempting to determine when the fees market becomes functional.
There is an argument that this is worrying about nothing because the block reward will maintain the network for many years without significant fees. Based upon the high fx rate this argument looks better by the day! Trying to force fees to match the block reward might be a task for the next decade, and counterproductive today. The risk remains from dead puppy transaction sources which would need to be throttled directly somehow, as the fees market won't do it.

Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 06, 2013, 08:55:39 AM
 #156

I think this shows great consideration and judgement because I note and emphasize the following:

Quote
100% of the last 1000 blocks is a straw-man; the actual criteria would probably be different ...  since this change means the first valid greater-than-MAX_BLOCK_SIZE-block immediately kicks anybody running old software off the main block chain.

What I think is of greatest value in Gavin's quote is that it's inclusive of data from the field. It's not him unilaterally saying the new size will be X, deal with it. Instead he essentially says the new size can be X if Y and Z are also true. It appears he has a regard for the ability of the market to decide. Indeed no change remains an option and is actually the default.

Think through this voting proposal carefully: can you think of a way to game the vote?

I completely disagree. Think how easily this issue could have been solved if in 2009 Satoshi implemented a rule such as Jeff suggests here:

Quote
My off-the-cuff guess (may be wrong) for a solution was:  if (todays_date > SOME_FUTURE_DATE) { MAX_BLOCK_SIZE *= 2, every 1 years }  [Other devs comment: too fast!]  That might be too fast, but the point is, not feedback based nor directly miner controlled.

I think the above could be a great solution (though I tend to agree it might be too fast). However, implementing it now will meet resistance from someone feeling it misses their views. If Satoshi had implemented it then it wouldn't be an issue now. We would simply be dealing with it and the market working around it. Now however there is a lot of money tied up in protocol changes and many more views about what should or shouldn't be done. That will only increase, meaning the economic/financial damage possible from ungraceful changes increases as well.

There was some heated IRC discussion on this point - gmaxwell made the excellent comment that any type of exponentially increasing blocksize function runs into the enormous risk that the constant is either too large, and the blocksize blows up, or too small, and you need the off-chain transactions it was supposed to avoid anyway. There is very little room for error, yet changing it later if you get it wrong is impossible due to centralization.

I also note early in Jeff's post he says he reversed his earlier stance, my point here being people are not infallible. I actually agree with his updated views, but what if they too are wrong? Who is to say? So the same could apply to Gavin. That's why I think it's wise he appears to include a response from the market in any change, and no change is the default.

Same here. I read Satoshi's predictions about achieving scalability through large blocks about two years ago, and simply accepted it as inevitable and ok. It was only much, much later, that I thought about the issue carefully and realized how dangerous the implications would be to Bitcoin's decentralization.


I am already concerned about the centralization seen in mining. Only a handful of pools are mining most of the blocks, so decentralization is already being lost there. Work is needed in two areas before the argument for off-chain solutions becomes strong: first blockchain pruning, secondly, initial propagation of headers (presumably with associated utxo) so that hashing can begin immediately while the last block is propagated and its verification done in parallel. These would help greatly to preserve decentralization.

(snip)

You mention your background in passing, so I will just mention mine. I spent many years at the heart of one of the largest TBTF banks working on its equities proprietary trading system. For a while 1% of the shares traded globally (by value) was our execution flow. On average every three months we encountered limitations of one sort or another (software, hardware, network, satellite systems), yet every one of them was solved by scaling, rewriting or upgrading. We could not stand still as the never-ending arms race for market-share meant that to accept a limitation was to throw in the towel.

It's good that you have had experience with such environments, but remember that centrally run systems, even if the architecture itself is distributed, are far, far easier to manage than truly decentralized systems. When your decentralized system must be resistant to attack, the problem is even worse.

As an example, consider your (and Mike's) proposal to propagate headers and/or transaction hash information, rather than full transactions. Can you think of a way to attack such a system? Can you think of a way that such a system could make network forking events more likely? I'll give you a hint for the latter: reorganizations.


It's my impression that the 2009 Satoshi implementation didn't have a block size limit - that it was a later addition to the reference client as a temporary anti-spam measure, which was left in until it became the norm.

Is this impression incorrect?

Bitcoin began with a 32MiB blocksize limit, which was reduced to 1MB later by Satoshi.

Zeilap
Full Member
***
Offline Offline

Activity: 154


View Profile
April 06, 2013, 11:35:09 AM
 #157

As an example, consider your (and Mike's) proposal to propagate headers and/or transaction hash information, rather than full transactions. Can you think of a way to attack such a system? Can you think of a way that such a system could make network forking events more likely? I'll give you a hint for the latter: reorganizations.

Are you talking about the case where I mine a block with one of my own transactions which I haven't announced, so you have to make another request for the missing transactions before you can verify it? If I refuse to provide the missing transactions, then no-one will accept my block and I lose. If I delay announcing the missing transactions, competing blocks may be found in the mean time, so this is no different to delaying the announcement of the block itself which has no benefit for me. What am I missing?

1GLeSqooAPe8PfWbJecnL3AteDac2B3cqj
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 06, 2013, 05:40:29 PM
 #158

Okay here is a proposal which IMO is "safe" enough to win support, but first a quick question. Infinite block sizes appear dangerous for centralization, but also DOS attack and chain bloat by tx's with no/trivial fees which size limits and higher fees help prohibit. Has there been good answer for the latter? (apologies for having missed it if so)

This new proposal builds on Gavin's model, but IMO makes it "safer":

Increases of 4MB every 26,400 blocks (every 6 months) if block.timestamp is after 1-Jan-2014 and 95% of the last 4,400 blocks (1 month) are new-version.

Pros:

- no action until future date gives market time to respond gracefully
- no change as default, means no fork risk
- allows possible increase of 16MB in 2 years and has no final limit
- puts increases in the hands of miners; even the small ones have a say
- should not be prone to gaming due to large sampling size, but worst case is 8MB per year increase

Cons

- advocates of infinite size will likely find it conservative
- ?

Please note in my view Bitcoin could survive with a 1MB limit and not be "crippled". I think one thing to remember is Bitcoin would not exist if government was completely out of money. Bitcoin only has value because it allows people to control their stored wealth and the free market to work. Regulation is the reason payments come with fees and inefficiency. A truly free market would have long since given users better and cheaper options. Indeed, payments existing outside the "system" was PayPal's original idea believe it or not.

So what Bitcoin allows are services to be built on top of it, e.g. ones which rely less on the block-chain, to extend its user empowerment. What I'm saying is Bitcoin is not only the block-chain; it's the idea of allowing free market efficiency the chance to work, so off-chain Bitcoin transactions can be considered Bitcoin too as Bitcoin makes them efficiently possible.
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652


Chief Scientist


View Profile WWW
April 06, 2013, 06:49:31 PM
 #159

So the longer I think about the block size issue, the more I'm reminded of this Hayek quote:

Quote
The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.

F.A. Hayek, The Fatal Conceit

We can speculate all we want about what is going to happen in the future, but we don't really know.

So, what should we do if we don't know? My default answer is "do the simplest thing that could possibly work, but make sure there is a Plan B just in case it doesn't work."

In the case of the block size debate, what is the simplest thing that just might possibly work?

That's easy!  Eliminate the block size limit as a network rule entirely, and trust that miners and merchants and users will reject blocks that are "obviously too big." Where what is "obviously too big" will change over time as technology changes.

What is Plan B if just trusting miners/merchants/users to do the right thing doesn't work?

Big-picture it is easy:  Schedule a soft-fork that imposes some network-rule-upper-limit, with whatever formula seems right to correct whatever problem crops up.
Small-picture: hard to see what the "right" formula would be, but I think it will be much easier to define after we run into some actual practical problem rather than guessing where problems might crop up.



How often do you get the chance to work on a potentially world-changing project?
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
April 06, 2013, 07:29:41 PM
 #160

So the longer I think about the block size issue, the more I'm reminded of this Hayek quote:

Quote
The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.

F.A. Hayek, The Fatal Conceit

We can speculate all we want about what is going to happen in the future, but we don't really know.

So, what should we do if we don't know? My default answer is "do the simplest thing that could possibly work, but make sure there is a Plan B just in case it doesn't work."

In the case of the block size debate, what is the simplest thing that just might possibly work?

That's easy!  Eliminate the block size limit as a network rule entirely, and trust that miners and merchants and users will reject blocks that are "obviously too big." Where what is "obviously too big" will change over time as technology changes.

What is Plan B if just trusting miners/merchants/users to do the right thing doesn't work?

Big-picture it is easy:  Schedule a soft-fork that imposes some network-rule-upper-limit, with whatever formula seems right to correct whatever problem crops up.
Small-picture: hard to see what the "right" formula would be, but I think it will be much easier to define after we run into some actual practical problem rather than guessing where problems might crop up.

Well, thanks for weighing in. Under the circumstances you're the closest to a leader this leaderless grand experiment has, so it means a lot. We simply must get some resolution to this so we can move forward.

As I alluded to earlier no action could ever satisfy everyone nor be provably correct.

I'm a bit torn on this avenue, because on one hand I believe strongly in the ability of the market to work effectively. So your rationale it will self regulate block size makes sense to me. On the other I consider this the riskiest route for things which can go wrong and consequences if they do.

However, this is one of the reasons I began pushing alt-coins. The more options a free market has the better it can work. In the end, because of that, I can live with any decision for Satoshi's version of Bitcoin that resolves block size without an economically and confidence damaging fork.

So I say let's start pushing forward with such plans. It will be better for all interests if we do.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 06, 2013, 07:53:03 PM
 #161

What is Plan B if just trusting miners/merchants/users to do the right thing doesn't work?

Big-picture it is easy:  Schedule a soft-fork that imposes some network-rule-upper-limit, with whatever formula seems right to correct whatever problem crops up.
Small-picture: hard to see what the "right" formula would be, but I think it will be much easier to define after we run into some actual practical problem rather than guessing where problems might crop up.

Your "plan" reminds me of an old joke about the difference between programmers and engineers:

Quote
A mechanical engineer, an electrical engineer and a computer programmer were on their way to a meeting in Switzerland. They had just passed the crest of a high mountain pass when suddenly the brakes on their car failed. The car careened almost out of control down the road, bouncing off the crash barriers, until it miraculously ground to a halt scraping along the guardrail.

The car's occupants, shaken but unhurt, now had a problem: they were still stuck at the top of a mountain in a car with no brakes. What were they to do?

"I know," said the mechanical manager, "Let's check under the car for leaks - chances are a hydraulic line burst."

"No, no," said the electrical engineer, "This is a new car with sensors everywhere  - let me hook up my multimeter to the hydraulic line pressure sensor output and see if it reads low first."

"Well," said the programmer, "Before we do anything, I think we should push the car back up the road and see if it happens again."

Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652


Chief Scientist


View Profile WWW
April 06, 2013, 08:41:42 PM
 #162

Your "plan" reminds me of an old joke...

Okey dokey.

If you want to be helpful, please write up a list of pros and cons for the various plans that have been proposed, including your own (last time I asked you, you waffled and didn't have any plan).

I've been pretty busy dealing with the avalanche of press and working on the payment protocol.

How often do you get the chance to work on a potentially world-changing project?
fornit
Hero Member
*****
Offline Offline

Activity: 989


View Profile
April 07, 2013, 02:47:23 AM
 #163

Your "plan" reminds me of an old joke...

Okey dokey.

If you want to be helpful, please write up a list of pros and cons for the various plans that have been proposed, including your own (last time I asked you, you waffled and didn't have any plan).

I've been pretty busy dealing with the avalanche of press and working on the payment protocol.

the short list:

1) no incease:

pro

- takes no time at all
- predictable consequences
- allows the biggest amount of people to run a full node

con

- pretty much limits bitcoin to store of value. if it fails as store of value (possibly for an off-chain transaction system) it will be dead or a total niche product very soon.

-----------------------------------------------

2) fixed increase:

pro

- very simple implementation
- easy to predict consequences
- no major worst case scenarios

con

- short term solution
- discrete jump in block size with no regard to current amount of transactions

-----------------------------------------------

3) floating block size:

pro

- most likely the best option to successfully balance low transaction fees and small block size
- no more pros, but thats a really big one

con

- most complex implementation (might still not be very complex though)
- hard to predict exact behavior
- has to undergo more scrutiny and fine tuning to avoid an exploitable algorithm

-----------------------------------------------

4) remove limit entirely:

pro

- takes no time at all
- might allow for a natural balance without artificial rules
- allows for a plan b

con

- needs a plan b  Wink
- relies on people to do the right thing in a situation when
 - a) its very hard to say what the right thing actually is
 - b) money is involved. people are evil, stupid and completely unable to be objective when money is involved.
- bonus con: with a constant need for consent, it actually allows for even more discussions...


honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).

justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 07, 2013, 03:34:22 AM
 #164

honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).
This forum is not representative of bitcoin users as a whole. The majority of bitcoin users don't realize there is a fixed limit on the transaction rate and would be appalled if they knew. Move outside the circle of people who have some kind of vested interest in limiting the capabilities of the blockchain and the response would be, "Of course there shouldn't be an arbitrary limit on the transaction rate."

Bitcoin with a fixed protocol limit of 1 MB blocks is like building a long distance telephone network that had a global limit of 10 calls in progress at once, and refusing to ever increase this limit no matter how much demand there was to make long distance phone calls, or how much the technology improved.
Realpra
Hero Member
*****
Offline Offline

Activity: 819


View Profile
April 07, 2013, 03:54:04 AM
 #165

Your "plan" reminds me of an old joke...

Okey dokey.

If you want to be helpful, please write up a list of pros and cons for the various plans that have been proposed, including your own (last time I asked you, you waffled and didn't have any plan).

I've been pretty busy dealing with the avalanche of press and working on the payment protocol.

the short list:

....

4) remove limit entirely:

pro

- takes no time at all
- might allow for a natural balance without artificial rules
- allows for a plan b

con

- needs a plan b  Wink
- relies on people to do the right thing in a situation when
 - a) its very hard to say what the right thing actually is
 - b) money is involved. people are evil, stupid and completely unable to be objective when money is involved.
- bonus con: with a constant need for consent, it actually allows for even more discussions...


honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).
4 is the correct option because your con list is flawed:

1. In the case of a very large block being artificially being created to destroy Bitcoin most of the network would simply automatically end up rejecting it because by the time they downloaded it there would be a longer chain/fork without that block.

2. In the case of increasing block sizes squeezing out smaller nodes the solution is to create "swarm clients" that do collaborative validation.
This would allow the network to scale to any size for "infinite" time when combined with the "ledger solution".
(This is not up to choice or opinion, Bitcoin presently does not scale and if you don't fix this some other crypto currency will. Forum searching both quotation marked concepts should be easy enough)


The vision of Bitcoin believers is to take over the world, you won't do that with a 1mb block limit, its pathetic.

Cheap and sexy Bitcoin card/hardware wallet, buy here:
http://BlochsTech.com
conv3rsion
Sr. Member
****
Offline Offline

Activity: 307


View Profile
April 07, 2013, 04:34:51 AM
 #166

honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).
This forum is not representative of bitcoin users as a whole. The majority of bitcoin users don't realize there is a fixed limit on the transaction rate and would be appalled if they knew. Move outside the circle of people who have some kind of vested interest in limiting the capabilities of the blockchain and the response would be, "Of course there shouldn't be an arbitrary limit on the transaction rate."

Bitcoin with a fixed protocol limit of 1 MB blocks is like building a long distance telephone network that had a global limit of 10 calls in progress at once, and refusing to ever increase this limit no matter how much demand there was to make long distance phone calls, or how much the technology improved.

I am 100% sure that a bitcoin would not be worth $144.00 USD right now if the majority of people purchasing at that price (or even half that price) had knowledge or belief that in the very short future, it would cost more than a few cents to use Bitcoin or that the network would permit a maximum of 7-20 transactions per second forever. That is not what people using, adopting, investing, or proselytizing are selling to others in their elevator speeches, because if they were aware of it they would be disgusted.

Any alt chain that chooses to scale will quickly replace Bitcoin in terms of both usage and fiat exchange rate, because as should be quite obvious to anyone without ulterior motives, the two are correlated. Without usage that presents a superior alternative to existing systems, which in this case is the ability to instantly and cheaply send money anywhere without a 3rd party risk, Bitcoin is irrelevant, worthless, and soon to be obsolete.

Nubarius
Sr. Member
****
Offline Offline

Activity: 309


View Profile
April 07, 2013, 08:54:46 AM
 #167

I fully support Gavin Andresen on this.

It is the best solution possible: no risk of overengineering, no kicking the can down the road and no half-baked attempts to solve a would-be problem that hasn't been proved to be there.

I think it is a mistake to apply the scarcity-as-economic-incentive logic to physical resources like storage or bandwidth. How scarce those resources are can only be determined by technology and the free market, not by a whimsical hard-coded little number. It is utterly wrong to draw any analogies with the 21-million-bitcoin limit, which simply affects the unit of account and doesn't relate to any physical resource. Bitcoins are neither scarce nor abundant; there's simply a predictable number of them. Extending the logic of the unit of account to actual physical resources is simply muddled economic thinking, in my opinion.

[...]
honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).

Your list is a very good summary, but I think you're exaggerating the cons of option 4. Besides what justusranvier and RealPra have said above, the need for a plan B applies to the other options too. Also, it is the other options that will keep a constant need for consent as the block size limit will be a recurring issue as long as there is an arbitrary cap that we're about to hit at any moment. With option 4 the debate will fizzle out and will only reappear if and when we come across an actual problem (and then discussion will be based on facts and not on speculation about human behaviour).

This forum is not representative of bitcoin users as a whole. The majority of bitcoin users don't realize there is a fixed limit on the transaction rate and would be appalled if they knew. Move outside the circle of people who have some kind of vested interest in limiting the capabilities of the blockchain and the response would be, "Of course there shouldn't be an arbitrary limit on the transaction rate."

An unlimited number of times this^.
fornit
Hero Member
*****
Offline Offline

Activity: 989


View Profile
April 07, 2013, 12:39:53 PM
 #168

honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).
This forum is not representative of bitcoin users as a whole. The majority of bitcoin users don't realize there is a fixed limit on the transaction rate and would be appalled if they knew. Move outside the circle of people who have some kind of vested interest in limiting the capabilities of the blockchain and the response would be, "Of course there shouldn't be an arbitrary limit on the transaction rate."

maybe next time you make up an argument that doesnt entirely rely on speculation about me and every other bitcoin user and i might consider thinking about it  Wink

@realpra
i cant find anything in your post that relates to mine. i think you interpreted a lot more into my post than i actually said.

@nubarius
there is very little upside to having no scarcity at all and a very concrete worst case scenario: the blockchain growth might be too quick for the current state of the technology. people dont want one piece of software to occupy whole harddisks and lightweight clients are still in an rather early development stage. just imagine some sort of major microtransaction or gambling service started using bitcoin and it grew by a factor of 100. its legitimate use alright. still, bitcoin cant possibly handle it at this point. the technology needs to be further developed before it can handle any amount of transactions. i wish the free-market enthusiasts would at least see that right now there are limits, regardless of what we want or decide.

i think its very unfortunate that both the the "full node for everyone!" and the free market advocates basically act like fanatics, with very little concern about real issues (including finding a solution a majority can agree on) and a strong tendecy for black-or-white thinking.

justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 07, 2013, 12:56:12 PM
 #169

there is very little upside to having no scarcity at all
That word does not mean what you think it means.

Blocks do not need a protocol limit on their maximum size in order for the space in them to be scarce.

Until computers start shipping that contain an infinite amount of RAM, and are equipped with CPUs that can process an infinite amount of data in zero time, and have internet connections which can transfer an unlimited amount of data in zero time, then space in the blockchain is scarce.

If you think an artificial limit on the size of a block is necessary to make the space scarce you either don't understand physics, or economics, or both.
fornit
Hero Member
*****
Offline Offline

Activity: 989


View Profile
April 07, 2013, 01:12:00 PM
 #170

how is any of those limitations relevant to the current situation?


mp420
Hero Member
*****
Offline Offline

Activity: 501


View Profile
April 07, 2013, 01:26:34 PM
 #171

I like the floating (miner-voted) block size idea. This is because I think:

1) Transactions require real resources (bandwidth, cpu time, storage), hence transactions must be expensive. They require these things from every full node, not just miners, so there's necessarily a kind of tragedy of commons situation in play. I cannot think of a solution that wouldn't discourage non-miners to run a full node. After trust-free lite nodes are implemented I can't think of a reason why a non-miner would run a full node.
2) Bitcoin ceases to be useful if transactions are TOO expensive. I think the upper limit for the minimum mandatory fee (currently zero) is around $5. Above that, people just migrate to cheaper ways to transfer value. This is why keeping the 1M limit won't work.

Of course, I have never considered Bitcoin to be particularily useful for e-commerce.
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
April 07, 2013, 01:28:04 PM
 #172

how is any of those limitations relevant to the current situation?
I don't even know where to begin. Maybe you could find an entry level textbook on economic and start with the definition of scarcity, then read up on price.

You've already identified the solution, except that you erroneously identified it as the problem.

people dont want one piece of software to occupy whole harddisks
Users have neither the ability nor desire for the blockchain to grow without bound. That's why no hard-coded limit on the block size is necessary.
TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 07, 2013, 04:31:55 PM
 #173

That's easy!  Eliminate the block size limit as a network rule entirely, and trust that miners and merchants and users will reject blocks that are "obviously too big." Where what is "obviously too big" will change over time as technology changes.

That effectively sets the block size at 32MB, since that is the max message size.  Would your intention be to include splitting blocks over multiple messages?

One option would be to change the "full" block message so that it only includes the hashes of the included transactions.

Transactions would then have to be sent separately. 

This allows the 32MB to be used entirely for transaction hashes, but still is a limit to 1 million transactions, which (famous last words) should be enough.

Quote
What is Plan B if just trusting miners/merchants/users to do the right thing doesn't work?

Big-picture it is easy:  Schedule a soft-fork that imposes some network-rule-upper-limit, with whatever formula seems right to correct whatever problem crops up.
Small-picture: hard to see what the "right" formula would be, but I think it will be much easier to define after we run into some actual practical problem rather than guessing where problems might crop up.

Ideally, there would be miner rules and user rules.  For example, normal uses might accept any size blocks.  However, when working out which block to add to, miners might reject certain blocks.  Once they are buried deeply enough, the miner might accept them then.  This would help to heal forks.

This means that only 1 hard fork is required to take the limit to infinite.  However, then the majority of the hashing power gets to pick the max block size.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
April 07, 2013, 05:02:30 PM
 #174

One option would be to change the "full" block message so that it only includes the hashes of the included transactions.

Transactions would then have to be sent separately.

Bloom filtering support already provides most of what's needed here. Matt checked in an optimization for full-match filters a short while ago, it makes sense to continue on that vein.
ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470


Bringing Legendary Har® to you since 1952


View Profile
April 07, 2013, 08:50:24 PM
 #175

So the longer I think about the block size issue, the more I'm reminded of this Hayek quote:

Quote
The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.

F.A. Hayek, The Fatal Conceit

We can speculate all we want about what is going to happen in the future, but we don't really know.

So, what should we do if we don't know? My default answer is "do the simplest thing that could possibly work, but make sure there is a Plan B just in case it doesn't work."

In the case of the block size debate, what is the simplest thing that just might possibly work?

That's easy!  Eliminate the block size limit as a network rule entirely, and trust that miners and merchants and users will reject blocks that are "obviously too big." Where what is "obviously too big" will change over time as technology changes.

I support this.

Let's try the simple solution and if it doesn't work, we can try something else.
Bitcoin is all about free market, right ? So let's allow the market to decide for us.

Configuration setting can be made for both client and miner applications, so users will be able to choose.

conv3rsion
Sr. Member
****
Offline Offline

Activity: 307


View Profile
April 07, 2013, 10:10:14 PM
 #176

So the longer I think about the block size issue, the more I'm reminded of this Hayek quote:

Quote
The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.

F.A. Hayek, The Fatal Conceit

We can speculate all we want about what is going to happen in the future, but we don't really know.

So, what should we do if we don't know? My default answer is "do the simplest thing that could possibly work, but make sure there is a Plan B just in case it doesn't work."

In the case of the block size debate, what is the simplest thing that just might possibly work?

That's easy!  Eliminate the block size limit as a network rule entirely, and trust that miners and merchants and users will reject blocks that are "obviously too big." Where what is "obviously too big" will change over time as technology changes.

I support this.

Let's try the simple solution and if it doesn't work, we can try something else.
Bitcoin is all about free market, right ? So let's allow the market to decide for us.

Configuration setting can be made for both client and miner applications, so users will be able to choose.

I support it too. Letting the market decide the fees and the maximum blocksize is the right answer.

I happen to believe that a limited blocksize will lead to centralization, but for those that think the opposite (that larger blocksizes lead to centralization), I truly wish they would put their time and effort into helping BITCOIN scale. If you are concerned about computing requirements going up for the average user, you can certainly work towards optimizations that will lighten that load, without trying to replace Bitcoin for the majority of what constitutes its present day usage. I think thats probably why Gavin is working on the payment protocol, so that dead puppy "communication" isn't needed.  

All the effort and time spent discussing this issue... imagine if it was actually used productively.
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
April 07, 2013, 11:26:13 PM
 #177

All the effort and time spent discussing this issue... imagine if it was actually used productively.

I know people are tired of the 20+ threads on block size and thousands of posts in them, but this exhaustive debate is very good.

The minutiae of every point of view is being thrashed out in the open, smart ideas and not so smart ideas are put forward and critiqued. What is the alternative? Important decisions taken behind closed doors. Isn't that the default model of existing CBs?

I think that this debate has been productive because a lot of information has been shared between people who care about Bitcoin and want to contribute to help it become successful.


zebedee
Donator
Hero Member
*
Offline Offline

Activity: 670



View Profile
April 08, 2013, 04:54:16 AM
 #178


I support it too. Letting the market decide the fees and the maximum blocksize is the right answer.

I happen to believe that a limited blocksize will lead to centralization, but for those that think the opposite (that larger blocksizes lead to centralization), I truly wish they would put their time and effort into helping BITCOIN scale. If you are concerned about computing requirements going up for the average user, you can certainly work towards optimizations that will lighten that load, without trying to replace Bitcoin for the majority of what constitutes its present day usage. I think thats probably why Gavin is working on the payment protocol, so that dead puppy "communication" isn't needed.  

All the effort and time spent discussing this issue... imagine if it was actually used productively.

Add my vote too.  Other stances have the premise that block space is valuable and can't be allowed to get too big too fast, but with the contradictory fear that it would be given away for free. And that miners would allow such large blocks to be created that miners wouldn't be able to handle it, which is also a self-correcting "problem".

Block space will always come with a price, it has for 2 years now despite 1MB not being a limiting number. Miners worry about their block propagation and always will have the incentive to be conservative and get their blocks known. See Eligius for example with 32 transactions limit for a long time.

Let the miners determine what works in the free market through trial and error.  This will change with time elastically as computing power and resources change. And that's the way it should be.
jgarzik
Legendary
*
qt
Offline Offline

Activity: 1470


View Profile
April 09, 2013, 04:24:28 AM
 #179

Satoshi also intended the subsidy-free, fee-only future to support bitcoin.  He did not describe fancy assurance contracts and infinite block sizes; he cleared indicated that fees would be driven in part by competition for space in the next block.

Unlimited block sizes are also a radical position quite outside whatever was envisioned by the system's creator -- who cleared did think that far ahead.
Appeal to authority: Satoshi didn't mention assurance contracts therefore they can not be part of the economics of the network
Strawman argument: The absence of a specific protocol-defined limit implies infinite block sizes.
False premise: A specific protocol-defined block size limit is required to generate fee revenue.

Gathering data does not imply blindly following the data's source.

Fully understanding the intent of the system's designer WRT fees is a very valuable data point in making a decision on block size limits.


Jeff Garzik, bitcoin core dev team and BitPay engineer; opinions are my own, not my employer.
Donations / tip jar: 1BrufViLKnSWtuWGkryPsKsxonV2NQ7Tcj
Zeilap
Full Member
***
Offline Offline

Activity: 154


View Profile
April 09, 2013, 05:43:23 AM
 #180

people dont want one piece of software to occupy whole harddisks
Users have neither the ability nor desire for the blockchain to grow without bound. That's why no hard-coded limit on the block size is necessary.
People tend not to want to murder others or be murdered themselves, yet it still happens and we have laws to prevent / discourage it.

Add my vote too.  Other stances have the premise that block space is valuable and can't be allowed to get too big too fast, but with the contradictory fear that it would be given away for free. And that miners would allow such large blocks to be created that miners wouldn't be able to handle it, which is also a self-correcting "problem".

You're working under the assumptions that everyone mining coins want what's best for Bitcoin / care about Bitcoin's future, and all miners are equal - they are not, some will realize that producing large blocks that others can't handle is very much in their interest. Others may have the aim of just causing as many problems as possible because Bitcoin is a threat to them in some way, who knows? All you're doing is creating a vulnerability and expecting no-one to take advantage.

We aren't all hippies living on a fucking rainbow - problems don't just fix themselves because in your one-dimensional view of the world you can't imagine anyone not acting rationally and for the greater good.

1GLeSqooAPe8PfWbJecnL3AteDac2B3cqj
zebedee
Donator
Hero Member
*
Offline Offline

Activity: 670



View Profile
April 09, 2013, 06:09:36 AM
 #181

You're working under the assumptions that everyone mining coins want what's best for Bitcoin / care about Bitcoin's future, and all miners are equal - they are not, some will realize that producing large blocks that others can't handle is very much in their interest. Others may have the aim of just causing as many problems as possible because Bitcoin is a threat to them in some way, who knows? All you're doing is creating a vulnerability and expecting no-one to take advantage.

We aren't all hippies living on a fucking rainbow - problems don't just fix themselves because in your one-dimensional view of the world you can't imagine anyone not acting rationally and for the greater good.
You are assuming things are problems with no evidence.  It's quite easy to ignore troublemakers, e.g. the anti-spam rules.  Think outside the box a bit more.  Central planners almost never get anything right, stop wanting others to solve your problems.

Despite all the claims of trouble-making large blocks, the only one we really had over 500k was recently, and the only reason that caused a problem was entirely unrelated, owing to limitations of untested parts of the software, untested so late only because of all these special case rules.  Let miners be free and figure it out for themselves.  I suspect a common core of rule(s) would be generally agreed (anti-spam etc.) and others would all do their own thing on the fringes, which is actually the case now.
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2408



View Profile
April 10, 2013, 07:56:02 AM
 #182

Okay, I think this will be my final word on this (if anyone cares or is listening).

Basically it comes down to a philosophical call since the number of variables and unknowns going into the decision lead to an incalculable analysis, at least I don't see any good way to reduce the variable set to get a meaningful result without over-simplifying the problem with speculative, weak assumptions.

In that case, as Gavin is saying, we should just leave it up to the market, our intuitions, our future ingenuity, trust in ability to solve anything that comes up and basically hope for the best.

So now I'm thinking, let the max_block_size float up to any unlimited size. But have a limit on how fast it can rise that makes practical sense, a rate limiter of some sort so it can grow as fast the network grows but not so fast that an attacker could game it over the medium/short term to force smaller users off the network with unrealistic hardware upgrade requirements. Ultimately though have an infinite limit to max block size in the long term, as Mike Hearn says.

As a first blush in pseudo-code;

Code:
if (MAX_BLOCK_SIZE < multiple of the median size of the last 1008 blocks)              ## Gavin A.'s rule to let MAX_BLOCK_SIZE float proportionally to recent median block size history, weekly check

    NEW_MAX_BLOCK_SIZE = multiple of the median size of the last 1008 blocks

    if ( NEW_MAX_BLOCK_SIZE < (2^(1/52))* MAX_BLOCK_SIZE )                             ## Jeff G.'s rule as a weekly time-based rate limit on MAX_BLOCK_SIZE to yearly doubling (Moore's law)

        then {MAX_BLOCK_SIZE = NEW_MAX_BLOCK_SIZE }

    else

        MAX_BLOCK_SIZE =(2^(1/52))* MAX_BLOCK_SIZE

    endif
endif

Infinite max block size growth possible but rate limited to yearly doubling rate.


notig
Sr. Member
****
Offline Offline

Activity: 280


View Profile
April 10, 2013, 08:37:45 AM
 #183

Oh goodie, more Google conspiracy theories. Actually I never had to ask for approval to use 20% time on Bitcoin. That's the whole point of the policy - as long as there's some justifiable connection to the business, you can do more or less whatever you want with it and managers can't tell you not to unless it's clearly abusive. That's how we ensure it's usable for radical (i.e. unexpected) innovation.

But even if I was being paid to work on Bitcoin full time by Google, the idea that I'd want Bitcoin to grow and scale up as part of some diabolical corporate master plan is stupid. Occam's Razor, people! The simplest explanation for why I have worked so hard on Bitcoin scalability is that I want it to succeed, according to the original vision laid down by Satoshi. Which did not include arbitrary and pointless limits on its traffic levels.

The idea that Bitcoin can be a store of value with a 1mb block size limit seems like nonsense to me. That's reversing cause and effect. Bitcoin gained value because it was useful, it didn't gain use because it had value - that can't be the case because it started out with a value of zero. So if Bitcoin is deliberately crippled so most people can't use it, it will also cease to have much (if any) value. You can't have one without the other. The best way to ensure Bitcoin is a solid store of value is to ensure it's widely accepted and used on an every day basis.

If Bitcoin was banned in a country then I think it's obvious its value would be close to zero. This is one of the most widely held misconceptions about Bitcoin, that it's somehow immune to state action. A currency is a classic example of network effects, the more people that use it, the more useful it becomes but it goes without saying that you have to actually know other people are using it to be able to use it yourself. If there was immediate and swift punishment of anyone who advertised acceptance of coins or interacted with an exchange, you would find it very hard to trade and coins would be useless/valueless in that jurisdiction.

The reason I'm getting tired of these debates is that I've come to agree with Gavin - there's an agenda at work and the arguments are a result of people working backwards from the conclusion they want to try and find rationales to support it.

Every single serious point made has been dealt with by now. Let's recap:

  • Scalability leads to "centralization". It's impossible to engage in meaningful debate with people like Peter on this because they refuse to get concrete and talk specific numbers for what they'd deem acceptable. But we now know that with simple optimisations that have been prototyped or implemented today, Bitcoin nodes can handle far more traffic than the worlds largest card networks on one single computer, what's more, a computer so ordinary that our very own gmaxwell has several of them in his house. This is amazing - all kinds of individuals can, on their own, afford to run full nodes without any kind of business subsidisation at all, including bandwidth. And it'll be even cheaper tomorrow.
  • Mining can't be anonymous if blocks are large. Firstly, as I already pointed out, if mining is illegal in one place then it'll just migrate to other parts of the world, and if it's illegal everywhere then it's game over and Bitcoin is valueless anyway, so at that point nobody cares anymore. But secondly, this argument is again impossible to really grapple with because it's based on an unsupported axiom: that onion networks can't scale. Nobody has shown this. Nobody has even attempted to show this. Once again, it's an argument reached by working backwards from a desired conclusion.
  • Mining is a public good and without artificial scarcity it won't get funded. This is a good argument but I've shown how alternative funding can be arranged via assurance contracts, with a concrete proposal and examples in the real world of public goods that get funded this way. It'll be years before we get to try this out (unless the value of Bitcoin falls a lot), but so far I haven't seen any serious rebuttals to this argument. The only ones that exist are of the form, "we don't have absolute certainty this will work, so let's not try". But it's not a good point because we have no certainty the proposed alternatives will work either, so they aren't better than what I've proposed.

Are there any others? The amount of time spent addressing all these arguments has been astronomical and at some point, it's got to be enough. If you want to continue to argue for artificial scaling limits, you need to get concrete and provide real numbers and real calculations supporting that position. Otherwise you're just peddling vague fears, uncertainties and doubts.

+1
gmaxwell
Moderator
Legendary
*
qt
Offline Offline

Activity: 2324



View Profile
April 10, 2013, 09:53:41 AM
 #184

Okay, I think this will be my final word on this (if anyone cares or is listening).

Meh.

Any amount of algorithms will just be guesswork that could be wildly wrong— A small error in the exponent makes a big difference: If computer's ability to handle the blocks doubles in 18 months instead of 12 then ten years the size increase will be 1024 : 101— blocks 10x too big to handle.

And "median of blocks" basically amounts to "miner's choose"— if you're going to have miners choose, I'd rather have them insert a value in the coinbase and take the median of that... then at least they're not incentivized to bloat blocks just to up the cap, or forced to forgo fees to their competition just to speak their mind on making it lower. Smiley  But miners choose is ... not great because miners alignment with everyone else is imperfect— and right now the majority of the mining decision is basically controlled by just two people.

There are two major areas of concern from my perspective:  The burden on unpaid full nodes making Bitcoin not practically decentralized if it grows faster than technology, and a race to the bottom on fees and thus PoW difficulty making Bitcoin insecure (or dependent on centralized measures like bank-signed-blocks).  (and then there are then some secondary ones, like big pools using large blocks to force small miners out of business)

If all of the cost of being a miner is in the handling of terabyte blocks and none in the POW, an attacker who can just ignore the block validation can easily reorg the chain. We need robust fees— not just to cover the non-adaptive "true" costs of blocks, but to also cover the POW security which can adapt as low as miners allow it.

The criteria you have only speak to the first of these indirectly— we HOPE computing will follow some exponential curve— but we don't know the exponent.  It doesn't speak to the second at all unless you assume some pretty ugly mining cartel behavior to fix the fee by a majority orphaning valid blocks by a minority (which, even if not too ugly for you on its face would make people have to wait many blocks to be sure the consensus was settled).

If you were to use machine criteria, I'd add a third: It shouldn't grow faster than (some factor of) the difficulty change.  This directly measures the combination of computing efficiency and miner profitability.  Right now difficulty change looks silly, but once we're comfortably settled w/ asics on the latest silicon process it should actually reflect changes in computing... and should at least prevent the worst of the difficulty tends to 1 problem. (though it might not keep up with computing advances and we could still lose security).  Also, difficult as a limit means that miners actually have to _spend_ something to increase it— the median of size thing means that miners have to spend something (lost fees) to _decrease it_...  

Since one of my concerns is that miners might not make enough fees income and the security will drop, making it so miners have to give up more income to 'vote' for the size to go down... is very not good. Making it so that they have to burn more computing cycles to make it go up would, on the other hand, be good.

Code:
every 2016 blocks:

new_max_size =  min( old_size * 2^(time-difference/year),
                                     max(max(100k,old_size/2),median(miner_coinbase_preferences[last 2016 blocks])),
                                     max(1mb,1MB*difficulty/100million),
                                     periodic_hard_limit (e.g. 10Mbytes),  
                                  )
(the 100m number there is completely random, we'll have to see what things look like 8 months once the recent difficulty surge settles some, otherwise I could propose something more complicated than a constant there.)

Mind you— I don't think this is good by itself. But the difficulty based check improves it a lot. If you were to _then_ augment these three rules with a ConsentOfTheGoverned hard upper limit that could be reset periodically if people thought the rules were doing the right thing for the system and decentralization was being upheld...  well, I'd run out of things to complain about, I think. Having those other limits might make agreeing on a particular hard limit easier— e.g. I might be more comfortable with a higher one knowing that there are those other limits keeping it in check. And a upper boundary gives you something to test software against.

I'm generally more of a fan rule by consent— public decisions suck and they're painful, but they actually measure what we need to measure and not some dumb proxies that might create bad outcomes or weird motivations— there is some value which is uncontroversial, and it should go to that level. Above that might be technically okay but controversial, and it shouldn't go there.   If you say that a public set limit can't work— then you're basically saying you want the machine to set behavior against the will and without consent of the users of the system, and I don't think thats right.
 

Bitcoin will not be compromised
TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 10, 2013, 11:53:13 AM
 #185

What about directly targeting fees, similar to Gavin's suggestion that blocks must have fee > 50 * (size in MB).

If the average (or median) block fee (including minting) for the last 2016 blocks is more than 65, then increase the block size by 5%.  If it is less than 35, then drop it by 5%.  This gives a potential increase in block size of 3.5 per year approx.  It also guarantees that the miner fees stay between 35 and 65 per block, so keeps the network secure.

Ofc, if decreased block size causes decreased usage, which causes less tx fees, then you could get a downward spiral, but the limit is a factor of 3.5 per year in either direction.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
fornit
Hero Member
*****
Offline Offline

Activity: 989


View Profile
April 10, 2013, 01:52:40 PM
 #186

It also guarantees that the miner fees stay between 35 and 65 per block, so keeps the network secure.

it also guarantess that bitcoin always costs more than 8,7%-16,2% of its current market cap in fees. sounds like a solid plan. at 100 billion dollar market cap we need to start thinking about invading some small countries for mining rig storage space though.

Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652


Chief Scientist


View Profile WWW
April 10, 2013, 04:29:29 PM
 #187

What about directly targeting fees, similar to Gavin's suggestion that blocks must have fee > 50 * (size in MB).
Nah, I now think that's a dumb idea.

Responding to gmaxwell:

RE: burden of unpaid full nodes: for the immediately forseeable future, that burden is on the order of several hundred dollars a year to buy a moderately beefy VPS somewhere.

I understand the "lets engineer for the far future" ... but, frankly, I think too much of that is dumb. Successful projects and products engineer for the next year or two, and re-engineer when they run into issues.

Maybe the answer will be "validation pools" like we have mining pools today, where people cooperate to validate part of the chain (bloom filters, DHTs, mumble mumble....). Maybe hardware will just keep up.

RE: race to the bottom on fees and PoW:

sigh.  Mike explained how that is likely to be avoided. I'm 100% convinced that if users of the network want secure transactions they will find a way to pay for them, whether that is assurance contracts or becoming miners themselves.

How often do you get the chance to work on a potentially world-changing project?
ripper234
Legendary
*
Offline Offline

Activity: 1260


Ron Gross


View Profile WWW
April 12, 2013, 01:05:59 PM
 #188

Posted the wiki article to reddit.

I haven't read the thread in a while. Can I assume the wiki is relatively up to date, or does it need updating?

Please do not pm me, use ron@bitcoin.org.il instead
Mastercoin Executive Director
Co-founder of the Israeli Bitcoin Association
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
April 12, 2013, 02:21:11 PM
 #189

The wiki page is basically a copy of what I wrote with a few edits to make it sound more wiki-ish.
ripper234
Legendary
*
Offline Offline

Activity: 1260


Ron Gross


View Profile WWW
April 12, 2013, 02:25:19 PM
 #190

The wiki page is basically a copy of what I wrote with a few edits to make it sound more wiki-ish.

Coolness - keep up the good work Mike!

Please do not pm me, use ron@bitcoin.org.il instead
Mastercoin Executive Director
Co-founder of the Israeli Bitcoin Association
Stampbit
Full Member
***
Offline Offline

Activity: 182



View Profile
April 12, 2013, 05:57:07 PM
 #191

Can someone explain why limited blocksizes limit the number of transactions that can be placed? Wouldnt 10 100kb blocks get solved in as much time as 1mb block?
TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 12, 2013, 07:06:23 PM
 #192

Can someone explain why limited blocksizes limit the number of transactions that can be placed? Wouldnt 10 100kb blocks get solved in as much time as 1mb block?

No, a block takes the same time to solve, no matter how many transactions are in it.  You only have to solve the block header which is 80 bytes.  You run the header through the sha-256 hash function twice to see if you "win".  If not, you increment an integer in the header and try again.  Mining is repeating the hash function until you get an integer than "wins".

Basically, once you have generated all the transactions you want to add, you calculate the merkle root.  This is a hash for all the transactions.  You include that number in the header and it is always the same size (32 bytes).

Larger blocks will potentially cause bandwidth issues when you try to propagate them though, since you have to send all the transactions.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 12, 2013, 07:52:27 PM
 #193

The wiki page is basically a copy of what I wrote with a few edits to make it sound more wiki-ish.

So, the procedure is basically, create a transaction:

Inputs:
2: <valid auth> + SIGHASH_ANYONECANPAY

Outputs
10: OP_RETURN <so dead>

This transaction cannot be spent.  However, when performing the authentication, all other inputs are ignored.  This means that if someone modified it to

Inputs:
2: <valid auth> + SIGHASH_ANYONECANPAY
8: <valid auth> + SIGHASH_ANYONECANPAY

Outputs
10: OP_RETURN <so dead>

It becomes valid.  The tx is validated twice, once with each input and the keys have to all be valid.

It seems that you can get the same effect without OP_RETURN by having a transaction that pays to OP_TRUE OP_VERIFY.

Also, what is to stop a miner adding the extra 8 BTC to get it over the limit?  In that case, the block is funded anyway.

The disadvantage is that the miner must add a 2nd transaction to direct the OP_TRUE output to his actual address.  However, no hard fork is required.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Stampbit
Full Member
***
Offline Offline

Activity: 182



View Profile
April 12, 2013, 08:24:01 PM
 #194


Larger blocks will potentially cause bandwidth issues when you try to propagate them though, since you have to send all the transactions.

When you propagate them, do you send them to all other miners or only a subset of miners?
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1106


View Profile
April 12, 2013, 08:27:06 PM
 #195

It seems that you can get the same effect without OP_RETURN by having a transaction that pays to OP_TRUE OP_VERIFY.

Actually you just need OP_TRUE as the scriptPubKey, or leave it empty and have the miner provide OP_TRUE in the scriptSig. I mentioned the trade-offs of the approach on IRC a few weeks ago.

Also, what is to stop a miner adding the extra 8 BTC to get it over the limit?  In that case, the block is funded anyway.

Ouch... not much of an assurance contract then is it? It's basically just a donation at random - that's very clever of you. There is of course the risk that your block gets orphaned and another miner takes the fees instead, but that happens something like 1% of the time now, and potentially a lot less in the future. Miners can deliberately orphan the block too, but we really want to implement mechanisms to discourage that behavior for a lot of reasons.

You could use nLockTime: basically you would all participate in authoring an nLockTime'd assurance contract transaction. Some time before the nLockTime deadline approaches, if the assurance contract is *not* fully funded, IE a miner might pull the "self-assure" trick, you double-spend the input that would have gone to the contract so your contribution to it is no longer valid. On the other hand, that means anyone can DoS attack the assurance contract process itself by doing exactly that, and if they time it right, they can still pull off the "self-assure" trick. Risky though - it's hard to reason about what is the correct strategy there, although it's obviously easier to pull off all those attacks if you control a large amount rather than a small amount of the overall hashing power.

With a hardfork we can fix the issue though, by making it possible to write a scriptPubKey that can only be spent by a transaction following a certain template. For instance it could say that prior to block n, the template can only be a mining fee donation of value > x, and after, the spending transaction can be anything at all. It'll be a long time before that feature is implemented though.

In any case all these tricks would benefit an attacker trying to depress the overall network hash rate, either to launch double-spend attacks, attack Bitcoin in general, or just keep funds from flowing to the competition if they are a miner.

TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 12, 2013, 08:32:20 PM
 #196

When you propagate them, do you send them to all other miners or only a subset of miners?

All full nodes you are connected to.  They will verify the block and if it is ok, forward it to all full nodes they are connect to.  However, if they receive the block a 2nd (or later) time, they don't forward it.

This sends the block to all nodes on the network.  However, large blocks will propagate more slowly.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 12, 2013, 09:09:26 PM
 #197

Also, what is to stop a miner adding the extra 8 BTC to get it over the limit?  In that case, the block is funded anyway.
Ouch... not much of an assurance contract then is it? It's basically just a donation at random - that's very clever of you. There is of course the risk that your block gets orphaned and another miner takes the fees instead, but that happens something like 1% of the time now, and potentially a lot less in the future. Miners can deliberately orphan the block too, but we really want to implement mechanisms to discourage that behavior for a lot of reasons.

Yeah, there is a risk, esp if the amount you add it very high.

A similar situation was discussed in another thread (ish).

The current rule is mine the longest chain.  However, if a miner included a payment to true in the block, then it would encourage other miners to build on his block.

If the chain was

A -> B -> C ->

but B has a large payment to true and C doesn't, then miners would be encouraged to keep mining against B, rather than accept the new block. 

This means that you have a tradeoff.  If you create C' and it pays a lot of the reward from B onward to true, then you weaken the incentive.

An equilibrium would set it where there is an incentive to include some payment to true.  This means that tx fees are effectively smeared.

I assumed there was basically 2 strategies

1) Mine against the longest chain
2) Mine against one of the top-2 blocks, whichever pays the highest to true

Depending on the payout for the top-2 blocks, neither strategy wins outright, a certain fraction will follow each of them.

Quote
You could use nLockTime: basically you would all participate in authoring an nLockTime'd assurance contract transaction. Some time before the nLockTime deadline approaches, if the assurance contract is *not* fully funded, IE a miner might pull the "self-assure" trick, you double-spend the input that would have gone to the contract so your contribution to it is no longer valid.

You can only spend your own input.  All other inputs would still be valid.  In effect, all participants would have to cancel.

It does offer protection for those who actually care.  The miner would have to publish the tx a few days (or hours) before the deadline, so he couldn't add it to his block.

Quote
On the other hand, that means anyone can DoS attack the assurance contract process itself by doing exactly that, and if they time it right, they can still pull off the "self-assure" trick.

The only thing that would accomplish is to reduce the total, since the inputs from anyone else who contributed would still be valid.

Quote
With a hardfork we can fix the issue though, by making it possible to write a scriptPubKey that can only be spent by a transaction following a certain template. For instance it could say that prior to block n, the template can only be a mining fee donation of value > x, and after, the spending transaction can be anything at all. It'll be a long time before that feature is implemented though.

What would be good would be the timestamping thing.  For example, you a hash of the tx must be included at least 10 blocks previously.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
gglon
Member
**
Offline Offline

Activity: 64


View Profile
April 12, 2013, 10:35:33 PM
 #198


Hi guys, being not that deep into bitcoin I was trying to make some sense from Mike's proposal. I wrote a comment on reddit describing it to normal people as best as I could. But I'm still not sure if it's more or less right. It would be nice if someone translated the proposal to more accessible language, as the way we will fund the network in the future is pretty important, and more people should know what are the options.
TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 12, 2013, 11:00:59 PM
 #199


Hi guys, being not that deep into bitcoin I was trying to make some sense from Mike's proposal. I wrote a comment on reddit describing it to normal people as best as I could. But I'm still not sure if it's more or less right. It would be nice if someone translated the proposal to more accessible language, as the way we will fund the network in the future is pretty important, and more people should know what are the options.

It is an assurance contact.  You say "Pay the miner who includes this transaction 100BTC, signed by me for 10BTC, allow extra signatures".  This is an invalid transaction since you haven't included enough money.

However, if others create more transactions, you end up with something like

Pay the miner who includes this transaction 100BTC, signed by me for 10BTC, allow extra signatures
Pay the miner who includes this transaction 100BTC, signed by person2 for 60BTC, allow extra signatures
Pay the miner who includes this transaction 100BTC, signed by person3 for 30BTC, allow extra signatures

These can be combined, since they are identical except for the signatures to give:

Pay the miner who includes this transaction 100BTC, signed by me for 10BTC, signed by person2 for 60BTC, signed by person3 for 30BTC, allow extra signatures

This is only possible if you explicitly allow extra signatures.  

The final transaction is valid, so can be submitted to the main network.  If a miner includes the transaction in the block, they get 100BTC.

The idea is that lots of people could add a few BTC.  However, it is only valid if the total is 100BTC (or more).

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
April 12, 2013, 11:18:05 PM
 #200

Arguably OT, but still important to the thrust of the OP and much of the debate where network security = decentralization.
Astounding developments like this will drive forward Bitcoin's success.

I feel that just THIS makes me strong bull again ..



friedcat presented USB powered mini ASIC running 300MH/sec. That's what I call DECENTRALIZATION.

This looks great. Thread for interested lurkers: https://bitcointalk.org/index.php?topic=99497.3180

Stampbit
Full Member
***
Offline Offline

Activity: 182



View Profile
April 12, 2013, 11:40:07 PM
 #201

Arguably OT, but still important to the thrust of the OP and much of the debate where network security = decentralization.
Astounding developments like this will drive forward Bitcoin's success.

I feel that just THIS makes me strong bull again ..



friedcat presented USB powered mini ASIC running 300MH/sec. That's what I call DECENTRALIZATION.

This looks great. Thread for interested lurkers: https://bitcointalk.org/index.php?topic=99497.3180

If that USB ASIC also includes FIOS then i'd call it decentralization too.
johnyj
Legendary
*
Offline Offline

Activity: 1834


Beyond Imagination


View Profile
April 13, 2013, 12:11:34 AM
 #202


I understand the "lets engineer for the far future" ... but, frankly, I think too much of that is dumb. Successful projects and products engineer for the next year or two, and re-engineer when they run into issues.


I guess when Satoshi laid out the framework for bitcoin, he looked forward for at least 100 years

Of course the fee problem will only appear after 100 years, enough time to discuss Smiley





Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526


View Profile
April 13, 2013, 03:13:11 PM
 #203

If a miner completes a contract with their own funds it doesn't make any difference, they're just taking part as normal and mining for less money.
TierNolan
Legendary
*
Offline Offline

Activity: 1120


View Profile
April 13, 2013, 09:38:04 PM