Bitcoin Forum
December 12, 2017, 11:59:11 PM *
News: Latest stable version of Bitcoin Core: 0.15.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 [All]
  Print  
Author Topic: Permanently keeping the 1MB (anti-spam) restriction is a great idea ...  (Read 103883 times)
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
February 05, 2015, 01:31:00 AM
 #1

Permanently keeping the 1MB (anti-spam) restriction is a great idea, if you are a bank. Those favoring a permanent 1MB cap, whilst asserting that Bitcoin can still be a financial backbone of sorts, don't know how right they are. The problem isn't a limit in general but that 1MB provides so little transaction capacity that, under any meaningful adoption scenario, it will push all individual users off the blockchain to rely on trusted third parties. 1MB is insufficient for end to end direct user access, but it is sufficient for a robust inter-‘bank’ settlement network.

If the cap is not raised to some higher limit; allowing a larger number of users to maintain direct access, then individuals will be priced out of the blockchain. When that happens Bitcoin becomes yet another network with no direct (peer) access; like FedWire, SWIFT, and other private closed transfer networks.  There is no realistic scenario where a network capped permanently at 1MB can have meaningful adoption whilst still maintaining direct access to the blockchain by individuals. To be clear, by ‘direct access’ I mean both parties transacting directly on-chain without relying on an intermediary or trusted third party.

Quote
Bitcoin ... A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution.
Satoshi Nakamoto - Bitcoin Whitepaper

Finding a balance
If the transaction capacity of a peer to peer network is very low then the resource requirements of running a full node will also be low but the cost of a transactions will be high.  A network that keeps computing requirements low but which has priced direct access to that network out of the hands of individuals doesn't help decentralization. Likewise if the transaction capacity of the network is very high then the cost of transactions will be low but the resource requirements for running a full node will be high.  A network that has sufficient transaction capacity for every possible transaction but which requires resources that put operation of a full node beyond the capabilities of individuals also doesn't help decentralization.

There is no "perfect" transaction rate.  It is a compromise between two different centralization risks.  Lowering the risk of transaction centralization raises the risk of node centralization and the reverse is also true.  A robust network means minimizing overall risk not minimizing one risk at the expense of the other.  This means that neither extreme is optimal.  What level of transaction capacity (in terms of block size) provides the optimum balance will be saved for future discussions. Having a discussion on how to change the network first requires an acceptance that the network needs to change.  So lets start with why the network must change.


Can we stop talking about a cup of coffee please?
If you have been following the discussion you may have heard a claim such as "Every $5 coffee doesn't need to be on the blockchain so there is no need to raise the limit".   The implied meaning is that while the cost of permanently keeping the limit will exclude 'trivial' transactions you will still have direct access to the blockchain for everything else.   This is misleading as the 1MB restriction is so restrictive that larger more meaningful transactions will eventually become uneconomical as well.  

I don't really care if my morning coffee is paid for using a centralized service.  Centralization and trusted third parties aren't always bad. <gasp>.  Context matters so try to read the rest before freaking out.  Have you ever given or received a gift card?   Can't get more centralized than that.  If I put $100 in Bitcoins under the control of a third party say in a ewallet or bitcoin debit card service, the risk and scope of that centralization are limited and manageable.  As long as direct access to the network remains economical I can securely store and transfer my wealth using on-chain transactions and use centralized solutions where the risk is low such as day to day purchases.  On the other hand if the imbalance between transaction demand and capacity makes individual transactions uneconomical I will lose direct access all together and that risk is more severe and the consequences more significant.

Sidechains, payment channels, and cross-chain atomic transactions are decentralized system that can move some of the transaction volume off the primary chain.  In essence like centralized solutions they can act as a multiplier allowing a higher number of total transactions than the number of direct on-chain transactions.   It is important to realize they still rely on the primary chain having sufficient transaction capacity or they aren't trustless.  As an example a payment channel could allow hundreds or even thousands of off chain transactions but it requires periodic on-chain transactions to create, adjust, and takedown.  If the individual user loses direct access to the primary chain they also lose trust free solutions like payment channels.  If direct access becomes prohibitively expensive then only alternative which provides sufficient scale is using trusted third parties.

When demand significantly exceeds capacity it increases the utility and value of centralized solutions
If the transaction demand exceeds capacity by a magnitude or more it will lead to direct users being replaced by trusted third parties acting as aggregators.   There are a lot of disadvantages to centralized services but they are more efficient and if the artificial limit is kept it will play right into the advantages of centralized services.  A third party can facilitate instant off-chain transactions.  If demand outstrips the very limited capacity it will force transactions off-chain using trusted third parties which I will call processors.  

Today these processors would include online wallet providers, exchanges, merchant payment service providers, and services where the user maintains a balance under the control of the merchant (casino, poker room).  In time even traditional financial companies and banks could be processors.  The one thing they all have in common is that customers of these services do not have direct control over private keys and do not directly make transactions on the network.  The processor has control over the private keys, keeps the Bitcoins in reserve and maintains an internal ledger of user transactions and balances.   Processors can trivially facilitate transactions between two customers of their service.   Since they control the private keys they simply update the ledger balances of the two customers and many of them do this today.  

However transactions can still occur off-chain even if they involve customers of different processors.  The process is not trustless but risk can be managed.  Two processors would aggregate all the transactions which occur between their customers and then periodically settle the difference.  When a payment from a customer of one processor is made to a customer of another processor the sending processor will notify the receiving processor.  Both processors will update their internal ledgers.  Over time this will result in an accumulated balanced owed by one processor to the other which can be settled with a single on-chain transaction.  The key thing is that there is a one to many relationship between the settlement transactions and underlying customer transactions.   While the 1MB block limit does not provide sufficient capacity for a large number of direct user transactions, third party processors can facilitate a very large number of off-chain transactions using a smaller number of on-chain settlements.  Blocks of a finite size can support a nearly unlimited amount of off-chain transaction capacity with the limitation that it involves the use of trusted third parties.

You can't competing with a 'bank'
You might be considering the point above but dismissing it because you still 'could' submit a transaction to the network.  The processors described above don't have the ability to close the network but a network that you have technical access to but which is uneconomical is effectively no access at all.  The current block size realistically limits capacity to no more than two to four transactions per second.  Two transactions per second is roughly 64 million transactions per year.  A finite number of transactions can't support an infinite number of direct users.  Say at some point there are ten million users and they wish to make two transactions per month.  That is 240 million transactions but only 64 million will fit in the blocks.  What happens to the excess.  If third party processors are attractive the difference will be handled by them. When you consider a settlement network would allows these third party processors to offer instant, 'no risk' transactions at significantly lower fees than on chain transactions the excess demand will be processed off-chain.   If the network continues to grow the profitability of these companies will grow and that will lead to more third party companies.   Those settlement transactions allow more off-chain transactions but at the same time compete with direct user transactions for the limited on-chain capacity.  In a settlement network the upper limit on the number of settlements required grows exponentially with the number of trusted peers.  Just two hundred trusted peers (crypto banks) performing hourly settlements would fill the blocks, all the blocks perpetually.  There may be billions of 'Bitcoin' transactions but they would be nothing more than updates on centralized ledgers.  The blockchain would just handle the settlement between massive financial service providers.  As these entities are collecting fees, and on chain transactions are a necessity of doing business, they can and will pay far more than you to ensure timely inclusion in a block.   When you as an individual have been reduced to a position where you must outbid a 'bank' for space in the blockchain then you have effectively lost access to the blockchain.

Wait I don't get how off blockchain transactions could occur across entities
Imagine two large financial service providers where thousands of customers make payments to customers of the other entity.  These may not be banks in the traditional sense, they can be any entity which acts as a third party to manage the bitcoins and transactions of customer.  Today it could be major exchanges, payment processors, and ewallet providers but tomorrow it could include traditional financial service companies or even banks.   For this example lets call the two entities Chase and HSBC.   Chase and HSBC can notify each other when one of their customers makes a payment to a customer of the other entity.  Both would update their internal ledgers and the payments would appear to occur instantly.  Most importantly none of them would require an on-chain transaction.  It is just updating numbers in a ledger.  If you are a Coinbase customer and pay another coinbase customer this happens today.  We are only taking it a step further and handling cross entity transactions.  Now the entities have no real cost to perform these payments.  They are just sharing a few bytes of information with their counterparty and updating numbers in a database.  However over time, the net amount of the thousands of transactions will result in one entity accumulating a balanced owed to the other.  This is why settlement networks require some level of trust.   It requires trusted peers to extend mutual lines of credit to each other.  The more they trust each other, the larger the lines of credit and the less frequently they need to settle.  It is also why you will never be a peer on this network.  The entities would enter into a legally binding agreements which set up the conditions of the settlement.  The amount of funds the entities are risking is limited.  The entities will limit the amount of credit they will extend and the terms are usually very short.  These aren't long term loans, in the traditional banking world settlement might occur the next business day.  The efficiency of the blockchain allows for lower capital requirements and lower risk by settling more frequently maybe even hourly.

Imagine that in a particular hour HSBC customers make thousands of payments to Chase customers totalling 10,000 BTC and Chase customers make thousands of payments to HSBC customers totalling 3,000 BTC.  In total transaction volume is worth 13,000 BTC.   As these payments occur Chase and HSBC notify the other party.  This allows both to update their ledgers.  A customer making a payment would see their balance be reduced 'instantly' and the customer receiving a payment would see their balance increase 'instantly'.  The net flows however are not balanced.  Chase has increased their customer balances 10,000 BTC and only reduced their customer balances 3,000 BTC.  On the books both entities have a liability called 'Customer Deposits'.  They keep reserves (hopefully >=100% to cover those liabilities).  However Chase has seen its liability increase by 7,000 BTC and HSBC has seen its liability decrease by the same amount.  To reconcile this HSBC will make a single on-chain transaction to Chase for 7,000 BTC.  This will increase Chase's reserves by 7,000 BTC and decrease HSBC's reserves by 7,000 BTC.  Everything is balanced again.   Yes it did require a limited amount of trust between settlement peers and a single on-chain transaction but it facilitated thousands of off chain transactions.  As soon as the next cross entity transaction happens a balance is owed by one entity and the net amount owed will increase and decrease until the next settlement which balances the books again and the cycle perpetually continues.

Now when demand for transactions exceeds what is possible who do you think can pay more in fees?  You or a bank?  If transaction demand exceeds capacity then some transactions will not make it into a block.  Those paying the highest fee will be the ones who retain access to the blockchain and those unable or unwilling to will be excluded.  It is delusional to think it will be the 'banks' that suffer in a situation like that.

The reported 7tps transaction capacity does not exist.
There is a myth that without raising the limit the network could handle 7tps.  It can't.  The limit is 1MB per block the actual transaction capacity depends on the average transaction size and realistically that provides no more than 2 to 4 tps. To achieve 7 tps using one block of 1 MB every 600 seconds means that the average transaction size must be 240 bytes (1,000,000 bytes / 600 seconds / 7tps = 240 bytes).  If you have a Bitcoin wallet handy take a look at your last dozen transactions and if you don't have a wallet handy use a website to lookup the transactions in the most recent block.  How many of the transactions were under 240 bytes? Not very many.  I am going to say the majority of your transactions were probably between 300 and 700 bytes.  

Can you form a 240 byte transaction?  Sure as long as you only have only a single input.  A transaction input is requires at least 147 bytes so an average of 240 bytes per transaction is not possible unless the average number of inputs is less than 2.  While some transactions may have one input on average they are going to have more.   The total number of inputs in the blockchain will roughly equal the the total number of outputs.  As the number of blocks approaches infinity the ratio of inputs to outputs in a well functioning blockchain will approach 1:1.

Since most outputs will eventually become inputs it makes more sense to look at block capacity using a balanced transactions as a template for transaction size.  A balanced transaction is one where the number of inputs equals the number of outputs.  Single input, single output exceptions are both rare and have limited use.   A 2 input, 2 output transaction using all compressed keys and P2PKH scripts is typical and weighs in at 373 bytes.  At 373 bytes per transaction and 1MB per block the network will not exceed 4.4 tps..  This is already 37% less than claimed but it is still unrealistic as it represents the smallest possible balanced transaction.

Most transactions are going to be larger than 373 bytes due to the use of uncompressed keys being used, more complex scripts, and more inputs and outputs per transaction.  Looking at the last million transactions in the blockchain I found the average txn size was 560 bytes.  At 560 bytes per transaction and 1MB per block the network will not exceed 3.0 tps.  So we have already lost over half of this claimed capacity but this is very likely to decrease over time as transaction sizes creep higher.  Multisig and other more complex scripts are being used more frequently and that trend will continue.  A good estimate for the network throughput when limited to 1MB blocks would be 2 to 4 tps depending on how optimistic you want to be.

Here is a direct comparison of the combined script size for some different types of scripts.  The scriptPubKey is encoded in a transaction output and the scriptSig is encoded in the transaction that "spends" that output.  Since outputs eventually become inputs in new transactions the combined size of the scriptPubKey and scriptSig represents the "roundtrip" script cost of a transaction.

Code:
       P2PkH:   131 bytes per script round trip (25 byte scriptPubKey +   106 byte scriptSig)
  2-of-3 P2SH:   253 bytes per script round trip (22 byte scriptPubKey +   231 byte scriptSig)  
  3-of-5 P2SH:   383 bytes per script round trip (22 byte scriptPubKey +   361 byte scriptSig)
15-of-15 P2SH: 1,481 bytes per script round trip (22 byte scriptPubKey + 1,459 byte scriptSig)

How many transactions are possible per megabyte of block capacity?  Below is a the maximum capacity of the network at various average transaction sizes.  Realistically 2 to 4 tps is all that is supported by a 1MB block and the lower end of that range is a far more likely.
Code:
Txn Size  Upper Bound   Example  
373       4.4 tps       (2in, 2out, P2PkH)
416       4.0 tps
520       3.3 tps       Average of the last 1,000,000 transactions
555       3.0 tps
616       2.7 tps       (2in, 2out, 2-of-3 P2SH)
833       2.0 tps

This same metric also applies to larger blocks.  Advocates of larger blocks will often overestimate the capacity of larger blocks.  It is realistic to estimate getting 2 to 4 tps per MB of block space regardless of the block size. If all blocks were 20MB that would provide a realistic throughput of 40 to 80 tps not 140 tps.  Still 40 to 80 tps would be sufficient for 100 million users to make one or two transactions per month.

1MB can not support a sufficient number of direct users even if transaction frequency is very low
One argument made by those favoring the cap is that Bitcoin doesn't need to be used as a transactional currency to be successful.  Users could primarily acquire Bitcoins as a way to secure store of value (savings) and continue to use other currencies for routine purchases.  Bullion and other stores of value have a much lower velocity than transactional currencies.  This means a block of the same size could support more more users.  While the user of a non-transaction currency may not make dozens of transactions a day, meaningful access would at least require access on the order of dozens of transactions per year.  If your savings or brokerage account restricted you to only one deposit per quarter and one withdrawal per year I don't think you would find that acceptable.  Future users of Bitcoin will not find it any more acceptable if they are forced to transaction as infrequently.

Code:
Maximum supported users based on transaction frequency.
Assumptions: 1MB block, 821 bytes per txn
Throughput:  2.03 tps, 64,000,000 transactions annually

Total #        Transactions per  Transaction
direct users     user annually    Frequency
       <8,000       8760          Once an hour
      178,000        365          Once a day
      500,000        128          A few (2.4) times a week
    1,200,000         52          Once a week
    2,600,000         24  Twice a month
    5,300,000         12  Once a month
   16,000,000          4  Once a quarter
   64,000,000          1          Once a year
  200,000,000          0.3        Less than once every few years
1,000,000,000          0.06       Less than once a decade

As you can see even with an average transaction frequency of just once a week or once a month the network can't support more than a token number of users.  When someone advocates a permanent cap of 1MB what they are saying is I think Bitcoin will be great if it is never used by more than a couple million users making less than one transaction per month.  Such a system will never flourish as a store of value as it is eclipsed by alternatives which are more inclusive.  To support even 100 million direct users making an average of one transaction every two weeks would require a throughput of 82 tps and an average block size of 20 to 40 Megabytes.

1MB doesn't can't even keep up with existing non-retail payment networks.
Going back to that coffee meme, the implied message is that 1MB is fine unless for everything else.  You know substantial stuff like paying your mortgage, business deals, major capital expenditures, or paying a supplier for inventory.  This just isn't the case though.  Do you know anyone who pays for coffee with a bank wire? The FedWire service (run by US federal reserve) processes ~150 million bank wires annually.   The FedWire service only operates in the US.  Internationally the largest clearinghouse is SWIFT and it processes more than 5 billion transfers annually.  The US ACH network is even larger with 19 billion transactions annually (excluding converted checks).  There are also about 2 billion international remittances annually (western union, moneygram, and other networks).  A 1MB restricted Bitcoin network couldn't even keep up with these transfer networks even if you forget about retail sales completely.  The idea keeping the 1MB restriction, only keeps limits the utility of small payments is simply incorrect.

Code:
Bitcoin block size to reach comparable network volume based on average txn size
Network    txn volume        Average transaction size
           annually (mil)    373 bytes   560 bytes   833 bytes
FedWire       150               1.1 MB      1.7 MB      2.3 MB
Remittance  2,000              14.2 MB     21.3 MB     31.7 MB
SWIFT       5,000              35.5 MB     53.3 MB     79.3 MB
ACH        19,000             134.8 MB    202.4 MB    301.0 MB

On a transaction fee basis.
Currently the cost of the network is roughly $300 million annually. The users of the network are collectively purchasing $300 mil worth of security each year.  If users paid $400 million the network would be more secure and if they paid $200 million it would be less secure. Today the majority of this cost is paid indirectly (or subsidized) through the creation of new coins but it is important to keep in mind the total unsubsidized security cost.  At 2 tps the network the unsubsidized cost per transaction would be about $5. At 100 tps it would be $0.05.  If Bitcoin was widely adopted, more users purchasing more coins should mean a higher exchange rate and thus the value of potential attacks also rises.  The future cost of the network will need to rise to ensure that attacks are not economical and non-economic attacks are prohibitively expense relative to the benefit for the attacker.   It may not rise linearly but it will need to rise.   If someday one Bitcoin is worth $10,000 and we are still only spending $300 million a year on security we probably are going to have a problem.  Now advocates of keeping the limit may argue that the majority of the network cost won't be paid by fees for many years but the reality is that with a low maximum transaction rate you can choose either much higher fees or much lower security.

Conclusion
Restricting the size of blocks to 1MB permanently is great if you are a major financial services company.   You could co-opt a very robust network, act as a trusted intermediary and force direct users off the chain onto centralized services.  For the same reasons, it is a horrible idea if you even want to keep open the possibility that individuals will be able to participate in that network without using a trusted third party as an intermediary.

On edit: fixed some typos.  Cleaned up some poorly worded statements.  Yeah there are a lot more.  It is a work in progress.
1513123151
Hero Member
*
Offline Offline

Posts: 1513123151

View Profile Personal Message (Offline)

Ignore
1513123151
Reply with quote  #2

1513123151
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
MrGreenHat
Full Member
***
Offline Offline

Activity: 173


View Profile
February 05, 2015, 01:51:31 AM
 #2

I approve this message!
juju
Sr. Member
****
Offline Offline

Activity: 382



View Profile
February 05, 2015, 02:13:11 AM
 #3

-snip- 

This makes me a bit scared, I had always figured in the future we would be allowed to use the blockchain and create transactions as we take for granted today. If we don't raise the block size, the competitiveness for transactions would be insane, I had always figured this but never had I imagined a scenario that we would never be able to move any coins.
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 02:21:12 AM
 #4

Yes, and it doesn't stop at 20MB either, I just hope people will stop crying about progress.

Did you cry as much about the upgrade from 32bit to 64bit operating systems?

Or from 1.44 MB floppy disks to 200 MB CDs?

I mean you can fit the current blocks on a floppy disk, maybe even two blocks if you're lucky.

If Bitcoin really catches on we will need block sizes of a gigabyte or even more.

20 mb is nice for a start, it will last as a year or two, three maybe, but it will need to be upgraded eventually as well. We are dangerously close to our limit with 1 my and we can't stay at 1 mb.
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 02:25:19 AM
 #5

I lol at your post, it shows how self-serving people are, ever tried to ask the miners?? You know the ones that keep the network alive in this competitive environment (not so much now) a market for fees would strenght the network even in the scenario of a total price collapse because you are creating more incentives for mining, and not keeping the vulture industry that dominates it today.

Miners can choose to include or exclude transactions regardless of block size.

A miner can decide for himself if a no-fee transaction is worth including, or a low-fee transaction for that matter.

If a lot of miners agree a low fee or no fee transaction should be excluded, it will take a long time before a transaction will be picked up by a small mining pool.

Miners also need to consider that excluding low fees will mean that the miner pools that do include the low fees will process more transactions (because all the low fee transactions will stack up) and it could be thousands upon thousands of transactions and thousand times a small fee can become quite a large sum.

It's better to sell 1 billion screws and make 0,01 cent profit on each of them than to sell 1 Lamborghini and make $100000 in profit.
MrTeal
Legendary
*
Offline Offline

Activity: 1274


View Profile
February 05, 2015, 02:30:16 AM
 #6

Very nice post, D&T. I think it's the most clear argument I've seen presented in favor of larger block size.
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 02:37:37 AM
 #7

I lol at your post, it shows how self-serving people are, ever tried to ask the miners?? You know the ones that keep the network alive in this competitive environment (not so much now) a market for fees would strenght the network even in the scenario of a total price collapse because you are creating more incentives for mining, and not keeping the vulture industry that dominates it today.

Miners can choose to include or exclude transactions regardless of block size.

A miner can decide for himself if a no-fee transaction is worth including, or a low-fee transaction for that matter.

If a lot of miners agree a low fee or no fee transaction should be excluded, it will take a long time before a transaction will be picked up by a small mining pool.

Miners also need to consider that excluding low fees will mean that the miner pools that do include the low fees will process more transactions (because all the low fee transactions will stack up) and it could be thousands upon thousands of transactions and thousand times a small fee can become quite a large sum.

It's better to sell 1 billion screws and make 0,01 cent profit on each of them than to sell 1 Lamborghini and make $100000 in profit.

You still dont get it, you think they mine because they like BTC, its about the money, no miner will keep on with a loss, so besides the centralization in storage to run the nodes you are centralizing the mining process even further! The USG must be proud.

You don't get it, there won't be any Bitcoin to mine if we limit the system to 2tps.

Also, people don't pay for taxis because taxis are limited, people pay for taxis because they need a taxi. And the taxi price is decided by the taxi drivers themselves.

Based on gas prices, wages, wear of the car, etc. it's not so high that competition will eat away their profits by offering the same service cheaper, but not so low that they can't make a living.

Same with miners, their fees (which they can decide themselves) can be very high, but then very few transactions will be solved by that miner because very few people will want to pay a lot to make sure their transaction is included within the next block for 100% certainty. But if you put it too low than you can't make a profit anymore. So each miner and mining poo, should decide for themselves how the a fee should be before they include it in a block.

It's better to make 0,01 Bitcoin per transaction on fees on 5 million transactions per block than to make 1 Bitcoin in fees on 2400 transactions in 10 minutes (because more transactions will not fit)

If you don't even understand this simple fact I'm done arguing with you.

Btw I used to be a miner and I would still be one, except black arrow never delivered. So I am done buying miners becaus they either arrive late or never arrive. And besides the cost of electricity in Europe is ridiculously high so I wouldn't be able to compete anyway.
Foxpup
Legendary
*
Offline Offline

Activity: 2044



View Profile
February 05, 2015, 02:45:14 AM
 #8

Thank you for reminding me that there are still a few sane individuals on this forum.


Will pretend to do unverifiable things (while actually eating an enchilada-style burrito) for bitcoins: 1K6d1EviQKX3SVKjPYmJGyWBb1avbmCFM4
acoindr
Legendary
*
Offline Offline

Activity: 1036


View Profile
February 05, 2015, 03:16:32 AM
 #9

A great detailed post as usual D&T.

Yes, I think the 7tps was meant as a technical maximum assuming 1 input and 2 outputs (1 to destination, and 1 back for change) for 225 bytes.

I agree 1MB blocks unnecessarily hinder Bitcoin, and although I don't see much difference in the potential for global value between 10tps and 20tps, seeing as VISA alone does 2,000 tps on average, I've revised my signature advertisement down from 7tps Wink
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2464



View Profile
February 05, 2015, 03:21:11 AM
 #10

There needs to be a consideration of chilling effects, or otherwise, upon ancillary technological development that is directed towards the current rules, when even merely discussing changing the rules.

R2D221
Hero Member
*****
Offline Offline

Activity: 658



View Profile
February 05, 2015, 03:36:50 AM
 #11

Did you cry as much about the upgrade from 32bit to 64bit operating systems?

Or from 1.44 MB floppy disks to 200 MB CDs?

I'm still using floppies, I don't know why anyone would think they are obsolete.

But I still need to upgrade my operating system, so I think this will be a long night.


An economy based on endless growth is unsustainable.
knight22
Legendary
*
Offline Offline

Activity: 1372


--------------->¿?


View Profile
February 05, 2015, 03:45:24 AM
 #12

Thanks for this. Bitcoin needs to scale and increasing the block size limit should part of that roadmap.

kingcolex
Legendary
*
Offline Offline

Activity: 1316



View Profile
February 05, 2015, 03:45:40 AM
 #13

Very good post, hopefully this will help some of the confused out there!

runpaint
Sr. Member
****
Offline Offline

Activity: 476



View Profile
February 05, 2015, 03:48:36 AM
 #14

Did you cry as much about the upgrade from 32bit to 64bit operating systems?

Or from 1.44 MB floppy disks to 200 MB CDs?

I'm still using floppies, I don't know why anyone would think they are obsolete.

But I still need to upgrade my operating system, so I think this will be a long night.






That is not true, Windows 8.1 is not officially offered on the medium of floppy discs.  Those discs are not genuine.

BARR - Burning Altcoins for Redemption and Reduction - First Coin Using Multi-Proof-of-Burn - http://barr.me - Raising the BARR by Reducing Supply - Absorbing and Destroying Entire Altcoin Networks
MrTeal
Legendary
*
Offline Offline

Activity: 1274


View Profile
February 05, 2015, 03:50:07 AM
 #15

That is not true, Windows 8.1 is not officially offered on the medium of floppy discs.  Those discs are not genuine.
You must be the toast of every party. Tongue
R2D221
Hero Member
*****
Offline Offline

Activity: 658



View Profile
February 05, 2015, 03:51:12 AM
 #16

That is not true, Windows 8.1 is not officially offered on the medium of floppy discs.  Those discs are not genuine.

Are you telling me I just got scammed? </sarcasm>

An economy based on endless growth is unsustainable.
RoadStress
Legendary
*
Offline Offline

Activity: 1652


View Profile
February 05, 2015, 03:59:06 AM
 #17

Great post D&T. As a miner raising the block limit is the next best thing that can happen after a moonish exchange rate.

iCEBREAKER is a troll! He and cypherdoc helped HashFast scam 50 Million $ from its customers !
H/w Hosting Directory & Reputation - https://bitcointalk.org/index.php?topic=622998.0
amspir
Member
**
Offline Offline

Activity: 112


View Profile
February 05, 2015, 04:16:57 AM
 #18

Great post.

Perhaps the best way to deal with the transaction limit, so it does not continue to be a problem, is to quadruple the block limit size at each block reward halving every 4 years.  This should put in in line with Moore's Law, such that running a full node won't be out of reach of the average user.

BusyBeaverHP
Full Member
***
Offline Offline

Activity: 194


View Profile
February 05, 2015, 04:21:39 AM
 #19

Thank you for expressing what many of us knows intuitively, but lack the technical articulation to bring the argument to full fidelity. Wonderful stuff.
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 04:23:44 AM
 #20

Did you cry as much about the upgrade from 32bit to 64bit operating systems?

Or from 1.44 MB floppy disks to 200 MB CDs?

I'm still using floppies, I don't know why anyone would think they are obsolete.

But I still need to upgrade my operating system, so I think this will be a long night.






That is not true, Windows 8.1 is not officially offered on the medium of floppy discs.  Those discs are not genuine.

the sad thing is, you're probably not even the most stupid person i have met on these forums.

marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2464



View Profile
February 05, 2015, 04:53:03 AM
 #21

Great post.

Perhaps the best way to deal with the transaction limit, so it does not continue to be a problem, is to quadruple the block limit size at each block reward halving every 4 years.  This should put in in line with Moore's Law, such that running a full node won't be out of reach of the average user.


I like the simplicity of this strategy and it has grounding in practical limitations for physical hardware.

Although I would suggest to have it at the midway points between halvings so as to smooth out any lumpiness in the response to fees/reward when changing the halving and max_block_size increase together. Analogous to presidential and mid-term election cycles.

So quadruple max_block_size at 315k, 525k, 735k, 945k, thereafter every 210k blocks. But need to begin with a one-off quadruple increase to 4 MByte ASAP (to account for previous increase that would have ideally happened at 315k).

Edit: on further thought maybe doubling every 105k blocks is less disruptive again, instead of banging the limit every so often. So a one-off quadruple to 4 MB ASAP then double to 8 MB at next halving (420k blocks) and double every 105k blocks thereafter, i.e. double approx. every 2 years, more or less, depending on hashrate, which is a rough proxy for network demand via price.

zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 06:12:10 AM
 #22

Great post.

Perhaps the best way to deal with the transaction limit, so it does not continue to be a problem, is to quadruple the block limit size at each block reward halving every 4 years.  This should put in in line with Moore's Law, such that running a full node won't be out of reach of the average user.


I like the simplicity of this strategy and it has grounding in practical limitations for physical hardware.

Although I would suggest to have it at the midway points between halvings so as to smooth out any lumpiness in the response to fees/reward when changing the halving and max_block_size increase together. Analogous to presidential and mid-term election cycles.

So quadruple max_block_size at 315k, 525k, 735k, 945k, thereafter every 210k blocks. But need to begin with a one-off quadruple increase to 4 MByte ASAP (to account for previous increase that would have ideally happened at 315k).

Edit: on further thought maybe doubling every 105k blocks is less disruptive again, instead of banging the limit every so often. So a one-off quadruple to 4 MB ASAP then double to 8 MB at next halving (420k blocks) and double every 105k blocks thereafter, i.e. double approx. every 2 years, more or less, depending on hashrate, which is a rough proxy for network demand via price.


yeah, sounds good.

the actual numbers and frequency could be tweaked, but the idea seems good.

This prevents us from having to hardfork it every time.
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 05, 2015, 06:16:05 AM
 #23

Great post obviously.

As for what to replace the 1 MB anti-spam restriction with, I think Gavin's proposal is very good:

http://gavintech.blogspot.ca/2015/01/twenty-megabytes-testing-results.html

Quote
But then we need a concrete proposal for exactly how to increase the size. Here's what I will propose:

1. Current rules if no consensus as measured by block.nVersion supermajority.
Supermajority defined as: 800 of last 1000 blocks have block.nVersion == 4
Once supermajority attained, block.nVersion < 4 blocks rejected.
2. After consensus reached: replace MAX_BLOCK_SIZE with a size calculated based on starting at 2^24 bytes (~16.7MB) as of 1 Jan 2015 (block 336,861) and doubling every 6*24*365*2 blocks -- about 40% year-on-year growth. Stopping after 10 doublings.
3. The perfect exponential function:
size = 2^24 * 2^((blocknumber-336,861)/(6*24*365*2))
... is approximated using 64-bit-integer math as follows:

Code:
double_epoch = 6*24*365*2 = 105120
(doublings, remainder) = divmod(blocknumber-336861, double_epoch)
if doublings >= 10 : (doublings, remainder) = (10, 0)
interpolate = floor ((2^24 << doublings) * remainder / double_epoch)
max_block_size = (2^24 << doublings) + interpolate

This is a piecewise linear interpolation between doublings, with maximum allowed size increasing a little bit every block.

Instead of sudden and massive block size limit increases every two or four years, it would increase a little every block.
Lauda
Legendary
*
Offline Offline

Activity: 1694


GUNBOT Licenses -20% with ref. code 'GrumpyKitty'


View Profile WWW
February 05, 2015, 06:17:10 AM
 #24

Excellent post. This was very clear and the best 'pro' argument for the fork. It pretty much squashed any of those ridiculous (no, we have not had a real 'anti-fork arguments) in the other thread.
It's nice to see that we still have a few very intelligent individuals.


          ▄▄█████▌▐█████▄▄
       ▄█████████▌    ▀▀▀███▄
     ▄███████████▌  ▄▄▄▄   ▀██▄
   ▄█████████████▌  ▀▄▄▀     ▀██▄
  ▐██████████████▌  ▄▄▄▄       ▀█▌
 ▐███████████████▌             ▀█▌
 ████████████████▌  ▀▀▀█         ██
▐████████████████▌  ▄▄▄▄         ██▌
▐████████████████▌  ▀  ▀         ██▌
 ████████████████▌  █▀▀█         ██
 ▐███████████████▌  ▀▀▀▀        ▄█▌
  ▐██████████████▌  ▀▀▀▀       ▄█▌
   ▀█████████████▌  ▀▀█▀     ▄██▀
     ▀███████████▌  ▀▀▀▀   ▄██▀
       ▀█████████▌    ▄▄▄███▀
          ▀▀█████▌▐█████▀▀
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
      ▄▄▄
 ▄▄█████████▄▄
  ▀▀▀▀▀▀▀▀▀▀▀
   █▌▐█ █▌▐█
   █▌▐█ █▌▐█
 ▄███████████▄
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄






▄█████████████▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
███████████████
██▀▀█▀▀████████
▀█████████████▀
homo homini lupus
Member
**
Offline Offline

Activity: 70


View Profile
February 05, 2015, 06:49:27 AM
 #25

my point is: the chain is already pretty big. If it becomes 20 fold as big i will be forced to stop using bitcoin because i don't want a lite-client or rely on 3rd parties with my coins but on the other hand can't afford upgrading harddrive all the time (especially not with these bad btc prices).

If you raise blocklimit 20-fold it will become unaffordable for normal people to store the blockchain on their computers and because of that people loose access.


On a sidenote:
I am not the only one totally annoyed by how bitcoin behaves in an effort to keep the dominant position and be the 'one coin for all' and with this hurts the alt-industry. I think at this point it becomes inevitable to start using more than one chain and stop looking at btc as the only coin worth bothering with.
All the problems dissolve at exactly the moment we accept a multi-coin/chain solution.

according to OP, if i understand right: if we don't raise limit little people loose access (i doubt it)
BUT raising the blocklimit will also ensure little people to loose access.
Conclusion: one blockchain for everyone is no viable idea

With raising the blocklimit and creating a chain as big as 200gb and more as soon as 1 or 2 years down the road bitcoin won't be able to reach the enduser.
RoadStress
Legendary
*
Offline Offline

Activity: 1652


View Profile
February 05, 2015, 07:07:08 AM
 #26

my point is: the chain is already pretty big. If it becomes 20 fold as big i will be forced to stop using bitcoin because i don't want a lite-client or rely on 3rd parties with my coins but on the other hand can't afford upgrading harddrive all the time (especially not with these bad btc prices).

If you raise blocklimit 20-fold it will become unaffordable for normal people to store the blockchain on their computers and because of that people loose access.

Why does everyone believe that raising the block limit will instantly raise the blockchain too? It will not. It will take time until that will happen!

iCEBREAKER is a troll! He and cypherdoc helped HashFast scam 50 Million $ from its customers !
H/w Hosting Directory & Reputation - https://bitcointalk.org/index.php?topic=622998.0
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
February 05, 2015, 07:21:35 AM
 #27

D&T, a very informative OP.

Empirically, ~1MB blocks support 2.7 tps.

https://bitcointalk.org/index.php?topic=941331.msg10360199#msg10360199

Melbustus
Legendary
*
Offline Offline

Activity: 1638



View Profile
February 05, 2015, 07:48:11 AM
 #28

<great post>


Agreed, and thanks for the taking the time to thoroughly explain and hopefully set some people straight on this.

[...but dang, I was scared for a second after reading the post title but before the post content had loaded Smiley ]


Bitcoin is the first monetary system to credibly offer perfect information to all economic participants.
Cryptoasset rankings and metrics for investors: http://onchainfx.com
LiteCoinGuy
Legendary
*
Offline Offline

Activity: 1148


In Satoshi I Trust


View Profile WWW
February 05, 2015, 07:48:48 AM
 #29

"Conclusion

The blockchain permanently restricted to 1MB is great if you are a major bank looking to co-opt the network for a next generation limited trust settlement network between major banks, financial service providers, and payment processors.   It is a horrible idea if you even want to keep open the possibility that individuals will be able to participate in that network without using a trusted third party as an intermediary.
"


great post.

homo homini lupus
Member
**
Offline Offline

Activity: 70


View Profile
February 05, 2015, 07:51:12 AM
 #30

my point is: the chain is already pretty big. If it becomes 20 fold as big i will be forced to stop using bitcoin because i don't want a lite-client or rely on 3rd parties with my coins but on the other hand can't afford upgrading harddrive all the time (especially not with these bad btc prices).

If you raise blocklimit 20-fold it will become unaffordable for normal people to store the blockchain on their computers and because of that people loose access.

Why does everyone believe that raising the block limit will instantly raise the blockchain too? It will not. It will take time until that will happen!

yes, my answer is on the other thread. No idea why OP needs to open third thread on the same issue and can't just post his view on the other thread. Probably he feels important enough to open a new thread (he must be extraordinary important) . Maybe i open another one for the same topic with my personal views ... maybe just everyone should open their personal thread for this.
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 05, 2015, 07:54:42 AM
 #31

^ I would say this guy doesn't want Bitcoin to succeed

Bitcoin has no advantage over low inflation altcoins which can hold their value. Bitcoin ends right here. Crypto is just about to begin.
Sorry for all who fell for the hype around that particular coin called 'Bitcoin'.

Anyway, it deserves to be said again: great post by D&T. This is basically the definitive counter to arguments from supporters of a permanent 1 MB restriction.
homo homini lupus
Member
**
Offline Offline

Activity: 70


View Profile
February 05, 2015, 07:55:43 AM
 #32

^^^

i would say this guy is a sheep
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 08:09:27 AM
 #33

my point is: the chain is already pretty big. If it becomes 20 fold as big i will be forced to stop using bitcoin because i don't want a lite-client or rely on 3rd parties with my coins but on the other hand can't afford upgrading harddrive all the time (especially not with these bad btc prices).

If you raise blocklimit 20-fold it will become unaffordable for normal people to store the blockchain on their computers and because of that people loose access.


It's spelled lose.

and you rather wait for 30 years before your transaction gets through? Because that's what will happen when you keep the filesize at 1 MB

On a sidenote:
I am not the only one totally annoyed by how bitcoin behaves in an effort to keep the dominant position and be the 'one coin for all' and with this hurts the alt-industry. I think at this point it becomes inevitable to start using more than one chain and stop looking at btc as the only coin worth bothering with.
All the problems dissolve at exactly the moment we accept a multi-coin/chain solution.

Allowing altcoins is opening a can of shit, if we just keep inventing and embracing new altcoins all the time, everyone would just print their own coin and how exactly would that be different from every nation issuing their own fiat currency?

Bitcoin is valuable because it's limited, deal with it. No other coins need to replace bitcoin, and if bitcoin is not suitable, we update it. And that's exactly what will happen, bitcoin will get an update, and it will be updated again and again until it is perfected. Just like every other piece of technology, including the internet.

Technology adapts, technology evolves, technology changes, get used to it. The only constant in the universe is change, you either change with it and adapt, or you get left behind.

according to OP, if i understand right: if we don't raise limit little people loose access (i doubt it)
BUT raising the blocklimit will also ensure little people to loose access.
Conclusion: one blockchain for everyone is no viable idea

With raising the blocklimit and creating a chain as big as 200gb and more as soon as 1 or 2 years down the road bitcoin won't be able to reach the enduser.

little people will lose access to the blockchain if we KEEP the limit at 1 MB because the 4 tps or something will never be enough to cover even a fraction of the transactions. You'd need insane amounts of fees to get included in the blockchain, more than any individual could ever afford.

We are currently siting at an average size of about 0.4 MB (so we are not at our limit yet) and it's not being spammed to be full, which is good. But there's not a lot of room left for growth.

40% may not seem like much, but under exponential growth, it's getting dangerously close. And with things like this you really don't want to wait til the last moment.

Also, it's just the LIMIT that increases, it doesn't suddenly increase the ACTUAL size 20 fold, it just increases the MAXIMUM POSSIBLE size 20 fold. So it gives us a little more breathing room, nothing to be scared of.

and lastly, harddisks are very cheap nowadays, and will only become cheaper.

what does it matter if the blockchain is 30 gb or 500 gb? It's not like 2TB harddisks are expensive or anything. Still cheaper than a safe.

Or do you really trust the banks with all your money? See if they don't run off in the event of the next economic collapse? Think the government will bail them out again? With what money? They can't really raise the taxes any higher you know? Not without civil unrest anyway.


I bet you're one of those guys that keep complaining about phone batteries getting smaller, while in reality they get bigger (capacity-wise anyway), but it's all those extra functions that drain the batteries faster than they can create more powerful batteries.

my point is: the chain is already pretty big. If it becomes 20 fold as big i will be forced to stop using bitcoin because i don't want a lite-client or rely on 3rd parties with my coins but on the other hand can't afford upgrading harddrive all the time (especially not with these bad btc prices).

If you raise blocklimit 20-fold it will become unaffordable for normal people to store the blockchain on their computers and because of that people loose access.

Why does everyone believe that raising the block limit will instantly raise the blockchain too? It will not. It will take time until that will happen!

yes, my answer is on the other thread. No idea why OP needs to open third thread on the same issue and can't just post his view on the other thread. Probably he feels important enough to open a new thread (he must be extraordinary important) . Maybe i open another one for the same topic with my personal views ... maybe just everyone should open their personal thread for this.

You had time to post this, but you couldn't just quote yourself here?

I bet you're afraid that we will come up with counter-arguments to it, because you probably realize you are wrong but you're not man enough to admit it.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 08:26:03 AM
 #34

The OP gives a valid technical argument for raising block size limit, but is neglecting a financial argument against it.

The miners' income has to be greater than the cost of their work. Miners' income is inflation now, but is expected to be replaced by fees,
since inflation halves every four years. Purchasing power of new coins might be sustained for a while but must converge to zero in the limit.

Transaction fees exist only because there is a competition for block space. Eliminating that competition eliminates the fees and with that mining.

Therefore block space has to become and remain a scarce asset.
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 08:27:15 AM
 #35

the point is: the chain is already pretty big. If it becomes 20 fold as big i will be forced to stop using bitcoin because i don't want a lite-client or rely on 3rd parties with my coins but on the other hand can't afford upgrading harddrive all the time (especially not with these bad btc prices).

If you raise blocklimit 20-fold it will become unaffordable for normal people to store the blockchain on their computers and because of that people loose access.

No need to post the same thing in 2 separate threads. Here is my answer:

my point is: the chain is already pretty big. If it becomes 20 fold as big i will be forced to stop using bitcoin because i don't want a lite-client or rely on 3rd parties with my coins but on the other hand can't afford upgrading harddrive all the time (especially not with these bad btc prices).

If you raise blocklimit 20-fold it will become unaffordable for normal people to store the blockchain on their computers and because of that people loose access.

Why does everyone believe that raising the block limit will instantly raise the blockchain too? It will not. It will take time until that will happen!

Even if it takes time: the blockchain is already very big - if you make it bigger normal people will need to upgrade their hardware to use it and people won't do that.


Right now there isn't even an immediate need to fork so the proposale doesn't make sense at this point in time.

As noted before: reaching the blocklimit will at first result in microtransactions being pushed off the chain and that won't be an issue for most users.

Fork to a bigger chain isn't rational at this point in time. Period.

Do you know how many viable blockchains are out there with almost only empty blocks and very small chanis (below 1bg storage)? Dozens!

Blockchains aren't scarce. So why would i use one blockchain that requires hundreds of GB storage when i can use one almost as secure  with much less HD-use? I personally will leave btc behind for good with a larger chain (just refuse using Gavincoin - it isn't even 'bitcoin' - it is really 'gavincoin') or stick to the old fork in case it can survive.

I guess you were referring to this one.

even though we never reached over 50% of the maximum blocksize yet, the block sizes vary wildly per block, and grow pretty much exponentially.

Also, we are at about 30% consistently right now.

If it keeps growing at the same rate as it has now, it may very well take a little over a year before we really need the upgrade, but what if it doesn't? What if we get another rally next month, or in two months? Another rally will surely come with an increase in transactions, a massive increase, as can be seen from the charts. And since we are already using about 30% another rally will likely need more than 1MB.

It will not need 20MB, but while we are at it we may as well give ourselves some room.

If we keep the 1MB limit, the next rally will kill bitcoin, because the transactions will be too limited, the network will clog, and people will blame the blockchain technology for it.

And we may very well never get a second chance to do it right.

and all those other blockchains are so small because NO ONE USES THEM

growth is a GOOD thing, more size means more transactions means more users means more value.
Foxpup
Legendary
*
Offline Offline

Activity: 2044



View Profile
February 05, 2015, 08:30:10 AM
 #36

can't afford upgrading harddrive all the time ... raising the blocklimit and creating a chain as big as 200gb and more
200GB of hard disk space costs about $10, making it the cheapest upgrade you can possibly get. Though if your system is so cheap that 200GB of hard disk space actually counts as an upgrade, one has to wonder how you ever managed to run a full node on it in the first place.

Will pretend to do unverifiable things (while actually eating an enchilada-style burrito) for bitcoins: 1K6d1EviQKX3SVKjPYmJGyWBb1avbmCFM4
johnyj
Legendary
*
Offline Offline

Activity: 1834


Beyond Imagination


View Profile
February 05, 2015, 08:32:25 AM
 #37

You must be able to broadcast that huge block to most of the nodes in 10 minutes. I don't see the latest research regarding this area, but there is a paper from 2013


http://www.tik.ee.ethz.ch/file/49318d3f56c1d525aabf7fda78b23fc0/P2P2013_041.pdf

Based on this research, it took 0.25 seconds for each KB transaction to reach 90% of network. In another word, a 1MB block will take 256 seconds to broadcast to majority of nodes and that is 4 minutes

When block size reach 10MB, you will have a broadcast time of 40 minutes, means before your block reach the far end of the network, those nodes have already digged out 3 extra blocks thus your block is always orphaned by them. And the whole network will have disagreement about which segment have the longest chain, thus fork into different chains

Gavin's proposal is to let mining pools and farms connect to high speed nodes on internet backbone. That is reasonable, since the propagation time is only meaningful for miners, your transaction will be picked up by the mining nodes closest to you and if those mining nodes have enough bandwidth, they can keep up with the speed. But anyway, how much bandwidth is really needed to broadcast 10MB message in a couple of minutes between hundreds of high speed nodes need to be tested. And this is the risk that someone worried about the centralization of mining nodes: Only those who have ultra high speed internet connection can act as nodes (I'm afraid that chinese farms will be dropped out since their connection to the outside world is extremely slow, they will just fork to their own chain inside mainland china)

zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 08:32:34 AM
 #38

The OP gives a valid technical argument for raising block size limit, but is neglecting a financial argument against it.

The miners' income has to be greater than the cost of their work. Miners' income is inflation now, but is expected to be replaced by fees,
since inflation halves every four years. Purchasing power of new coins might be sustained for a while but must converge to zero in the limit.

Transaction fees exist only because there is a competition for block space. Eliminating that competition eliminates the fees and with that mining.

Therefore block space has to become and remain a scarce asset.


like i have said before in this very thread, it's better to sell 1 billion screws and make 0,01 cent profit from every screw than to sell 1 Lamborghini and make a profit of $100000 from that single sale.

bigger blocks means more transactions which means MORE FEES.

not less fees, MORE fees.

it's a GOOD thing for miners, not a bad thing.

Imagine if you owned a taxi business, but no matter how many taxis and taxi drivers you have, you are only allowed to transfer 500 passengers a day, period. You could own a million cars and a million drivers, but you would still only be allowed 500 passengers. There could be a major sport events like the superbowl or the champions league, 100,000 of people could be waiting for a taxi ready to pay but no, you can not take more than 500, why? because protocol said so.

The blockchain works by supply and demand, and supply is made by the miners themselves, not by some dumb hardcoded limit.

If the miners think 0.001 fee per transaction is too low, they are free to deny those transactions. But they should deny transactions just because 'lol the block is full'
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 08:47:46 AM
 #39

The blockchain works by supply and demand, and supply is made by the miners themselves, not by some dumb hardcoded limit.

You assume that miner can effectively control supply. This in absence of a block size limit is only the case if they are building cartels, that artificially limit the supply.

Is that you really prefer instead of an algorithmic decision?
Remember Bitcoin's promise is to operate without the need of cartels and authorities.
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 05, 2015, 08:55:05 AM
 #40

^ Did you read the entire post? The OP fully addressed the effect on fees:

Quote
On a transaction fee basis.
Currently the cost of the network is roughly $300 million annually. The users of the network are collectively purchasing $300 mil worth of security each year.  If users paid $400 million the network would be more secure and if they paid $200 million it would be less secure. Today the majority of this cost is paid indirectly (or subsidized) through the creation of new coins but it is important to keep in mind the total unsubsidized security cost.  At 2 tps the network the unsubsidized cost per transaction would be about $5. At 100 tps it would be $0.05.  If Bitcoin was widely adopted, more users purchasing more coins should mean a higher exchange rate and thus the value of potential attacks also rises.  The future cost of the network will need to rise to ensure that attacks are not economical and non-economic attacks are prohibitively expense relative to the benefit for the attacker.   It may not rise linearly but it will need to rise.   If someday one Bitcoin is worth $10,000 and we are still only spending $300 million a year on security we probably are going to have a problem.  Now advocates of keeping the limit may argue that the majority of the network cost won't be paid by fees for many years but the reality is that with the limit on potential transactions there are only two other ways to balance the equation and that is much higher fees or much lower security.
ABitNut
Hero Member
*****
Offline Offline

Activity: 763


I'm a cynic, I'm a quaint


View Profile
February 05, 2015, 08:57:27 AM
 #41

The OP gives a valid technical argument for raising block size limit, but is neglecting a financial argument against it.

The miners' income has to be greater than the cost of their work. Miners' income is inflation now, but is expected to be replaced by fees,
since inflation halves every four years. Purchasing power of new coins might be sustained for a while but must converge to zero in the limit.

Transaction fees exist only because there is a competition for block space. Eliminating that competition eliminates the fees and with that mining.

Therefore block space has to become and remain a scarce asset.


like i have said before in this very thread, it's better to sell 1 billion screws and make 0,01 cent profit from every screw than to sell 1 Lamborghini and make a profit of $100000 from that single sale.

bigger blocks means more transactions which means MORE FEES.

not less fees, MORE fees.

it's a GOOD thing for miners, not a bad thing.

Imagine if you owned a taxi business, but no matter how many taxis and taxi drivers you have, you are only allowed to transfer 500 passengers a day, period. You could own a million cars and a million drivers, but you would still only be allowed 500 passengers. There could be a major sport events like the superbowl or the champions league, 100,000 of people could be waiting for a taxi ready to pay but no, you can not take more than 500, why? because protocol said so.

The blockchain works by supply and demand, and supply is made by the miners themselves, not by some dumb hardcoded limit.

If the miners think 0.001 fee per transaction is too low, they are free to deny those transactions. But they should deny transactions just because 'lol the block is full'

To clarify... Suppose there are 5 transactions:
#FeeSizeFee per 256 KB
1
0.1
512KB
0.05
2
0.1
512KB
0.05
3
0.05
256KB
0.05
4
0.05
256KB
0.05
5
0.01
256KB
0.01

Now with the 1 MB limit the miner can, at most, earn BTC0.2.
With a 20MB limit the miner can earn BTC0.31. More than 1.5x as much as with the 1MB limit! Did their cost increase 1.5x? I don't see that. I wonder which fork miners would pick...


              ▄
            ▄███▄
          ▄███████▄
   ▄▄▄    █
█████████
   ███
    ███████████▄
██    ████    ████████▄
      ████    ██████████
  ████    ████▀██████████
  ████    ██▀   ▀█████████▄
      █████       █████████▄
      ███▀         ▀████████
  ██████▀           ▀███████
  █████▀             ▀█████
   ████ █▄▄▄     ▄▄▄█ ████
    ███ ▀███████████▀ ███
     ▀▀█▄ █████████ ▄█▀▀
        ▀▀▄▄ ▀▀▀ ▄▄▀▀
●●
●●
●●
●●
●●
●●
|●  facebook
●  reddit
●  ann thread
|
█ ██
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██

██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
██ █
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 09:07:51 AM
 #42

The blockchain works by supply and demand, and supply is made by the miners themselves, not by some dumb hardcoded limit.

You assume that miner can effectively control supply. This in absence of a block size limit is only the case if they are building cartels, that artificially limit the supply.

Is that you really prefer instead of an algorithmic decision?
Remember Bitcoin's promise is to operate without the need of cartels and authorities.

oh but you can.

the miners decide which transactions are valid and which are not.

you can make your own rules about which fees to accept and which fees not to accept. You can even ban individual transactions or addresses if you want.

I'm pretty sure there's also a way to make your own limits to the size of the block you want to solve, as long as it's not bigger than the hard-coded limit.

As long as your block is considered "valid" by the rest of the network you decide the rules about which transactions you accept and which you dont accept.

If you have sufficient power as a miner, you can have a pretty big influence on the fees. And any miner who's not stupid will ask for at least a small fee. Or like make something like a 250KB limit for 'free transactions' and reserve the rest of the block for transactions with fee, the more fee the more priority, something like that.

be creative...

just because there's the POSSIBILITY of 20MB doesnt mean you HAVE TO use it.

There's not even a need for cartels, but each miner/pool will set their own limits based on some calculations.

Do i really need to do all your thinking for you?

Calculations would be roughly based on

  • how much electricity does mining cost for me?
  • how much $/kWh in my area?
  • how much transactions are in an average block lately?
  • how many blocks do i solve (or more likely does your pool solve) per hour
  • how much of a % do i influence the pool?
  • how much is the block reward if any? (and how much does it cover the cost of mining?)
  • how long does this mining equipment stay competive
  • how much did my equipment cost and how much will i need to profit to invest in new one
  • overhead costs

weight all those factors against each other and you can make a rough estimate on how much fees you should ask for an average transaction.

now let's suppose we're far into the future and we rely mostly on fees as opposed to block rewards.

lets also assume there's (for simplicity sake) 8 large pools controlling a majority of the mining. (not to say this will happen but just to keep the calculations easy).

Pool A asks for a fee of 1 bitcoin minimum. and controls 25% of the network
Pool B asks for a fee of 0.001 bitcoin minimum and controls 5% of the network
Pool C asks for a fee of 0.005 bitcoin and controls 10% of the network
Pool D asks for a fee of 0.003 bitcoin and controls 10% of the network
pool E asks for a fee of 0.5 bitcoin and controls 15% of the network
pool F asks for a fee of 1 satoshi and controls 10% of the network
Pool G asks for a fee of 0.01 bitcoin and controls 15% of the network
Pool H asks for a fee of 0.025 bitcoin and remains the remaining 10%

now if i would want to sent a transaction and be absolutely sure it gets confirmed 6 times within the next 6 blocks (which should take about an hour) I should add a fee of 1 bitcoin.

However, if i'm satisfied with a 75% chance of waiting for a block i just add a fee of 0.5 instead.

If i am satisfied with a chance of 60% to have to wait a block, i'd add a fee of 0.025 and so on.

Of course as the user I don't know exactly what the miners ask in fees, so it's a bit of trial and error. But most users will add a fee that proved to work for them in the past.

Many large transactions or transactions that NEED to be processed fast will likely have a higher fee to ENSURE a fast transaction (some miners only accept large fees) smaller transactions (which will be more plentiful) will add lower fees, risk waiting a little bit longer.

The miners who accept smaller fees will process more transactions and the sheer volume of transactions may actually add up. But the few big transactions that the other pools sometimes pick up may also bring in some large chunks of fees from time to time.

There's no need for a hard-coded limit, fees will come in anyway, and you can make your own rules.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 09:16:39 AM
 #43

^ Did you read the entire post? The OP fully addressed the effect on fees:

He neglects that there is no reason to pay fees, if there is no limit on supply.

just because there's the POSSIBILITY of 20MB doesnt mean you HAVE TO use it.

Since there is no marginal cost in including a transaction to the current block, a rational miner will always include a transaction with a non zero fee,
before it is included by any of its competitors.

Therefore a lower bound on fee will not work without a cartel or without a competition for space.

I prefer algorithms over cartels.
ABitNut
Hero Member
*****
Offline Offline

Activity: 763


I'm a cynic, I'm a quaint


View Profile
February 05, 2015, 09:20:05 AM
 #44

You must be able to broadcast that huge block to most of the nodes in 10 minutes. I don't see the latest research regarding this area, but there is a paper from 2013


http://www.tik.ee.ethz.ch/file/49318d3f56c1d525aabf7fda78b23fc0/P2P2013_041.pdf

Based on this research, it took 0.25 seconds for each KB transaction to reach 90% of network. In another word, a 1MB block will take 256 seconds to broadcast to majority of nodes and that is 4 minutes

When block size reach 10MB, you will have a broadcast time of 40 minutes, means before your block reach the far end of the network, those nodes have already digged out 3 extra blocks thus your block is always orphaned by them. And the whole network will have disagreement about which segment have the longest chain, thus fork into different chains

Gavin's proposal is to let mining pools and farms connect to high speed nodes on internet backbone. That is reasonable, since the propagation time is only meaningful for miners, your transaction will be picked up by the mining nodes closest to you and if those mining nodes have enough bandwidth, they can keep up with the speed. But anyway, how much bandwidth is really needed to broadcast 10MB message in a couple of minutes between hundreds of high speed nodes need to be tested. And this is the risk that someone worried about the centralization of mining nodes: Only those who have ultra high speed internet connection can act as nodes (I'm afraid that chinese farms will be dropped out since their connection to the outside world is extremely slow, they will just fork to their own chain inside mainland china)

I don't know how you come to those assumptions based on that research.

Quote
the block message may be very large — up to 500kB at the time of writing.

Quote
The median time until a node receives a block is 6.5 seconds whereas the mean is at 12.6 seconds.

Quote
For blocks, whose size is larger than 20kB, each kilobyte in size costs an additional 80ms delay until a majority knows about the block.

The do not mention the average size of blocks they measured. Let's assume all their blocks were 0KB. 12.6 seconds for that. Add 80 ms per addicition KB.... 80ms * 1024 * 20 is about 27.3 minutes. Add the original 12.6 seconds... Roughly 28 minutes for 20MB.

Of course, 28 minutes is still long. That is based on 2013 data. I assume the nodes now will have improved their verification speed and have more bandwidth. New measurements could / should be made to verify that propagation speed will not become an issue.


              ▄
            ▄███▄
          ▄███████▄
   ▄▄▄    █
█████████
   ███
    ███████████▄
██    ████    ████████▄
      ████    ██████████
  ████    ████▀██████████
  ████    ██▀   ▀█████████▄
      █████       █████████▄
      ███▀         ▀████████
  ██████▀           ▀███████
  █████▀             ▀█████
   ████ █▄▄▄     ▄▄▄█ ████
    ███ ▀███████████▀ ███
     ▀▀█▄ █████████ ▄█▀▀
        ▀▀▄▄ ▀▀▀ ▄▄▀▀
●●
●●
●●
●●
●●
●●
|●  facebook
●  reddit
●  ann thread
|
█ ██
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██

██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
█ ██ █
██ █
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 09:21:30 AM
 #45

Since there is no marginal cost in including a transaction to the current block,

let me be more precise:
There is a marginal cost implied by block propagation speed being proportional to size and propagation being proportional to orphan rate. There is also a computation cost of updating the merkle tree and updating miner with it. These marginal costs are today however magnitudes below the lowest non-zero fees paid.
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 09:26:48 AM
 #46

^ Did you read the entire post? The OP fully addressed the effect on fees:

He neglects that there is no reason to pay fees, if there is no limit on supply.

just because there's the POSSIBILITY of 20MB doesnt mean you HAVE TO use it.

Since there is no marginal cost in including a transaction to the current block, a rational miner will always include a transaction with a non zero fee,
before it is included by any of its competitors.

Therefore a lower bound on fee will not work without a cartel or without a competition for space.

I prefer algorithms over cartels.

then why does a transaction with a 1 satoshi fee take longer to process even though we are not at our limit yet?

explain me this.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 09:33:05 AM
 #47

then why does a transaction with a 1 satoshi fee take longer to process even though we are not at our limit yet?

explain me this.

Because majority of miner nowdays use the bitoin core as published by the devs and that compiles blocks following a mix of commercial and alturistic rules.

Majority of miner do not care about this since inflation provides three-four magnitudes higher income than fees at the moment. They will. And this discussion is about the future and the limits.
zebedee
Donator
Hero Member
*
Offline Offline

Activity: 670



View Profile
February 05, 2015, 09:34:58 AM
 #48

I am not the only one totally annoyed by how bitcoin behaves in an effort to keep the dominant position and be the 'one coin for all' and with this hurts the alt-industry. I think at this point it becomes inevitable to start using more than one chain and stop looking at btc as the only coin worth bothering with.
I see, so rather than doubling the size of the bitcoin chain you'd rather have two chains half the size?  How is that a win?
turvarya
Hero Member
*****
Offline Offline

Activity: 714


View Profile
February 05, 2015, 09:42:47 AM
 #49

Since I so often see the argument about, what is good for the miners:
Is there any statement from the large mining pools about this?

https://forum.bitcoin.com/
New censorship-free forum by Roger Ver. Try it out.
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 10:14:36 AM
 #50

then why does a transaction with a 1 satoshi fee take longer to process even though we are not at our limit yet?

explain me this.

Because majority of miner nowdays use the bitoin core as published by the devs and that compiles blocks following a mix of commercial and alturistic rules.

Majority of miner do not care about this since inflation provides three-four magnitudes higher income than fees at the moment. They will. And this discussion is about the future and the limits.

well, in the future miners will either adapt and make their own rules, or leave the mining industry because they are too broke to continue. And smarter miners will take over.

judging by your post, you'll be one of the broke ones, because you're too lazy or stupid to make your own rules.

it's the miners that make the rules, not the protocol.

but the protocol should not be a hard limit, because that would just destroy bitcoin.

I'm not going to waste my time thinking of more examples to point this out, i have better things to do.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 10:29:46 AM
 #51

judging by your post, you'll be one of the broke ones, because you're too lazy or stupid to make your own rules.

it's the miners that make the rules, not the protocol.

but the protocol should not be a hard limit, because that would just destroy bitcoin.

I'm not going to waste my time thinking of more examples to point this out, i have better things to do.

You know much less about Bitcoin that you think. I hope you have better things to do.

Miner do not make the rules. They have to obey the rules if in minority or can narrow existing rules if organizing a majority cartel.
The later would be needed to impose a non-zero fee in absence of a size limit in the protocol.
homo homini lupus
Member
**
Offline Offline

Activity: 70


View Profile
February 05, 2015, 10:39:12 AM
 #52

I am not the only one totally annoyed by how bitcoin behaves in an effort to keep the dominant position and be the 'one coin for all' and with this hurts the alt-industry. I think at this point it becomes inevitable to start using more than one chain and stop looking at btc as the only coin worth bothering with.
I see, so rather than doubling the size of the bitcoin chain you'd rather have two chains half the size?  How is that a win?

It IS in fact a win - coins are easy to convert in many different ways within splitseconds. How is it a loose? From an economic standpoint it's not a loose. Say the coffeeshop accepts 'Bitcoin' but i do use xyz-coin. The conversion is a mere formal act. I can pay the coffeeshop in xyz-coin and the shopowner gets the coin he wants (gavincoin or whatever). Crypto is easy enough to convert for this to have no economic impact. All it takes is an exchange-app, and i think it could already be developed or even exist already (would need to check)

With multiple smaller chains the enduser doesn't loose souvereignity to too high hardware requirements.

You could in fact even merge mine them.

The individual user would not be required to store Terrabytes of data and have extreme bandwidth like he would be with Gavincoin.

How is requiring everyone to potentially store terrabytes a win?

---
This scenario is what will happen most likely anyway once running Gavincoin becomes unaffordable for the enduser.
Nancarrow
Hero Member
*****
Offline Offline

Activity: 494


View Profile
February 05, 2015, 11:08:20 AM
 #53

The OP gives a valid technical argument for raising block size limit, but is neglecting a financial argument against it.

The miners' income has to be greater than the cost of their work. Miners' income is inflation now, but is expected to be replaced by fees,
since inflation halves every four years. Purchasing power of new coins might be sustained for a while but must converge to zero in the limit.

Transaction fees exist only because there is a competition for block space. Eliminating that competition eliminates the fees and with that mining.

Therefore block space has to become and remain a scarce asset.


Your second and third paragraph contradict each other.

Transaction fees don't ONLY exist because there is a competition for block space. They ALSO exist to pay the miners to secure the network, as you clearly understood before you implicitly denied it. Fees are not an either/or thing. It absolutely ISN'T a case of lifting block limit->eliminates fees->eliminates mining.

We have instead a *feedback* process. LOWER fees (not ZERO fees) means LESS mining (not NO mining) which in turn means LONGER confirmation times (not COMPLETE COLLAPSE) which leads to MORE FEES which leads to mining power switching back on. It's what engineers call a negative feedback loop, designed to keep the hashing rate broadly stable, or at least oscillating within a fairly narrow range.

People really must stop thinking of all the causes and effects in the world as being ON/OFF switches. They aren't. They're analogue dials.

[If you're on board with the idea of bitcoin, you've probably had to deal with people saying a deflationary money supply can't work because NOBODY would ever spend ANY MONEY AT ALL. Same problem. "Less" is not the same as "none". Especially when "Less X" induces "Less Y" which induces "More X".]
 

If I've said anything amusing and/or informative and you're feeling generous:
1GNJq39NYtf7cn2QFZZuP5vmC1mTs63rEW
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 11:09:51 AM
 #54

judging by your post, you'll be one of the broke ones, because you're too lazy or stupid to make your own rules.

it's the miners that make the rules, not the protocol.

but the protocol should not be a hard limit, because that would just destroy bitcoin.

I'm not going to waste my time thinking of more examples to point this out, i have better things to do.

You know much less about Bitcoin that you think. I hope you have better things to do.

Miner do not make the rules. They have to obey the rules if in minority or can narrow existing rules if organizing a majority cartel.
The later would be needed to impose a non-zero fee in absence of a size limit in the protocol.

you just assume all miners will allow a low fee, but that simply won't be the case.

because it's not sustainable.

for the same reason taxis do not offer free rides, and shops dont offer free products.

like i explained before, you need to calculate a lot of factors, and then decide a cut-off price. And all the miners who dont do that once the fees start to matter more than the block reward will go bankrupt and will be forced to stop.

just because the transaction itself doesnt cost anything (or barely anything) doesnt mean it's free.

Everyone understands that at some point transaction won't be free, because miners need compensation for their power and other costs.

The thing you refuse to understand is that it's better to spread that cost over 1 million transactions per 10 minutes than on 1620 transactions (which, based on empirical evidence is about the limit of the current blocksize).

Because if you spread the fee over just 1620 transactions, the fee would be ridiculously high and would never be able to support the network, but if you spread it over a million transactions it will be very cheap. But you wont be able to spread the cost over a million transactions because it simply would not fit in 1 MB.

You were talking about the future, so think about the future. If bitcoin still has only 2 transactions per second on average by the time block rewards are insignificant, bitcoin has failed, and this whole discussion is pointless.

Are you really this stupid or are you just trolling?

by the way if miners dont make the rules, who do?
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 05, 2015, 11:17:52 AM
 #55

I am not the only one totally annoyed by how bitcoin behaves in an effort to keep the dominant position and be the 'one coin for all' and with this hurts the alt-industry. I think at this point it becomes inevitable to start using more than one chain and stop looking at btc as the only coin worth bothering with.
I see, so rather than doubling the size of the bitcoin chain you'd rather have two chains half the size?  How is that a win?

It IS in fact a win - coins are easy to convert in many different ways within splitseconds. How is it a loose? From an economic standpoint it's not a loose. Say the coffeeshop accepts 'Bitcoin' but i do use xyz-coin. The conversion is a mere formal act. I can pay the coffeeshop in xyz-coin and the shopowner gets the coin he wants (gavincoin or whatever). Crypto is easy enough to convert for this to have no economic impact. All it takes is an exchange-app, and i think it could already be developed or even exist already (would need to check)

With multiple smaller chains the enduser doesn't loose souvereignity to too high hardware requirements.

You could in fact even merge mine them.

The individual user would not be required to store Terrabytes of data and have extreme bandwidth like he would be with Gavincoin.

How is requiring everyone to potentially store terrabytes a win?

---
This scenario is what will happen most likely anyway once running Gavincoin becomes unaffordable for the enduser.

you really dont get it do you?

even if we ignore the legion of premined scamcoins, we still have the following issue:

Suppose we use 4 different altcoins

Acoin has a 10 GB blockchain in this example
Bcoin has a 5 GB blockchain
Ccoin has a 20 GB blockchain
Dcoin has a 15 GB blockchain

now how exactly is that better than having 1 blockchain of 10+5+20+15=50 GB?

If you are going to store them all anyway, it doesnt matter if you have 1 blockchain of 50 GB or many smaller ones adding up to 50.

In fact, having multiple blockchains means you have a lot of extra redundant information and applications running, it's less convenient, less secure, etc. It only brings more problems and it doesnt solve anything.

Why am i even explaining this....
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 05, 2015, 11:20:23 AM
 #56

^ Did you read the entire post? The OP fully addressed the effect on fees:

He neglects that there is no reason to pay fees, if there is no limit on supply.

So now that you've moved the goalposts (first it was: "he doesn't address fees"), I need to correct you again. The OP never advocated for there being no limit:

The problem isn't a limit in general but that 1MB is so low that under any meaningful adoption scenario it will push all individual users off the blockchain to rely on trusted third parties.

A limit with thousands of tps will undoubtedly produce more fees for miners than a limit capping the network at 3 tps.
Nancarrow
Hero Member
*****
Offline Offline

Activity: 494


View Profile
February 05, 2015, 11:32:35 AM
 #57

Oh another thing: re the concern about a huge blockchain size...

Increasing the block limit isn't the only thing Gavin and co have been giving a lot of thought to. They've also understood the (complementary) problem of blockchain bloat. I'm given to understand that several blockchain pruning schemes are out there. Note these are NOT the same idea as 'lightweight' or SPV clients. They're genuine info compression schemes. I haven't followed for a while, but one scheme I heard basically suggested that *on disk* the blockchain information up to the 'current block #X' should essentially be split into three parts:

1) A chain of block headers only up to block #X-N, where N might be a couple of thousand.
2) A database of UTXOs for all the blocks up to #X-N
3) A complete sequence of the last N blocks

The header-only chain provides the proof of the next block's valid membership of the chain. It grows linearly for ever more, at a rate of ~(640*6*24*365)/(1024^2) = 32 MB a year. Which is peanuts.
The UTXO database grows more unpredictably, but I guess broadly proportionally to global bitcoin usage. It's there so the node knows who's got what and can verify new transactions are valid.
The uncompressed sequence of the last N blocks, where N might vary by individual preference, remains fixed length for a fixed blocksize. The node keeps the last N rolling blocks uncompressed in case there's a fork.
Even if the blocksize grows, so that this tail of blocks grows, it's unlikely to gum up disk space. If it does, the node could choose to reduce N (I pick N=2000 as rather overly cautious).

Anyway TLDR: disk space is NOT a problem. Can't comment much on network propagation issues, except I'm led to believe these can be addressed by an approach such as 'send the header first, then the block info separately'.


If I've said anything amusing and/or informative and you're feeling generous:
1GNJq39NYtf7cn2QFZZuP5vmC1mTs63rEW
homo homini lupus
Member
**
Offline Offline

Activity: 70


View Profile
February 05, 2015, 11:45:56 AM
 #58

I am not the only one totally annoyed by how bitcoin behaves in an effort to keep the dominant position and be the 'one coin for all' and with this hurts the alt-industry. I think at this point it becomes inevitable to start using more than one chain and stop looking at btc as the only coin worth bothering with.
I see, so rather than doubling the size of the bitcoin chain you'd rather have two chains half the size?  How is that a win?

It IS in fact a win - coins are easy to convert in many different ways within splitseconds. How is it a loose? From an economic standpoint it's not a loose. Say the coffeeshop accepts 'Bitcoin' but i do use xyz-coin. The conversion is a mere formal act. I can pay the coffeeshop in xyz-coin and the shopowner gets the coin he wants (gavincoin or whatever). Crypto is easy enough to convert for this to have no economic impact. All it takes is an exchange-app, and i think it could already be developed or even exist already (would need to check)

With multiple smaller chains the enduser doesn't loose souvereignity to too high hardware requirements.

You could in fact even merge mine them.

The individual user would not be required to store Terrabytes of data and have extreme bandwidth like he would be with Gavincoin.

How is requiring everyone to potentially store terrabytes a win?

---
This scenario is what will happen most likely anyway once running Gavincoin becomes unaffordable for the enduser.

you really dont get it do you?

even if we ignore the legion of premined scamcoins, we still have the following issue:

Suppose we use 4 different altcoins

Acoin has a 10 GB blockchain in this example
Bcoin has a 5 GB blockchain
Ccoin has a 20 GB blockchain
Dcoin has a 15 GB blockchain

now how exactly is that better than having 1 blockchain of 10+5+20+15=50 GB?

If you are going to store them all anyway, it doesnt matter if you have 1 blockchain of 50 GB or many smaller ones adding up to 50.

In fact, having multiple blockchains means you have a lot of extra redundant information and applications running, it's less convenient, less secure, etc. It only brings more problems and it doesnt solve anything.

Why am i even explaining this....

How do premined scamcoins even matter? They don't.


Let's go out in the future 5 years from now:

Gavincoin: 1.5 terrabyte storage requirements
Mpcoin: 70GB
Acoin: 10gb
Bcoin: 20gb
Ccoin: 5gb
Dcoin: 1gb


you don't store them all - especially not gavincoin. You store two or three maximum. Those that are most according to your usecase of crypto.
So your assumption everyone would store all chains is wrong of course. Different people use different coins for different purposes and will use different chains according to their preferences.

a,b,c,d,e coin can still be merged mined with the strongest network (which one that will be remains to be seen) - so a,b,c,d coin are almost as secure as the network they merge mine against.

Almost nobody will use a coin that requires such insane hardware as Gavincoin will.
Once this is understood there's going to be a lot developement towards efficency - Gavincoin is a proposal for a waste of systemrecources and therefore it won't be able to compete with more efficient coins in the long run.

If in any case i would be required to use Gavincoin i can still convert my a,b,c,d,e coin in splitseconds to it.
homo homini lupus
Member
**
Offline Offline

Activity: 70


View Profile
February 05, 2015, 11:55:50 AM
 #59

Not the coin that's the biggest bully and has the most vocal support wins but the coin that's most efficient. (not even network effect and merchant adoption are as important as efficency in the long run)

So fork it all up, fools. Do it right now!
Stephen Gornick
Legendary
*
Offline Offline

Activity: 2338


✪ NEXCHANGE | BTC, LTC, ETH & DOGE ✪


View Profile
February 05, 2015, 11:56:29 AM
 #60

On edit: fixed some typos.

Possibly one more ...
"Have you ever thought about the fact that you send a bank wire yourself."
Did you mean to instead write "that you can't send a bank wire yourself."?

Stephen Gornick
Legendary
*
Offline Offline

Activity: 2338


✪ NEXCHANGE | BTC, LTC, ETH & DOGE ✪


View Profile
February 05, 2015, 11:57:00 AM
 #61

If I lose direct access to the blockchain then I am forced to hand over all my wealth (not just enough funds to cover trivial purchases) to a third party.

That's probably the best argument I've seen.  I think of how today I have many paper wallets, mobile wallets, and hosted (shared) E-Wallet services (i.e., custodial services) where I had a balance of a few millibits or less in them and won't (or can't) withdraw because the Bitcoin network transaction fee is prohibitive for that small amount of bitcoin.    Now as fees rise due to scarcity of empty space in blocks then I'll essentially have permanently abandoned those wallets with the small values (i.e., those funds become economically unspendable, ... worthless!).

I suppose for the paper wallets and mobile wallets where I have the private key I could combine them into a single transaction without incurring a higher fee, but that's possible only today because there's still some low-cost space left and the fee required doesn't rise linearly when the size of the transaction rises.   Without a fork then eventually we are essentially paying a fee for each additionally byte and the resul will likely be that millions of UTXOs with small amounts (e.g., sub-millibit -- less than 0.001 bitcoins) will become worthless.   Hey now,  that's some real money we're talking about discarding!  [Anyone care to crunch the numbers -- what is the sum total of bitcoins that exists in UTXOs under one millibit?)  

It makes me think back ... First they came for the dust, and I didn't speak out because I didn't play SatoshiDICE.    When this day comes (when the 1MB cap is reached, and no-hard fork), maybe a millibit becomes the new dust!   And then let's say the fee required rises even more, maybe UTXOs under 0.01 even become economically unspendable.    That becomes a real problem for most of us!

So what to do?
I'ld like a Bitcoin that can scale and grow with transaction demand.  I'ld also like my car to get 150 MPG.  The latter can't happen simply because a gallon of gasoline doesn't have the energy needed for an internal combustion engine to propel today's car for 150 miles.  Now I'm not entirely convinced the former can happen either.  That's because a hard fork that doesn't have the consent of the Economic Majority ( http://en.bitcoin.it/wiki/Economic_majority ) will fail.

Let's go through the first hours of the hard-fork.  Let's say it happens at block 400,000 (a lttle over a year from now).  Everything was in place -- miners with the right nVersion indicating consent was well above the threshold (e.g., 80% of last 1,000 blocks had nVersion=4).     But ..., I'm not willing to believe that exchanges, merchants, merchant processors, etc. are going to themselves take on the full risk of double spending that would occur if the hard fork eventually fails (maybe a day or three later even).   The only way to prevent the double spending that would result would be to require that a transaction confirms on the block chains on both sides of the fork.  So these entities are going to watch both sides. Well, a transaction that has any taint from a coin generated on the side with the larger block size rule change (I hate calling them "gavincoins", but for the purpose of this argument that name is short and everyone here knows what it means) will not confirm on the other side of the fork where the 1MB limit is still followed.

So the market instantly realizes this difference between a Bitcoin and a GavinCoin.  So the value of a GavinCoin will drop relative to a Bitcoin.   Miners can't convert or spend these newly mined coins nearly anywhere, so all you have is buying from speculators.   Now those mining on the side which still recognizes the 1MB max limit are still mining blocks (albeit at a much slower rate because of the dramatic loss of hashing capacity) and some market for those newly mined coins exists -- again, thanks to speculators.    Things can flip quick.  Maybe a day goes by and all of a sudden a large amount of hashing capacity switches back to the 1MB max side due to the dropping exchange rate of GavinCoin, and it becomes quite possible (if not probable) that the 1MB limit will be with us for some time longer.

The exchanges and merchants that played both sides (i.e., required confirmations on both sides of the fork) lost nothing as either way they have confirmed transactions on what is eventually the sole winner.   If the hard fork fails then the losers are those who had E-Wallets (custodial accounts) and found their pre-fork bitcoins were spent and they only end up with tainted GavinCoins that can now never be spent (at least not anywhere near parity with a bitcoin).   Likely most every custodial service (e.g., exchanges, hosted/shared eWallets, etc.) that didn't require confirmations on both sides of the fork ends up bankrupt as a result of getting dumped on with GavinCoins while allowing withdrawals of untainted bitcoins.

I just don't see how a hard-fork succeeds.   There is risk of accepting GavinCoins.  There is no risk (excluding exchange rate risk) of putting your own pre-fork coins into storage for a (long) while and not letting them become tainted GavinCoins (as they can still be spent a year, two or ten later).

[Edited: A couple small readability changes.]

lunarboy
Hero Member
*****
Offline Offline

Activity: 544



View Profile
February 05, 2015, 01:25:28 PM
 #62

Really glad to see this stupid conversation laid to rest once and for all.

All trolls/ vested interests now please shut up..

Well said D&T .... Claps hands   Grin
Lauda
Legendary
*
Offline Offline

Activity: 1694


GUNBOT Licenses -20% with ref. code 'GrumpyKitty'


View Profile WWW
February 05, 2015, 01:28:30 PM
 #63

We still have people thinking that increasing the block size limit by a factor of 20 will increase the blockchain size by the same factor.
This is FALSE.
The blockchain will grow slowly over time. It could take us years before we reach this limit. Besides the cost per GB of storage is pretty low these days.
What would be very beneficial is including more options into the fork. If the fork happens, this could be our last one. Once we reach a few million users doing so will be almost impossible (it is hard already).


          ▄▄█████▌▐█████▄▄
       ▄█████████▌    ▀▀▀███▄
     ▄███████████▌  ▄▄▄▄   ▀██▄
   ▄█████████████▌  ▀▄▄▀     ▀██▄
  ▐██████████████▌  ▄▄▄▄       ▀█▌
 ▐███████████████▌             ▀█▌
 ████████████████▌  ▀▀▀█         ██
▐████████████████▌  ▄▄▄▄         ██▌
▐████████████████▌  ▀  ▀         ██▌
 ████████████████▌  █▀▀█         ██
 ▐███████████████▌  ▀▀▀▀        ▄█▌
  ▐██████████████▌  ▀▀▀▀       ▄█▌
   ▀█████████████▌  ▀▀█▀     ▄██▀
     ▀███████████▌  ▀▀▀▀   ▄██▀
       ▀█████████▌    ▄▄▄███▀
          ▀▀█████▌▐█████▀▀
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
      ▄▄▄
 ▄▄█████████▄▄
  ▀▀▀▀▀▀▀▀▀▀▀
   █▌▐█ █▌▐█
   █▌▐█ █▌▐█
 ▄███████████▄
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄






▄█████████████▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
███████████████
██▀▀█▀▀████████
▀█████████████▀
kingcolex
Legendary
*
Offline Offline

Activity: 1316



View Profile
February 05, 2015, 01:35:35 PM
 #64

We still have people thinking that increasing the block size limit by a factor of 20 will increase the blockchain size by the same factor.
This is FALSE.
The blockchain will grow slowly over time. It could take us years before we reach this limit. Besides the cost per GB of storage is pretty low these days.
What would be very beneficial is including more options into the fork. If the fork happens, this could be our last one. Once we reach a few million users doing so will be almost impossible (it is hard already).
Thank you for going out of your way to not be deterred when they keep falsely stating it will grow to 20mb over night and centralize and to keep on posting the info. I know it has to have been posted at least 50 times now but I guess if each post teaches a person it's worth it.

altcoin hitler
Member
**
Offline Offline

Activity: 84


View Profile
February 05, 2015, 01:44:04 PM
 #65

We still have people thinking that increasing the block size limit by a factor of 20 will increase the blockchain size by the same factor.
This is FALSE.
The blockchain will grow slowly over time. It could take us years before we reach this limit. Besides the cost per GB of storage is pretty low these days.
What would be very beneficial is including more options into the fork. If the fork happens, this could be our last one. Once we reach a few million users doing so will be almost impossible (it is hard already).
Thank you for going out of your way to not be deterred when they keep falsely stating it will grow to 20mb over night and centralize and to keep on posting the info. I know it has to have been posted at least 50 times now but I guess if each post teaches a person it's worth it.

Free (literally) space will be filled, no worries. Very fast. 

King of the real Bitcoin Foundation https://bitcointalk.org/index.php?topic=934517.0
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 05, 2015, 01:50:07 PM
 #66

Whilst I am not against raising the 1 MB limit I do think that this idea that their should be "only 1 chain" is actually rather "stupid".

The very point of decentralisation is not to have a single point of failure - yet this is constantly what Bitcoin is trying to do (set itself up as the single point of failure).

I don't see the future as being just Bitcoin but in fact numerous blockchains that you'll use if you want (making this whole storage issue really a pointless argument).

Trying to have Bitcoin solve every single problem is just silly - it will never suit all purposes and this is why we will have many blockchains.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
kingcolex
Legendary
*
Offline Offline

Activity: 1316



View Profile
February 05, 2015, 01:50:32 PM
 #67

We still have people thinking that increasing the block size limit by a factor of 20 will increase the blockchain size by the same factor.
This is FALSE.
The blockchain will grow slowly over time. It could take us years before we reach this limit. Besides the cost per GB of storage is pretty low these days.
What would be very beneficial is including more options into the fork. If the fork happens, this could be our last one. Once we reach a few million users doing so will be almost impossible (it is hard already).
Thank you for going out of your way to not be deterred when they keep falsely stating it will grow to 20mb over night and centralize and to keep on posting the info. I know it has to have been posted at least 50 times now but I guess if each post teaches a person it's worth it.

Free (literally) space will be filled, no worries. Very fast. 
What makes you think that? Miners won't include spam transactions without fee's or dusting.

kingcolex
Legendary
*
Offline Offline

Activity: 1316



View Profile
February 05, 2015, 01:52:05 PM
 #68

Whilst I am not against raising the 1 MB limit I do think that this idea that their should be "only 1 chain" is actually rather "stupid".

The very point of decentralisation is not to have a single point of failure - yet this is constantly what Bitcoin is trying to do (set itself up as the single point of failure).

I don't see the future as being just Bitcoin but in fact numerous blockchains that you'll use if you want (making this whole storage issue really a pointless argument).

How should it work with 2 chains? A fork where everyone doubles their holdings for free? Which chain would merchants accept? Which chain do we use for exchanges? A dual chain seems like a terrible idea, an dual coin system with an altcoin is better than that.

CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 05, 2015, 01:52:58 PM
 #69

I agree but maybe the most important part of that is the links between them.

This is why I designed AT: https://bitcointalk.org/index.php?topic=822100.0

(it allows "atomic" trustless transfers to occur across blockchains)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 05, 2015, 01:53:40 PM
 #70

How should it work with 2 chains? A fork where everyone doubles their holdings for free? Which chain would merchants accept? Which chain do we use for exchanges? A dual chain seems like a terrible idea, an dual coin system with an altcoin is better than that.

I am not talking about forks but different blockchains.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
sickpig
Legendary
*
Offline Offline

Activity: 1232


View Profile
February 05, 2015, 01:56:54 PM
 #71

Whilst I am not against raising the 1 MB limit I do think that this idea that their should be "only 1 chain" is actually rather "stupid".

The very point of decentralisation is not to have a single point of failure - yet this is constantly what Bitcoin is trying to do (set itself up as the single point of failure).

I don't see the future as being just Bitcoin but in fact numerous blockchains that you'll use if you want (making this whole storage issue really a pointless argument).

Trying to have Bitcoin solve every single problem is just silly - it will never suit all purposes and this is why we will have many blockchains.


From the bitcoin dev mailing list quoting Wladimir van der Laan1:

Quote
The block chain is a single channel broadcasted over the entire
world, and I don't believe it will ever be possible nor desirable to broadcast all the
world's transactions over one channel.

The everyone-validates-everything approach doesn't scale. It is however
useful to settle larger transactions in an irreversible, zero-trust way.
That's what makes the bitcoin system, as it is now, valuable.

But it is absurd for the whole world to have to validate every purchase of
a cup of coffee or a bus ticket by six billion others.

Naively scaling up the block size will get some leeway in the short term,
but I believe a future scalable payment system based on bitcoin will be
mostly based on off-blockchain transactions (in some form) or that there
will be a hierarchical or subdivided system (e.g. temporary or per-locale
sidechains).

1 Bitcoin core maintainer

Bitcoin is a participatory system which ought to respect the right of self determinism of all of its users - Gregory Maxwell.
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 05, 2015, 02:04:27 PM
 #72

It turns out that Bitcoin handling a significant percentage of global transactions in a couple of decades isn't that far-fetched:

Quote from: Gavin Andresen link=https://blog.bitcoinfoundation.org/a-scalability-roadmap/
There is a clear path to scaling up the network to handle several thousand transactions per second (“Visa scale”). Getting there won’t be trivial, because writing solid, secure code takes time and because getting consensus is hard. Fortunately technological progress marches on, and Nielsen’s Law of Internet Bandwidth and Moore’s Law make scaling up easier as time passes.

The map gets fuzzy if we start thinking about how to scale faster than the 50%-per-increase-in-bandwidth-per-year of Nielsen’s Law. Some complicated scheme to avoid broadcasting every transaction to every node is probably possible to implement and make secure enough.

But 50% per year growth is really good. According to my rough back-of-the-envelope calculations, my above-average home Internet connection and above-average home computer could easily support 5,000 transactions per second today.

That works out to 400 million transactions per day. Pretty good; every person in the US could make one Bitcoin transaction per day and I’d still be able to keep up.

After 12 years of bandwidth growth that becomes 56 billion transactions per day on my home network connection — enough for every single person in the world to make five or six bitcoin transactions every single day. It is hard to imagine that not being enough; according the the Boston Federal Reserve, the average US consumer makes just over two payments per day.

So even if everybody in the world switched entirely from cash to Bitcoin in twenty years, broadcasting every transaction to every fully-validating node won’t be a problem.

Combined with the added scalability and functionality that sidechains would provide, we really can have a universal apolitical currency. The market will ultimately decide. I believe the market wants a common apolitical digital currency for international trade and commerce.

CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 05, 2015, 02:06:06 PM
 #73

Couple of quick questions, does it allow trades between one chain and another to be bundled together into single transactions? I'm thinking to allow large amounts of transactions on low cost chains to be carried out as a single transaction on a busy chain. And does it allow something like a meshnet of chains? Transaction fees for incentivisation is another but I'll have a read through first.

I think the ATs could fairly easily be modified to do the sorts of things you want (they are Turing complete after all and are in charge of their own funds).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
MrTeal
Legendary
*
Offline Offline

Activity: 1274


View Profile
February 05, 2015, 02:14:51 PM
 #74

We still have people thinking that increasing the block size limit by a factor of 20 will increase the blockchain size by the same factor.
This is FALSE.
The blockchain will grow slowly over time. It could take us years before we reach this limit. Besides the cost per GB of storage is pretty low these days.
What would be very beneficial is including more options into the fork. If the fork happens, this could be our last one. Once we reach a few million users doing so will be almost impossible (it is hard already).
Thank you for going out of your way to not be deterred when they keep falsely stating it will grow to 20mb over night and centralize and to keep on posting the info. I know it has to have been posted at least 50 times now but I guess if each post teaches a person it's worth it.

Free (literally) space will be filled, no worries. Very fast. 
What makes you think that? Miners won't include spam transactions without fee's or dusting.
Miners include transactions without fees right now because there isn't a huge disincentive to leave them out, and some pools include a certain number of free ones just because they want to.
I would imagine that charity would not extend past the fork to 20MB blocks, and miners would stop including no-fee low priority transactions in their blocks as the increased risk of an orphan without a similar increase in reward wouldn't be worth it.

However, if the network did immediately balloon out to 20MB/block with 30,000 transactions paying the minimum 0.1mBTC fee the reference client uses, that would still be 3BTC in fees, which is still more than an order of magnitude more than we are currently seeing.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 02:23:07 PM
 #75

you just assume all miners will allow a low fee, but that simply won't be the case.

because it's not sustainable.

Unfortunately behaviour that is wrong for the community as a whole can make sense individually and can wreck the entire ecosystem.
Economist call this the "Tragedy of the commons" http://en.wikipedia.org/wiki/Tragedy_of_the_commons

In absence of block size limit individual miner are rational to include every transaction with any greater than zero fee. This however
disables pricing power of miner to the extent that they become unprofitable and go out of business, in effect reducing utility and security for all.

Are you really this stupid or are you just trolling?

by the way if miners dont make the rules, who do?

I am not trolling.

You are not neccesarily stupid, but uninformed if you think miner set the rules.

A miner who violates a rule that is enforced by the majority of miner locks himself into an alternate reality (a fork) that no one else cares of.

The majority of miner can enforce new stricter rules than currently in existence, this is called a soft fork. Enforcing a minimal fee by a cartel would be in effect a soft fork.

Not even the majority of miner can introduce a rule that is not a subset of those already in existence, without getting most of ordinary user upgraded.
An increased block size is not subset but extension of the current rule, therefore it falls into this cathegory, also called hard fork.

Bitcoin's value rests on the consensus of its user of the rules. The rules are practically constant as it is very hard to convience all user to upgrade.

In case of block size interest of ordinary user and miner are not aligned. Ordinary user just want lower best zero fee. Miner have to protect pricing power.
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 05, 2015, 02:34:50 PM
 #76

^ The OP never advocated for there being no limit so you're criticizing a straw man proposal:

The problem isn't a limit in general but that 1MB is so low that under any meaningful adoption scenario it will push all individual users off the blockchain to rely on trusted third parties.

A limit with thousands of tps will undoubtedly produce more fees for miners than a limit capping the network at 3 tps.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 02:38:51 PM
 #77


A limit with thousands of tps will undoubtedly produce more fees for miners than a limit capping the network at 3 tps.


This is only true if the per transaction fee is not zero.
In absence of a block size limit, there is no incentive to pay fee. Any positive fee would have to be enforced by a cartel of miner.
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 05, 2015, 02:39:28 PM
 #78

The problem isn't a limit in general but that 1MB is so low that under any meaningful adoption scenario it will push all individual users off the blockchain to rely on trusted third parties.

This statement is just *wrong* (and is itself a straw-man argument).

With other blockchains that also work trustlessly why on earth is anyone being pushed to rely on trusted 3rd parties?

Of course they are not (OP should fix that IMO).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
homo homini lupus
Member
**
Offline Offline

Activity: 70


View Profile
February 05, 2015, 02:41:27 PM
 #79



A limit with thousands of tps will undoubtedly produce more fees for miners than a limit capping the network at 3 tps.


that needs to occure those txs first
Nancarrow
Hero Member
*****
Offline Offline

Activity: 494


View Profile
February 05, 2015, 02:47:08 PM
 #80


In absence of a block size limit, there is no incentive to pay fee.

Yes THERE FUCKING IS.

READ, god damn you.

 Angry

If I've said anything amusing and/or informative and you're feeling generous:
1GNJq39NYtf7cn2QFZZuP5vmC1mTs63rEW
RoadStress
Legendary
*
Offline Offline

Activity: 1652


View Profile
February 05, 2015, 02:47:35 PM
 #81


A limit with thousands of tps will undoubtedly produce more fees for miners than a limit capping the network at 3 tps.


This is only true if the per transaction fee is not zero.
In absence of a block size limit, there is no incentive to pay fee. Any positive fee would have to be enforced by a cartel of miner.

There will be lots of services/people that will want to include fees for various reasons.

iCEBREAKER is a troll! He and cypherdoc helped HashFast scam 50 Million $ from its customers !
H/w Hosting Directory & Reputation - https://bitcointalk.org/index.php?topic=622998.0
Nancarrow
Hero Member
*****
Offline Offline

Activity: 494


View Profile
February 05, 2015, 02:49:45 PM
 #82

YOO HOO! GRAU!!! HELLO THERE, GRAU!


The OP gives a valid technical argument for raising block size limit, but is neglecting a financial argument against it.

The miners' income has to be greater than the cost of their work. Miners' income is inflation now, but is expected to be replaced by fees,
since inflation halves every four years. Purchasing power of new coins might be sustained for a while but must converge to zero in the limit.

Transaction fees exist only because there is a competition for block space. Eliminating that competition eliminates the fees and with that mining.

Therefore block space has to become and remain a scarce asset.


Your second and third paragraph contradict each other.

Transaction fees don't ONLY exist because there is a competition for block space. They ALSO exist to pay the miners to secure the network, as you clearly understood before you implicitly denied it. Fees are not an either/or thing. It absolutely ISN'T a case of lifting block limit->eliminates fees->eliminates mining.

We have instead a *feedback* process. LOWER fees (not ZERO fees) means LESS mining (not NO mining) which in turn means LONGER confirmation times (not COMPLETE COLLAPSE) which leads to MORE FEES which leads to mining power switching back on. It's what engineers call a negative feedback loop, designed to keep the hashing rate broadly stable, or at least oscillating within a fairly narrow range.

People really must stop thinking of all the causes and effects in the world as being ON/OFF switches. They aren't. They're analogue dials.

[If you're on board with the idea of bitcoin, you've probably had to deal with people saying a deflationary money supply can't work because NOBODY would ever spend ANY MONEY AT ALL. Same problem. "Less" is not the same as "none". Especially when "Less X" induces "Less Y" which induces "More X".]
 

If I've said anything amusing and/or informative and you're feeling generous:
1GNJq39NYtf7cn2QFZZuP5vmC1mTs63rEW
uvt9
Sr. Member
****
Offline Offline

Activity: 301


View Profile
February 05, 2015, 02:59:55 PM
 #83

can someone put on a TL;DR of original post for a lazy and non-technical reader like me  Cry
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 05, 2015, 03:00:51 PM
 #84


A limit with thousands of tps will undoubtedly produce more fees for miners than a limit capping the network at 3 tps.


This is only true if the per transaction fee is not zero.
In absence of a block size limit, there is no incentive to pay fee. Any positive fee would have to be enforced by a cartel of miner.

But the OP is not proposing an absence of a block size limit. This is a straw man argument, and the fact that it's the second time you've made it, means you're being disingenuous.
R2D221
Hero Member
*****
Offline Offline

Activity: 658



View Profile
February 05, 2015, 03:04:47 PM
 #85

In absence of a block size limit,

But there IS a block size limit. It's only going to be bigger, but not infinite.

An economy based on endless growth is unsustainable.
Lauda
Legendary
*
Offline Offline

Activity: 1694


GUNBOT Licenses -20% with ref. code 'GrumpyKitty'


View Profile WWW
February 05, 2015, 03:06:42 PM
 #86

We still have people thinking that increasing the block size limit by a factor of 20 will increase the blockchain size by the same factor.
This is FALSE.
The blockchain will grow slowly over time. It could take us years before we reach this limit. Besides the cost per GB of storage is pretty low these days.
What would be very beneficial is including more options into the fork. If the fork happens, this could be our last one. Once we reach a few million users doing so will be almost impossible (it is hard already).
Thank you for going out of your way to not be deterred when they keep falsely stating it will grow to 20mb over night and centralize and to keep on posting the info. I know it has to have been posted at least 50 times now but I guess if each post teaches a person it's worth it.

Free (literally) space will be filled, no worries. Very fast.  
What makes you think that? Miners won't include spam transactions without fee's or dusting.
I'm just doing my part. Also I'm realizing just how limited and stubborn people here can be.
Do not answer to that guy, he is a trolling shill.

The problem isn't a limit in general but that 1MB is so low that under any meaningful adoption scenario it will push all individual users off the blockchain to rely on trusted third parties.

This statement is just *wrong* (and is itself a straw-man argument).

With other blockchains that also work trustlessly why on earth is anyone being pushed to rely on trusted 3rd parties?

Of course they are not (OP should fix that IMO).

You're not just limiting the amount of transactions per block, you're limiting the development of potential services of tomorrow.


          ▄▄█████▌▐█████▄▄
       ▄█████████▌    ▀▀▀███▄
     ▄███████████▌  ▄▄▄▄   ▀██▄
   ▄█████████████▌  ▀▄▄▀     ▀██▄
  ▐██████████████▌  ▄▄▄▄       ▀█▌
 ▐███████████████▌             ▀█▌
 ████████████████▌  ▀▀▀█         ██
▐████████████████▌  ▄▄▄▄         ██▌
▐████████████████▌  ▀  ▀         ██▌
 ████████████████▌  █▀▀█         ██
 ▐███████████████▌  ▀▀▀▀        ▄█▌
  ▐██████████████▌  ▀▀▀▀       ▄█▌
   ▀█████████████▌  ▀▀█▀     ▄██▀
     ▀███████████▌  ▀▀▀▀   ▄██▀
       ▀█████████▌    ▄▄▄███▀
          ▀▀█████▌▐█████▀▀
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
      ▄▄▄
 ▄▄█████████▄▄
  ▀▀▀▀▀▀▀▀▀▀▀
   █▌▐█ █▌▐█
   █▌▐█ █▌▐█
 ▄███████████▄
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄






▄█████████████▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
███████████████
██▀▀█▀▀████████
▀█████████████▀
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 05, 2015, 03:08:35 PM
 #87


The problem isn't a limit in general but that 1MB is so low that under any meaningful adoption scenario it will push all individual users off the blockchain to rely on trusted third parties.

This statement is just *wrong* (and is itself a straw-man argument).

With other blockchains that also work trustlessly why on earth is anyone being pushed to rely on trusted 3rd parties?

Of course they are not (OP should fix that IMO).

You're not just limiting the amount of transactions per block, you're limiting the development of potential services of tomorrow.

Sorry - but that makes *zero* sense (especially as I stated that I am not against raising the 1 MB limit in the first place).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 03:09:16 PM
 #88

We have instead a *feedback* process. LOWER fees (not ZERO fees) means LESS mining (not NO mining) which in turn means LONGER confirmation times (not COMPLETE COLLAPSE) which leads to MORE FEES which leads to mining power switching back on.

This is not how Bitcoin works.
Instead: Less mining power leads to difficulty adjustment that restores confirmation time.

Lower mining income simply leads to less security that will lead to less use and less fees -> less mining power.
A positive feedback loop wrecking the ecosystem.
Lauda
Legendary
*
Offline Offline

Activity: 1694


GUNBOT Licenses -20% with ref. code 'GrumpyKitty'


View Profile WWW
February 05, 2015, 03:12:52 PM
 #89

Sorry - but that makes *zero* sense (especially as I stated that I am not against raising the 1 MB limit in the first place).

I misunderstood your post, sorry.
My post does however make sense, also I was referring to the like of the Lighthouse project.


          ▄▄█████▌▐█████▄▄
       ▄█████████▌    ▀▀▀███▄
     ▄███████████▌  ▄▄▄▄   ▀██▄
   ▄█████████████▌  ▀▄▄▀     ▀██▄
  ▐██████████████▌  ▄▄▄▄       ▀█▌
 ▐███████████████▌             ▀█▌
 ████████████████▌  ▀▀▀█         ██
▐████████████████▌  ▄▄▄▄         ██▌
▐████████████████▌  ▀  ▀         ██▌
 ████████████████▌  █▀▀█         ██
 ▐███████████████▌  ▀▀▀▀        ▄█▌
  ▐██████████████▌  ▀▀▀▀       ▄█▌
   ▀█████████████▌  ▀▀█▀     ▄██▀
     ▀███████████▌  ▀▀▀▀   ▄██▀
       ▀█████████▌    ▄▄▄███▀
          ▀▀█████▌▐█████▀▀
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
      ▄▄▄
 ▄▄█████████▄▄
  ▀▀▀▀▀▀▀▀▀▀▀
   █▌▐█ █▌▐█
   █▌▐█ █▌▐█
 ▄███████████▄
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄






▄█████████████▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
███████████████
██▀▀█▀▀████████
▀█████████████▀
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 03:13:16 PM
 #90

But the OP is not proposing an absence of a block size limit. This is a straw man argument, and the fact that it's the second time you've made it, means you're being disingenuous.

But there IS a block size limit. It's only going to be bigger, but not infinite.

Increasing the limit to avoid raising fees is in effect asking for no limit, as the reason for the limit is to ensure sufficient fees are paid.
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 05, 2015, 03:15:19 PM
 #91

Sorry - but that makes *zero* sense (especially as I stated that I am not against raising the 1 MB limit in the first place).

I misunderstood your post, sorry.
My post does however make sense, also I was referring to the like of the Lighthouse project.

No problem but perhaps it would make more sense if you could expand on what you posted a bit.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
homo homini lupus
Member
**
Offline Offline

Activity: 70


View Profile
February 05, 2015, 03:18:46 PM
 #92

can someone put on a TL;DR of original post for a lazy and non-technical reader like me  Cry

the tl,dr - version is here:
https://bitcointalk.org/index.php?topic=941331.msg10364386#msg10364386
R2D221
Hero Member
*****
Offline Offline

Activity: 658



View Profile
February 05, 2015, 03:20:07 PM
 #93

But the OP is not proposing an absence of a block size limit. This is a straw man argument, and the fact that it's the second time you've made it, means you're being disingenuous.

But there IS a block size limit. It's only going to be bigger, but not infinite.

Increasing the limit to avoid raising fees is in effect asking for no limit, as the reason for the limit is to ensure sufficient fees are paid.

What maths are required to prove that 20MB limit = no limit?

An economy based on endless growth is unsustainable.
Nancarrow
Hero Member
*****
Offline Offline

Activity: 494


View Profile
February 05, 2015, 03:20:31 PM
 #94

But the OP is not proposing an absence of a block size limit. This is a straw man argument, and the fact that it's the second time you've made it, means you're being disingenuous.

But there IS a block size limit. It's only going to be bigger, but not infinite.

Increasing the limit to avoid raising fees is in effect asking for no limit, as the reason for the limit is to ensure sufficient fees are paid.

This is a valid point. As far as incentivising fees goes, a block limit that is always just a bit bigger than it needs to be, is functionally equivalent to no block limit at all.

And actually, I think I see your point that IF this were the only mechanism in play regarding how much people pay in fees, it could pose a problem. My counter is that it is NOT the only mechanism in play.

ETA: R2D221, hope this clarifies it for you too.

If I've said anything amusing and/or informative and you're feeling generous:
1GNJq39NYtf7cn2QFZZuP5vmC1mTs63rEW
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 05, 2015, 03:21:50 PM
 #95

But the OP is not proposing an absence of a block size limit. This is a straw man argument, and the fact that it's the second time you've made it, means you're being disingenuous.

But there IS a block size limit. It's only going to be bigger, but not infinite.

Increasing the limit to avoid raising fees is in effect asking for no limit, as the reason for the limit is to ensure sufficient fees are paid.

It's not a binary option. Raising the limit at a moderate pace, so that fees don't have to increase a substantial amount with increasing adoption, is a middle ground solution, that will lead to average fees remaining affordable but not-zero.
Lauda
Legendary
*
Offline Offline

Activity: 1694


GUNBOT Licenses -20% with ref. code 'GrumpyKitty'


View Profile WWW
February 05, 2015, 03:23:40 PM
 #96


No problem but perhaps it would make more sense if you could expand on what you posted a bit.

Well one of the 'anti' arguments is that there is time and that this should be done in the future (if at all according to them). What they do not realize that it's not just the amount of transactions for the users that are limited.

Quote
Currently you cannot accept more than 684 pledges for your project. This is due to limitations in the Bitcoin protocol - the final contract that claims the pledges is a single Bitcoin transaction, and thus is limited to 500 kilobytes by the Bitcoin block size and protocol relay rules. Lighthouse therefore enforces a minimum pledge size of whatever the goal amount is, divided by the max number of pledges. The more you try and raise, the larger the minimum buy-in becomes.

A future version of Lighthouse could optimise its usage of transaction space to bump up the maximum number of pledges by another few hundred. But ultimately, the fundamental limit is imposed by the Bitcoin block size.
That's taken right from the FAQ of the project itself. This is just 1 example of potential services that won't be able to get developed completely (if at all).

https://www.vinumeris.com/lighthouse


          ▄▄█████▌▐█████▄▄
       ▄█████████▌    ▀▀▀███▄
     ▄███████████▌  ▄▄▄▄   ▀██▄
   ▄█████████████▌  ▀▄▄▀     ▀██▄
  ▐██████████████▌  ▄▄▄▄       ▀█▌
 ▐███████████████▌             ▀█▌
 ████████████████▌  ▀▀▀█         ██
▐████████████████▌  ▄▄▄▄         ██▌
▐████████████████▌  ▀  ▀         ██▌
 ████████████████▌  █▀▀█         ██
 ▐███████████████▌  ▀▀▀▀        ▄█▌
  ▐██████████████▌  ▀▀▀▀       ▄█▌
   ▀█████████████▌  ▀▀█▀     ▄██▀
     ▀███████████▌  ▀▀▀▀   ▄██▀
       ▀█████████▌    ▄▄▄███▀
          ▀▀█████▌▐█████▀▀
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
      ▄▄▄
 ▄▄█████████▄▄
  ▀▀▀▀▀▀▀▀▀▀▀
   █▌▐█ █▌▐█
   █▌▐█ █▌▐█
 ▄███████████▄
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄






▄█████████████▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
███████████████
██▀▀█▀▀████████
▀█████████████▀
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 05, 2015, 03:30:14 PM
 #97

That's taken right from the FAQ of the project itself. This is just 1 example of potential services that won't be able to get developed completely (if at all).

They just haven't designed it right (so the Bitcoin limits are not really an issue).

Although let's not go off-topic my AT project will be launching "crowdfund" very soon and that could actually work on Bitcoin if Bitcoin were to adopt AT (with no limits for the number of pledges).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 03:30:48 PM
 #98

And actually, I think I see your point that IF this were the only mechanism in play regarding how much people pay in fees, it could pose a problem. My counter is that it is NOT the only mechanism in play.

ETA: R2D221, hope this clarifies it for you too.

What else would make people pay a fee? I do not think alturism would pay for an infrastructure of 300 mio USD and increasing.

It's not a binary option. Raising the limit at a moderate pace, so that fees don't have to increase a substantial amount with increasing adoption, is a middle ground solution, that will lead to average fees remaining affordable but not-zero.

There are quite a few constants in Bitcoin that one could argue with, that is what we do. It is however important to do it for the right reason.
Avoid paying fees is not a right reason.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 03:42:10 PM
 #99

It's not a binary option. Raising the limit at a moderate pace, so that fees don't have to increase a substantial amount with increasing adoption, is a middle ground solution, that will lead to average fees remaining affordable but not-zero.

There are quite a few constants in Bitcoin that one could argue with, that is what we do. It is however important to do it for the right reason.
Avoid paying fees is not a right reason.

Exending on above:

It is rather difficult to substantiate an algorithm that would set course for the future, given the huge amount of unkown parameters a constant we have is preferred by occam's razor.
Nancarrow
Hero Member
*****
Offline Offline

Activity: 494


View Profile
February 05, 2015, 04:04:10 PM
 #100

What else would make people pay a fee? I do not think alturism would pay for an infrastructure of 300 mio USD and increasing.

What else, is that the miners don't mine if it's not profitable. And it's NOT altruism to pay a fee to the miners. You can't use bitcoin without miners, so if you want to use bitcoin, you pay a fee. If you DON'T want to use bitcoin, then surely the question of whether or not you WOULD choose to pay a fee is irrelevant?

There are TWO incentives to pay fees - getting your transaction in a block, and having a bitcoin currency at all. Increasing the blocksize may attenuate the first incentive but doesn't remove the second at all.

If I've said anything amusing and/or informative and you're feeling generous:
1GNJq39NYtf7cn2QFZZuP5vmC1mTs63rEW
sickpig
Legendary
*
Offline Offline

Activity: 1232


View Profile
February 05, 2015, 04:10:23 PM
 #101

But the OP is not proposing an absence of a block size limit. This is a straw man argument, and the fact that it's the second time you've made it, means you're being disingenuous.

But there IS a block size limit. It's only going to be bigger, but not infinite.

Increasing the limit to avoid raising fees is in effect asking for no limit, as the reason for the limit is to ensure sufficient fees are paid.

Maybe this is an impertinent question, but isn't the main goal of lifting the max block size increasing the # of tx/s? (Avoiding raising fees is a side effect at best, no?)
And more to the point, it seems to me that 1MB limit was introduced as countermeasure against potential network flooding, am I missing something obvious?


Bitcoin is a participatory system which ought to respect the right of self determinism of all of its users - Gregory Maxwell.
homo homini lupus
Member
**
Offline Offline

Activity: 70


View Profile
February 05, 2015, 04:16:02 PM
 #102

But the OP is not proposing an absence of a block size limit. This is a straw man argument, and the fact that it's the second time you've made it, means you're being disingenuous.

But there IS a block size limit. It's only going to be bigger, but not infinite.

Increasing the limit to avoid raising fees is in effect asking for no limit, as the reason for the limit is to ensure sufficient fees are paid.

What maths are required to prove that 20MB limit = no limit?

before you reach that limit you'll upgrade to 50MB and quantitative ease another 10 million btc into existance to cover miners - that's how it'll end up. (not really because people won't even use the 20MB version)
homo homini lupus
Member
**
Offline Offline

Activity: 70


View Profile
February 05, 2015, 04:18:46 PM
 #103

Guys, these txs you dream up do not occure right now on the network. A large percentage of txs that do occure currently are microtransactions.
The fork proposal can be seen as a proposal to secure microtransactions with btc hashpower for free.  Huh 
Nancarrow
Hero Member
*****
Offline Offline

Activity: 494


View Profile
February 05, 2015, 04:19:49 PM
 #104

before you reach that limit you'll upgrade to 50MB and quantitative ease another 10 million btc into existance to cover miners - that's how it'll end up. (not really because people won't even use the 20MB version)

What plausible reasons can you give for either of those assertions?

Why *wouldn't* people use the 20MB version?

Why *would* the network majority agree to create another 10 million BTC?

If I've said anything amusing and/or informative and you're feeling generous:
1GNJq39NYtf7cn2QFZZuP5vmC1mTs63rEW
homo homini lupus
Member
**
Offline Offline

Activity: 70


View Profile
February 05, 2015, 04:24:42 PM
 #105

before you reach that limit you'll upgrade to 50MB and quantitative ease another 10 million btc into existance to cover miners - that's how it'll end up. (not really because people won't even use the 20MB version)

What plausible reasons can you give for either of those assertions?

Why *wouldn't* people use the 20MB version?

Why *would* the network majority agree to create another 10 million BTC?


-you propose to raise the limit without an immediate need now - you'll do it again (what else)

- why they wouldn't use it has been explained already earlier

- the majority will have to agree on that later because mining becomes unprofitable once the inflation fades out - especially if not the number txs actually do occure what you project (in case your fortune telling skills fail, which they will)
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 04:30:37 PM
 #106

What else would make people pay a fee? I do not think alturism would pay for an infrastructure of 300 mio USD and increasing.
What else, is that the miners don't mine if it's not profitable.

Miners not mining does not force people to pay fee. Difficulty would decrease and transactions would confirm again with less miner.
Most would not even notice.

Some would see that security decreases, but will that make them pay more ? Not neccesarily, since their security preference is different.
Some might think it is still sufficient and continue to free ride, and if no longer, then leave for a better chain.

Again, this is a typical "the tragedy of the commons" scenario.
MrTeal
Legendary
*
Offline Offline

Activity: 1274


View Profile
February 05, 2015, 04:36:32 PM
 #107

What else would make people pay a fee? I do not think alturism would pay for an infrastructure of 300 mio USD and increasing.
What else, is that the miners don't mine if it's not profitable.

Miners not mining does not force people to pay fee. Difficulty would decrease and transactions would confirm again with less miner.
Most would not even notice.

Some would see that security decreases, but will that make them pay more ? Not neccesarily, since their security preference is different.
Some might think it is still sufficient and continue to free ride, and if no longer, then leave for a better chain.

Again, this is a typical "the tragedy of the commons" scenario.

I don't think he's saying that miners won't mine at all, just that they won't mine no-fee transactions. Already with 1MB blocks, most pools limit the number of no-fee transactions. There is a cost to including a transaction in a block for a miner, and there's no economic incentive to do so without fees. No pool operator in their right mind would allow 20MB worth of no-fee transactions into their blocks; that doesn't mean they won't include every TX with a fee large enough to justify its inclusion.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 05:06:55 PM
 #108

Bitcoins current inflation rate is 9.98 % that will drop to 4.26% with block 420000, expected on 29th July 2016.

Current inflation hardly covers the cost of current mining infrastructure, as a consequence we see difficulty stalling since several adjustment cycles.

This means some combination of following needs to happen to sustain security level past block 420000 at current level.

a) Bitcoin purchasing power increases at least 2.32 fold, means to more than 506 USD
b) Transaction fees have to supply 17.14 BTC per block

We all hope that the actual outcome will be more of a) than of b)

a) is a bet
b) implies transaction fees of ca. 0.005 BTC with fully packed 1MB blocks or 0.0003 BTC fees with fully packed 20MB blocks.

Note that above would only sustain current security level.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 05:09:55 PM
 #109

I don't think he's saying that miners won't mine at all, just that they won't mine no-fee transactions.

A rational miner mines all non-zero fee transactions he sees until the block is full, since it has virtually no cost to include one. If he would not include any, he would leave it on the table for an other miner.

Imposing lower limit to fee could only be effective if miner build a cartel for that, and this is not the way of regulation I favor.
MrTeal
Legendary
*
Offline Offline

Activity: 1274


View Profile
February 05, 2015, 05:17:57 PM
 #110

I don't think he's saying that miners won't mine at all, just that they won't mine no-fee transactions.

A rational miner mines all non-zero fee transactions he sees until the block is full, since it has virtually no cost to include one. If he would not include any, he would leave it on the table for an other miner.

Imposing lower limit to fee could only be effective if miner build a cartel for that, and this is not the way of regulation I favor.
Virtually no cost to include one, but not no cost. Building a 20MB block full of 1 satoshi fee transactions would not make economic sense as the increased risk of an orphan would outweigh the benefit of the less than 1mBTC in fees.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
February 05, 2015, 05:21:06 PM
 #111

I don't think he's saying that miners won't mine at all, just that they won't mine no-fee transactions.

A rational miner mines all non-zero fee transactions he sees until the block is full, since it has virtually no cost to include one. If he would not include any, he would leave it on the table for an other miner.

Imposing lower limit to fee could only be effective if miner build a cartel for that, and this is not the way of regulation I favor.

You keep saying this but the cost is not zero.  The cost of orphaned blocks is very real.   If you increase propagation time by six seconds then the probability that your block will be orphaned increased by 1%.  Those transactions have to be paying more than the estimated loss due to increased propagation delay or the miner takes a net loss.  As margins squeeze and the subsidy declines, miners that are bad at math will quickly become bankrupt miners that will be replaced by miners less bad at math.

Another way to look at it is if you double the size of a block you double the chance of your block being orphaned however since miners include highest fee transactions first doubling the size of the block does not double your gross revenue.  At some point there is that marginal transaction where despite it having a fee it will result in a net loss to include it.  A rational miner will draw the minimum fee policy just above that line.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 05:34:31 PM
 #112

The cost of orphaned blocks is very real.

I stated the same. We might disagree on the magnitude.

Since there is no marginal cost in including a transaction to the current block,

let me be more precise:
There is a marginal cost implied by block propagation speed being proportional to size and propagation being proportional to orphan rate. There is also a computation cost of updating the merkle tree and updating miner with it. These marginal costs are today however magnitudes below the lowest non-zero fees paid.

A rational miner will draw the minimum fee policy just above that line.

Allowing for that rational line being above zero, the question is if that rational limit pays for the security we need to sustain. See my previous calc. on transaction fees needed.
wilth1
Member
**
Offline Offline

Activity: 63


View Profile
February 05, 2015, 05:38:47 PM
 #113


Why not implement maximum block size alongside mining difficulty adjustment using the same mechanism?

Rather than an arbitrary 20MB limit or rolling quadruple/exponential maximum size increases, why not incorporate a self-adjusting maximum block sized based on the number of the last n blocks solved that hit the existing hard limit?
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
February 05, 2015, 05:41:54 PM
 #114

Quote
Allowing for that rational line being above zero, the question is if that rational limit pays for the security we need to sustain. See my previous calc. on transaction fees needed.

https://gist.github.com/gavinandresen/5044482

I am not sure the numbers are completely sound so I am not saying rely on them like gospel but more as a thought exercise.  I think the underlying study that Gavin relied on has some pretty worst case assumptions backed in and the average miner is probably going to be better connected than the average non-miner.  Still even assuming they estimate is 10x actual cost the idea that there is no cost is simply not supported.   A 1 satoshi fee (or even 100 satoshi fee) for all intents and purposes is a no fee transaction.

As for fees making up the difference of the subsidy cut ... they won't.  However at the current time the network is probably overprotected relative to the actual economic value of the transactions occurring on it.  Subsidies tend to do that in any market.   So over the next five years the difference caused by the two halvings will be compensated by a combination of a) some reduction in overall security, b)the rise in the exchange rate as miners costs are mostly in fiat terms, and c)rise in overall block fees.
balu2
Hero Member
*****
Offline Offline

Activity: 728


View Profile
February 05, 2015, 05:49:48 PM
 #115

Quote
Allowing for that rational line being above zero, the question is if that rational limit pays for the security we need to sustain. See my previous calc. on transaction fees needed.

https://gist.github.com/gavinandresen/5044482

I am not sure the numbers are completely sound so I am not saying rely on them like gospel but more as a thought exercise.  I think the underlying study that Gavin relied on has some pretty worst case assumptions backed in and the average miner is probably going to be better connected than the average non-miner.  Still even assuming they estimate is 10x actual cost the idea that there is no cost is simply not supported.   A 1 satoshi fee (or even 100 satoshi fee) for all intents and purposes is a no fee transaction.

As for fees making up the difference of the subsidy cut ... they won't.  However at the current time the network is probably overprotected relative to the actual economic value of the transactions occurring on it.  Subsidies tend to do that in any market.   So over the next five years the difference caused by the two halvings will be compensated by a combination of a) some reduction in overall security, b)the rise in the exchange rate as miners costs are mostly in fiat terms, and c)rise in overall block fees.

the problem is: on gavincoin low fees will lead very fast to reaching the blocklimit again.
If the space is there and it's cheap it will be filled with one crap or another.
So reaching the 20MB limit will likely occure way sooner than most think in case fees are too low.

Gavincoin is basically a proposal for socialist central planning (first on fees and later on moneysupply)
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 05:55:05 PM
 #116

https://gist.github.com/gavinandresen/5044482

I am not sure the numbers are sound (I think the underlying study that Gavin relied on has some pretty worst case assumptions) but even assuming they are 10x actual cost the idea that there is no cost is simply not supported.   A 1 satoshi fee (or even 100 satoshi fee) for all intents and purposes is a no fee transaction.

Let me go with his assumptions implying a rational minimum fee of 0.0008 BTC for a 250 Byte transaction.

That would imply a total fee of 3.2 for 1 MB  blocks. We have not even seen that magnitude yet (exceptions were only fucked up transactions).
Means we are not even close to block size limit sqeezing out meaningful fees, so why increase it?
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
February 05, 2015, 06:04:46 PM
 #117

https://gist.github.com/gavinandresen/5044482

I am not sure the numbers are sound (I think the underlying study that Gavin relied on has some pretty worst case assumptions) but even assuming they are 10x actual cost the idea that there is no cost is simply not supported.   A 1 satoshi fee (or even 100 satoshi fee) for all intents and purposes is a no fee transaction.

Let me go with his assumptions implying a rational minimum fee of 0.0008 BTC for a 250 Byte transaction.

That would imply a total fee of 3.2 for 1 MB  blocks. We have not even seen that magnitude yet (exceptions were only fucked up transactions).
Means we are not even close to block size limit sqeezing out meaningful fees, so why increase it?

Because the block limit is not creating a fees market. The block reward is too high and masks the theoretical "bidding" process for block space, leaving aside the fact that wallet software does not offer users a way to update a fee on a transaction until it has been dropped from all the mempools.

However 20MB is large enough for a proper fees market to develop. Currently, blocks >900KB are carrying 0.25 BTC in fees. ~5BTC in fees for a full 20MB block and a block reward down to 6.25, means the fees and reward are similar, and the fees market should be healthy.

phillipsjk
Legendary
*
Offline Offline

Activity: 1008

Let the chips fall where they may.


View Profile WWW
February 05, 2015, 06:06:39 PM
 #118

That's taken right from the FAQ of the project itself. This is just 1 example of potential services that won't be able to get developed completely (if at all).

They just haven't designed it right (so the Bitcoin limits are not really an issue).

Although let's not go off-topic my AT project will be launching "crowdfund" very soon and that could actually work on Bitcoin if Bitcoin were to adopt AT (with no limits for the number of pledges).

Turing-completeness is a non-starter.

The Bitcoin scripts were deliberately designed to not be Turing complete.

I have to admit, the 28 minute (thorough) block propagation time (for 20MB block) mentioned earlier does seem realistic to me. That works out to about 14 hops with each node spending 2 minutes on block verification. Edit: Forgot to add: Turing completeness, even with cycle limits, will obviously increase block verification time.

This lends credence to LaudaM's contention that the blocks will not fill up immediately. Though, I have seen it argued that the large miners will have an incentive to push smaller miners out with large blocks (filled with garbage if need be).

Let me go with his assumptions implying a rational minimum fee of 0.0008 BTC for a 250 Byte transaction.

That would imply a total fee of 3.2 for 1 MB  blocks. We have not even seen that magnitude yet (exceptions were only fucked up transactions).
Means we are not even close to block size limit sqeezing out meaningful fees, so why increase it?

Because a change like this has to be planned months in advance. During the next bubble it will be too late to meaningfully accommodate the increased transaction volume. I suppose that may be the point: to temper speculation with high fees. The difficulty is that the institutions pushing the next bubble have the ability to print their own money.

James' OpenPGP public key fingerprint: EB14 9E5B F80C 1F2D 3EBE  0A2F B3DE 81FF 7B9D 5160
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 05, 2015, 06:09:06 PM
 #119

Turing-completeness is a non-starter.

The Bitcoin scripts were deliberately designed to not be Turing complete.

A statement with nothing to back it up (the reason Bitcoin is designed that way is because it was designed that way).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Peter R
Legendary
*
Offline Offline

Activity: 1064



View Profile
February 05, 2015, 06:20:18 PM
 #120

In response to those claiming that a hard fork to increase the blocksize limit will hurt the miners' ability to collect fee revenue:

The empirical data we have so far does not support the notion that the miners will be starved of fees or that blocks will be full of low fee transactions if the blocksize limit is increased.  If we inspect the fees paid to miners per day in US dollars over the lifetime of the network (avg blocksize << max blocksize), we see that total fee revenue, on average, has grown with increases in the daily transaction volume.



The total daily fees, F, have actually grown as the number of transactions, N, raised to the power of 2.7.  Although I don't expect this F~N2.7 relationship to hold forever, those suggesting that the total fees would actually decrease with increasing N have little data to support this claim (although, during our present bear market we've seen a reduction in the daily fees paid to miners despite an increase in N.)

Past behaviour is no guarantee of future behaviour, but historically blocks don't get filled with low-fee transactions and historically the total fees paid to miners increases with increased transaction volume.    

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
jabo38
Legendary
*
Offline Offline

Activity: 1162


mining is so 2012-2013


View Profile WWW
February 05, 2015, 06:22:17 PM
 #121

Sometimes I debate D&T on some issues, but not this time.  Great post. 

NEM      Faucet      Slack Invite      Easy API’s      Light Wallet      Amazing White Paper       Supernodes     Telegram Invite     Mijin 
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 05, 2015, 06:31:50 PM
 #122

Conclusion
The blockchain permanently restricted to 1MB is great if you are a major bank looking to co-opt the network for a next generation limited trust settlement network between major banks, financial service providers, and payment processors.   It is a horrible idea if you even want to keep open the possibility that individuals will be able to participate in that network without using a trusted third party as an intermediary.
I agree 100%.

There will probably be a role for settlement networks in the future, but even if they do exist those settlement networks should not be artificially subsidized by a block size limit.
Peter R
Legendary
*
Offline Offline

Activity: 1064



View Profile
February 05, 2015, 06:36:37 PM
 #123

Fantastic work, DeathAndTaxes!

Two things really stuck out for me after reading your post:

(1) How ridiculously low the 1 MB limit is if we envision any sort of "success case" for bitcoin. 

(2) How the "don't-increase-the-blocksize-limit-because-centralization" argument is misguided.  You clearly showed that not increasing the blocksize limit could lead to greater centralization by pricing out individuals from trustless access to the blockchain.   

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 05, 2015, 06:44:55 PM
 #124

(2) How the "don't-increase-the-blocksize-limit-because-centralization" argument is misguided.  You clearly showed that not increasing the blocksize limit could lead to greater centralization by pricing out individuals from trustless access to the blockchain.  

He didn't as he used a "straw-man" argument that people would use centralised authorities when they could just as easily not (again I am not against raising the 1MB limit but using the wrong arguments for that is simply not persuasive).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 06:47:32 PM
 #125

Because the block limit is not creating a fees market. The block reward is too high and masks the theoretical "bidding" process for block space

You are right there is no fees market yet, but not for the reason you state.
The block limit is not currently creating a market because it is not yet tight and bigger blocks will not be tight for yet more time.

Miners' only freedom is the freedom to exclude transactions. Can it be used to press users to pay higher fees?
Only if either

1) the space is scarce
2) supported by a rational lower limit to fee, no sane miner crosses
3) played in cartel

You want to eliminate 1) hoping that 2) holds so we do not fall through to 3)

Gaving quantified a rational lower limit to transaction fee as 0.0032/KB. A miner that includes a transaction paying less is no longer compensated for his increased orhpan cost. Miners currently accept even less, that tells miner are either dumb or the rational limit is lower. I suspect people deploying hundreds of millions in equipment would not act dumb for a prolonged time period.
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 05, 2015, 06:51:12 PM
 #126

1) the space is scarce
Misinformation.

Space in a block is always scarce, regardless of whether or not there's a protocol limit.

The only to make space in a block non-scarce is to invent a way of transmitting data that requires zero energy, zero time, and exceeds the shannon limit.

Whether or not space in a block is scare depends on physics, not on software design.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 07:19:10 PM
 #127

1) the space is scarce
Misinformation.

Space in a block is always scarce, regardless of whether or not there's a protocol limit.

The only to make space in a block non-scarce is to invent a way of transmitting data that requires zero energy, zero time, and exceeds the shannon limit.

Whether or not space in a block is scare depends on physics, not on software design.

Not.

What you describe is less than rational limit in 2.
johnyj
Legendary
*
Offline Offline

Activity: 1834


Beyond Imagination


View Profile
February 05, 2015, 07:19:57 PM
 #128

You must be able to broadcast that huge block to most of the nodes in 10 minutes. I don't see the latest research regarding this area, but there is a paper from 2013


http://www.tik.ee.ethz.ch/file/49318d3f56c1d525aabf7fda78b23fc0/P2P2013_041.pdf

Based on this research, it took 0.25 seconds for each KB transaction to reach 90% of network. In another word, a 1MB block will take 256 seconds to broadcast to majority of nodes and that is 4 minutes

When block size reach 10MB, you will have a broadcast time of 40 minutes, means before your block reach the far end of the network, those nodes have already digged out 3 extra blocks thus your block is always orphaned by them. And the whole network will have disagreement about which segment have the longest chain, thus fork into different chains

Gavin's proposal is to let mining pools and farms connect to high speed nodes on internet backbone. That is reasonable, since the propagation time is only meaningful for miners, your transaction will be picked up by the mining nodes closest to you and if those mining nodes have enough bandwidth, they can keep up with the speed. But anyway, how much bandwidth is really needed to broadcast 10MB message in a couple of minutes between hundreds of high speed nodes need to be tested. And this is the risk that someone worried about the centralization of mining nodes: Only those who have ultra high speed internet connection can act as nodes (I'm afraid that chinese farms will be dropped out since their connection to the outside world is extremely slow, they will just fork to their own chain inside mainland china)

I don't know how you come to those assumptions based on that research.

Quote
the block message may be very large — up to 500kB at the time of writing.

Quote
The median time until a node receives a block is 6.5 seconds whereas the mean is at 12.6 seconds.

Quote
For blocks, whose size is larger than 20kB, each kilobyte in size costs an additional 80ms delay until a majority knows about the block.

The do not mention the average size of blocks they measured. Let's assume all their blocks were 0KB. 12.6 seconds for that. Add 80 ms per addicition KB.... 80ms * 1024 * 20 is about 27.3 minutes. Add the original 12.6 seconds... Roughly 28 minutes for 20MB.

Of course, 28 minutes is still long. That is based on 2013 data. I assume the nodes now will have improved their verification speed and have more bandwidth. New measurements could / should be made to verify that propagation speed will not become an issue.

I just took the numbers on that chart, their paper says 0.08s/KB but the chart shows 0.25s/KB, no big difference

Ideally, you would like to keep the broadcasting time below 1 minute, to make sure the network does not fork into different chains, and to reduce the orphaned blocks. Currently some connection to china is about 20KB/second, means 1MB data will take 1 minute to just reach their network. Of course china have much less nodes than the rest of the world, but they do have large amount of hashing power

For 10MB block, the bandwidth requirement will be 2Mb, which is quite high if you consider the connection over continent. Hopefully before we reach that stage the network bandwidth has been upgraded


justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 05, 2015, 07:23:34 PM
 #129

What you describe is less than rational limit in 2.

First you start out by grossly misusing economic terms.

Then, when this is pointed out, you reply by making assumptions about theological price levels which you can't possibly calculate because they are not calculable.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 07:25:52 PM
 #130

What you describe is less than rational limit in 2.

First you start out by grossly misusing economic terms.

Then, when this is pointed out, you reply by making assumptions about theological price levels which you can't possibly calculate because they are not calculable.

what is misused and what can not be calculated?
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 05, 2015, 07:36:47 PM
 #131

If we inspect the fees paid to miners per day in US dollars over the lifetime of the network (avg blocksize << max blocksize), we see that total fee revenue, on average, has grown with increases in the daily transaction volume.

The regression is dominated by increase of BTC value not that of fees.
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 05, 2015, 07:44:10 PM
 #132

what is misused and what can not be calculated?
A non-scarce good is one that does not require allocation because the available supply at a price of zero exceeds the maximum achievable demand at that price.

No good that requires time or energy to deliver can be non-scarce.

Including transactions in a block will always require both time and energy, therefore the space in a block will be scarce.

Because space in a block is scarce, miners will need to allocate the inclusion of transactions into a block, and there exists a price below which they will not do so.

We can't calculate ahead of time what the equilibrium price of a transaction will be in the future, because that depends on the future actions and preferences of millions of other people.
gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 2366



View Profile
February 05, 2015, 07:53:29 PM
 #133

A non-scarce good is one that does not require allocation because the available supply at a price of zero exceeds the maximum achievable demand at that price.
No good that requires time or energy to deliver can be non-scarce.
Including transactions in a block will always require both time and energy, therefore the space in a block will be scarce.
Because space in a block is scarce, miners will need to allocate the inclusion of transactions into a block, and there exists a price below which they will not do so.
We can't calculate ahead of time what the equilibrium price of a transaction will be in the future, because that depends on the future actions and preferences of millions of other people.
This is a bit unhinged.  The _inherent_ costs of transactions is roughly   size_of_data * decenteralization_level (actually there is a quadratic component in a decentralized network too, but lets ignore that; good design can make it small).  In a free market for transaction capacity based purely on the inherent cost optimal competition can drive decentralization down to lower costs. With a completely centralized system the cost of almost any imaginable scale is basically nothing (e.g. a single <$2000 host on a sub-gigabit network connection is able to process a hundred thousand transactions per second).

So effectively one can replace the fee market with a market that favors the most centralization as they have the lowest costs (as thats all network income would pay for).  This may be true, but it's not interesting-- since if a highly centralized system were desirable there are more efficient and secure ways to achieve one.

I believe you're making a false comparison. None of the market participants have a way to express their preference for a decentralized network except by defining Bitcoin to be one though the rules of the system. Absent that someone who doesn't care and just wants to maximize their short term income can turn the decentralization knob all the way down (as we've seen with the enormous amount of centralization in mining pools) and maximize their income-- regardless of what the owners of bitcoins or the people making the transactions prefer. You could just as well argue that miners should be able to freely print more Bitcoins without limit and magically, if the invisible-pink-hand decides it doesn't want bitcoin to inflate, "the market" will somehow prevent it (in a way that doesn't involve just defining it out of the system).

Of course, 28 minutes is still long. That is based on 2013 data.
This data is massively outdated... it's before signature caching and ultra-prune, each were easily an order of magnitude (or two) improvements in the transaction dependent parts of propagation delay. It's also prior to block relay network, not to mention the further optimizations proposed but not written yet.

I don't actually think hosts are faster, actually I'd take a bet that they were slower on average, since performance improvements have made it possible to run nodes on smaller hosts than were viable before (e.g. crazy people with Bitcoind on rpi). But we've had software improvements which massively eclipsed anything you would have gotten from hardware improvements. Repeating that level of software improvement is likely impossible, though there is still some room to improve.

There are risks around massively increasing orphan rates in the short term with larger blocks (though far far lower than what those numbers suggest), indeed... thats one of the unaddressed things in current larger block advocacy, though block relay network (and the possibility of efficient set reconciliation) more or less shows that the issues there are not very fundamental though maybe practically important.

Bitcoin will not be compromised
Peter R
Legendary
*
Offline Offline

Activity: 1064



View Profile
February 05, 2015, 08:00:10 PM
 #134

If we inspect the fees paid to miners per day in US dollars over the lifetime of the network (avg blocksize << max blocksize), we see that total fee revenue, on average, has grown with increases in the daily transaction volume.

The regression is dominated by increase of BTC value not that of fees.

The plot shows the total daily fees in USD versus the number of transactions per day.  Indeed the correlation between the BTC value and the number of transactions per day is stronger, but that doesn't make the chart I plotted invalid.  In other words, if the network growth in the future is anything like the network growth in the past, the total fees will continue to increase along with the number of transactions per day (and along with the price of a bitcoin). 

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 05, 2015, 08:04:17 PM
 #135

This is a bit unhinged.
Typical central planner hubris.
So effectively one can replace the fee market with a market that favors the most centralization they have the lowest costs (as thats all network income would pay for).  This may be true, but it's not interesting-- since if a highly centralized system were desirable there are more efficient and secure ways to achieve one.
You're making assumptions about the preferences of current and future Bitcoin users which are not warranted.

If the world wants a system which can conduct transactions at the lowest possible price, then the world will not choose to use Bitcoin, regardless of what you try to shape their behaviour with arbitrary protocol rules.

If, on the other hand, there is any demand for censorship resistance and money with a predictable supply, then that demand will express itself as a willingness to pay some price to obtain it.

If Bitcoin is going to fail because there is not enough demand for what it provides that people are willing to pay what it costs to obtain it, then Bitcoin is going to fail.

If that's what's going to happen, then no amount of tampering with the price discovery process can change that outcome, and indeed will only be counterproductive to any chances at success Bitcoin does have.
redsn0w
Legendary
*
Offline Offline

Activity: 1288


# Free market


View Profile
February 05, 2015, 08:08:48 PM
 #136

Here https://bitcointalk.org/index.php?topic=941331.0;topicseen 215 forum users don't think the same :



CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 05, 2015, 08:10:33 PM
 #137

Here https://bitcointalk.org/index.php?topic=941331.0;topicseen 215 forum users don't think the same :

So - they could be all sockpuppet accounts (as this forum supports that).

Any poll on this forum is worth *zero* so posting the result of any such poll has *zero credibility*.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
redsn0w
Legendary
*
Offline Offline

Activity: 1288


# Free market


View Profile
February 05, 2015, 08:13:08 PM
 #138

Here https://bitcointalk.org/index.php?topic=941331.0;topicseen 215 forum users don't think the same :

So - they could be all sockpuppet accounts (as this forum supports that).

Any poll on this forum is worth *zero*.


I know, we can never do a valid poll on this forum.
However I agree with the 20 MB limit, I don't think will be a problem for the bitcoiner users.
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 05, 2015, 08:19:40 PM
 #139

I believe you're making a false comparison. None of the market participants have a way to express their preference for a decentralized network except by defining Bitcoin to be one though the rules of the system. Absent that someone who doesn't care and just wants to maximize their short term income can turn the decentralization knob all the way down (as we've seen with the enormous amount of centralization in mining pools) and maximize their income-- regardless of what the owners of bitcoins or the people making the transactions prefer. You could just as well argue that miners should be able to freely print more Bitcoins without limit and magically if the invisible pink hand decides it doesn't want inflation it will somehow market-prevent it (in a way that doesn't involve just defining it out of the system).
All you've done here is reinforce the fact that the design of the P2P network is broken and should be fixed, which is indeed an argument I am making, with a side order of red herring regarding the issuance schedule.

The difference between us is that I don't accept a permanently broken P2P network as a given and conclude that we should employ broken economics as a work around.

The broken economics of having a block size limit, and the broken P2P network should both be fixed.
Cryddit
Legendary
*
Offline Offline

Activity: 840


View Profile
February 05, 2015, 08:37:43 PM
 #140

The assertion that a fee market only works if miners act as a cartel is false.  The situation in which miners do not act as a cartel simply presents the consumer with a range of prices which they can choose among, paying a premium if they are willing to pay for high-priority or prompt service or taking a discount for making low-priority or slow transactions.

Imagine a market in which there is no cartel.  To make it simple, suppose that there are ten miners each with ten percent of the hashing power, and that the block size limit is not routinely reached.  Because they are economically rational and facing different prices for bandwidth and electricity in their respective neighborhoods, they all set different minimum-fee policies.

The consumer is faced with ten different price points for a "minimum acceptable" fee, which determines how many of these miners would accept his or her transaction.  

So... paying a minimum fee would get your tx accepted by one miner.  On average you're going to have to wait ten blocks before that miner gets a block, so your expected tx time is about 100 minutes.  Paying a median fee would get your tx accepted by any of five miners.  On average you're going to have to wait two blocks before one of those five gets a block, so 20 minutes.  Paying the highest fee would get your tx into any block regardless of who mines it, so you'll be in the very next block in around 10 minutes.  

The point is that consumers are not faced with a binary "pay enough" or "don't pay anything" choice; they are faced instead with the opportunity to select a level of responsiveness desired and pay for the priority they want or need on a by-transaction basis.  

gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 2366



View Profile
February 05, 2015, 08:49:46 PM
 #141

All you've done here is reinforce the fact that the design of the P2P network is broken and should be fixed, which is indeed an argument I am making, with a side order of red herring regarding the issuance schedule.
The difference between us is that I don't accept a permanently broken P2P network as a given and conclude that we should employ broken economics as a work around.
The broken economics of having a block size limit, and the broken P2P network should both be fixed.
I was already assuming a perfectly idealized p2p network that had no overhead or sub-linear scaling. I've done as much to explore the space of efficiency gains in this kind of system as any two other people combined here, come on. Please don't try to play off that I don't know how the system works. Decentralization has inherent costs.  You're not saying anything to escape that. It's not good enough to just say "broken broken" when reality doesn't behave like you wish it did.  I also wish there wasn't a tradeoff here, but it doesn't make it so. Sad  (And to be clear, I think there is some amount where the costs are insignificant and not a concern and that cutoff changes over time; it's only the unlimited view which I think is clearly at odds with strong decentralization and risks disenfranchising the actual holders and users of bitcoin; people who weren't signing up for a system controlled by and operated at the complete whim of a few large banks ('miners'/pools)).

Bitcoin will not be compromised
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 05, 2015, 08:55:49 PM
 #142

I was already assuming a perfectly idealized p2p network that had no overhead or sub-linear scaling. I've done as much to explore the space of efficiency gains in this kind of system as any two other people combined here, come on. Please don't try to play off that I don't know how the system works.
What I mean is that your perfectly idealized p2p network is still wrong.

A more detained explaination is forthcoming.

Decentralization has inherent costs.  You're not saying anything to escape that.

That's exactly true, and I'm not trying to escape it.

Everything that any network does has inherent costs, and every one of them in existence of which I am aware either fails to recognize this fact or else does so and does the wrong thing in response.
Lauda
Legendary
*
Offline Offline

Activity: 1694


GUNBOT Licenses -20% with ref. code 'GrumpyKitty'


View Profile WWW
February 05, 2015, 08:57:28 PM
 #143

Looks like were concluding that doing this has also a few (potential) drawbacks. All that was discussed is theoretical as we do not really know.

I'm amazed how the thread took a turn to discussing the issues of p2p networks.


          ▄▄█████▌▐█████▄▄
       ▄█████████▌    ▀▀▀███▄
     ▄███████████▌  ▄▄▄▄   ▀██▄
   ▄█████████████▌  ▀▄▄▀     ▀██▄
  ▐██████████████▌  ▄▄▄▄       ▀█▌
 ▐███████████████▌             ▀█▌
 ████████████████▌  ▀▀▀█         ██
▐████████████████▌  ▄▄▄▄         ██▌
▐████████████████▌  ▀  ▀         ██▌
 ████████████████▌  █▀▀█         ██
 ▐███████████████▌  ▀▀▀▀        ▄█▌
  ▐██████████████▌  ▀▀▀▀       ▄█▌
   ▀█████████████▌  ▀▀█▀     ▄██▀
     ▀███████████▌  ▀▀▀▀   ▄██▀
       ▀█████████▌    ▄▄▄███▀
          ▀▀█████▌▐█████▀▀
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
      ▄▄▄
 ▄▄█████████▄▄
  ▀▀▀▀▀▀▀▀▀▀▀
   █▌▐█ █▌▐█
   █▌▐█ █▌▐█
 ▄███████████▄
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄






▄█████████████▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
███████████████
██▀▀█▀▀████████
▀█████████████▀
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2464



View Profile
February 05, 2015, 09:09:11 PM
 #144

I am almost certain that if Satoshi had originally coded a block limit doubling every 2 years along with block reward halving every 4 years then this would never have been discussed or even pondered in so much questionable detail. People would be like, "Oh yeah, that's the way it works, let's just deal with now".

gmaxwell: thanks for your timely input on the necessary decentralisation quantification.

Do not forget that the hard-coded fees constants fix should be addressed simultaneously with this issue since they are inter-linked .... or we'll be back arguing about that eventually also.

solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
February 05, 2015, 09:58:08 PM
 #145

The assertion that a fee market only works if miners act as a cartel is false.  The situation in which miners do not act as a cartel simply presents the consumer with a range of prices which they can choose among, paying a premium if they are willing to pay for high-priority or prompt service or taking a discount for making low-priority or slow transactions.

Imagine a market in which there is no cartel.  To make it simple, suppose that there are ten miners each with ten percent of the hashing power, and that the block size limit is not routinely reached.  Because they are economically rational and facing different prices for bandwidth and electricity in their respective neighborhoods, they all set different minimum-fee policies.

The consumer is faced with ten different price points for a "minimum acceptable" fee, which determines how many of these miners would accept his or her transaction.  

So... paying a minimum fee would get your tx accepted by one miner.  On average you're going to have to wait ten blocks before that miner gets a block, so your expected tx time is about 100 minutes.  Paying a median fee would get your tx accepted by any of five miners.  On average you're going to have to wait two blocks before one of those five gets a block, so 20 minutes.  Paying the highest fee would get your tx into any block regardless of who mines it, so you'll be in the very next block in around 10 minutes.  

The point is that consumers are not faced with a binary "pay enough" or "don't pay anything" choice; they are faced instead with the opportunity to select a level of responsiveness desired and pay for the priority they want or need on a by-transaction basis.  


This is a great explanation!
It fits the "unconstrained" block size scenario, which is how Bitcoin has worked for most of its existence, except for a day or so about March 6th, 2013, when the 250KB soft-limit was effective. It does mean that when users create a transaction they have a single-shot at getting the fee right. In the simplified example, if a user pitches their fee so that 2 miners will accept it (out of 10), and then change their mind, deciding that waiting an expected 50 minutes is too long, then they are SOL, the unconfirmed tx can't be changed. Fortunately this is rare.

The "constrained" block size scenario makes necessary the ability for ordinary users to increase the fee. Users will want to update the fee on their unconfirmed tx to manage the instability in confirmation times, otherwise their tx can remain stuck in cyberspace, and they are helpless.

Certainly, protocol block limits should not be hit unless all wallets first support the updating of fees on unconfirmed tx.


gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 2366



View Profile
February 05, 2015, 10:17:26 PM
 #146

Do not forget that the hard-coded fees constants fix should be addressed simultaneously with this issue since they are inter-linked .... or we'll be back arguing about that eventually also.
We don't have hardcoded fees in Bitcoin Core... except very low ones for relay permission which have been, in practice, below typical. They're kind of ugly, and I'm generally opposed to hardcoded fees, but if they're below behavior in practice they don't cause much harm (and are very very help at preventing resource exhaustion attacks). Bitcoin Core 0.10 has a automatic fee system based on the transactions in the mempool and recent blocks, where you can set a target number of blocks to wait and it will pay based on recent history.

The "constrained" block size scenario makes necessary the ability for ordinary users to increase the fee. Users will want to update the fee on their unconfirmed tx to manage the instability in confirmation times, otherwise their tx can remain stuck in cyberspace, and they are helpless.
This is relatively straight forward to support. When a new transaction comes into the mempool, if it pays at least $increment more fees per KB than the conflicting already mempooled transaction, replace it and forward on.  Then you just need fairly simple wallet support to revise a transaction. Petertodd (IIRC) already wrote "replace by fee" code that does this.  The catch is that implemented this way it makes zero-confirmed transactions less safe, since you could have a greater success in double spending.   This can be addressed by narrowing the set of allowed replacements (e.g. all outputs must be equal or greater), but AFAIK no one has bothered implementing it.

Quote
Certainly, protocol block limits should not be hit unless all wallets first support the updating of fees on unconfirmed tx.
Chicken and egg. Without fee pressure there is no incentive to work on software to do that. Most non-bitcoin core wallets just set rather high hardcoded fees (even constant ones that don't related the the txsize metric that miners use to prioritize transactions into blocks.).

Unfortunately over-eager increases of the soft-limit have denied us the opportunity to learn from experience under congestion and the motivation to create tools and optimize software to deal with congestion (fee-replacement, micropayment hubs, etc).

Look at the huge abundance of space wasting uncompressed keys (it requires ~ one line of code to compress a bitcoin pubkey) on the network to get an idea of how little pressure there exists to optimize use of the blockchain public-good right now.

Because they are economically rational and facing different prices for bandwidth and electricity in their respective neighborhoods, they all set different minimum-fee policies.
With correctly setup software there is no relationship between your bandwidth or electricity costs as a miner and the transactions you accept into your blocks, and any slight residual relation can be divided down to nothing by pooling with other N other miners (centralizing the consensus in the process) in order to have 1/Nth the bandwidth/cpu costs. As a miner you maximize your personal income by accepting all available transactions that fit which pay a fee, it's best for you when other miners reject low fee paying transactions to encourage people to pay high fees, but you dont and instead hoover up all the fees they passed up. They take the cost of encouraging users to pay higher fees, you defect and take the benefit.

A more detained explaination is forthcoming.
Sounds good, but hopefully you can understand that some people are not very comfortable betting Bitcoin's future on not-yet-public theorems (which sounds like they must be at odds with the best understanding available from the active technical community _and_ academia...).  There have been many "bitcoin scaling" ideas that accidentally turned out to have no security or implied extreme centralization once considered more carefully. There are a few ideas which I think will someday help a lot, but they're not practical yet and its not clear when they will be.

Bitcoin will not be compromised
oldbute
Member
**
Offline Offline

Activity: 61


View Profile
February 05, 2015, 10:26:03 PM
 #147

^ Did you read the entire post? The OP fully addressed the effect on fees:

He neglects that there is no reason to pay fees, if there is no limit on supply.

just because there's the POSSIBILITY of 20MB doesnt mean you HAVE TO use it.

Since there is no marginal cost in including a transaction to the current block, a rational miner will always include a transaction with a non zero fee,
before it is included by any of its competitors.

Therefore a lower bound on fee will not work without a cartel or without a competition for space.

I prefer algorithms over cartels.

The chance of orphan blocks should provide some competition for space.  Miners may find with current network topology a 4MB block is the right size.   As more nodes and faster connections occur size can be adjusted.  Is a hard limit an algorithm?
gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 2366



View Profile
February 05, 2015, 10:29:43 PM
 #148

The chance of orphan blocks should provide some competition for space.
Centralized miners suffer much lower orphan blocks if the orphan block rate is macroscopic and driven by actual propagation time. If you're in a regime where one would want to do something to lower their orphan rate, the optimal income maximizing strategy is to centralize, not to reduce sizes.

Though at least fundamentally we know there there is no need for the orphan rate to increase proportional to block-size, if miners use more efficient relaying mechanisms that take advantage of the transactions having been already sent in advance.

Bitcoin will not be compromised
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
February 05, 2015, 11:17:05 PM
 #149

Unfortunately over-eager increases of the soft-limit have denied us the opportunity to learn from experience under congestion and the motivation to create tools and optimize software to deal with congestion (fee-replacement, micropayment hubs, etc).

Probably the best time to let soft-limits persist was in 2011/12 when the ecosystem was smaller, the funds at stake were a lot smaller, users considered the software more experimental than beta, and the world's press wasn't really watching.

Look at the huge abundance of space wasting uncompressed keys (it requires ~ one line of code to compress a bitcoin pubkey) on the network to get an idea of how little pressure there exists to optimize use of the blockchain public-good right now.

My experience of (centralized) financial systems over many years is that ignoring hardware and software constraints as they are approached invariably causes outages. Also, that trying to train a user-base or worse, a market, to behave differently to accommodate IT constraints is a Sisyphean task. There are probably hundreds of IT experts who are concerned about the block size limit, because they can see the risks in it, which they recognize from prior (usually bitter) experience.

And, this is where the role of Core Dev is crucial. If there are major efficiencies to be had, "low-hanging fruit", then it would be wonderful to see them go live and reflected in smaller blocks etc. But right now, we can only project forwards, from what is happening with the average block size.

bitcoinbeliever
Jr. Member
*
Offline Offline

Activity: 52

Unshackle yourself


View Profile
February 05, 2015, 11:20:13 PM
 #150

D&T, I agree with most everything you wrote, many thanks for the interesting research and composition effort.
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 06, 2015, 01:06:02 AM
 #151

Here https://bitcointalk.org/index.php?topic=941331.0;topicseen 215 forum users don't think the same :

So - they could be all sockpuppet accounts (as this forum supports that).

Any poll on this forum is worth *zero* so posting the result of any such poll has *zero credibility*.


Polls on bitcointalk:

- if I agree with the outcome, it's undenyable evidence that I am right
- if I don't agree with the outcome, it's just sock puppets.

Why did this forum get so filled with trolls?
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 06, 2015, 02:39:59 AM
 #152

It's not a binary option. Raising the limit at a moderate pace, so that fees don't have to increase a substantial amount with increasing adoption, is a middle ground solution, that will lead to average fees remaining affordable but not-zero.

There are quite a few constants in Bitcoin that one could argue with, that is what we do. It is however important to do it for the right reason.
Avoid paying fees is not a right reason.

Again, a straw man argument. No one has argued that the limit should be raised to "avoid paying fees". I want the blocks to come against the limit but I want that limit to be much higher than it is. I want the limit to put some upward pressure on fees, but not too much, because I don't want mass adoption to be dependent on end users paying "excessive fees" to access the blockchain.

Exending on above:

It is rather difficult to substantiate an algorithm that would set course for the future, given the huge amount of unkown parameters a constant we have is preferred by occam's razor.

But anyone wanting a permanent 1 MB restriction also needs to substantiate this course being set for the future. The argument for getting rid of the 1 MB restriction is no more speculative than the one for making it permanent.

Given what Peter R has shown:

In response to those claiming that a hard fork to increase the blocksize limit will hurt the miners' ability to collect fee revenue:

The empirical data we have so far does not support the notion that the miners will be starved of fees or that blocks will be full of low fee transactions if the blocksize limit is increased.  If we inspect the fees paid to miners per day in US dollars over the lifetime of the network (avg blocksize << max blocksize), we see that total fee revenue, on average, has grown with increases in the daily transaction volume.



The total daily fees, F, have actually grown as the number of transactions, N, raised to the power of 2.7.  Although I don't expect this F~N2.7 relationship to hold forever, those suggesting that the total fees would actually decrease with increasing N have little data to support this claim (although, during our present bear market we've seen a reduction in the daily fees paid to miners despite an increase in N.)

Past behaviour is no guarantee of future behaviour, but historically blocks don't get filled with low-fee transactions and historically the total fees paid to miners increases with increased transaction volume.

And given that there is no reason to assume that the demand for space relative to available space will be lower with a higher block size limit, there is no reason not to raise the limit above the current 1 MB in light of DeathAndTaxes' analysis on what this limit will mean for end-user access to the blockchain.
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 06, 2015, 04:22:17 AM
 #153

Polls on bitcointalk:

- if I agree with the outcome, it's undenyable evidence that I am right
- if I don't agree with the outcome, it's just sock puppets.

Why did this forum get so filled with trolls?

I have never agreed with the use of polls in the forum full stop (and my own forum software does not even have them).

The very idea of putting polls into a forum that "encourages" sockpuppets is ridiculous.

(but let's not go off topic)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
ObamaMcKennit
Newbie
*
Offline Offline

Activity: 1


View Profile
February 06, 2015, 05:05:50 AM
 #154

http://trilema.com/2015/gerald-davis-is-wrong-heres-why/

Since you content farm lot are all busy copy/pasting the same stuff over and over to each other instead of paying attention.
wilth1
Member
**
Offline Offline

Activity: 63


View Profile
February 06, 2015, 05:08:33 AM
 #155

Why couldn't MAX_BLOCK_SIZE be self-adjusting?
gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 2366



View Profile
February 06, 2015, 05:15:42 AM
 #156

Why couldn't MAX_BLOCK_SIZE be self-adjusting?
That very vague.... based on what?   The hard rules of the protocol are what protect the users and owners of Bitcoins from miners whos interests are only partially aligned.  Sadly, miners have substantial censoring power for data that goes into the blockchain.  I suppose it's useful to have an in-protocol way of coordinating rather than depending on potentially non-transparent back room dealing; but almost anything in the network would be easily gamable by miners. There are some things that I think are preferable to just having no effective limit (e.g. having a rolling median, and requiring mining at higher diff to move the needle for your own blocks, and requiring difficulty to not be falling over-all for the size to go up) but these don't address half the concerns and potentially add a fair bit of complexity (which has its own risks.). 

Bitcoin will not be compromised
sangaman
Sr. Member
****
Offline Offline

Activity: 342



View Profile WWW
February 06, 2015, 05:31:16 AM
 #157

Thank you DeathAndTaxes for this excellent post on why the block size limit must increase if bitcoin is to ever reach its potential as (or simply to remain) a decentralized, peer-to-peer means of exchange. A highly restricted blockchain that is impractical for regular people to use is not what bitcoin was ever intended to be, and you did a good job explaining why this is what bitcoin would become with a fixed 1 MB cap (and sooner than most people think). You also did a good job debunking several of the most common objections to lifting the 1 MB cap, and I would further emphasize that an artificial cap on the number of bitcoin transactions is not the best way to maximize mining fees and/or security. This follows from basic economic principles. Hypothetically speaking, if you assume purely self-interested miners and capping blocks at 1 MB happens to be the way to generate the greatest amount of transaction fees, then blocks will be no larger than 1 MB regardless of what the block limit is. Of course if there were no block limit, miners would be able to maximize their transaction fee intake whether that mean 2 MB blocks, 5 MB, whatever... market forces will determine the fee per byte that is needed to get a transaction into the blockchain in a timely fashion. Whether that fee makes it economical to purchase a cup of coffee with bitcoin, that remains to be seen (and may very well vary from person to person).

D&T, you've been one of if not the most consistently reasonable, sincere, and intelligent posters on this forum since I first discovered this forum, and I appreciate you taking the time to write this persuasive argument on a topic that's critical for the long-term success and viability of bitcoin.
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 06, 2015, 05:36:23 AM
 #158

What if he's wrong?

Quote
without MPEx no fork of this network can succeed
sangaman
Sr. Member
****
Offline Offline

Activity: 342



View Profile WWW
February 06, 2015, 06:19:45 AM
 #159

Of course if there were no block limit, miners would be able to maximize their transaction fee intake whether that mean 2 MB blocks, 5 MB, whatever... market forces will determine the fee per byte that is needed to get a transaction into the blockchain in a timely fashion. Whether that fee makes it economical to purchase a cup of coffee with bitcoin, that remains to be seen (and may very well vary from person to person).

I should actually correct myself here since I overlooked the potential tragedy of the commons situation when it comes to miners including transations- a blockchain with no block size limit would not necessarily generate the most fees or be the most secure for that reason. However, it's still entirely possible (and in my opinion quite likely) that increasing the block size limit from 1 MB would increase the total fees per block in the long run.
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 06, 2015, 06:26:54 AM
 #160

Of course if there were no block limit, miners would be able to maximize their transaction fee intake whether that mean 2 MB blocks, 5 MB, whatever... market forces will determine the fee per byte that is needed to get a transaction into the blockchain in a timely fashion. Whether that fee makes it economical to purchase a cup of coffee with bitcoin, that remains to be seen (and may very well vary from person to person).

I should actually correct myself here since I overlooked the potential tragedy of the commons situation when it comes to miners including transations- a blockchain with no block size limit would not necessarily generate the most fees or be the most secure for that reason. However, it's still entirely possible (and in my opinion quite likely) that increasing the block size limit from 1 MB would increase the total fees per block in the long run.
For the entirety of Bitcoin's history, it has produced blocks smaller than the protocol limit.

Why didn't the average size of blocks shoot up to 1 MB and stay there the instant Satoshi added a block size limit to the protocol?
sangaman
Sr. Member
****
Offline Offline

Activity: 342



View Profile WWW
February 06, 2015, 06:48:26 AM
 #161

Of course if there were no block limit, miners would be able to maximize their transaction fee intake whether that mean 2 MB blocks, 5 MB, whatever... market forces will determine the fee per byte that is needed to get a transaction into the blockchain in a timely fashion. Whether that fee makes it economical to purchase a cup of coffee with bitcoin, that remains to be seen (and may very well vary from person to person).

I should actually correct myself here since I overlooked the potential tragedy of the commons situation when it comes to miners including transations- a blockchain with no block size limit would not necessarily generate the most fees or be the most secure for that reason. However, it's still entirely possible (and in my opinion quite likely) that increasing the block size limit from 1 MB would increase the total fees per block in the long run.
For the entirety of Bitcoin's history, it has produced blocks smaller than the protocol limit.

Why didn't the average size of blocks shoot up to 1 MB and stay there the instant Satoshi added a block size limit to the protocol?

I'm not sure what you're getting at. Clearly there just hasn't been the demand for 1 MB worth of transactions per block thus far, but that could change relatively soon., and thus the debate over lifting the 1 MB cap before we get to that point. If suddenly the block limit were to drop to 50kb, I think we'd start seeing a whole lot of 50kb blocks, no?
turvarya
Hero Member
*****
Offline Offline

Activity: 714


View Profile
February 06, 2015, 07:43:05 AM
 #162

http://trilema.com/2015/gerald-davis-is-wrong-heres-why/

Since you content farm lot are all busy copy/pasting the same stuff over and over to each other instead of paying attention.
lol,
so someone made a blog-entry about a Forum post to make it look more legitimate?

https://forum.bitcoin.com/
New censorship-free forum by Roger Ver. Try it out.
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 06, 2015, 08:47:16 AM
 #163

http://trilema.com/2015/gerald-davis-is-wrong-heres-why/

Since you content farm lot are all busy copy/pasting the same stuff over and over to each other instead of paying attention.

What does he expect, we're going to cancel the hard fork because he slanders some people with eloquent and expressive prose? Because he declares he can single handedly sabotage the process due to his importance and influence?

Whether the hard fork happens is not going to be determined by one man's ego. If he wants to present his arguments for not doing the hard fork in a diplomatic manner, they will be taken into account, and debated, but there is no debate as long as one man looks down on others, and feels no need to restrain himself and speak to them respectfully.
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 06, 2015, 08:53:39 AM
 #164

Whether the hard fork happens is not going to be determined by one man's ego. If he wants to present his arguments for not doing the hard fork in a diplomatic manner, they will be taken into account, and debated, but there is no debate as long as one man looks down on others, and feels no need to restrain himself and speak to them respectfully.

Absolutely agreed.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
turvarya
Hero Member
*****
Offline Offline

Activity: 714


View Profile
February 06, 2015, 08:56:14 AM
 #165

http://trilema.com/2015/gerald-davis-is-wrong-heres-why/

Since you content farm lot are all busy copy/pasting the same stuff over and over to each other instead of paying attention.

What does he expect, we're going to cancel the hard fork because he slanders some people with eloquent and expressive prose? Because he declares he can single handedly sabotage the process due to his importance and influence?

Whether the hard fork happens is not going to be determined by one man's ego. If he wants to present his arguments for not doing the hard fork in a diplomatic manner, they will be taken into account, and debated, but there is no debate as long as one man looks down on others, and feels no need to restrain himself and speak to them respectfully.
I recently talked to some Bitcoin-Users in real life. None of them doubted, that the hard fork will come, none of them had any fear, that something would go wrong(and some of them understood the technology much better than me)
Being active on this forum often clouds your vision from what is going on in the real BTC-World.
That's why I try to stay away from such thread. I think, I already read all the pro and cons on that matter. The rest is just accusation of who is  an idiot or a shill. It reminds me about elementary school.

https://forum.bitcoin.com/
New censorship-free forum by Roger Ver. Try it out.
bambou
Sr. Member
****
Offline Offline

Activity: 346


View Profile
February 06, 2015, 08:56:26 AM
 #166

Whether the hard fork happens is not going to be determined by one man's ego. If he wants to present his arguments for not doing the hard fork in a diplomatic manner, they will be taken into account, and debated, but there is no debate as long as one man looks down on others, and feels no need to restrain himself and speak to them respectfully.

Absolutely agreed.


please stop acting like over-emotive princesses.

Non inultus premor
turvarya
Hero Member
*****
Offline Offline

Activity: 714


View Profile
February 06, 2015, 08:58:16 AM
 #167

Whether the hard fork happens is not going to be determined by one man's ego. If he wants to present his arguments for not doing the hard fork in a diplomatic manner, they will be taken into account, and debated, but there is no debate as long as one man looks down on others, and feels no need to restrain himself and speak to them respectfully.

Absolutely agreed.


please stop acting like over-emotive princesses.
lol
those posts are exactly what I meant. It's elementary school all over again Cheesy

https://forum.bitcoin.com/
New censorship-free forum by Roger Ver. Try it out.
turvarya
Hero Member
*****
Offline Offline

Activity: 714


View Profile
February 06, 2015, 09:05:38 AM
 #168

Whether the hard fork happens is not going to be determined by one man's ego. If he wants to present his arguments for not doing the hard fork in a diplomatic manner, they will be taken into account, and debated, but there is no debate as long as one man looks down on others, and feels no need to restrain himself and speak to them respectfully.

Absolutely agreed.


please stop acting like over-emotive princesses.
lol
those posts are exactly what I meant. It's elementary school all over again Cheesy

Bitch, fork that chain and I'll get my cousin to beat you up Tongue
I have two older brothers, who will beat the shit out your cousins. So, just try.

https://forum.bitcoin.com/
New censorship-free forum by Roger Ver. Try it out.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 06, 2015, 09:21:57 AM
 #169

Why couldn't MAX_BLOCK_SIZE be self-adjusting?
That very vague.... based on what?

I am not generally against increasing block size, but against doing it for the wrong reason or too eagerly.

The pace of increase has to be algorithmic, driven by market forces and advances of technology, not cenral planning or cartels.
The algorithm we have now, that fits above, is that of the difficulty adjustment.

It is plausible to me that difficulty stalls or falls if mining is no longer highly profitable, in which case block size limit should not be increased. This is actualy the line of thought that makes me think an increase of block limit at this time point is not warranted.

Linking block size increase with difficulty adjustment would be technically straightforward since difficulty calculation already drives validatation, size limit could be just one more of it its output and implied checks.

amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 06, 2015, 09:36:00 AM
 #170

The pace of increase has to be algorithmic, driven by market forces and advances of technology, not cenral planning or cartels.
The algorithm we have now, that fits above, is that of the difficulty adjustment.

This was also my first preference. I'd add that if it was up to me, the hard limit would be removed altogether, and it would be up to miners to create a soft limit. As long as over 50 percent of the network hashrate enforces a particular rule on the block size, it will be as binding as a protocol rule.

Perhaps the 40% per year hard limit increase can co-exist with a dynamic soft limit that tracks difficulty. That way the hard limit acts as a failsafe, while the soft limit imposes the real size constraints.
grau
Hero Member
*****
Offline Offline

Activity: 836


bits of proof


View Profile WWW
February 06, 2015, 09:44:08 AM
 #171

The pace of increase has to be algorithmic, driven by market forces and advances of technology, not cenral planning or cartels.
The algorithm we have now, that fits above, is that of the difficulty adjustment.

This was also my first preference. I'd add that if it was up to me, the hard limit would be removed altogether, and it would be up to miners to create a soft limit. As long as over 50 percent of the network hashrate enforces a particular rule on the block size, it will be as binding as a protocol rule.

Perhaps the 40% per year hard limit increase can co-exist with a dynamic soft limit that tracks difficulty. That way the hard limit acts as a failsafe, while the soft limit imposes the real size constraints.

What I meant is not a vote by hash rate, that would be a cartel. Difficulty increase is also not a vote but a consequence of market forces and technology.

It is important to preserve the proposed role of fees, that are to two fold: contain spam and replace inflation on the long run.
Eagerly increasing block size would be contrary to both.
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 06, 2015, 09:52:01 AM
 #172

Quote
What I meant is not a vote by hash rate, that would be a cartel. Difficulty increase is also not a vote but a consequence of market forces and technology.

Miners cartelizing to create a sensible de facto block size limit doesn't seem like a bad thing to me. Anyway, I don't want to take this discussion too far off-topic so I'll save it for a different thread.





wilth1
Member
**
Offline Offline

Activity: 63


View Profile
February 06, 2015, 05:37:18 PM
 #173

Why couldn't MAX_BLOCK_SIZE be self-adjusting?
That very vague.... based on what?

Max block size could be retargeted periodically alongside difficulty adjustments using average block size and the frequency of full blocks in a period with the hard-coded value as a floor.

Imagining the necessity of 20MB blocks relies on the assumption that a massive increase in transaction volume develops, but what if it slows significantly?  Bloated block broadcast delay might temporarily even be useful as a competitive advantage.  When fees outweigh reward, doesn't the mining market encourage bloat?

Also, with a hard limit on size, won't the conversation on manual adjustment resurface indefinitely with increasing political difficulty? Wink
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
February 06, 2015, 06:15:22 PM
 #174

Why couldn't MAX_BLOCK_SIZE be self-adjusting?

It certainly could be.   The point of the post wasn't to arrogantly state what we must do or even that we must do something now.   I would point out that planning a hard fork is no trivial manner so the discussion needs to start now even if the final switch to new block version won't actually occur for 9-12 months.   The point was just to show that a permanent 1MB cap is simply a non-starter.  It allows roughly 1 million direct users to make less than 1 transaction per month.  That isn't a backbone it is a technological dead end.

For the record I disagree with Gavin on if Bitcoin can (or even should) scale to VISA levels.   It is not optimal that someone in Africa needs transaction data on the daily coffee habit of a guy in San Francisco they will never meet.   I do believe that Bitcoin can be used as a core backbone to link a variety of other more specialized (maybe even localized) systems via the use of sidechains and other technologies.  The point of the post was that whatever the future of Bitcoin ends up being it won't happen with a permanent 1 MB cap.
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW