Bitcoin Forum
November 15, 2024, 02:04:07 AM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Block size limit questions  (Read 1846 times)
d'aniel (OP)
Sr. Member
****
Offline Offline

Activity: 461
Merit: 251


View Profile
May 15, 2012, 12:13:54 PM
Last edit: May 15, 2012, 12:29:52 PM by d'aniel
 #1

I'm wondering:

  • Is there a plan yet for when we start bumping up against the block size limit?
  • Is it going to be held where it is, pushing fees up and txs to occur off the blockchain on, e.g. Open Transactions servers, as gmaxwell suggested here: https://bitcointalk.org/index.php?PHPSESSID=c5c394d3434101e2874d5cadff6221a6&topic=80435.msg898723#msg89872?
  • Or will it be raised somewhat after some scalability optimizations are implemented?
  • If so, how high can be raised while still allowing the average PC to run a full node?
  • Do the developers all agree for the sake of decentralization to keep this a priority, enforced with a block size limit?
  • If so, why are people spending so much time developing lightweight bitcoin clients instead of working on, e.g. OT, if average people are going to be priced out of blockchain txs anyway?

Thanks for any clarification!
finkleshnorts
Sr. Member
****
Offline Offline

Activity: 336
Merit: 250



View Profile
May 15, 2012, 01:36:34 PM
 #2

following
deus-ex-machina
Full Member
***
Offline Offline

Activity: 166
Merit: 100



View Profile
May 15, 2012, 03:23:53 PM
 #3

Rewrite the old parts of the blockchain so all transactions that happened more than 24 months ago are condensed so that they total to the same amount but merge somewhat. Then slap a warning saying they may be inaccurate.

ex.
guy 1 gives guy 2 5 btc, followed by guy 2 giving guy 3 5 btc
this becomes guy 1 giving guy 3 5 btc, removing guy 2 from that section entirely.
theymos
Administrator
Legendary
*
Offline Offline

Activity: 5376
Merit: 13420


View Profile
May 15, 2012, 03:29:34 PM
 #4

Quote from: d'aniel
Do the developers all agree for the sake of decentralization to keep this a priority, enforced with a block size limit?

No. IIRC Mike Hearn supports moving most nodes to SPV. My impression was that Satoshi also expected most nodes to use SPV. Not sure about the opinions of other developers besides gmaxwell.

1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
d'aniel (OP)
Sr. Member
****
Offline Offline

Activity: 461
Merit: 251


View Profile
May 15, 2012, 10:31:37 PM
 #5

Just thinking aloud here...

I'm inclined to agree with gmaxwell that an off-blockchain transaction infrastructure is the answer.  Seems like it would be much cheaper, and more convenient and private/anonymous, anyway.  And with multisig/P2SH, it seems like it could be very secure against operators running off with the bitcoins people have bailed onto the tx servers.

OTOH, if this infrastructure isn't available when the block size limit is bumped up against and transactions start getting delayed and expensive, I doubt developers will be able to resist demands to increase the limit.

If it's not ready in time, could we ever revert back when it is, or would there be kind of a ratchet effect to this?
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
May 15, 2012, 10:47:22 PM
 #6

Change in blockchain size must be supported by super majority of miners to avoid a split in the network (yeah technically 50% + 1 hash is sufficient but it would be a disaster).

Fees are essentially 0.  The few satoshis paid in fees per block are a rounding error.  I doubt many miners will be supporting raising the block size any time soon especially w/ the subsidy being cut in half.

Still it is a totally non-issue.  Block size is 500 KB.  Average tx is ~ 500 bytes.  So the current block size is good for ~180K daily tx.  We are a small fraction of that.  If (due to economic pressure) some of the spam (satoshi dice, miner's taking 2 bitcent payouts, etc) was reduced we likely wouldn't even be 2K tx.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4284
Merit: 8808



View Profile WWW
May 15, 2012, 11:13:51 PM
 #7

First— do you mean the 500k soft target or the million byte protocol rule?

The soft target is trivially lifted a node at a time. I expect the default soft limit will change once the network is consistently producing blocks up against that limit.

So I'll assume you mean the protocol rule—

OTOH, if this infrastructure isn't available when the block size limit is bumped up against and transactions start getting delayed and expensive, I doubt developers will be able to resist demands to increase the limit.

I haven't done the benchmarking to fully figure out exactly where a standard PC peters off, but I'm pretty sure they can process somewhat more than the current limit, at least if they're SSD equipped.  So even if you're a full card-carrying member of my Church of Forever Decentralization,  whos doctrines requires that the maximum block sizes stay quite small,  you could still support a bit of a bump.

Quote
pushing fees up and txs to occur off the blockchain on, e.g. Open Transactions servers

It's worth mentioning that beyond escaping the limits external things can have other advantages too.  For example, even getting a _single_ confirmation in Bitcoin (the minimum required to resist reversal attacks without using a trusted certification serice) can take a long time— 10 minutes is an _average_, but 30 minutes or longer happens about 7 times per day, an hour or longer every 2.8 days, etc.    And even though Bitcoin with the block size limits removed could be coerced to insane scaling levels, it would be a fairly storage and computation inefficient way to process all the world's transactions.   D'aniel also points out the considerable privacy/anonymity advantages other systems can have over Bitcoin (and add to Bitcoin when used along with it)

Quote
Or will it be raised somewhat after some scalability optimizations are implemented?

The limit can't be raised at all without a hardforking change (old nodes will not accept the new chain at all once the first oversized block is mined).  

It's not sufficient to change miners, as DeathAndTaxes suggests— the lifting the 1M protocol rule is a change unlike the BIP16/P2SH change which was fully compatible with old nodes. It's technically the same kind of change needed to adjust Bitcoin from 21m total BTC to 42m total BTC (though obviously not politically equal).   Every single piece of Bitcoin software produced would have to be updated to allow the oversized blocks

If the Bitcoin system were to take a hardforking change, switching to Ed25519 would remove ECC signature validation as a performance bottleneck, as  a fast quadcore desktop from today can do about 50k Ed25519 validates per second, compared to perhaps a thousand for the curve we use... though the random IO is still an issue.

More recently a number of people have independently invented the idea of committing to a merkle tree of open transactions.  If we do adopt some form of this it would allow the creation of nodes which are someplace in between SPV and a pruned full node in terms of security and decentralization benefit— so lower operating costs for nodes that validate. (In particular these nodes would have greatly reduced storage requirements)

Quote from: Theymos
No. IIRC Mike Hearn supports moving most nodes to SPV. My impression was that Satoshi also expected most nodes to use SPV. Not sure about the opinions of other developers besides gmaxwell.

Indeed, and Mike's position has gotten us (rightfully) flamed as not-decenteralized by e.g. Dan Kaminsky.

Gavin and Jeff have taken less strong positions than I have on the importance (and viability) of maintaining decenteralization in Bitcoin.  Although I expect to convince them eventually,  I think _everyone_ is in a wait and see mode.  Who knows what will happen?  At the moment I would aggressively argue against raising the limit— without it I don't see any alternative to Bitcoin becoming a particularly inefficient distributed system of establishment central banks— but I fully admit my position may change as things develop.  

I expect most Bitcoin users by count to be not even SPV— I expect most by count to be semi-SPV thin-clients (which may connect to a couple independent services). But expecting most users to be on not nodes does not preclude there being hundreds of thousands of nodes which perform complete validation, but gigabyte blocks surely would.


Stephen Gornick
Legendary
*
Offline Offline

Activity: 2506
Merit: 1010


View Profile
May 15, 2012, 11:54:27 PM
 #8

No. IIRC Mike Hearn supports moving most nodes to SPV.

Vocabulary / acronym of the day:

SPV - Simplified Payment Verification
 - http://en.bitcoin.it/wiki/Scalability#Simplified_payment_verification

Unichange.me

            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █


d'aniel (OP)
Sr. Member
****
Offline Offline

Activity: 461
Merit: 251


View Profile
May 15, 2012, 11:58:21 PM
 #9

Thanks for the excellent responses!

Still it is a totally non-issue.  Block size is 500 KB.  Average tx is ~ 500 bytes.  So the current block size is good for ~180K daily tx.  We are a small fraction of that.  If (due to economic pressure) some of the spam (satoshi dice, miner's taking 2 bitcent payouts, etc) was reduced we likely wouldn't even be 2K tx.
Ah, didn't realize so much of it was spam that wouldn't occur if transactions weren't basically free.  Still, though, 180K transactions/day is only ~2tps, or 0.1% of Visa, so hopefully this issue won't arise too far into the future Smiley

Another reason I can think to keep the limit is I believe the client software that talks with the tx servers would be engaging in real-time audits (for OT, anyway), and would thus require running a bitcoin client (something the average PC would be able to do because of the block size limit).  While smart phones would use SPV (or the merkle tree of open transactions gmaxwell just mentioned) to audit, there would still be a lot more fully verifying clients out there.  This is important, I think, because

Lightweight clients can't efficiently calculate the size of the coinbase for a block without downloading the whole block and then downloading the dependencies of every transaction in that block, along with the Merkle branches linking them to the relevant block headers (which may also need to be fetched because I think in future lightweight clients will throw away very old headers).
This means the inflation schedule can be much better enforced by staying decentralized, correct?
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526
Merit: 1134


View Profile
May 18, 2012, 05:24:37 PM
 #10

This means the inflation schedule can be much better enforced by staying decentralized, correct?

Yes, but I am not expecting key players to leave behind Satoshis code any time soon: miners, trading platforms and merchants should all be sticking with it. So even if mobile and desktop users followed some kind of inflationary fork without realizing, you'd still have to convince a majority of the miners, AND the merchants, AND the exchange operators.

That said, I expect using libraries like bitcoinj in combination with a regular Satoshi node to be quite common in future just because the programming model is simpler.

I don't think we need to worry about the block size limit any time soon. There are quite a few ways of using bitcoin that let you push non-time-sensitive transactions off to the night time when blocks should be less full - even very simple tricks like that could buy plenty of time to introduce a hard forking change.
nibor
Sr. Member
****
Offline Offline

Activity: 438
Merit: 291


View Profile
May 20, 2012, 06:51:11 PM
 #11

Why do you think that this issues is a long way off?

Just looking at today and there is a 450k block:
http://blockchain.info/block-index/229209/00000000000006bc9956fe5ffc47c310a4270f560866ee86d8b5b30f75ff1ee8
and many in the 200k+ range.

All it will take is a couple more "satoshidice.com" type services that are taking advantage of the very cheap transaction fees and could start to hit it.

Do not think it is a bad thing having the limit though as 52gig a year of blockchain would soon cause all sorts of other issues!

nibor
Sr. Member
****
Offline Offline

Activity: 438
Merit: 291


View Profile
May 20, 2012, 06:54:12 PM
 #12

some of the spam (satoshi dice, miner's taking 2 bitcent payouts, etc)

This is a feature! And I think will be a lot more soon...
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!