Bitcoin Forum
April 25, 2024, 03:30:36 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 [3] 4 5 6 7 8 9 10 11 »  All
  Print  
Author Topic: Funding of network security with infinite block sizes  (Read 24526 times)
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
March 28, 2013, 06:24:32 PM
 #41

I think that's rather harsh. People are processing Bitcoin problems (scalability, block size, etc.) in different parts, from different aspects, and with differing information. I don't think calling anything obvious is fair.
I didn't necessarily mean it should have been obvious to you, but it should be for the person you were quoting, especially since it's been brought up many times in the past.
1714059036
Hero Member
*
Offline Offline

Posts: 1714059036

View Profile Personal Message (Offline)

Ignore
1714059036
Reply with quote  #2

1714059036
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714059036
Hero Member
*
Offline Offline

Posts: 1714059036

View Profile Personal Message (Offline)

Ignore
1714059036
Reply with quote  #2

1714059036
Report to moderator
1714059036
Hero Member
*
Offline Offline

Posts: 1714059036

View Profile Personal Message (Offline)

Ignore
1714059036
Reply with quote  #2

1714059036
Report to moderator
1714059036
Hero Member
*
Offline Offline

Posts: 1714059036

View Profile Personal Message (Offline)

Ignore
1714059036
Reply with quote  #2

1714059036
Report to moderator
acoindr
Legendary
*
Offline Offline

Activity: 1050
Merit: 1002


View Profile
March 28, 2013, 06:31:26 PM
 #42

I think that's rather harsh. People are processing Bitcoin problems (scalability, block size, etc.) in different parts, from different aspects, and with differing information. I don't think calling anything obvious is fair.
I didn't necessarily mean it should have been obvious to you, but it should be for the person you were quoting, especially since it's been brought up many times in the past.

Yes, that's what I meant. I've always believed the time between blocks factor was the thing giving miners incentive to broadcast hurriedly, but I don't recall reading that in any block size debate thread. That's what I mean about people processing problems from different aspects and with different information sets. How do you know most everyone has seen that point mentioned?
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1149


View Profile
March 28, 2013, 06:31:31 PM
 #43

Hang on a second. Am I missing something? I don't think miners need a hard block size limit to have incentive to stop accepting transactions. They will do so because there is always a time limit.

The difficulty target is adjusted to regulate time between blocks, and results in a target with a probability a correct hash will be found within a certain time (regardless the total hashing power of the network). Every second a miner waits to include more transactions in their block the probability is increased a competing miner will find a correct hash for their own block.

Yeah you're missing something.

Miners are always constantly hashing to try to find a new block that would include the most profitable set of transactions that they could include in that block. If they find a hash that meets the target they should immediately send the block they found to the network and start trying to find a hash that would build on the block they found.

Under no circumstance does it ever make sense to withhold a solution. If they find two blocks in a row, splitting the transactions they could have included in the blocks between the two blocks, that's actually better for the miner because it makes it harder for other miners to orphan those blocks, and thus collect the fees themselves.

You have to remember that mining is a random process. It's not like you work towards solving a block, it's more like you have this machine that spits out lottery tickets, and you are scratching them off as fast as possible hoping for a winner. You might get lucky and have two winners in a row, or unlucky and go for days before finding another one, but either way ever winner you do find you should cash in immediately however much it's worth.

acoindr
Legendary
*
Offline Offline

Activity: 1050
Merit: 1002


View Profile
March 28, 2013, 06:34:16 PM
 #44

Hang on a second. Am I missing something? I don't think miners need a hard block size limit to have incentive to stop accepting transactions. They will do so because there is always a time limit.

The difficulty target is adjusted to regulate time between blocks, and results in a target with a probability a correct hash will be found within a certain time (regardless the total hashing power of the network). Every second a miner waits to include more transactions in their block the probability is increased a competing miner will find a correct hash for their own block.

Yeah you're missing something.

Miners are always constantly hashing to try to find a new block that would include the most profitable set of transactions that they could include in that block. If they find a hash that meets the target they should immediately send the block they found to the network and start trying to find a hash that would build on the block they found.

Under no circumstance does it ever make sense to withhold a solution. If they find two blocks in a row, splitting the transactions they could have included in the blocks between the two blocks, that's actually better for the miner because it makes it harder for other miners to orphan those blocks, and thus collect the fees themselves.

You have to remember that mining is a random process. It's not like you work towards solving a block, it's more like you have this machine that spits out lottery tickets, and you are scratching them off as fast as possible hoping for a winner. You might get lucky and have two winners in a row, or unlucky and go for days before finding another one, but either way ever winner you do find you should cash in immediately however much it's worth.

I don't see that you've shown something I missed (not trying to be sarcastic). It sounds like you're describing my point.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1149


View Profile
March 28, 2013, 06:43:27 PM
 #45

I don't see that you've shown something I missed (not trying to be sarcastic). It sounds like you're describing my point.

Ah, you're saying that because miners have a time limit, they won't want to fill up their blocks.

What I'm saying, and now I think you do understand, is that mining is a random process so miners should send every block out with whatever transactions they included in it when they found the correct PoW; we're in agreement on that point.

However, without a limit, what reason do I have to send the miner a high fee in the first place? Provided marginal cost of including my transaction, based on network costs and the increased chance the block will be orphaned, is less than the fee I attached they'll include it. So naturally fees will settle down to that marginal cost. The problem is the network cost is tiny, yet has nothing to do with the long-term cost of storing the UTXO set, and also is fixed so that profitability for larger, more centralized, pools is always higher than smaller pools. The other side of the cost, the orphaning chance, goes down as fees go down, essentially because if fees aren't significant, the loss due to orphaning isn't significant either, so you can take more risks and try to stuff more low-fee transactions into your blocks.

It's a nasty race to the bottom - a textbook example of how capital intensive businesses where efficiency goes up as capital investment tends to result in oligopolies or monopolies in the long run.

acoindr
Legendary
*
Offline Offline

Activity: 1050
Merit: 1002


View Profile
March 28, 2013, 07:23:06 PM
 #46

I don't see that you've shown something I missed (not trying to be sarcastic). It sounds like you're describing my point.

Ah, you're saying that because miners have a time limit, they won't want to fill up their blocks.

What I'm saying, and now I think you do understand, is that mining is a random process so miners should send every block out with whatever transactions they included in it when they found the correct PoW; we're in agreement on that point.

However, without a limit, what reason do I have to send the miner a high fee in the first place? Provided marginal cost of including my transaction, based on network costs and the increased chance the block will be orphaned, is less than the fee I attached they'll include it. So naturally fees will settle down to that marginal cost. The problem is the network cost is tiny, yet has nothing to do with the long-term cost of storing the UTXO set, and also is fixed so that profitability for larger, more centralized, pools is always higher than smaller pools. The other side of the cost, the orphaning chance, goes down as fees go down, essentially because if fees aren't significant, the loss due to orphaning isn't significant either, so you can take more risks and try to stuff more low-fee transactions into your blocks.

It's a nasty race to the bottom - a textbook example of how capital intensive businesses where efficiency goes up as capital investment tends to result in oligopolies or monopolies in the long run.

OK, got it, thanks. Yes, I was missing something. That's what I get for reading too quickly. The following, which I quoted earlier, is actually right:

...

The argument is that unless there is a hard block size limit, miners are incentivised to include any transaction no matter how small its fee because the cost of doing so is practically zero (less than a microdollar, according to Gavins calculations). Therefore if a bunch of transactions stack up in the memory pool that pay a smaller percentage than "normal", some miner will include them anyway because it costs nothing to do so and maximizes short term profit. Hence, you get a race to the bottom and you need some kind of hard network rule saying you can't do that. We already have one in the form of block byte size, so the debate becomes "let's keep the size limit" vs "let's remove it".

In my mind I was thinking of this text from the OP:

One question that comes up often in the block size debate is how will mining be funded if there's no competition for block space.

I took that as meaning no fees, but the other quote is about low/marginal fees, not zero fees.

My response about blocks being limited by time addresses zero fees, not low fees, as miners will prioritize transactions with any fee (even if very low) first.

I see what you mean about the race to the bottom now for marginal fees. I remember reading that point in another debate thread.
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
March 28, 2013, 07:39:14 PM
 #47

Block size is effectively infinite right now because they are nowhere near being full to the 500KB soft limit.

Empirical evidence is preferred over theoretical chains of cause and effect, and this shows a steady long-term increase in fees already. The chart below is in BTC, ignoring the USD equivalent one...
https://blockchain.info/charts/transaction-fees?showDataPoints=false&timespan=&show_header=true&daysAverageString=7&scale=0&address=

What is being done in the field to increase fees, right now, is WORKING.

All that needs to happen is allow the 1MB to be replaced by a capping algorithm which just keeps pace ahead of demand. Then see what happens to fees. If they plateau at too low a level - then try to fix it. Why fix something which is not broken (except the need to avoid the sudden train wreck due to an arbitrary constant).

acoindr
Legendary
*
Offline Offline

Activity: 1050
Merit: 1002


View Profile
March 28, 2013, 08:04:43 PM
 #48

Block size is effectively infinite right now because they are nowhere near being full to the 500KB soft limit.

That's something that occurred to me too...

Empirical evidence is preferred over theoretical chains of cause and effect, and this shows a steady long-term increase in fees already. The chart below is in BTC, ignoring the USD equivalent one...
https://blockchain.info/charts/transaction-fees?showDataPoints=false&timespan=&show_header=true&daysAverageString=7&scale=0&address=

What is being done in the field to increase fees, right now, is WORKING.

All that needs to happen is allow the 1MB to be replaced by a capping algorithm which just keeps pace ahead of demand. Then see what happens to fees. If they plateau at too low a level - then try to fix it. Why fix something which is not broken (except the need to avoid the sudden train wreck due to an arbitrary constant).

Have there been good arguments against a dynamic cap?
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
March 28, 2013, 08:31:16 PM
 #49

The problem is the network cost is tiny, yet has nothing to do with the long-term cost of storing the UTXO set, and also is fixed so that profitability for larger, more centralized, pools is always higher than smaller pools.
There is a marginal cost to the miner for increasing the UTXO set in the form of capital investment of memory and fast storage to store it in. When the UTXO set gets large enough to be a problem miners will have an economic incentive to reduce their hardware costs by favouring transactions that shrink the set over those that grow the set.

Even the miners with lower capital costs will have an incentive to limit the size of the set because it affects the speed at which other nodes can validate their blocks and thus their orphan rate.
markm
Legendary
*
Offline Offline

Activity: 2940
Merit: 1090



View Profile WWW
March 28, 2013, 08:38:32 PM
 #50

What orphan rate? Miners who cannot service the large population markets hardly even count, do they? If you are serving billions of people, who will even care that a bunch of third world nation peasants local miners fail to rubber-stamp megacorp's blocks?

Heck if merely having more bandwidth isn't enough to nuke the competition and gain a monopoly, why not buy some hashing power too fergoshsakes?

If your huge nuke the smaller people blocks aren't making you enough money to buy up a majority of hashing power too, maybe you aren't doing it right or are doing it too soon or are merely too bandwidth-centric and not balancing your bandwidth advantage with hashing advantage.

Maybe you can get together with number two and together try harder?

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
markm
Legendary
*
Offline Offline

Activity: 2940
Merit: 1090



View Profile WWW
March 28, 2013, 08:43:54 PM
 #51

Block size is effectively infinite right now because they are nowhere near being full to the 500KB soft limit.

Bandwidth is NOT effectively infinite right now, because there still exists at least one entity or nation on the planet that has enough bandwidth and processing power to process blocks.

Before it reaches effectively infinite, it will reach effectively too much for anyone other than the one global mega best at it cartel to handle.

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
caveden
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
March 28, 2013, 09:12:17 PM
 #52

The fact that solutions are being proposed to a problem that can be so trivially shown not to exists calls into question the real motives of the people pushing said solutions.

I believe their motive is to try to convince those who want to cripple Bitcoin to a 1Mb block size limit that such crippling is not a good idea...

Have there been good arguments against a dynamic cap?

I used to support such idea, until I realize that a dynamic cap implies in an arbitrary formula, and that's an attempt to guess subjective demands and unpredictable supplies. It's impossible, and fortunately, not necessary.
acoindr
Legendary
*
Offline Offline

Activity: 1050
Merit: 1002


View Profile
March 28, 2013, 09:24:47 PM
 #53

Have there been good arguments against a dynamic cap?

I used to support such idea, until I realize that a dynamic cap implies in an arbitrary formula, and that's an attempt to guess subjective demands and unpredictable supplies. It's impossible, and fortunately, not necessary.

That doesn't make sense to me. The formula wouldn't need to be arbitrary; it could be based on actual data. If your sentiment were true the difficulty target wouldn't work.
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
March 28, 2013, 09:30:25 PM
 #54

Block size is effectively infinite right now because they are nowhere near being full to the 500KB soft limit.
Bandwidth is NOT effectively infinite right now,...
-MarkM-

MarkM, I still read your posts because you embed enough useful feedback within your stream-of-consciousness padding to make it worthwhile, but it is a struggle at times.

Clearly bandwidth is finite, but the increase in propagation time between 50KB, 500KB or 5MB blocks is not very significant in a 10 minute time window. The increase in verification time seems to be the real limiting factor.

I used to support such idea, until I realize that a dynamic cap implies in an arbitrary formula, and that's an attempt to guess subjective demands and unpredictable supplies. It's impossible, and fortunately, not necessary.

The advantages are nice to have, not mission critical:
Politically, a cap is a less radical departure from the soft and hard block limit which people know about. Psychologically, it maintains a perceived need to add fees, and might price out SD-like flooding. It also prevents the chance that an unexpected monster block gets accepted and built on causing problems for some miners.



caveden
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
March 28, 2013, 09:40:08 PM
 #55

That doesn't make sense to me. The formula wouldn't need to be arbitrary; it could be based on actual data.

But the formula remains arbitrary. You can't come up with an algorithm capable of measuring actual demand and actual supply, since these units are impossible to measure. So you can't really know how many security is demanded (remember, demand is subjective!), nor how such demand would compete for Earth's scarce resources. You need to be omniscient to know all that.
 
If your sentiment were true the difficulty target wouldn't work.

The difficulty target aims to make one block at every 10min. But why 10min? This is an arbitrary value. It may be too much sometimes, too little at other times. It's certainly not optimal. That said, it's not such a big deal, and trying to improve it would not be worth the risks.

Concerning mining remuneration, if we can go directly to spontaneous order - and that's the closest you'll ever get from "optimal" -, then why not? Why try to come up with arbitrary formulas? That would be "presumption of knowledge".
Quote from: Hayek
"The curious task of economics is to demonstrate to men how little they know about what they imagine they can design."

Politically, a cap is a less radical departure from the soft and hard block limit which people know about. Psychologically, it maintains a perceived need to add fees, and might price out SD-like flooding.

SD is not flooding anything. They're not attacking the network, Bitcoin users want to use their services.
Of all business, they're likely the one that has mostly contributed to miners via transaction fees.

It also prevents the chance that an unexpected monster block gets accepted and built on causing problems for some miners.

Miners have no interest in keeping a "monster block". And they can easily choose not to build on top of such block, unless it is N blocks deep already, what would likely get the monster block rejected by the network.
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 3920
Merit: 2348


Eadem mutata resurgo


View Profile
March 28, 2013, 10:27:34 PM
 #56

Interesting discussion.

solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
March 28, 2013, 11:11:24 PM
Last edit: March 29, 2013, 04:11:04 AM by solex
 #57

SD is not flooding anything. They're not attacking the network, Bitcoin users want to use their services.
Of all business, they're likely the one that has mostly contributed to miners via transaction fees.

This is the Circe-like character of SD. It looks attractive but carries great dangers. I am still concerned that this type of transaction source can scale far faster than the Bitcoin network.

Miners have no interest in keeping a "monster block". And they can easily choose not to build on top of such block, unless it is N blocks deep already, what would likely get the monster block rejected by the network.

Consider variance. One hallmark of any successful, complex system is low variance of important intrinsic parameters. The Earth's ecosystem depends upon low variance in climate: e.g. the difference in air pressure between a cyclone and anticyclone is not a large percentage of 1 atmosphere.
In the case of Bitcoin, a very small block followed by a very large one is an unhealthy sign. A cap will help keep the variance (standard of deviation) of block size lower. This must be helpful to all miners as they know what to expect and plan accordingly, making incremental changes, which are always safer. A cap helps ensure all miners are on the same page about what is considered an expected block or a oversized one.


markm
Legendary
*
Offline Offline

Activity: 2940
Merit: 1090



View Profile WWW
March 29, 2013, 01:55:07 PM
Last edit: March 30, 2013, 12:56:36 PM by markm
 #58

If there is no cap, there is no amount of resources you can buy and set up and run that will be enough unless you are the top spender, or possibly one of the cartel of top spenders.

A cap means we can know how many millions of dollars each node is going to cost, so that we can start figuring out how many such nodes the world, or a given nation, or a given demographic, or a given multinational corporation, or a given corner store, or a given mid-sized business, or a given chain of grocery stores, etc etc can afford to set up and run.

No cap means those things basically cannot be known, so trying to build a node becomes a massive hole in the budget that you keep throwing money at but never maybe manage to throw as much money at it as sprint and bell and google and yahoo and virgin so end up having thrown away all your money for nothing.

So we have to know, are nodes something only the fortune 25 should be able to afford? Or something even the entire fortune 500 could afford? Could any of the fortune 1000 that are not in the fortune 500 afford it if they happen to be very highly aligned to and optimised for that particular kind of business? Or should they be able to afford it even if it is not really strongly aligned with their existing infrastructure and business?

Those are the kinds of things we need to know, that a lack of cap on block size makes unknowable.

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
acoindr
Legendary
*
Offline Offline

Activity: 1050
Merit: 1002


View Profile
March 29, 2013, 06:38:42 PM
 #59

That doesn't make sense to me. The formula wouldn't need to be arbitrary; it could be based on actual data.

But the formula remains arbitrary. You can't come up with an algorithm capable of measuring actual demand and actual supply, since these units are impossible to measure. ...

No, but you can have an algorithm that measures actual data, which is how the difficulty target works. You can measure what happened in the past.

If your sentiment were true the difficulty target wouldn't work.

The difficulty target aims to make one block at every 10min. But why 10min? This is an arbitrary value. It may be too much sometimes, too little at other times. It's certainly not optimal. That said, it's not such a big deal, and trying to improve it would not be worth the risks.

No, the difficulty target being set at 10 minutes is not arbitrary. It may or may not be optimal, but it's not arbitrary. If that value could be set arbitrarily then it could be two weeks, or two years, which of course is not workable for the application.

Speaking of all this it occurs to me we could have a dynamic cap provide both a limit and non-limit for block size. That may be a workable way to satisfy both camps.

I once had a friend download a movie using BitTorrent  Roll Eyes and noticed the download speed varied from an absolute trickle to a full flood of throughput. Like a race car on a freeway speeds alternated between open and constricted. I'm pretty sure that's done so "leechers" don't drain "seeder" resources too much as would naturally happen if transfer channels were left unchecked.

Bitcoin could work the same way. Form a mental picture of the block size beating slow like a heart. At times the block size could be constricted, allowing small players equal chance to participate meaningfully. However, that constriction could also be released to allow unlimited throughput.

In the real world that would translate to inconvenience only if and when you needed a transaction with Bitcoin's desirable features (anonymity, irreversibility, etc.) at a time of block size constriction and weren't willing to bid a high enough fee for priority inclusion. You might instead opt for an alternate cryptocurrency or suitable off-chain transaction option. This seems a small price to pay if it makes Bitcoin workable at a global scale.
doobadoo
Sr. Member
****
Offline Offline

Activity: 364
Merit: 250


View Profile
March 30, 2013, 03:46:15 PM
 #60

You meant why not enact a percentage fee, right?

The argument is that unless there is a hard block size limit, miners are incentivised to include any transaction no matter how small its fee because the cost of doing so is practically zero (less than a microdollar, according to Gavins calculations). Therefore if a bunch of transactions stack up in the memory pool that pay a smaller percentage than "normal", some miner will include them anyway because it costs nothing to do so and maximizes short term profit. Hence, you get a race to the bottom and you need some kind of hard network rule saying you can't do that. We already have one in the form of block byte size, so the debate becomes "let's keep the size limit" vs "let's remove it".

Not exactly true, very large blocks are slow to transmit and slow for others to process before relaying.  This increases the chance that a miner scores a block but it becomes orphaned by a miner with a smaller sized block.  Where this limit plays out is not known yet.  Is it 1MB blocks, 2MB, 10MB, 100MB?

"It is, quite honestly, the biggest challenge to central banking since Andrew Jackson." -evoorhees
Pages: « 1 2 [3] 4 5 6 7 8 9 10 11 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!