Bitcoin Forum
May 06, 2024, 10:59:17 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 »  All
  Print  
Author Topic: Permanently keeping the 1MB (anti-spam) restriction is a great idea ...  (Read 104993 times)
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
February 05, 2015, 09:58:08 PM
Last edit: February 06, 2015, 08:56:26 AM by solex
 #141

The assertion that a fee market only works if miners act as a cartel is false.  The situation in which miners do not act as a cartel simply presents the consumer with a range of prices which they can choose among, paying a premium if they are willing to pay for high-priority or prompt service or taking a discount for making low-priority or slow transactions.

Imagine a market in which there is no cartel.  To make it simple, suppose that there are ten miners each with ten percent of the hashing power, and that the block size limit is not routinely reached.  Because they are economically rational and facing different prices for bandwidth and electricity in their respective neighborhoods, they all set different minimum-fee policies.

The consumer is faced with ten different price points for a "minimum acceptable" fee, which determines how many of these miners would accept his or her transaction.  

So... paying a minimum fee would get your tx accepted by one miner.  On average you're going to have to wait ten blocks before that miner gets a block, so your expected tx time is about 100 minutes.  Paying a median fee would get your tx accepted by any of five miners.  On average you're going to have to wait two blocks before one of those five gets a block, so 20 minutes.  Paying the highest fee would get your tx into any block regardless of who mines it, so you'll be in the very next block in around 10 minutes.  

The point is that consumers are not faced with a binary "pay enough" or "don't pay anything" choice; they are faced instead with the opportunity to select a level of responsiveness desired and pay for the priority they want or need on a by-transaction basis.  


This is a great explanation!
It fits the "unconstrained" block size scenario, which is how Bitcoin has worked for most of its existence, except for a day or so about March 6th, 2013, when the 250KB soft-limit was effective. It does mean that when users create a transaction they have a single-shot at getting the fee right. In the simplified example, if a user pitches their fee so that 2 miners will accept it (out of 10), and then change their mind, deciding that waiting an expected 50 minutes is too long, then they are SOL, the unconfirmed tx can't be changed. Fortunately this is rare.

The "constrained" block size scenario makes necessary the ability for ordinary users to increase the fee. Users will want to update the fee on their unconfirmed tx to manage the instability in confirmation times, otherwise their tx can remain stuck in cyberspace, and they are helpless.

Certainly, protocol block limits should not be hit unless all wallets first support the updating of fees on unconfirmed tx.


1715036357
Hero Member
*
Offline Offline

Posts: 1715036357

View Profile Personal Message (Offline)

Ignore
1715036357
Reply with quote  #2

1715036357
Report to moderator
1715036357
Hero Member
*
Offline Offline

Posts: 1715036357

View Profile Personal Message (Offline)

Ignore
1715036357
Reply with quote  #2

1715036357
Report to moderator
1715036357
Hero Member
*
Offline Offline

Posts: 1715036357

View Profile Personal Message (Offline)

Ignore
1715036357
Reply with quote  #2

1715036357
Report to moderator
"Your bitcoin is secured in a way that is physically impossible for others to access, no matter for what reason, no matter how good the excuse, no matter a majority of miners, no matter what." -- Greg Maxwell
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1715036357
Hero Member
*
Offline Offline

Posts: 1715036357

View Profile Personal Message (Offline)

Ignore
1715036357
Reply with quote  #2

1715036357
Report to moderator
1715036357
Hero Member
*
Offline Offline

Posts: 1715036357

View Profile Personal Message (Offline)

Ignore
1715036357
Reply with quote  #2

1715036357
Report to moderator
gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
February 05, 2015, 10:17:26 PM
 #142

Do not forget that the hard-coded fees constants fix should be addressed simultaneously with this issue since they are inter-linked .... or we'll be back arguing about that eventually also.
We don't have hardcoded fees in Bitcoin Core... except very low ones for relay permission which have been, in practice, below typical. They're kind of ugly, and I'm generally opposed to hardcoded fees, but if they're below behavior in practice they don't cause much harm (and are very very help at preventing resource exhaustion attacks). Bitcoin Core 0.10 has a automatic fee system based on the transactions in the mempool and recent blocks, where you can set a target number of blocks to wait and it will pay based on recent history.

The "constrained" block size scenario makes necessary the ability for ordinary users to increase the fee. Users will want to update the fee on their unconfirmed tx to manage the instability in confirmation times, otherwise their tx can remain stuck in cyberspace, and they are helpless.
This is relatively straight forward to support. When a new transaction comes into the mempool, if it pays at least $increment more fees per KB than the conflicting already mempooled transaction, replace it and forward on.  Then you just need fairly simple wallet support to revise a transaction. Petertodd (IIRC) already wrote "replace by fee" code that does this.  The catch is that implemented this way it makes zero-confirmed transactions less safe, since you could have a greater success in double spending.   This can be addressed by narrowing the set of allowed replacements (e.g. all outputs must be equal or greater), but AFAIK no one has bothered implementing it.

Quote
Certainly, protocol block limits should not be hit unless all wallets first support the updating of fees on unconfirmed tx.
Chicken and egg. Without fee pressure there is no incentive to work on software to do that. Most non-bitcoin core wallets just set rather high hardcoded fees (even constant ones that don't related the the txsize metric that miners use to prioritize transactions into blocks.).

Unfortunately over-eager increases of the soft-limit have denied us the opportunity to learn from experience under congestion and the motivation to create tools and optimize software to deal with congestion (fee-replacement, micropayment hubs, etc).

Look at the huge abundance of space wasting uncompressed keys (it requires ~ one line of code to compress a bitcoin pubkey) on the network to get an idea of how little pressure there exists to optimize use of the blockchain public-good right now.

Because they are economically rational and facing different prices for bandwidth and electricity in their respective neighborhoods, they all set different minimum-fee policies.
With correctly setup software there is no relationship between your bandwidth or electricity costs as a miner and the transactions you accept into your blocks, and any slight residual relation can be divided down to nothing by pooling with other N other miners (centralizing the consensus in the process) in order to have 1/Nth the bandwidth/cpu costs. As a miner you maximize your personal income by accepting all available transactions that fit which pay a fee, it's best for you when other miners reject low fee paying transactions to encourage people to pay high fees, but you dont and instead hoover up all the fees they passed up. They take the cost of encouraging users to pay higher fees, you defect and take the benefit.

A more detained explaination is forthcoming.
Sounds good, but hopefully you can understand that some people are not very comfortable betting Bitcoin's future on not-yet-public theorems (which sounds like they must be at odds with the best understanding available from the active technical community _and_ academia...).  There have been many "bitcoin scaling" ideas that accidentally turned out to have no security or implied extreme centralization once considered more carefully. There are a few ideas which I think will someday help a lot, but they're not practical yet and its not clear when they will be.
oldbute
Jr. Member
*
Offline Offline

Activity: 59
Merit: 10


View Profile
February 05, 2015, 10:26:03 PM
 #143

^ Did you read the entire post? The OP fully addressed the effect on fees:

He neglects that there is no reason to pay fees, if there is no limit on supply.

just because there's the POSSIBILITY of 20MB doesnt mean you HAVE TO use it.

Since there is no marginal cost in including a transaction to the current block, a rational miner will always include a transaction with a non zero fee,
before it is included by any of its competitors.

Therefore a lower bound on fee will not work without a cartel or without a competition for space.

I prefer algorithms over cartels.

The chance of orphan blocks should provide some competition for space.  Miners may find with current network topology a 4MB block is the right size.   As more nodes and faster connections occur size can be adjusted.  Is a hard limit an algorithm?
gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
February 05, 2015, 10:29:43 PM
 #144

The chance of orphan blocks should provide some competition for space.
Centralized miners suffer much lower orphan blocks if the orphan block rate is macroscopic and driven by actual propagation time. If you're in a regime where one would want to do something to lower their orphan rate, the optimal income maximizing strategy is to centralize, not to reduce sizes.

Though at least fundamentally we know there there is no need for the orphan rate to increase proportional to block-size, if miners use more efficient relaying mechanisms that take advantage of the transactions having been already sent in advance.
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
February 05, 2015, 11:17:05 PM
Last edit: February 06, 2015, 12:14:26 AM by solex
 #145

Unfortunately over-eager increases of the soft-limit have denied us the opportunity to learn from experience under congestion and the motivation to create tools and optimize software to deal with congestion (fee-replacement, micropayment hubs, etc).

Probably the best time to let soft-limits persist was in 2011/12 when the ecosystem was smaller, the funds at stake were a lot smaller, users considered the software more experimental than beta, and the world's press wasn't really watching.

Look at the huge abundance of space wasting uncompressed keys (it requires ~ one line of code to compress a bitcoin pubkey) on the network to get an idea of how little pressure there exists to optimize use of the blockchain public-good right now.

My experience of (centralized) financial systems over many years is that ignoring hardware and software constraints as they are approached invariably causes outages. Also, that trying to train a user-base or worse, a market, to behave differently to accommodate IT constraints is a Sisyphean task. There are probably hundreds of IT experts who are concerned about the block size limit, because they can see the risks in it, which they recognize from prior (usually bitter) experience.

And, this is where the role of Core Dev is crucial. If there are major efficiencies to be had, "low-hanging fruit", then it would be wonderful to see them go live and reflected in smaller blocks etc. But right now, we can only project forwards, from what is happening with the average block size.

bitcoinbeliever
Newbie
*
Offline Offline

Activity: 54
Merit: 0


View Profile
February 05, 2015, 11:20:13 PM
 #146

D&T, I agree with most everything you wrote, many thanks for the interesting research and composition effort.
zimmah
Legendary
*
Offline Offline

Activity: 1106
Merit: 1005



View Profile
February 06, 2015, 01:06:02 AM
 #147

Here https://bitcointalk.org/index.php?topic=941331.0;topicseen 215 forum users don't think the same :

So - they could be all sockpuppet accounts (as this forum supports that).

Any poll on this forum is worth *zero* so posting the result of any such poll has *zero credibility*.


Polls on bitcointalk:

- if I agree with the outcome, it's undenyable evidence that I am right
- if I don't agree with the outcome, it's just sock puppets.

Why did this forum get so filled with trolls?
amincd
Hero Member
*****
Offline Offline

Activity: 772
Merit: 501


View Profile
February 06, 2015, 02:39:59 AM
 #148

It's not a binary option. Raising the limit at a moderate pace, so that fees don't have to increase a substantial amount with increasing adoption, is a middle ground solution, that will lead to average fees remaining affordable but not-zero.

There are quite a few constants in Bitcoin that one could argue with, that is what we do. It is however important to do it for the right reason.
Avoid paying fees is not a right reason.

Again, a straw man argument. No one has argued that the limit should be raised to "avoid paying fees". I want the blocks to come against the limit but I want that limit to be much higher than it is. I want the limit to put some upward pressure on fees, but not too much, because I don't want mass adoption to be dependent on end users paying "excessive fees" to access the blockchain.

Exending on above:

It is rather difficult to substantiate an algorithm that would set course for the future, given the huge amount of unkown parameters a constant we have is preferred by occam's razor.

But anyone wanting a permanent 1 MB restriction also needs to substantiate this course being set for the future. The argument for getting rid of the 1 MB restriction is no more speculative than the one for making it permanent.

Given what Peter R has shown:

In response to those claiming that a hard fork to increase the blocksize limit will hurt the miners' ability to collect fee revenue:

The empirical data we have so far does not support the notion that the miners will be starved of fees or that blocks will be full of low fee transactions if the blocksize limit is increased.  If we inspect the fees paid to miners per day in US dollars over the lifetime of the network (avg blocksize << max blocksize), we see that total fee revenue, on average, has grown with increases in the daily transaction volume.



The total daily fees, F, have actually grown as the number of transactions, N, raised to the power of 2.7.  Although I don't expect this F~N2.7 relationship to hold forever, those suggesting that the total fees would actually decrease with increasing N have little data to support this claim (although, during our present bear market we've seen a reduction in the daily fees paid to miners despite an increase in N.)

Past behaviour is no guarantee of future behaviour, but historically blocks don't get filled with low-fee transactions and historically the total fees paid to miners increases with increased transaction volume.

And given that there is no reason to assume that the demand for space relative to available space will be lower with a higher block size limit, there is no reason not to raise the limit above the current 1 MB in light of DeathAndTaxes' analysis on what this limit will mean for end-user access to the blockchain.
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 06, 2015, 04:22:17 AM
 #149

Polls on bitcointalk:

- if I agree with the outcome, it's undenyable evidence that I am right
- if I don't agree with the outcome, it's just sock puppets.

Why did this forum get so filled with trolls?

I have never agreed with the use of polls in the forum full stop (and my own forum software does not even have them).

The very idea of putting polls into a forum that "encourages" sockpuppets is ridiculous.

(but let's not go off topic)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
ObamaMcKennit
Newbie
*
Offline Offline

Activity: 1
Merit: 0


View Profile
February 06, 2015, 05:05:50 AM
 #150

http://trilema.com/2015/gerald-davis-is-wrong-heres-why/

Since you content farm lot are all busy copy/pasting the same stuff over and over to each other instead of paying attention.
wilth1
Member
**
Offline Offline

Activity: 63
Merit: 10


View Profile
February 06, 2015, 05:08:33 AM
 #151

Why couldn't MAX_BLOCK_SIZE be self-adjusting?
gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
February 06, 2015, 05:15:42 AM
 #152

Why couldn't MAX_BLOCK_SIZE be self-adjusting?
That very vague.... based on what?   The hard rules of the protocol are what protect the users and owners of Bitcoins from miners whos interests are only partially aligned.  Sadly, miners have substantial censoring power for data that goes into the blockchain.  I suppose it's useful to have an in-protocol way of coordinating rather than depending on potentially non-transparent back room dealing; but almost anything in the network would be easily gamable by miners. There are some things that I think are preferable to just having no effective limit (e.g. having a rolling median, and requiring mining at higher diff to move the needle for your own blocks, and requiring difficulty to not be falling over-all for the size to go up) but these don't address half the concerns and potentially add a fair bit of complexity (which has its own risks.). 
sangaman
Sr. Member
****
Offline Offline

Activity: 342
Merit: 250



View Profile WWW
February 06, 2015, 05:31:16 AM
 #153

Thank you DeathAndTaxes for this excellent post on why the block size limit must increase if bitcoin is to ever reach its potential as (or simply to remain) a decentralized, peer-to-peer means of exchange. A highly restricted blockchain that is impractical for regular people to use is not what bitcoin was ever intended to be, and you did a good job explaining why this is what bitcoin would become with a fixed 1 MB cap (and sooner than most people think). You also did a good job debunking several of the most common objections to lifting the 1 MB cap, and I would further emphasize that an artificial cap on the number of bitcoin transactions is not the best way to maximize mining fees and/or security. This follows from basic economic principles. Hypothetically speaking, if you assume purely self-interested miners and capping blocks at 1 MB happens to be the way to generate the greatest amount of transaction fees, then blocks will be no larger than 1 MB regardless of what the block limit is. Of course if there were no block limit, miners would be able to maximize their transaction fee intake whether that mean 2 MB blocks, 5 MB, whatever... market forces will determine the fee per byte that is needed to get a transaction into the blockchain in a timely fashion. Whether that fee makes it economical to purchase a cup of coffee with bitcoin, that remains to be seen (and may very well vary from person to person).

D&T, you've been one of if not the most consistently reasonable, sincere, and intelligent posters on this forum since I first discovered this forum, and I appreciate you taking the time to write this persuasive argument on a topic that's critical for the long-term success and viability of bitcoin.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
February 06, 2015, 05:36:23 AM
 #154

What if he's wrong?

Quote
without MPEx no fork of this network can succeed
sangaman
Sr. Member
****
Offline Offline

Activity: 342
Merit: 250



View Profile WWW
February 06, 2015, 06:19:45 AM
 #155

Of course if there were no block limit, miners would be able to maximize their transaction fee intake whether that mean 2 MB blocks, 5 MB, whatever... market forces will determine the fee per byte that is needed to get a transaction into the blockchain in a timely fashion. Whether that fee makes it economical to purchase a cup of coffee with bitcoin, that remains to be seen (and may very well vary from person to person).

I should actually correct myself here since I overlooked the potential tragedy of the commons situation when it comes to miners including transations- a blockchain with no block size limit would not necessarily generate the most fees or be the most secure for that reason. However, it's still entirely possible (and in my opinion quite likely) that increasing the block size limit from 1 MB would increase the total fees per block in the long run.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
February 06, 2015, 06:26:54 AM
 #156

Of course if there were no block limit, miners would be able to maximize their transaction fee intake whether that mean 2 MB blocks, 5 MB, whatever... market forces will determine the fee per byte that is needed to get a transaction into the blockchain in a timely fashion. Whether that fee makes it economical to purchase a cup of coffee with bitcoin, that remains to be seen (and may very well vary from person to person).

I should actually correct myself here since I overlooked the potential tragedy of the commons situation when it comes to miners including transations- a blockchain with no block size limit would not necessarily generate the most fees or be the most secure for that reason. However, it's still entirely possible (and in my opinion quite likely) that increasing the block size limit from 1 MB would increase the total fees per block in the long run.
For the entirety of Bitcoin's history, it has produced blocks smaller than the protocol limit.

Why didn't the average size of blocks shoot up to 1 MB and stay there the instant Satoshi added a block size limit to the protocol?
sangaman
Sr. Member
****
Offline Offline

Activity: 342
Merit: 250



View Profile WWW
February 06, 2015, 06:48:26 AM
 #157

Of course if there were no block limit, miners would be able to maximize their transaction fee intake whether that mean 2 MB blocks, 5 MB, whatever... market forces will determine the fee per byte that is needed to get a transaction into the blockchain in a timely fashion. Whether that fee makes it economical to purchase a cup of coffee with bitcoin, that remains to be seen (and may very well vary from person to person).

I should actually correct myself here since I overlooked the potential tragedy of the commons situation when it comes to miners including transations- a blockchain with no block size limit would not necessarily generate the most fees or be the most secure for that reason. However, it's still entirely possible (and in my opinion quite likely) that increasing the block size limit from 1 MB would increase the total fees per block in the long run.
For the entirety of Bitcoin's history, it has produced blocks smaller than the protocol limit.

Why didn't the average size of blocks shoot up to 1 MB and stay there the instant Satoshi added a block size limit to the protocol?

I'm not sure what you're getting at. Clearly there just hasn't been the demand for 1 MB worth of transactions per block thus far, but that could change relatively soon., and thus the debate over lifting the 1 MB cap before we get to that point. If suddenly the block limit were to drop to 50kb, I think we'd start seeing a whole lot of 50kb blocks, no?
turvarya
Hero Member
*****
Offline Offline

Activity: 714
Merit: 500


View Profile
February 06, 2015, 07:43:05 AM
 #158

http://trilema.com/2015/gerald-davis-is-wrong-heres-why/

Since you content farm lot are all busy copy/pasting the same stuff over and over to each other instead of paying attention.
lol,
so someone made a blog-entry about a Forum post to make it look more legitimate?

https://forum.bitcoin.com/
New censorship-free forum by Roger Ver. Try it out.
amincd
Hero Member
*****
Offline Offline

Activity: 772
Merit: 501


View Profile
February 06, 2015, 08:47:16 AM
Last edit: February 06, 2015, 08:57:40 AM by amincd
 #159

http://trilema.com/2015/gerald-davis-is-wrong-heres-why/

Since you content farm lot are all busy copy/pasting the same stuff over and over to each other instead of paying attention.

What does he expect, we're going to cancel the hard fork because he slanders some people with eloquent and expressive prose? Because he declares he can single handedly sabotage the process due to his importance and influence?

Whether the hard fork happens is not going to be determined by one man's ego. If he wants to present his arguments for not doing the hard fork in a diplomatic manner, they will be taken into account, and debated, but there is no debate as long as one man looks down on others, and feels no need to restrain himself and speak to them respectfully.
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 06, 2015, 08:53:39 AM
 #160

Whether the hard fork happens is not going to be determined by one man's ego. If he wants to present his arguments for not doing the hard fork in a diplomatic manner, they will be taken into account, and debated, but there is no debate as long as one man looks down on others, and feels no need to restrain himself and speak to them respectfully.

Absolutely agreed.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!