Bitcoin Forum
December 15, 2017, 06:28:40 PM *
News: Latest stable version of Bitcoin Core: 0.15.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 »  All
  Print  
Author Topic: Permanently keeping the 1MB (anti-spam) restriction is a great idea ...  (Read 103902 times)
gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 2366



View Profile
February 05, 2015, 08:49:46 PM
 #141

All you've done here is reinforce the fact that the design of the P2P network is broken and should be fixed, which is indeed an argument I am making, with a side order of red herring regarding the issuance schedule.
The difference between us is that I don't accept a permanently broken P2P network as a given and conclude that we should employ broken economics as a work around.
The broken economics of having a block size limit, and the broken P2P network should both be fixed.
I was already assuming a perfectly idealized p2p network that had no overhead or sub-linear scaling. I've done as much to explore the space of efficiency gains in this kind of system as any two other people combined here, come on. Please don't try to play off that I don't know how the system works. Decentralization has inherent costs.  You're not saying anything to escape that. It's not good enough to just say "broken broken" when reality doesn't behave like you wish it did.  I also wish there wasn't a tradeoff here, but it doesn't make it so. Sad  (And to be clear, I think there is some amount where the costs are insignificant and not a concern and that cutoff changes over time; it's only the unlimited view which I think is clearly at odds with strong decentralization and risks disenfranchising the actual holders and users of bitcoin; people who weren't signing up for a system controlled by and operated at the complete whim of a few large banks ('miners'/pools)).

Bitcoin will not be compromised
1513362520
Hero Member
*
Offline Offline

Posts: 1513362520

View Profile Personal Message (Offline)

Ignore
1513362520
Reply with quote  #2

1513362520
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1513362520
Hero Member
*
Offline Offline

Posts: 1513362520

View Profile Personal Message (Offline)

Ignore
1513362520
Reply with quote  #2

1513362520
Report to moderator
1513362520
Hero Member
*
Offline Offline

Posts: 1513362520

View Profile Personal Message (Offline)

Ignore
1513362520
Reply with quote  #2

1513362520
Report to moderator
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 05, 2015, 08:55:49 PM
 #142

I was already assuming a perfectly idealized p2p network that had no overhead or sub-linear scaling. I've done as much to explore the space of efficiency gains in this kind of system as any two other people combined here, come on. Please don't try to play off that I don't know how the system works.
What I mean is that your perfectly idealized p2p network is still wrong.

A more detained explaination is forthcoming.

Decentralization has inherent costs.  You're not saying anything to escape that.

That's exactly true, and I'm not trying to escape it.

Everything that any network does has inherent costs, and every one of them in existence of which I am aware either fails to recognize this fact or else does so and does the wrong thing in response.
Lauda
Legendary
*
Offline Offline

Activity: 1694


GUNBOT Licenses -20% with ref. code 'GrumpyKitty'


View Profile WWW
February 05, 2015, 08:57:28 PM
 #143

Looks like were concluding that doing this has also a few (potential) drawbacks. All that was discussed is theoretical as we do not really know.

I'm amazed how the thread took a turn to discussing the issues of p2p networks.


          ▄▄█████▌▐█████▄▄
       ▄█████████▌    ▀▀▀███▄
     ▄███████████▌  ▄▄▄▄   ▀██▄
   ▄█████████████▌  ▀▄▄▀     ▀██▄
  ▐██████████████▌  ▄▄▄▄       ▀█▌
 ▐███████████████▌             ▀█▌
 ████████████████▌  ▀▀▀█         ██
▐████████████████▌  ▄▄▄▄         ██▌
▐████████████████▌  ▀  ▀         ██▌
 ████████████████▌  █▀▀█         ██
 ▐███████████████▌  ▀▀▀▀        ▄█▌
  ▐██████████████▌  ▀▀▀▀       ▄█▌
   ▀█████████████▌  ▀▀█▀     ▄██▀
     ▀███████████▌  ▀▀▀▀   ▄██▀
       ▀█████████▌    ▄▄▄███▀
          ▀▀█████▌▐█████▀▀
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
      ▄▄▄
 ▄▄█████████▄▄
  ▀▀▀▀▀▀▀▀▀▀▀
   █▌▐█ █▌▐█
   █▌▐█ █▌▐█
 ▄███████████▄
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄






▄█████████████▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
███████████████
██▀▀█▀▀████████
▀█████████████▀
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2464



View Profile
February 05, 2015, 09:09:11 PM
 #144

I am almost certain that if Satoshi had originally coded a block limit doubling every 2 years along with block reward halving every 4 years then this would never have been discussed or even pondered in so much questionable detail. People would be like, "Oh yeah, that's the way it works, let's just deal with now".

gmaxwell: thanks for your timely input on the necessary decentralisation quantification.

Do not forget that the hard-coded fees constants fix should be addressed simultaneously with this issue since they are inter-linked .... or we'll be back arguing about that eventually also.

solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
February 05, 2015, 09:58:08 PM
 #145

The assertion that a fee market only works if miners act as a cartel is false.  The situation in which miners do not act as a cartel simply presents the consumer with a range of prices which they can choose among, paying a premium if they are willing to pay for high-priority or prompt service or taking a discount for making low-priority or slow transactions.

Imagine a market in which there is no cartel.  To make it simple, suppose that there are ten miners each with ten percent of the hashing power, and that the block size limit is not routinely reached.  Because they are economically rational and facing different prices for bandwidth and electricity in their respective neighborhoods, they all set different minimum-fee policies.

The consumer is faced with ten different price points for a "minimum acceptable" fee, which determines how many of these miners would accept his or her transaction.  

So... paying a minimum fee would get your tx accepted by one miner.  On average you're going to have to wait ten blocks before that miner gets a block, so your expected tx time is about 100 minutes.  Paying a median fee would get your tx accepted by any of five miners.  On average you're going to have to wait two blocks before one of those five gets a block, so 20 minutes.  Paying the highest fee would get your tx into any block regardless of who mines it, so you'll be in the very next block in around 10 minutes.  

The point is that consumers are not faced with a binary "pay enough" or "don't pay anything" choice; they are faced instead with the opportunity to select a level of responsiveness desired and pay for the priority they want or need on a by-transaction basis.  


This is a great explanation!
It fits the "unconstrained" block size scenario, which is how Bitcoin has worked for most of its existence, except for a day or so about March 6th, 2013, when the 250KB soft-limit was effective. It does mean that when users create a transaction they have a single-shot at getting the fee right. In the simplified example, if a user pitches their fee so that 2 miners will accept it (out of 10), and then change their mind, deciding that waiting an expected 50 minutes is too long, then they are SOL, the unconfirmed tx can't be changed. Fortunately this is rare.

The "constrained" block size scenario makes necessary the ability for ordinary users to increase the fee. Users will want to update the fee on their unconfirmed tx to manage the instability in confirmation times, otherwise their tx can remain stuck in cyberspace, and they are helpless.

Certainly, protocol block limits should not be hit unless all wallets first support the updating of fees on unconfirmed tx.


gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 2366



View Profile
February 05, 2015, 10:17:26 PM
 #146

Do not forget that the hard-coded fees constants fix should be addressed simultaneously with this issue since they are inter-linked .... or we'll be back arguing about that eventually also.
We don't have hardcoded fees in Bitcoin Core... except very low ones for relay permission which have been, in practice, below typical. They're kind of ugly, and I'm generally opposed to hardcoded fees, but if they're below behavior in practice they don't cause much harm (and are very very help at preventing resource exhaustion attacks). Bitcoin Core 0.10 has a automatic fee system based on the transactions in the mempool and recent blocks, where you can set a target number of blocks to wait and it will pay based on recent history.

The "constrained" block size scenario makes necessary the ability for ordinary users to increase the fee. Users will want to update the fee on their unconfirmed tx to manage the instability in confirmation times, otherwise their tx can remain stuck in cyberspace, and they are helpless.
This is relatively straight forward to support. When a new transaction comes into the mempool, if it pays at least $increment more fees per KB than the conflicting already mempooled transaction, replace it and forward on.  Then you just need fairly simple wallet support to revise a transaction. Petertodd (IIRC) already wrote "replace by fee" code that does this.  The catch is that implemented this way it makes zero-confirmed transactions less safe, since you could have a greater success in double spending.   This can be addressed by narrowing the set of allowed replacements (e.g. all outputs must be equal or greater), but AFAIK no one has bothered implementing it.

Quote
Certainly, protocol block limits should not be hit unless all wallets first support the updating of fees on unconfirmed tx.
Chicken and egg. Without fee pressure there is no incentive to work on software to do that. Most non-bitcoin core wallets just set rather high hardcoded fees (even constant ones that don't related the the txsize metric that miners use to prioritize transactions into blocks.).

Unfortunately over-eager increases of the soft-limit have denied us the opportunity to learn from experience under congestion and the motivation to create tools and optimize software to deal with congestion (fee-replacement, micropayment hubs, etc).

Look at the huge abundance of space wasting uncompressed keys (it requires ~ one line of code to compress a bitcoin pubkey) on the network to get an idea of how little pressure there exists to optimize use of the blockchain public-good right now.

Because they are economically rational and facing different prices for bandwidth and electricity in their respective neighborhoods, they all set different minimum-fee policies.
With correctly setup software there is no relationship between your bandwidth or electricity costs as a miner and the transactions you accept into your blocks, and any slight residual relation can be divided down to nothing by pooling with other N other miners (centralizing the consensus in the process) in order to have 1/Nth the bandwidth/cpu costs. As a miner you maximize your personal income by accepting all available transactions that fit which pay a fee, it's best for you when other miners reject low fee paying transactions to encourage people to pay high fees, but you dont and instead hoover up all the fees they passed up. They take the cost of encouraging users to pay higher fees, you defect and take the benefit.

A more detained explaination is forthcoming.
Sounds good, but hopefully you can understand that some people are not very comfortable betting Bitcoin's future on not-yet-public theorems (which sounds like they must be at odds with the best understanding available from the active technical community _and_ academia...).  There have been many "bitcoin scaling" ideas that accidentally turned out to have no security or implied extreme centralization once considered more carefully. There are a few ideas which I think will someday help a lot, but they're not practical yet and its not clear when they will be.

Bitcoin will not be compromised
oldbute
Member
**
Offline Offline

Activity: 61


View Profile
February 05, 2015, 10:26:03 PM
 #147

^ Did you read the entire post? The OP fully addressed the effect on fees:

He neglects that there is no reason to pay fees, if there is no limit on supply.

just because there's the POSSIBILITY of 20MB doesnt mean you HAVE TO use it.

Since there is no marginal cost in including a transaction to the current block, a rational miner will always include a transaction with a non zero fee,
before it is included by any of its competitors.

Therefore a lower bound on fee will not work without a cartel or without a competition for space.

I prefer algorithms over cartels.

The chance of orphan blocks should provide some competition for space.  Miners may find with current network topology a 4MB block is the right size.   As more nodes and faster connections occur size can be adjusted.  Is a hard limit an algorithm?
gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 2366



View Profile
February 05, 2015, 10:29:43 PM
 #148

The chance of orphan blocks should provide some competition for space.
Centralized miners suffer much lower orphan blocks if the orphan block rate is macroscopic and driven by actual propagation time. If you're in a regime where one would want to do something to lower their orphan rate, the optimal income maximizing strategy is to centralize, not to reduce sizes.

Though at least fundamentally we know there there is no need for the orphan rate to increase proportional to block-size, if miners use more efficient relaying mechanisms that take advantage of the transactions having been already sent in advance.

Bitcoin will not be compromised
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
February 05, 2015, 11:17:05 PM
 #149

Unfortunately over-eager increases of the soft-limit have denied us the opportunity to learn from experience under congestion and the motivation to create tools and optimize software to deal with congestion (fee-replacement, micropayment hubs, etc).

Probably the best time to let soft-limits persist was in 2011/12 when the ecosystem was smaller, the funds at stake were a lot smaller, users considered the software more experimental than beta, and the world's press wasn't really watching.

Look at the huge abundance of space wasting uncompressed keys (it requires ~ one line of code to compress a bitcoin pubkey) on the network to get an idea of how little pressure there exists to optimize use of the blockchain public-good right now.

My experience of (centralized) financial systems over many years is that ignoring hardware and software constraints as they are approached invariably causes outages. Also, that trying to train a user-base or worse, a market, to behave differently to accommodate IT constraints is a Sisyphean task. There are probably hundreds of IT experts who are concerned about the block size limit, because they can see the risks in it, which they recognize from prior (usually bitter) experience.

And, this is where the role of Core Dev is crucial. If there are major efficiencies to be had, "low-hanging fruit", then it would be wonderful to see them go live and reflected in smaller blocks etc. But right now, we can only project forwards, from what is happening with the average block size.

bitcoinbeliever
Jr. Member
*
Offline Offline

Activity: 52

Unshackle yourself


View Profile
February 05, 2015, 11:20:13 PM
 #150

D&T, I agree with most everything you wrote, many thanks for the interesting research and composition effort.
zimmah
Legendary
*
Offline Offline

Activity: 896



View Profile
February 06, 2015, 01:06:02 AM
 #151

Here https://bitcointalk.org/index.php?topic=941331.0;topicseen 215 forum users don't think the same :

So - they could be all sockpuppet accounts (as this forum supports that).

Any poll on this forum is worth *zero* so posting the result of any such poll has *zero credibility*.


Polls on bitcointalk:

- if I agree with the outcome, it's undenyable evidence that I am right
- if I don't agree with the outcome, it's just sock puppets.

Why did this forum get so filled with trolls?
amincd
Hero Member
*****
Offline Offline

Activity: 772


View Profile
February 06, 2015, 02:39:59 AM
 #152

It's not a binary option. Raising the limit at a moderate pace, so that fees don't have to increase a substantial amount with increasing adoption, is a middle ground solution, that will lead to average fees remaining affordable but not-zero.

There are quite a few constants in Bitcoin that one could argue with, that is what we do. It is however important to do it for the right reason.
Avoid paying fees is not a right reason.

Again, a straw man argument. No one has argued that the limit should be raised to "avoid paying fees". I want the blocks to come against the limit but I want that limit to be much higher than it is. I want the limit to put some upward pressure on fees, but not too much, because I don't want mass adoption to be dependent on end users paying "excessive fees" to access the blockchain.

Exending on above:

It is rather difficult to substantiate an algorithm that would set course for the future, given the huge amount of unkown parameters a constant we have is preferred by occam's razor.

But anyone wanting a permanent 1 MB restriction also needs to substantiate this course being set for the future. The argument for getting rid of the 1 MB restriction is no more speculative than the one for making it permanent.

Given what Peter R has shown:

In response to those claiming that a hard fork to increase the blocksize limit will hurt the miners' ability to collect fee revenue:

The empirical data we have so far does not support the notion that the miners will be starved of fees or that blocks will be full of low fee transactions if the blocksize limit is increased.  If we inspect the fees paid to miners per day in US dollars over the lifetime of the network (avg blocksize << max blocksize), we see that total fee revenue, on average, has grown with increases in the daily transaction volume.



The total daily fees, F, have actually grown as the number of transactions, N, raised to the power of 2.7.  Although I don't expect this F~N2.7 relationship to hold forever, those suggesting that the total fees would actually decrease with increasing N have little data to support this claim (although, during our present bear market we've seen a reduction in the daily fees paid to miners despite an increase in N.)

Past behaviour is no guarantee of future behaviour, but historically blocks don't get filled with low-fee transactions and historically the total fees paid to miners increases with increased transaction volume.

And given that there is no reason to assume that the demand for space relative to available space will be lower with a higher block size limit, there is no reason not to raise the limit above the current 1 MB in light of DeathAndTaxes' analysis on what this limit will mean for end-user access to the blockchain.
CIYAM
Legendary
*
Offline Offline

Activity: 1862


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 06, 2015, 04:22:17 AM
 #153

Polls on bitcointalk:

- if I agree with the outcome, it's undenyable evidence that I am right
- if I don't agree with the outcome, it's just sock puppets.

Why did this forum get so filled with trolls?

I have never agreed with the use of polls in the forum full stop (and my own forum software does not even have them).

The very idea of putting polls into a forum that "encourages" sockpuppets is ridiculous.

(but let's not go off topic)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
ObamaMcKennit
Newbie
*
Offline Offline

Activity: 1


View Profile
February 06, 2015, 05:05:50 AM
 #154

http://trilema.com/2015/gerald-davis-is-wrong-heres-why/

Since you content farm lot are all busy copy/pasting the same stuff over and over to each other instead of paying attention.
wilth1
Member
**
Offline Offline

Activity: 63


View Profile
February 06, 2015, 05:08:33 AM
 #155

Why couldn't MAX_BLOCK_SIZE be self-adjusting?
gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 2366



View Profile
February 06, 2015, 05:15:42 AM
 #156

Why couldn't MAX_BLOCK_SIZE be self-adjusting?
That very vague.... based on what?   The hard rules of the protocol are what protect the users and owners of Bitcoins from miners whos interests are only partially aligned.  Sadly, miners have substantial censoring power for data that goes into the blockchain.  I suppose it's useful to have an in-protocol way of coordinating rather than depending on potentially non-transparent back room dealing; but almost anything in the network would be easily gamable by miners. There are some things that I think are preferable to just having no effective limit (e.g. having a rolling median, and requiring mining at higher diff to move the needle for your own blocks, and requiring difficulty to not be falling over-all for the size to go up) but these don't address half the concerns and potentially add a fair bit of complexity (which has its own risks.). 

Bitcoin will not be compromised
sangaman
Sr. Member
****
Offline Offline

Activity: 342



View Profile WWW
February 06, 2015, 05:31:16 AM
 #157

Thank you DeathAndTaxes for this excellent post on why the block size limit must increase if bitcoin is to ever reach its potential as (or simply to remain) a decentralized, peer-to-peer means of exchange. A highly restricted blockchain that is impractical for regular people to use is not what bitcoin was ever intended to be, and you did a good job explaining why this is what bitcoin would become with a fixed 1 MB cap (and sooner than most people think). You also did a good job debunking several of the most common objections to lifting the 1 MB cap, and I would further emphasize that an artificial cap on the number of bitcoin transactions is not the best way to maximize mining fees and/or security. This follows from basic economic principles. Hypothetically speaking, if you assume purely self-interested miners and capping blocks at 1 MB happens to be the way to generate the greatest amount of transaction fees, then blocks will be no larger than 1 MB regardless of what the block limit is. Of course if there were no block limit, miners would be able to maximize their transaction fee intake whether that mean 2 MB blocks, 5 MB, whatever... market forces will determine the fee per byte that is needed to get a transaction into the blockchain in a timely fashion. Whether that fee makes it economical to purchase a cup of coffee with bitcoin, that remains to be seen (and may very well vary from person to person).

D&T, you've been one of if not the most consistently reasonable, sincere, and intelligent posters on this forum since I first discovered this forum, and I appreciate you taking the time to write this persuasive argument on a topic that's critical for the long-term success and viability of bitcoin.
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 06, 2015, 05:36:23 AM
 #158

What if he's wrong?

Quote
without MPEx no fork of this network can succeed
sangaman
Sr. Member
****
Offline Offline

Activity: 342



View Profile WWW
February 06, 2015, 06:19:45 AM
 #159

Of course if there were no block limit, miners would be able to maximize their transaction fee intake whether that mean 2 MB blocks, 5 MB, whatever... market forces will determine the fee per byte that is needed to get a transaction into the blockchain in a timely fashion. Whether that fee makes it economical to purchase a cup of coffee with bitcoin, that remains to be seen (and may very well vary from person to person).

I should actually correct myself here since I overlooked the potential tragedy of the commons situation when it comes to miners including transations- a blockchain with no block size limit would not necessarily generate the most fees or be the most secure for that reason. However, it's still entirely possible (and in my opinion quite likely) that increasing the block size limit from 1 MB would increase the total fees per block in the long run.
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
February 06, 2015, 06:26:54 AM
 #160

Of course if there were no block limit, miners would be able to maximize their transaction fee intake whether that mean 2 MB blocks, 5 MB, whatever... market forces will determine the fee per byte that is needed to get a transaction into the blockchain in a timely fashion. Whether that fee makes it economical to purchase a cup of coffee with bitcoin, that remains to be seen (and may very well vary from person to person).

I should actually correct myself here since I overlooked the potential tragedy of the commons situation when it comes to miners including transations- a blockchain with no block size limit would not necessarily generate the most fees or be the most secure for that reason. However, it's still entirely possible (and in my opinion quite likely) that increasing the block size limit from 1 MB would increase the total fees per block in the long run.
For the entirety of Bitcoin's history, it has produced blocks smaller than the protocol limit.

Why didn't the average size of blocks shoot up to 1 MB and stay there the instant Satoshi added a block size limit to the protocol?
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!