Bitcoin Forum
April 26, 2024, 07:47:50 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 »  All
  Print  
Author Topic: Increasing blocksize dynamically w/economic safeguards - the ideal compromise?  (Read 1546 times)
pondjohn (OP)
Newbie
*
Offline Offline

Activity: 21
Merit: 1


View Profile
January 09, 2017, 11:38:28 AM
Merited by ABCbits (1)
 #1

One side of the block size debate wants to hand over control of the block size to the miners.  Many fear such an implementation would cause catastrophic failures of consensus, and that miners could even be incentivised to bloat the block size at a rate that overly compromises Bitcoin’s decentralistion.

Others are worried that scaling solutions such as Lightning Network and sidechains will take too long and not achieve sufficient gains, stifling Bitcoin’s network effect and preventing its continued exponential growth.

What if there were a way to simultaneously allow for exponential growth on chain if needed – allowing time for layer two solutions to take some heat off the chain, but also creating an economic disincentive for miners trying to inflate the block size arbitrarily.

Such a solution should allow for an exponential increase in block size if miners were in consensus, but require they face an economic risk when signaling for a block size increase where there was no consensus. Cryptoeconomics is built on incentive game theory, why not introduce it here?

Allowing the block size to change dynamically with demand would reduce the risk of requiring additional contentious block size hard forks and hostile debate. I fear a simple 2MB increase would reignite the debate almost as soon as it was activated, we need to buy as much time as possible.

Any solution is going to be a compromise, but by allowing a few years of exponential growth with strict safeguards and appropriate economic incentives we can hopefully achieve that.

So how do we do it?

My basic idea is for miners to vote in each block to increase the block size.

Allowing for exponential growth would mean that the block size could double every year.

This would be achieved by each of the previous 2016 blocks voting to increase the block size by the maximum amount of 2.7% each time. An increase of 2.7% every 2 weeks would result in an annual block size increase of 99.9% (rounding).

We only need to use 3 bits for miners to vote on block size:
000 = not voting
001 = vote no change
011 = vote decrease 2%
101 = vote increase 1.35%, pay 10% of transaction fees to next block
111 = vote increase 2.7%, pay 25% of transaction fees to next block

Not including any transactions in a block will waive a miners’ right to vote.

Each block is a vote, and the block size change could be calculated by averaging out all the votes over 2016 blocks.

In order to achieve an increase in block size, the blocks must also have been sufficiently full to justify one. Transactions with no fee and perhaps outliers far from the mean tx fee/kb should perhaps not be included.

By asking miners to pay a percentage of their transaction fees to the miner of the next block, you discourage miners from stuffing the blocks with transactions to artificially inflate the block size.

If miners are in unanimous agreement that the block size needs to increase, the fees would average out and all miners should still be equally rewarded. Only miners trying to increase the block size when consensus is not there would incur a cost.

There should be a limit on the maximum increase, perhaps 8MB. This isn’t a permanent solution, it is just to create time for Bitcoin to progress, and then re-evaluate things further down the line. Combined with SegWit this should provide a reasonable balance between satisfying those who are worried about missing out on exponential growth for a few years if LN and other solutions are not as fast or effective as hoped.

This is my rough idea for trying to find a compromise we can all get behind. Ant thoughts?

Link to blog post: https://seebitcoin.com/2017/01/dynamic-block-size-with-economic-safeguards-could-this-be-the-solution-that-we-can-all-get-behind/
1714117670
Hero Member
*
Offline Offline

Posts: 1714117670

View Profile Personal Message (Offline)

Ignore
1714117670
Reply with quote  #2

1714117670
Report to moderator
1714117670
Hero Member
*
Offline Offline

Posts: 1714117670

View Profile Personal Message (Offline)

Ignore
1714117670
Reply with quote  #2

1714117670
Report to moderator
Remember that Bitcoin is still beta software. Don't put all of your money into BTC!
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714117670
Hero Member
*
Offline Offline

Posts: 1714117670

View Profile Personal Message (Offline)

Ignore
1714117670
Reply with quote  #2

1714117670
Report to moderator
1714117670
Hero Member
*
Offline Offline

Posts: 1714117670

View Profile Personal Message (Offline)

Ignore
1714117670
Reply with quote  #2

1714117670
Report to moderator
1714117670
Hero Member
*
Offline Offline

Posts: 1714117670

View Profile Personal Message (Offline)

Ignore
1714117670
Reply with quote  #2

1714117670
Report to moderator
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
January 09, 2017, 12:14:38 PM
Merited by ABCbits (3)
 #2

101 = vote increase 1.35%, pay 10% of transaction fees to next block
111 = vote increase 2.7%, pay 25% of transaction fees to next block
There are no mandated fees in the Bitcoin protocol so the natural response to schemes like this is for miners to simply accept fees via other means (such as the direct txout method supported by eligius since 2011, or via outputs with empty scriptpubkeys, or via out of band fees) and give users a discount for using them. The expected result would be fees migrating out of the fee area, and protocols that depend on them (like your suggestion) becoming dysfunctional.  Sad

I previously tried to rescue this class of proposal by having the  change not be to fees but by modifying the lowness of the required hash (effective difficulty), but it's difficult to do that in the presence of subsidy.

Unrelated, as you note your proposal is no constraint if miners agree-- this is also why it fails to address the conflict of interest between miners (really mining pools), who are paid to include transactions, and everyone else-- who experiences them as an externalize except to the extent that they contribute to economic growth (not at all a necessity: e.g. many companies want to use the bitcoin blockchain without using the Bitcoin currency at all).  Still, better to solve one issue even if all can't be solved.

pondjohn (OP)
Newbie
*
Offline Offline

Activity: 21
Merit: 1


View Profile
January 09, 2017, 12:47:14 PM
 #3

There are no mandated fees in the Bitcoin protocol so the natural response to schemes like this is for miners to simply accept fees via other means (such as the direct txout method supported by eligius since 2011, or via outputs with empty scriptpubkeys, or via out of band fees) and give users a discount for using them. The expected result would be fees migrating out of the fee area, and protocols that depend on them (like your suggestion) becoming dysfunctional.  Sad

Hi greg, thanks for taking a look at the idea.

The issue you highlight is why I had the idea to exclude zero/far below mean fee transactions from calculating the fullness of blocks to justify an increase.

Therefore there would be no benefit to the miners who want to accept fees out of band in an effort to avoid paying for a vote to increase the block size, as increasing the block size would not be possible unless the blocks are sufficiently full with transactions that are paying a fee.

This transaction doesn't change the fundamentals of Bitcoin - fees are still not mandated, there is no penalty for including zero fee transactions. Its just that if there are a large number of transactions in blocks that are paying fees substantially below the mean the block size cannot grow, just as it can't anyway at the moment.

I hope what I'm trying to say makes sense. Do you think this could help mitigate the out of band fees problem?
pondjohn (OP)
Newbie
*
Offline Offline

Activity: 21
Merit: 1


View Profile
January 09, 2017, 03:02:20 PM
 #4

Another way to do it is instead of averaging out the votes, say 5 votes: 0 + 0 + 1.35% + 2.7% + 2.7% = +1.35% increase.

You could do the option with the most votes wins. As that way miners who are signaling +2.7% are not getting their higher investment diluted down as long as they have consensus. Once the threshold is reached all miners would drop down to not voting until the next round.
DooMAD
Legendary
*
Offline Offline

Activity: 3766
Merit: 3100


Leave no FUD unchallenged


View Profile
January 09, 2017, 04:41:10 PM
Last edit: January 09, 2017, 05:05:16 PM by DooMAD
 #5

Dynamic is absolutely my personal preferred solution.  The tricky part is not just how to implement it, but also to get people enthusiastic about the idea.  "Trying to find a compromise we can all get behind" is great, but we can only do that if the idea catches on.  Other proposals seem to garner so much more attention and I'm always at a loss to know why that is.  Maybe it's simply down to the fact that it would be inherently more complex than static limits.  "Blocksize = X" is just easier for everyone to comprehend than when it starts being about percentage increases.  There's no technical reason why the blocksize has to be an integer, but maybe since that's how it's always been, people have grown accustomed to it.

For me, the ideal compromise should tick three boxes:

    1) An algorithmic element based on transaction volumes, so change only happens when required
    2) A way for both miners *and* nodes to signal the size of the adjustment they are willing to accept, to maintain equilibrium and to ensure miners can't dictate to the rest of the network
    3) Another algorithmic element taking into consideration the average total fees received per block over all the blocks in the previous difficulty period, to ensure economic viability and to avoid rigging by any single pool or entity.

No easy task, for sure.  But it feels like all the elements are there and just need putting together somehow.  BIP106 came very close, but left some important elements missing.  Mainly #2, a way for full nodes to set a "this far and no further" threshold.  It's almost ironic that for all the complaints on this forum about the BU client, #2 is almost exactly what it does (although again, tends to encourage whole numbers and not decimals, adjustments should be in fractions of a MB).  But on its own, BU isn't the whole solution either, because it doesn't tick boxes #1 and #3.  It absolutely must contain algorithmic elements, or it's no better than "democracy" and would be just as easily corrupted.  If someone can put it all together, there's no reason I can see why it won't work.

Your own addition to the mix of miners paying a percentage of transaction fees to the next block may also have merit, but I'm struggling a bit with this part:

If miners are in unanimous agreement that the block size needs to increase, the fees would average out and all miners should still be equally rewarded. Only miners trying to increase the block size when consensus is not there would incur a cost.

In practice, would it not be a case of whoever blinks first loses money?  How else would miners know what to agree on unless someone starts signalling first?

.
.HUGE.
▄██████████▄▄
▄█████████████████▄
▄█████████████████████▄
▄███████████████████████▄
▄█████████████████████████▄
███████▌██▌▐██▐██▐████▄███
████▐██▐████▌██▌██▌██▌██
█████▀███▀███▀▐██▐██▐█████

▀█████████████████████████▀

▀███████████████████████▀

▀█████████████████████▀

▀█████████████████▀

▀██████████▀▀
█▀▀▀▀











█▄▄▄▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
.
CASINSPORTSBOOK
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀▀█











▄▄▄▄█
pondjohn (OP)
Newbie
*
Offline Offline

Activity: 21
Merit: 1


View Profile
January 09, 2017, 05:57:34 PM
 #6

For me, the ideal compromise should tick three boxes:

    1) An algorithmic element based on transaction volumes, so change only happens when required
    2) A way for both miners *and* nodes to signal the size of the adjustment they are willing to accept, to maintain equilibrium and to ensure miners can't dictate to the rest of the network
    3) Another algorithmic element taking into consideration the average total fees received per block over all the blocks in the previous difficulty period, to ensure economic viability and to avoid rigging by any single pool or entity.

No easy task, for sure.  But it feels like all the elements are there and just need putting together somehow.  BIP106 came very close, but left some important elements missing.  Mainly #2, a way for full nodes to set a "this far and no further" threshold.  It's almost ironic that for all the complaints on this forum about the BU client, #2 is almost exactly what it does (although again, tends to encourage whole numbers and not decimals, adjustments should be in fractions of a MB).

The insurmountable problem with #2, beyond BU's implementation making something absolute like consensus into something fuzzy, is that it is impossible to avoid Sybil manipulation. Also, it isn't really easy to take a poll of all nodes on the network. The closest you could get is asking individual transactions to signal, but that adds extra bloat on chain, and gives the power to users instead of nodes, when really it is a decision for the latter.

Also, a node is just an IP in terms of measuring 'support'.

If you look at nodes you might end up with a super high bandwidth node that serves many concurrent connections, and somebody's tiny Raspberry Pi hobbyist setup over dodgy wifi. Giving each node equal say is ripe for gaming.

In practice, would it not be a case of whoever blinks first loses money?  How else would miners know what to agree on unless someone starts signalling first?

It would introduce some interesting game theory for sure. It is possible that if there was clear consensus to increase the block size miners would avoid 'paying' to signal for an increase at the beginning of a cycle rather than at the end, it would depend on the strength of consensus and how desperate they were to get it 'passed' - a little like law makers.

The stakes are low enough that the real cost is incurred for signalling against consensus over a sustained period. Getting 'caught' occasionally by having to pay to signal because somebody else who shares your goal found a block before you and didn't is not a big deal and would average itself over time.
DooMAD
Legendary
*
Offline Offline

Activity: 3766
Merit: 3100


Leave no FUD unchallenged


View Profile
January 09, 2017, 06:43:53 PM
 #7

The insurmountable problem with #2, beyond BU's implementation making something absolute like consensus into something fuzzy, is that it is impossible to avoid Sybil manipulation. Also, it isn't really easy to take a poll of all nodes on the network. The closest you could get is asking individual transactions to signal, but that adds extra bloat on chain, and gives the power to users instead of nodes, when really it is a decision for the latter.

Also, a node is just an IP in terms of measuring 'support'.

If you look at nodes you might end up with a super high bandwidth node that serves many concurrent connections, and somebody's tiny Raspberry Pi hobbyist setup over dodgy wifi. Giving each node equal say is ripe for gaming.

Aside from the "fuzzy consensus" bit, would the same shortcomings not apply to nodes signalling for a softfork as is happening currently?  That's measurable, so this should be equally so.  Plus, we're apparently prepared to accept shenanigans like client spoofing, when that could have a significantly adverse effect on consensus, because there's no easy way to prevent it.  It's the nature of the beast.  So while I'm of the mindset "never say never", it's highly unlikely we're going to find a completely fool-proof solution.  Hence #1 and #3 being necessary as well.

As an additional safety precaution, is there a way to measure the maturity of a node?  Something like a node isn't permitted a vote until it has relayed X amount of blocks?  Or is that just my brain drifting into the realms of fantasy?  The more algorithmic checks and balances that can be added, the better.

I certainly wouldn't advocate transaction-based signalling while we're in the process of trying to optimise the space they use up, not add to it.  Plus, as you mention, SPV clients and other non-load-bearing entities would then get equal say without equal contribution, which hardly seems fair.

The natural order of things is that large blocks favour miners because there's greater potential for profit and small blocks favour nodes because there's less externalised cost.  The only solution people will be prepared to accept is to strike a fair balance between the two.  It has to be solved somehow. 

.
.HUGE.
▄██████████▄▄
▄█████████████████▄
▄█████████████████████▄
▄███████████████████████▄
▄█████████████████████████▄
███████▌██▌▐██▐██▐████▄███
████▐██▐████▌██▌██▌██▌██
█████▀███▀███▀▐██▐██▐█████

▀█████████████████████████▀

▀███████████████████████▀

▀█████████████████████▀

▀█████████████████▀

▀██████████▀▀
█▀▀▀▀











█▄▄▄▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
.
CASINSPORTSBOOK
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀▀█











▄▄▄▄█
pondjohn (OP)
Newbie
*
Offline Offline

Activity: 21
Merit: 1


View Profile
January 09, 2017, 10:47:10 PM
 #8

Aside from the "fuzzy consensus" bit, would the same shortcomings not apply to nodes signalling for a softfork as is happening currently?  That's measurable, so this should be equally so.

Nodes don't signal for SegWit activation, only miners do.

I assumed nodes in BU signalled their max block size and depth they wanted to be overridden at, but perhaps it is signalled in blocks too - I haven't thoroughly examined the implementation, I just know that there is no consensus on what consensus even is.

Plus, we're apparently prepared to accept shenanigans like client spoofing, when that could have a significantly adverse effect on consensus, because there's no easy way to prevent it.  It's the nature of the beast.  So while I'm of the mindset "never say never", it's highly unlikely we're going to find a completely fool-proof solution.  Hence #1 and #3 being necessary as well.

Accept in what way? Spoofing nodes is only a social manipulation strategy, it does not achieve anything in terms of votes or consensus. The whole point is node counts of signalling can't be trusted. Period. There is no way to assess a nodes contribution to the network. #2 cannot happen. Only miners or transactions can have any sway on the blockchain, as far as the blockchain is concerned nodes are read only.
DooMAD
Legendary
*
Offline Offline

Activity: 3766
Merit: 3100


Leave no FUD unchallenged


View Profile
January 10, 2017, 12:35:05 AM
 #9

Accept in what way? Spoofing nodes is only a social manipulation strategy, it does not achieve anything in terms of votes or consensus. The whole point is node counts of signalling can't be trusted. Period. There is no way to assess a nodes contribution to the network. #2 cannot happen. Only miners or transactions can have any sway on the blockchain, as far as the blockchain is concerned nodes are read only.

Accept in the sense that we have no alternative.  No one can think of a way to prevent dishonest nodes.  Social manipulation or otherwise, it's still a danger to the overall health of the network and needs to be considered.  Whether node counts can be trusted or not, the fact remains it is being measured and will likely have an impact on whether or not SegWit activates.  Miners won't risk activation if they perceive a chance of the network not relaying their new blocks.  They'll want to see nodes running compatible software before it goes live.  Even though there's no way of knowing for sure if the figures are accurate, it's all we have to go by.  I'm suggesting that similar figures could be used for blocksize.

What we're discussing here isn't a problem with factoring node statistics into a dynamic blocksize proposal, but with an inherent flaw in the consensus mechanism itself.  One we can't easily get rid of, because there's currently nothing to stop dishonest node operators from spoofing their software version.  Similarly, there's nothing to stop them spoofing a preference for blocksize if that could be listed and quantified in the same way we list and quantify the software version.  If we can do one, we can obviously do the other.  It's not perfect, granted, but at least it would give a rough indication of what size adjustment the network would generally be prepared to allow before the miners started their signalling for larger or smaller blocks.  

If we can find a way to implement algorithmic safeguards to limit the damage that could potentially be caused by dishonest participants, perhaps those could also be extended to future fork proposals and kill two birds with one stone.  Effectively making the entire system more resilient.

.
.HUGE.
▄██████████▄▄
▄█████████████████▄
▄█████████████████████▄
▄███████████████████████▄
▄█████████████████████████▄
███████▌██▌▐██▐██▐████▄███
████▐██▐████▌██▌██▌██▌██
█████▀███▀███▀▐██▐██▐█████

▀█████████████████████████▀

▀███████████████████████▀

▀█████████████████████▀

▀█████████████████▀

▀██████████▀▀
█▀▀▀▀











█▄▄▄▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
.
CASINSPORTSBOOK
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀▀█











▄▄▄▄█
pondjohn (OP)
Newbie
*
Offline Offline

Activity: 21
Merit: 1


View Profile
January 11, 2017, 01:20:00 AM
 #10

There are no mandated fees in the Bitcoin protocol so the natural response to schemes like this is for miners to simply accept fees via other means (such as the direct txout method supported by eligius since 2011, or via outputs with empty scriptpubkeys, or via out of band fees) and give users a discount for using them. The expected result would be fees migrating out of the fee area, and protocols that depend on them (like your suggestion) becoming dysfunctional.  Sad

I previously tried to rescue this class of proposal by having the  change not be to fees but by modifying the lowness of the required hash (effective difficulty), but it's difficult to do that in the presence of subsidy.

Unrelated, as you note your proposal is no constraint if miners agree-- this is also why it fails to address the conflict of interest between miners (really mining pools), who are paid to include transactions, and everyone else-- who experiences them as an externalize except to the extent that they contribute to economic growth (not at all a necessity: e.g. many companies want to use the bitcoin blockchain without using the Bitcoin currency at all).  Still, better to solve one issue even if all can't be solved.

I did some analysis of the transaction fees, and used the data to properly demonstrate how all the risks you identify can be mitigated!

Out of band fees can be completely disincentivised.

Please have a read and let me know if you have any thoughts.

https://seebitcoin.com/2017/01/i-analysed-24h-worth-of-transaction-fee-data-and-this-is-what-i-discovered/
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
January 11, 2017, 01:40:00 AM
 #11

out of band fees can work in both directions, e.g. including rebates.  (and, in fact rebates can be done inband with coinjoins with no trust)

Also consider what your scheme does when a majority hashpower censors any transaction paying a high inband fee level.
pondjohn (OP)
Newbie
*
Offline Offline

Activity: 21
Merit: 1


View Profile
January 11, 2017, 01:58:38 AM
 #12

out of band fees can work in both directions, e.g. including rebates.  (and, in fact rebates can be done inband with coinjoins with no trust)

Also consider what your scheme does when a majority hashpower censors any transaction paying a high inband fee level.

I'm not sure I understand how you're suggesting a rebate would work?

What do you mean by a majority hashpower, a 51% attack? If a transaction has a high fee surely any miner is incentivised to include it in a block?
manselr
Legendary
*
Offline Offline

Activity: 868
Merit: 1004


View Profile
January 16, 2017, 04:56:43 PM
 #13

gmaxwell as a XMR supporter and holder (I think) what do you think of the Monero's way to deal with the blocksize problem? they are running a dynamic blocksize solution, what do you think about it? what are the differences from this compared to BU's proposal?

Do you think Monero can end up in a tricky situation if the volume of transaction starts growing where the blocksize becomes too big then it becomes too centralized? (which would make it less secure than just using bitcoin and additional anonymous features). I heard some guy talking about how they have a method of raising the fees if that happens, but then it's the same problem again (fees too high) so whats the point?

Im trying to guess if I want to buy a long term position in Monero but since im not a coder and only an investor I can't understand all the details  Cry

Can anybody explain the pros and cons in an ELI5 way or its not possible?

I just see a lot of people here saying how dynamic solution is the best and "blockstream don't want it so they can profit from sending most of transaction volumes throught LN" and stuff specially that franky1 guy is saying that here all day and as a non coder I dont know who to believe anymore  Cry
Jet Cash
Legendary
*
Offline Offline

Activity: 2702
Merit: 2449


https://JetCash.com


View Profile WWW
January 16, 2017, 05:27:45 PM
Merited by ABCbits (1)
 #14

Here is a sequence from blockchain info - why will increasing blocksize help with this situation.

Quote
448492 (Main Chain)   2017-01-16 14:11:08   BW.COM   000000000000000000da57d09bae60aad857065282df71d390e19fc37b888db3   399.24
448491 (Main Chain)   2017-01-16 14:06:48   BitFury   00000000000000000302e906a677bd35944940aa9612ac3191583eef95a7e969   478.14
448490 (Main Chain)   2017-01-16 14:04:00   BW.COM   0000000000000000012ee8ec9caea10946f5310f60a47191038e47e43bd5eeed   342.04
448489 (Main Chain)   2017-01-16 14:01:57   BW.COM   00000000000000000197f8a1e665a11461bee7a29a73011bcf5b0fbf3aa20bce   496.43
448488 (Main Chain)   2017-01-16 13:57:47   SlushPool   0000000000000000003fdcae2df726d493136831ad7fc7de1378a2d24e0ba1d3   195.67
448487 (Main Chain)   2017-01-16 13:56:29   BTCC Pool   000000000000000001edfa42ad5754deba9df7359e920a23c849b97df05aa56c   998.86
448486 (Main Chain)   2017-01-16 13:51:53   BW.COM   0000000000000000032ad47bb82c250652cefd086408c14fc4c02feab9e7ceab   998.03
448485 (Main Chain)   2017-01-16 13:38:09   AntPool   0000000000000000016095498c650309869c2e327ba5f4783ce6a3bfc24a8e7b   472.42

Offgrid campers allow you to enjoy life and preserve your health and wealth.
Save old Cars - my project to save old cars from scrapage schemes, and to reduce the sale of new cars.
My new Bitcoin transfer address is - bc1q9gtz8e40en6glgxwk4eujuau2fk5wxrprs6fys
Nicolas Tesla
Newbie
*
Offline Offline

Activity: 57
Merit: 0


View Profile
January 17, 2017, 03:22:14 PM
 #15

Here is a sequence from blockchain info - why will increasing blocksize help with this situation.

Quote
448492 (Main Chain)   2017-01-16 14:11:08   BW.COM   000000000000000000da57d09bae60aad857065282df71d390e19fc37b888db3   399.24
448491 (Main Chain)   2017-01-16 14:06:48   BitFury   00000000000000000302e906a677bd35944940aa9612ac3191583eef95a7e969   478.14
448490 (Main Chain)   2017-01-16 14:04:00   BW.COM   0000000000000000012ee8ec9caea10946f5310f60a47191038e47e43bd5eeed   342.04
448489 (Main Chain)   2017-01-16 14:01:57   BW.COM   00000000000000000197f8a1e665a11461bee7a29a73011bcf5b0fbf3aa20bce   496.43
448488 (Main Chain)   2017-01-16 13:57:47   SlushPool   0000000000000000003fdcae2df726d493136831ad7fc7de1378a2d24e0ba1d3   195.67
448487 (Main Chain)   2017-01-16 13:56:29   BTCC Pool   000000000000000001edfa42ad5754deba9df7359e920a23c849b97df05aa56c   998.86
448486 (Main Chain)   2017-01-16 13:51:53   BW.COM   0000000000000000032ad47bb82c250652cefd086408c14fc4c02feab9e7ceab   998.03
448485 (Main Chain)   2017-01-16 13:38:09   AntPool   0000000000000000016095498c650309869c2e327ba5f4783ce6a3bfc24a8e7b   472.42

In short blocks aren't full ? Some are not even half of the limit ! WTF ?
Nicolas Tesla
Newbie
*
Offline Offline

Activity: 57
Merit: 0


View Profile
January 17, 2017, 03:36:14 PM
 #16

gmaxwell as a XMR supporter and holder (I think) what do you think of the Monero's way to deal with the blocksize problem? they are running a dynamic blocksize solution, what do you think about it? what are the differences from this compared to BU's proposal?

Do you think Monero can end up in a tricky situation if the volume of transaction starts growing where the blocksize becomes too big then it becomes too centralized? (which would make it less secure than just using bitcoin and additional anonymous features). I heard some guy talking about how they have a method of raising the fees if that happens, but then it's the same problem again (fees too high) so whats the point?

Im trying to guess if I want to buy a long term position in Monero but since im not a coder and only an investor I can't understand all the details  Cry

Can anybody explain the pros and cons in an ELI5 way or its not possible?

I just see a lot of people here saying how dynamic solution is the best and "blockstream don't want it so they can profit from sending most of transaction volumes throught LN" and stuff specially that franky1 guy is saying that here all day and as a non coder I dont know who to believe anymore  Cry

Yes as any flex cap can be gamed or it just fall in the tragedy of the common:
You have let's say 1000 nodes,
First you say lets raise to 2 M.
10 % will give up because of higher cost.
You still have 900 nodes.
Then you want raise to 4 MB.
20 % drop out.
You still have 720 nodes...
Now you say you see we lost only 28 %.
Lets raise to 10 MB.
Half now drop out this a unbearable for the average joe,
you have 360 nodes...
you have lost 64 % of your nodes...
This is tragedy of the common.
HostFat
Staff
Legendary
*
Offline Offline

Activity: 4214
Merit: 1203


I support freedom of choice


View Profile WWW
January 18, 2017, 02:15:49 AM
Last edit: January 18, 2017, 02:33:19 AM by HostFat
Merited by ABCbits (2)
 #17

@Nicolas Tesla
You as many other are currently missing one important thing.

What happen if the network lose a large number of nodes? The decentralisation feature of the network starts to missing.
What happen to the market if the decentralisation feature of the network starts to missing? The confidence in the Bitcoin gets lower, and so the price.
What happen to the miners if the price gets lower? They get less money.

So, even with an unlimited block size, maybe there can be problems from a possible attack from malicious entity, and so some dynamic barriers are needed, but there is no way that miners will start to make huge blocks, because this goes directly against their interest on money.

Again, not because they have a good heart, but even just because of their greed, as the Nakamoto's consensus works. (6. Incentive)

And they have millions invested that need to cover on all the next months/years.

Good past example of this economic behavior:
http://www.coindesk.com/bitcoin-miners-ditch-ghash-io-pool-51-attack/

NON DO ASSISTENZA PRIVATA - http://hostfatmind.com
Nicolas Tesla
Newbie
*
Offline Offline

Activity: 57
Merit: 0


View Profile
January 18, 2017, 12:29:47 PM
 #18

@Nicolas Tesla
You as many other are currently missing one important thing.

What happen if the network lose a large number of nodes? The decentralisation feature of the network starts to missing.
What happen to the market if the decentralisation feature of the network starts to missing? The confidence in the Bitcoin gets lower, and so the price.
What happen to the miners if the price gets lower? They get less money.

So, even with an unlimited block size, maybe there can be problems from a possible attack from malicious entity, and so some dynamic barriers are needed, but there is no way that miners will start to make huge blocks, because this goes directly against their interest on money.

Again, not because they have a good heart, but even just because of their greed, as the Nakamoto's consensus works. (6. Incentive)

And they have millions invested that need to cover on all the next months/years.

Good past example of this economic behavior:
http://www.coindesk.com/bitcoin-miners-ditch-ghash-io-pool-51-attack/

Maybe, but I don't see how can avoid this situation even if miners aren't malicious, my scenario still apply. The price could still up because most people doesn't care about decentralization and financial sovereignty. This Paypal alike coin could still survive with enough big business and gov supporting it !

BU or any big block proposal will fall in that tragedy of the common price rising or not.

If BUcoin become successful we are back to square one: a centralized system in the hand of govs and big businesses, the exact reason I left fiat in the first place !

A settlement system with ten thousands nodes and a good portion of them on Tor run by the average Joe and with decentralized trust Layer II is far better in term of censorship resistance than another pale and ineffective copy of PayPal.

In my view BU coin could be successful if by successful you meant by price rise but at the cost to sell your soul again to gov, data centers and big businesses in term of revolution and financial sovereignty it would have failed miserably !
HostFat
Staff
Legendary
*
Offline Offline

Activity: 4214
Merit: 1203


I support freedom of choice


View Profile WWW
January 18, 2017, 07:41:03 PM
 #19

Quote
Maybe, but I don't see how can avoid this situation even if miners aren't malicious, my scenario still apply.
There are already some dynamic proposals Smiley

https://zander.github.io/posts/Blocksize%20Consensus/

Out of here... many things are moving to make it possible the on-chain scaling.

NON DO ASSISTENZA PRIVATA - http://hostfatmind.com
Nicolas Tesla
Newbie
*
Offline Offline

Activity: 57
Merit: 0


View Profile
January 18, 2017, 09:05:29 PM
 #20

Quote
Maybe, but I don't see how can avoid this situation even if miners aren't malicious, my scenario still apply.
There are already some dynamic proposals Smiley

https://zander.github.io/posts/Blocksize%20Consensus/

Out of here... many things are moving to make it possible the on-chain scaling.

I know about dynamics proposal, even Core has apparently one in their cardboard box, they are commonly called "FlexCap", but they all still fall in the tragedy of the common in the long term, it's the same as unlimited, the cap is dynamic but doesn't have upper bound (unless it have an hard upper bound like let's say 8 MB).

We are back to square one, small nodes will disappear and Bitcoin will become PayPal in Bitmain data centers...  Cry

Let's face it blockchain tech ain't meant to be VISA they are highly inefficient, their value obviously derive on being a censorship resistant settlement network...

I don't see any viable solution for massive on-chain scaling without sacrificing core values and ethos of BTC...

Pages: [1] 2 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!