Bitcoin Forum
July 21, 2017, 08:49:33 PM *
News: The warning which may be displayed by Bitcoin Core about unknown versions is related to BIP91, and can be safely ignored.
 
   Home   Help Search Donate Login Register  
Poll
Question: Will you support Gavin's new block size limit hard fork of 8MB by January 1, 2016 then doubling every 2 years?
1.  yes
2.  no

Pages: « 1 ... 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 [1519] 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 1558 »
  Print  
Author Topic: Gold collapsing. Bitcoin UP.  (Read 1933290 times)
Peter R
Legendary
*
Offline Offline

Activity: 1036



View Profile
August 13, 2015, 04:00:27 PM
 #30361

YES! See also here: https://www.reddit.com/r/Bitcoin/comments/3eaxyk/idea_on_bitcoin_mailing_list_blocksize_freely/

Instead of a pull down menu, I would favor a free form text field without any default. (For policy neutrality)

Pushes the responsibility and the power to set this limit back to the user - where it belongs.

Thanks for the link!  Sounds like this is already a thing!  We should bring more attention to this idea and iron out the details.  

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
1500670173
Hero Member
*
Offline Offline

Posts: 1500670173

View Profile Personal Message (Offline)

Ignore
1500670173
Reply with quote  #2

1500670173
Report to moderator
1500670173
Hero Member
*
Offline Offline

Posts: 1500670173

View Profile Personal Message (Offline)

Ignore
1500670173
Reply with quote  #2

1500670173
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1500670173
Hero Member
*
Offline Offline

Posts: 1500670173

View Profile Personal Message (Offline)

Ignore
1500670173
Reply with quote  #2

1500670173
Report to moderator
1500670173
Hero Member
*
Offline Offline

Posts: 1500670173

View Profile Personal Message (Offline)

Ignore
1500670173
Reply with quote  #2

1500670173
Report to moderator
1500670173
Hero Member
*
Offline Offline

Posts: 1500670173

View Profile Personal Message (Offline)

Ignore
1500670173
Reply with quote  #2

1500670173
Report to moderator
awemany
Newbie
*
Offline Offline

Activity: 28


View Profile
August 13, 2015, 04:03:51 PM
 #30362

[...]
So…is this a good idea?  If there are no obvious "gotchas" then perhaps we should write up a BIP.


I'd be willing to help! But I'd also suggest to just make it about the configurable setting and leave the rest to the user. I think signalling about blocksize has to happen out-of-band for the time being. Because it is potentially a lot of code complexity. And simple IMO beats complex here.

Just make it mandatory to start bitcoind with -maxblocksizelimit (or similar) and have an edit box for bitcoin-qt that has to be filled with a value. The amount of code change should be about the same as BIP101.

Start requesting this value at some switchover date in the future - maybe at the beginning of Gavin's increase schedule. Reason for this: Time for user education on building a function Bitcoin network.

What do you think?
Zangelbert Bingledack
Legendary
*
Offline Offline

Activity: 1036


View Profile
August 13, 2015, 04:04:25 PM
 #30363

It would be a recognition that the block size limit is not part of the consensus layer, but rather part of the transport layer, as sickpig suggested:

you know what I can't stop thinking that the max block size is a transport layer constraint that have crept in consensus layer.

The network would dynamically determine the max block size as the network evolves by expressing the size of the blocks they will accept with the drop-down menu on their client.

This seems too easy, like why wouldn't this have been thought of before. Is the idea that maybe this is one of those cases where muddled thinking (the consensus/transport layer confusion) has prevented people from seeing the obvious? I ask because I'm not sure I understand the full implications of sickpig's comment.

EDIT: I think I may get it now:

https://www.reddit.com/r/Bitcoin/comments/3eaxyk/idea_on_bitcoin_mailing_list_blocksize_freely/ctddl6h

along with why it hasn't been tried:

https://www.reddit.com/r/Bitcoin/comments/3eaxyk/idea_on_bitcoin_mailing_list_blocksize_freely/ctd812o
kehtolo
Hero Member
*****
Offline Offline

Activity: 700


View Profile
August 13, 2015, 04:08:00 PM
 #30364


At this point I'd say just find a way to put the forks on the market and let's arbitrage it out. I will submit if a fork cannot gain the market cap advantage, and I suspect the small-blockers will likewise if Core loses it. Money talks.

I had a strange idea recently: what if we don't even bother with BIP100, BIP101, etc., or trying to come to "consensus" in some formal way.  What if, instead, we just make it very easy for node operators to adjust their block size limit.  Imagine a drop down menu where you can select "1 MB, 2 MB, 4 MB, 8 MB, … ."  What would happen?  

Personally, I'd just select some big block size limit, like 32 MB.  This way, I'd be guaranteed to follow the longest proof of work chain, regardless of what the effective block size limit becomes.  I'd expect many people to do the same thing.  Eventually, it becomes obvious that the economic majority is supporting a larger limit, and a brave miner publishes a block that is 1.1 MB is size.  We all witness that indeed that block got included into the longest proof of work chain, and then suddenly all miners are confident producing 1.1 MB blocks.  Thus, the effective block size limit slowly creeps upwards, as this process is repeated over and over as demand for block space grows.

TL/DR: maybe we don't need a strict definition for the max block size limit.

that's just a re-write of what i've been advocating; lift the limit entirely.

but yeah, your idea is great b/c it would give full node operators a sense of being in charge via a pull down menu.  i like it.

don't forget that mining pools are just huge hashing overlays of full nodes which they operate and could use to do the same type of voting.

Yes, you have been essentially advocating the same thing.  

We could take this idea further: in addition to the drop-down menu where node operators and miners select the max block size they'll accept, we could add two new features to improve communication of their decisions:

  1.  The max block size selected by a node would be written into the header of any blocks the node mines.

  2.  The P2P protocol would be extended so that nodes could poll other nodes to find out their block size limit.

This would be a highly decentralized way of coming to consensus in a very flexible and dynamic manner.  

It would be a recognition that the block size limit is not part of the consensus layer, but rather part of the transport layer, as sickpig suggested:

you know what I can't stop thinking that the max block size is a transport layer constraint that have crept in consensus layer.

The network would dynamically determine the max block size as the network evolves by expressing the size of the blocks they will accept with the drop-down menu on their client.

So…is this a good idea?  If there are no obvious "gotchas" then perhaps we should write up a BIP.


It's a wonderful idea! It scales dynamically by reaching a consensus in a decentralised way. The network decides and evolves organically almost. I love it.

The next 24 hours are critical!
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
August 13, 2015, 04:17:35 PM
 #30365

I'd say the term "store of value" has meaning in the context our current world of fiat money, where you need a hedge against inflation. In the case of Bitcoin while it is still not yet mainstream I think a special definition is useful: an asset that retains or grows its purchasing power over the years (particularly in contrast with fiat money), with growth of course being considered even better as a store of value than simply staying level. Also the difficulty in confiscating it should be part of its store-of-value merits.
There is certainly a difference between forms of money that work well, and forms of money that don't, that many people have observed throughout history.

The problem is that the explanations for those observations are not correct, because they are tautological. They show up in conversations with goldbugs all the time.

Q: What is a store of value?
A: Anything that has the properties of gold.

Q: Why is gold a store of value?
A: Because it has intrinsic value.

Q: What is value?
A: It's what anything with the properties of gold has.

Q: Can you give reason why I should buy your gold other than that you want to sell it?
A: ...no.
Peter R
Legendary
*
Offline Offline

Activity: 1036



View Profile
August 13, 2015, 04:20:31 PM
 #30366

[...]
So…is this a good idea?  If there are no obvious "gotchas" then perhaps we should write up a BIP.


I'd be willing to help! But I'd also suggest to just make it about the configurable setting and leave the rest to the user. I think signalling about blocksize has to happen out-of-band for the time being. Because it is potentially a lot of code complexity. And simple IMO beats complex here.

Just make it mandatory to start bitcoind with -maxblocksizelimit (or similar) and have an edit box for bitcoin-qt that has to be filled with a value. The amount of code change should be about the same as BIP101.

Start requesting this value at some switchover date in the future - maybe at the beginning of Gavin's increase schedule. Reason for this: Time for user education on building a function Bitcoin network.

What do you think?

I think that all makes perfect sense, and I agree that simple is better!  Perhaps the BIP could only advocate for doing what you said to start, and then there could be a follow-up BIP to do the signalling in the block headers and to add the p2p "block size limit request" messages.  The nice thing is that the signalling stuff in the follow-up BIP would have nothing to do with the consensus layer, so it would be much easier to build support for it.  

I'd be willing to contribute to this too.  Realistically, I couldn't do any serious work on this until mid-September, however.  Timing wise, it would be great to have a polished proposal published for the second Scalability Workshop in Hong Kong probably in November or December: https://scalingbitcoin.org/

I'm actually quite excited about this idea.  It has a sort of inevitable feel to it.

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
Erdogan
Hero Member
*****
Offline Offline

Activity: 756



View Profile
August 13, 2015, 04:22:57 PM
 #30367


At this point I'd say just find a way to put the forks on the market and let's arbitrage it out. I will submit if a fork cannot gain the market cap advantage, and I suspect the small-blockers will likewise if Core loses it. Money talks.

I had a strange idea recently: what if we don't even bother with BIP100, BIP101, etc., or trying to come to "consensus" in some formal way.  What if, instead, we just make it very easy for node operators to adjust their block size limit.  Imagine a drop down menu where you can select "1 MB, 2 MB, 4 MB, 8 MB, … ."  What would happen?  

Personally, I'd just select some big block size limit, like 32 MB.  This way, I'd be guaranteed to follow the longest proof of work chain, regardless of what the effective block size limit becomes.  I'd expect many people to do the same thing.  Eventually, it becomes obvious that the economic majority is supporting a larger limit, and a brave miner publishes a block that is 1.1 MB is size.  We all witness that indeed that block got included into the longest proof of work chain, and then suddenly all miners are confident producing 1.1 MB blocks.  Thus, the effective block size limit slowly creeps upwards, as this process is repeated over and over as demand for block space grows.

TL/DR: maybe we don't need a strict definition for the max block size limit.

That is exactly what I think. The miners will have to try it out or get some feel of what they can do through other channels (social media, conferences, node versions), including associate with other miners. As long as the association is voluntary, it will not form a monopoly.
Erdogan
Hero Member
*****
Offline Offline

Activity: 756



View Profile
August 13, 2015, 04:43:39 PM
 #30368

Note that this toe dipping is the reality also if we go to 2 MB. There could be some bug related to the network or whatever, that could slip through testing in all environments except the production blockchain. Heck, it is there now, unless we have had a block of excactly 1000000 bytes decimal. I guess some miners are not 100 percent certain that there is not an off-by-one bug there, so they just remove one transaction to be sure.
iCEBREAKER
Legendary
*
Offline Offline

Activity: 1722


Support SEGWIT on 8/1/17 https://github.com/UASF


View Profile WWW
August 13, 2015, 04:55:53 PM
 #30369


i really see no technical reasons why we can't have bigger blocks.  now.


Then you're not looking hard enough.  Here, I'll help:

Quote


Quote

The vast majority of research demonstrates that blocksize does matter, blocksize caps are required to secure the network, and large blocks are a centralizing pressure.

Here’s a short list of what has been published so far:

1) No blocksize cap and no minimum fee leads to catastrophic breakage as miners chase marginal 0 fees:

    http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2400519

It’s important to note that mandatory minimum fees could simply be rebated out-of-band, which would lead to the same problems.

2) a) Large mining pools make strategies other than honest mining more profitable:

    http://www.cs.cornell.edu/~ie53/publications/btcProcArXiv.pdf

2) b) In the presence of latency, some alternative selfish strategy exists that is more profitable at any mining pool size. The larger the latency, the greater the selfish mining benefit:

    http://arxiv.org/pdf/1507.06183v1.pdf

3) Mining simulations run by Pieter Wuille shows that well-connected peers making a majority of the hashing power have an advantage over less-connected ones, earning more profits per hash. Larger blocks even further favor these well-connected peers. This gets even worse as we shift from block subsidy to fee based reward :

    http://www.mail-archive.com/bitcoin-development@lists.sourceforge.net/msg08161.html

4) Other point(s):

If there is no blocksize cap, a miner should simply snipe the fees from the latest block and try to stale that block by mining their own replacement. You get all the fees plus any more from new transactions. Full blocks gives less reward for doing so, since you have to choose which transactions to include. https://www.reddit.com/r/Bitcoin/comments/3fpuld/a_transaction_fee_market_exists_without_a_block/ctqxkq6

"I think this is a good idea" doesn't count.

Stamping your feet and demanding things be done "now" isn't going to help XT.

"Not tonight dear"   Cheesy

The difference between bad and well-developed digital cash will determine whether we have a dictatorship or a real democracy.  David Chaum 1996
"Monero" : { Private - Auditable - 100% Fungible - Flexible Blocksize - Wild & Free® - Intro - Core GUI - Podcats - Roadmap - Dice - Blackjack - Github - Android }
MoneroForCash.com  |  Buy and sell XMR near you  |  Easymonero.com  |  Bitsquare.io - Decentralized XMR Exchange  |  Buy XMR with fiat
Fungibility provides privacy as a side effect.  Adam Back 2014

Bitcoin is intentionally designed to be ungovernable and governance-free.  luke-jr 2016
Blocks must necessarily be full for the Bitcoin network to be able to pay for its own security.  davout 2015
Blocksize is an intentionally limited resource, like the 21e6 BTC limit.  Changing it degrades the surrounding economics, creating negative incentives.  Jeff Garzik 2013


The raison d'être of bitcoin is trustlessness. - Eric Lombrozo 2015
It is an Engineering Requirement that Bitcoin be “Above the Law”  Paul Sztorc 2015
Resiliency, not efficiency, is the paramount goal of decentralized, non-state sanctioned currency -Jon Matonis 2015

Bitcoin is intentionally designed to be ungovernable and governance-free.  luke-jr 2016

Technology tends to move in the direction of making surveillance easier, and the ability of computers to track us doubles every eighteen months. - Phil Zimmerman 2013

The only way to make software secure, reliable, and fast is to make it small. Fight Features. - Andy Tanenbaum 2004
tvbcof
Legendary
*
Offline Offline

Activity: 2212


View Profile
August 13, 2015, 04:57:49 PM
 #30370


The problem is that the explanations for those observations are not correct, because they are tautological. They show up in conversations with goldbugs all the time.

Where do you live?  Strawman City?


Q: What is a store of value?
A: Anything that has the properties of gold.
...

Actually, Bitcoin is something I consider a 'store of value' for a close reason.  Neither Bitcoin nor gold have counter-party risk, and that is actually fairly unusual characteristic these days and one I value highly because I consider one of most acute risks associated with wealth preservation to be an economic crisis where one can kiss almost anything with counter-party (or theft) risk bye-bye.

What gold has over Bitcoin is that it does not require a free and high capacity internet to function.  The only 'bandwidth' that gold needs is a ticker signal and that could be accomplished even with fairly low latency without a functional internet at all.  Even then, this need is a nicety more than a necessity.

Under conditions of economic crisis I believe it almost certain that there will be a significant clamp-down on the free flow of information.  Unfortunately this is termed an 'internet kill switch' which is misleading.  I expect it to be implemented as a shift from simply monitoring internet traffic to actively blocking that of it which is potentially damaging to those seeking to maintain control and shape society.  IOW, we will still have access to our porn and mainstream movies from an authorized intellectual property owner through large corporate providers, but it will be to 'dangerous to society' for unauthorized people to communicate directly with one another or to allow subversive ideas and data to be disseminated.

Bitcoin used in certain sophisticated ways which force multiply the bandwidth still has 'store of value' potential to me which is why I am still a hodler.  The simple reason for this is that it can at least in theory function in a world which is much different than we see today but is very likely to exist at the time when it actually matters.

That, in a nutshell, is why I am so opposed to growing Bitcoin in simplistic ways which box us in to a reliance on assumptions about the global internet based on the common experiences of today and most people's expectations for tomorrow.  'Dumb growth' hacks off what many people consider to be a vestigial appendage but to me it represents a very big part of the 'store of value' proposition for Bitcoin and one that makes it somewhat competitive with gold.


notme
Legendary
*
Offline Offline

Activity: 1736


View Profile
August 13, 2015, 05:02:43 PM
 #30371

[...]
So…is this a good idea?  If there are no obvious "gotchas" then perhaps we should write up a BIP.


I'd be willing to help! But I'd also suggest to just make it about the configurable setting and leave the rest to the user. I think signalling about blocksize has to happen out-of-band for the time being. Because it is potentially a lot of code complexity. And simple IMO beats complex here.

Just make it mandatory to start bitcoind with -maxblocksizelimit (or similar) and have an edit box for bitcoin-qt that has to be filled with a value. The amount of code change should be about the same as BIP101.

Start requesting this value at some switchover date in the future - maybe at the beginning of Gavin's increase schedule. Reason for this: Time for user education on building a function Bitcoin network.

What do you think?

I think that all makes perfect sense, and I agree that simple is better!  Perhaps the BIP could only advocate for doing what you said to start, and then there could be a follow-up BIP to do the signalling in the block headers and to add the p2p "block size limit request" messages.  The nice thing is that the signalling stuff in the follow-up BIP would have nothing to do with the consensus layer, so it would be much easier to build support for it.  

I'd be willing to contribute to this too.  Realistically, I couldn't do any serious work on this until mid-September, however.  Timing wise, it would be great to have a polished proposal published for the second Scalability Workshop in Hong Kong probably in November or December: https://scalingbitcoin.org/

I'm actually quite excited about this idea.  It has a sort of inevitable feel to it.


What about lying? If enough miners claim to support larger blocks but actually don't, then part of the network will waste time producing blocks that won't be built on.  IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.  However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
12jh3odyAAaR2XedPKZNCR4X4sebuotQzN
Chainsaw
Hero Member
*****
Offline Offline

Activity: 619


x


View Profile
August 13, 2015, 05:04:17 PM
 #30372



I'm actually quite excited about this idea.  It has a sort of inevitable feel to it.


Yes.
I've been watching for months. This feels more right than any proposals I've seen to date.

Odalv
Legendary
*
Offline Offline

Activity: 1162



View Profile
August 13, 2015, 05:54:34 PM
 #30373

[...]
So…is this a good idea?  If there are no obvious "gotchas" then perhaps we should write up a BIP.


I'd be willing to help! But I'd also suggest to just make it about the configurable setting and leave the rest to the user. I think signalling about blocksize has to happen out-of-band for the time being. Because it is potentially a lot of code complexity. And simple IMO beats complex here.

Just make it mandatory to start bitcoind with -maxblocksizelimit (or similar) and have an edit box for bitcoin-qt that has to be filled with a value. The amount of code change should be about the same as BIP101.

Start requesting this value at some switchover date in the future - maybe at the beginning of Gavin's increase schedule. Reason for this: Time for user education on building a function Bitcoin network.

What do you think?

I think that all makes perfect sense, and I agree that simple is better!  Perhaps the BIP could only advocate for doing what you said to start, and then there could be a follow-up BIP to do the signalling in the block headers and to add the p2p "block size limit request" messages.  The nice thing is that the signalling stuff in the follow-up BIP would have nothing to do with the consensus layer, so it would be much easier to build support for it.  

I'd be willing to contribute to this too.  Realistically, I couldn't do any serious work on this until mid-September, however.  Timing wise, it would be great to have a polished proposal published for the second Scalability Workshop in Hong Kong probably in November or December: https://scalingbitcoin.org/

I'm actually quite excited about this idea.  It has a sort of inevitable feel to it.


What about lying? If enough miners claims to support larger blocks but actually don't, then part of the network will waste time producing blocks that won't be built on.  IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.  However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

+1
 - I can't wait how this bloat chains supporters will hit the wall. If it was so easy "just increase numbers and we will have 10 times more faster network ... increase to the infinity" :-)

 - please start XT tomorrow. I like fun.
thezerg
Legendary
*
Offline Offline

Activity: 1246


View Profile
August 13, 2015, 06:28:23 PM
 #30374


At this point I'd say just find a way to put the forks on the market and let's arbitrage it out. I will submit if a fork cannot gain the market cap advantage, and I suspect the small-blockers will likewise if Core loses it. Money talks.

I had a strange idea recently: what if we don't even bother with BIP100, BIP101, etc., or trying to come to "consensus" in some formal way.  What if, instead, we just make it very easy for node operators to adjust their block size limit.  Imagine a drop down menu where you can select "1 MB, 2 MB, 4 MB, 8 MB, … ."  What would happen?  

Personally, I'd just select some big block size limit, like 32 MB.  This way, I'd be guaranteed to follow the longest proof of work chain, regardless of what the effective block size limit becomes.  I'd expect many people to do the same thing.  Eventually, it becomes obvious that the economic majority is supporting a larger limit, and a brave miner publishes a block that is 1.1 MB is size.  We all witness that indeed that block got included into the longest proof of work chain, and then suddenly all miners are confident producing 1.1 MB blocks.  Thus, the effective block size limit slowly creeps upwards, as this process is repeated over and over as demand for block space grows.

TL/DR: maybe we don't need a strict definition for the max block size limit.

That is exactly what I think. The miners will have to try it out or get some feel of what they can do through other channels (social media, conferences, node versions), including associate with other miners. As long as the association is voluntary, it will not form a monopoly.


yes, this has been considered and discussed before.  The danger is that a large block miner cartel might develop naturally whose blocks put small-bandwidth players at a disadvantage.  But as others have mentioned, some people are at an electricity cost disadvantage, some bandwidth, some something else... basically it would just be another metric to take into account as you site your miners.

So I would be 100% for this if miners could only work with real txns.  But a miner could fill up a huge block with a bunch of "fake" (unrelayed, fee pays to himself) txns to artificially drive up network costs.  Its too bad Bitcoin doesn't have the "pay portion of fees to miner pool, receive portion for the next N blocks" feature... that idea closes a lot of miner loopholes.  

But regardless I'm not sure if this "loophole" really is one; it does require 51% of the network to be as connected as you are and willing to process your monster garbage block.  I have a hard time believing that miners would do so since over the long term they need bitcoin to succeed.  More likely (as you guys suggest) they'll just configure their nodes to ignore monster blocks unless > N deep in the chain.

Peter R
Legendary
*
Offline Offline

Activity: 1036



View Profile
August 13, 2015, 06:36:52 PM
 #30375

What about lying? If enough miners claim to support larger blocks but actually don't, then part of the network will waste time producing blocks that won't be built on.  IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.  However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

I don't think it would be a problem.  Like Erdogan said, the miners will use the "tip-toe" method of increasing the block size.  Worst case, a large block gets orphaned and nobody tries again for a while.  But if the larger block doesn't get orphaned, then the network will assume that that size is now supported (thereby setting a new effective upper limit).

IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.

This doesn't put the power directly in the miners' hands.  It keeps the power where it already is: in everybody's hands!  It just makes it much easier for people to exercise the power they already possess.  

Quote
However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

I disagree.  For example, I would not set my node's limit to anything greater than 32 MB until I understood the 33.5 MB message size limitation better.  I expect many people would do the same thing.  Rational miners won't dare to randomly publish a 100 MB block, because they'd be worried that it would be orphaned.

Furthermore, since miners would likely use the "tip-toe" method, the effective block size limit will grow only in very small increments, helping to reveal any potential limitations before they become problems.


Run Bitcoin Unlimited (www.bitcoinunlimited.info)
Adrian-x
Legendary
*
Offline Offline

Activity: 1358



View Profile
August 13, 2015, 06:51:27 PM
 #30376



what is the latest version of XT, is it still a test version?

Thank me in Bits 12MwnzxtprG2mHm3rKdgi7NmJKCypsMMQw
cypherdoc
Legendary
*
Offline Offline

Activity: 1764



View Profile
August 13, 2015, 06:55:17 PM
 #30377

What about lying? If enough miners claim to support larger blocks but actually don't, then part of the network will waste time producing blocks that won't be built on.  IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.  However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

I don't think it would be a problem.  Like Erdogan said, the miners will use the "tip-toe" method of increasing the block size.  Worst case, a large block gets orphaned and nobody tries again for a while.  But if the larger block doesn't get orphaned, then the network will assume that that size is now supported (thereby setting a new effective upper limit).

IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.

This doesn't put the power directly in the miners' hands.  It keeps the power where it already is: in everybody's hands!  It just makes it much easier for people to exercise the power they already possess.  

Quote
However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

I disagree.  For example, I would not set my node's limit to anything greater than 32 MB until I understood the 33.5 MB message size limitation better.  I expect many people would do the same thing.  Rational miners won't dare to randomly publish a 100 MB block, because they'd be worried that it would be orphaned.

Furthermore, since miners would likely use the "tip-toe" method, the effective block size limit will grow only in very small increments, helping to reveal any potential limitations before they become problems.



yes, i've called this "advancing together" but "tip toeing" is even a better descriptor as it implies small baby steps upwards as opposed to random big steps.  miners will not only do what's best for themselves but what's best for the group.  they know that all hands on deck are needed as a team to replace the existing financial order.  where BitcoinXT is going there will be plenty of profits to be had for existing cooperative players as well as new entrants.  the stakes are enormous to the upside but individual miners cannot afford to be caught being dishonest or attacking or they will be left behind or severely deprecated ala ghash.  what a shame to miss out on being the next JPM as a result of being greedy.
Adrian-x
Legendary
*
Offline Offline

Activity: 1358



View Profile
August 13, 2015, 06:56:49 PM
 #30378


At this point I'd say just find a way to put the forks on the market and let's arbitrage it out. I will submit if a fork cannot gain the market cap advantage, and I suspect the small-blockers will likewise if Core loses it. Money talks.

I had a strange idea recently: what if we don't even bother with BIP100, BIP101, etc., or trying to come to "consensus" in some formal way.  What if, instead, we just make it very easy for node operators to adjust their block size limit.  Imagine a drop down menu where you can select "1 MB, 2 MB, 4 MB, 8 MB, … ."  What would happen?  

Personally, I'd just select some big block size limit, like 32 MB.  This way, I'd be guaranteed to follow the longest proof of work chain, regardless of what the effective block size limit becomes.  I'd expect many people to do the same thing.  Eventually, it becomes obvious that the economic majority is supporting a larger limit, and a brave miner publishes a block that is 1.1 MB is size.  We all witness that indeed that block got included into the longest proof of work chain, and then suddenly all miners are confident producing 1.1 MB blocks.  Thus, the effective block size limit slowly creeps upwards, as this process is repeated over and over as demand for block space grows.

TL/DR: maybe we don't need a strict definition for the max block size limit.

Nodes have the power to do that, even a right given they host the data, however they dont have the the market knowledge to know what it should be, the power to set the block size must come from the incentive schema designed as designed by Satosi to work.  I cant imagine it would work out well at all, probably better than limiting the block size but not my much.  

Thank me in Bits 12MwnzxtprG2mHm3rKdgi7NmJKCypsMMQw
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
August 13, 2015, 06:57:18 PM
 #30379

The network would dynamically determine the max block size as the network evolves by expressing the size of the blocks they will accept with the drop-down menu on their client.

So…is this a good idea?  If there are no obvious "gotchas" then perhaps we should write up a BIP.

I like the principle of the idea, however the idea of voting by non-miners is subject to Sybil attack - someone spinning up many fake nodes to skew the results. There have been posts on the mailing list about various voting schemes.

Going one better than simple user voting is proof-of-stake voting:

Proof-of-stake voting could be combined with miner voting (like BIP-100) to get a balance between mining power and investors/holders.
https://www.mail-archive.com/bitcoin-development@lists.sourceforge.net/msg02323.html
A drop down box, which would need supporting on the many wallet providers, then allows people to vote depending upon their coin balance. A non-vote is a "vote" for no change.

Except, the problem with proof-of-stake voting is summarized by Alan Reiner (Armory developer) in the responses to this proposal:

Quote
One major problem I see with this, no matter how well-thought-out it is,
it's unlikely that those with money will participate.  Those with the
most stake, likely have their private keys behind super-secure
accessibility barriers, and are not likely to go through the effort just
to sign votes.  Not only that, but it would require them to reveal their
public key, which while isn't technically so terrible, large amounts of
money intended to be kept in storage for 10+ years will prefer to avoid
any exposure at all, in the oft-chance that QCs come around a lot
earlier than we expected.  Sure, the actual risk should be pretty much
non-existent, but some of the most paranoid folks are probably the same
ones who have a lot of funds and want 100.00% of the security that is
possible.   They will see this as wildly inconvenient.

Adrian-x
Legendary
*
Offline Offline

Activity: 1358



View Profile
August 13, 2015, 07:02:47 PM
 #30380


but back to this, you and i can't be outliers as much as LukeJr and gmax want everyone to think in terms of bandwidth speed.  we can easily handle a significant block size increase, no problem.



furthermore:

http://arstechnica.co.uk/gadgets/2015/08/samsung-unveils-2-5-inch-16tb-ssd-the-worlds-largest-hard-drive/

i really see no technical reasons why we can't have bigger blocks.  now.

Yeah I think we're past that point in the debate. It's now clear that the concern of those who make the technical claims regarding bandwidth is about ensuring that Bitcoin node-running is an all-inclusive activity. They insist that no one can be left out, or else it's not a "consensus." Well we're being left out right now, aren't we. By their logic we should be able to halt Bitcoin entirely during this debate because they don't have our consensus. There is no internal consistency in the whole "consensus" line of reasoning. It's just a feel-good buzzword except in the very narrow sense that of course the code will only run among those who are currently in consensus. The lack of any mention of market cap or other economic factor during such invocations of consensus should be a red flag.

There are aspects of the debate where intelligent people may disagree, but this part is pure reactionary stalwartism at this point. It doesn't even jive with the fundamental nature of open source software, which makes consensus a fluid concept. At this point I'd say just find a way to put the forks on the market and let's arbitrage it out. I will submit if a fork cannot gain the market cap advantage, and I suspect the small-blockers will likewise if Core loses it. Money talks.

i've flipped the question around to the Cripplecoiner's a few times, as in, what happens if Gavin is the sole dissenter when the need to slip in the spvp for SC's comes around in a year or so?  will they gracefully and quietly back off since they won't have consensus?  the angry answer i get back always sounds like they'll ram it thru anyways.

another reason Cripplecoiner's want to see 0 hard forks is they have invested or are investing in systems that could be obsoleted very easily with small tweaks to the code. I suspect if someone like Gavin who isent invested in there company can come in and make a change without there permission its very threatening to there future which may even depend on bugs in the code that cant be fixed for legacy reasons.

Thank me in Bits 12MwnzxtprG2mHm3rKdgi7NmJKCypsMMQw
Pages: « 1 ... 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 [1519] 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 1558 »
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!