Bitcoin Forum
March 19, 2024, 02:08:13 AM *
News: Latest Bitcoin Core release: 26.0 [Torrent]
 
   Home   Help Search Login Register More  
Poll
Question: Will you support Gavin's new block size limit hard fork of 8MB by January 1, 2016 then doubling every 2 years?
1.  yes
2.  no

Pages: « 1 ... 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 [1519] 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 »
  Print  
Author Topic: Gold collapsing. Bitcoin UP.  (Read 2032123 times)
Adrian-x
Legendary
*
Offline Offline

Activity: 1372
Merit: 1000



View Profile
August 13, 2015, 07:15:16 PM
 #30361



what is the latest version of XT, is it still a test version?

0.11A is the testing version of the actual bigger block software, i think.

i'm running 0.10.2 which is the stable version but doesn't have the bigger blocks enabled.

both versions mine Core as usual, as you'd expect.
thanks,

I'm ruining XT too but an older stable version that still identifies as Satoshi client.

I'm waiting for 11.0 or the bigger block stable version.

Thank me in Bits 12MwnzxtprG2mHm3rKdgi7NmJKCypsMMQw
You get merit points when someone likes your post enough to give you some. And for every 2 merit points you receive, you can send 1 merit point to someone else!
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1710814093
Hero Member
*
Offline Offline

Posts: 1710814093

View Profile Personal Message (Offline)

Ignore
1710814093
Reply with quote  #2

1710814093
Report to moderator
1710814093
Hero Member
*
Offline Offline

Posts: 1710814093

View Profile Personal Message (Offline)

Ignore
1710814093
Reply with quote  #2

1710814093
Report to moderator
Adrian-x
Legendary
*
Offline Offline

Activity: 1372
Merit: 1000



View Profile
August 13, 2015, 07:20:05 PM
 #30362


At this point I'd say just find a way to put the forks on the market and let's arbitrage it out. I will submit if a fork cannot gain the market cap advantage, and I suspect the small-blockers will likewise if Core loses it. Money talks.

I had a strange idea recently: what if we don't even bother with BIP100, BIP101, etc., or trying to come to "consensus" in some formal way.  What if, instead, we just make it very easy for node operators to adjust their block size limit.  Imagine a drop down menu where you can select "1 MB, 2 MB, 4 MB, 8 MB, … ."  What would happen?  

Personally, I'd just select some big block size limit, like 32 MB.  This way, I'd be guaranteed to follow the longest proof of work chain, regardless of what the effective block size limit becomes.  I'd expect many people to do the same thing.  Eventually, it becomes obvious that the economic majority is supporting a larger limit, and a brave miner publishes a block that is 1.1 MB is size.  We all witness that indeed that block got included into the longest proof of work chain, and then suddenly all miners are confident producing 1.1 MB blocks.  Thus, the effective block size limit slowly creeps upwards, as this process is repeated over and over as demand for block space grows.

TL/DR: maybe we don't need a strict definition for the max block size limit.

Nodes have the power to do that, even a right given they host the data, however they dont have the the market knowledge to know what it should be, the power to set the block size must come from the incentive schema designed as designed by Satosi to work.  I cant imagine it would work out well at all, probably better than limiting the block size but not my much.  

that's not the pt. 

the full nodes should and will care only about what tx volume they have the capability or desire to validate.  for me, despite not getting paid, i will want to max out those validation capabilities to whatever i'm willing to pay (donate) to the network, which is much higher than it is being used today.  i want Bitcoin to grow in size and price so i'm going to donate to my hearts content trying to make that happen.   the Cripplecoiners will say that is no way to run a full node but i'll bet there are thousands of guys like me who are willing to do this for the prospect of taking the price to the next 10x level.  there is nothing wrong with wanting a higher price b/c a much higher price is essential to Bitcoin's overall success and ability to move large scale tx's as we've argued about before.  it has to be able to allow large tx's in the $millions to fund purchases of real estate, yachts, planes or bigger items so as not to perturb the overall exchange price, unlike what we have today. 

as more users come onboard, merchants will have the incentive, not to mention fiduciary responsibility to run full nodes.  that's a good thing.

but if someone designs a full node fee mkt, i won't be complaining either.

the optimal solution is looking more and more like 101 to me. I think the sybil attack is a good reason keep nodes from forking to the wrong block size.   

Thank me in Bits 12MwnzxtprG2mHm3rKdgi7NmJKCypsMMQw
brg444
Hero Member
*****
Offline Offline

Activity: 644
Merit: 504

Bitcoin replaces central, not commercial, banks


View Profile
August 13, 2015, 07:22:44 PM
Last edit: August 13, 2015, 07:45:44 PM by brg444
 #30363

My advise to you is to start looking at growth in another light. While it seems reasonable to track "adoption & growth" by an increase in the userbase, I have recently come to the conclusion that what might be even more preferable is a growth in capital.

I guess this comes back to our different idea of Bitcoin's value proposition but to put it shortly, my opinion is that more expensive transaction fees on the blockchain will hardly hinder the adoption of capital looking to buy a spot and park their money in the unforgeable ledger. That is because bitcoins are a unique collectible unlike anything the world has seen since gold. Unfortunately much like gold some characteristics limit its direct use as a mean of exchange. Gold's shortcoming is in its physicality, Bitcoin's own is the decentralization tradeoff.

This is just for the record as Peter is doing an admirable job of explaining things, I have highlighted the fundamental failings in your logic which makes you come to the wrong conclusion. I fully expect you to ignore this and hand-wave it away, but here goes...

1. Growth in capital is a reactive process, it is a market response to the growth of  the whole ecosystem. There have been altcoins with enormous early capital such as Auroracoin where $100 million in market cap rapidly went to zero, like morning mist in the sun. This was because the capital temporarily existed but there was no ecosystem to maintain it.

2. Bitcoins are not a "unique collectible" because while bitcoins are truly finite, cryptocurrency is infinite. Litecoin is just Bitcoin with a different name and a few minor software changes. Many new coins exist such as Monero and NXT and Etherium. ALL of these could do the job of Bitcoin if Bitcoin vanished. The only thing keeping Bitcoin at No.1 is a positive feedback loop: ecosystem usage (transactions) > utility value -> market price -> mining power -> PoW security of blockchain -> more public interest -> more ecosystem usage

The problem with the 1MB is that it will eventually cripple this all-important feedback loop.

Your Auroracoin false equivalence does not stand. An irrational pump & dump does not compare in any way to a collector's item.

I think Bitcoins are absolutely a unique collectible. I hate to "call up" authority but its own creator was well aware of that:

Quote
Maybe it could get an initial value circularly as you’ve suggested, by people foreseeing its potential usefulness for exchange. (I would definitely want some) Maybe collectors, any random reason could spark it. - Satoshi Nakamoto

Quote
It might make sense just to get some in case it catches on. If enough people think the same way, that becomes a self fulfilling prophecy. -Satoshi Nakamoto

Quote
Aug. 27, 2010: Bitcoins have no dividend or potential future dividend, therefore not like a stock. (They’re) more like a collectible or commodity. - Satoshi Nakamoto

This typical "cryptocurrency is infinite" reply seems shortsighted and IMO shows a misunderstanding of Bitcoin's origin of value.

If we were to derive the latter from "ecosystem usage" as you represent it : transactions on the blockchain, then we should argue Bitcoin is a pretty low value network since the velocity of transactions on the network is, frankly, very low. Have a quick look at the top 500 (you can even go up to 20,000) on the website here http://bitcoinrichlist.com/top500?page=40. A very short glance should make it very clear that most bitcoins rarely move on the blockchain.

In other words, very little people actually use Bitcoin for exchanges of goods and services traded on the blockchain. Therefore, I believe, the "utility" value in your feedback loop is incorrect. The actual primary use case of Bitcoin is for people to store wealth by exchanging fiat currency for bitcoins or, in the case of miners, work/energy. This is exactly how a collectible comes into existence. An organic process where an "unforgeably costly commodity repeatedly adds value by enabling beneficial wealth transfers." (1) Beneficial wealth transfer, in our case, is not transfer of goods or services, but an exit from the fiat system and/or also a speculative move.

By creating value that way, its market prices increases which encourages miners to put more energy into creating them therefore increasing their rarity by making them harder to forge. This increasing price and growing rarity attracts more collectors and on it goes. This is what the feedback loop looks like to me. The current rarity of Bitcoin may very well be a product of its first-mover advantage but it is pointless to dismiss it in the present. That is why Bitcoin should not be thrown into the same bucket as other crypto.

Don't get me wrong, Bitcoin will eventually evolve into a much larger network where an increasing amount of goods and services will be traded for them but I expect this "utility" to remain marginal until Bitcoin grows its collector base by a couple order of magnitudes and ultimately peak when Bitcoin is used as a unit of account.

For all these reason I believe my conclusion still stand and no, 1MB will not cripple this feedback loop.

(1) http://szabo.best.vwh.net/shell.html#Attributes of Collectibles

"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
cypherdoc (OP)
Legendary
*
Offline Offline

Activity: 1764
Merit: 1002



View Profile
August 13, 2015, 07:22:57 PM
Last edit: August 13, 2015, 07:38:42 PM by cypherdoc
 #30364

The network would dynamically determine the max block size as the network evolves by expressing the size of the blocks they will accept with the drop-down menu on their client.

So…is this a good idea?  If there are no obvious "gotchas" then perhaps we should write up a BIP.

I like the principle of the idea, however the idea of voting by non-miners is subject to Sybil attack - someone spinning up many fake nodes to skew the results. There have been posts on the mailing list about various voting schemes.

Going one better than simple user voting is proof-of-stake voting:

Proof-of-stake voting could be combined with miner voting (like BIP-100) to get a balance between mining power and investors/holders.
https://www.mail-archive.com/bitcoin-development@lists.sourceforge.net/msg02323.html
A drop down box, which would need supporting on the many wallet providers, then allows people to vote depending upon their coin balance. A non-vote is a "vote" for no change.

Except, the problem with proof-of-stake voting is summarized by Alan Reiner (Armory developer) in the responses to this proposal:

Quote
One major problem I see with this, no matter how well-thought-out it is,
it's unlikely that those with money will participate.  Those with the
most stake, likely have their private keys behind super-secure
accessibility barriers, and are not likely to go through the effort just
to sign votes.  Not only that, but it would require them to reveal their
public key, which while isn't technically so terrible, large amounts of
money intended to be kept in storage for 10+ years will prefer to avoid
any exposure at all, in the oft-chance that QCs come around a lot
earlier than we expected.  Sure, the actual risk should be pretty much
non-existent, but some of the most paranoid folks are probably the same
ones who have a lot of funds and want 100.00% of the security that is
possible.   They will see this as wildly inconvenient.

Alan is quite right about that.  forget POS voting.

i also disagree with you about the Sybil attacks by spinning up full nodes.  first off, it's not a trivial expense even today to run a full node, so there is a cost to trying to manipulate the system.  and you'd have to run them for "longish" periods of time before you'd get miners to react with bigger blocks.  and, as larger blocks come into being, those full node costs will rise making the attack even more expensive.  and really, for what point?  it's always been possible to spin up nodes but no one's bothered to do it.  as more users enter the system, honest merchants will start spinning up their full nodes in numbers that should make Sybil attacks very difficult.  finally, while running this attack, they are actually helping the network validate and relay tx's and decentralize it.  i'd think there are better ways to attack the network.
cypherdoc (OP)
Legendary
*
Offline Offline

Activity: 1764
Merit: 1002



View Profile
August 13, 2015, 07:29:29 PM
 #30365



what is the latest version of XT, is it still a test version?

0.11A is the testing version of the actual bigger block software, i think.

i'm running 0.10.2 which is the stable version but doesn't have the bigger blocks enabled.

both versions mine Core as usual, as you'd expect.
thanks,

I'm ruining XT too but an older stable version that still identifies as Satoshi client.

I'm waiting for 11.0 or the bigger block stable version.

it identifies as a Satoshi client but it is an XT version of it.  Bitnodes allows you to look at the details of your node to see if it's XT.
Peter R
Legendary
*
Offline Offline

Activity: 1162
Merit: 1007



View Profile
August 13, 2015, 07:52:14 PM
 #30366

that's not the pt.  

the full nodes should and will care only about what tx volume they have the capability or desire to validate.  for me, despite not getting paid, i will want to max out those validation capabilities to whatever i'm willing to pay (donate) to the network, which is much higher than it is being used today.  i want Bitcoin to grow in size and price so i'm going to donate to my hearts content trying to make that happen.   the Cripplecoiners will say that is no way to run a full node but i'll bet there are thousands of guys like me who are willing to do this for the prospect of taking the price to the next 10x level.  there is nothing wrong with wanting a higher price b/c a much higher price is essential to Bitcoin's overall success and ability to move large scale tx's as we've argued about before.  it has to be able to allow large tx's in the $millions to fund purchases of real estate, yachts, planes or bigger items so as not to perturb the overall exchange price, unlike what we have today.  

as more users come onboard, merchants will have the incentive, not to mention fiduciary responsibility to run full nodes.  that's a good thing.

but if someone designs a full node fee mkt, i won't be complaining either.

the optimal solution is looking more and more like 101 to me. I think the sybil attack is a good reason keep nodes from forking to the wrong block size.  

I'm not sure a sybil attack is the way to look at it, with this new proposal.  What is there to sybil attack?  Every node operator independently selects the max block size that he is willing to validate.  The effective block size limit is equal to the largest block that has successfully been included in the longest proof-of-work chain.  Any miner can attempt to increase the limit, simply by publishing a slightly larger block than has ever been published before.  They can assess the probability that their block will be mined upon (rather than orphaned) by any number of methods.  If they try to publish too large a block, then it will likely be orphaned.

As sickpig suggested, it would be a recognition that the block size limit is not part of the consensus layer, but rather part of the transport layer.

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
cypherdoc (OP)
Legendary
*
Offline Offline

Activity: 1764
Merit: 1002



View Profile
August 13, 2015, 07:55:21 PM
 #30367

that's not the pt.  

the full nodes should and will care only about what tx volume they have the capability or desire to validate.  for me, despite not getting paid, i will want to max out those validation capabilities to whatever i'm willing to pay (donate) to the network, which is much higher than it is being used today.  i want Bitcoin to grow in size and price so i'm going to donate to my hearts content trying to make that happen.   the Cripplecoiners will say that is no way to run a full node but i'll bet there are thousands of guys like me who are willing to do this for the prospect of taking the price to the next 10x level.  there is nothing wrong with wanting a higher price b/c a much higher price is essential to Bitcoin's overall success and ability to move large scale tx's as we've argued about before.  it has to be able to allow large tx's in the $millions to fund purchases of real estate, yachts, planes or bigger items so as not to perturb the overall exchange price, unlike what we have today.  

as more users come onboard, merchants will have the incentive, not to mention fiduciary responsibility to run full nodes.  that's a good thing.

but if someone designs a full node fee mkt, i won't be complaining either.

the optimal solution is looking more and more like 101 to me. I think the sybil attack is a good reason keep nodes from forking to the wrong block size.  

I'm not sure a sybil attack is the way to look at it, with this new proposal.  What is there to sybil attack?  Every node operator independently selects the max block size that he is willing to validate.  The effective block size limit is equal to the largest block that has successfully been included in the longest proof-of-work chain.  Any miner can attempt to increase the limit, simply by publishing a slightly larger block than has ever been published before.  They can assess the probability that their block will be mined upon (rather than orphaned) by any number of methods.  If they try to publish too large a block, then it will likely be orphaned.

As sickpig suggested, it would be a recognition that the block size limit is not part of the consensus layer, but rather part of the transport layer.

i never did quite get this part.  can you explain?
Erdogan
Legendary
*
Offline Offline

Activity: 1512
Merit: 1005



View Profile
August 13, 2015, 08:00:11 PM
 #30368


At this point I'd say just find a way to put the forks on the market and let's arbitrage it out. I will submit if a fork cannot gain the market cap advantage, and I suspect the small-blockers will likewise if Core loses it. Money talks.

I had a strange idea recently: what if we don't even bother with BIP100, BIP101, etc., or trying to come to "consensus" in some formal way.  What if, instead, we just make it very easy for node operators to adjust their block size limit.  Imagine a drop down menu where you can select "1 MB, 2 MB, 4 MB, 8 MB, … ."  What would happen?  

Personally, I'd just select some big block size limit, like 32 MB.  This way, I'd be guaranteed to follow the longest proof of work chain, regardless of what the effective block size limit becomes.  I'd expect many people to do the same thing.  Eventually, it becomes obvious that the economic majority is supporting a larger limit, and a brave miner publishes a block that is 1.1 MB is size.  We all witness that indeed that block got included into the longest proof of work chain, and then suddenly all miners are confident producing 1.1 MB blocks.  Thus, the effective block size limit slowly creeps upwards, as this process is repeated over and over as demand for block space grows.

TL/DR: maybe we don't need a strict definition for the max block size limit.

That is exactly what I think. The miners will have to try it out or get some feel of what they can do through other channels (social media, conferences, node versions), including associate with other miners. As long as the association is voluntary, it will not form a monopoly.


yes, this has been considered and discussed before.  The danger is that a large block miner cartel might develop naturally whose blocks put small-bandwidth players at a disadvantage.  But as others have mentioned, some people are at an electricity cost disadvantage, some bandwidth, some something else... basically it would just be another metric to take into account as you site your miners.

So I would be 100% for this if miners could only work with real txns.  But a miner could fill up a huge block with a bunch of "fake" (unrelayed, fee pays to himself) txns to artificially drive up network costs.  Its too bad Bitcoin doesn't have the "pay portion of fees to miner pool, receive portion for the next N blocks" feature... that idea closes a lot of miner loopholes.  

But regardless I'm not sure if this "loophole" really is one; it does require 51% of the network to be as connected as you are and willing to process your monster garbage block.  I have a hard time believing that miners would do so since over the long term they need bitcoin to succeed.  More likely (as you guys suggest) they'll just configure their nodes to ignore monster blocks unless > N deep in the chain.


First I am not afraid of cartels in the free market (laws against market actor collusion is rather a statist notion of fear of markets, in reality it is the regulator who is the monopolist intruding with undue force and destroying the market), and second, I don't think a miner will act to waste bandwith, if that were the case, we would already have seen it, since unrestrained block size has been the situation for the first 6 years.

Erdogan
Legendary
*
Offline Offline

Activity: 1512
Merit: 1005



View Profile
August 13, 2015, 08:11:46 PM
 #30369

The network would dynamically determine the max block size as the network evolves by expressing the size of the blocks they will accept with the drop-down menu on their client.

So…is this a good idea?  If there are no obvious "gotchas" then perhaps we should write up a BIP.

I like the principle of the idea, however the idea of voting by non-miners is subject to Sybil attack - someone spinning up many fake nodes to skew the results. There have been posts on the mailing list about various voting schemes.

Going one better than simple user voting is proof-of-stake voting:

Proof-of-stake voting could be combined with miner voting (like BIP-100) to get a balance between mining power and investors/holders.
https://www.mail-archive.com/bitcoin-development@lists.sourceforge.net/msg02323.html
A drop down box, which would need supporting on the many wallet providers, then allows people to vote depending upon their coin balance. A non-vote is a "vote" for no change.

Except, the problem with proof-of-stake voting is summarized by Alan Reiner (Armory developer) in the responses to this proposal:

Quote
One major problem I see with this, no matter how well-thought-out it is,
it's unlikely that those with money will participate.  Those with the
most stake, likely have their private keys behind super-secure
accessibility barriers, and are not likely to go through the effort just
to sign votes.  Not only that, but it would require them to reveal their
public key, which while isn't technically so terrible, large amounts of
money intended to be kept in storage for 10+ years will prefer to avoid
any exposure at all, in the oft-chance that QCs come around a lot
earlier than we expected.  Sure, the actual risk should be pretty much
non-existent, but some of the most paranoid folks are probably the same
ones who have a lot of funds and want 100.00% of the security that is
possible.   They will see this as wildly inconvenient.

Alan is quite right about that.  forget POS voting.

i also disagree with you about the Sybil attacks by spinning up full nodes.  first off, it's not a trivial expense even today to run a full node, so there is a cost to trying to manipulate the system.  and you'd have to run them for "longish" periods of time before you'd get miners to react with bigger blocks.  and, as larger blocks come into being, those full node costs will rise making the attack even more expensive.  and really, for what point?  it's always been possible to spin up nodes but no one's bothered to do it.  as more users enter the system, honest merchants will start spinning up their full nodes in numbers that should make Sybil attacks very difficult.  finally, while running this attack, they are actually helping the network validate and relay tx's and decentralize it.  i'd think there are better ways to attack the network.

Voting with the node number is not a democracy game or even a stake game. Take the example of a change that I never would support, adjusting the total coins. Even if I was outnumbered 100 to 1, I would not accept that change in my node, and the chain would fork. So churning up lots of nodes does not give someone unrestrained power. I don't know how to name such a method of decision making, but it is perfect for the problem at hand. "Fork restrained majority decision" may be.

Peter R
Legendary
*
Offline Offline

Activity: 1162
Merit: 1007



View Profile
August 13, 2015, 08:13:46 PM
 #30370

As sickpig suggested, it would be a recognition that the block size limit is not part of the consensus layer, but rather part of the transport layer.

i never did quite get this part.  can you explain?

Sure.  

Why do we have a consensus layer in the first place?  It is a way for us to agree on what transactions are valid and what transactions are invalid.  For example, we all agree that Alice shouldn't be able to move Bob's coins without a valid signature, and that Bob shouldn't be able to create coins out of thin air.  The consensus layer is about obvious stuff like that.  In order for Bitcoin to function as sound money, we need to agree on "black-or-white" rules like this that define which transactions are valid and which are invalid.

Notice that the paragraph above discusses valid and invalid transactions.  No where did I say anything about blocks.  That's because we only really care about transactions in the first place!  In fact, how can a block be invalid just because it includes one too many valid transactions?  

Satoshi added the 1 MB limit as an anti-spam measure to deal with certain limitations of Bitcoin's transport layer--not as a new rule for what constitutes a valid transaction.  We should thus think of every block that is exclusively composed of valid transactions as itself valid.  The size of the block alone should not make it invalid.  Instead, if a block is too big, think of it as likely to be orphaned (a "gray" rule) rather than as invalid (a black-or-white rule).  Perhaps above a certain block size, we're even 100% sure that a block will be orphaned; still we should view it as a valid block!  It will be orphaned because the transport layer was insufficient to transport it across the network--not because there was anything invalid about it.

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
cypherdoc (OP)
Legendary
*
Offline Offline

Activity: 1764
Merit: 1002



View Profile
August 13, 2015, 08:27:47 PM
 #30371

As sickpig suggested, it would be a recognition that the block size limit is not part of the consensus layer, but rather part of the transport layer.

i never did quite get this part.  can you explain?

Sure.  

Why do we have a consensus layer in the first place?  It is a way for us to agree on what transactions are valid and what transactions are invalid.  For example, we all agree that Alice shouldn't be able to move Bob's coins without a valid signature, and that Bob shouldn't be able to create coins out of thin air.  The consensus layer is about obvious stuff like that.  In order for Bitcoin to function as sound money, we need to agree on "black-or-white" rules like this that define which transactions are valid and which are invalid.

Notice that the paragraph above discusses valid and invalid transactions.  No where did I say anything about blocks.  That's because we only really care about transactions in the first place!  In fact, how can a block be invalid just because it includes one too many valid transactions?  

Satoshi added the 1 MB limit as an anti-spam measure to deal with certain limitations of Bitcoin's transport layer--not as a new rule for what constitutes a valid transaction.  We should thus think of every block that is exclusively composed of valid transactions as itself valid.  The size of the block alone should not make it invalid.  Instead, if a block is too big, think of it as likely to be orphaned (a "gray" rule) rather than as invalid (a black-or-white rule).  Perhaps above a certain block size, we're even 100% sure that a block will be orphaned; still we should view it as a valid block!  It will be orphaned because the transport layer was insufficient to transport it across the network--not because there was anything invalid about it.

it was the term "layer" that made me question.  when i hear "transport layer" i start thinking technical like TCP/IP, SSL/TLS, http, https, etc.  

so nothing technical, just layered "concepts".
Peter R
Legendary
*
Offline Offline

Activity: 1162
Merit: 1007



View Profile
August 13, 2015, 08:30:03 PM
 #30372

As sickpig suggested, it would be a recognition that the block size limit is not part of the consensus layer, but rather part of the transport layer.

i never did quite get this part.  can you explain?

Sure.  

Why do we have a consensus layer in the first place?  It is a way for us to agree on what transactions are valid and what transactions are invalid.  For example, we all agree that Alice shouldn't be able to move Bob's coins without a valid signature, and that Bob shouldn't be able to create coins out of thin air.  The consensus layer is about obvious stuff like that.  In order for Bitcoin to function as sound money, we need to agree on "black-or-white" rules like this that define which transactions are valid and which are invalid.

Notice that the paragraph above discusses valid and invalid transactions.  No where did I say anything about blocks.  That's because we only really care about transactions in the first place!  In fact, how can a block be invalid just because it includes one too many valid transactions?  

Satoshi added the 1 MB limit as an anti-spam measure to deal with certain limitations of Bitcoin's transport layer--not as a new rule for what constitutes a valid transaction.  We should thus think of every block that is exclusively composed of valid transactions as itself valid.  The size of the block alone should not make it invalid.  Instead, if a block is too big, think of it as likely to be orphaned (a "gray" rule) rather than as invalid (a black-or-white rule).  Perhaps above a certain block size, we're even 100% sure that a block will be orphaned; still we should view it as a valid block!  It will be orphaned because the transport layer was insufficient to transport it across the network--not because there was anything invalid about it.

it was the term "layer" that made me question.  when i hear "transport layer" i start thinking technical like TCP/IP, SSL/TLS, http, https, etc.  

so nothing technical, just layered "concepts".

Yes, that was confusing in hindsight.  We're just proposing a new way to think about it. 

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
NewLiberty
Legendary
*
Offline Offline

Activity: 1204
Merit: 1002


Gresham's Lawyer


View Profile WWW
August 13, 2015, 08:36:02 PM
 #30373


At this point I'd say just find a way to put the forks on the market and let's arbitrage it out. I will submit if a fork cannot gain the market cap advantage, and I suspect the small-blockers will likewise if Core loses it. Money talks.

I had a strange idea recently: what if we don't even bother with BIP100, BIP101, etc., or trying to come to "consensus" in some formal way.  What if, instead, we just make it very easy for node operators to adjust their block size limit.  Imagine a drop down menu where you can select "1 MB, 2 MB, 4 MB, 8 MB, … ."  What would happen?  

Personally, I'd just select some big block size limit, like 32 MB.  This way, I'd be guaranteed to follow the longest proof of work chain, regardless of what the effective block size limit becomes.  I'd expect many people to do the same thing.  Eventually, it becomes obvious that the economic majority is supporting a larger limit, and a brave miner publishes a block that is 1.1 MB is size.  We all witness that indeed that block got included into the longest proof of work chain, and then suddenly all miners are confident producing 1.1 MB blocks.  Thus, the effective block size limit slowly creeps upwards, as this process is repeated over and over as demand for block space grows.

TL/DR: maybe we don't need a strict definition for the max block size limit.

You know that you can do this now, right?  And always could.

The code is open source, you can (of course) just change it and compile.

FREE MONEY1 Bitcoin for Silver and Gold NewLibertyDollar.com and now BITCOIN SPECIE (silver 1 ozt) shows value by QR
Bulk premiums as low as .0012 BTC "BETTER, MORE COLLECTIBLE, AND CHEAPER THAN SILVER EAGLES" 1Free of Government
cypherdoc (OP)
Legendary
*
Offline Offline

Activity: 1764
Merit: 1002



View Profile
August 13, 2015, 08:40:49 PM
 #30374

here's further evidence the Reddit mods are steering the blocksize debate. they're letting this guy spam attack me with false allegations despite me reporting him.  same post about a dozen times:

https://www.reddit.com/r/Bitcoin/comments/3gutfp/if_youre_not_running_a_node_youre_not_really/cu1x6fl
Peter R
Legendary
*
Offline Offline

Activity: 1162
Merit: 1007



View Profile
August 13, 2015, 08:41:57 PM
 #30375


At this point I'd say just find a way to put the forks on the market and let's arbitrage it out. I will submit if a fork cannot gain the market cap advantage, and I suspect the small-blockers will likewise if Core loses it. Money talks.

I had a strange idea recently: what if we don't even bother with BIP100, BIP101, etc., or trying to come to "consensus" in some formal way.  What if, instead, we just make it very easy for node operators to adjust their block size limit.  Imagine a drop down menu where you can select "1 MB, 2 MB, 4 MB, 8 MB, … ."  What would happen?  

Personally, I'd just select some big block size limit, like 32 MB.  This way, I'd be guaranteed to follow the longest proof of work chain, regardless of what the effective block size limit becomes.  I'd expect many people to do the same thing.  Eventually, it becomes obvious that the economic majority is supporting a larger limit, and a brave miner publishes a block that is 1.1 MB is size.  We all witness that indeed that block got included into the longest proof of work chain, and then suddenly all miners are confident producing 1.1 MB blocks.  Thus, the effective block size limit slowly creeps upwards, as this process is repeated over and over as demand for block space grows.

TL/DR: maybe we don't need a strict definition for the max block size limit.

You know that you can do this now, right?  And always could.

The code is open source, you can (of course) just change it and compile.

I know that I could, but I also know that I won't, which is sort of another way of saying that I can't!

However, if everyone knows that everyone else could change the limit with just a couple key strokes, then the dynamics of the situation will be very different!  I know that in that case I both could and would change my block size limit.  Better yet: if, as awemany suggested, the software comes with no default block size limit, and the node operator has to pick something, then things will get very interesting.  

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
August 13, 2015, 09:33:07 PM
 #30376

What about lying? If enough miners claim to support larger blocks but actually don't, then part of the network will waste time producing blocks that won't be built on.  IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.  However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

I don't think it would be a problem.  Like Erdogan said, the miners will use the "tip-toe" method of increasing the block size.  Worst case, a large block gets orphaned and nobody tries again for a while.  But if the larger block doesn't get orphaned, then the network will assume that that size is now supported (thereby setting a new effective upper limit).

IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.

This doesn't put the power directly in the miners' hands.  It keeps the power where it already is: in everybody's hands!  It just makes it much easier for people to exercise the power they already possess.  

Quote
However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

I disagree.  For example, I would not set my node's limit to anything greater than 32 MB until I understood the 33.5 MB message size limitation better.  I expect many people would do the same thing.  Rational miners won't dare to randomly publish a 100 MB block, because they'd be worried that it would be orphaned.

Furthermore, since miners would likely use the "tip-toe" method, the effective block size limit will grow only in very small increments, helping to reveal any potential limitations before they become problems.



Okay... I'm going to have to agree with that.

But what if, when everyone is voting, an attacker sees that 50% is advertising < X MB and 50% is advertising > X MB with X obviously been larger than any block seen before.  By publishing a block of size X + 1 byte, the attacker effectively splits the network in half.  If he was previously 25+% of the network, he is now 50% of the two new forks and can either cause further bifurcation or anything else you can do with half the network.  Even if the attacker only has small hashpower, it will still cause a hash war between the two forks.

I suppose this could be mitigated if most nodes refuse to build on a block that is larger than what a supermajority votes for.

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
cypherdoc (OP)
Legendary
*
Offline Offline

Activity: 1764
Merit: 1002



View Profile
August 13, 2015, 09:41:21 PM
 #30377

What about lying? If enough miners claim to support larger blocks but actually don't, then part of the network will waste time producing blocks that won't be built on.  IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.  However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

I don't think it would be a problem.  Like Erdogan said, the miners will use the "tip-toe" method of increasing the block size.  Worst case, a large block gets orphaned and nobody tries again for a while.  But if the larger block doesn't get orphaned, then the network will assume that that size is now supported (thereby setting a new effective upper limit).

IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.

This doesn't put the power directly in the miners' hands.  It keeps the power where it already is: in everybody's hands!  It just makes it much easier for people to exercise the power they already possess.  

Quote
However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

I disagree.  For example, I would not set my node's limit to anything greater than 32 MB until I understood the 33.5 MB message size limitation better.  I expect many people would do the same thing.  Rational miners won't dare to randomly publish a 100 MB block, because they'd be worried that it would be orphaned.

Furthermore, since miners would likely use the "tip-toe" method, the effective block size limit will grow only in very small increments, helping to reveal any potential limitations before they become problems.



Okay... I'm going to have to agree with that.

But what if, when everyone is voting, an attacker sees that 50% is advertising < X MB and 50% is advertising > X MB with X obviously been larger than any block seen before.  By publishing a block of size X + 1 byte, the attacker effectively splits the network in half.  If he was previously 25+% of the network, he is now 50% of the two new forks and can either cause further bifurcation or anything else you can do with half the network.  Even if the attacker only has small hashpower, it will still cause a hash war between the two forks.

I suppose this could be mitigated if most nodes refuse to build on a block that is larger than what a supermajority votes for.

awemany might have answered your question here:

https://www.reddit.com/r/Bitcoin/comments/3eaxyk/idea_on_bitcoin_mailing_list_blocksize_freely/cu1tzuh

IOW, as a full node operator wanting to stay on the longest chain at all times, set your maximum block size high enough so as  not to be exceeded.
notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
August 13, 2015, 09:55:00 PM
 #30378

What about lying? If enough miners claim to support larger blocks but actually don't, then part of the network will waste time producing blocks that won't be built on.  IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.  However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

I don't think it would be a problem.  Like Erdogan said, the miners will use the "tip-toe" method of increasing the block size.  Worst case, a large block gets orphaned and nobody tries again for a while.  But if the larger block doesn't get orphaned, then the network will assume that that size is now supported (thereby setting a new effective upper limit).

IMO, if we want to put the power directly in miners hands it would be better to raise the limit entirely.

This doesn't put the power directly in the miners' hands.  It keeps the power where it already is: in everybody's hands!  It just makes it much easier for people to exercise the power they already possess.  

Quote
However, to do so we would need to test the crap out of everything to be reasonably sure that there aren't bugs that are only uncovered by larger blocks like what happened when the soft limit was raised to 1MB.

I disagree.  For example, I would not set my node's limit to anything greater than 32 MB until I understood the 33.5 MB message size limitation better.  I expect many people would do the same thing.  Rational miners won't dare to randomly publish a 100 MB block, because they'd be worried that it would be orphaned.

Furthermore, since miners would likely use the "tip-toe" method, the effective block size limit will grow only in very small increments, helping to reveal any potential limitations before they become problems.



Okay... I'm going to have to agree with that.

But what if, when everyone is voting, an attacker sees that 50% is advertising < X MB and 50% is advertising > X MB with X obviously been larger than any block seen before.  By publishing a block of size X + 1 byte, the attacker effectively splits the network in half.  If he was previously 25+% of the network, he is now 50% of the two new forks and can either cause further bifurcation or anything else you can do with half the network.  Even if the attacker only has small hashpower, it will still cause a hash war between the two forks.

I suppose this could be mitigated if most nodes refuse to build on a block that is larger than what a supermajority votes for.

awemany might have answered your question here:

https://www.reddit.com/r/Bitcoin/comments/3eaxyk/idea_on_bitcoin_mailing_list_blocksize_freely/cu1tzuh

IOW, as a full node operator wanting to stay on the longest chain at all times, set your maximum block size high enough so as  not to be exceeded.

Ensuring you are part of the supermajority by looking at votes should give you a safe least upper bound.  Ensuring you have the largest size possible is a bit excessive and will raise your costs too much compared to someone who only upgrades when they are coming close to falling out of the supermajority (however they define that).

If miners don't take the supermajority into account, it could be a risk to the network.  Including a risk to the nodes with the highest limits as they suddenly become part of a chain with half as much hash power.

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
iCEBREAKER
Legendary
*
Offline Offline

Activity: 2156
Merit: 1070


Crypto is the separation of Power and State.


View Profile WWW
August 13, 2015, 10:18:32 PM
 #30379

here's further evidence the Reddit mods are steering the blocksize debate. they're letting this guy spam attack me with false allegations despite me reporting him.  same post about a dozen times:

https://www.reddit.com/r/Bitcoin/comments/3gutfp/if_youre_not_running_a_node_youre_not_really/cu1x6fl


Yesterday you were complaining about mod "censorship" and today you are demanding the same mods censor posts you don't like.


██████████
█████████████████
██████████████████████
█████████████████████████
████████████████████████████
████
████████████████████████
█████
███████████████████████████
█████
███████████████████████████
██████
████████████████████████████
██████
████████████████████████████
██████
████████████████████████████
██████
███████████████████████████
██████
██████████████████████████
█████
███████████████████████████
█████████████
██████████████
████████████████████████████
█████████████████████████
██████████████████████
█████████████████
██████████

Monero
"The difference between bad and well-developed digital cash will determine
whether we have a dictatorship or a real democracy." 
David Chaum 1996
"Fungibility provides privacy as a side effect."  Adam Back 2014
Buy and sell XMR near you
P2P Exchange Network
Buy XMR with fiat
Is Dash a scam?
Erdogan
Legendary
*
Offline Offline

Activity: 1512
Merit: 1005



View Profile
August 14, 2015, 02:45:07 AM
 #30380

I don't know if this is a problem for anyone, but anyway...

A fork in the software, as in bitcoin XT, is much welcome, as is alternative implementations of the basic libraries that are used. The more diverse the software, the more antifragile is the system.

A fork of the blockchain is different. Obviously, we do not want a fork to parallell the original for ever, that would create confusion for users and stir up some turbulence. On the other hand, we want forking the chain to be possible, and in fact it is necessary for the basic function that there is no single point deciding what is the right branch, it is based partly on randomness. It is also necessary to create and maintain the consensus on what bitcoin is. To the dangers: If the rules of one branch is distinctly different, we will have two coins. That would be almost the same as creating an altcoin. Note that the definition of altcoin is based only on the fact that bitcoin was first, otherwise there is no fundamental difference and you could also call bitcoin itself an altcoin, as in bitcoin comprises all coins based on blockchains. So the branch can live its own life.

When the branch is almost, but not quite, the same as the original, then what? The answer is that the two branches can not go on for long, one is bound to win. It is like turning a pendulum upside down, the situation is not stable. The reason is that liquidity is essential to money, money that has lower liquidity compared to a money type with essentially the same properties but higher liquidity, will lose. And the level of liquidity is decided by the number of users. Therefore, a blocksize based fork, if there is a chain fork at all, will be short lived. Meaning hours.

Pages: « 1 ... 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 [1519] 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!