cypherdoc (OP)
Legendary
Offline
Activity: 1764
Merit: 1002
|
|
May 29, 2015, 03:02:33 PM |
|
we all want things to launch right? get your space suits on:
|
|
|
|
cypherdoc (OP)
Legendary
Offline
Activity: 1764
Merit: 1002
|
|
May 29, 2015, 03:15:46 PM |
|
lol so now gavin is only considering mike's opinion. plus calling for lobbying ( ) from merchants and centralized businesses... mymy.. USG get out of this body! He has to lobby merchants and exchanges to accept the changes because of the decentralized nature of Bitcoin. this is what i want to see, the will of the majority crush the minority this IS what Bitcoin is all about. Do It Do it, come on, Do it! As always, the majority is often wrong and misguided. Because masses are far more easy to manipulate than individuals. So your majority can go screw themselves as they are used to. Im not following the sheeples. Seriously bitcoin is NOT about lobbying, that filthy practice which is far from democratic and in direct opposition to a natural consensus. Bitcoin is about freedom. It certainly doesnt need lobbyists a la TBF to take over teh world and them wall street scammers. Gavin seems pretty desperate here. Good. edit: funny the irony of such a nice bitcoiner like you to consider bypassing the decentralized consensus of Bitcoin with some nasty self centered lobbyists. It's not about lobbying or desperation. This is why bitcoin works: Because if we can't come to consensus here, the ultimate authority for determining consensus is what code the majority of merchants and exchanges and miners are running. this is pure fud. what then? bitcoin is doomed? bitcoin back to 0? please. what he is saying is that the economic majority will determine Bitcoin's future. it always has. this isn't some pigeon-hole geek experiment confined to technical considerations only. this has always been primarily an economic project enforced thru technical means. where would all these geeks be w/o all the fiat money poured into this project since the beginning? no_where. the geeks, esp gmax and LukeJr, have tried to make this a tail wagging the dog project. instead, Gavin understands that ultimately the dog needs to wag the tail. ultimately, gavin is USG's dog here. seriously I am baffled that smart people like you here fail at grasping the situation regarding the US endoctrinement of the masses and manipulation of literally everything they could get a leverage with. let it not be with Bitcoin. because decentralized consensus. thats what i signed for. so ultimately, people will fight to get a (rare) piece of that secured and robust technology when the rest of the financial world will be on the hedge. so let it be decentralized man. no need friggin leaders. we already have enough of them. we're on the same team. we all want more decentralization. we're simply arguing about the best way to achieve that. multiple ppl here have argued how increasing the block size will facilitate that. i happen to think those arguments are most sound. i'm not sure why you say Gavin is a USG lap dog. over Bitcoin's entire history since his involvement, he has been shown to provide sound leadership. my poll and the other github poll show this. the general tenor of comments show this. yes, there are haters but on balance he is the right one to have been chosen to be lead.
|
|
|
|
cypherdoc (OP)
Legendary
Offline
Activity: 1764
Merit: 1002
|
|
May 29, 2015, 03:16:57 PM |
|
Gold preparing to take the next dive.
|
|
|
|
Adrian-x
Legendary
Offline
Activity: 1372
Merit: 1000
|
|
May 29, 2015, 03:20:54 PM Last edit: May 29, 2015, 06:44:58 PM by Adrian-x |
|
The reason we have to worry about miners producing "too large" blocks is because they don't pay for all the P2P network resources they use (neither do end users).
All the arguments we have about resource consumption are derived from that primary design flaw.
If we fix it, then we won't have to argue any more.
Well put. Till we have had a 1:1 ratio between full node and miner, the block reward did pay for all the resoures involed in the process. Once such ratio started to decrease, due to the introduction of mining pools, mining and full node role became more and more decoupled. The block reward remains on the miner side, though. * edited to read more clearly. I agree with the notion that miners are for the most part unaffected by block size and are empowered not to care, this is also why I dismiss those developer arguments that want to solve the block size problem by manipulating mining fees, or some variant of this idea. The incentive is not in the TX fee to reduce block size - that's paying miners not nodes if anything you want the incentive to be supply and demand based on node size. Ironically it is only the competition for the fee between miners that will force writing blocks to the marginal cost and force the block size to the smallest size capable of sustaining a profit, this could be neatly modeled by the Nash equilibrium. As block rewards diminishes the Nash equilibrium is introduced and miners become marginalized with little to no power in the system. I participated in discussing the idea in 2012 of financially incentivizing notes in a market driven way to regulate miners and block size. But after pondering the idea over time it seemed it was not necessary. People invested in the idea that money is memory who store that memory on the blockchain - They have a lot of "memory",ie. an invested interest in the blockchain will want to preserve the blockchain, some call it altruistic but I prefer to think they will use greed for the grater good. The conclusion I draw is as long as wealth is distributed and people are in competition with one another the blockchain will remain distributed. by the nature of the design of Bitcoin, the wealth that is still to move in to Bitcoin cannot be transferred into Bitcoin with out redistributing it to the participants who grow the network.
|
Thank me in Bits 12MwnzxtprG2mHm3rKdgi7NmJKCypsMMQw
|
|
|
|
justusranvier
Legendary
Offline
Activity: 1400
Merit: 1013
|
|
May 29, 2015, 03:27:12 PM |
|
There's a peculiar kind of incoherence about people who can argue both for decentralization and also argue that users of the system can not be relied up to decide their own best interests.
|
|
|
|
cypherdoc (OP)
Legendary
Offline
Activity: 1764
Merit: 1002
|
|
May 29, 2015, 03:32:34 PM |
|
There's a peculiar kind of incoherence about people who can argue both for decentralization and also argue that users of the system can not be relied up to decide their own best interests.
especially profound given how many here argue that individuals are "stupid" and irrational. this is a particular form of pessimism and hubris. given the proper incentives, individuals can be counted on to be quite rational to work not only in their individual best interest, but that of the group.
|
|
|
|
justusranvier
Legendary
Offline
Activity: 1400
Merit: 1013
|
|
May 29, 2015, 03:34:00 PM |
|
It's not possible to build a currency on misanthropy. http://nakamotoinstitute.org/reciprocal-altruism-in-the-theory-of-money/Reciprocal altruism is a great first start as a theory of money because it so neatly undercuts a lot of the most common fallacies. First, what gives money value? An adherent of commodity money might say that it is the industrial uses of the money good, whereas an adherent of fiat money might say that it is the force of the government issuing it, and the loyalty people have toward their government. Neither of these answers is true. It is true that some system is required to keep track of who has money and who does not, but that is not what makes money valuable. The value of money is the value of cooperation. It is that simple. The value of money is not somehow in the monetary unit; it is in the whole of society and in peoples’ desire to cooperate. If you want your money to be valuable you need the people who produce the products and services you want to consume to use that money. There is no other way to imbue currency with value. If bringing sound money into existence requires an mass education project to overcome many generations of propaganda-induced fallacies, then that's what it going to take. There is no shortcut.
|
|
|
|
_mr_e
Legendary
Offline
Activity: 817
Merit: 1000
|
|
May 29, 2015, 03:35:47 PM |
|
Funny that the new bitcoin fork will be called bitcoi nxt
|
|
|
|
cypherdoc (OP)
Legendary
Offline
Activity: 1764
Merit: 1002
|
|
May 29, 2015, 03:48:48 PM |
|
It's not possible to build a currency on misanthropy. http://nakamotoinstitute.org/reciprocal-altruism-in-the-theory-of-money/Reciprocal altruism is a great first start as a theory of money because it so neatly undercuts a lot of the most common fallacies. First, what gives money value? An adherent of commodity money might say that it is the industrial uses of the money good, whereas an adherent of fiat money might say that it is the force of the government issuing it, and the loyalty people have toward their government. Neither of these answers is true. It is true that some system is required to keep track of who has money and who does not, but that is not what makes money valuable. The value of money is the value of cooperation. It is that simple. The value of money is not somehow in the monetary unit; it is in the whole of society and in peoples’ desire to cooperate. If you want your money to be valuable you need the people who produce the products and services you want to consume to use that money. There is no other way to imbue currency with value. If bringing sound money into existence requires an mass education project to overcome many generations of propaganda-induced fallacies, then that's what it going to take. There is no shortcut. the Blockstream devs have said they would like to see & study what happens upon the repeated filling of blocks. they'd like to study what happens to frustrated users, hoodwinked merchants, exchange price volatility, confused full nodes, etc. to what end? to satisfy their own curiosity? and only to then raise the limit as they've admitted to? what a bunch of misplaced pseudo-academia. they're like a wife who begs her husband to beat her just so she can experience what it is like. so he beats her repeatedly. she finally decides she doesn't like it. but it's too late; he's already lost all respect for her and leaves for another woman. that's what will happen to Bitcoin if users and merchants have bad experiences at this early stage of the game. they'll just leave and may not come back for a 100 yrs.
|
|
|
|
Zangelbert Bingledack
Legendary
Offline
Activity: 1036
Merit: 1000
|
|
May 29, 2015, 03:56:46 PM |
|
Perhaps some of the coders here can help me understand something.
Why not have an new "mempool" be created every 10 minutes, so that if it takes 30 minutes to find a block the winning miner will just take all the valid transactions in the first mempool, no matter how huge the total "blocksize" would be, and put only the hash of those transactions into the block? That way the block itself would be tiny so propagation wouldn't be an issue. All miners and other full nodes would have the first mempool transactions already(?), those being set in stone, so they would just have to check that the hash matches the set of all valid tx in the first mempool. Then the next winning miner would take all the valid tx in the second mempool, etc.
Of course if a miner finds the next block in less than 10 minutes and there is no mempool queued up yet, this doesn't work. Perhap difficulty would have to be adjusted to ensure miners were usually a little bit behind the curve.
This seems to shift the burden from bandwidth to CPU power for checking the hash, but as long as miners are behind the curve it seems to avoid the "race" where lower-bandwidth miners/nodes are at a disadvantage.
Does this, or anything like it, make any sense?
|
|
|
|
adamstgBit
Legendary
Offline
Activity: 1904
Merit: 1037
Trusted Bitcoiner
|
|
May 29, 2015, 04:02:45 PM |
|
Perhaps some of the coders here can help me understand something.
Why not have an new "mempool" be created every 10 minutes, so that if it takes 30 minutes to find a block the winning miner will just take all the valid transactions in the first mempool, no matter how huge the total "blocksize" would be, and put only the hash of those transactions into the block? That way the block itself would be tiny so propagation wouldn't be an issue. All miners and other full nodes would have the mempool transactions already(?), so they would just have to check that the hash matches the set of all valid tx in the first mempool. Then the next winning miner would take all the valid tx in the second mempool, etc.
Of course if a miner finds the next block in less than 10 minutes and there is no mempool queued up yet, this doesn't work. Perhap difficulty would have to be adjusted to ensure miners were usually a little bit behind the curve.
Does this, or anything like it, make any sense?
i think thats more or less how its currently works when there's a backlog of unconfirmed TX this is fine for now, but at one point if theres alot of TX the mem pool will just grow and grow, and TX will confirm slower and slower.
|
|
|
|
cypherdoc (OP)
Legendary
Offline
Activity: 1764
Merit: 1002
|
|
May 29, 2015, 04:04:16 PM |
|
such a simple but elegant point here from Reddit:
[–]painlord2k 5 points 48 minutes ago
Use your coins as usual. More uses, more transactions, the smaller blockchain will not be able to manage the transactions and people will be forced to migrate to the larger.
permalinksaveparentreportgive goldreply
|
|
|
|
Zangelbert Bingledack
Legendary
Offline
Activity: 1036
Merit: 1000
|
|
May 29, 2015, 04:06:09 PM |
|
Perhaps some of the coders here can help me understand something.
Why not have an new "mempool" be created every 10 minutes, so that if it takes 30 minutes to find a block the winning miner will just take all the valid transactions in the first mempool, no matter how huge the total "blocksize" would be, and put only the hash of those transactions into the block? That way the block itself would be tiny so propagation wouldn't be an issue. All miners and other full nodes would have the mempool transactions already(?), so they would just have to check that the hash matches the set of all valid tx in the first mempool. Then the next winning miner would take all the valid tx in the second mempool, etc.
Of course if a miner finds the next block in less than 10 minutes and there is no mempool queued up yet, this doesn't work. Perhap difficulty would have to be adjusted to ensure miners were usually a little bit behind the curve.
Does this, or anything like it, make any sense?
i think thats more or less how its currently works when there's a backlog of unconfirmed TX this is fine for now, but at one point if theres alot of TX the mem pool will just grow and grow, and TX will confirm slower and slower. You're saying miners currently sometimes only put the hash of all the tx in a block, instead of the tx themselves?
|
|
|
|
adamstgBit
Legendary
Offline
Activity: 1904
Merit: 1037
Trusted Bitcoiner
|
|
May 29, 2015, 04:08:44 PM |
|
Perhaps some of the coders here can help me understand something.
Why not have an new "mempool" be created every 10 minutes, so that if it takes 30 minutes to find a block the winning miner will just take all the valid transactions in the first mempool, no matter how huge the total "blocksize" would be, and put only the hash of those transactions into the block? That way the block itself would be tiny so propagation wouldn't be an issue. All miners and other full nodes would have the mempool transactions already(?), so they would just have to check that the hash matches the set of all valid tx in the first mempool. Then the next winning miner would take all the valid tx in the second mempool, etc.
Of course if a miner finds the next block in less than 10 minutes and there is no mempool queued up yet, this doesn't work. Perhap difficulty would have to be adjusted to ensure miners were usually a little bit behind the curve.
Does this, or anything like it, make any sense?
i think thats more or less how its currently works when there's a backlog of unconfirmed TX this is fine for now, but at one point if theres alot of TX the mem pool will just grow and grow, and TX will confirm slower and slower. You're saying miners currently sometimes only put the hash of all the tx in a block, instead of the tx themselves? i miss read, ya no they put the Full TX I dont see how knowing which TX to include in the next block is going to help tho.
|
|
|
|
cypherdoc (OP)
Legendary
Offline
Activity: 1764
Merit: 1002
|
|
May 29, 2015, 04:12:39 PM |
|
Perhaps some of the coders here can help me understand something.
Why not have an new "mempool" be created every 10 minutes, so that if it takes 30 minutes to find a block the winning miner will just take all the valid transactions in the first mempool, no matter how huge the total "blocksize" would be, and put only the hash of those transactions into the block? That way the block itself would be tiny so propagation wouldn't be an issue. All miners and other full nodes would have the first mempool transactions already(?), those being set in stone, so they would just have to check that the hash matches the set of all valid tx in the first mempool. Then the next winning miner would take all the valid tx in the second mempool, etc.
Of course if a miner finds the next block in less than 10 minutes and there is no mempool queued up yet, this doesn't work. Perhap difficulty would have to be adjusted to ensure miners were usually a little bit behind the curve.
This seems to shift the burden from bandwidth to CPU power for checking the hash, but as long as miners are behind the curve it seems to avoid the "race" where lower-bandwidth miners/nodes are at a disadvantage.
Does this, or anything like it, make any sense?
the mempool is rarely uniform across all nodes. it would be impossible to reconstruct which unconf tx's a node would be missing. your idea is a variation on IBLT. but in that case, nodes can reconstruct their missing tx's due to the math of the IBLT. and your idea would totally render SPV clients unusable as they rely on retrieving the Merkle tree path with it's block header to their specific tx history they are interested in.
|
|
|
|
Zangelbert Bingledack
Legendary
Offline
Activity: 1036
Merit: 1000
|
|
May 29, 2015, 04:15:57 PM |
|
the mempool is rarely uniform across all nodes. it would be impossible to reconstruct which unconf tx's a node would be missing.
OK, good point. I thought maybe having a time cutoff where no new tx are added to the first mempool after 10 minutes would help, but I guess there's no way to know for sure. That's the whole point of a consensus network after all. Oh well, there goes that shower thought. Thanks for the quick reply.
|
|
|
|
Natalia_AnatolioPAMM
|
|
May 29, 2015, 04:36:34 PM |
|
Gold preparing to take the next dive.
mayeb the last one
|
|
|
|
adamstgBit
Legendary
Offline
Activity: 1904
Merit: 1037
Trusted Bitcoiner
|
|
May 29, 2015, 05:03:14 PM |
|
Gold preparing to take the next dive.
mayeb the last one but probably not. i firmly believe we will see <900$ gold and eventually <500$ gold
|
|
|
|
cypherdoc (OP)
Legendary
Offline
Activity: 1764
Merit: 1002
|
|
May 29, 2015, 05:07:55 PM |
|
Gold preparing to take the next dive.
mayeb the last one but probably not. i firmly believe we will see <900$ gold and eventually <500$ gold yes, gold is useless in the new digital age.
|
|
|
|
|