LovelyDay
Newbie
Offline
Activity: 21
Merit: 0
|
|
January 03, 2016, 11:03:36 AM |
|
Aquent has posted his reply to Adam's thread-opening post: https://bitco.in/forum/threads/re-bitcoin-unlimited-seeks-review.718/His closing statement is worth re-iterating in this forum: I invite you, and everyone else, to find holes and tear the above apart, always in the spirit of reaching a solution and if no holes can be found, then lets end this thing and have a reunited celebration party.
|
|
|
|
LovelyDay
Newbie
Offline
Activity: 21
Merit: 0
|
|
January 03, 2016, 11:09:21 AM |
|
Ignore the obvious problem of Sybil attacks, especially those exploiting BTC's current crummy P2P code and superlinear vulnerability to maliciously construed (ie very slow to verify) blocks.
I just wanted to come back on the superlinear problem that iCEBREAKER mentioned here, because it deserves response since it's just as much a problem for other BIPs and SW. So this is not a new problem introduced by BU, and there have been some proposals made in the context of BU to mitigate this (by moving verification into separate threads etc.) Most of the discussion around this in BU has happened in the thread below as far as I have seen: https://bitco.in/forum/threads/i-really-want-to-like-bitcoin-unlimited.684/page-2#post-8087
|
|
|
|
jonny1000
Member
Offline
Activity: 129
Merit: 14
|
|
January 03, 2016, 11:55:17 AM |
|
In the interest of this "review", I will point out a point commonly not understood by those new to BU:
BU follows the longest chain.
BU does not always follow the longest chain. In general BU operates in the same way as in Core, BU nodes follow the longest valid chain. There is a slight difference with respect to the blocksize issue. It is possible BU nodes will follow the longest chain, where the blocksizes are arbitrarily large, but only under certain limited unusual conditions. There is an “N depth” idea in BU, where nodes switch from regarding one chain as valid to another chain, if the chain with larger blocks has a lead of N blocks. The default value of N is 4 and N could be manually set to any number, including infinity, when BU nodes will never switch to the longest chain. Only if N is manually changed to 1, does BU follow the longest chain with respect to any blocksize. Therefore the claim that BU always follows the longest chain is false.
|
|
|
|
jonny1000
Member
Offline
Activity: 129
Merit: 14
|
|
January 03, 2016, 11:57:39 AM |
|
The idea is that users would converge on a consensus Schelling point through various communication channels because of the overwhelming economic incentive to do so. The situation in a BU world would be no different than now except that there would be no reliance on Core (or XT) to determine from on high what the options are. BU rejects the idea that it is the job of Core (or XT, or BU) developers to govern policy on consensus or restrict the conveniently available policy options on blocksize.
There is no mechanism to reach consensus about the blocksize limit, with BU all nodes just set there own limits. This does not reduce the potential for debate, arguments or central planning. The community would still need to reach consensus about the blocksize limit in the same way it is trying to now. The core development team could still try to keep the 1MB limit and miners would still need to make a decision to change to accept larger blocks, the only difference would be the debate may be about values to set in the client, rather than which client to use, which is effectively no significant change from the current status quo. If we want nodes to dynamically set a blocksize limit, in a way determined by the market, we should use a proposal like BIP100. BIP100 actually allows miners to dynamically set a blocksize limit and agree with each other on a new limit, BU has no system to enable nodes to agree on the limit.
|
|
|
|
cr1776
Legendary
Offline
Activity: 4214
Merit: 1312
|
|
January 03, 2016, 12:49:11 PM |
|
... I can't wait to see BU (fail to) deal with 16/32MB blocks trollishly construed so as to take hours or days to verify. ... This is an important point - and the one I've thought it safe to ignore here, but if you have people picking a larger sized block, without BU building in safeguards, it will be dangerous to the network if people are not aware of the consequences. Even 2 MB blocks can have transactions that can take 5-10 minutes to verify.
|
|
|
|
Peter R
Legendary
Offline
Activity: 1162
Merit: 1007
|
|
January 03, 2016, 04:00:30 PM |
|
I just wanted to point out that the one comment I left in this thread has now been deleted. The idea that this place is "neutral" is silly.
All the good people are leaving.
|
|
|
|
adam3us (OP)
|
|
January 03, 2016, 04:39:47 PM Last edit: January 03, 2016, 06:40:52 PM by adam3us |
|
The idea of enabling nodes & miners to set a market block-size is quite reasonable so there is no criticism of the idea. Dont take review of the mechanism as a critique of the idea: for ideas to be deployed we need game-theory reviewed protocols and rigorously tested implementations. Dynamic block-size is actually on the core roadmap and the best proposal I've seen for it is flexcap by GMaxwell & Maaku, with some ideas from Jeff. You can watch a video about flexcap presented at scaling bitcoin HK. Maaku has code for the core parts of it, I believe he was going to publish it, probably online by now. If we want nodes to dynamically set a blocksize limit, in a way determined by the market, we should use a proposal like BIP100. BIP100 actually allows miners to dynamically set a blocksize limit and agree with each other on a new limit, BU has no system to enable nodes to agree on the limit.
The precursor idea was BIP "100" which Jeff retracted. The BIP "100" proposal is similar but only miners vote. In flexcap both users and miners vote. I would suggest people interested in the idea of dynamic blocks, learn about BIP "100" and flex-cap and see if they can improve them. There are design considerations that have been refined between 100 and the improvement on it flex-cap. The bitcoin unlimited project has presented some ideas which do try to automate things. Unfortunately all of the ones so far seem to be defective suffering sybil attack, constant centralisation pressure, dont take account of SPV mining, shared pools, relay network existing practice nor selfish-mining, attacks. I have not really analysed the idea of validating two chains, but it seems likely to have problems based on intuition, particularly in the area of race-conditions, chain sharding and divergence risk, and in an adversarial environment. Bear in mind that the consensus mechanism is extremely fragile, it only just works in that there are many design close things that completely fail. Most variants I tried I self-broke fairly immediately. But some of these things take a long time to realise, or require review from GMaxwell or others to disprove. For example selfish mining was not noticed for years. I did spend about 3 - 4 months cooking up analysing mining variants to improve bitcoin mining centralisation (eg I invented ghost, but rejected it as over complex for what it achieved, before the academic paper proposed it and a bunch of other variants), before getting into side-chains. The idea for users to vote by delaying block relay wont work because most miners are already using the relay network or SPV mining. Over 50% of the network was SPV mining during the 4th july fork. A large portion of miners use the relay network. Users voting by advertisement wont work because of sybil as others have explained. You can read flex-cap to see how they combine miner and user voting in a secure, game-theory stable way that defends against all these attacks. In summary: 1. The use case: dynamic market set block-sizes are interesting. 2. bitcoin unlimited proposals so far seems broken as discussed by multiple people for a whole range of reasons. We didnt have a crisp definition and it seems that some things maybe undecided. That's ok - just keep working on it and make a concrete proposal later and people can analyse it from that. 3. BIP "100" seemed plausible, but was only miner meta-incentive secure. Meaning we would be trusting miners to do the right thing, limited only by their commitment to not do anything too selfish for fear of hurting bitcoin's long term value. 4. flex-cap adds user voting (in transactions) and an economic quadratic feed-back mechanism to create an incentive to right-size blocks (to deter miner zero sum attacks against other miners and curtail the continuous centralisation pressure). Flex-cap also ensures miner fee profit in conditions where otherwise mining fees can be driven to zero by excess capacity in a non-dynamic block-size growth proposals like BIP-103. [EDIT: I suppose the other thing is it might be better to run experiments on testnet rather than bitcoin or putting clear warnings for users if you have not. People could lose bitcoin running partial implementations of incomplete ideas. Encouraging users arent understanding it is a research project to run experimental code with real Bitcoin under it's control or even on the same machine, would be inadvisable.] Adam
|
hashcash, committed transactions, homomorphic values, blind kdf; researching decentralization, scalability and fungibility/anonymity
|
|
|
cypherdoc
Legendary
Offline
Activity: 1764
Merit: 1002
|
|
January 03, 2016, 05:32:24 PM |
|
The idea for users to vote by delaying block relay wont work because most miners are already using the relay network or SPV mining. Over 50% of the network was SPV mining during the 4th july fork. A large portion of miners use the relay network.
if so, aren't fraud proofs for SW an illusion?
|
|
|
|
BitUsher
Legendary
Offline
Activity: 994
Merit: 1035
|
|
January 03, 2016, 05:41:55 PM |
|
[EDIT: I suppose the other thing is it might be better to run experiments on testnet rather than bitcoin or putting clear warnings for users if you have not. People could lose bitcoin running partial implementations of incomplete ideas. Encouraging users arent understanding it is a research project to run experimental code with real Bitcoin under it's control or even on the same machine, would be inadvisable.]
This ...1000x Case in point - https://www.darkwallet.is/ Even with many warnings such as --- Remember: this is only a alpha preview. Do not expect stable software yet. Right now it is only available for Chrome/Chromium browsers.
IMPORTANT The wallet is Not Stable or Safe, and at this point you should use it with real money only at Your Own Risk.
You can use it with testnet coins so it's safe to test like that though. Choose testnet network when creating the identity.
People were still using real coins and losing them all the time on experimental software. Are any of the core devs open to us having a fundraiser to finish cleaning up the code and testing the dark wallet implementation or is this a matter of all the focus on bitcoin core at the moment?
|
|
|
|
alp
|
|
January 03, 2016, 06:36:14 PM |
|
A very simple attack can be done on the network.
There are really 3 parameters that can be set at each node:
1) Maximum generation size (only used by miners)
2) Maximum relay size (used to propagate to peers)
3) Maximum acceptance size and # of blocks ahead to force an automatic change of parameters (This mechanism is one I am less sure of how it works).
I'll ignore the first parameter since it's set by miners. Suppose I want big blocks in my unlimited node. I set a max relay size of 32MB and max acceptance size of 32MB (the maximum allowed?). I will accept anything I see that is the longest chain and pass everything along.
Now suppose I connect to a bunch of nodes, and they have much lower limits - the max relay they have is 2MB, and max acceptance is 2MB. I will never see any blocks bigger than 2MB! Even if the majority of nodes were accepting large blocks, and I just got connected to a few small block nodes, I'm screwed. I'm out of sync with everyone else, and there's nothing I can do to sync up.
The attack is simple - launch a bunch of nodes, try to connect to as many nodes as possible, and stop relaying anything big.
In a large enough network, this doesn't even need to be accidental. Some poor sap gets unlucky and connects to all small block users, and can't ever get an update. The end result is many chains and chaos.
Of course this is solved in an interesting mechanism - a bunch of people get together and talk and coordinate limits and upgrade together... hmmm.... sounds familiar.
|
I am looking for a good signature. Here could be your advertisement
|
|
|
Zarathustra
Legendary
Offline
Activity: 1162
Merit: 1004
|
|
January 03, 2016, 07:31:06 PM |
|
I just wanted to point out that the one comment I left in this thread has now been deleted. The idea that this place is "neutral" is silly.
All the good people are leaving.
Zero opposition from Adam against the deletions.
|
|
|
|
BitUsher
Legendary
Offline
Activity: 994
Merit: 1035
|
|
January 03, 2016, 08:11:42 PM |
|
There seems to be multiple different explanations of BU which is leading to some confusion. Bitcoin Unlimited instead proposes that node operators configure their own limit in a simple GUI menu. Currently, as there is unanimous agreement for a 2mb limit, the miners just sign their blocks with 2mb so communicating that they are willing to accept 2mb blocks. Once 90 or whatever % of them agree, then the miners can create the first 1.1 mb block and wala, we have a new limit.
Is this a hypothetical "unanimous agreement for a 2mb limit" of the miners or just BU nodes? So BU takes queue from a miner "BIP100 like" vote and combines that with a supermajority BU node vote to determine a new maxBlockSize limit? That would only work if 51% of them so decided and, as we know, if 51% of miners decide to fork off they can do so currently.
Thus a simple majority of miners is the "BIP-100" like vote that would permit an increase if the corresponding nodes agreed? The miners are self interested economic actors and we can assume that 51% of them will be honest (if we don't make such assumption then bitcoin does not work).
My thoughts are bitcoin simply made the game theory incentive to merely discourage dishonesty by incentivizing miners at a greater degree to secure the blockchain than attack it. Why are we assuming honesty? I hold the opinion we should prepare for dishonest and malicious miners as a possible attack vector. This is why we need to focus on decentralizing mining much more. Right now for example we have core stating 1mb or bust - well there is unanimous agreement that approach is wrong and the limit has to increase to at least 2mb because businesses right now are hurting.
Is this true? As far as I know only Luke-jr wants to keep the blocklimit or lower it. Peter Todd signed up to make an effective increase and the rest of the scaling signers want to raise maxBlockSize after all the other improvements are completed to accommodate the LN. Is this simply a misunderstanding because I repeatedly see this being repeated on that forum and on r/btc or do you acknowledge cores statements and are assuming dishonest intent? FYI... regarding post deletions, the only posts being deleted should be offtopic or non technical related ones, which we welcome because after all this is the " Development & Technical Discussion " section. Please take a screenshot of any "technical" post if you believe there is censorship here.
|
|
|
|
sAt0sHiFanClub
|
|
January 03, 2016, 10:26:11 PM |
|
The idea of enabling nodes & miners to set a market block-size is quite reasonable so there is no criticism of the idea.
That seems a reasonable result. Just need to iron out the details.... In summary:
1. The use case: dynamic market set block-sizes are interesting.
2. bitcoin unlimited proposals so far seems broken as discussed by multiple people for a whole range of reasons. We didnt have a crisp definition and it seems that some things maybe undecided. That's ok - just keep working on it and make a concrete proposal later and people can analyse it from that.
I wouldn't say they are 'broken' - there is a lot of misunderstanding about it, and due to the level of censure taking place on bitcointalk its difficult to get a clear picture of what it is about on this forum. So may I respectfully point interested parties to the BU 'white paper' ( without having this post moderated/deleted as a result) here where you can digest the finer details. Additionally, the FAQ provides a higher level description of key topics. This should remove some of the guesswork which has characterised this thread. Then present your thoughts or arguments on the unmoderated /r/btc reddit or directly to bitco.in where they can be discussed.
|
We must make money worse as a commodity if we wish to make it better as a medium of exchange
|
|
|
smooth
Legendary
Offline
Activity: 2968
Merit: 1198
|
|
January 03, 2016, 10:43:32 PM Last edit: January 04, 2016, 12:01:06 AM by smooth |
|
In the interest of this "review", I will point out a point commonly not understood by those new to BU:
BU follows the longest chain.
BU does not always follow the longest chain. In general BU operates in the same way as in Core, BU nodes follow the longest valid chain. There is a slight difference with respect to the blocksize issue. It is possible BU nodes will follow the longest chain, where the blocksizes are arbitrarily large, but only under certain limited unusual conditions. There is an “N depth” idea in BU, where nodes switch from regarding one chain as valid to another chain, if the chain with larger blocks has a lead of N blocks. The default value of N is 4 and N could be manually set to any number, including infinity, when BU nodes will never switch to the longest chain. Only if N is manually changed to 1, does BU follow the longest chain with respect to any blocksize. Therefore the claim that BU always follows the longest chain is false. It is not false with respect to any setting other than infinity (but your point is valid since infinity is a apparently a valid setting). There is never a guarantee that any Bitcoin node will converge to the longest chain in finite time, only eventually. The exact same guarantee applies to BU, though it may take longer to agree on a chain than other implementations.
|
|
|
|
adam3us (OP)
|
|
January 04, 2016, 12:00:19 AM |
|
I wouldn't say they are 'broken' - there is a lot of misunderstanding about it, and due to the level of censure taking place on bitcointalk its difficult to get a clear picture of what it is about on this forum.
We might need to think about moving forum because some of the people proposing the ideas apparently being "moderator incompatible". Technically I dont think it is accurate to say the clarity has been hurt by deletions directly, as far as I saw & remember contents of now deleted post no technical comments were deleted. What has hurt clarity possibly is the refusal of people to participate because of the potential of moderation or clash of egos and moderators. I do sympathize and dislike moderation myself, though there is a little irony in Peter R's deleted comment containing a proud link to his own heavy trolling of /r/bitcoin which is not exactly the way to encourage people to divert their time to review your proposal (moderator or not!) I started a thread on /r/btc https://www.reddit.com/r/btc/comments/3zc6qg/review_of_shelling_point_protocol_selection_ideas/and typed up a summary of ideas explained so far (including the below one that I already read before starting this thread we're in). So may I respectfully point interested parties to the BU 'white paper' ( without having this post moderated/deleted as a result) here where you can digest the finer details. I had read that one before starting the thread. It seems to be a different idea again not mentioned on this thread. Maybe we should be analysing a set of features that BU proposes to combine. Immediate observation with empty block ratios is that this appears not to work in the face of 4 existing network behaviours SPV mining, relay network, big pools and selfish-mining. Additionally, the FAQ provides a higher level description of key topics. This should remove some of the guesswork which has characterised this thread. Dont recall if I read that one yet. Then present your thoughts or arguments on the unmoderated /r/btc reddit or directly to bitco.in where they can be discussed.
Ha after thought I figured /r/btc is the next best option given the egos and the moderators clashing, yes, see above. Adam
|
hashcash, committed transactions, homomorphic values, blind kdf; researching decentralization, scalability and fungibility/anonymity
|
|
|
Nubarius
|
|
January 04, 2016, 12:12:59 AM |
|
I am very enthusiastic about Bitcoin Unlimited and think it is a great idea. (Disclaimer: I am a big blocker, and see BU as the mechanism that will end up bringing higher maximum block sizes that can let Bitcoin scale with on-chain transactions. Anyway, my stance on big blocks is, I think, independent of my argument below.)
As Zangelbert Bingledack has argued before, BU lets users do something that they can already do, albeit currently in a more complicated way. Let's suppose a group of miners decide that a blockchain with 2 MB blocks is acceptable to them, and want to gamble that other miners and economic actors (like exchanges) will think likewise and accept such big blocks too. They can already upgrade to a blockchain that supports larger blocks by recompiling their own custom version of Bitoin Core where they substitute a higher value for the integer constant MAX_BLOCK_SIZE. If these miners command a minority of the total hashing power, they will find that whenever they try to relay a 1.9-MB block, it gets stalled among nodes that refuse to relay it and the block gets orphaned and rejected by the longest chain. Now they could keep on trying to propagate their blocks and use human-communication channels like this forum to try to attract other miners and users to the 2-MB camp, but there is something that will probably put them off and that makes it hard for this approach to succeed: the moral aspect. Right now, the more common view is that a block that exceeds the 1-MB cap is an "invalid" block that violates the consensus rules, just as would be a block that contains, say, invalid transactions or incorrect hash values. This leads to a situation where the miners sending 1.9-MB blocks out to the open will get rejected as "malicious" or "attacking" nodes. Any reputable mining company won't want to be associated with such antics, and will understandably keep their blocks under the limit to be seen as well-behaved legit nodes.
Now the great idea behind BU is that it completely eliminates this moral aspect by taking the maximum block size out of the consensus rules. Under BU, the blockchain becomes a parameterised family of N-capped blockchains. If BU becomes the main Bitcoin implementation, miners may freely choose to decide that they can support a 2-MB-capped blockchain or a 32-MB-capped blockchain without anyone shouting at them "malicious!", "attacker!", "you filthy rascal, impostor of a node, boo, boo, boo!!!". The moral aspect vanishes as it is now just as legit to adhere to any cap. Those who only support the 1-MB-capped blockchain will reject any blocks bigger than 1 MB, while the 2-MB-capped blockchain nodes may keep on trying to build a longer chain. It's important to note that the blockchain only splits when a subset of the miners advocating for an N-capped blockchain command more hashing power than those miners that advocate for M-capped blockchains such that M < N. At that point, two things could happen: a node supporting the M-capped blockchains may accept that a longer chain now has larger blocks and raise their M parameter to match at least N, the new proof-of-work consensus, or insist that they won't support such large blocks and keep on mining a shorter chain of blocks that respect their M limit. This might lead to several alternative chains coexisting for some time, but the ensuing chaos should be very brief as the free market will rapidly settle for one of them. My feeling is that this would be the chain with the higher block sizes and the larger proof-of-work behind it.
How will the growth of the block size cap happen under this mechanism? I think the pressure on miners to increase the limit will grow as the cap starts to be hit. As more users try to send (on-chain) Bitcoin payments, the 1 MB limit will be hit more and more often and a backlog of transactions will start to grow. This will lead to users complaining about their transactions being stuck in limbo and some negative media reporting about it ("Bitcoin network collapsing" and the like). Under such strained conditions, which we may start to see pretty soon, the Bitcoin price will take a battering in the exchanges with miners starting to worry about their income and realising that they're jeopardising the Bitcoin network and their earnings by not raising the cap. I could be wrong about all this, of course, but then it would be the free market which would prove me wrong by settling on a tightly capped blockchain. In any case, thanks to the mechanism that BU provides, raising the block size limit can be a market-driven smooth transition rather than an incessant and unseemly fracas among the core developers in the mailing lists and IRC channels.
|
|
|
|
alp
|
|
January 04, 2016, 03:21:57 AM |
|
I am very enthusiastic about Bitcoin Unlimited and think it is a great idea. (Disclaimer: I am a big blocker, and see BU as the mechanism that will end up bringing higher maximum block sizes that can let Bitcoin scale with on-chain transactions. Anyway, my stance on big blocks is, I think, independent of my argument below.)
As Zangelbert Bingledack has argued before, BU lets users do something that they can already do, albeit currently in a more complicated way. Let's suppose a group of miners decide that a blockchain with 2 MB blocks is acceptable to them, and want to gamble that other miners and economic actors (like exchanges) will think likewise and accept such big blocks too. They can already upgrade to a blockchain that supports larger blocks by recompiling their own custom version of Bitoin Core where they substitute a higher value for the integer constant MAX_BLOCK_SIZE. If these miners command a minority of the total hashing power, they will find that whenever they try to relay a 1.9-MB block, it gets stalled among nodes that refuse to relay it and the block gets orphaned and rejected by the longest chain. Now they could keep on trying to propagate their blocks and use human-communication channels like this forum to try to attract other miners and users to the 2-MB camp, but there is something that will probably put them off and that makes it hard for this approach to succeed: the moral aspect. Right now, the more common view is that a block that exceeds the 1-MB cap is an "invalid" block that violates the consensus rules, just as would be a block that contains, say, invalid transactions or incorrect hash values. This leads to a situation where the miners sending 1.9-MB blocks out to the open will get rejected as "malicious" or "attacking" nodes. Any reputable mining company won't want to be associated with such antics, and will understandably keep their blocks under the limit to be seen as well-behaved legit nodes.
Now the great idea behind BU is that it completely eliminates this moral aspect by taking the maximum block size out of the consensus rules. Under BU, the blockchain becomes a parameterised family of N-capped blockchains. If BU becomes the main Bitcoin implementation, miners may freely choose to decide that they can support a 2-MB-capped blockchain or a 32-MB-capped blockchain without anyone shouting at them "malicious!", "attacker!", "you filthy rascal, impostor of a node, boo, boo, boo!!!". The moral aspect vanishes as it is now just as legit to adhere to any cap. Those who only support the 1-MB-capped blockchain will reject any blocks bigger than 1 MB, while the 2-MB-capped blockchain nodes may keep on trying to build a longer chain. It's important to note that the blockchain only splits when a subset of the miners advocating for an N-capped blockchain command more hashing power than those miners that advocate for M-capped blockchains such that M < N. At that point, two things could happen: a node supporting the M-capped blockchains may accept that a longer chain now has larger blocks and raise their M parameter to match at least N, the new proof-of-work consensus, or insist that they won't support such large blocks and keep on mining a shorter chain of blocks that respect their M limit. This might lead to several alternative chains coexisting for some time, but the ensuing chaos should be very brief as the free market will rapidly settle for one of them. My feeling is that this would be the chain with the higher block sizes and the larger proof-of-work behind it.
How will the growth of the block size cap happen under this mechanism? I think the pressure on miners to increase the limit will grow as the cap starts to be hit. As more users try to send (on-chain) Bitcoin payments, the 1 MB limit will be hit more and more often and a backlog of transactions will start to grow. This will lead to users complaining about their transactions being stuck in limbo and some negative media reporting about it ("Bitcoin network collapsing" and the like). Under such strained conditions, which we may start to see pretty soon, the Bitcoin price will take a battering in the exchanges with miners starting to worry about their income and realising that they're jeopardising the Bitcoin network and their earnings by not raising the cap. I could be wrong about all this, of course, but then it would be the free market which would prove me wrong by settling on a tightly capped blockchain. In any case, thanks to the mechanism that BU provides, raising the block size limit can be a market-driven smooth transition rather than an incessant and unseemly fracas among the core developers in the mailing lists and IRC channels.
via Imgflip Meme Maker
|
I am looking for a good signature. Here could be your advertisement
|
|
|
|
iCEBREAKER
Legendary
Offline
Activity: 2156
Merit: 1072
Crypto is the separation of Power and State.
|
|
January 04, 2016, 05:08:57 AM Last edit: January 04, 2016, 05:43:39 AM by iCEBREAKER |
|
[BitcointalkObituaries.com] Hi PR. I too had a mostly on-topic, albeit slightly aggro, post deleted by a mod apparently set on keeping the thread's decorum conducive to productive discourse. So quit yer bitchin! EDIT: How close is BU to JR's block-market proposal? I've always thought https://bitcoinism.liberty.me/economic-fallacies-and-the-block-size-limit-part-1-scarcity/ is full of great ideas....for altcoins to test. It seems BU is just an abstraction onto which people like Frap.doc project their unrequited ideals, at evidence by the inability to get the story straight w/r/t whether or not BU follows the longest ("valid") chain.
|
██████████ ██████████████████ ██████████████████████ ██████████████████████████ ████████████████████████████ ██████████████████████████████ ████████████████████████████████ ████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ████████████████████████████████ ██████████████ ██████████████ ████████████████████████████ ██████████████████████████ ██████████████████████ ██████████████████ ██████████ Monero
|
| "The difference between bad and well-developed digital cash will determine whether we have a dictatorship or a real democracy." David Chaum 1996 "Fungibility provides privacy as a side effect." Adam Back 2014
|
| | |
|
|
|
VeritasSapere
|
|
January 04, 2016, 02:37:57 PM Last edit: January 04, 2016, 04:36:46 PM by VeritasSapere |
|
Sorry for stepping in.
If someone tries to sybill the networks and sets up 2,000 nodes with a blocklimit of 200 MB, no responsible miner would take this as a reason to set his own limit to 200 MB.
When one of the miners was corrupted too, he could release a 200 MB block and 2,000 Nodes would propagate it. All the other nodes with lower limits would reject the block untill it reaches some depth. For that to happen the majority of miners has to be corrupted. The attack is a lot more complex than that. I think you're on the BU forum? Taek had a nice explanation of the centralization pressure enabled by BU. Someone could leverage a sybil attack to effectively do just what he proposed: slowly but surely prune nodes out of the network until it gets consolidated into a few more controllable hands. If you are a miner, and you know a block of size X can be processed by 85% of the network, but not 100%, do you mine it? If by 'network', we mean hashrate, then definitely! 85% is high enough that you'll be able to build the longest chain. The miners that can't keep up will be pruned, and then the target for '85% fastest' moves - now a smaller set of miners represents 85% and you can move the block size up, pruning another set of miners.
If by 'network', you mean all nodes... today we already have nodes that can't keep up. So by necessity you are picking a subset of nodes that can keep up, and a subset that cannot. So, now you are deciding who is safe to prune. Raspi's? Probably safe. Single merchants that run their own nodes on desktop hardware? Probably safe. All desktop hardware, but none of the exchanges? Maybe not safe today. But if you've been near desktop levels for a while, and slowly driving off the slower desktops, at some point you might only be driving away 10 nodes to jump up to 'small datacenter' levels. And so it continues anyway. You get perpetual centralization pressure because there will always be that temptation to drive off that slowest subset of the network since by doing so you can claim more transaction fees. I countered the criticism of Taek and so did many other people on the bitco.in forum, here is a link to my response: https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-203#post-7395https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-208#post-7550
|
|
|
|
|