Bitcoin Forum
May 10, 2024, 02:24:13 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 [3] 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 ... 96 »
41  Bitcoin / Development & Technical Discussion / Re: bitcoin "unlimited" seeks review on: January 02, 2016, 08:33:32 PM
The proposal seems at first skim to be a copy of a few existing technologies from Bitcoin's roadmap and were first proposed by Greg Maxwell and others*: weak-blocks & network-compression/IBLT to reduce orphan risk, and flexcap (or a variant of it perhaps).

That is something else, perhaps from one of the research papers on future areas of interest.

Bitcoin Unlimited's main change at present is simply that, for better or worse, it makes it more convenient for miners and nodes to adjust the blocksize cap settings. This is done through a GUI menu, meaning users don't have to mod the Core code themselves like some do now. Planned improvements to BU include options that automatically mimic the blocksize settings of some Core BIPs, as well as blocksize proposals recommended by other luminaries.

The idea is that users would converge on a consensus Schelling point through various communication channels because of the overwhelming economic incentive to do so. The situation in a BU world would be no different than now except that there would be no reliance on Core (or XT) to determine from on high what the options are. BU rejects the idea that it is the job of Core (or XT, or BU) developers to govern policy on consensus or restrict the conveniently available policy options on blocksize.

BU supporters believe that to have it otherwise is the tail wagging the dog: the finding of market-favored consensus is not aided, but rather hindered, by attempts to spoonfeed consensus parameters to the users. (This is putting it gently. Having a controversial parameter set at a specific number by default would be spoonfeeding, not even having the option to change it is more like force-feeding.)

Widespread adoption of BU, or adoption of BU-like configurability of settings within Core/XT, would relegate developer-led BIPs on controversial changes to the status of mere recommendations. Proposals like 2-4-8 would be taken into consideration, but would have to compete in the market on their own without the artificial advantage of the current barrier of inconvenience and technical ability (users having to mod their code to deviate from Core settings).

BU does not support bigger blocks, nor smaller blocks; it is rather a tool for consensus on blocksize to emerge in a more natural, market-driven way - free of market intervention as it were.

Adam, if you are confident that, for instance, 2-4-8 scaling is the best option and would be supported by the market, I think you should either support BU or support a Core BIP to make the blocksize settings configurable within the Core client.

Right now the leaders of the dominant Bitcoin implementation are for a low blocksize cap, but imagine if the situation reverses and big blockists are in control, to the consternation of many in the community. I think you would not want them locking down the settings. You might say, "You folks are doing fine otherwise, but you are off on the blocksize cap. Why try to play central planner? Please leave it up to the market if you are so sure the market will like your huge blocks. People will follow your recommendations if they like them anyway, so what are you worried about?"

If I were Core maintainer, I would do the same. Perhaps I would set a higher default, but I would not take the option away from the user. To do so risks sudden consensus shocks due to friction effects, risks my position being undermined silently, and most of all assumes I know better than everyone else. I might set it at 10MB. But I may be wrong; I'd rather trust in the market, because none of us knows better than a million people all with skin in the game.

As for how communication to settle on a Schelling consensus happens, besides the usual out-of-band communication that happens now in the debate, there is also interest in adding a tool within BU to efficiently communicate information about blocksize settings across the network, thereby facilitating an emergent consensus.

dynamic block-size game-theory

The game theory is the same as that arising in the choice of Core vs. XT vs. whatever (or among the BIPs by the miners and other stakeholders; if we look at the game theoretic considerations applying to the Core dev consensus process I'm sure you realize that problem is intractable). Miners and nodes have all the same choices now, except there is some additional friction introduced by Core's locking down of the blocksize settings, forcing miners and nodes to mod the Core code if they want to change them.

The question ought to be turned around: what are the game-theoretic considerations involved in having a monolithic reference client causing complicated issues of inertia, authority, and potential power grabs on top of the cleaner game theory? If tractability of a game theory analysis is the goal, surely BU is at least no more complicated than the situation under Core in the event of a hard fork.

How will the network not split into a myriad little shards without manual human coordination?

Ah. This is good. What I believe you are not noticing is that "manual human coordination" need not be top-down. Coordination can emerge, and it can be just as solid as any. Are you familiar with situations where it does? That would save a lot of ink.
42  Bitcoin / Bitcoin Discussion / Re: Bitcoin XT - Officially #REKT (also goes for BIP101 fraud) on: December 31, 2015, 02:19:52 AM
I'd say what almost everyone actually trusts is that various experts and people with a lot of money on the line have vetted the code.
43  Economy / Speculation / Re: 2016 will be a boring year for bitcoin on: December 30, 2015, 04:04:44 AM
Not only is OP reverse-jinxing 2016 (which is good), there's actually mechanism behind that magical hex/charm:

Investors put millions of dollars into Bitcoin every single day. Investors looking to maximize their returns in the Bitcoin space have several options:

  • buy BTC
  • buy mining equipment
  • buy stake in a startup (venture capital)

For two straight years, those daily millions have been flowing into mining and venture capital, not the price. If mining is topping out (likely) and venture capital is topping out (apparent), where do those millions flow? I'll give you one guess Grin
44  Bitcoin / Development & Technical Discussion / Re: "Bitcoin Unlimited" is not convergent due to the "excessive block depth" idea on: December 24, 2015, 08:11:01 PM
My response from there:

My first impression is that your points are based on a misapprehension of how consensus on blocksize is intended to be reached in BU. I think you have seen N (the oversized-block acceptance depth) as the mechanism for reaching consensus,* when it's really more of a failsafe for miners who don't keep abreast of the situation on the network as well as they should if they want maximum profits.

Miners who are paying attention should have their cap set substantially above the normal maximum blocksize as seen on the network; I see N as a bit of a "gimme" to small block advocates who want to make some sacrifices to discourage large blocks, as N allows them to be a little bit riskier about it (they can safely set their max acceptance size M a bit lower) while still tracking consensus in most situations. However, I think you may have misinterpreted the point of this: N wasn't really necessary for BU to work. The original idea was to have blocksize be unlimited, then user-selectable blocksize was added as a concession to small block advocates who worry about big block attacks, and N was largely a way to make life a little easier for those small block advocates (BU engineers please correct me if I'm wrong).

Quote
However, this upgrade system is not convergent at all, there is no reason why this battle between the 2MB chain and 1MB chain should be resolved quickly. This could occur over many blocks (greater than 4).

Such a long-running duel assumes something near a 50/50 split of hash power between the chains, or else they would resolve more quickly, correct? It seems the situation with Core is the same if Core were to try to upgrade to 2MB blocks when miner support was near a 50/50 split on the issue. I know, they won't do this, instead waiting for "overwhelming consensus" like 95% or something, thus avoiding the terrible 50/50 outcome.

However, here I think you are seeing the illusion that the tail is wagging the dog. You apparently view Core's policy of waiting for 95% consensus as the means by which a 50/50 split is avoided, as if miners are all a bunch of robots. Since miners are aren't robots but in fact people who are dead-set on maximizing their profits, they certainly would not mine or build on an over-limit block (blocksize > L) unless they believed that would be a profitable choice. That would only be a profitable choice if it were unlikely to result in their block being orphaned.

If a miner is rational, which is the governing assumption of Bitcoin in the first place, this profit/loss calculation will of course take into account the very same 95% that Core is looking for, except the miner is free from that market intervention** and can choose his own threshold percentage while also balancing it with whatever other factors he deems relevant to the profit/loss calculation in his own individual situation - with his own unique power costs and connectivity and the transaction fees in the mempool at the time - to dynamically determine the point at which he himself has an expected positive return on mining a bigger block.

The first miner to mine a bigger block can be assumed to have performed such a calculation as a profit-maximizing agent, and then others may follow suit by raising their limits. Why would miners be monitoring the network that closely? Perhaps they aren't now, but they will have to in the future if they want to stay profitable, because their competitors will. They will do so when blockspace becomes limited enough and fees become high enough that their expected return on mining a block with some extra juicy fees overshadows the orphan probability based on various factors, including the XX% miner signaling.

Centrally planning the switchover at 95% is economically suboptimal and unnecessary. The Core devs imagine that without their paternalistic setting of that 95% threshold, miners would be unable to make the calculation for themselves - even though they are privy to the very same information, and in real time at that! And again, Core devs cannot know the profit/loss calculations that apply to each miner's individual situation. This is akin to Soviet-style price fixing and falls afoul of the Economic Calculation problem as explained by Ludwig von Mises, or the problems with the use of knowledge in society as explained by F.A. Hayek: no central planner can know all the individual valuations and tradeoffs each person in the economy will want to make.

The one-size-fits-all approach destroys the whole idea of the division of labor, the mainspring of human progress for the past 6000 years. Core's one-size-fits-all decrees are NOT where consensus comes from; consensus comes from miners not being idiots.*** The miners are the dog and the devs are the tail. Just like governments, they do something like require seatbelts in cars right as the auto industry is putting in seatbelts on its own and imagine that they - the wise overseers - are the source of auto safety. The tail imagines it wags the dog.

Everyone can see the decree and the effect, but few can see that the same thing - or probably something far less clunky - would have happened without the decree. If a miner has a positive expected return on an oversize block, he will mine it even if it does end up orphaned. Other miners can see this and adjust accordingly, especially once they start getting outcompeted by accepting fewer fees that they could be. Miners stay in consensus because they are econo-rational, not because of developer nannyism.

I would therefore expect blocksize to creep up very conservatively, and barring miner agreement on a flag-day upgrade - which is always a possibility even with BU except they can do without messing around with the Core devs - this will only happen when there are enough fees to warrant the orphan risk, as well as a high percentage of miners signalling support for a bigger blocksize. It would probably move up in some kind of Schelling-point increments like 2MB.

*unless you think BU is good and your point is simply that BU shouldn't have the "excessive block depth." That may indeed be and I would have to think about your scenarios more.

**which is, after all, nothing more than the inconvenience of adjusting that 95% threshold in the Core code oneself or finding someone to do to it for you.

***Some might say that miners really are idiots, but this violates the basic assumption of Bitcoin, that miners are economically rational. Sure, some miners now might not have to be very smart about their operations, but in the future this will have to change. Right now being dumb in some ways as a miner doesn't make you economically irrational, but in the future where forking and emergent coordination must occur, it will. Miners that refuse to monitor the network and make prudent profit/loss calculations will be outcompeted by those who do.
45  Bitcoin / Bitcoin Discussion / Re: POLL: What is the reason hard forks require broad consensus? on: November 19, 2015, 10:07:30 PM
Of course. Still with there being multiple choices it sounds like by "dictator" here we really mean "suggester."
Not entirely right. With that he would be implementing code which he wanted to (who knows what could come out of this).

By "implementing" I assume you mean suggesting. Or do I have this wrong and he is able to force downloads?
46  Bitcoin / Bitcoin Discussion / Re: POLL: What is the reason hard forks require broad consensus? on: November 19, 2015, 06:30:18 PM
Of course. Still with there being multiple choices it sounds like by "dictator" here we really mean "suggester."
47  Bitcoin / Bitcoin Discussion / Re: POLL: What is the reason hard forks require broad consensus? on: November 19, 2015, 06:03:21 PM
I'm not sure what "taking power" has to do with this. Are you talking about someone taking commit control of one of the implementations? (Wlad/Mike)
Well not exactly. Wlad has already commit control of Core so that's not right. What I was talking about is something that Mike seems to like, a 'benevolent dictator' (look up the terms Hearn and benevolent dictator in case you haven't read about it before) of Bitcoin. For example, Wladimir will not implement things that do not have a broad consensus and the developers have a right to veto. Taking power would mean to disregard this, and implement what you "feel" is right. Obviously this is bad on many levels.

I've heard Mike mention "benevolent dictator" of his own implementation, but I thought that was in the context of having multiple implementation options for the Bitcoin community to choose from.
48  Bitcoin / Bitcoin Discussion / Re: POLL: What is the reason hard forks require broad consensus? on: November 19, 2015, 05:57:02 PM
If the nodes didn't agree on who owns which Bitcoin, then it's not even proper money, because there is no ledger.

If there are two different consensuses among "the nodes," then there are effectively two separate sets of nodes now, so which set of nodes would you be referring to by saying "the nodes"?
49  Bitcoin / Bitcoin Discussion / Re: POLL: What is the reason hard forks require broad consensus? on: November 17, 2015, 07:47:14 PM
I'm not sure what "taking power" has to do with this. Are you talking about someone taking commit control of one of the implementations? (Wlad/Mike)
50  Bitcoin / Bitcoin Discussion / Re: POLL: What is the reason hard forks require broad consensus? on: November 17, 2015, 06:54:34 AM
If I had to guess, I would say Core will up the limit early next year, probably modestly, in response to demand. If they stall too long, they risk losing some control over development.
51  Bitcoin / Bitcoin Discussion / Re: POLL: What is the reason hard forks require broad consensus? on: November 16, 2015, 10:09:09 PM
Food for thought... Why Meni Rosenfeld doesn't fear contentious hard forks:

http://fieryspinningsword.com/2015/08/25/how-i-learned-to-stop-worrying-and-love-the-fork/
52  Bitcoin / Bitcoin Discussion / POLL: What is the reason hard forks require broad consensus? on: November 16, 2015, 05:15:08 PM
"Hard forks require broad consensus" is a common sentiment among the Core developers and the wider community. I want to get a sense of whether this is justified more by fairness/image considerations or more by perceptions that contentious hard forks are technically unworkable.

*For example, Adam Back said on reddit recently, "the network can not reach consensus with multiple competing incompatible consensus algorithms vying for control."
53  Bitcoin / Bitcoin Discussion / Re: Bitcoin XT - Officially #REKT (also goes for BIP101 fraud) on: November 12, 2015, 08:51:43 PM
brg444,

Why would the market care yet? No one is complaining about high fees yet. No adoption is being hindered. Yet. The market is patient as long as it can be, and prefers the conservatism of sticking with Core until Core shows itself to be definitely unwilling to meet market demands. This may or may not happen, depending on what Core decides to do with the blocksize in the coming months. I'm guessing Core will increase the blocksize fairly soon, but if they don't. POW.

With the quote, I have no idea what it's getting at. Who are the "needers" and who are the "creators and owners" - needers, creators, and owners of what?

It's really very simple: investors can support whatever fork they like, and there will go the hashing power and eventually the development. Investors hold the keys to the castle. Always have, always will. It doesn't matter if they don't create or own anything. All that matters is they can move the price of CoreBTC or XTBTC when the time comes.
54  Bitcoin / Bitcoin Discussion / Re: Bitcoin XT - Officially #REKT (also goes for BIP101 fraud) on: November 12, 2015, 08:28:01 PM
...
This sort of argument boils down to an assumption that miners are not net economically-rational. If you think that, then you must think that bitcoin can *never* work. Miners being net-econo-rational actually *IS* probably a core axiom that must be accepted for Bitcoin to be viable. So arguments that reject that idea are worthless.
...

Wow, this sort of blew my mind.  Can we prove this somehow?  And if we did prove this, would it not logically follow that top-down planning à la Core Dev (e.g., using the block size as a policy tool rather than allowing it to emerge naturally) is at best redundant and at worst damaging?

Yes, basically miners serve as a proxy for investors (in fact miners are a kind of investor as well, but investors pushing up the price incentivize miners to mine). Since investors are who control Bitcoin, miners are also part of "who control Bitcoin" both by proxy and directly. If we cannot trust the market of investors or we cannot trust the market of miners, we cannot trust Bitcoin.

This should be no surprise, really, as Satoshi originally spoke of users voting with their CPU power (now read: hashing power) to choose which fork they like. Bitcoin was always an emergent phenomenon of the market, not a planned phenomenon of certain developers.

It's easy to confuse this, because certain economic parameters in Bitcoin were planned by Satoshi...but it was not the dev Satoshi who made them part of the World Wide Ledger. It was the market. The market just happened to like his parameters. The market has not expressed an opinion on blocksize because it has never had a chance nor a reason to.

Forks give it the chance, and full blocks will soon give it the reason. Combine the two and



or maybe nothing. We will see. I happen to think the market will choose a substantial increase.
55  Economy / Speculation / Re: What happened to all the old timers? on: September 28, 2015, 12:31:42 PM
The "Gold collapsing. Bitcoin UP." folks migrated to the bitco.in forum.
56  Bitcoin / Bitcoin Discussion / Re: Centralization in Bitcoin: Nodes, Mining and Development on: August 23, 2015, 06:28:57 PM
The dev team is necessarily centralised. When centralised is the only ideology that works, I'm in favour.

Pick your centralising influence: amorphous Core team or Mike Hearn's self described "benevolent dictatorship". Benevolent in respect of who, exactly?

Can you explain why you believe a model with several competing implementations of the protocol wouldn't work?  In fact, with the emergence of BitcoinXT it looks like it is beginning to work.  The consensus critical code would of course need to be compatible between implementations; however, if a change was desired by the community (e.g., increasing the block size limit), each implementation could attempt to solve it "their own way" and then the community would decide the winner by switching to the implementation they favoured.  Then, to retain some portion of their previous user base, the losing implementations would adopt the same change to the consensus code to prevent their clients from forking off from the longest proof-of-work chain when the change goes into effect (at some future time similar to BIP101).

Is this a way to give the community more power in exercising their free choice?

Of course it is. Competing implementations is the only way to make it a market choice. In some years we will wonder why this was ever not obvious.
57  Economy / Speculation / Re: Gold collapsing. Bitcoin UP. on: August 14, 2015, 10:19:36 PM
ZB had brought up this point before and while it does sort of makes sense from an "antifragile" standpoint, I have a problem accepting it as a viable or desirable outcome.

On that point I believe this post from Alex Morcos is relevant:

Quote
What gives Bitcoin value aren't its technical merits but the fact that people believe in it.   The biggest risk here isn't that 20MB blocks will
be bad or that 1MB blocks will be bad, but that by forcing a hard fork that isn't nearly universally agreed upon, we will be damaging that belief.
  If I strongly believed some hard fork would be better for Bitcoin, say permanent inflation of 1% a year to fund mining, and I managed to convince 80% of users, miners, businesses and developers to go along with me, I would still vote against doing it.  Because that's not nearly universal agreement, and it changes what people chose to believe in without their consent. Forks should be hard, very hard.  And both sides should recognize that belief in the value of Bitcoin might be a fragile thing.
http://sourceforge.net/p/bitcoin/mailman/message/34092527/

While I do understand your point about improving on the ability of the users to come to consensus, it seems a stretch to me to suggest that regular changes to the way Bitcoin operate can strengthen the trust people have in it. Rather we should strive to come to a point where the consensus critical code in Bitcoin is set in stone for eternity as it becomes harder and harder for an ever-growing ecosystem to come to consensus on a proposed change.

For that reason, I am wary and quite frankly curious of recent attempts by Hearn in particular to lessen the impact and the dangers of hard forks.

You can't force a fork.
58  Economy / Speculation / Re: Gold collapsing. Bitcoin UP. on: August 14, 2015, 08:53:40 PM
https://www.reddit.com/r/Bitcoin/comments/3h01p2/how_is_the_bitcoin_community_supposed_to_build/

*poof* Top thread on the front page one moment, a few minutes later gone.

The subreddit has set a policy where not only is talking about alternative implementations banned, but apparently so is talking about the ban itself.

This reeks of desperation.
59  Other / Off-topic / Re: If someone figures out Strong AI, how do we keep humans safe? on: August 14, 2015, 01:40:06 PM
Box it. I'm not worried about AI itself, since Eliezer's quote is an anthropomorphism: an AI won't "want" to do anything unless it is programmed in such a way. The concern is about people running around with boxed AIs that can answer all their questions very accurately and thereby make them uber-powerful.

An AI won't be a super Watson. The idea behind an AI with human like intelligence is that it will be able to change itself. It will learn and rewrite its own program. Maybe it will be possible to give it some starting values and desires. Maybe not. But it will have its own desires eventually.

No, it won't. Desires are something humans have for evolutionary reasons. An AI is just a bunch of code. It can rewrite itself, but it can no more develop desires when it it loaded into a computer than it could if it were written down in book form. It will never "want" anything unless someone programs it to simulate such behavior, though as Cryddit points out there is a literal genie effect to worry about, where in attempting to execute an instruction it could wreak havoc on the real world.

That's why I say box it. By "box it" I mean never give it the ability to affect the real world other than by providing information. It will affect its own world, where it can learn mathematical properties and reason out things based on given physical constraints (3D shapes cannot pass through each other; given these shapes, what can be built?). To the extent it can remake the outside world in its sandbox, it can potentially provide us with a lot of useful information without being able to directly affect anything. It can probably tell people how to make a dangerous device very easily, or how to override their own conscious and go postal. So limiting its communication ability might also be good. Maybe at first only allow it to answer yes/no questions, with yes and no.

You will say, "No, it's supremely intelligent and will find a way out of the box. It will outsmart us and figure out how to affect the real world." (Eliezer Yudkowsky's argument.) But this is akin to a category error. It assumes that faster processing power equates with God-like abilities, or even the ability to do the logically impossible. Can an AI tell you what the sum of Firlup + Waglesnorth is? Not if it doesn't know what you mean by those terms. Similarly, it is functionally impossible to for an AI to even conceptualize the fact that it is, from our perspectives, in a box. An impossible thought is not thinkable even for a super-intelligence. We cannot conceptualize an infinite set, and neither can any AI. (In case this seems untrue, Eliezer also believes infinite sets are nonsense, to his credit. Whatever your position on infinite sets, if we for now accept that the notion is nonsense, an AI would not be any better at conceptualizing infinite sets, only better at noticing the meaninglessness of the term.)

You will say, "No, Nick Bostrom's simulation argument demonstrates how humans can reason about our own world being a simulation, where we are effectively in a box, as it were." But it doesn't. Bostrom's argument is merely a confusion based on equivocation. An AI would see this, as many people have. For example, scroll down to Thabiguy's comments here. It is clear that there is no way to even conceptualize the notion of "we are living in a simulation" coherently. If you think you can visualize this, try drawing it. It's just high-falutin wordplay. If the notion is not even coherent, there is no way the further notion that "an AI would figure out it is living in a simulation" is coherent either. Hence it is of no concern.

AI != God. And heck, even a god could not do the semantically impossible like prove that 2+2=5, though it may be able to convince humans that it has done so.

With the right controls, AI can be boxed. Still, as I mentioned, the danger of the operator equipped with AI-furnished knowledge is a serious one.
60  Economy / Speculation / Re: Gold collapsing. Bitcoin UP. on: August 14, 2015, 12:48:28 PM
As sickpig suggested, it would be a recognition that the block size limit is not part of the consensus layer, but rather part of the transport layer.

i never did quite get this part.  can you explain?

Sure.  

Why do we have a consensus layer in the first place?  It is a way for us to agree on what transactions are valid and what transactions are invalid.  For example, we all agree that Alice shouldn't be able to move Bob's coins without a valid signature, and that Bob shouldn't be able to create coins out of thin air.  The consensus layer is about obvious stuff like that.  In order for Bitcoin to function as sound money, we need to agree on "black-or-white" rules like this that define which transactions are valid and which are invalid.

Notice that the paragraph above discusses valid and invalid transactions.  No where did I say anything about blocks.  That's because we only really care about transactions in the first place!  In fact, how can a block be invalid just because it includes one too many valid transactions?  

Satoshi added the 1 MB limit as an anti-spam measure to deal with certain limitations of Bitcoin's transport layer--not as a new rule for what constitutes a valid transaction.  We should thus think of every block that is exclusively composed of valid transactions as itself valid.  The size of the block alone should not make it invalid.  Instead, if a block is too big, think of it as likely to be orphaned (a "gray" rule) rather than as invalid (a black-or-white rule).  Perhaps above a certain block size, we're even 100% sure that a block will be orphaned; still we should view it as a valid block!  It will be orphaned because the transport layer was insufficient to transport it across the network--not because there was anything invalid about it.

Nice. This could be modified into a good reddit self-post that should generate a lot of thought.
Pages: « 1 2 [3] 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 ... 96 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!