Bitcoin Forum
May 15, 2024, 01:42:19 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 [20] 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 ... 212 »
381  Economy / Speculation / Re: Gold collapsing. Bitcoin UP. on: February 19, 2015, 06:21:15 AM
http://www.bloomberg.com/news/articles/2015-02-09/greek-investors-buying-more-gold-coins-from-u-k-s-royal-mint
382  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 18, 2015, 09:12:02 PM
We look to change max block size for expediency's sake only.  (bigger faster)  Not because any of us know yet what it should be.

If we get a (TBD) criteria for changing max block size that makes sense.  An inaccurate but definite change would be just fine with me.  Whatever arbitrary max block size is picked is ultimately going to be "wrong" in the same way that the one satoshi picked is "wrong" based on the (TBD) criteria.

So...  a new static limit of X MB would be inaccurate but not indefinite.  Still wrong, just less wrong, and so acceptable for expediency's sake.
383  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 18, 2015, 09:02:33 PM
i still would like to hear when you think such a change is necessary.
you dont seem to be against any blocksize change as long as it is not needed: how do you define needed in this case?
In his view, there is no definition of needed in this case. (hope that I'm not mistaken)

A max block size is needed because of the asymmetric incentive structure. -- There is no increase in revenue for node running with larger blocks, only increased costs.  Under free market conditions, the larger the block chain data set gets, the worse our node/miner ratio should be expected to gets, and we should expect a declining # of nodes overall, even with increased mining.
This is a bad result for network resilience and performance.

But... that is just the benign bad effect, one of the pernicious risks that would aggregate (with an indefinite and inaccurate max block size) would be an increasingly easy attack on this pain point.  Eventually, someone who wishes bad things for Bitcoin could mine fully padded blocks of transactions to from and to themselves and increase the costs for everyone else at no marginal cost to the miner.

If the economic incentive structure were fully resolved (and I don't think that we know how to do that yet), there might be no need for a max block size!

The root of your question is another problem, I've discussed elsewhere.
For so long as we are stuck with some hard-fork based solution, we ought also set forward a criteria for changing it.  Currently it is structured in a central banking type small group of deciders which is going to be a source of criticism.
What is needed also is some mathematical method of determining when it should be raised (or lowered).  Getting this right would give us the "accurate" part of the solution of not being both inaccurate and indefinite.

So... assuming there is a problem, the current proposal is not the solution.
384  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 18, 2015, 06:49:47 PM
If you want to continue the conversation, please be very explicit about what problem you think needs solving, and how whatever solution you're proposing solves that problem.

Actually, that's something the one proposing to fork should answer. And yes, that's you.
So far I'm really not convinced it is necessary.

I'll assume for the sake of discussion that it is necessary (or ultimately will be).

What we need is a proposal that is either definite or accurate. 
We can't safely move to a guess with an infinite progression.  It is a great idea, but it is too risky.
385  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 18, 2015, 06:24:05 PM
This is where my conversation with Gavin fell apart.  He was not able to acknowledge the concept of a too-high limit.  His reasoning was that since the limit was only one-sided (blocks with size above it are prevented) that it couldn't be too high.

Huh what?
...

If you want to continue the conversation, please be very explicit about what problem you think needs solving, and how whatever solution you're proposing solves that problem.

As simply as I can put it:
The proposal is both inaccurate and indefinite, the proposal should not be both.  


Now a few words on why...
We do not yet have the mechanism for an accurate limit (one that conforms to need such as those I and many others have proposed), so an indefinitely increasing limit should not be suggested.
It should not be done because it might be good enough for a while, and then suddenly not be good enough and we won't be prepared for that (whether it is too high or too low at that point, we can't say now).  It introduces pernicious catastrophic failure potentials.


It may help to think about the US debt limit debates...
The US Congress keeps raising the debt limit.  Congress will keep doing it. They wont let it get hit, or if they do, not for long.  There is a history of exponential debt growth.  So why not propose one that increases exponentially forever?  Wouldn't that make sense?  It would be practical.  There are arguments in favor of it.  What it would also do however, is remove the incentive to get it right until it becomes VERY wrong.  Its just a limit, congress doesn't have to spend it all.

We have a similar problem with the block chain data size.  It is a problem of the commons.  The storage and maintenance is done by many, but each individual block doesn't bear the cost.  There is a small marginal cost in orphan risk, but this risk fades over time as block rewards increasingly become fees rather than coinbase.  There is no economic incentive to run a node, only mining, but the nodes are also necessary for the security and resilience and speed of the network.
386  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 18, 2015, 12:11:33 AM
The ttl of a future block height was the method proposed by satoshi years ago as part of the method of increasing the maxblocksize

The WWSD argument has been refuted already.

IV. Satoshi himself envisioned much larger blocks.

The discussion of what "Satoshi himself" did or didn't do, meant or didn't mean, so on and so forth is about as interesting and discussing the Mormon "bible".

This is called "arguing to authority", and it tries to give pecuniary value to that only truly worthless article of all times and places : the esteem of the mob. This may work well in electing United States presidents, ensuring that "while the voting public knows best what it wants, it deserves to get it long and hard". Bitcoin specifically and deliberately does not work in this way.

Bitcoin is not a reflection of your hopes and aspirations, but a check on them. Bitcoin isn't here to make it easier for you to do what you want to do ; Bitcoin is here to make it trivial for others to prevent you from doing what you want to do every time that's stupid. The sooner you comprehend this fundamental difference between Bitcoin and "technology" especially in the "revolutionary & innovative" subsense of that nonsense, the better, for you.

Satoshi probably doesn't even know what's in his code, let alone how to improve it.

Sure... except this was not an appeal to an authority.

In case the point of this comment was lost to you, it was this:
The part of the conversation that was interesting to sed (what was referred to as the ttl) was not novel to the #bitcoin-assets discussion but was discussed in 2010 when the maxblocksize was initially created.

But since you brought it up, why are you attacking someone here that isn't going to be able to respond? (satoshi)  What satoshi would or wouldn't do, or what he does or does not understand, is not part of this discussion. 
387  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 17, 2015, 11:01:46 PM
I thought that the most interesting idea in that log was the one about implementing the fork with a ttl of 2 years.  By the time those two years have passed, everyone will have forgotten that this was such a big issue.

The ttl of a future block height was the method proposed by satoshi years ago as part of the method of increasing the maxblocksize:

It can be phased in, like:

if (blocknumber > 115000)
    maxblocksize = largerlimit

It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.

When we're near the cutoff block number, I can put an alert to old versions to make sure they know they have to upgrade.

388  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 17, 2015, 10:31:27 PM
Consensus is the result of quality, not the other way 'round.
389  Bitcoin / Development & Technical Discussion / Re: How to Create a Bitcoin Address from a Coin Flip on: February 17, 2015, 10:22:03 PM
The only thing I claim is safe is: 1) its done offline, 2) its done randomly, and 3) no one can know the method of creation.  I will stick by that maxim. < this is the essence of the thread

I don't see the need for (3).  Indeed, if (3) is at all useful to your security then I'd claim that you're not introducing enough entropy at step (2) and are being forced to rely on the extra entropy of your method being one among many plausible alternatives.

Certainly, 256 coin flips provides sufficient entropy.  I believe 128 coin-flips is enough for critical cold storage even with the method known but I'm not a cryptographer.

While this is true, (3) may be important in case (2) is not perfectly knowable.

If I know all of the circumstances surrounding your coin flips (from even a little bit, up to even the extreme of covert surveillance of your flipping), then (3) would have been helpful to you.  The less others know of your method, the more of your secrets are secret.

Maybe you have your phone with you, and I can turn your phone't mic or camera on via remote.  Maybe I can hear whether you are writing an H or a T or a 1 or 0 by the noise you make while doing it?  The more I know of your process, the worse it is for you.
390  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 17, 2015, 10:05:09 PM

It took place in #bitcoin-assets, and he relented here.

Thanks, I should have known there would be IRC logs with the actual discussions.  Cheers!

Reading IRC logs makes my brain bleed, but I went and read it because of the extraordinary claim.  As usual for IRC, the  "discussion" amounted to abuse and posturing.  And as usual for extraordinary claims, the evidence does not support it. If you think that Gavin committed to making or not making a hard fork in that log, you have wilfully misinterpreted it.



He relented by just joining. The fact that the great asciilifeform was able to !up him is proof of our victory. The concession is this: you need MP's permission in order to do anything in bitcoin. Gavin admitted this by showing up. Do you see MP on the forum? Nope, didn't think so. The funniest part is that Gavin waited until MP was out to lunch before showing his sorry face.

Isn't this is a bit silly, unless your goal is being ignored?   (If communicating with you is equivalent to agreeing with you and also submitting to you, the only other choice is to ignore you.)   I like #bitcoin-assets also.  Some very interesting things happen there precisely because not everyone is thinking the same thing.  Why encourage groupthink?

It is to Gavin's credit that he is seeking diverse inputs and points of view, and to yours for engaging in that discussion.
391  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 17, 2015, 08:58:43 PM
This is where my conversation with Gavin fell apart.  He was not able to acknowledge the concept of a too-high limit.  His reasoning was that since the limit was only one-sided (blocks with size above it are prevented) that it couldn't be too high.

Huh what?

I am not proposing infinitely sized blocks, so I obviously acknowledge the concept of a too-high limit as being plausible.

If you want to continue the conversation, please be very explicit about what problem you think needs solving, and how whatever solution you're proposing solves that problem.

We might agree or disagree on both of those points, but we won't have a productive conversation if you can't say what problem you are trying to solve.

To summarize my position: I see one big problem that need solving:

Supporting lots (millions, eventually billions) of people transacting in Bitcoin.
  Ideally at as low a cost as possible, as secure as possible, and in the most decentralized and censorship-resistant way possible.

It is hard to get consensus on HOW to solve that problem, because no solution is obviously lowest cost, most secure, and most decentralized all at the same time, and different people assign different weights to the importance of those three things.

My bias is to "get big fast" -- I think the only way Bitcoin thrives is for lots of people to use it and to be happy using it. If it is a tiny little niche thing then it is much easier for politicians or banks to smother it, paint it as "criminal money", etc. They probably can't kill it, but they sure could make life miserable enough to slow down adoption by a decade or three.

"Get big fast" has been the strategy for a few years now, ever since the project became too famous to fly under the radar of regulators or the mainstream press.

The simplest path to "get big fast" is allowing the chain to grow. All the other solutions take longer or compromise decentralization (e.g. off-chain transactions require one or more semi-trusted entities to validate those off-chain transactions). I'm listening very carefully to anybody who argues that a bigger chain will compromise security, and those concerns are why I am NOT proposing an infinite maximum block size.

There is rough consensus that the max block size must increase. I don't think there is consensus yet on exactly HOW or WHEN.


Thank you for this additional insight into your thinking on the matter.
My impression was that you were not proposing "no block size limit" only because that would not be likely to attract the preponderance of consensus support, and that you were instead proposing as high as you thought you could get agreement upon.  I like the goal, I hate the method.

We (you and I and whomever else agrees) share the bolded goal you articulated above.  Where we diverge may be on the value of the "get big fast" bias and maybe the ordering of the criteria in the bolded goals.  Though I would be delighted to see "big fast" happen, if it is done at the cost of security, decentralization, censorship or transaction/network cost (which is also a type of censorship), then the cost of growth is too high.

My view is that the "mass adoption" isn't a short term goal, but an eventual fait accompli.  Getting the max block size out of the way as an impediment would be greatly beneficial.  

I like JustusRanvier's thought work on the matter, but as a proposal it is pretty far from where we are today and likely unreachable with Bitcoin internally.  If miners are induced to 'tip out' to nodes somehow, it may happen external to the protocol through groups like chain.com or apicoin.

Where the current proposals fail in my humble opinion is the risk/reward.

To go from a static block size limit, to one over an order of magnitude larger and adding a dynamic limit (with exponential growth) has risks to node centralization.  It invites spam transactions which create costs to the node maintenance in perpetuity.

Bitcoin has less than 10K nodes operating, if node running could be made to be 20x more expensive (in network and verification/storage/search) there are going to be less nodes, not more.  This is a loss in security, decentralization and cost (and where it loses decentralization and cost, it also creates censorship risks).  So a too-high cost-of-risk for the growth.  There is also the risk that it may be to low of a limit.  Maybe we get some event in the world that brings people to Bitcoin in droves.  Either too-high or too-low and we get a crisis induced change and are back to where we are today.

Other advancements may mitigate this somewhat.  Fractional/shard nodes may help long term by making the cost more granular, (what have we decided to call these?), but in the short term the total node count may fall as some currently full nodes become shards.  

Where does that leave me?
I'd support lower risk proposals:
1) A static but higher max block size.  This just kicks the can down the road until we can achieve...
2) A dynamic max block size that increases or decreases based on the needs of people to use Bitcoin, not on a current best guess of what the distant future may appear to be (as good as that may be).  and ultimately...
3) No block size limit, because the economics are in place to make it unnecessary

Lots of us have had ideas on how to get to (2) but the pieces are not all in place yet.  What we really need is a strategy to get from where we are now to (2).  This is the discussion I would like to see happening, and assist with to the extent possible.  A proposal to get Bitcoin protocol through the next handful of decades, rather than the Bitcoin software through the next handful of years.  (3) may be a problems for the next generation to handle, I'd like to give them the chance to get there.
392  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 17, 2015, 02:05:21 PM
I'm still holding hope for a better proposal that isn't too complex.
A dynamic proposal that allows the "anti-spam" limit to increase and decrease may allow us to continue to search for other solutions while we have the backstop in place so we don't hit a crisis may be a good idea.

This is where my conversation with Gavin fell apart.  He was not able to acknowledge the concept of a too-high limit.  His reasoning was that since the limit was only one-sided (blocks with size above it are prevented) that it couldn't be too high.

You, I and others, can see the fallacy of this in a moment, but he seems willing to ignore it.  I have good confidence in his programming.  I like the way he codes, it is tight and clean.  He is a great software engineer, and he has tested the software so that it works with large blocks.
As a protocol engineer he may be able to improve.  He has set himself up as the curator of ideas, and decides which have merit and which do not.  He was wise enough to revise this aspect of his proposal for the protocol once already, so he can hear some criticisms and respond.
This is why I hold some hope for him yet and have not given up on him.
393  Bitcoin / Development & Technical Discussion / Re: How to Create a Bitcoin Address from a Coin Flip on: February 17, 2015, 08:53:18 AM
1. I agree that my measly .05 BTC of bait is not a real enticement.  But it would demonstrate if someone had maliciously swiped the code and was using the site to swipe private keys no matter the balance.
2. I appreciate gmaxwell's point and I added warnings based on his point.  This thread is however, about creating the private key with a coin flip not a web site.  Using a site like random.org is ancillary to the thread and I marked it "educational only."
3. The only thing I claim is safe is: 1) its done offline, 2) its done randomly, and 3) no one can know the method of creation.  I will stick by that maxim. < this is the essence of the thread
Agree with all of this Smiley and with the recent revelations:
http://www.cbc.ca/news/technology/nsa-hid-spying-software-in-hard-drive-firmware-report-says-1.2959252
the method is very attractive, still I'd stick with the 64 hex dice rolls over the 256 coin flips.
394  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 17, 2015, 08:12:24 AM
The difference is most everything.  Consensus is agreement.  Usually there isn't any debating at that point.
If it comes to a vote, or a battle, then it isn't a consensus.

Consensus is preferred, but it is already clear that certain individuals are unwilling to compromise no matter what, so ultimately it will
never be reached between the whole bitcoin community. Ultimately, 95% consensus will be reached to make this happen. How quickly will the remaining 5% upgrade when the hardfork initiates will also be interesting to watch. My guess is at least 1% will fail to upgrade due to be stubborn or oblivious.

It would be wise for those being left behind to start getting ready to release their own hard fork to solve some serious concerns when they control less than 5% hashing power overnight on a forked coin. They should create their own github and start marketing it to users and miners to solve this serious problem which will make their currency unusable overnight unless they adjust difficulty retarget manually and possibly implement more checkpoints.

It will be interesting to see if their coin survives and for how long? Most miners and pools would be unlikely to attack their coin but with such a small hash rate one group may attempt a 51% attack out of spite, curiosity, or for profit.

Bitcoin has always remained interesting to say the least. Smiley

It is entirely possible that there is more than one consensus, which are not in agreement with each other.  There is one consensus per chain.  All on that chain are in agreement that it is the longest for that chain.

So for the 'best' solution is still a pretty bad one.  Its more of a software fix than a protocol fix.  If enough adoption occurs, and transaction volumes continue to grow, such that lots of people think that we HAVE to DO SOMETHING or else really bad things happen, and the crisis that we manufacture for ourselves convinces us that a bad solution is better than those really bad things...  Then we get forked.

I'm still holding hope for a better proposal that isn't too complex.
395  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 17, 2015, 12:03:22 AM
What is the difference between debate and consensus?

Quote from: Ideal Money
Illustrating the principle of these optional choices, the people of Sweden recently had the opportunity of voting in a referendum on whether or not Sweden should join the Eurocurrency bloc and replace the kronor by the euro and thus use the same currency as Finland. The people voted against that, for various reasons. But it cannot be irrelevant whether or not the future quality of a money is really assured or whether instead that it depends on the shifting sands of political decisions or the possibly arbitrary actions of a bureaucracy of officials.

The difference is most everything.  Consensus is agreement.  Usually there isn't any debating at that point.
If it comes to a vote, or a battle, then it isn't a consensus.
396  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 16, 2015, 07:02:22 PM
We're seeing spikes to 1MB now, even without driving events we can point at. So, at the moment when 1MB definitively isn't enough, I fear that 5MB won't be enough either.  Transaction volume in the real world is incredibly spiky in response to such events, and Bitcoin is starting to have closer and closer interactions with the 'real world.'

So in your formulation, a single block is a meaningful spike?

If so, there isn't a max block size that can accommodate.
397  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 16, 2015, 06:59:34 PM
This security risk is balanced by the correspondingly increased number of nodes, given an equivolent Bitcoin Network Cost, where:
Bitcoin Network Cost = Data Size * Decentralization.

You can't arrive to meaningful conclusions if you base your reasoning on undemonstrated equalities.

gmaxwell put out a recent and good discussion on this, in regards to this topic here:
https://bitcointalk.org/index.php?topic=946236.msg10369307#msg10369307

Though he calls it the "inherent cost of transactions", this is the bitcoin network's fundamental cost and value added for if there are no transactions ever again, there is not any more meaning to Bitcoin.

What sort of demonstration are you seeking?  Why is this in doubt?
398  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 16, 2015, 05:56:17 PM
There is no reduction in security.

It becomes much easier to pretend some blocks don't exist to selected targets, and get them to accept selectively rewritten history for just long enough to rip them off.

This security risk is balanced by the correspondingly increased number of nodes, given an equivolent Bitcoin Network Cost, where:
Bitcoin Network Cost = Data Size * Decentralization.

So the net result is no decrease in security.

We can expect a marginal increase in security as the node maintenance cost becomes less 'chunky'.
399  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 16, 2015, 05:29:53 PM
Pruning should arrive in Bitcoin Core 0.11, allowing people to run a full node with only 1 GB of storage space.

Please answer this simple question: "how does one bootstrap a full node once everybody's pruning?".
Suppose every node prunes a 99% of the blockchain, and keeps a random 1% of the blocks.

A new node bootstraps by contacting 100 random nodes.

More formally constructed the optimal balance is a hybrid DHT.  All nodes contain full blockheaders.  This makes attempting to rewrite history impossible.  The headers are too important to store in a DHT but the transactions can be.  All nodes don't need all the copies of SPENT transactions just full headers and the UTXO set.  Of course users also have a vested interest in ensuring a copy of their txns remains so they would probably store that locally.

Future bitcoin node could only hold the following
Full blockheaders (~4MB per year)
Full blocks for the most recent x blocks (can improve efficiency in the event of reorg)
Current UTXO set
Users own transaction history
Some user defined portion of the full transaction set (could be 0% to 100%)

This presents a drastic improvement in the bitcoin network cost and decentralization if:
Bitcoin Network Cost = Data Size * decentralization

400  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 16, 2015, 04:30:28 PM
...

We agree on the problem, we disagree on the solution.  Treating it as a 'production server' replacement, would be the wrong approach.
How many times do you want to replace this 'production server' for the same reason?  
If we are going to a dynamic limit, it should be one that isn't going to need to change later, and can be assured that it will be fit for purpose, and without opening up new vulnerabilities.

The problem with that... it isn't simple.

If the limit were say 10x the average size of the last 1000 blocks, it would still provide the anti-spam protection and keep the node distribution from getting too centralized

Not necessarily. Smiley My only point is that a fixed 1MB limit is a bad idea long term. So increasing it makes a lot of sense.

What is the formula for the sweet spot? Beats me. Smiley We have a year or two to figure that out.

The thing is, I don't think there will ever be mass adoption of the system as it exist today. The 'average Joe' will be using some payment processor and not the blockchain directly. So the blockchain will have transactions from payment processors and early adopters/enthusiasts. The rest will be in closed systems and/or side chains that can solve a lot of the volume issues.

But still, 1 MB is not enought.

We're probably in agreement on this then.  I don't like the 'exponential best guess' approach.  I'd favor either a new static limit, maybe 8MB to give a bit more time for a real solution, or... a real solution.

A real solution would be a limit that right-sizes as blocks are added that both prevents abuse, and allows for transaction growth so that we don't get queued transactions that are chipping in reasonable fees and accounts for days destroyed.

We aren't anywhere close to mass adoption.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 [20] 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 ... 212 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!