Bitcoin Forum
May 13, 2024, 08:53:48 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 [19] 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 ... 212 »
361  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 24, 2015, 05:26:10 AM
I'm with you there.  It also makes the attack orders of magnitude less costly though (because the cost is primarily the orphan cost).

Not really.  IBLT only produces O(1) propagation times if all nodes already know about the transactions in question.  If an attacker colludes with a miner to fill giant blocks with spam then the miner loses O(1) propagation times.  So the attack miner's orphan rates skyrocket at a time when honest miners see their orphan rates fall.  Since the true cost of orphans is the relative difference this would make those type of attacks far more expensive.

With IBLT the upper safe block size becomes limited by the internode bandwidth.  A miner can't necessarily know the txn set in the memory pool of all nodes but he can guess.  If txn volume exceeds full node resources nodes will need to drop some transactions.  They will find it most efficient to drop those transactions least likely to be included in a block.  That means sorting txns by fees and priority just like miners do.  So miners will want to pick a subset of those sorted transactions which meets the bandwidth requirements of most nodes.   It doesn't need to be perfect but there is an inflection point below which orphan rate is essentially flat relative to size and above which it increases linearly.  The miner may not know the exact inflection point but being conservative (smaller block) effectively costs nothing at least while the subsidy is high but being too aggressive can be very expensive.   Since the fees paid are unlikely to compensate for a linear growth in orphan probability there is no economic incentive to mine above that point.  Uncertainty combined non-linear risk reward means they will probably underestimate not overestimate in order to compensate for uncertainty otherwise the miner is simply working harder and taking more risk for less net income.

In the mentioned attack, the preponderance of miners can be so conscripted.  The smaller pools will disappear.
362  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 24, 2015, 04:13:36 AM
Don't forget that IBLT (set reconciliation) will reduce the size of new block announcements by about two orders of magnitude. So 20MB blocks could take 200KB (although in practice the minimum IBLT might be a bit larger).  Only node bootstrapping and resync will need full size blocks sent on the network. A lot of dev work is going on so maybe this will happen soon after Gavin's v4 blocks. The relay service also reduces block size (by about 80-90%, I think I remember reading), and this is already live.

The efficiency of this will improve over time, and it will also hugely reduce one class of spamming, where a miner keeps a lot of spam transactions secret and eventually blasts out a large solved block full of them.

I'm with you there.  It also makes the attack orders of magnitude less costly though (because the cost is primarily the orphan cost).
The biggest unknown in all this guessing is probably the rate of growth of Bitcoin.
363  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 24, 2015, 12:49:44 AM
With all due respect, contrast this limit (or any limit) with unlimited.

Nobody (well nobody with influence to make it happen) has proposed unlimited blocks.  20MB is just as finite as 1MB and so is 1GB.

Quote
It does indeed prevent spam attacks.
No it limits the severity of the damage that such an attack can cause.  It provides an upper limit not a prevention.  Many corporate wire account have an upper limit on the value of wire's that can be sent in one day.  That doesn't prevent a fraudulent wire transfer but it does prevent the most the company could lose due to such fraud.  The bean counters at a company will impose a limit that balances the needs of the company vs the loss the company could face.  If the upper bound of the loss is less than what would cripple the company than no single compromise could bring down the company.  The block limit is the same thing.  It is saying "worst case scenario, how much damage are we talking about?".  How much is a good question to ask but you have to be asking the right question.  The question of what limit prevents spam is the wrong question but the limit doesn't prevent spam.
We disagree and here's where.
It is the difference between preventing spam and preventing a spam attack.

Take your example of wire transfers.  If initiating a wire transfer cost less to the initiator than it does in process-costs to the institution effecting it, and all of the counter-party institutions have to put forth some effort for no benefit and only cost, it would be similar.  A competing bank could then flood another bank's wire transfer process with a very high load of transfers.  There would be costs driving down all of its competition at only a tiny cost.  This is what miners can do.  The fees they pay to themselves cost them nothing, but EVERYONE has to store those transactions FOREVER.

The limit does nothing to prevent spam.  Having a limit prevents using it for attacks on Bitcoin.  You may think of it as an economic game-theory problem.  If this were not a problem, then we could have unlimited blocks.  In my nomenclature, that is a phase 3 solution.

Quote
and the proposal is for 16x the current risk and x16000 over 20 years.
In nominal terms but then again in nominal terms but the cost per unit of bandwidth (and cpu time, memory, and storage as well) falls over time.  I mean even 1MB blocks would have been massive 20 years ago as well when the most common form of connectivity was a 56K modem.

So the problem could expressed as both a short term and longer term problem.  What is the maximum block size that could be created today without exceeding the bandwidth resources of a well connected node?  If it is x and bandwidth availability per unit of cost increases by y per year, then in n years a block size of x*(1+y)^n presents no more of a burden than a block size of x today.

For the record I think Gavin's proposal is too aggressive.  It uses Moore's law but bandwidth has trailed moore's law.  A 20% YOY increase more closely resembles bandwidth availability over the last 20 years.  Also 20MB as "x" is pretty aggressive as well.  So something like 11MB * 1.2^n gets us to the same place (~16B) in 50 years instead of 20 and with a higher confidence that bandwidth requirements will grow slower than bandwidth availability.  Still I got a little off track no matter what limit is adopted it doesn't prevent spam.  Economics and relaying rules prevent spam.

Yeah, I remember when 1200 baud was a good thing.  Gavin started with Moore's law but has now hewed closer to Neilson's law (which exhibits the slower growth you describe).

From my reading of him, he expects to rely on miners to continue to use lower limits.  I do not trust this reliance so much.  There are very real threat models when you start to consider the self interests of both Bitcoin economic interests, as well as non-Bitcoin economic interests.  If you think "its not possible because logistics/economics/practicalities...read just a few more paragraphs.  

I don't generally like discussing threat vectors in public forums.  Bad people sometimes get ideas and do stupid things.  CitizenFour let us know that there aren't really any private forums anyhow so...
One of the other effects that doesn't get discussed is that this could very viably open up new service models for miners.  They may do some sort of bulk package where some transaction initiator pays for unlimited transactions and fills every block a miner can win.   The marginal cost to the miner is the orphan cost...  but if this were also done with all the other miners, that marginal cost is quite low.  In this way all the block space could be bought for a de-minimus amount of fiat, and it would be in each one of the miner's interests to do so.

So if you imagine that there may be some fiat-based entity that feels an existential threat looming from Bitcoin and would just love to strangle it in its cradle...  Bad people could do bad things with a too-large limit.  This is just a small sample of what an evil mind may contemplate.  This is something that should have careful handling.  Queueing up some transactions and filling some blocks with some transaction bidding is also quite bad, but not the end of the world.  

FWIW: I don't think developers are slacking, but I do consider there are vast competing initiatives, and furthermore laziness is a sort of programming virtue.  Getting computers to do things with the minimum code is a very good thing.  This is a realm where Gavin is especially good.  His code is thin and tight.
364  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 23, 2015, 09:44:56 PM

Development says: "This problem...its a biggie, give us 20 years"


With my developer hat on, if I were Gavin I might ask for this also.  A good engineer can figure out what it would take to solve a problem.  A great engineer with some experience usually multiplies this by 3x before going to management and asking for what is needed.

Gavin may be thinking there's at least 6-7 years of development in this thing and we got very little done.  Maybe if we start now we can lick it in 6 or 7 years so lets ask for 20+.

Good management knows this, and knows that developers get sucked in to everything unless there is some pressure, so they ask the unreasonable.  Sometimes they get it, sometimes they just end up with frustrated developers.

But since we know that eventually we will be back to this same negotiation... It is better to have shorter intervals between checking in on this than 20 years (which is essentially passing it on to your successor).
365  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 23, 2015, 09:30:41 PM
Think of it yet another way.

Development and Management go into a negotiation.  Management assumes the risk of development failing.  Management wants Bitcoin that helps people, does what is needed, and continues to exist.  So Secure it and make it useful.  Do so in a way that doesn't have to be fixed over and over forever.

Development says: "This problem...its a biggie, give us 20 years"

What development is asking for is some breathing room so that they can kick the can down the road on the real work that needs to be done to get a self-managing model in place that can dispense with the need for futzing with the limit, ever.  It is a pretty big thing to ask.  Basically it is saying "trust us, we will work on this over the next 20 years even though we know we won't be held accountable until that time is up or unless something bad happens from us ignoring it for so long."

Management has the incentive to keep the developer's feet to the fire, development want's slack.
Development and management may be the same folks here, so it is ultimately an exercise in self discipline.

These arbitrary limits are only phase 1.  1MB, 16MB+, 20MB, exponential growth formulas... all of these are arbitrary.  We have arbitrary now at 1MB.  No progress has been made to get past arbitrary yet.  This is understandable, lots of bugs to fix.  But this will always be true.  It is a developer's maxim: "the last bug is fixed when the last user dies".  So removing all incentive to fix this issue for 20 years?  It is better to fix things that require hard forks earlier than later.

Getting to the ability to measure and adapt the max block size is only phase 2.  There is a phase 3 also, it would be nice to get there.
366  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 23, 2015, 09:21:19 PM
In the short term...to high a limit and we get increased risk and spam attack risks, too low and we get transactions queued.  Both are very bad.  Whether you think one is worse than another is a matter of perspective.

Well I think this is a common misunderstanding.  The block limit never had and never will prevent spam.  When the 1MB limit was put in place the average block size was about 35KB.  So even after it was in place a miner could have made a block 3000% of the average block size.  The rules for standard transactions, transaction priority, dust threshold, and minimum fees is what prevents spam.  The block limit only provides an upper bound on how much damage a malicious user could do.  By your definitions the 1MB limit was "too high" as it never did anything to prevent spam.  In fact it failed utterly horribly at that job (well failed as much as something failing to prevent an event it was not designed to prevent).   Prior to the dust threshold being created users (one service in particularly) horribly wasted the most critical resource of the network.  No contrary to popular opinion that isn't block space, it is the UTXO set.  The UTXO set is critical because blocks are not used to validate new blocks or transactions.  Blocks are used to update the UTXO and the UTXO is used to validate all new transactions.  Under normal use the UTXO set grows slower than the overall blockchain.  Satoshi Dice sent low (as low as 1 satoshi) outputs to notify users of losses.  Those outputs will probably never be spent but the network can't "write them off".  They will remain perpetually bloating the UTXO set.  Even if the block limit was ultra conservative (say 100K instead of 1000KB) it wouldn't have prevented this poor use of critical resources.

So what did the block limit do.  It provided an upper limit on the damage that one or more malicious entities could do to the overall network.  If 100% of the network conspired it still put an upper bound on the blockchain growth of 30x the average block size.  If only 10% of the network conspired then it limited the bloat to no more than 3x the average block size.  A single malicious entity creating the occasional max block would have even less effect in the long run.  That was the purpose of the block limit.  The block limit is like fire insurance for your house.  It limits the damage in the event your house is destroyed but it would not be accurate to say that fire insurance prevents fires anymore than the block limit prevents spam

With all due respect, contrast this limit (or any limit) with unlimited.
It does indeed prevent spam attacks.  Do not confuse the scope of the threat with the existence of it.
We are looking at arbitrary amounts of risk of threat to accept, and the proposal is for 16x the current risk and x16000 over 20 years.

That may be a reasonable number, or it may not be.  We can't know from where we are today.
367  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 23, 2015, 09:16:19 PM

Finally, there is no such thing as a course of action to prepare for the future that is not based on some form of extrapolation or prediction.  But that doesn't mean failing to be prepared is a good idea.

And I still have no idea what miner would deliberately lose money by taking block size up to the very limit using bogus transactions, or why they would try to.  We're talking these days about people running actual businesses that they have invested major resources in and want to see some ROI.  This is not just set of randoms that's going to contain trolls who came along just to wreck things for lulz.    So I'm just not seeing the immediate threat model of a ridiculously enormous block-chain that you claim to see.

There is a course of action to prepare for the future that is not based on some form of extrapolation.  Many proposals have taken this form.  All of them have the same failing in that they are not implementable without adding some code for metrics.
What will be there for us in 2, 5, 10, 20 years that will know how big bitcoin blocks are and need to be?  The block chain will.  

Simply stated the threat model of a ridiculously enormous block chain is that the cost of bitcoin can be made to be unreasonably high, such that it becomes non useful, and even non-economic.
Perhaps you do not see it because you are considering only those within the Bitcoin Economy as important, but when you consider the larger scope of players in the game theory, and consider that there might be some who would like our experiment in crypto currency called Bitcoin to fail?

The Bitcoin Network Cost = Data Size * Decentralization.  

It doesn't take a 50+% attack to grow the data size if the protocol permits it.  It does not lose money to do this attack, except only at the margins with orphaned blocks.  This is a small fraction of the mining revenue, and it is important to those that are working to be the most competitive.

If you consider an 'attacker' who might be willing to absorb whatever losses might accrue from the very occasional orphaned block in order to grow the data size by as much as the new maximum allows with every block they solve.  This will have some bad effects.

First bad effect is to increase the cost for all miners and node operators (these are not necessarily the same folks).  The node operators take the biggest hit because there is no revenue for them anyway.
The miners also take a hit in increased cost (bandwidth, maintenance, storage).  The smallest and those with the most expensive bandwidth may fall below profitability and may leave the market.  This benefits the 'attacker' in several ways.
1) they get a larger share of the hash rate by knocking out competition
2) they increase the centralization of node operations and mining making Bitcoin ever easier to attack.

Second bad effect of the exponential growth plan is its perniciousness.
The greatest effect of an attack is done when it is sudden, persistent, and overbearing.
We may have a great majority of Bitcoin Economy miners that manage their block sizes and even though the max is 16MB-16GB or whatever the limit is of the period.  Average block size may continue to be less than 1MB for many years to come, or grow much slower than the extrapolation predicts.  We simply do not know what it will be.  If the attacker waits until a time when Bitcoin is particularly vulnerable, and then starts mining the huge bloated blocks to make it more expensive.  The risk will slowly increase until such time that it can be exploited.

Third bad effect.  The limit could be too low.  Ridiculously high may not be high enough.
Bitcoin could become wildly successful much sooner than expected.

Does you seriously think that it might take 20 years to solve the block size measurement problem?
I like that the new proposal does have a sunset provision (only 10 doublings so increases 2^10).  Each revision Gavin has improved the proposal, though it still seems so very pessimistic.  If we are postulating exponential Bitcoin transaction growth, why not also postulate exponential growth in Bitcoin Developer expertise?
368  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 23, 2015, 07:28:41 PM
Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

The maximum maximum block size will become 16 GB on Jan 1 2035 with Gavins proposal.

So I guess that means that another hard fork is not needed until 2035?

If the population of Earth in 2035 is ~8 billion, that's 2 bytes per block, per person.  

Given 144 blocks a day, that's 288 bytes per person, per day.  If everybody uses the block chain twice a week or so, the block size proposed for 2035 should be adequate.  

Since most people don't do money orders, remittances, or savings-account transactions that often, honestly I think that ought to suffice for "ordinary" use by ordinary people, even with a very high rate of adoption.  I mean, some will be using it for every cuppa coffee, and some won't be using it at all, but it averages out at what I'd think of as a normal rate of usage for complete mainstream adoption.

Rather than extrapolate on what might happen, a better question to ask yourself would be:  "Would there be any negative effect if things are different from this result?"

Bitcoin is not improved by making it work.  It already does that.  Bitcoin is improved by making it not fail.  This must be the goal of the fork.

Consider if there is growth and shrinkage in either bitcoin use or populations?  Even a miner with a smallish percentage of the hash rate could significantly raise the costs for everyone else.

Bitcoin is as excellent as it is not so much because of what it can do, but also because of that you can not do.

Adopting an inaccurate and indefinite exponential growth proposal based on extrapolations is folly and hubris.  It introduces new failure modes.  It just isn't worth it.
369  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 23, 2015, 05:33:21 PM
Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

The maximum maximum block size will become 16 GB on Jan 1 2035 with Gavins proposal.

So I guess that means that another hard fork is not needed until 2035?

You are guessing.  So is Gavin.

Wouldn't it be better to measure and not have to guess?
370  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 23, 2015, 05:27:16 PM
People have discussed sidechains and pruning and other suggestions, but the fact is, they will take time to implement to make sure they actually work.  A 20MB blocksize is a far more simple solution that will work right now and until I hear something better, that's the one I'll be supporting.

Ok, but, see, we can go for 2-5-8 MB limit (wihtout exponential growth) before we hit the limit, and use the time gained by shooting the can to invent something better

It's not so urgent that our only choice is to implement exponential size growth

What you are saying now is, let's go exponential, maybe we can sustain it.

This is the worst anti-fork argument, since it means you'd want have to have another hard fork each time you need to increase it.  If exponential adoption happens, either we find a way to cope with it, or another coin will.  Again, there are only two outcomes.  Increase the limit, or jump ship when Bitcoin can't cope and grinds to a halt.  

Wow.  
Take some time and consider what you are suggesting.
If you only do what is easy, rather than what is right, you will always be wrong.

Secondly, all of your premises here are incorrect.  There is no grinding to a halt.  The worse case scenario of not resolving this in a strategic way is that Bitcoin will continue to do what it already does better than anything else.  It does not "grind to a halt".  If your transaction is urgent, you pay a fee.

Important things are worth doing right.  Even if they are hard.  

Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

You want us to be afraid of some other crypto currency taking over, and that is why we need to fork?  Sell your fear elsewhere.

This isn't "selling fear", this is common sense.  This isn't about sending transactions without a fee, this is your transaction my not make it in even with a fee.  Full means full.  As in wait for the next one.  And if that one is full, wait for the one after that.  How long do you think people will wait before complaining that Bitcoin is slow and useless?  Bearing in mind this is the internet and people complain about the slightest little inconvenience like it's the end of the world.  I don't want to see the outcome of that fallout.

My argument is keep the number of forks to an absolute minimum, which is hardly controversial.  You can clearly see in the quote above, Sardokan said "see, we can go for 2-5-8 MB limit", which sounded like he wanted to have this discussion a few more times in future.  But now that he's clarified his position, you'll find I've already agreed with him that this fork is needed because we may not have a permanent solution ready before we hit the 1MB limit and the next fork should be the permanent fix to solve the problem of scalability once and for all.  Again, if someone comes up with a permanent fix before we start coming close to that limit, I'll happily listen to that argument.  Until then, we need to raise the limit.  Stop over-reacting to things.

Yes, I caught your later post afterward.  Thank you for clarifying that.
We will very likely have this discussion again in the future with the proposed fork also. 
As you recognize here, there is not yet any proposal that will prevent that.  We do not have a mechanism yet for a right-sizing max block size.

The 2-5-8 is not any less reasonable.  A road-map of how to get to that right-sizing mechanism should be the 1st goal.  We should have that before any significant fork anyhow.  Without it, a small increase would be just fine with me.    Once that road map is articulated, then comes picking an appropriate size to get Bitcoin that mechanism.  Without it, it doesn't make a lot of sense to guess how long or how many transactions per second there will be by the time we get to it.

Beyond that, ending the max block size limit with free-market economic incentives is a far more distant goal.

In the short term...to high a limit and we get increased risk and spam attack risks, too low and we get transactions queued.  Both are very bad.  Whether you think one is worse than another is a matter of perspective.
371  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 23, 2015, 04:45:47 PM
People have discussed sidechains and pruning and other suggestions, but the fact is, they will take time to implement to make sure they actually work.  A 20MB blocksize is a far more simple solution that will work right now and until I hear something better, that's the one I'll be supporting.

Ok, but, see, we can go for 2-5-8 MB limit (wihtout exponential growth) before we hit the limit, and use the time gained by shooting the can to invent something better

It's not so urgent that our only choice is to implement exponential size growth

What you are saying now is, let's go exponential, maybe we can sustain it.

This is the worst anti-fork argument, since it means you'd want have to have another hard fork each time you need to increase it.  If exponential adoption happens, either we find a way to cope with it, or another coin will.  Again, there are only two outcomes.  Increase the limit, or jump ship when Bitcoin can't cope and grinds to a halt.  

Take some time, consider what you are suggesting.
If you only do what is easy, rather than what is right, you will always be wrong.

Secondly, all of your premises here are incorrect.  There is no grinding to a halt.  The worse case scenario of not resolving this in a strategic way, or not forking for a larger max block size is that Bitcoin will continue to do what it already does better than anything else.  It does not "grind to a halt".  If your transaction is urgent, you pay a fee.

Important things are worth doing right.  Even if they are hard.  

Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

You want us to be afraid of some other crypto currency taking over, and that is why we need to fork?  Sell your fear elsewhere.

The max block size is only the upper limit of what miners can set for their blocks.  The miners set the limits in practice.  Most miners have their limits set well below 1MB. 
372  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 22, 2015, 09:27:39 PM
Newliberty, WE are off topic until we relevate Ideal Money in relation to the OP.  It SEEMS otherwise, because we function from a false perspective that Satoshi is a person.  I'm likely away for 10 days regardless, I hope you all can finally become sincere.

In other words, the lecture serious Ideal Money was specifically prepared and given, on the exact topic of this OP.  You cannot ignore the lecture and refute this simply obvious truth. READ IT!

"Ideal Money" is a very broad topic.  
It is not so relevant here.
This fork is a very narrow issue.  Attempts to broaden it too much harms the discussion.

There are a great many posts in this topic to promote Nash's lecture series.  We all understand it is interesting.  It should be a separate thread, or condense something in particular that makes it relevant.  The normal thing is to put it in your .sig where everyone can see with each post, and stick to the topics of threads, please.

I could ask all to understand everything written in http://mises.org and the last 30 years but it would be silly.  It is a good thing to understand, but the relevant bits to this topic can fit in a few sentences.

The notion that we need to agree because it is bad for Bitcoin to disagree is also not helpful.  We either agree or do not agree.  When we do not agree we might end up with multiple consensus chains.  This is not the end of the world, or even the end of Bitcoin.    

Unless and until the fork happens, it is our responsibility to improve the understanding of the people that may be subjected to it.  To understand the problems with the proposals, and the difference between a good proposal and a bad one.

We also need to work toward understanding what brings us to this point:  What are the criteria for a hard fork of this nature?  Under what conditions may it occur?  When is it crazy and when is it sane, what justifies it if anything?
====

The problem with Gavin's proposal is that it is almost good enough temporary solution to become a permanent solution.  It's saving grace is that if implemented, then fixing it (by cutting out the exponential growth or reducing the size to fix a spam problem) would be a soft fork instead of a hard one.

I hope to get closer to an accurate indefinite solution.  One based on measurement and responses.
373  Bitcoin / Bitcoin Discussion / Re: How did you got into bitcoin/crypto ? on: February 22, 2015, 07:30:08 PM
Always was interested in mathematics.  Always fascinated with secret codes, and computers, and security.
Got much more into cryptography from Zimmerman's pgp in the early 90's.  Interest in Bitcoin came from a history of disappointments.  Digicash, frustrations with having a merchant account in the 90's and trying to do online sales.  Work doing coding for POS systems and technologies in fin-tech, stock analysis, William O'Neil & Co, and for banks in the mid 90's helped refine an understanding of the problems.  Followed by a career in multinational telecom and datacom security set the stage.

Not going to write more about my introduction to Bitcoin here, but suffice to say, I was a very ripe audience for my introduction to it, and immediately loved the idea.
374  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 22, 2015, 06:41:32 PM

Does your notion of "ideal money" include manipulation of supply in order to "manage" an economy by a group of managers that claim to know more about what is right than everyone else?  If so, than this fork discussion is not where your notions belong.  The fork has nothing to do with money supply, it has to do with whether and how to relax a constraint on transaction velocity before that constraint becomes a limiting factor of Bitcoin's utility.  This fork is not inflationary or deflationary.

Yes... lets discuss your concepts, but in this thread they are off topic.  They were only raised here to show how they have no importance to this particular discussion.
My friend, "Ideal Money" is a lecture, that pertains to a very specific "notion" that is not mine...that says if a new money technology could be released, with a finite "money" supply, it would stabilize every global currency in relation to it.

The rest that you comment on, is not related to what I am pointing out.  I am suggesting in the future, with a cooperative society, we do not need the same level of security. This might change our formulas...but we aren't aware or able to discuss it if we do not believe in an eventually cooperative society.

In other words, satoshi nor szabo talked about what bitcoin might do to our economy, but it turns out Dr. Nash has been touring country to country for the last 20 years (same time szabo's blog exists), explaining exactly what bitcoin will do to our global economy.  This is fact.
We agree than?  These are off topic.
Have you found a thread where they are on topic?  Or want to start one?  I'd be happy to participate there.  Here they are noise.


The slippery slope notion?

We are all likely in agreement that the fundamental purpose of Bitcoin is sound money.  If not, then another thread for dealing with "first principles" is needed.


no.  and you don't know what sound money is.  You cannot ignore 20 years of lectures about what sound money is and that discuss it like you understand it.  Gavin proposes a test on the robustness of the system.  If the community continues to disagree and debate block size, then bitcoin remains broken.  We need a solid consensus, that is what sound money is based on.  Nothing else matters in regards to "sound" money.

In fact I can ignore them.  There is an easy button to do that.
If you care to find some way to link any of this to this thread's topic, we can engage that.
375  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 22, 2015, 06:34:47 PM
The fork has nothing to do with money supply, it has to do with whether and how to relax a constraint on transaction velocity before that constraint becomes a limiting factor of Bitcoin's utility.  This fork is not inflationary or deflationary.

The fork is exactly about money supply. The primary argument used by those in favor of Gavin's proposal is that the fork is needed in order to make bitcoin more inclusive. If we make concessions to these demands now, what is stopping the same people from insisting we create 5 million bitcoin to airdrop into the wallets of some 3rd world country? When getting "mass adoption" is the primary focus, you end up with things like the dollar. I don't want mass adoption; I want sound money.

The slippery slope notion?

We are all likely in agreement that the fundamental purpose of Bitcoin is sound money.  If not, then another thread for dealing with "first principles" is needed.


One of the elements of sound money is its utility as a "medium of exchange".  This is what we are addressing with this hard fork.

The thing that stops the same people from destroying the sound money principle (and your example of 5 million more bitcoin airdrop) is the "store of value" element.

My last few posts here in this thread have been an attempt to remove the confusion between these things.  We can advance the "medium of exchange" utility, without harming the "store of value" utility at all.  This particular hard fork is not a negative impact to the store of value.  If anything, an increase in the velocity of money improves the store of value utility through increased liquidity.

I'm of the opinion that "Mass adoption" is a pipe dream (the market agrees with this).  I think it is decades away, if ever   However, I am working toward making it possible, and I am opposed to things that would make it increasingly impossible.
===

I also agree with your idea that doing something "because Gavin said so" is the height of laziness and foolishness.  I think even (perhaps especially) Gavin would agree with that.  His role is more of a curator of ideas and he should not be expected to come up with everything.
There are a whole branch of folks here that disagree with this.  They think that we should do just whatever Gavin proposes.  They think that "ossification" risk will kill Bitcoin.  They are as wrong as the people that think that we should not do something because Gavin proposes it.  It should not be about the personalities here.  Bitcoin is more important than that sort of political nonsense.

It is the responsibility of all of us out here in the cheap seats to add some peer-review rigorousness to the process.  Its a messy job.
376  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 22, 2015, 05:40:11 PM
There is one more missing variable, but alas it cannot very easily be relevated, because of our belief of the purpose and cause of "fighting" or "conflict" and relation our lack of understanding to "Ideal Money" or in other words the effects of an unstable global economic "situation", or the effects of floating currencies.

Humans (or life in general) is expected to evolve through perpetual conflict ONLY up until a point in which they develop and relevate a hyper currency (bitcoin).  And/or in other words this has to do a lot with the Wealth of Nations, in which these nations create a global economy and stop invading newly discovered lands.

If we understand the above post about the base needs for an equilibrium of wants that creates a stable "nation", when we understand also the cause of war (or peace).

And so what we can expect in the future is a society that cooperates more and teaches its peoples (and all life) about the benefits and importance of it.  We are in fact not destined to fight and we do not actually in the future rely on conflict to evolve (even in games says J.Smithy).

And so I worry about setting certain block size limitation based on the financial incentive required for securing "trust", because eventually this need will not exist, or at least not anywhere near to the extent we require it, or we think in the future we might require it.

I wonder if we can discuss then, whether or not this would be a kind of reverse inflation, a deflation, or something that might unravel the robustness of the entire system.

In other words, if all of this is about incentivizing a system against attackers, it will certainly, in the future, lose its coherence, if we do not plan for cooperation.  Of course if we do not believe in the possibility of a cooperative society then I cannot make this point.


Smiley

This hard fork gets a lot of attention, but it is not all of what Bitcoin is about.
It is instead, an example of one of its failings.  Yes, it is just about the balance between security and utility.  It is not about money supply at all.  
Your discussion of Keynesianism and ideal money, notwithstanding.

Does your notion of "ideal money" include manipulation of supply in order to "manage" an economy by a group of managers that claim to know more about what is right than everyone else?  If so, than this fork discussion is not where your notions belong.  The fork has nothing to do with money supply, it has to do with whether and how to relax a constraint on transaction velocity before that constraint becomes a limiting factor of Bitcoin's utility.  This fork is not inflationary or deflationary.

Yes... lets discuss your concepts, but in this thread they are off topic.  They were only raised here to show how they have no importance to this particular discussion.
377  Bitcoin / Development & Technical Discussion / Re: How to Create a Bitcoin Receive Address from a Coin Flip on: February 21, 2015, 02:45:58 PM

EXTRA SPEED:  .....


Punch your keyboard and take SHA256 of the results. It's way much better than using an online third party RNG.

I actually tried this... it worked great!   Thanks!

This is a decent RNG for small numbers, but not a lot of entropy for a whole key.  It has only as much as the variety of punches, so it is not so great for high value long term storage if the generation method is known.
378  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 20, 2015, 11:22:00 PM
Quote from: Pete Dushenski on October 8, 2014 at 5:47 PM in #bitcoin-assets
This is what it boils down to: scarcity. There’s no room in Bitcoin for inflation of any kind. Other applications and whatnot can be built on top of it as is. It’s for the world to adapt and conform to Bitcoin, not the other way around.

I don't see transaction volume increase as a change in inflation so much as a change in velocity.  Henry Hazlitt who is credited with bringing Austrian Economic theory to the English speaking world, would I think agree that is not something that ought to be artificially influenced either by constraint or encouragement.  Hitting the limit would be an artificial constraint, removing it ahead of (3) would be an encouragement, but for inflation it is a no-op.
With some people, watching them wield economics terms of art they do not understand is like watching a child run with scissors.
Many others may also conflate these, so a few words around it might be useful?  Lest some think that there is any attempt to equate Bitcoin with anything else, but who also may think that the principles that govern human behavior are all different with respect to Bitcoin and do not apply to it.
On Block Size and the Velocity of Money...

When the Keynesians attempt to increase velocity they do it by increasing money supply in order to attempt to get the money to 'burn a hole in your pocket'.  So that quote from above may be relating the max block size increase as a velocity increasing move (which for Keynesians usually means money supply increases).  

However, with Bitcoin, an increase in the max block size does not increase the money supply at all.  There is not even any new spending incentive created.  Bitcoin doesn't start to burn a hole in your pocket.  All it does is potentially reduce the confirmation time, and remove an artificial floor in the transaction cost (which is a form of economic friction).  Removing the friction is a good thing.  It is one of the things that Bitcoin is very good at doing.

And in further contrast to the Keynesians, the contrary is more the empirical truth with Bitcoin.  Transaction volume (velocity) has never yet been constrained by the protocol's Max Block Size, only by miners.  The queueing we have is almost entirely constrained to the no-fee transactions.  So now in approaching the point where the protocol constrains it, intelligent people may disagree whether or not Bitcoin undergoes changes simply by keeping the limit the same.  Such queueing would serve to artificially subsidize mining at the expense of users rather than let a freer market dictate transaction pricing.

I'd aver that consistently hitting the transaction velocity limit is a change.  It is as much a change as is changing that limit.  Neither have occurred in recent times and there are those that reasonably expect both.  Further, let me suggest that the change it is one which is very different in nature and effect from hitting a limit on supply (the 21M limit).  
This difference is one where supply being inflationary, and velocity not so...not at all.  Velocity (transaction volume) increase is a measure of the health of Bitcoin.  It is a virtue.

=======
Personally, I do agree with danielpbarron on the notion of running a full node.
To wit, I expect to maintain my nodes as full.  I do not expect everyone else to do so, and I do not hope to force them to do so either.  Daniel and Immanual Kant have the ethics right on this.
379  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 20, 2015, 05:54:41 PM
Over its history, Bitcoin users per nodes has declined from >1:1 to now something like <1:700 (from the calcs upthread).  

Looking at it another way... # of nodes contributes an element of security (resilience) where N is the minimum to run, N+1 is with one 'hot failover' system for backup, we have something like N+7000... which so far has been more than enough.
....

As (A) [meaningful transactions] increases, Bitcoin's value increases.  As (B) [node quantity] increases, Bitcoin's cost increases.

Thus if <1:~700 has been sufficient with a market cap of 3-4 billion one could extrapolate a minimum of full nodes desired with a more valuable economy as there is a loose correlation between the incentive to attack a more valuable network and the costs inherent to do so. So if Bitcoin grows to a market cap of 8 billion we would want around  <1:~350 full nodes. 16 billion we would want <1:~175 nodes. This shouldn't be a linear ratio continuously as supporting that level of security for growth isn't realistic even if we keep the block limit at 1MB. Perhaps if we start considering pruned nodes as having a certain amount of value in the total node count it can be extended much further.

Extrapolation with only these variables is not such a great idea, but I take your meaning.  Yes, add in pruned nodes when they materialize at some value less than their percentage (they are worth nothing alone).

I still believe what is more important is the type, quality, whom controls,  and distribution of the nodes vs just the total node count but am fine if bitcoin is over-engineered where it focuses on both considerations. Additionally, I believe we should consider the value of 1GB pruned full nodes in context to the discussion with subsidizing security.

Thus we could either incorporate an incentive structure through price discovery methods to incentivize nodes and/or optimize core and add pruning, or we can have some external methods as discussed to accomplished these shared goals.
What we cannot do is simply sit idly by and assume nothing needs to be done either. To be fair many people who oppose this fork are actually working on projects like the 10k extra pogo nodes to address this issue. I think that if we have a comprehensive plan of action before the hardfork we can satisfy all our values and objectives.

There are more important things that can be done like setting up libbitcoin Obelisk nodes/servers to support other implementations than simply focusing on total node count for bitcoin core, but if the requirements are realistically achievable like you outlined with scaling up the total node count with market cap than there is no reason we cannot accomplish both at the same time.

So moving forward, if a plan is in place, with an incentive structure built that will sustain decentralization realistically, and we are achieving said goals before the hard fork, than is anyone opposed to increasing the blocksize limit and has specific objections in doing so?

Sure...  but the perfect is the enemy of the good (or the better).  You listed some risk mitigation efforts, there are also other threat vectors.  If you might be willing to imagine that the USG is anti-Bitcoin, you might posit things like Project BULLRUN, or the Equation Group's capabilities to take over any computer with storage and an EEPROM based on it matching a pattern of data on its storage (say, the genesis block, for example).  So this doesn't move the needle much.  We still won't know how much security is needed until there isn't enough.  

We can also consider recovery modes, a bandwidth attack may not end Bitcoin entirely due to its anti-fragile nature.  The biggest full failure modes come with adding pernicious risks, ones that build over time to eventually kill it.  I believe this is part of why satoshi dropped it from 32MB to 1MB in the first instance (and maybe also TOR).

An external incentive structure is a good thing, but being external, it is also untrustable by the code (it could disappear and we'd be left with code that was written assuming it is there).

So with this impasse, I'll re-propose the progression mentioned above with just a little more detail.

1) A hard fork with a definite increase (8MB maybe, some moderate yet arbitrary number that doesn't fix the core of the problem).  Nothing exponentially increasing, but maybe a best guess at providing enough time for (1.5).
1.5) Adding code for all the stuff needed to measure block size within the block chain, or number of transactions per block in the chain (something clever with the Merkle tree maybe?)

2)  A dynamic limit hard fork (indefinite in term, but not inaccurate) which sets a max block size to accommodate transaction growth within a range thereby preventing spam/attacks but with some room to grow, (maybe 10x the average size).  The goal here is to rightsize the block size within the block chain.

3) No block size limit (because the transaction cost supports all network costs appropriately).  JustusRanvier's discussion papers allude to methods for this.  There are practical problems, lots of them, too many to list.  This is however the end-game for perfectly scalability whilst maintaining security using a decentralized free-market economic incentive structure.  No one knows how to do this yet.  The problem with indefinite exponential increasing limit is that it is too close to this and assumes too much.

For what its worth, there may well be a decade in between each of these phases, what we need from our scientists is the massive amount of work and planning to get from (1) to (3).  Bitcoin ought not skip steps due to fear of its opposition, to do so would play into the hands of the feared opposition.

Gavin's proposal is a great one, it is well reasoned, and I think it is well meaning.  I agree with him that it is likely "the simplest that could work".  However it unnecessarily creates more risk than benefits.  Please, let us take only necessary risks and not unnecessary ones?  Let us keep the highest standards for ourselves, and for posterity.  We owe this to all those that have contributed so very much to this project.

============

There are some "no fork for blocksize" folks that see increasing the capability of transactions as a sort of inflation and thus unacceptable.

www.contravex.com/2015/01/12/bitcoin-doesnt-need-a-hard-fork-it-needs-hard-people/
Quote from: Pete Dushenski on October 8, 2014 at 5:47 PM in #bitcoin-assets
This is what it boils down to: scarcity. There’s no room in Bitcoin for inflation of any kind. Other applications and whatnot can be built on top of it as is. It’s for the world to adapt and conform to Bitcoin, not the other way around.

I don't see transaction volume increase as a change in inflation so much as a change in velocity.  Henry Hazlitt who is credited with bringing Austrian Economic theory to the English speaking world, would I think agree that is not something that ought to be artificially influenced either by constraint or encouragement.  Hitting the limit would be an artificial constraint, removing it ahead of (3) would be an encouragement, but for inflation it is a no-op.

Quote from: Henry Hazlit "Notes on 'Velocity Circulation,"' Henry Hazlitt Papers, Foundation for Economic Education. Hazlitt wrote the paper in 1944 for the Mises N.Y.U. Seminar
Money is always in someone's hand. For consumers to spend and "circulate" money at a rapid rate, there needs to be a party willing to accept the currency. That is, the average per capita holding of currency will remain the same
380  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 20, 2015, 03:35:16 PM
My questions are aimed at the people against this hardfork to understand what degree of decentralization and security which would make them comfortable as their primary concern deals with the risk of centralization and decrease of nodes.

Thank you for your valuable attention and contribution to this! 

Over its history, Bitcoin users per nodes has declined from >1:1 to now something like <1:700 (from the calcs upthread). 

Looking at it another way... # of nodes contributes an element of security (resilience) where N is the minimum to run, N+1 is with one 'hot failover' system for backup, we have something like N+7000... which so far has been more than enough.

The question posed here boils down to: 'how much security is needed?'.  An honest answer to this is always going to be: 'I don't know until we don't have enough.'  So far, Bitcoin has had enough.  So far Bitcoin has also had enough space in the blocks for meaningful transactions.  These two virtues are come into conflict with this proposed fork.

So where the (A) number of meaningful transactions, is empirically observable; and (B) the amount of security needed, is only knowable after it has failed.

From this, a simplified risk analysis.

As (A) increases, Bitcoin's value increases.  As (B) increases, Bitcoin's cost increases. 

Increasing either (A) or (B) also increases risk to Bitcoin:  In brief,  The more value Bitcoin has, the greater the incentive to subvert it.  The greater cost to Bitcoin, the easier it is to subvert.

The optimal management of this A:B ratio for greatest Bitcoin value and security would be a block size that permits all meaningful transactions to be swiftly included in blocks, but not so large that it permits excessive spurious transactions for which the cost is borne by all...  So a hard fork to accomplish this is desirable, just not "at any cost".

This brought me to needing a proposal for such a fork that is "Not both inaccurate and indefinite". 
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 [19] 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 ... 212 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!