Bitcoin Forum

Economy => Economics => Topic started by: ByteCoin on July 17, 2010, 03:20:24 AM



Title: Get rid of "difficulty" and maintain a constant rate.
Post by: ByteCoin on July 17, 2010, 03:20:24 AM
The primary purpose of generating BitCoins is to provide an incentive for people to participate in the maintainance of block chain. Generating BitCoins out of "thin air" has recently captured the imagination of a set of new users (me included) and the sudden increase in available computing power has meant a dramatic increase in the rate of block generation.

The increased rate doesn't have any substantial disadvantages or risks that I can see but the variability of the rate is inelegant and it seems to attract a lot of discussion on IRC which distracts from more important issues. I can make a stronger case for the undesirability of an increased rate if required.

The difficulty of block generation will increase to counteract the influx of processing power and the generation rate will normalize after some delay. I predict that new users become disillusioned with the apparently unproductive use of their computer time (especially compared with their experiences in generating coins easily before the difficulty increase) and leave en-masse. The difficulty will not ramp down fast enough to offset this and we will be left with a period of very slow block generation. This will result in trades taking an irritatingly long time to confirm and arguably leaves the system more susceptible to certain types of fraud.

I predict that sucessful fraud schemes will be preceeded by manipulation of the rate by untraceably and deniably introducing and withdrawing substantial hash computation resources.

It would be much more elegant to be able to rely on blocks being generated regularly at 10 minute intervals (or whatever rate is agreed upon). I believe this can be achieved with only a modest increase in bandwidth.

Simply, as the 10 minutes (or whatever) is about to elapse, hash generating computers broadcast the block they have found with the lowest hash. The other computers briefly stop to check the hash and they only broadcast their block if it has an even lower hash. At the 10 minute mark the lowest hashed block is adopted to continue the chain.

There are some details to iron out to do with how low the hash has to be versus the time elapsed before you bother breaking the silence and broadcasting it but I believe that this would be a more elegant solution to the rate problem.  People could rely on a fixed number of blocks being generated a day at fixed times or whatever timetable was mutually agreed.

ByteCoin 

     


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: llama on July 17, 2010, 03:39:48 AM
This is a very very very interesting idea.  It does seem to "automatically" solve the difficulty problem.

To extend it just a bit, a node should broadcast its block as soon as it finds the new lowest hash, even if its not close to the ten minute mark.  Then, nodes would only broadcast if their new hash was lower then that one and so on.  This would help minimize the effects latency and of the nodes' clocks being slightly off.

I'd have to think about this a lot more, but you might be on to something...


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: Bitcoiner on July 17, 2010, 04:21:23 AM
This is indeed an interesting idea. I'm curious what the devs would think about this idea. It could always be implemented on the test network first.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: knightmb on July 17, 2010, 05:02:03 AM
I"m not part of the development team, but my take on it is that you'll just be replacing the randomness with another randomness. Right now, even though the difficulty is very high, blocks are still being generated in under 3 to 5 minutes. So if this new system was in place, you would still be waiting for a block just as long as you would now. I don't usually disclose how many PCs I have in the BTC network for sanity reasons, but let me say that I have systems that can barely manage 90 khash/s and a few that are chruning out 19,2000 khash/s and one beast doing 38,400 khash/s. They don't win any more blocks than the much slower PCs does. One of my 900MHz PCs solved a block under 100 seconds by pure chance alone after the difficulty was increased. The other super clusters are still 0 after the difficulty went up earlier today.

I'm afraid your solution would give my super clusters a big advantage because then it becomes they will always have the lowest hashed block if it's a CPU vs CPU thing.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: d1337r on July 17, 2010, 06:31:46 AM
This is a very very very interesting idea.  It does seem to "automatically" solve the difficulty problem.

To extend it just a bit, a node should broadcast its block as soon as it finds the new lowest hash, even if its not close to the ten minute mark.  Then, nodes would only broadcast if their new hash was lower then that one and so on.  This would help minimize the effects latency and of the nodes' clocks being slightly off.

I'd have to think about this a lot more, but you might be on to something...

It's not ten minutes, it is 2016 blocks.

And with your variant: imagine: by some sheer luck, some machine generated a block with a VERY VERY low hash. Then if other machines pick this low hash as a target, most of the blocks that would suit the target otherwise will be dropped. And only after the 2016 block cycle ends, an easier target will be set.

Target is not the thing that only decreases, it may increase (for example, if some nodes leave the network or stop generating, the "still generating" nodes should get a better chance to keep emission at the required level)


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: NewLibertyStandard on July 17, 2010, 06:36:25 AM
I had this idea myself and it's pretty much the same solution in a different form. Yeah, the timing of blocks would be more consistent, but in the current implementation, the timing is consistent if you take the average time it takes to generate blocks over a long period of time. In the current implementation, it's easy to measure sudden increases and decreases in the swarm. In the suggested implementation, you could also calculate sudden increases and decreases in the swarm by the lowness of the hash, but it would be much less noticeable.

If confirmations suddenly increase or decrease dramatically, it warns users that there is a rush of new users or the abandonment of a botnet, which may cause the exchange rate to fluctuate.

In the current implementation, it's a race toward the lowest time with a set low hash, while in the suggested implementation, it would be a race toward the lowest hash with a set low time. The slow CPU would be just as likely to generate a block. It's competing in the same way, just with goals and limits reversed.

Edited a few times.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: RHorning on July 17, 2010, 06:48:35 AM
I"m not part of the development team, but my take on it is that you'll just be replacing the randomness with another randomness. Right now, even though the difficulty is very high, blocks are still being generated in under 3 to 5 minutes.

Block generation is at roughly every 10-15 minutes right now.  See:

http://nullvoid.org/bitcoin/statistix.php (http://nullvoid.org/bitcoin/statistix.php)

for a current report on some statistical averages over the last several blocks that have been generated.  Still, the general point is valid.  Some blocks are being generated in under ten seconds from the previous one, but statistical averages still exist.

I do see the variable time between blocks, and in particular the predictive quality about when the difficulty is going to increase as something which could be used as a manipulation target after a fashion, although I should point out that any such manipulation would by definition also require CPU processing ability that approaches at least a substantial minority of the overall CPU strength of the network as a whole which is engaged in creating bitcoins.  

I give that last little exception as I hope it will become apparent that in time there will start to be people dropping out of the bitcoin creation process thinking that the whole effort is futile even if maintaining a connection on the network for the purposes of transaction processing could still be useful.  I'm curious about where that will go over time.

The strength of the network is in the overwhelming number of participants where even somebody with a (temporarily) unused server room at their disposal doing nothing but making bitcoin blocks still is a minority of the overall network.  Furthermore, having a couple of "trusted" participants with server farms who are cooperatively making blocks only enhances this protection for everybody and keeps the would-be miscreants at bay.

The only manipulation that I can imagine where this proposal would help is in the case of an attacker who times the connection and release of significant computing resources on the network, where for some periods of time the CPU server farm is banging out the bitcoin blocks and then leaves the network when the difficulty increases substantially.... waiting for that difficulty to drop back to what it was before it started to make the bitcoin blocks (doing other stuff in the meantime or even simply shutting down).  Such efforts over a prolonged period of time, if successful, could also be derived and even plotted statistically to show an attack was under way.  Randomizing the attacks to make it seem like "noise" would only serve to drop the value of such an attack.  Trying to sneak in under the radar to appear as a "normal" user would end up simply adding strength to the network against other would-be attackers and in the long run be ineffective in their attack.  Attackers would be fighting each other and normal users could simply be oblivious that anything is happening at all in terms of an attack.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: wizeman on July 19, 2010, 05:48:33 PM
It would be much more elegant to be able to rely on blocks being generated regularly at 10 minute intervals (or whatever rate is agreed upon). I believe this can be achieved with only a modest increase in bandwidth.

Simply, as the 10 minutes (or whatever) is about to elapse, hash generating computers broadcast the block they have found with the lowest hash. The other computers briefly stop to check the hash and they only broadcast their block if it has an even lower hash. At the 10 minute mark the lowest hashed block is adopted to continue the chain.

How do you get thousands of computers to agree when is the 10 minute mark?

Ideally you want the algorithm to rely on synchronized clocks as little as possible.

Another problem is that if you'd use your strategy, at every 10 minute mark the network would be swamped with a flood of candidate blocks.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: RHorning on July 19, 2010, 06:36:10 PM
How do you get thousands of computers to agree when is the 10 minute mark?

Ideally you want the algorithm to rely on synchronized clocks as little as possible.

Another problem is that if you'd use your strategy, at every 10 minute mark the network would be swamped with a flood of candidate blocks.

Just making a presumption on this particular issue, and I don't have any sort of commentary on what would be a "triggering" event to create these kind of blocks, but here is at least a strategy to keep the network from getting completely bogged down with candidate blocks:

Each potential candidate would obviously have some sort of "fitness" metric to suggest which one is more "fit" than another.  If you are generating a block and the "event" trigger occurs, that node would "broadcast" its candidate to all of its immediate neighbors.... and keep track of which "neighbor" (direct connection to another node) has already had that candidate block transmitted.  It only gets transmitted once to a neighbor (with acknowledgment).

When a node starts to receive candidate blocks, it would perform this "fitness" test and either dump the current block (it has failed a fitness test) or keep the block.... continuing to contact adjacent nodes that have not yet acknowledged receiving the candidate block.  If a new candidate is found that is more fit, it wouldn't re-transmit back to the original source of that block (whatever direct connection sent that block), but it would try to share that with other nodes.  If the node receives the same block from a neighbor, it would consider that node to have also acknowledged the block until all neighbors are essentially working with the same block.

While it would be chaotic at first, the network would calm down very quickly in this situation and settle upon a new block that would be ultimately accepted into the chain.

I do agree that the main triggering event would be the big problem with this kind of scheme, and that would imply some sort of centralized timekeeper to create the events.  I also think that such an event driven block creation system would ultimately give out about the same number of "new" coin blocks as the current system, and it would create much more network bandwidth trying to negotiate "winner".  It would also introduce scaling problems that currently don't exist in the current network.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: wizeman on July 19, 2010, 07:01:45 PM
I do agree that the main triggering event would be the big problem with this kind of scheme, and that would imply some sort of centralized timekeeper to create the events.  I also think that such an event driven block creation system would ultimately give out about the same number of "new" coin blocks as the current system, and it would create much more network bandwidth trying to negotiate "winner".  It would also introduce scaling problems that currently don't exist in the current network.

Not to mention we'd also add another two new points of failure - the centralized timekeeper (presumably a network of NTP servers) and the automatic rejection of all valid blocks from clients which don't have the clock set correctly, either because they don't have an NTP service configured or because a firewall is blocking the NTP packets.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: NewLibertyStandard on July 19, 2010, 08:01:52 PM
You wouldn't need to wait until right before the ten minute mark to compare hashes. Ideally, hashes would be compared continuously, for the whole ten minutes, so when the ten minute mark approached, all nodes would already have a pretty good idea of who was going to get the block. I think I2P has a swarm based clock and from what I understand, it's a huge complicated mess trying to accurately achieve and maintain an accurate time, but if somebody did want to go that route, the code is available. I imagine if an attacker had enough nodes, he could manipulate the swarm time to his advantage. Slowing down time when he doesn't have the lowest hash and speeding up time when he does. Of course, if you waited until right before the ten minutes are over, I suppose that probably wouldn't be possible. I don't think a giant rush of hashes being compared, would really bog down the swarm, since each node is only sending out their hash if they haven't received a higher one and only spreading the highest received hash. I think the total time it would not take would be more than the amount of time for the lowest hash to be checked by each node, because it would always win when compared to against other hashes and so it would always be propagated and never held back. Of course then the issue arises that competing nodes have an incentive to lie, but I imagine that's the case under the current system. If user X is only connected to attacking, lying nodes, then if he gets a lower hash than the attacker, then the attacker just refrains from forwarding his hash.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: wizeman on July 19, 2010, 08:32:06 PM
Personally, I think this is not worth it, because:

1) We'd be complicating the algorithm, making it much harder to verify that the code is correct and potentially introducing new ways of attacking the network.

2) We'd be introducing new points of failure because clients with wrong clocks wouldn't generate new coins, also NTP packets can be easily forged, and you shouldn't trust the clocks of other clients because they can also be forged

3) We'd be introducing potential new scalability problems. With the current algorithm, it's easy to predict the total bandwidth needed by the network per unit of time: on average, sizeof(block)*number_of_clients*connections_per_client per 10 minutes. With the proposed algorithm, it's harder to calculate, but it'll definitely need more bandwidth (I think much more, but I have no proof).

4) You will never make all the clients agree on a common 10-minute window of time. There will be clients who will be a few seconds off, there will be some a few minutes off, some a few hours off. How do you decide when a time window starts and when it ends?

Personally I find the current algorithm much more elegant than the proposed one. A slight improvement we can make is to do a more dynamic difficulty adjustment like proposed here - http://bitcointalk.org/index.php?topic=463.0 - this will more gradually mitigate the problem of blocks taking a much shorter or longer time when someone adds or removes a large amount of CPU power to the network.

Still, I think that this is only a problem while the network is still small. When bitcoin becomes more popular, it will be much harder for any single entity to influence how long the block generation takes on average. But in fact, I don't even consider this a problem, because the network should work just as robustly, regardless of the rate of block generation. The actual rate of block generation should be an implementation detail, not something a user has to worry about. All he should know is that it may take a variable amount of time to confirm a transaction, even though in the future this variation will keep being more and more predictable.



Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: bdonlan on July 21, 2010, 12:10:16 AM
The biggest problem with this approach is you can't audit it after the fact. Consider the case of restarting your client after it's been off for a week. It immediately connects to Mallory's client and asks it for the last week's block chain. Mallory responds with a chain of super-easy blocks, each exactly 10 minutes apart. Now mallory can control your view of the network and transaction history. Oops. And even if you connect to a 'good' node later, you have no way of sorting out which block chain is real, unless you take the higher difficulty - but this raises the problem where an attacker could spend a long time generating a single block, more difficult than some particular historical block, followed by a bunch of easy blocks. It'll take a while to generate, but then Mallory can rewrite history for the entire network.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: ByteCoin on July 21, 2010, 01:01:56 AM
The problem you outline exists in the current system. You restart your client after it's been off for a week. It immediately connects to Mallory's client and asks it for the last week's block chain. Mallory responds with one or two blocks, each with an appropriate hash. Now mallory can control your view of the network and transaction history. Oops.

Same problem.

With my scheme when you connect to a 'good' node later, you take the chain with the higher total difficulty instead of the longest block chain. A reasonable measure of the total difficulty under the current proof of work is the total number of zero leading bits in all the block hashes. In the case you mention, the attacker generates a better single block that some particular historical block but because it's followed by a bunch of easy blocks the total number of leading zero bits is much lower than the real block chain and hence the attack fails.

ByteCoin


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: Unorthodox on July 21, 2010, 03:10:54 AM
The biggest issue with this idea except for the bandwidth, is that you won't have a good idea of how many computers are generating in the network, or how hard it will be for generating.

This kind of information is useful when buying/selling bitcoins, as it has an effect on the price.
Also, I wouldn't know myself how easy it will be to generate, possibly wasting CPU power and electricity in my computers.

I rather stick with the system in use today.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: NewLibertyStandard on July 21, 2010, 06:07:51 AM
The biggest issue with this idea except for the bandwidth, is that you won't have a good idea of how many computers are generating in the network, or how hard it will be for generating.

This kind of information is useful when buying/selling bitcoins, as it has an effect on the price.
Also, I wouldn't know myself how easy it will be to generate, possibly wasting CPU power and electricity in my computers.

I rather stick with the system in use today.
The lowness of accepted blocks would be a measurement of difficulty and network computational power.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: knightmb on July 21, 2010, 06:39:38 AM
The problem you outline exists in the current system. You restart your client after it's been off for a week. It immediately connects to Mallory's client and asks it for the last week's block chain. Mallory responds with one or two blocks, each with an appropriate hash. Now mallory can control your view of the network and transaction history. Oops.

Same problem.

With my scheme when you connect to a 'good' node later, you take the chain with the higher total difficulty instead of the longest block chain. A reasonable measure of the total difficulty under the current proof of work is the total number of zero leading bits in all the block hashes. In the case you mention, the attacker generates a better single block that some particular historical block but because it's followed by a bunch of easy blocks the total number of leading zero bits is much lower than the real block chain and hence the attack fails.

ByteCoin
The client makes at least 8 connections, kind of random. One would need to control all of those entry points. Not impossible of course, but one rogue client is one thing, but a bunch of good clients, how to do you formulate attacking all of them?


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: Traktion on July 21, 2010, 12:22:17 PM
There could be a good argument for increasing the rate, along with the number of participants, forever.

If every generating node maintains the same constant rate of minting per CPU cycle (ie. more powerful CPU => more minting), then the coin base will grow along with the node base - which is the user base, which gives us a handle on coin demand. This has been touched on a few times on this forum, but I sense resistance to this.

The reason for doing the above, is not to create an inflationary environment, but to keep the number of coins relative to the user base. Failing to do this, will put deflationary pressure on the currency; remember, increasing the demand (number of Bitcoin users) for the currency, is the same as decreasing the quantity (of coins in a fixed Bitcoin user base). If you want to retain a steady, neutral, value of Bitcoins, then this needs to be considered.

Therefore a constant rate relative to the user base would be ideal. The faster the rate of adoption, the more coins should be created.   The reverse could be handled by natural wastage (lost coins), although this could be 'sticky' in the extremes (although destroying a proportion of the transaction fee would speed this up).

I'm sure an algorithm could be formulated to achieve the above, with the constant rate not being so high as to be inflationary - the target would be to keep the number of coins proportionate to the user base, thus creating 0% inflation*.

* I know this seems counter intuitive, but in a currency with a non-fixed (hopefully growing) user base, it becomes very important.

[NOTE: There may be an argument for the minting rate to track 'GDP' or some such - perhaps based on the number and value of transactions taking place? If people are economically active enough to have their node minting coins, maybe the user base may be sufficient/better in creating stability. This is probably another debate in itself, but the above point needs to be agreed on first.]


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: joechip on July 21, 2010, 12:40:57 PM
There could be a good argument for increasing the rate, along with the number of participants, forever.

If every generating node maintains the same constant rate of minting per CPU cycle (ie. more powerful CPU => more minting), then the coin base will grow along with the node base - which is the user base, which gives us a handle on coin demand. This has been touched on a few times on this forum, but I sense resistance to this.

The reason for doing the above, is not to create an inflationary environment, but to keep the number of coins relative to the user base. Failing to do this, will put deflationary pressure on the currency; remember, increasing the demand (number of Bitcoin users) for the currency, is the same as decreasing the quantity (of coins in a fixed Bitcoin user base). If you want to retain a steady, neutral, value of Bitcoins, then this needs to be considered.


This is simply the Monetarist desire for the rate of monetary increase to equal the output increase of the economy.  It is a false argument the Austrians have debunked for years and is one of the factors which has led us to the situation we currently have.  That's been the FED's policy... to match money growth to economic growth....it's one of their primary mandates, price stability.  NO NO NO.  Price deflation is a GOOD thing.  Your money buys more per unit.  You become more wealthy not only by investing in interest-bearing projects but through the increase in the value of what your savings (which pay no interest) will buy you.  It leads to thrift and investment in projects likely to have a higher return than the rate of price deflation.



Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: Traktion on July 21, 2010, 01:39:18 PM
There could be a good argument for increasing the rate, along with the number of participants, forever.

If every generating node maintains the same constant rate of minting per CPU cycle (ie. more powerful CPU => more minting), then the coin base will grow along with the node base - which is the user base, which gives us a handle on coin demand. This has been touched on a few times on this forum, but I sense resistance to this.

The reason for doing the above, is not to create an inflationary environment, but to keep the number of coins relative to the user base. Failing to do this, will put deflationary pressure on the currency; remember, increasing the demand (number of Bitcoin users) for the currency, is the same as decreasing the quantity (of coins in a fixed Bitcoin user base). If you want to retain a steady, neutral, value of Bitcoins, then this needs to be considered.


This is simply the Monetarist desire for the rate of monetary increase to equal the output increase of the economy.  It is a false argument the Austrians have debunked for years and is one of the factors which has led us to the situation we currently have.  That's been the FED's policy... to match money growth to economic growth....it's one of their primary mandates, price stability.  NO NO NO.  Price deflation is a GOOD thing.  Your money buys more per unit.  You become more wealthy not only by investing in interest-bearing projects but through the increase in the value of what your savings (which pay no interest) will buy you.  It leads to thrift and investment in projects likely to have a higher return than the rate of price deflation.


You get more wealthy for doing nothing, just for being one of the first users of Bitcoin. The more people join, the less they gain from this, until there are few new entrants. It's like a pyramid scheme in that sense - you're being rewarded for doing nothing.

Sure, the currency can be easily divided. Sure, people can use alternatives (Hayek, Denationalisation of Money - good read), but I don't think that helps the Bitcoin to become the best money and will prompt its replacement.

BTW, as the Bitcoin supply will grow for decades according to the current plan, this isn't at odds with my POV. I just don't think that it should stop growing if the user base is still growing; that would be counter productive. I also thing the rate of this growth could be optimised better, rather than being arbitrary.

EDIT: P.S. It's nothing to do with keeping a price index like CPI steady (like the central banks try to do). That's something quite different and I would agree that it's flawed and probably an impossible task too (as most Austrians would agree).


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: Bitcoiner on July 21, 2010, 03:48:43 PM
...

The reason for doing the above, is not to create an inflationary environment, but to keep the number of coins relative to the user base. Failing to do this, will put deflationary pressure on the currency; remember, increasing the demand (number of Bitcoin users) for the currency, is the same as decreasing the quantity (of coins in a fixed Bitcoin user base). If you want to retain a steady, neutral, value of Bitcoins, then this needs to be considered.

Why would we want to retain a steady, neutral value of Bitcoins? This would mean we would artificially decrease the value of each Bitcoin when more of them were demanded, and artificially increase the value of each Bitcoin when less of them were demanded. This seems completely inverse to what supply & demand is telling you.

Therefore a constant rate relative to the user base would be ideal. ...

Ideal in which sense?

I'm sure an algorithm could be formulated to achieve the above, with the constant rate not being so high as to be inflationary - the target would be to keep the number of coins proportionate to the user base, thus creating 0% inflation*.

How is 0% inflation defined?

[NOTE: There may be an argument for the minting rate to track 'GDP' or some such - perhaps based on the number and value of transactions taking place?

GDP is a poor and unreliable indicator. How would you track the number and value of transactions taking place? How would you quantify non-monetary aspects of the transaction, that can only be measured subjectively by the actual players involved?

If people are economically active enough to have their node minting coins, maybe the user base may be sufficient/better in creating stability. This is probably another debate in itself, but the above point needs to be agreed on first.]

I'm sorry, but I don't agree. I've seen a lot of "we should do this and that" with no backing arguments. I'll need a little more than just assertive statements to be convinced. :)


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: Traktion on July 21, 2010, 04:29:03 PM
...

The reason for doing the above, is not to create an inflationary environment, but to keep the number of coins relative to the user base. Failing to do this, will put deflationary pressure on the currency; remember, increasing the demand (number of Bitcoin users) for the currency, is the same as decreasing the quantity (of coins in a fixed Bitcoin user base). If you want to retain a steady, neutral, value of Bitcoins, then this needs to be considered.

Why would we want to retain a steady, neutral value of Bitcoins? This would mean we would artificially decrease the value of each Bitcoin when more of them were demanded, and artificially increase the value of each Bitcoin when less of them were demanded. This seems completely inverse to what supply & demand is telling you.

The point is to retain the same value. With a limited supply, but growing demand, each coin is going to be worth more. Companies, and individuals, want to know that 1 Bitcoin will be worth roughly* the same now as in many years to come. This makes business and personal planning easier.

If the global population was steady and everybody used Bitcoins, having a steady supply makes sense. This isn't where we are though - we have a growing Bitcoin user base, which needs more Bitcoins to match this.

* Don't mistake this to be some CPI, price index type tracking (like central banks have attempted over the last decades) - this is just about total coin and user quantities. The reason price index tracking is flawed, is that it is trying to measure the near impossible, because peoples' propensity to spend or save changes all the time. However, it is simple to consider why price stability is important (future planning) and it also follows that more Bitcoin users fighting for a constant Bitcoin supply is going to affect the price (think supply/demand).

Therefore a constant rate relative to the user base would be ideal. ...

Ideal in which sense?

In keeping 1 Bitcoin worth roughly the same 1, 5 or 10 years later, to help future planning (and therefore, business/savings/security etc) as mentioned above. If you have a growing Bitcoin user base, but a fixed Bitcoin supply, 1 Bitcoin will be worth more YoY. The holder of the Bitcoin would be made more wealthy from doing nothing. This isn't a great basis for a vibrant economy.

The real kicker? The only way demand will grow in the above scenario, will be if the newcomer thinks that there will be someone coming in after him (or they have just assumed 'prices only go up', perhaps because 'there are only so many bitcoins out there'). Classic pyramid/bubble stuff.

I'm sure an algorithm could be formulated to achieve the above, with the constant rate not being so high as to be inflationary - the target would be to keep the number of coins proportionate to the user base, thus creating 0% inflation*.

How is 0% inflation defined?

B = Bitcoins
U = Bitcoin users

Inflation = B/U

In other words, the number of bitcoins should be proportionate to the number of bitcoin users.

[NOTE: There may be an argument for the minting rate to track 'GDP' or some such - perhaps based on the number and value of transactions taking place?

GDP is a poor and unreliable indicator. How would you track the number and value of transactions taking place? How would you quantify non-monetary aspects of the transaction, that can only be measured subjectively by the actual players involved?

It's why I suggested there may be an argument. I'm fairly certain that the number of Bitcoin users would be the better metric. However, the swarm would surely know how many transactions were taking place, should this be a useful metric.

If people are economically active enough to have their node minting coins, maybe the user base may be sufficient/better in creating stability. This is probably another debate in itself, but the above point needs to be agreed on first.]

I'm sorry, but I don't agree. I've seen a lot of "we should do this and that" with no backing arguments. I'll need a little more than just assertive statements to be convinced. :)
[/quote]

Just think it through. I'm sure there are papers on this, but it's just a logical chain of thought - new users competing for existing coins will create additional demand, per coin. Either new users simply won't bother to use Bitcoins or they'll buy in, hoping the price will continue to rise... until one day it doesn't and the value crashes.

If you want Bitcoins to be the best money, then the supply needs to flex with the user base (demand). If you just want them to become a digital commodity, then expect there to be swings in demand and value, which would make it less preferable as money.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: Bitcoiner on July 21, 2010, 06:34:50 PM
...

The reason for doing the above, is not to create an inflationary environment, but to keep the number of coins relative to the user base. Failing to do this, will put deflationary pressure on the currency; remember, increasing the demand (number of Bitcoin users) for the currency, is the same as decreasing the quantity (of coins in a fixed Bitcoin user base). If you want to retain a steady, neutral, value of Bitcoins, then this needs to be considered.

Why would we want to retain a steady, neutral value of Bitcoins? This would mean we would artificially decrease the value of each Bitcoin when more of them were demanded, and artificially increase the value of each Bitcoin when less of them were demanded. This seems completely inverse to what supply & demand is telling you.

The point is to retain the same value. With a limited supply, but growing demand, each coin is going to be worth more. Companies, and individuals, want to know that 1 Bitcoin will be worth roughly* the same now as in many years to come. This makes business and personal planning easier.

How does it make business and personal planning easier?

If the global population was steady and everybody used Bitcoins, having a steady supply makes sense. This isn't where we are though - we have a growing Bitcoin user base, which needs more Bitcoins to match this.

Again, why? It's not enough to say "we should do this because we should."

* Don't mistake this to be some CPI, price index type tracking (like central banks have attempted over the last decades) - this is just about total coin and user quantities. The reason price index tracking is flawed, is that it is trying to measure the near impossible, because peoples' propensity to spend or save changes all the time. However, it is simple to consider why price stability is important (future planning) and it also follows that more Bitcoin users fighting for a constant Bitcoin supply is going to affect the price (think supply/demand).

Ok, I'm glad to see that we see eye to eye about CPI and related measures. However, first, how does maintaining a Bitcoin/user ratio maintain price stability, and second, why is it more desirable than a fixed supply?

Therefore a constant rate relative to the user base would be ideal. ...

Ideal in which sense?

In keeping 1 Bitcoin worth roughly the same 1, 5 or 10 years later, to help future planning (and therefore, business/savings/security etc) as mentioned above. If you have a growing Bitcoin user base, but a fixed Bitcoin supply, 1 Bitcoin will be worth more YoY. The holder of the Bitcoin would be made more wealthy from doing nothing. This isn't a great basis for a vibrant economy.

The real kicker? The only way demand will grow in the above scenario, will be if the newcomer thinks that there will be someone coming in after him (or they have just assumed 'prices only go up', perhaps because 'there are only so many bitcoins out there'). Classic pyramid/bubble stuff.

He's not made wealthier for doing nothing. He's made wealthier for making the decision to hold Bitcoins in expectation of increasing demand. This is not "doing nothing". At every moment he must make a decision, "Should I continue to hold on to my Bitcoins? Or are my needs and purposes better served by spending them in exchange for something else?"

There is also no guarantee that he will be made wealthier. If too many people hoard, then it will be hard to buy Bitcoins, which means that less people will use Bitcoins, which means that the value of Bitcoins will go down, which means that the hoarders ultimately lose out. The value of Bitcoin is not set in stone; it is determined by what people are willing to exchange for it. This is a self-correcting equilibrium.


I'm sure an algorithm could be formulated to achieve the above, with the constant rate not being so high as to be inflationary - the target would be to keep the number of coins proportionate to the user base, thus creating 0% inflation*.

How is 0% inflation defined?

B = Bitcoins
U = Bitcoin users

Inflation = B/U

In other words, the number of bitcoins should be proportionate to the number of bitcoin users.

Ok, now we have a metric. You are proposing that the ratio of Bitcoins to users should remain constant. Why would this be superior to the current system? How would you determine the # of users? How would you distribute the coins?

[NOTE: There may be an argument for the minting rate to track 'GDP' or some such - perhaps based on the number and value of transactions taking place?

GDP is a poor and unreliable indicator. How would you track the number and value of transactions taking place? How would you quantify non-monetary aspects of the transaction, that can only be measured subjectively by the actual players involved?

It's why I suggested there may be an argument. I'm fairly certain that the number of Bitcoin users would be the better metric. However, the swarm would surely know how many transactions were taking place, should this be a useful metric.

If people are economically active enough to have their node minting coins, maybe the user base may be sufficient/better in creating stability. This is probably another debate in itself, but the above point needs to be agreed on first.]

I'm sorry, but I don't agree. I've seen a lot of "we should do this and that" with no backing arguments. I'll need a little more than just assertive statements to be convinced. :)

Just think it through. I'm sure there are papers on this, but it's just a logical chain of thought - new users competing for existing coins will create additional demand, per coin. Either new users simply won't bother to use Bitcoins or they'll buy in, hoping the price will continue to rise... until one day it doesn't and the value crashes.

If you want Bitcoins to be the best money, then the supply needs to flex with the user base (demand). If you just want them to become a digital commodity, then expect there to be swings in demand and value, which would make it less preferable as money.

I can't read your mind; I'm sure you are convinced that what you are proposing is a great idea, but you need to formulate this in writing. You make statements such as "Bitcoins will be the best money if the supply flexes with the user base", but that doesn't actually explain anything to me. You are telling me what you think, but not why you think that way.

If the value crashes because people speculated and drove the value too high, then what's wrong with that? That's how things should work. Would you prefer that oil prices still be near $200 a barrel? Some speculators lost out, again, so what?

"If you just want them to become a digital commodity, then expect there to be swings in demand and value, which would make it less preferable as money."

How would fixing it to the # of users alleviate this, and how could we accurately measure this? Why are swings in demand and value a bad thing? If anything, they are how a market, through the collective actions of individual players, allocates capital most efficiently. This is based on the axiom that every trade is made in order to make each individual player better off. If Bitcoins rise and fall in value, there are good reasons for it, and we should not try to "hide" this or fight against it.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: NewLibertyStandard on July 21, 2010, 07:43:58 PM
An electronic currency such as bitcoin is a poor candidate for stable value. Because bitcoins don't have inherent value, they work well in a system where they have a relatively stable price in the short term and can be transmitted easily and reliably, but in the long term can fluctuate in either direction to meet supply and demand. A vitally necessary commodity such as water works much better as having a constant value. The value may somewhat shift during conflicts or when new ways of producing it become prevalent, but in comparison to something like Bitcoin, it's value will be retained very well over time. There you have it. You should invest in companies that provide clean water, buy a water tower or buy and sell bottled water.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: Traktion on July 22, 2010, 12:46:16 AM
Bitcoiner, I'll post up a reply tomorrow (it's late here), but I've covered some of the points in the other thread. To summarise in one sentence, fluctuations will always happen in a free market, but restricting supply when demand is growing will not help matters.

As I said before, if you want Bitcoins to behave like digital commodities, that's fine. It may not make it the most stable currency though, which is surely what most people want from their ideal money.

[A bit of a tangent - BTW, this is also why a basket of commodities is often said to be better than a single commodity, as backing for money - they all fluctuate, but the hope is, overall they will be relatively stable, with some increasing in supply as demand dictates (like grain, drinking water etc). I have sympathy for that argument too, for similar reasons to the above.]


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: QuantumMechanic on July 22, 2010, 07:46:49 AM

You get more wealthy for doing nothing, just for being one of the first users of Bitcoin. The more people join, the less they gain from this, until there are few new entrants. It's like a pyramid scheme in that sense - you're being rewarded for doing nothing.


Notice that you can say the same thing about all speculative buying.  Speculators perform the valuable function of finding market prices which, ideally, reflect economic realities.

Currencies need market prices which reflect economic realities, too - economic realities being the present and expected future degree of usage of the currency.


Title: Re: Get rid of "difficulty" and maintain a constant rate.
Post by: Noctis Connor on June 12, 2018, 09:27:04 PM
The biggest issue with this idea except for the bandwidth, is that you won't have a good idea of how many computers are generating in the network, or how hard it will be for generating.

This kind of information is useful when buying/selling bitcoins, as it has an effect on the price.
Also, I wouldn't know myself how easy it will be to generate, possibly wasting CPU power and electricity in my computers.

I rather stick with the system in use today.
The lowness of accepted blocks would be a measurement of difficulty and network computational power.

I agree with you, since the difficulty comes from the system itself, constant rate will be determine on how many computers are there in the system itself and how will the network works on that matter.