molecular
Donator
Legendary
Offline
Activity: 2772
Merit: 1019
|
|
November 27, 2012, 12:25:03 PM |
|
200 blocks left... I can happen tomorow.. given that "tomorrow" can mean a lot of things depending on timezone, this is very likely to be true.
|
PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0 3F39 FC49 2362 F9B7 0769
|
|
|
runeks
Legendary
Offline
Activity: 980
Merit: 1008
|
|
November 27, 2012, 02:21:33 PM |
|
true, but nite's suggestion would amplify changes in difficulty. for example let's say the network hashrate doubles and blocks are solved every 5 minutes. we would have a huge difficulty adjustment sure, but under nite's system, it would assume the network hashrate was going to double *again*.
It would 'make things worse' on cases when first hashrate is first increasing and the suddenly starts decreasing or vice versa. However it would make things go better when hash rate increases or decreases steadily or stays constant. And this case is 99% of the time.. Ie, if the hashrate has doubled, we have a *good reason* to assume it will most likely double again in the same time. Technically addition would be very simple, just add "+ delta hashrate" to the formula. We now have 170 blocks left. If we assume these last blocks are solved at a rate of 6 per hour, the network has spent about 36 days less than four years to solve 210,000 blocks. That's about 2.5% faster than 10 minutes per block, ie. 9:45 minutes per block. Even if your algorithm is more accurate than the one in place now, is it even worth the added code complexity to get closer to 10 minutes than 9:45 is? Why would it matter at all that blocks take, on average, 15 seconds less to solve? I'd say it's completely unnecessary to adjust anything, and the 2.5% difference will make no measurable difference to how the network operates at all.
|
|
|
|
molecular
Donator
Legendary
Offline
Activity: 2772
Merit: 1019
|
|
November 27, 2012, 02:24:25 PM |
|
true, but nite's suggestion would amplify changes in difficulty. for example let's say the network hashrate doubles and blocks are solved every 5 minutes. we would have a huge difficulty adjustment sure, but under nite's system, it would assume the network hashrate was going to double *again*.
It would 'make things worse' on cases when first hashrate is first increasing and the suddenly starts decreasing or vice versa. However it would make things go better when hash rate increases or decreases steadily or stays constant. And this case is 99% of the time.. Ie, if the hashrate has doubled, we have a *good reason* to assume it will most likely double again in the same time. Technically addition would be very simple, just add "+ delta hashrate" to the formula. We now have 170 blocks left. If we assume these last blocks are solved at a rate of 6 per hour, the network has spent about 36 days less than four years to solve 210,000 blocks. That's about 2.5% faster than 10 minutes per block, ie. 9:45 minutes per block. Even if your algorithm is more accurate than the one in place now, is it even worth the added code complexity to get closer to 10 minutes than 9:45 is? Why would it matter at all that blocks take, on average, 15 seconds less to solve? I'd say it's completely unnecessary to adjust anything, and the 2.5% difference will make no measurable difference to how the network operates at all. I agree. 10 minutes is arbitrarily chosen anyway. Accuracy is not a goal here.
|
PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0 3F39 FC49 2362 F9B7 0769
|
|
|
Nite69
|
|
November 27, 2012, 02:42:27 PM |
|
true, but nite's suggestion would amplify changes in difficulty. for example let's say the network hashrate doubles and blocks are solved every 5 minutes. we would have a huge difficulty adjustment sure, but under nite's system, it would assume the network hashrate was going to double *again*.
It would 'make things worse' on cases when first hashrate is first increasing and the suddenly starts decreasing or vice versa. However it would make things go better when hash rate increases or decreases steadily or stays constant. And this case is 99% of the time.. Ie, if the hashrate has doubled, we have a *good reason* to assume it will most likely double again in the same time. Technically addition would be very simple, just add "+ delta hashrate" to the formula. We now have 170 blocks left. If we assume these last blocks are solved at a rate of 6 per hour, the network has spent about 36 days less than four years to solve 210,000 blocks. That's about 2.5% faster than 10 minutes per block, ie. 9:45 minutes per block. Even if your algorithm is more accurate than the one in place now, is it even worth the added code complexity to get closer to 10 minutes than 9:45 is? Why would it matter at all that blocks take, on average, 15 seconds less to solve? I'd say it's completely unnecessary to adjust anything, and the 2.5% difference will make no measurable difference to how the network operates at all. I agree. 10 minutes is arbitrarily chosen anyway. Accuracy is not a goal here. That's quite much true, ie this is more question of opinion. The functionality of bitcoin practically would not change and current algorithm is very good. Just being as pragmatic as I am, I would have done it differently.
|
Sync: ShiSKnx4W6zrp69YEFQyWk5TkpnfKLA8wx Bitcoin: 17gNvfoD2FDqTfESUxNEmTukGbGVAiJhXp Litecoin: LhbDew4s9wbV8xeNkrdFcLK5u78APSGLrR AuroraCoin: AXVoGgYtSVkPv96JLL7CiwcyVvPxXHXRK9
|
|
|
dooglus
Legendary
Offline
Activity: 2940
Merit: 1333
|
|
November 27, 2012, 06:57:19 PM |
|
That's quite much true, ie this is more question of opinion. The functionality of bitcoin practically would not change and current algorithm is very good. Just being as pragmatic as I am, I would have done it differently.
We can argue as to whether Nite69's way would have been better or not, but now that the network is already up and running it would be dangerous to make such a change. Old clients would reject blocks which meet the new difficulty requirements but not the old ones resulting in a hard split of the blockchain. Any marginal improvement the change might bring is certainly offset by the difficulty of getting all the old clients updated.
|
Just-Dice | ██ ██████████ ██████████████████ ██████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████ ██████████████ ██████ | Play or Invest | ██ ██████████ ██████████████████ ██████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████ ██████████████ ██████ | 1% House Edge |
|
|
|
MatthewLM
Legendary
Offline
Activity: 1190
Merit: 1004
|
|
November 27, 2012, 08:00:22 PM |
|
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.
|
|
|
|
Nite69
|
|
November 27, 2012, 08:14:58 PM |
|
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.
Well, yes. That would be even better. Perfect. Actually, in my suggestion, it would cause *some* problems, if the hash rate dropped to half.. ;-) But it is also true, this is just speculating. Cannot change it any more.
|
Sync: ShiSKnx4W6zrp69YEFQyWk5TkpnfKLA8wx Bitcoin: 17gNvfoD2FDqTfESUxNEmTukGbGVAiJhXp Litecoin: LhbDew4s9wbV8xeNkrdFcLK5u78APSGLrR AuroraCoin: AXVoGgYtSVkPv96JLL7CiwcyVvPxXHXRK9
|
|
|
molecular
Donator
Legendary
Offline
Activity: 2772
Merit: 1019
|
|
November 27, 2012, 09:36:40 PM |
|
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.
Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning.
|
PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0 3F39 FC49 2362 F9B7 0769
|
|
|
phelix
Legendary
Offline
Activity: 1708
Merit: 1020
|
|
November 27, 2012, 10:36:47 PM |
|
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.
Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning. I see problems. after random long blocks you would have to set a minimum difficulty. what is the problem with the current design? just set your watch according to bitcoin blocks if you must, not the other way round
|
|
|
|
Nite69
|
|
November 27, 2012, 10:56:22 PM |
|
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.
Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning. I see problems. after random long blocks you would have to set a minimum difficulty. what is the problem with the current design? just set your watch according to bitcoin blocks if you must, not the other way round Only after a block lasting 2 weeks. Edit: no, not even then. Halving the difficulty would be enought.
|
Sync: ShiSKnx4W6zrp69YEFQyWk5TkpnfKLA8wx Bitcoin: 17gNvfoD2FDqTfESUxNEmTukGbGVAiJhXp Litecoin: LhbDew4s9wbV8xeNkrdFcLK5u78APSGLrR AuroraCoin: AXVoGgYtSVkPv96JLL7CiwcyVvPxXHXRK9
|
|
|
Fjordbit
|
|
November 27, 2012, 11:00:22 PM |
|
Just so people know, this block could be found within 2 seconds of the prior block or 2 hours. That's unlikely, but I did see a block once take over 60 minutes. Because of this, bitcoinclock can give an approximation, but you really need to stay tuned to blockchain.info.
|
|
|
|
MatthewLM
Legendary
Offline
Activity: 1190
Merit: 1004
|
|
November 27, 2012, 11:06:26 PM |
|
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.
Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning. Yes. It's just another way it could have been done so I was being academic.
|
|
|
|
payb.tc
|
|
November 28, 2012, 04:04:17 AM |
|
Just so people know, this block could be found within 2 seconds of the prior block or 2 hours. That's unlikely, but I did see a block once take over 60 minutes. Because of this, bitcoinclock can give an approximation, but you really need to stay tuned to blockchain.info.
yeah, i still remember this 90-minute one... https://bitcointalk.org/index.php?topic=118930.0what do you mean by "stay tuned to blockchain.info"? for what, the blockcount? that part isn't an approximation
|
|
|
|
jl2035
|
|
November 28, 2012, 02:40:57 PM |
|
3 blocks left
|
|
|
|
bitcon
Legendary
Offline
Activity: 2212
Merit: 1008
|
|
November 28, 2012, 03:00:13 PM |
|
|
|
|
|
|
maaku
Legendary
Offline
Activity: 905
Merit: 1012
|
|
November 28, 2012, 06:23:04 PM |
|
I said it before and I'll say it again: reward halving has been factored into the price of btc/usd for months now.
|
I'm an independent developer working on bitcoin-core, making my living off community donations. If you like my work, please consider donating yourself: 13snZ4ZyCzaL7358SmgvHGC9AxskqumNxP
|
|
|
molecular
Donator
Legendary
Offline
Activity: 2772
Merit: 1019
|
|
November 28, 2012, 06:34:06 PM |
|
I said it before and I'll say it again: reward halving has been factored into the price of btc/usd for months now.
Let us wait 'till december has passed with judging this. I doubt it's been priced in correctly.
|
PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0 3F39 FC49 2362 F9B7 0769
|
|
|
dooglus
Legendary
Offline
Activity: 2940
Merit: 1333
|
|
November 28, 2012, 06:43:32 PM |
|
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.
What happens when ASICs become cheap and available, and the network hash rate goes up by a factor of 10, say? Suppose the difficulty only just changed, so we were expecting it to take 2 weeks until the next difficulty change. We're running 10 times normal speed, so the difficulty changes in just 1.4 days instead of in 2 weeks. Under the current scheme, the difficulty is adjusted by the maximum factor of 4, bringing block generation time up to a more reasonable 4 minutes, and on the next adjustment it's adjusted by another 2.5x taking us back to 1 block per 10 minutes. With your proposed scheme, after the 1.4 days there would be very little adjustment. The average time-per-block over the 4 year history of bitcoin won't have been affected much by the fact that the last 2 weeks' worth of blocks was found in just 1.4 days, because 12.6 days is pretty insignificant compared to the 4 year history. So the adjustment will be minor, and we'll continue seeing blocks every minute. It will take quite a while until the 10 minutes per block status quo is achieved again.
|
Just-Dice | ██ ██████████ ██████████████████ ██████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████ ██████████████ ██████ | Play or Invest | ██ ██████████ ██████████████████ ██████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████████████ ██████████████████████ ██████████████ ██████ | 1% House Edge |
|
|
|
molecular
Donator
Legendary
Offline
Activity: 2772
Merit: 1019
|
|
November 28, 2012, 06:49:22 PM |
|
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.
What happens when ASICs become cheap and available, and the network hash rate goes up by a factor of 10, say? Suppose the difficulty only just changed, so we were expecting it to take 2 weeks until the next difficulty change. We're running 10 times normal speed, so the difficulty changes in just 1.4 days instead of in 2 weeks. Under the current scheme, the difficulty is adjusted by the maximum factor of 4, bringing block generation time up to a more reasonable 4 minutes, and on the next adjustment it's adjusted by another 2.5x taking us back to 1 block per 10 minutes. With your proposed scheme, after the 1.4 days there would be very little adjustment. The average time-per-block over the 4 year history of bitcoin won't have been affected much by the fact that the last 2 weeks' worth of blocks was found in just 1.4 days, because 12.6 days is pretty insignificant compared to the 4 year history. So the adjustment will be minor, and we'll continue seeing blocks every minute. It will take quite a while until the 10 minutes per block status quo is achieved again. I'm having problems thinking about this. At first glance I thought like dooglus (bolded part). Then I thought a little about it and had some sort of unclear heureka moment (if such a thing exists) and thought it would work really well. Now I'm back to dooglus way of looking at it. This is really moot to discuss as everyone agrees, but how exactly would the new difficulty be calculated?
|
PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0 3F39 FC49 2362 F9B7 0769
|
|
|
|