Bitcoin Forum
November 13, 2024, 01:17:18 PM *
News: Check out the artwork 1Dq created to commemorate this forum's 15th anniversary
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 [9] 10 »  All
  Print  
Author Topic: Date for 25 BTC per Block  (Read 35020 times)
molecular
Donator
Legendary
*
Offline Offline

Activity: 2772
Merit: 1019



View Profile
November 27, 2012, 12:25:03 PM
 #161

200 blocks left... I can happen tomorow.. Smiley

given that "tomorrow" can mean a lot of things depending on timezone, this is very likely to be true.

PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0  3F39 FC49 2362 F9B7 0769
runeks
Legendary
*
Offline Offline

Activity: 980
Merit: 1008



View Profile WWW
November 27, 2012, 02:21:33 PM
 #162



true, but nite's suggestion would amplify changes in difficulty. for example let's say the network hashrate doubles and blocks are solved every 5 minutes. we would have a huge difficulty adjustment sure, but under nite's system, it would assume the network hashrate was going to double *again*.



It would 'make things worse' on cases when first hashrate is first increasing and the suddenly starts decreasing or vice versa.
However it would make things go better when hash rate increases or decreases steadily or stays constant. And this case is 99% of the time..

Ie, if the hashrate has doubled, we have a *good reason* to assume it will most likely double again in the same time.

Technically addition would be very simple, just add "+ delta hashrate" to the formula.
We now have 170 blocks left. If we assume these last blocks are solved at a rate of 6 per hour, the network has spent about 36 days less than four years to solve 210,000 blocks. That's about 2.5% faster than 10 minutes per block, ie. 9:45 minutes per block.

Even if your algorithm is more accurate than the one in place now, is it even worth the added code complexity to get closer to 10 minutes than 9:45 is? Why would it matter at all that blocks take, on average, 15 seconds less to solve? I'd say it's completely unnecessary to adjust anything, and the 2.5% difference will make no measurable difference to how the network operates at all.
molecular
Donator
Legendary
*
Offline Offline

Activity: 2772
Merit: 1019



View Profile
November 27, 2012, 02:24:25 PM
 #163



true, but nite's suggestion would amplify changes in difficulty. for example let's say the network hashrate doubles and blocks are solved every 5 minutes. we would have a huge difficulty adjustment sure, but under nite's system, it would assume the network hashrate was going to double *again*.



It would 'make things worse' on cases when first hashrate is first increasing and the suddenly starts decreasing or vice versa.
However it would make things go better when hash rate increases or decreases steadily or stays constant. And this case is 99% of the time..

Ie, if the hashrate has doubled, we have a *good reason* to assume it will most likely double again in the same time.

Technically addition would be very simple, just add "+ delta hashrate" to the formula.
We now have 170 blocks left. If we assume these last blocks are solved at a rate of 6 per hour, the network has spent about 36 days less than four years to solve 210,000 blocks. That's about 2.5% faster than 10 minutes per block, ie. 9:45 minutes per block.

Even if your algorithm is more accurate than the one in place now, is it even worth the added code complexity to get closer to 10 minutes than 9:45 is? Why would it matter at all that blocks take, on average, 15 seconds less to solve? I'd say it's completely unnecessary to adjust anything, and the 2.5% difference will make no measurable difference to how the network operates at all.

I agree. 10 minutes is arbitrarily chosen anyway. Accuracy is not a goal here.

PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0  3F39 FC49 2362 F9B7 0769
Nite69
Sr. Member
****
Offline Offline

Activity: 477
Merit: 500


View Profile
November 27, 2012, 02:42:27 PM
 #164



true, but nite's suggestion would amplify changes in difficulty. for example let's say the network hashrate doubles and blocks are solved every 5 minutes. we would have a huge difficulty adjustment sure, but under nite's system, it would assume the network hashrate was going to double *again*.



It would 'make things worse' on cases when first hashrate is first increasing and the suddenly starts decreasing or vice versa.
However it would make things go better when hash rate increases or decreases steadily or stays constant. And this case is 99% of the time..

Ie, if the hashrate has doubled, we have a *good reason* to assume it will most likely double again in the same time.

Technically addition would be very simple, just add "+ delta hashrate" to the formula.
We now have 170 blocks left. If we assume these last blocks are solved at a rate of 6 per hour, the network has spent about 36 days less than four years to solve 210,000 blocks. That's about 2.5% faster than 10 minutes per block, ie. 9:45 minutes per block.

Even if your algorithm is more accurate than the one in place now, is it even worth the added code complexity to get closer to 10 minutes than 9:45 is? Why would it matter at all that blocks take, on average, 15 seconds less to solve? I'd say it's completely unnecessary to adjust anything, and the 2.5% difference will make no measurable difference to how the network operates at all.

I agree. 10 minutes is arbitrarily chosen anyway. Accuracy is not a goal here.

That's quite much true, ie this is more question of opinion. The functionality of bitcoin practically would not change and current algorithm is very good. Just being as pragmatic as I am, I would have done it differently.

Sync: ShiSKnx4W6zrp69YEFQyWk5TkpnfKLA8wx
Bitcoin: 17gNvfoD2FDqTfESUxNEmTukGbGVAiJhXp
Litecoin: LhbDew4s9wbV8xeNkrdFcLK5u78APSGLrR
AuroraCoin: AXVoGgYtSVkPv96JLL7CiwcyVvPxXHXRK9
dooglus
Legendary
*
Offline Offline

Activity: 2940
Merit: 1333



View Profile
November 27, 2012, 06:57:19 PM
 #165

That's quite much true, ie this is more question of opinion. The functionality of bitcoin practically would not change and current algorithm is very good. Just being as pragmatic as I am, I would have done it differently.

We can argue as to whether Nite69's way would have been better or not, but now that the network is already up and running it would be dangerous to make such a change.  Old clients would reject blocks which meet the new difficulty requirements but not the old ones resulting in a hard split of the blockchain.

Any marginal improvement the change might bring is certainly offset by the difficulty of getting all the old clients updated.

Just-Dice                 ██             
          ██████████         
      ██████████████████     
  ██████████████████████████ 
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
    ██████████████████████   
        ██████████████       
            ██████           
   Play or Invest                 ██             
          ██████████         
      ██████████████████     
  ██████████████████████████ 
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
    ██████████████████████   
        ██████████████       
            ██████           
   1% House Edge
MatthewLM
Legendary
*
Offline Offline

Activity: 1190
Merit: 1004


View Profile
November 27, 2012, 08:00:22 PM
 #166

If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.
Nite69
Sr. Member
****
Offline Offline

Activity: 477
Merit: 500


View Profile
November 27, 2012, 08:14:58 PM
 #167

If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

Well, yes. That would be even better. Perfect.

Actually, in my suggestion, it would cause *some* problems, if the hash rate dropped to half.. ;-)

But it is also true, this is just speculating. Cannot change it any more.

Sync: ShiSKnx4W6zrp69YEFQyWk5TkpnfKLA8wx
Bitcoin: 17gNvfoD2FDqTfESUxNEmTukGbGVAiJhXp
Litecoin: LhbDew4s9wbV8xeNkrdFcLK5u78APSGLrR
AuroraCoin: AXVoGgYtSVkPv96JLL7CiwcyVvPxXHXRK9
molecular
Donator
Legendary
*
Offline Offline

Activity: 2772
Merit: 1019



View Profile
November 27, 2012, 09:36:40 PM
 #168

If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning.

PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0  3F39 FC49 2362 F9B7 0769
phelix
Legendary
*
Offline Offline

Activity: 1708
Merit: 1020



View Profile
November 27, 2012, 10:36:47 PM
 #169

If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning.
I see problems. after random long blocks you would have to set a minimum difficulty. what is the problem with the current design? just set your watch according to bitcoin blocks if you must, not the other way round
Nite69
Sr. Member
****
Offline Offline

Activity: 477
Merit: 500


View Profile
November 27, 2012, 10:56:22 PM
 #170

If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning.
I see problems. after random long blocks you would have to set a minimum difficulty. what is the problem with the current design? just set your watch according to bitcoin blocks if you must, not the other way round

Only after a block lasting 2 weeks.

Edit: no, not even then. Halving the difficulty would be enought.
 

Sync: ShiSKnx4W6zrp69YEFQyWk5TkpnfKLA8wx
Bitcoin: 17gNvfoD2FDqTfESUxNEmTukGbGVAiJhXp
Litecoin: LhbDew4s9wbV8xeNkrdFcLK5u78APSGLrR
AuroraCoin: AXVoGgYtSVkPv96JLL7CiwcyVvPxXHXRK9
Fjordbit
Hero Member
*****
Offline Offline

Activity: 588
Merit: 500

firstbits.com/1kznfw


View Profile WWW
November 27, 2012, 11:00:22 PM
 #171

Just so people know, this block could be found within 2 seconds of the prior block or 2 hours. That's unlikely, but I did see a block once take over 60 minutes. Because of this, bitcoinclock can give an approximation, but you really need to stay tuned to blockchain.info.
MatthewLM
Legendary
*
Offline Offline

Activity: 1190
Merit: 1004


View Profile
November 27, 2012, 11:06:26 PM
 #172

If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning.

Yes. It's just another way it could have been done so I was being academic.
payb.tc
Hero Member
*****
Offline Offline

Activity: 812
Merit: 1000



View Profile
November 28, 2012, 04:04:17 AM
 #173

Just so people know, this block could be found within 2 seconds of the prior block or 2 hours. That's unlikely, but I did see a block once take over 60 minutes. Because of this, bitcoinclock can give an approximation, but you really need to stay tuned to blockchain.info.

yeah, i still remember this 90-minute one... https://bitcointalk.org/index.php?topic=118930.0

what do you mean by "stay tuned to blockchain.info"? for what, the blockcount? that part isn't an approximation Smiley

jl2035
Full Member
***
Offline Offline

Activity: 146
Merit: 100



View Profile
November 28, 2012, 02:40:57 PM
 #174

3 blocks left Smiley

JOIN Bitbiz bitbiz.io
bitcon
Legendary
*
Offline Offline

Activity: 2212
Merit: 1008


View Profile
November 28, 2012, 03:00:13 PM
 #175

woohoo! today is the big day! Smiley Smiley Smiley
Stephen Gornick
Legendary
*
Offline Offline

Activity: 2506
Merit: 1010


View Profile
November 28, 2012, 06:19:57 PM
 #176

So, it ended up being November 28th in most every populated timezone.

Specifically, 2012-11-28 15:24:38 UTC
 - http://blockchain.info/block-index/322335/000000000000048b95347e83192f69cf0366076336c639f9b7228e9ba171342e

I was really expecting there to be some larger exchange rate volatility over the last few days.  I was expecting GPU miners to already have been selling in quantity on eBay.  

But it has been pretty quiet.

Unichange.me

            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █
            █


maaku
Legendary
*
expert
Offline Offline

Activity: 905
Merit: 1012


View Profile
November 28, 2012, 06:23:04 PM
 #177

I said it before and I'll say it again: reward halving has been factored into the price of btc/usd for months now.

I'm an independent developer working on bitcoin-core, making my living off community donations.
If you like my work, please consider donating yourself: 13snZ4ZyCzaL7358SmgvHGC9AxskqumNxP
molecular
Donator
Legendary
*
Offline Offline

Activity: 2772
Merit: 1019



View Profile
November 28, 2012, 06:34:06 PM
 #178

I said it before and I'll say it again: reward halving has been factored into the price of btc/usd for months now.

Let us wait 'till december has passed with judging this. I doubt it's been priced in correctly.

PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0  3F39 FC49 2362 F9B7 0769
dooglus
Legendary
*
Offline Offline

Activity: 2940
Merit: 1333



View Profile
November 28, 2012, 06:43:32 PM
 #179

If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

What happens when ASICs become cheap and available, and the network hash rate goes up by a factor of 10, say?

Suppose the difficulty only just changed, so we were expecting it to take 2 weeks until the next difficulty change.  We're running 10 times normal speed, so the difficulty changes in just 1.4 days instead of in 2 weeks.

Under the current scheme, the difficulty is adjusted by the maximum factor of 4, bringing block generation time up to a more reasonable 4 minutes, and on the next adjustment it's adjusted by another 2.5x taking us back to 1 block per 10 minutes.

With your proposed scheme, after the 1.4 days there would be very little adjustment.  The average time-per-block over the 4 year history of bitcoin won't have been affected much by the fact that the last 2 weeks' worth of blocks was found in just 1.4 days, because 12.6 days is pretty insignificant compared to the 4 year history.  So the adjustment will be minor, and we'll continue seeing blocks every minute.  It will take quite a while until the 10 minutes per block status quo is achieved again.

Just-Dice                 ██             
          ██████████         
      ██████████████████     
  ██████████████████████████ 
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
    ██████████████████████   
        ██████████████       
            ██████           
   Play or Invest                 ██             
          ██████████         
      ██████████████████     
  ██████████████████████████ 
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
    ██████████████████████   
        ██████████████       
            ██████           
   1% House Edge
molecular
Donator
Legendary
*
Offline Offline

Activity: 2772
Merit: 1019



View Profile
November 28, 2012, 06:49:22 PM
 #180

If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

What happens when ASICs become cheap and available, and the network hash rate goes up by a factor of 10, say?

Suppose the difficulty only just changed, so we were expecting it to take 2 weeks until the next difficulty change.  We're running 10 times normal speed, so the difficulty changes in just 1.4 days instead of in 2 weeks.

Under the current scheme, the difficulty is adjusted by the maximum factor of 4, bringing block generation time up to a more reasonable 4 minutes, and on the next adjustment it's adjusted by another 2.5x taking us back to 1 block per 10 minutes.

With your proposed scheme, after the 1.4 days there would be very little adjustment.  The average time-per-block over the 4 year history of bitcoin won't have been affected much by the fact that the last 2 weeks' worth of blocks was found in just 1.4 days, because 12.6 days is pretty insignificant compared to the 4 year history.  So the adjustment will be minor, and we'll continue seeing blocks every minute.  It will take quite a while until the 10 minutes per block status quo is achieved again.

I'm having problems thinking about this. At first glance I thought like dooglus (bolded part). Then I thought a little about it and had some sort of unclear heureka moment (if such a thing exists) and thought it would work really well. Now I'm back to dooglus way of looking at it.

This is really moot to discuss as everyone agrees, but how exactly would the new difficulty be calculated?

PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0  3F39 FC49 2362 F9B7 0769
Pages: « 1 2 3 4 5 6 7 8 [9] 10 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!