Bitcoin Forum
December 10, 2016, 08:49:28 PM *
News: To be able to use the next phase of the beta forum software, please ensure that your email address is correct/functional.
 
   Home   Help Search Donate Login Register  
Pages: « 1 ... 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 [446] 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 ... 744 »
  Print  
Author Topic: [1500 TH] p2pool: Decentralized, DoS-resistant, Hop-Proof pool  (Read 2034858 times)
norgan
Sr. Member
****
Offline Offline

Activity: 308

Decentralize your hashing - p2pool - Norgz Pool


View Profile WWW
June 10, 2014, 02:41:45 AM
 #8901

So, right before the latest block found and usual virgin generated BTC payout, there was another much smaller payout that wasn't generated, but it did go to p2pool associated wallet addresses.

Was that a general donation to P2Pool miners, or was that mdude77's merged mining donations being sent out to everyone?

Wasn't me.

M

Oh well... may be a donation then?  Good news regardless...

Wait, None of you guys got this little tip in your P2Pool associated addresses?

I'm not in p2pool.  I'm not wasting 10% of my hashpower pointing my S2s at it. 

M

P2pool not worth it with S2?

Kano is working on a driver that may help this. It's always worth it but you need to be ok with initially loosing 10% due to efficiency issues with p2pool. Hopefully Kano can get the new cgminer driver out for the s2 soon Smiley

Miner, tech geek, operator of NorgzPool - Sydney Australia P2Pool Node creator of p2pool fancy front end

Tips: 1NorganBbymShTN2MMpfGzRYJF8mcPeXjv Exchange BTC locally in Australia or Donate to p2pool miners
1481402968
Hero Member
*
Offline Offline

Posts: 1481402968

View Profile Personal Message (Offline)

Ignore
1481402968
Reply with quote  #2

1481402968
Report to moderator
1481402968
Hero Member
*
Offline Offline

Posts: 1481402968

View Profile Personal Message (Offline)

Ignore
1481402968
Reply with quote  #2

1481402968
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1481402968
Hero Member
*
Offline Offline

Posts: 1481402968

View Profile Personal Message (Offline)

Ignore
1481402968
Reply with quote  #2

1481402968
Report to moderator
bitdigger2013
Sr. Member
****
Offline Offline

Activity: 354


View Profile WWW
June 10, 2014, 02:48:19 AM
 #8902

So, right before the latest block found and usual virgin generated BTC payout, there was another much smaller payout that wasn't generated, but it did go to p2pool associated wallet addresses.

Was that a general donation to P2Pool miners, or was that mdude77's merged mining donations being sent out to everyone?

Wasn't me.

M

Oh well... may be a donation then?  Good news regardless...

Wait, None of you guys got this little tip in your P2Pool associated addresses?

I'm not in p2pool.  I'm not wasting 10% of my hashpower pointing my S2s at it.  

M

P2pool not worth it with S2?

Kano is working on a driver that may help this. It's always worth it but you need to be ok with initially loosing 10% due to efficiency issues with p2pool. Hopefully Kano can get the new cgminer driver out for the s2 soon Smiley

If losing 10% of hash power I would be more profitable to stay on regular pool until this is solved. I don't mind the variance but if it less profitable for an s2 on p2pool then regular pool it kind of defeats the purpose. Same issue for S1?

This new driver would be offered through a bitmain update ?
jedimstr
Hero Member
*****
Offline Offline

Activity: 784



View Profile
June 10, 2014, 02:52:33 AM
 #8903

So, right before the latest block found and usual virgin generated BTC payout, there was another much smaller payout that wasn't generated, but it did go to p2pool associated wallet addresses.

Was that a general donation to P2Pool miners, or was that mdude77's merged mining donations being sent out to everyone?

Wasn't me.

M

Oh well... may be a donation then?  Good news regardless...

Wait, None of you guys got this little tip in your P2Pool associated addresses?

I'm not in p2pool.  I'm not wasting 10% of my hashpower pointing my S2s at it. 

M

P2pool not worth it with S2?

Kano is working on a driver that may help this. It's always worth it but you need to be ok with initially loosing 10% due to efficiency issues with p2pool. Hopefully Kano can get the new cgminer driver out for the s2 soon Smiley

If losing 10% of hash power I would be more profitable to stay on regular pool until this is solved. I don't mind the variance but if it less profitable for an s2 on p2pool then regular pool it kind of defeats the purpose. Same issue for S1?

This new driver would be offered through a bitmain update ?

Kano already has an updated CGMiner for the S1 that works great with p2pool (and other pools).
You can get it here: https://github.com/kanoi/cgminer-binaries/tree/master/AntS1

He's still working on the S2 version.

bitdigger2013
Sr. Member
****
Offline Offline

Activity: 354


View Profile WWW
June 10, 2014, 03:03:58 AM
 #8904

So, right before the latest block found and usual virgin generated BTC payout, there was another much smaller payout that wasn't generated, but it did go to p2pool associated wallet addresses.

Was that a general donation to P2Pool miners, or was that mdude77's merged mining donations being sent out to everyone?

Wasn't me.

M

Oh well... may be a donation then?  Good news regardless...

Wait, None of you guys got this little tip in your P2Pool associated addresses?

I'm not in p2pool.  I'm not wasting 10% of my hashpower pointing my S2s at it. 

M

P2pool not worth it with S2?

Kano is working on a driver that may help this. It's always worth it but you need to be ok with initially loosing 10% due to efficiency issues with p2pool. Hopefully Kano can get the new cgminer driver out for the s2 soon Smiley

If losing 10% of hash power I would be more profitable to stay on regular pool until this is solved. I don't mind the variance but if it less profitable for an s2 on p2pool then regular pool it kind of defeats the purpose. Same issue for S1?

This new driver would be offered through a bitmain update ?

Kano already has an updated CGMiner for the S1 that works great with p2pool (and other pools).
You can get it here: https://github.com/kanoi/cgminer-binaries/tree/master/AntS1

He's still working on the S2 version.

Thanks for the link. I have not put my S1s on my p2pool yet because with my S2 on it they get high diff on the s1's. Now that I hear S2 is not currently efficient on p2pool I may swap and put my S1's on it with the new cgminer and put my S2 on reg pool.

I assume you void warranty when you upgrade cgminer this way Smiley

Is it possible to restore to default after this update by using the reset button on S1? or would I have to remove the updated cgminer manually?
norgan
Sr. Member
****
Offline Offline

Activity: 308

Decentralize your hashing - p2pool - Norgz Pool


View Profile WWW
June 10, 2014, 03:29:07 AM
 #8905



Thanks for the link. I have not put my S1s on my p2pool yet because with my S2 on it they get high diff on the s1's. Now that I hear S2 is not currently efficient on p2pool I may swap and put my S1's on it with the new cgminer and put my S2 on reg pool.

I assume you void warranty when you upgrade cgminer this way Smiley

Is it possible to restore to default after this update by using the reset button on S1? or would I have to remove the updated cgminer manually?

you can revert back easily as you only rename the old one. you simply rename it back to go back to default. I don't think changing cgminer could void hardware warranty but you'll need to ask Bitmain about that.
If you have your s1's and the s2 on your node then use the diff setting as specified a couple of pages back.

essentially you add a /0+diff to the end of your miner address. for an s1 at around 200gh/s it's /0+230

Quote
If your running S1's set <share> to 0 ("0" defaults to lowest p2pool diff, currently 1677854.73)

And <pseudo_share> to 220.4 (optimized for 190GH/s)

<pseudo_share> is calculated as your hash rate in KH/s times 0.00000116

i.e. 190,000,000 * 0.00000116 = 220.4

Miner, tech geek, operator of NorgzPool - Sydney Australia P2Pool Node creator of p2pool fancy front end

Tips: 1NorganBbymShTN2MMpfGzRYJF8mcPeXjv Exchange BTC locally in Australia or Donate to p2pool miners
bitdigger2013
Sr. Member
****
Offline Offline

Activity: 354


View Profile WWW
June 10, 2014, 03:41:16 AM
 #8906



Thanks for the link. I have not put my S1s on my p2pool yet because with my S2 on it they get high diff on the s1's. Now that I hear S2 is not currently efficient on p2pool I may swap and put my S1's on it with the new cgminer and put my S2 on reg pool.

I assume you void warranty when you upgrade cgminer this way Smiley

Is it possible to restore to default after this update by using the reset button on S1? or would I have to remove the updated cgminer manually?

you can revert back easily as you only rename the old one. you simply rename it back to go back to default. I don't think changing cgminer could void hardware warranty but you'll need to ask Bitmain about that.
If you have your s1's and the s2 on your node then use the diff setting as specified a couple of pages back.

essentially you add a /0+diff to the end of your miner address. for an s1 at around 200gh/s it's /0+230

Quote
If your running S1's set <share> to 0 ("0" defaults to lowest p2pool diff, currently 1677854.73)

And <pseudo_share> to 220.4 (optimized for 190GH/s)

<pseudo_share> is calculated as your hash rate in KH/s times 0.00000116

i.e. 190,000,000 * 0.00000116 = 220.4

so username would be address/0+230
cool I will try that
I am actually averaging 963ghs with my S2 on p2pool @ 110% efficiency over 3 days and I was averaging 1003ghs on ghash.io
norgan
Sr. Member
****
Offline Offline

Activity: 308

Decentralize your hashing - p2pool - Norgz Pool


View Profile WWW
June 10, 2014, 04:40:17 AM
 #8907



so username would be address/0+230
cool I will try that
I am actually averaging 963ghs with my S2 on p2pool @ 110% efficiency over 3 days and I was averaging 1003ghs on ghash.io

yuip, that's for an s1, your s2 would have a different figure. 963 isn't too bad, when you have 1th/s 40gh/s isn't much but it's 40gh/s you would otherwise have generating coin.

Miner, tech geek, operator of NorgzPool - Sydney Australia P2Pool Node creator of p2pool fancy front end

Tips: 1NorganBbymShTN2MMpfGzRYJF8mcPeXjv Exchange BTC locally in Australia or Donate to p2pool miners
jedimstr
Hero Member
*****
Offline Offline

Activity: 784



View Profile
June 10, 2014, 11:19:29 AM
 #8908

So, right before the latest block found and usual virgin generated BTC payout, there was another much smaller payout that wasn't generated, but it did go to p2pool associated wallet addresses.

Was that a general donation to P2Pool miners, or was that mdude77's merged mining donations being sent out to everyone?

I only see the two block payments. nothing in between from p2pool


It doesn't show up on the UI or P2pool stats...but it was definitely a payout to p2pool associated addresses.  At least I got it in the address I only use for P2Pool.

Here's the Transaction: https://blockchain.info/address/1Q3PppNci5gTKmxpCa7VytXxc3PMv7XDkm


I wonder if this new donation site mentioned on reddit was the source of the mystery tip:
http://www.reddit.com/r/Bitcoin/comments/27rgvk/p2pool_donation_tool_now_available_to_help/

It makes use of the donation capabilities in p2pool already, but puts a web UI in front of it.

UPDATE: mystery solved... It was from this donation site: http://blisterpool.com/p2pdonationstatus/f052c50b722773c288a081abf9a5ea68

Hmmm. The odd thing was that 3 donations went out and I was only part of one of them.  
How does p2pool distribute donations?  Does the code randomly choose a set number based on amount of donation to minimize dust?  I would think it would be better to hold in the chain until a certain threshold and then pay out to everyone who has active shares. Anyone know?

UPDATE 2:  just got my second donation through this site.  Another node just donated their 1% and it was relatively sizeable so more p2pool mining addresses got a cut.   I really like where this is going.

windpath
Legendary
*
Online Online

Activity: 938


View Profile WWW
June 10, 2014, 01:27:44 PM
 #8909

I received 2 donations as well that were not from coin generation:

Amount: 0.00015444
Time: 2014-06-10 08:28:17

Amount: 0.00010090
Time: 2014-06-10 08:28:17

Thanks to whoever you are Smiley

Also, a note on Kano's S1 update:

I have 6 S1's running on p2pool, they were all running fine, but I installed the update on one of them to test.

It had no effect on p2pool earnings, but it did change the time format of LSTime in the UI, originally it showed in elapsed time (i.e. 5 seconds ago), after the update it showed in a unix timestamp, and after the suggested fix it showed as the full date/time. Not a big deal, but I prefer the elapsed time...

Before updating any miner I would strongly suggest testing it on a p2pool node first. All my S1's ran just fine from the factory on p2pool, and even better when overclocked Wink

I would suggest a minimum test window of 6 hours, with a full day being preferred. If after a day your hashrate seems low, then I would consider an update.

windpath
Legendary
*
Online Online

Activity: 938


View Profile WWW
June 10, 2014, 02:41:52 PM
 #8910

Alternative p2pool.info Suggestions:

I'm working on getting this live this week. I plan to try and reproduce all the data from the original, and include some other stuff we are already storing. I've looked over the p2pool.info Azure code (not something I am familiar with) and have decided that it will be faster to write my own version on a LAMP stack as that is already the foundation of the Coin Cadence site.

As you might expect there are already some substantial differences between Coin Cadence and p2pool.info as to what and how p2pool data is stored and collected.

The biggest change/advantage of my version will be the fact that all data is gathered locally from p2pool and bitcoind compared with the current p2pool.info implementation which pulls block data from blockchain.info

I have a few questions for the community as a whole as to how to calculate and present some of the stats:

Estimated time to block:
This is currently calculated on the fly by p2pool using the pool hash rate and "attempts_to_block" from:

http://mining.coincadence.com:9332/local_stats

Code:
     attempts_to_block=
        parseInt(local_stats.attempts_to_block || 0);
      time_to_block= attempts_to_block / pool_hash_rate;
      $('#expected_time_to_block').text((''+time_to_block).formatSeconds());

The problem I see is when to store this value in the DB. The pool hash rate fluctuates pretty wildly even when miners are relatively consistent, this is a fact of life when trying to calculate the global rate of the distributed pool. Add the fact that miners are joining and leaving p2pool on a regular basis and it becomes even more complicated.

Should the expected time to block be stored immediately after or before a block is found? Should it be stored every x minutes and an average calculated on a per block basis? Very open to suggestions!

Pool Luck:
Assuming we have decided on a satisfactory answer to storing the expected time to block above, this is pretty straight forward...

The question is how the luck stats are presented. In the current p2pool.info implementation luck is presented as a % over the last 7, 30 and 90 days.

I'm considering 2 alternatives:

1. Borrowing Slush's time frame and using 1, 7 and 30 days.

2. Basing it on blocks rather then time, i.e. current block luck, last 5 blocks, last 15 blocks, etc...

What do you think?

Estimated Hashrate:
in current p2pool.info implementation:
Quote
The following users / addresses have submitted at least 1 valid share in the past 24 hours. Note: Hashrates are very rough estimates based on the number of shares submitted in the past day. They may be off by 10-20% or more due to variance.

This uses some fuzzy math that I don't fully understand. If anyone has a method of calculating this and can explain it to me I'd love to hear it....

Here is my proposed solution, and to be honest I'm not sure if it is better or worse then the current implementation, and am very open to suggestions:

Using data provided by p2pool here: http://mining.coincadence.com:9332/current_payouts

Retrieve the following: current total block payout (including tx fees), payout per miner, global pool hashrate

Calculate miner % of total expected block payout.

Miner estimated hash rate = % of global pool speed based on % of expected payout??

So for example if a miner has 10% of the expected total payout, we can assume they have 10% of the global hash rate...

Again, fuzzy math at best and am open to suggestions....

Summary
I'd like feedback on my 3 suggested methods:

1. When to store estimated time to block (every x minutes and use average, just before or just after a block is found)
2. Calculating/Display format for pool luck
3. Estimating miner hash rates

Thanks!



CartmanSPC
Legendary
*
Offline Offline

Activity: 1148



View Profile WWW
June 10, 2014, 06:50:27 PM
 #8911



Thanks for the link. I have not put my S1s on my p2pool yet because with my S2 on it they get high diff on the s1's. Now that I hear S2 is not currently efficient on p2pool I may swap and put my S1's on it with the new cgminer and put my S2 on reg pool.

I assume you void warranty when you upgrade cgminer this way Smiley

Is it possible to restore to default after this update by using the reset button on S1? or would I have to remove the updated cgminer manually?

you can revert back easily as you only rename the old one. you simply rename it back to go back to default. I don't think changing cgminer could void hardware warranty but you'll need to ask Bitmain about that.
If you have your s1's and the s2 on your node then use the diff setting as specified a couple of pages back.

essentially you add a /0+diff to the end of your miner address. for an s1 at around 200gh/s it's /0+230

Quote
If your running S1's set <share> to 0 ("0" defaults to lowest p2pool diff, currently 1677854.73)

And <pseudo_share> to 220.4 (optimized for 190GH/s)

<pseudo_share> is calculated as your hash rate in KH/s times 0.00000116

i.e. 190,000,000 * 0.00000116 = 220.4

Is the pseudo share diff setting Address+number ever really necessary other than for graphs?

KyrosKrane
Sr. Member
****
Offline Offline

Activity: 292


View Profile WWW
June 10, 2014, 07:50:16 PM
 #8912

I received 2 donations as well that were not from coin generation:

Amount: 0.00015444
Time: 2014-06-10 08:28:17

Amount: 0.00010090
Time: 2014-06-10 08:28:17

Thanks to whoever you are Smiley
Sadly, it seems my piddly 14 shares didn't rate. Sad Ah, well. Maybe I'll qualify next time. Smiley

Tips and donations: 1KyrosREGDkNLp1rMd9wfVwfkXYHTd6j5U  |  BTC P2Pool node: p2pool.kyros.info:9332
jonnybravo0311
Legendary
*
Offline Offline

Activity: 1008


Mine at Jonny's Pool


View Profile WWW
June 10, 2014, 09:17:12 PM
 #8913

Alternative p2pool.info Suggestions:

I'm working on getting this live this week. I plan to try and reproduce all the data from the original, and include some other stuff we are already storing. I've looked over the p2pool.info Azure code (not something I am familiar with) and have decided that it will be faster to write my own version on a LAMP stack as that is already the foundation of the Coin Cadence site.

As you might expect there are already some substantial differences between Coin Cadence and p2pool.info as to what and how p2pool data is stored and collected.

The biggest change/advantage of my version will be the fact that all data is gathered locally from p2pool and bitcoind compared with the current p2pool.info implementation which pulls block data from blockchain.info

I have a few questions for the community as a whole as to how to calculate and present some of the stats:

Estimated time to block:
This is currently calculated on the fly by p2pool using the pool hash rate and "attempts_to_block" from:

http://mining.coincadence.com:9332/local_stats

Code:
     attempts_to_block=
        parseInt(local_stats.attempts_to_block || 0);
      time_to_block= attempts_to_block / pool_hash_rate;
      $('#expected_time_to_block').text((''+time_to_block).formatSeconds());

The problem I see is when to store this value in the DB. The pool hash rate fluctuates pretty wildly even when miners are relatively consistent, this is a fact of life when trying to calculate the global rate of the distributed pool. Add the fact that miners are joining and leaving p2pool on a regular basis and it becomes even more complicated.

Should the expected time to block be stored immediately after or before a block is found? Should it be stored every x minutes and an average calculated on a per block basis? Very open to suggestions!

Pool Luck:
Assuming we have decided on a satisfactory answer to storing the expected time to block above, this is pretty straight forward...

The question is how the luck stats are presented. In the current p2pool.info implementation luck is presented as a % over the last 7, 30 and 90 days.

I'm considering 2 alternatives:

1. Borrowing Slush's time frame and using 1, 7 and 30 days.

2. Basing it on blocks rather then time, i.e. current block luck, last 5 blocks, last 15 blocks, etc...

What do you think?

Estimated Hashrate:
in current p2pool.info implementation:
Quote
The following users / addresses have submitted at least 1 valid share in the past 24 hours. Note: Hashrates are very rough estimates based on the number of shares submitted in the past day. They may be off by 10-20% or more due to variance.

This uses some fuzzy math that I don't fully understand. If anyone has a method of calculating this and can explain it to me I'd love to hear it....

Here is my proposed solution, and to be honest I'm not sure if it is better or worse then the current implementation, and am very open to suggestions:

Using data provided by p2pool here: http://mining.coincadence.com:9332/current_payouts

Retrieve the following: current total block payout (including tx fees), payout per miner, global pool hashrate

Calculate miner % of total expected block payout.

Miner estimated hash rate = % of global pool speed based on % of expected payout??

So for example if a miner has 10% of the expected total payout, we can assume they have 10% of the global hash rate...

Again, fuzzy math at best and am open to suggestions....

Summary
I'd like feedback on my 3 suggested methods:

1. When to store estimated time to block (every x minutes and use average, just before or just after a block is found)
2. Calculating/Display format for pool luck
3. Estimating miner hash rates

Thanks!



Hey windpath,

You can always calculate the expected time to block.  I'm not sure why they use the formula "attempts / rate".  Expected time to block is based on the following:
Code:
Difficulty * 2**32 / hashrate / 86400 = number of days to find a block
As you can see, that value is going to fluctuate considerably based on hash rate.  One minute it could be 1.2 days, and the next 0.5 days.  This actually happened just a few days ago when one minute the pool reported about 500TH/s and the next it reported just over 1PH/s.

I guess what I'm stating is that the best you can hope for providing is an "average" value.  Collect expected time to block values every X units of time.  You can't just use either just before, or just after the block is found.

This will impact your luck calculations as well.  The constant in that equation is the time between blocks.  Since they are timestamped, you know exactly how long it took.  Then, depending on how many "expected time to block" values you've recorded, you can make an educated guess on the luck.  By the way, you'd have to start your stats collection immediately after p2pool finds a block for the most "accurate" calculations.

The reason the other calculations are so far off is because they are all based on submitted shares.  That's your "fuzzy math".  Your miners know their own hashing rate.  Why?  Because every single work unit is considered.  The pool does not, because not every work unit is considered.  It's an estimate based upon how much work of a given value is submitted.  Therefore, while extremely unlikely, it is entirely possible that a miner with 1GH/s submits 100 shares in a minute.  The pool would report that miner having a substantially higher hash rate that it actually does because of it.

Unfortunately, that's what we're stuck with.  Using a variation of the formula I gave above, you can estimate the miner's hash rate, since you know all of the other variables.  Let's use my example of the miner submitting 100 shares in a minute.  For simplicity's sake, we're going to make the assumption that those 100 shares were all that were submitted in a 24 hour period.
Code:
100 shares in 24 hours = 864 seconds to find a share.
Current share difficulty = 1508968.56
1508968.56 * 2**32 / hashrate = 864
hashrate = 7501123398023.39555555555556 hashes per second =  7.5TH/s
So, the miner is actually a 1GH/s unit, but p2pool thinks it's 7.5TH/s.  Obviously, this is a contrived example to display the effect of a miner not falling within expected parameters.  However, looking at the expected payouts, you would think this miner is in reality 7.5TH/s.

Alright, I've rambled on long enough and should probably get back to work Smiley

Jonny's Pool - Mine with us and help us grow!  Support a pool that supports Bitcoin, not a hardware manufacturer's pockets!  No SPV cheats.  No empty blocks.
windpath
Legendary
*
Online Online

Activity: 938


View Profile WWW
June 11, 2014, 02:14:13 AM
 #8914

Hey windpath,

...

Thanks Jonny,

I appreciate you taking the time!

I did not understand the "attempts / rate" either, my assumption is that it may be based on p2pools high share diff, but not sure.

Using the standard formula certainly seems to make sense, with the exception that I'll probably store the value in seconds so that when p2pool is finding 2-3 blocks an hour in the near future it will still work Wink

Code:
Difficulty * 2**32 / hashrate  = number of seconds to find a block

I'm finding even getting the "best" value for global hash rate is going to be a challenge as: http://mining.coincadence.com:9332/global_stats and http://mining.coincadence.com:9332/web/graph_data/pool_rates/last_hour report very different values....

After closer examination it looks like "global_stats" is the total hash rate, including orphans and DOA shares. "graph_data" separates them out by good, orphan and doa. I'm going to stick with the "global_stats" number for the overall pool speed, as that does represent the "overall" speed.

For the luck calculations I'm considering using "pool_nonstale_hash_rate" from "global_stats", while at first look it seems like I'm cherry picking the best data to shine a good light on p2pool, with pool orphan+DOA rates often touching 20% I feel like including them in the luck calculation could significantly throw things off, so I want to base luck on actual valid work... Open to feedback on this one.

This would be the ideal solution for miner hash rate:

Code:
100 shares in 24 hours = 864 seconds to find a share.
Current share difficulty = 1508968.56
1508968.56 * 2**32 / hashrate = 864
hashrate = 7501123398023.39555555555556 hashes per second =  7.5TH/s

However, unless I'm missing something obvious, p2pool does not publicly expose its version of the blockchain, which makes it difficult to determine submitted shares by miner for miners who are not on my node. Also, storing historical p2pool-wide share data with p2pools higher share rate will become unsustainable 60x faster then storing the Bitcoin blockchain (share every ~10 seconds vs. ~10 minutes)...

I'm going to test my original proposal, basing the estimated hash rate on the per-miner payout from the last found block and see how accurate it looks based on miners on my node with known hash rates (I'm already collecting/storing this data so it wont bloat the DB, and requires less new code).

For tonight I'll be setting up the data collector to grab the following and place it in its own table:

ID: unix time stamp
Global Hash Rate: http://mining.coincadence.com:9332/global_stats (pool_hash_rate)
Accepted Hash Rate: http://mining.coincadence.com:9332/global_stats (pool_nonstale_hash_rate)
Difficulty: bitcoind_rpc->getdifficulty
Number of Miners: http://mining.coincadence.com:9332/current_payouts

I already have found blocks and payouts per miner per found block in a separate table....

In looking at the "last_hour" graph data it reports 150 data points per 1 hour period, that seems like overkill for our purposes, most of my other data collectors run every minute on a standard unix cron job, I'm going to collect every minute to keep things simple, it should give plenty of resolution for any statistically valid reports and will limit DB growth to ~526k records per year...

Your point about starting data collection immediately after a found block is an excellent one. I'll start collecting ASAP and remove any data stored before the next found block.

Again thanks.

Still very open to any other feedback, rushing this a bit based on the non-availability of p2pool.info, but would still like to get it right the first time Wink




jonnybravo0311
Legendary
*
Offline Offline

Activity: 1008


Mine at Jonny's Pool


View Profile WWW
June 11, 2014, 03:01:17 AM
 #8915

Quote
For the luck calculations I'm considering using "pool_nonstale_hash_rate" from "global_stats", while at first look it seems like I'm cherry picking the best data to shine a good light on p2pool, with pool orphan+DOA rates often touching 20% I feel like including them in the luck calculation could significantly throw things off, so I want to base luck on actual valid work... Open to feedback on this one.
You're cherry picking, but remember, those DOA shares could potentially still be solutions for the BTC block chain.  I think you'd be doing the calculations a disservice by not including the DOA, since that hashing does indeed add value.

Quote
However, unless I'm missing something obvious, p2pool does not publicly expose its version of the blockchain, which makes it difficult to determine submitted shares by miner for miners who are not on my node.
No, there is no "sharechain.info" site as far as I know.  However, by running the node, you have a complete copy of the share chain.  You can use the information in there to calculate how many shares each address has submitted across the network.  You're still bound by the limitations I pointed out (excessive luck - both good and bad) which would affect the calculations.

Honestly, though, your method of using percentage of expected block payout to then determine the approximate hash rate of that address will probably get you a relatively close approximation.  Close enough for government work, anyway... which about the best you could hope for here since there's no way to actually accurately garner this information.

Quote
I'm going to test my original proposal, basing the estimated hash rate on the per-miner payout from the last found block and see how accurate it looks based on miners on my node with known hash rates (I'm already collecting/storing this data so it wont bloat the DB, and requires less new code).
I emphasized that because it's a pretty important point.  Your node only approximates the miners' true hash rates based upon shares it receives from those miners.  The value can swing pretty wildly, especially if you've got a wide range of hashing power.  This is where setting the pseudo-diff comes in to play.  If all of your miners actually set an appropriate value, then you'd get a better approximation.  Of course, you could always throw in the auto worker diff patch, which would dynamically assign difficulties to each worker, instead of having difficulty assigned based on the node's hash rate.  Again, though, these are approximations, but we're back to the government work again Smiley

I think you're doing a fantastic job with your site.  Keep up the great work.

Jonny's Pool - Mine with us and help us grow!  Support a pool that supports Bitcoin, not a hardware manufacturer's pockets!  No SPV cheats.  No empty blocks.
windpath
Legendary
*
Online Online

Activity: 938


View Profile WWW
June 11, 2014, 03:16:48 AM
 #8916

Quote
You're cherry picking, but remember, those DOA shares could potentially still be solutions for the BTC block chain.  I think you'd be doing the calculations a disservice by not including the DOA, since that hashing does indeed add value.

Hmmm, I'll have to give this some thought, just so I'm clear your suggestion is to include good+doa and exclude orphan from http://mining.coincadence.com:9332/web/graph_data/pool_rates/last_hour

As an aside, can anyone clarify exactly what http://mining.coincadence.com:9332/rate is reporting? from web.py:
Code:
web_root.putChild('rate', WebInterface(lambda: p2pool_data.get_pool_attempts_per_second(node.tracker, node.best_share_var.value, decent_height())/(1-p2pool_data.get_average_stale_prop(node.tracker, node.best_share_var.value, decent_height()))))

Quote
miners on my node with known hash rates.
Sorry, was not specific, I mean my own miners.... I keep data, and can make some luck calculations as well to validate/invalidate results...

Quote
I think you're doing a fantastic job with your site.  Keep up the great work.

Thanks!

jonnybravo0311
Legendary
*
Offline Offline

Activity: 1008


Mine at Jonny's Pool


View Profile WWW
June 11, 2014, 05:05:05 AM
 #8917

Quote
Hmmm, I'll have to give this some thought, just so I'm clear your suggestion is to include good+doa and exclude orphan from http://mining.coincadence.com:9332/web/graph_data/pool_rates/last_hour
Actually, I hadn't thought about discounting orphans, since they again could potentially solve the BTC requirements.

Quote
As an aside, can anyone clarify exactly what http://mining.coincadence.com:9332/rate is reporting? from web.py:
Code:
web_root.putChild('rate', WebInterface(lambda: p2pool_data.get_pool_attempts_per_second(node.tracker, node.best_share_var.value, decent_height())/(1-p2pool_data.get_average_stale_prop(node.tracker, node.best_share_var.value, decent_height()))))
I was about to reply that it's just calculating the overall hash rate of the p2pool network, but that little bit about the "get_average_stale_prop" threw me off.  I'd have to dig further into the code to answer you more accurately.  If I get the time tomorrow, I'll try to get back to you on it.  Hopefully someone more familiar with that could answer it sooner.

Jonny's Pool - Mine with us and help us grow!  Support a pool that supports Bitcoin, not a hardware manufacturer's pockets!  No SPV cheats.  No empty blocks.
dexu
Full Member
***
Offline Offline

Activity: 149


View Profile
June 11, 2014, 05:46:58 PM
 #8918

Hi, new node @ Poland - http://bitcoin.fastlink.pl:9332
Please add to http://p2pool-nodes.info/


Best Regards, Dexu.
murdof
Full Member
***
Offline Offline

Activity: 182


View Profile
June 11, 2014, 06:33:30 PM
 #8919

Can someone use p2pool to solo mine and merge mine at the same time?

So mainly is there an option to turn btc solo mining on?

If not - would mainly mine other coin e.g. IXC and merge mine BTC offer the same result?

jonnybravo0311
Legendary
*
Offline Offline

Activity: 1008


Mine at Jonny's Pool


View Profile WWW
June 11, 2014, 06:47:06 PM
 #8920

Can someone use p2pool to solo mine and merge mine at the same time?

So mainly is there an option to turn btc solo mining on?

If not - would mainly mine other coin e.g. IXC and merge mine BTC offer the same result?
I imagine you could... you'd have to hack the code a bit (there's a guide on how to do it here: https://bitcointalk.org/index.php?topic=512042.0.  Then you'd just start up p2pool as you normally would for merged mining.  Remember, though, since merged mining is solo mining, you're now solo mining everything.  Maybe you get lucky and hit the lottery by generating the BTC block.

Jonny's Pool - Mine with us and help us grow!  Support a pool that supports Bitcoin, not a hardware manufacturer's pockets!  No SPV cheats.  No empty blocks.
Pages: « 1 ... 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 [446] 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 ... 744 »
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!