p3yot33at3r
|
|
June 07, 2015, 11:45:37 AM |
|
qt works but not daemon
Exactly. At least on my distro & local node.
|
|
|
|
chalkboard17
|
|
June 07, 2015, 12:38:37 PM |
|
I have searched everywhere but haven't found an answer to this. Appreciate if someone could help me. I have been trying to mine on p2pool, but I get much lower hashrate than usual. I normally get 1380gh/s on eligius and ghash. On p2pool I always get ~1230gh/s. I already tried mining on my node and other people's nodes and result is still the same. I am using antminer s5 interface and pointing miner to the ip.
I cannot use my IP to mine, don't know why. I use its network IP and seems to work fine (at lower hashrate). Is that ok?
Can someone take any fees or possibly steal hashrate from me if I use p2pool?
Can I merge mine on all merged mining possible coins? At the same time? On windows? Thanks
|
|
|
|
p3yot33at3r
|
|
June 07, 2015, 01:08:42 PM |
|
I am using antminer s5 interface and pointing miner to the ip.
Use kano's cgminer replacement - bitmain's cgminer is borked for p2pool: SSH into your S5 as root then copy/paste: cd /tmp wget http://ck.kolivas.org/apps/cgminer/antminer/s5/4.9.0-150105/cgminer chmod +x cgminer mv /usr/bin/cgminer /usr/bin/cgminer.bak cp cgminer /usr/bin /etc/init.d/cgminer.sh restart Then press enter. I cannot use my IP to mine, don't know why. I use its network IP and seems to work fine (at lower hashrate). Is that ok?
You should be able to use your LAN IP - is your node on your network or on your PC? Can someone take any fees or possibly steal hashrate from me if I use p2pool?
You can see if a node is charging you a fee by adding "/fee" at the end of the node address. It's impossible for anyone to steal your hash if you use your own node. Can I merge mine on all merged mining possible coins? At the same time? On windows?
Yes.
|
|
|
|
windpath
Legendary
Offline
Activity: 1258
Merit: 1027
|
|
June 07, 2015, 06:01:00 PM |
|
The correct and direct definition of luck (where >100% is good luck and less than 100% is bad luck) is simply DifficultyExpected/DifficultySubmitted
I'm not sure I understand how you would calculate this, wouldn't the submitted diff for a valid block always be greater than or equal to the expected diff ? We use the average of the stored hashrate since the last block was found by the pool (and we weight and average the difficulty if it changed since last block found) to determine an average expected time to block, then compare that to the actual time for the block in question. If the the times are equal its 100% If we found it faster > than 100% If we found it slower < than 100%
|
|
|
|
yslyung
Legendary
Offline
Activity: 1500
Merit: 1002
Mine Mine Mine
|
|
June 07, 2015, 08:15:46 PM |
|
The correct and direct definition of luck (where >100% is good luck and less than 100% is bad luck) is simply DifficultyExpected/DifficultySubmitted
I'm not sure I understand how you would calculate this, wouldn't the submitted diff for a valid block always be greater than or equal to the expected diff ? We use the average of the stored hashrate since the last block was found by the pool (and we weight and average the difficulty if it changed since last block found) to determine an average expected time to block, then compare that to the actual time for the block in question. If the the times are equal its 100% If we found it faster > than 100% If we found it slower < than 100% math & coding is hard ... but afaik with my poor math & coding skills ... p2pool luck is better than kano.is ?
|
|
|
|
windpath
Legendary
Offline
Activity: 1258
Merit: 1027
|
|
June 07, 2015, 08:27:30 PM |
|
The correct and direct definition of luck (where >100% is good luck and less than 100% is bad luck) is simply DifficultyExpected/DifficultySubmitted
I'm not sure I understand how you would calculate this, wouldn't the submitted diff for a valid block always be greater than or equal to the expected diff ? We use the average of the stored hashrate since the last block was found by the pool (and we weight and average the difficulty if it changed since last block found) to determine an average expected time to block, then compare that to the actual time for the block in question. If the the times are equal its 100% If we found it faster > than 100% If we found it slower < than 100% math & coding is hard ... but afaik with my poor math & coding skills ... p2pool luck is better than kano.is ? Sometimes it is, sometimes it is not, the nature of luck
|
|
|
|
chalkboard17
|
|
June 07, 2015, 08:39:54 PM |
|
I am using antminer s5 interface and pointing miner to the ip.
Use kano's cgminer replacement - bitmain's cgminer is borked for p2pool: SSH into your S5 as root then copy/paste: cd /tmp wget http://ck.kolivas.org/apps/cgminer/antminer/s5/4.9.0-150105/cgminer chmod +x cgminer mv /usr/bin/cgminer /usr/bin/cgminer.bak cp cgminer /usr/bin /etc/init.d/cgminer.sh restart Then press enter. Thank you, worked.
|
|
|
|
p3yot33at3r
|
|
June 07, 2015, 09:07:59 PM |
|
Thank you, worked.
Thank kano It's not persistent - you'll have to do the same after every reboot. Also, change your queue setting to 1 or 0 - whatever works best for you. There's a custom firmware mentioned a few pages back that does all this for you & is persistent - might be better for you.
|
|
|
|
jonnybravo0311
Legendary
Offline
Activity: 1344
Merit: 1024
Mine at Jonny's Pool
|
|
June 08, 2015, 02:54:22 PM |
|
The correct and direct definition of luck (where >100% is good luck and less than 100% is bad luck) is simply DifficultyExpected/DifficultySubmitted
I'm not sure I understand how you would calculate this, wouldn't the submitted diff for a valid block always be greater than or equal to the expected diff ? We use the average of the stored hashrate since the last block was found by the pool (and we weight and average the difficulty if it changed since last block found) to determine an average expected time to block, then compare that to the actual time for the block in question. If the the times are equal its 100% If we found it faster > than 100% If we found it slower < than 100% He's referring to the actual number of shares submitted vs the expected shares submitted to find a block. Since p2pool has no real knowledge of any miner's actual hash rate and submitted shares like ckpool does, the best we could do with p2pool is to evaluate how many share-chain shares we'd expect it to take to find a block vs how many share-chain shares were actually submitted to find it. The problem is the number of expected shares is constantly changing on p2pool because the share difficulty constantly changes, unlike the BTC network where it's static for 2016 blocks. The best you're ever going to get is just an approximation of luck using the expected vs actual figures, so I see no real reason to change the calculations you're currently using, since they're providing an approximation as well.
|
Jonny's Pool - Mine with us and help us grow! Support a pool that supports Bitcoin, not a hardware manufacturer's pockets! No SPV cheats. No empty blocks.
|
|
|
kano
Legendary
Offline
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
|
|
June 08, 2015, 11:20:40 PM |
|
The correct and direct definition of luck (where >100% is good luck and less than 100% is bad luck) is simply DifficultyExpected/DifficultySubmitted
I'm not sure I understand how you would calculate this, wouldn't the submitted diff for a valid block always be greater than or equal to the expected diff ? We use the average of the stored hashrate since the last block was found by the pool (and we weight and average the difficulty if it changed since last block found) to determine an average expected time to block, then compare that to the actual time for the block in question. If the the times are equal its 100% If we found it faster > than 100% If we found it slower < than 100% He's referring to the actual number of shares submitted vs the expected shares submitted to find a block. Since p2pool has no real knowledge of any miner's actual hash rate and submitted shares like ckpool does, the best we could do with p2pool is to evaluate how many share-chain shares we'd expect it to take to find a block vs how many share-chain shares were actually submitted to find it. The problem is the number of expected shares is constantly changing on p2pool because the share difficulty constantly changes, unlike the BTC network where it's static for 2016 blocks. The best you're ever going to get is just an approximation of luck using the expected vs actual figures, so I see no real reason to change the calculations you're currently using, since they're providing an approximation as well. DifficultyExpected = 47589591153.62500763 It only changes once every 2016 blocks ...... so yes you know what it is. The shares in the sharechain submitted have a difficulty pow requirement for each share accepted ... that's why it's accepted. Sum up the sharechain difficulties (pow requirement, not the actual share difficulty of course) to get DifficultySubmitted. Yeah as I said, you have those numbers. Those numbers are how p2pool determines the pool hash rate, except it's not very accurate, and you are using that hash rate number to show the luck ... Look at the pool hash rate and watch it change ... often up to 20% ... all over the place ... it's not very accurate. The problem on top of all this is that is if you include the (rare) non-share-chain blocks in your calculation - but don't include the hashes that were used to find those blocks ... so ... your stated luck would be higher than it really is ... hmm that doesn't sound good ... stating it higher than it really is.
|
|
|
|
jonnybravo0311
Legendary
Offline
Activity: 1344
Merit: 1024
Mine at Jonny's Pool
|
|
June 09, 2015, 12:11:02 AM |
|
The problem on top of all this is that is if you include the (rare) non-share-chain blocks in your calculation - but don't include the hashes that were used to find those blocks ... so ... your stated luck would be higher than it really is ... hmm that doesn't sound good ... stating it higher than it really is.
You can't include the share difficulty of the shares that are orphaned/dead that solve blocks because those shares are never transmitted to the p2pool network. Furthermore, even standard orphaned/dead shares can't ever be counted for the same reason - they aren't transmitted. The only thing you've got is what can be gleaned from the share chain itself, which would only ever be accurate if absolutely there were no orphans or dead... which you're pretty much guaranteed to never have happen . In effect, any calculation of luck is ALWAYS going to be higher than actuality because of orphaned/dead shares that never make it onto the share chain. EDIT: I forgot to mention that the share chain doesn't keep record of all shares, so there is also the possibility that some shares drop off the chain between block finds. So unless you're recording every share that is submitted (which you certainly should be if you're trying to capture luck) your calculations will be off from there as well.
|
Jonny's Pool - Mine with us and help us grow! Support a pool that supports Bitcoin, not a hardware manufacturer's pockets! No SPV cheats. No empty blocks.
|
|
|
yslyung
Legendary
Offline
Activity: 1500
Merit: 1002
Mine Mine Mine
|
|
June 09, 2015, 12:15:29 AM |
|
The problem on top of all this is that is if you include the (rare) non-share-chain blocks in your calculation - but don't include the hashes that were used to find those blocks ... so ... your stated luck would be higher than it really is ... hmm that doesn't sound good ... stating it higher than it really is.
You can't include the share difficulty of the shares that are orphaned/dead that solve blocks because those shares are never transmitted to the p2pool network. Furthermore, even standard orphaned/dead shares can't ever be counted for the same reason - they aren't transmitted. The only thing you've got is what can be gleaned from the share chain itself, which would only ever be accurate if absolutely there were no orphans or dead... which you're pretty much guaranteed to never have happen . In effect, any calculation of luck is ALWAYS going to be higher than actuality because of orphaned/dead shares that never make it onto the share chain. minus the DOA "should" be closer to actual luck ? well even taking away the 20% is still good luck & i hope it continues. mine on !
|
|
|
|
kano
Legendary
Offline
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
|
|
June 09, 2015, 01:36:40 AM |
|
The problem on top of all this is that is if you include the (rare) non-share-chain blocks in your calculation - but don't include the hashes that were used to find those blocks ... so ... your stated luck would be higher than it really is ... hmm that doesn't sound good ... stating it higher than it really is.
You can't include the share difficulty of the shares that are orphaned/dead that solve blocks because those shares are never transmitted to the p2pool network. Furthermore, even standard orphaned/dead shares can't ever be counted for the same reason - they aren't transmitted. The only thing you've got is what can be gleaned from the share chain itself, which would only ever be accurate if absolutely there were no orphans or dead... which you're pretty much guaranteed to never have happen . In effect, any calculation of luck is ALWAYS going to be higher than actuality because of orphaned/dead shares that never make it onto the share chain. minus the DOA "should" be closer to actual luck ? well even taking away the 20% is still good luck & i hope it continues. mine on ! Ignoring the non-share-chain blocks would give you a valid luck value for the share-chain p2pool blocks - using the simple DifficultyExpected/DifficultySubmitted The only catch of course would be to know if the non-share-chain blocks have roughly the same expected luck as the normal blocks. i.e. is there some code/network related factor that affects their luck differently to the others? The assumption would probably be no difference. A very rough estimate of the non-share-chain blocks work would be 95% of all the pool's stale work - since it is that work (and only that work) that produces those blocks. "95%" since on average, 19 out of 20 share-chain shares are submitted when there isn't a network block change. ... 30s per share = average 20 share changes per block change (on a 0% diff change), but only one of the 20 is a block change.
|
|
|
|
windpath
Legendary
Offline
Activity: 1258
Merit: 1027
|
|
June 09, 2015, 01:48:57 PM |
|
Did not implement a way to track non-share chain blocks, perhaps I will in the future, but the historical stuff is gone. I believe it is a much higher % than you think. I'd speculate (and yes it's just speculation) that it is somewhere around 5-7% of found blocks. I understand how your calculation works now, thank you, however I don't see how it could be applied to p2pool practically. The share difficulty changes with every share, and often nodes disagree on difficulty (due to propagation times). While the reported hash rate is only an estimate I believe it's the best number we have to work with, and by using the 1 minute average since the last block was found I think we get about as accurate a picture as possible without storing every single share for the long term. Storing all the shares would be cool, but just don't see how to do it in a way that is directly query-able without creating an additional enterprise scale DB on top of what we already have. How do you store submitted work for your pool? Do you plan to keep all the data in a query-able state for all time? After thinking about this on and off for over a year now I'm OK with having an ~estimated~ luck, that is consistently calculated from the data we have available. It serves it's intended purpose which is to reflect how the pool is preforming overall, over time. I view having to use imperfect data as a trade off for getting completely trust-less decentralization, something I value much more than a 100% accurate all the time luck stat. While I do understand your points, I find it ironic for you to criticize P2Pool's infrastructure while running your own closed source centralized pool. Serious question: What would it take to get you and CK to merge your pool with P2Pool, and focus on it's scalability problems? We could sure use your expertise
|
|
|
|
yslyung
Legendary
Offline
Activity: 1500
Merit: 1002
Mine Mine Mine
|
|
June 09, 2015, 01:57:11 PM |
|
While I do understand your points, I find it ironic for you to criticize P2Pool's infrastructure while running your own closed source centralized pool.
+9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999 Well said VERY well said, sending u a tip shortly for your effort for p2pool. post up an addy.
|
|
|
|
p3yot33at3r
|
|
June 09, 2015, 02:07:55 PM |
|
Serious question: What would it take to get you and CK to merge your pool with P2Pool, and focus on it's scalability problems? We could sure use your expertise I've seen posts by others on this thread regarding the scalability thing - to me this is the biggest issue with p2pool. Has anyone actually conversed with the dev (if he's still around) to see if he has any ideas/solutions? Was a bounty ever organised with regard to finding a dev who could find a solution?
|
|
|
|
yslyung
Legendary
Offline
Activity: 1500
Merit: 1002
Mine Mine Mine
|
|
June 09, 2015, 02:36:06 PM |
|
Serious question: What would it take to get you and CK to merge your pool with P2Pool, and focus on it's scalability problems? We could sure use your expertise I've seen posts by others on this thread regarding the scalability thing - to me this is the biggest issue with p2pool. - yes it is & it still is atm sadly. i think it can be scaled but some of those who have the knowledge might not be willing to help or share Has anyone actually conversed with the dev (if he's still around) to see if he has any ideas/solutions? - u mean forrestv ? he has not been here since a long time ago, now it;s the community that's supporting p2pool Was a bounty ever organised with regard to finding a dev who could find a solution? - yes but splashed with cold water ... i'm still a supporter for p2pool & hoping something will happen someday.
|
|
|
|
jonnybravo0311
Legendary
Offline
Activity: 1344
Merit: 1024
Mine at Jonny's Pool
|
|
June 09, 2015, 05:17:47 PM |
|
Serious question: What would it take to get you and CK to merge your pool with P2Pool, and focus on it's scalability problems? We could sure use your expertise I've seen posts by others on this thread regarding the scalability thing - to me this is the biggest issue with p2pool. Has anyone actually conversed with the dev (if he's still around) to see if he has any ideas/solutions? Was a bounty ever organised with regard to finding a dev who could find a solution? A ton of ideas have been thrown around but none of them have proved to be implementable. The problem is in the concept of the share chain. In effect it really is nothing more than a relatively low difficulty coin that you are solo mining. The solution, which contains things like payout information, gets added to the share chain. If that share also happens to solve a block of BTC, the block award gets distributed according to the payout information in the share that solves the block. Because of this construct there's no easily implemented solution to the problem of variance. OgNasty and Nonnakip have come up with a very interesting approach which puts ckpool on top of the p2pool backbone. The tradeoff is that by choosing to mine there you are bound by the same constraints as a typical centralized pool. For example, mining on a typical p2pool node, you can failover to any other p2pool node and none of your work is lost. Not so with their implementation - no other standard p2pool node has any concept of work you've done because you don't have any individual shares on the chain. You sacrifice the completely trust-less decentralized nature of p2pool for variance reduction. It's a nice step forward and I've had a couple S3s pointed to OgNasty's pool (both standard p2pool and NastyPoP) since November of last year. You can see my long-running thread about it here: https://bitcointalk.org/index.php?topic=891298.0If there were a viable solution, it could be implemented.
|
Jonny's Pool - Mine with us and help us grow! Support a pool that supports Bitcoin, not a hardware manufacturer's pockets! No SPV cheats. No empty blocks.
|
|
|
minerpool-de
|
|
June 09, 2015, 06:49:05 PM |
|
Somehow I think P2Pool dying out. The network performance is ridiculous, the last visitors on my Node passed weeks ago. I think I switch my node down.
|
|
|
|
p3yot33at3r
|
|
June 09, 2015, 07:00:02 PM |
|
Somehow I think P2Pool dying out. The network performance is ridiculous, the last visitors on my Node passed weeks ago. I think I switch my node down.
Don't you use your own node then? The last couple of weeks have been excellent for p2pool - way above average.
|
|
|
|
|