supermine
|
|
December 19, 2013, 11:07:27 PM |
|
How many hours does it usually take the pool to pay out?
About 1 millisecond. Why did I not receive any then? (I am mining doge since 12 hours).
|
|
|
|
Schleicher
|
|
December 19, 2013, 11:39:41 PM |
|
You are missing the issue of variance. If the difficulty drops it will also go up. It averages out to the same share difficulty. So statistically you still get the same number of shares over time.
If the big miners are sending shares with higher difficulty they would be sending less shares per minute. The variance for the big miners would be higher. The minimum p2pool difficulty would drop. All miners using the minimum difficulty would be sending more shares. Their variance would decrease. The total p2pool hash rate would stay the same. The average income of the miners would stay the same.
|
|
|
|
supermine
|
|
December 19, 2013, 11:51:49 PM |
|
Why not state clearly in this thread that p2pool.org and p2pool.com are not affiliated with p2pool?
I added some clarification about p2pool.org to the first post. I think p2pool.com is pretty obviously not. In other related news, using http://p2pool.info/ as the new homepage for P2Pool is on the horizon since I recently took up maintenance of it and as it's already well-known. good, about time to handle with the guy @ p2pool.org Can you help me? I've been mining into p2pool.org and did not get any coinage.
|
|
|
|
organofcorti
Donator
Legendary
Offline
Activity: 2058
Merit: 1007
Poor impulse control.
|
|
December 20, 2013, 01:34:55 AM |
|
....... The well known bitcoin term here is called ... variance. Read up about it Yeah, I know. No. You are missing the issue of variance. If the difficulty drops it will also go up. It averages out to the same share difficulty. So statistically you still get the same number of shares over time. kano, I think TierNolan is referring to the fact that some large miners can set their own difficulty much higher than the automatic amount set by the system. If I have that fact right, then this means that the average share difficulty is reduced for the remainder of miners.
|
|
|
|
kano
Legendary
Offline
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
|
|
December 20, 2013, 02:48:57 AM |
|
....... The well known bitcoin term here is called ... variance. Read up about it Yeah, I know. No. You are missing the issue of variance. If the difficulty drops it will also go up. It averages out to the same share difficulty. So statistically you still get the same number of shares over time. kano, I think TierNolan is referring to the fact that some large miners can set their own difficulty much higher than the automatic amount set by the system. If I have that fact right, then this means that the average share difficulty is reduced for the remainder of miners. Unless there's some weird calculation in the p2pool code to push the base difficulty down, it doesn't matter if someone is submitting 'double' diff shares - they are just the same as submitting 2 'single' diff shares other than an increase in variance.
|
|
|
|
kjj
Legendary
Offline
Activity: 1302
Merit: 1026
|
|
December 20, 2013, 02:58:42 AM |
|
Unless there's some weird calculation in the p2pool code to push the base difficulty down
There is.
|
17Np17BSrpnHCZ2pgtiMNnhjnsWJ2TMqq8 I routinely ignore posters with paid advertising in their sigs. You should too.
|
|
|
|
TierNolan
Legendary
Offline
Activity: 1232
Merit: 1104
|
|
December 20, 2013, 12:05:32 PM |
|
Each worker sets its own difficulty right? So, - adjust minimum difficulty per miner so that shares are received at most once every 60 minutes - all workers to the same address count as a single miner for these purposes I think there is at least one p2pool backed pool out there that supports smaller miners. However, it works by setting the fee to 100% and has a separate process to handle payouts to miners.
|
1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
|
|
|
kano
Legendary
Offline
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
|
|
December 20, 2013, 12:59:29 PM |
|
Unless there's some weird calculation in the p2pool code to push the base difficulty down
There is. Well if there is than the statement on the main p2pool page is wrong also: P2Pool creates a new block chain in which the difficulty is adjusted so a new block is found every 1030 seconds. I of course updated the 10 to 30. That statement would be false if what people are implying here is true. To make a share average every 30s, it must use the submitted 1diff share rate to attempt to make that happen. There is no other way to do it. So again ... If the difficulty drops it will also go up. It averages out to the same share difficulty.
So statistically you still get the same number of shares over time.
|
|
|
|
twmz
|
|
December 20, 2013, 01:41:37 PM Last edit: December 20, 2013, 04:28:52 PM by twmz |
|
Unless there's some weird calculation in the p2pool code to push the base difficulty down
There is. Well if there is than the statement on the main p2pool page is wrong also: P2Pool creates a new block chain in which the difficulty is adjusted so a new block is found every 1030 seconds. I of course updated the 10 to 30. That statement would be false if what people are implying here is true. To make a share average every 30s, it must use the submitted 1diff share rate to attempt to make that happen. There is no other way to do it. So again ... If the difficulty drops it will also go up. It averages out to the same share difficulty.
So statistically you still get the same number of shares over time.
It's really not that complicated. p2pool adjusts the minimum required share difficulty for shares so that a share will be found once every 30 seconds. Now, if a bunch of people start to opt-in to a share difficulty much higher minimum requirement (doing so is a feature baked into p2pool), then obviously shares are going to start to be found less often. In order for shares to continue to be found about once per 30 seconds, everyone else has to start finding them more often. The only way they can find shares more often (with the same hashrate) is if p2pool lowers the minimum required share difficulty, and that's exactly what it does.
|
Was I helpful? 1 TwmzX1wBxNF2qtAJRhdKmi2WyLZ5VHRs WoT, GPGBitrated user: ewal.
|
|
|
TierNolan
Legendary
Offline
Activity: 1232
Merit: 1104
|
|
December 20, 2013, 02:49:04 PM |
|
The only way they can find shares more often (with the same hashrate) is if p2pool lowers the minimum required share difficulty, and that's exactly what it does.
Right, the key is that the pool can set minimum difficulty. Rather than adjusting difficulty for all miners, it adjusts difficulty for miners who have their difficulty set to the lowest possible. Another way would be to adjust all miner's difficulty. A miner might set their difficulty to 100X the default value. The pool would then adjust the default value.
|
1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
|
|
|
crunchy
Newbie
Offline
Activity: 19
Merit: 0
|
|
December 20, 2013, 05:28:56 PM |
|
so whats with this liteco.in not found messages at startup ??
i've seen whats happened with liteco.in. very questionable all.
so why has that adress been hardcoded into p2pool ??
|
|
|
|
kano
Legendary
Offline
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
|
|
December 20, 2013, 10:21:40 PM |
|
... It's really not that complicated. p2pool adjusts the minimum required share difficulty for shares so that a share will be found once every 30 seconds. Now, if a bunch of people start to opt-in to a share difficulty much higher minimum requirement (doing so is a feature baked into p2pool), then obviously shares are going to start to be found less often. In order for shares to continue to be found about once per 30 seconds, everyone else has to start finding them more often. The only way they can find shares more often (with the same hashrate) is if p2pool lowers the minimum required share difficulty, and that's exactly what it does.
That misses Stats 101 Since the shares are of higher value, then when one is submitted it will mean the hash rate will appear higher. So back again to what I said ... ... No.
You are missing the issue of variance.
If the difficulty drops it will also go up. It averages out to the same share difficulty.
So statistically you still get the same number of shares over time.
... Unless there's some weird calculation in the p2pool code to push the base difficulty down, it doesn't matter if someone is submitting 'double' diff shares - they are just the same as submitting 2 'single' diff shares other than an increase in variance.
i.e. no 'weird' calculation. If you (on average) submit a double diff share every share you submit, but do it half as often, then the average share difficulty you submit will still be the same. I'll try a little simple maths: 2 + 0 = 2 ... average ... 1 Since the highest user is only around 11% of the pool, the variance it causes is also pretty small.
|
|
|
|
kjj
Legendary
Offline
Activity: 1302
Merit: 1026
|
|
December 20, 2013, 11:09:12 PM |
|
... It's really not that complicated. p2pool adjusts the minimum required share difficulty for shares so that a share will be found once every 30 seconds. Now, if a bunch of people start to opt-in to a share difficulty much higher minimum requirement (doing so is a feature baked into p2pool), then obviously shares are going to start to be found less often. In order for shares to continue to be found about once per 30 seconds, everyone else has to start finding them more often. The only way they can find shares more often (with the same hashrate) is if p2pool lowers the minimum required share difficulty, and that's exactly what it does.
That misses Stats 101 Since the shares are of higher value, then when one is submitted it will mean the hash rate will appear higher. So back again to what I said ... ... No.
You are missing the issue of variance.
If the difficulty drops it will also go up. It averages out to the same share difficulty.
So statistically you still get the same number of shares over time.
... Unless there's some weird calculation in the p2pool code to push the base difficulty down, it doesn't matter if someone is submitting 'double' diff shares - they are just the same as submitting 2 'single' diff shares other than an increase in variance.
i.e. no 'weird' calculation. If you (on average) submit a double diff share every share you submit, but do it half as often, then the average share difficulty you submit will still be the same. I'll try a little simple maths: 2 + 0 = 2 ... average ... 1 Since the highest user is only around 11% of the pool, the variance it causes is also pretty small. Are you being dense on purpose, or do you really not get it? The network targets a share rate, not a hash rate, not a difficulty rate, and not a (share times value) rate. It adjusts so that the correct number of shares is found over time, which shares can be of any difficulty equal to or greater than the minimum. Shares of higher than the minimum difficulty are worth more, but still only take up one "slot" in the calculation.
|
17Np17BSrpnHCZ2pgtiMNnhjnsWJ2TMqq8 I routinely ignore posters with paid advertising in their sigs. You should too.
|
|
|
twmz
|
|
December 21, 2013, 01:25:56 AM |
|
That misses Stats 101
You're welcome to believe what you want, but I'm done interacting with you. It's an observable fact that if high hashrate miners opt in to a higher share difficulty/target, the minimum share difficulty/target will come down for everyone else. Think of it as "magic" if that makes you more comfortable.
|
Was I helpful? 1 TwmzX1wBxNF2qtAJRhdKmi2WyLZ5VHRs WoT, GPGBitrated user: ewal.
|
|
|
powersync
Member
Offline
Activity: 77
Merit: 10
|
|
December 21, 2013, 04:34:49 AM |
|
hey guys,
I just started my ltc p2pool server again after taking a break for 6 months. I upgraded to 13.4-4 and seeing my memory usage creep up to 1gb. I restart P2pool and all is fine. I'm running Python 2.7.2+ on Ubuntu Linux server 11.10. I cant find anyone lately having the same problem. Where should I start to solve this?
PS
|
|
|
|
Raulnsh
Newbie
Offline
Activity: 22
Merit: 0
|
|
December 21, 2013, 10:54:35 AM |
|
hey guys,
I just started my ltc p2pool server again after taking a break for 6 months. I upgraded to 13.4-4 and seeing my memory usage creep up to 1gb. I restart P2pool and all is fine. I'm running Python 2.7.2+ on Ubuntu Linux server 11.10. I cant find anyone lately having the same problem. Where should I start to solve this?
PS
You should use p2pool 13.3. I've tried to upgrade 13.3 to 13.4 but the latter did not got any shares.
|
|
|
|
powersync
Member
Offline
Activity: 77
Merit: 10
|
|
December 21, 2013, 04:12:33 PM |
|
Well went back to 13.3 and all is fine.... What's up with that? Anyone having luck with 13.4?
PS
|
|
|
|
maqifrnswa
|
|
December 21, 2013, 11:24:28 PM |
|
Are you being dense on purpose, or do you really not get it?
The network targets a share rate, not a hash rate, not a difficulty rate, and not a (share times value) rate. It adjusts so that the correct number of shares is found over time, which shares can be of any difficulty equal to or greater than the minimum. Shares of higher than the minimum difficulty are worth more, but still only take up one "slot" in the calculation.
I'm surprised too that Kano doesn't know that. p2pool works this way: Imagine there are two miners on p2pool, miner A has 10x the hashrate of miner B. Miner A and B both use the default difficulty, which is adjusted so that miner A gets 10x the shares of miner B and a share is found every 30 seconds. In one hour (on average), Miner A finds ~109 shares and Miner B finds ~11 shares. Now miner A manually sets his difficulty to 5x higher than the default. He now finds shares 5x slower (on average). So in an hour, miner A only finds 22 shares, and miner B still finds 11 shares. The p2pool network now must decrease difficulty so that 120 shares are found in an hour. Difficulty is thus decreased to the point where miner B (the miner mining on default minimum difficulty) finds 120-22=98 shares in an hour. That means: Miner A increased difficulty 5x the p2pool minimum difficulty decreases 9x both miner A and B get the same payout on average, miner A has higher variance than default and miner B has lower variance than default.
|
|
|
|
kano
Legendary
Offline
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
|
|
December 22, 2013, 01:01:50 AM |
|
... Are you being dense on purpose, or do you really not get it?
The network targets a share rate, not a hash rate, not a difficulty rate, and not a (share times value) rate. It adjusts so that the correct number of shares is found over time, which shares can be of any difficulty equal to or greater than the minimum. Shares of higher than the minimum difficulty are worth more, but still only take up one "slot" in the calculation.
Sigh - OK so what you are telling me is that the function in p2pool is flawed and not as specified on the p2pool.info web site. The hash rate of the pool is simply the accepted difficulty of all the shares over a given period of time divided by that time. The shorter the time frame, the more unreliable it is i.e. the greater the variance. The way bitcoin works is it does indeed do what you are suggesting - however it does it over a set of data that is consistent - a set of blocks that have the same difficulty requirement. Doing it over a set of data that isn't consistent, is, well, saying it's done incorrectly. Maybe that is part of the reason why it jumps all over the place like crap. What you are implying is that indeed it has nothing to do with the p2pool hash rate, but rather a set of shares with somewhat random difficulty. I will also point out that your wording is vague and missing a detail: The difficulty of the share submitted is only the difficulty that it was requested at, not the actual difficulty of the share itself. All shares on all pools have varying difficulty, but it is only the difficulty of the work request that can affect anything on any pool payment scheme other than PoT Edit: this also means the payout scheme could be flawed.
|
|
|
|
|