organofcorti
Donator
Legendary
Offline
Activity: 2058
Merit: 1007
Poor impulse control.
|
|
May 11, 2012, 03:39:07 PM |
|
At a guess (and seeing a number of conversations about it) he's probably concerned about variance vs maturity time. It would be nice if pools posted these values based on their chosen parameters. It would be a point of differentiation for them, anyway.
I thought the other pools were too. We have since we moved to DGM shown Double Geometric Variables f=-0.25, c=0.2, o=0.8 in our forum thread 1st post and on Ozcoin's homepage are there others we should be showing? I do think that's a great start, but it occurred to me that miners just want to know how the variable choice will affect them. Many miners will confuse expectation with variance and maturity time. Having the variables listed is good, no doubt, but for a naive miner it will be difficult to understand the relevance. One problem is that I don't know of a simple calculation that will give the variance and maturity time as a function of the parameters. Also, there is more than one kind of variance. If there is demand for this I'll try to work on some approximations or charts or something. For now there's the variance for a single share in the OP, it's relevant for very small or intermittent miners, and as you can see it's a mouthful.
Thinking about it some more, even listing the single share variance wouldn't be a very good indicator for a general miner. Charted data from a simulation might be a better bet, but I'm not sure how you'd select a set of exemplars that would cover all (or most) situations of interest to a miner. For example, would you need to have separate large and small miner simulation results? Plus, each pool would have to run their own simulation to illustrate what effects their variable choices have on the performance metrics of the examplars selected.
|
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
May 11, 2012, 03:47:55 PM |
|
One problem is that I don't know of a simple calculation that will give the variance and maturity time as a function of the parameters. Also, there is more than one kind of variance. If there is demand for this I'll try to work on some approximations or charts or something. For now there's the variance for a single share in the OP, it's relevant for very small or intermittent miners, and as you can see it's a mouthful.
Thinking about it some more, even listing the single share variance wouldn't be a very good indicator for a general miner. Charted data from a simulation might be a better bet, but I'm not sure how you'd select a set of exemplars that would cover all (or most) situations of interest to a miner. For example, would you need to have separate large and small miner simulation results? Plus, each pool would have to run their own simulation to illustrate what effects their variable choices have on the performance metrics of the examplars selected. It should be possible to create a dataset by running for various parameters, and then interpolating for all other values. Then each pool can show values or graphs for their chosen parameters, and allow the user to set his hashrate with a slider to his how specific results. I have reason to believe that the variance is affine in the miner's hashrate (at least approximately), so that simplifies things. If the interpolation can be done easily enough, maybe the pool can also give sliders for seeing the results for various DGM parameters. By the way, in AoBPMRS I mentioned that it's not strictly necessary that all miners in a pool will have the same parameters. So it could be possible for miners to choose their own parameters based on their desired performance metrics.
|
|
|
|
kano
Legendary
Offline
Activity: 4620
Merit: 1851
Linux since 1997 RedHat 4
|
|
May 11, 2012, 09:52:59 PM |
|
... As for bankruptcy payout - the main problem I'd see there from DGM vs PPS (or some of the different PPS) is that no one can probably work out the owed amount on DGM since there is a future component to that, that depends on the future also. On PPS it's pretty simple.
You can calculate the expected future payment for a given score. (Actually, you can parametrize it so that the expected payout is the score). The operator should pay that out so that miners won't suffer (I'm assuming a deterministic payout is considered better than a variable payout of the same expectation). The total score of all miners is bounded, and the operator should have this as an additional reserve - he should declare bankruptcy before he becomes unable to offer a cashout. Just thought I'd mention that you seem to be discussing exactly what I was referring to here ...
|
|
|
|
organofcorti
Donator
Legendary
Offline
Activity: 2058
Merit: 1007
Poor impulse control.
|
|
May 11, 2012, 11:11:34 PM |
|
... As for bankruptcy payout - the main problem I'd see there from DGM vs PPS (or some of the different PPS) is that no one can probably work out the owed amount on DGM since there is a future component to that, that depends on the future also. On PPS it's pretty simple.
That's not really what I meant. I don't want to see how parameters affect the 'owed amount' , but rather the variance in payout and maturity time. If the score is calculated to represent the 'owed amount', then a graphical illustration of the effects of changing parameters wouldn't be necessary. Unless I'm not following you?
|
|
|
|
organofcorti
Donator
Legendary
Offline
Activity: 2058
Merit: 1007
Poor impulse control.
|
|
May 12, 2012, 01:42:34 AM |
|
It should be possible to create a dataset by running for various parameters, and then interpolating for all other values. Then each pool can show values or graphs for their chosen parameters, and allow the user to set his hashrate with a slider to his how specific results. I have reason to believe that the variance is affine in the miner's hashrate (at least approximately), so that simplifies things. If the interpolation can be done easily enough, maybe the pool can also give sliders for seeing the results for various DGM parameters.
The important parameters being f, c, o, and miner hashrate as a proportion of pool hashrate? Let me know if I missed anything. By the way, in AoBPMRS I mentioned that it's not strictly necessary that all miners in a pool will have the same parameters. So it could be possible for miners to choose their own parameters based on their desired performance metrics.
I didn't see that mentioned explicitly - is it implied?
|
|
|
|
Graet
VIP
Legendary
Offline
Activity: 980
Merit: 1001
|
|
May 12, 2012, 03:37:03 AM |
|
The important parameters being f, c, o, and miner hashrate as a proportion of pool hashrate? Let me know if I missed anything.
is this assuming a "stable " hashrate through the round with proxy miners, gpumax miners and some leftover hoppers using us as backup etc coming and going our hashrate can vary significantly within 1 round eg https://ozcoin.net/content/hashrate
|
|
|
|
organofcorti
Donator
Legendary
Offline
Activity: 2058
Merit: 1007
Poor impulse control.
|
|
May 12, 2012, 03:46:15 AM |
|
I think you need to assume a miner's hashrate is a constant proportion of the pool's hashrate, excepting when variance in share submission is a variable of interest.
|
|
|
|
Graet
VIP
Legendary
Offline
Activity: 980
Merit: 1001
|
|
May 12, 2012, 03:57:47 AM |
|
so miner average hashrate and pool average hashrate over the round to be able to make that assumption - cheers
|
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
May 12, 2012, 05:34:12 PM |
|
... As for bankruptcy payout - the main problem I'd see there from DGM vs PPS (or some of the different PPS) is that no one can probably work out the owed amount on DGM since there is a future component to that, that depends on the future also. On PPS it's pretty simple.
You can calculate the expected future payment for a given score. (Actually, you can parametrize it so that the expected payout is the score). The operator should pay that out so that miners won't suffer (I'm assuming a deterministic payout is considered better than a variable payout of the same expectation). The total score of all miners is bounded, and the operator should have this as an additional reserve - he should declare bankruptcy before he becomes unable to offer a cashout. Just thought I'd mention that you seem to be discussing exactly what I was referring to here ... Expected payout is one thing, variance is another, maturity time is another. Expected payout is the most important and easiest to calculate, and it's safe to refer to it as the amount owed. It should be possible to create a dataset by running for various parameters, and then interpolating for all other values. Then each pool can show values or graphs for their chosen parameters, and allow the user to set his hashrate with a slider to his how specific results. I have reason to believe that the variance is affine in the miner's hashrate (at least approximately), so that simplifies things. If the interpolation can be done easily enough, maybe the pool can also give sliders for seeing the results for various DGM parameters.
The important parameters being f, c, o, and miner hashrate as a proportion of pool hashrate? Let me know if I missed anything. A measure of the miner's intermittency would also be good to have. By the way, in AoBPMRS I mentioned that it's not strictly necessary that all miners in a pool will have the same parameters. So it could be possible for miners to choose their own parameters based on their desired performance metrics.
I didn't see that mentioned explicitly - is it implied? It's fairly explicit in section 7.4, "Heterogeneous pools". so miner average hashrate and pool average hashrate over the round to be able to make that assumption - cheers Variability in pool hashrate is in a sense equivalent to variability in miner hashrate, which I think should be included. One issue is that the specific form of variability will be different, but perhaps it is possible to encode it approximately with a single value.
|
|
|
|
Graet
VIP
Legendary
Offline
Activity: 980
Merit: 1001
|
|
September 02, 2012, 03:51:54 AM |
|
Lots of miners moving around atm, questions on DGM being asked time for a bump to make this thread easier to find
|
|
|
|
doublec
Legendary
Offline
Activity: 1078
Merit: 1005
|
|
September 09, 2012, 01:01:12 PM |
|
What changes should be made to the DGM calculation in the presence of miners that are mining at different difficulty levels? For example, a pool that dynamically adjusts difficulty based on the users hashrate, or one that allows the user to select their own target value?
|
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
September 09, 2012, 02:22:12 PM |
|
What changes should be made to the DGM calculation in the presence of miners that are mining at different difficulty levels? For example, a pool that dynamically adjusts difficulty based on the users hashrate, or one that allows the user to select their own target value?
It's more or less the same, but instead of p=1/D you should use throughout p=d/D, where d is the difficulty of the current share submitted.
|
|
|
|
Syloq
Newbie
Offline
Activity: 8
Merit: 0
|
|
November 15, 2012, 04:01:22 PM |
|
What changes should be made to the DGM calculation in the presence of miners that are mining at different difficulty levels? For example, a pool that dynamically adjusts difficulty based on the users hashrate, or one that allows the user to select their own target value?
It's more or less the same, but instead of p=1/D you should use throughout p=d/D, where d is the difficulty of the current share submitted. Pardon my ignorance to this issue but I was looking over your initial post and I was trying to understand why with a variable share difficulty that p would turn into d/D? Wouldn't it be 1/d? I guess I don't understand where D comes into play with variable share difficulty.
|
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
November 15, 2012, 04:14:22 PM |
|
What changes should be made to the DGM calculation in the presence of miners that are mining at different difficulty levels? For example, a pool that dynamically adjusts difficulty based on the users hashrate, or one that allows the user to select their own target value?
It's more or less the same, but instead of p=1/D you should use throughout p=d/D, where d is the difficulty of the current share submitted. Pardon my ignorance to this issue but I was looking over your initial post and I was trying to understand why with a variable share difficulty that p would turn into d/D? Wouldn't it be 1/d? I guess I don't understand where D comes into play with variable share difficulty. p is the probability that a share will be a block. If the global difficulty is D and the share difficulty is the standard 1 then p=1/D. If the share difficulty is d then p=d/D.
|
|
|
|
Syloq
Newbie
Offline
Activity: 8
Merit: 0
|
|
November 15, 2012, 05:09:52 PM |
|
What changes should be made to the DGM calculation in the presence of miners that are mining at different difficulty levels? For example, a pool that dynamically adjusts difficulty based on the users hashrate, or one that allows the user to select their own target value?
It's more or less the same, but instead of p=1/D you should use throughout p=d/D, where d is the difficulty of the current share submitted. Pardon my ignorance to this issue but I was looking over your initial post and I was trying to understand why with a variable share difficulty that p would turn into d/D? Wouldn't it be 1/d? I guess I don't understand where D comes into play with variable share difficulty. p is the probability that a share will be a block. If the global difficulty is D and the share difficulty is the standard 1 then p=1/D. If the share difficulty is d then p=d/D. Ah, so it should have always been d/D? Where d was always 1 until now?
|
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
November 15, 2012, 05:30:59 PM |
|
What changes should be made to the DGM calculation in the presence of miners that are mining at different difficulty levels? For example, a pool that dynamically adjusts difficulty based on the users hashrate, or one that allows the user to select their own target value?
It's more or less the same, but instead of p=1/D you should use throughout p=d/D, where d is the difficulty of the current share submitted. Pardon my ignorance to this issue but I was looking over your initial post and I was trying to understand why with a variable share difficulty that p would turn into d/D? Wouldn't it be 1/d? I guess I don't understand where D comes into play with variable share difficulty. p is the probability that a share will be a block. If the global difficulty is D and the share difficulty is the standard 1 then p=1/D. If the share difficulty is d then p=d/D. Ah, so it should have always been d/D? Where d was always 1 until now? Exactly.
|
|
|
|
Bicknellski
|
|
April 02, 2013, 12:24:28 AM |
|
@kano: Your error is that you do not properly consider the effect of round length on how much shares are worth.
In a proportional pool, shares in a short round are worth more than in a long round. If you look at the intra-round level then yes, hoppers get per share the same as anyone else in the round. But the trick is that hoppers get more of their shares in short rounds than normal miners, thus profit more per share.
So to be fair in a proportional system you need not only for all shares in a round to be rewarded equally, you also need each share to have the same chance to go into a short round as other shares. And this simply doesn't hold true for hoppers, so it's unfair.
To give an analogy: At a soup kitchen people are randomly assigned to a line where potatoes are handed out or a line where meat (which is superior) is handed out. Suppose someone sneaks into the meat line although he was assigned to the potato line. He gets the same food as everyone else in the line - but it's unfair because he cheated his way into the better line. (Though this example will only make sense to people who understand that true randomness is fair).
In DGM the concept of rounds isn't rigidly defined. If a hopper tries to hop a DGM pool as if it was proportional then yes, in every "round" he'd get per share on average less than normal miners. But this is only because the artificial division to rounds is flawed. What matters is the reward per share on the global level. If you must, you can say that hoppers get more of their shares in the shorter, more lucrative rounds, but in those rounds get less per share than other shares in the round. The two effects cancel each other to result in average payout per share equal to (1-f)pB, no matter the mining pattern.
And I disagree vehemently with your sentiment that "maths wins logic loses". If a mathematical result seems illogical the art did not fail you, it is you who failed the art.
Sweet explanation... thanks Meni.
|
|
|
|
techwtf
|
|
April 14, 2013, 12:14:08 PM |
|
(some code snippet...) def share(user, diff=1): global worker, p, s, B my_r = Decimal(1.0) + p * diff * c_1 * o_1 / c score = p * diff * B * s s = s * my_r worker[user] = worker.get(user, Decimal(0.0)) + score
def estimate(user): global worker, r, f, p, s score = worker.get(user, Decimal(0.0)) return score * (r - Decimal(1.0)) * (Decimal(1.0) - f) / (p * s)
Did I handle the var diff right? in linear model. p is always 1/D.
|
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
April 14, 2013, 01:16:57 PM Last edit: April 24, 2013, 08:25:52 AM by Meni Rosenfeld |
|
(some code snippet...) def share(user, diff=1): global worker, p, s, B my_r = Decimal(1.0) + p * diff * c_1 * o_1 / c score = p * diff * B * s s = s * my_r worker[user] = worker.get(user, Decimal(0.0)) + score
def estimate(user): global worker, r, f, p, s score = worker.get(user, Decimal(0.0)) return score * (r - Decimal(1.0)) * (Decimal(1.0) - f) / (p * s)
Did I handle the var diff right? in linear model. p is always 1/D. A small but important correction: The last line should be return score * (r - Decimal(1.0)) * (Decimal(1.0) - f) / (diff * p * s)
(Extra diff in the denominator.) In addition: 1. The function "estimate" seems to return the reward that should be given to the user for a block just founds, but the name seems to suggest something else. 2. Make sure the r used in the reward calculation is based on the current p.
|
|
|
|
doublec
Legendary
Offline
Activity: 1078
Merit: 1005
|
|
April 18, 2013, 02:19:30 PM |
|
I'm a bit confused about the term "fixed fee" vs average total fee" mentioned in the OP and what they mean. Is the "average total fee" the average fee the pool operator receives over the long term? Or is 'f' the fee the pool operator expects to receive over the long term?
I ask because I notice, for example, a pool that states it has a 1% DGM fee but parameters "f=-0.25, c=0.2, o=0.8". This makes the "average total fee" 0 which makes me think that that isn't the fee the pool is expecting to get on average.
|
|
|
|
|