Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
June 08, 2011, 08:04:01 PM |
|
using (1-rd.f)*(1-rd.c)*p*rd.b*sum(exp(lscore-lastlscore)) to calculate an estimated earning, I'm getting wildly incorrect results.
For example my account says 11.x, when I run the full round calc it's closer to 6.
Additionally, the sum of estimates is >88, when it should be < 50.
I'm looking at a round with ~1mil shares.
I should emphasize that mid-round estimates are fairly meaningless. It is always almost certain that round end will be far enough in the future that all current shares will be worthless. In particular, calculating the expected reward for already existing shares will be close to 0, while calculating the reward if the round ended now will be much higher. However, the numbers you write might indicate a problem with the calculation. Please post all the values involved, as well as the lscore values of the last few shares - both in general and for the particular worker. Also, in your thread you say 0.001%(or less) fee!
If you use f=0, c=0.001 then it's not 0.001% fee, it's 0.1% fee. And it's the average - on any round it can be much higher. You can also consider using negative f to make the expected fee 0. So, if you're not trying to pool-hop, you will be paid slightly more than those who are.
Those not trying to pool-hop will, in expectation, earn exactly as much as those who are, per share submitted.
|
|
|
|
martok
|
|
June 08, 2011, 10:21:19 PM |
|
It's been some time since I've used MySQL but I expect that might give you some trouble implementing this method. Not to say it can't be done but you might have to be careful with it. Unless MySQL has a serializable mode IE select * from round for update blocks other threads trying to score a share, there is a possibility of bad data getting in: thread1: select * from round; lastscore = 1 thread2 select * from round; lastscore = 1. thread 1 scores current share at 1+r thread 2 does the same thing thread 1: update round set lastscore=lastscore+r commit thread2 update round set lastscore=lastscore+r but thread2's score is wrong because 1's update dwasn't accounted.
MySQL has transactions these days but I wonder how it handles this where shares depend on previous data.
|
|
|
|
simplecoin
|
|
June 08, 2011, 10:48:35 PM |
|
I should emphasize that mid-round estimates are fairly meaningless. It is always almost certain that round end will be far enough in the future that all current shares will be worthless. In particular, calculating the expected reward for already existing shares will be close to 0, while calculating the reward if the round ended now will be much higher.
However, the numbers you write might indicate a problem with the calculation. Please post all the values involved, as well as the lscore values of the last few shares - both in general and for the particular worker.
It seems to be behaving now, I think it was just the earlier implementation. It's off, just not quite as badly. Also, in your thread you say 0.001%(or less) fee!
If you use f=0, c=0.001 then it's not 0.001% fee, it's 0.1% fee. And it's the average - on any round it can be much higher. You can also consider using negative f to make the expected fee 0. so using $c = 0.001; $f = (-$c)/(1-$c); should result .1%? or should that be closer to 0? Is there a way to get to 0% without possible losses? So, if you're not trying to pool-hop, you will be paid slightly more than those who are.
Those not trying to pool-hop will, in expectation, earn exactly as much as those who are, per share submitted. Got it.
|
Donations: 1VjGJHPtLodwCFBDWsHJMdEhqRcRKdBQk
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
June 09, 2011, 03:51:49 AM Last edit: July 07, 2011, 06:42:28 PM by Meni Rosenfeld |
|
0.001%(or less) fee!
If you use f=0, c=0.001 then it's not 0.001% fee, it's 0.1% fee. And it's the average - on any round it can be much higher. You can also consider using negative f to make the expected fee 0. so using $c = 0.001; $f = (-$c)/(1-$c); should result .1%? or should that be closer to 0? Is there a way to get to 0% without possible losses? With these parameters the expected fee is 0, and for any round and it can be as low as -0.1001% (negative) or as high as 100%. There's no way to have expected fee of 0% without a risk of losing out on a round. Note that the losses will be very small though.
|
|
|
|
Inaba
Legendary
Offline
Activity: 1260
Merit: 1000
|
|
June 19, 2011, 09:07:53 PM |
|
So I have been trying to figure out the last part of this and from everything I've read in the thread and from some example code I've looked at for both PGSQL and MySQL, I can't quite grasp what's going on here:
let totscore := sum(exp(share.lscore - max)) + exp(round.los - max)
In the pseudo code, PGSQL code and MySQL code, the bolded part would always seem to evaluate to zero? Is this correct? If not, it seems that lscore will be an arbitrary number based off the last user to submit a share for that block (presumably the person who found the answer). So one person can have an lscore of 80, having been there the entire round, and another person can have an lscore of 1, but find the block, thus giving the completely random and arbitrary value of lscore in that scenario.
So, my question is can someone explain which value lscore is and why? If it's the former, what's the point of having an expression that always evaluates to zero? If the latter, what's going on with the formula that it can take an essentially random number and use it as a valid value for calculating score?
|
If you're searching these lines for a point, you've probably missed it. There was never anything there in the first place.
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
June 20, 2011, 03:58:26 AM |
|
The formula you quote is used only when the round ends to calculate the payouts.
lscore is the logarithm of the score for a given share.
"share" is a table that contains information about all shares. "max" is the lscore of the last share (note the "order by id desc limit 1;" part). So only for the last share lscore - max will be 0, for earlier shares it will be negative. You sum over all shares.
The later the share was submitted, the higher its lscore, as specified in the algorithm.
|
|
|
|
Inaba
Legendary
Offline
Activity: 1260
Merit: 1000
|
|
June 20, 2011, 04:47:59 AM |
|
Hi Meni,
Thanks for responding. So just to confirm, MAX is being set to the score of the last share, which is arbitrary in so far as it could be anything depending on when the round ends?
I think I understand what's going on now as far as lscore - max. Is the share table then intended to contain one row per share each with a score as opposed to one row per user with an aggregate score that is increased for each share submitted? I think that may be where I went off track and why it wasn't making sense.
Is each share worth a certain amount, regardless of who submitted it or does the value of a share differ depending on who submitted it? By this, I mean if person A has submitted 500 shares in the past hour and person B has submitted 200 shares in the past hour, person A's 500th share is worth more than person B's 200th share or A's 500th and B's 200th are worth the same if they are submitted at the same time (well, one is worth slightly less than the other depending on which order it was submitted)?
|
If you're searching these lines for a point, you've probably missed it. There was never anything there in the first place.
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
June 20, 2011, 05:51:39 AM |
|
Thanks for responding. So just to confirm, MAX is being set to the score of the last share, which is arbitrary in so far as it could be anything depending on when the round ends?
Yes. And it's also arbitrary in the sense that it's just used for numerical stability, you can use another value for it as long as it's used consistently in all parts of the calculation. I think I understand what's going on now as far as lscore - max. Is the share table then intended to contain one row per share each with a score as opposed to one row per user with an aggregate score that is increased for each share submitted? I think that may be where I went off track and why it wasn't making sense.
Correct, one row per share. Is each share worth a certain amount, regardless of who submitted it or does the value of a share differ depending on who submitted it? By this, I mean if person A has submitted 500 shares in the past hour and person B has submitted 200 shares in the past hour, person A's 500th share is worth more than person B's 200th share or A's 500th and B's 200th are worth the same if they are submitted at the same time (well, one is worth slightly less than the other depending on which order it was submitted)?
Yes, the value of a share depends only on when it was submitted and not on who submitted it. The total payout of a worker is just the sum of the payouts for all of his shares.
|
|
|
|
Inaba
Legendary
Offline
Activity: 1260
Merit: 1000
|
|
June 20, 2011, 06:11:03 AM |
|
Thank you Meni, I think I understand now. I will probably have some more questions tomorrow
|
If you're searching these lines for a point, you've probably missed it. There was never anything there in the first place.
|
|
|
btcmonkey
Newbie
Offline
Activity: 20
Merit: 0
|
|
July 29, 2011, 01:41:56 PM |
|
This is a really interesting score implementation. I have a few questions.
I have not had a lot of success getting your proof to render for me. Can you describe how you came up with the decay factor, r? Isn't it really a growth factor for any reasonable c?
Is it true that for a very low number of shares ( < 1000 ) at the current difficulty, the total fee gets really large ( > 50% ) when c = 0.001? My implementation seems to show this. Does this mean that a really lucky block find would mean bad news for pool members, or is my implementation flawed?
Expanding on this, what impact would having the score start at some high arbitrary number (e.g. r^10000) instead of 1 have? It seems it could enable setting a max value for what fee would be taken, but I'm not sure how doing this would effect the cheat-proofness of the system and expected fee calculations.
For difficulty 2 and difficulty 3 shares is p simply 2/difficulty and 3/difficulty respectively?
|
|
|
|
Meni Rosenfeld (OP)
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
July 29, 2011, 04:08:35 PM Last edit: July 30, 2011, 07:52:07 PM by Meni Rosenfeld |
|
I have not had a lot of success getting your proof to render for me. Can you describe how you came up with the decay factor, r? Isn't it really a growth factor for any reasonable c?
It's growth in the score of new shares, but a decay in the value of old shares. It's the unique value that makes the sums come out right. In fact you could choose r first and then find c, the average score fee, in terms of r. Is it true that for a very low number of shares ( < 1000 ) at the current difficulty, the total fee gets really large ( > 50% ) when c = 0.001? My implementation seems to show this. Does this mean that a really lucky block find would mean bad news for pool members, or is my implementation flawed?
Yes, the fee is large for short rounds. This is because there aren't many participants to receive a reward, otherwise early miners would get a disproportionate reward. Expanding on this, what impact would having the score start at some high arbitrary number (e.g. r^10000) instead of 1 have? It seems it could enable setting a max value for what fee would be taken, but I'm not sure how doing this would effect the cheat-proofness of the system and expected fee calculations.
If you do this and keep the score fee as stated, it will be like decreasing the score fee, which means that this is no longer hopping-proof. For difficulty 2 and difficulty 3 shares is p simply 2/difficulty and 3/difficulty respectively?
Yes. All in all the method was designed for everything to be 100% accurate in expectation, though this means relatively high variance and some counterintuitive situations.
|
|
|
|
|