flound1129 (OP)
|
|
May 04, 2013, 03:53:46 AM |
|
Just found our 5th block!
That's 1000 FTC generated so far, good work pool!
|
Multipool - Always mine the most profitable coin - Scrypt, X11 or SHA-256!
|
|
|
flound1129 (OP)
|
|
May 04, 2013, 04:45:27 AM |
|
You may have just noticed that your FTC unconfirmed balance changed for the last block.
This is due to a flaw in the payout calculation that was identified by user UNOE.
The calculation was using too few shares to determine the block payout, due to a change I made in the algorithm during the wild difficulty swings of TRC. It is working properly now.
I offer my apologies, the estimates and unconfirmed rewards should be accurate going forward.
|
Multipool - Always mine the most profitable coin - Scrypt, X11 or SHA-256!
|
|
|
SuperTramp
Legendary
Offline
Activity: 1073
Merit: 1000
|
|
May 04, 2013, 09:31:54 AM |
|
Congrats flound1129 !! Your MNC pool has qualified you for a MNC bounty of 1k MNC !!! Please take a look at the MNC release thread (1st page): https://bitcointalk.org/index.php?topic=165397.0When you get a chance please post in that thread with your MNC addy and I will get your bounty out to you ASAP ! Thanks for your support !!!
|
|
|
|
chr0me
Member
Offline
Activity: 108
Merit: 10
|
|
May 04, 2013, 02:21:01 PM |
|
When will FRC/MNC stats show?
Also, a manual diff setting would be cool.
|
- I do translations! PM me if interested - 19sfw2W3dAFrzerkxLS26aL4HuYGpN1LNP
|
|
|
flound1129 (OP)
|
|
May 05, 2013, 05:19:46 AM |
|
When will FRC/MNC stats show?
Also, a manual diff setting would be cool.
Still on my to-do list. MNC stats are working fine. For FRC we haven't found any blocks yet, and I need to update the block reward calculation before estimates will show.
|
Multipool - Always mine the most profitable coin - Scrypt, X11 or SHA-256!
|
|
|
kslavik
Sr. Member
Offline
Activity: 441
Merit: 250
GET IN - Smart Ticket Protocol - Live in market!
|
|
May 05, 2013, 04:47:17 PM |
|
What kind share counts do you display Inside "Your Stats" table?
Those seems to be round shares submitted by user. Do they take stratum difficulty into account?
I have my cgminer operating at difficulty 5 and it reports the following stats:
Queued work requests: 150 Share submissions: 762 Accepted shares: 735 Rejected shares: 27 Accepted difficulty shares: 3164 Rejected difficulty shares: 123 Reject ratio: 3.5%
While I see ~ 770 shares reported by your site (i'm doing FRC mining now).
I'm assuming that 3164 shares (reported by cgminer) are my shares normalized to difficulty 1. If so, it doesn't make sense to display "Share submissions" on "Your Stats" or use inside calculations for rewards because those shares could be done at different difficulty levels and so they cannot be added together without considering difficulty at which there were submitted.
What about Pool Stats? what kind of shares do you display there? are those difficulty 1 shares?
I'm I missing something?
Thank you
|
████ ███ ███ ████ ███ ███ ███ ███ ████ ███ ███ ███ ███ ███ ███ ████ ███ ███ ██ ███ ███ █████████████████ ███ ███ ███ ██ ███ ███ ██ ██ ███ ██████████ ███ ███ ██████ ███ ███ ██ ███ ███ ███ ███ ███ ███ ███ ████
| | GUTS | | ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███
| | smart-ticket protocol for events ✔ live product with market traction! | | ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███
| | ▶ BTC ANN ▶ WEBSITE ▶ BLOG
| | ▶ SANDBOX ▶ WHITEPAPER ▶ BOUNTY
| |
|
|
|
limingz
|
|
May 05, 2013, 04:55:16 PM |
|
Why can I post this topic but others
|
|
|
|
flound1129 (OP)
|
|
May 05, 2013, 06:26:46 PM |
|
What kind share counts do you display Inside "Your Stats" table?
Those seems to be round shares submitted by user. Do they take stratum difficulty into account?
I have my cgminer operating at difficulty 5 and it reports the following stats:
Queued work requests: 150 Share submissions: 762 Accepted shares: 735 Rejected shares: 27 Accepted difficulty shares: 3164 Rejected difficulty shares: 123 Reject ratio: 3.5%
While I see ~ 770 shares reported by your site (i'm doing FRC mining now).
I'm assuming that 3164 shares (reported by cgminer) are my shares normalized to difficulty 1. If so, it doesn't make sense to display "Share submissions" on "Your Stats" or use inside calculations for rewards because those shares could be done at different difficulty levels and so they cannot be added together without considering difficulty at which there were submitted.
What about Pool Stats? what kind of shares do you display there? are those difficulty 1 shares?
I'm I missing something?
Thank you
Currently, share difficulty is not taken into account for display purposes. It's one of the reasons I dropped share counts from the pool stats pages, because they didn't make any sense. It is, however, taken into account for payments. Your shares are paid based on the average difficulty vs. the pool's average difficulty (weighted average). It's just a huge pain in the ass (not to mention very expensive, from a disk and CPU perspective) to do these calculations for every page refresh. The site is already slow enough
|
Multipool - Always mine the most profitable coin - Scrypt, X11 or SHA-256!
|
|
|
flound1129 (OP)
|
|
May 05, 2013, 06:27:09 PM |
|
Why can I post this topic but others
No idea.
|
Multipool - Always mine the most profitable coin - Scrypt, X11 or SHA-256!
|
|
|
flound1129 (OP)
|
|
May 05, 2013, 06:28:09 PM |
|
I will be upgrading the underlying disk system on the DB later tonight. This should hopefully result in a decent speed increase for the website. There may be some short downtimes on the pools but nothing severe. (mysql restarts etc).
|
Multipool - Always mine the most profitable coin - Scrypt, X11 or SHA-256!
|
|
|
kslavik
Sr. Member
Offline
Activity: 441
Merit: 250
GET IN - Smart Ticket Protocol - Live in market!
|
|
May 05, 2013, 07:58:15 PM |
|
Calculating averages for the difficulty levels doesn't seems be accurate or fast or efficient way of doing pool mining.
I think the simpler and most accurate way of doing it is the following:
Upon every share submission, normalize it to difficulty 1 and increment Pool count and user round count by this normalized difficulty 1 equivalent number.
So if my miner is operating at difficulty 2, only record double shares for every difficulty 2 submissions and ignore all shares with less than 2 difficulty. if my miner is operating at difficulty 3, only record 3 shares for every submission and ignore all shares with difficulty less than 3 and so on.
I'm pretty sure that this is how it is done by other pools.
So, to display results you would only have to do a simple read from the database for the pool round shares and for the user round shares.
|
████ ███ ███ ████ ███ ███ ███ ███ ████ ███ ███ ███ ███ ███ ███ ████ ███ ███ ██ ███ ███ █████████████████ ███ ███ ███ ██ ███ ███ ██ ██ ███ ██████████ ███ ███ ██████ ███ ███ ██ ███ ███ ███ ███ ███ ███ ███ ████
| | GUTS | | ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███
| | smart-ticket protocol for events ✔ live product with market traction! | | ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███
| | ▶ BTC ANN ▶ WEBSITE ▶ BLOG
| | ▶ SANDBOX ▶ WHITEPAPER ▶ BOUNTY
| |
|
|
|
flound1129 (OP)
|
|
May 05, 2013, 11:42:50 PM |
|
So if my miner is operating at difficulty 2, only record double shares for every difficulty 2 submissions and ignore all shares with less than 2 difficulty. if my miner is operating at difficulty 3, only record 3 shares for every submission and ignore all shares with difficulty less than 3 and so on.
There is no mathematical difference between what you are describing and calculating a weighted average. What you're proposing would also increase the number of rows recorded in the db by a factor of 10-20 or more in some cases, which is going to cause a fairly large performance hit. Where this would help would be in calculating the last N shares a bit more accurately, since currently we're using more than we should. But for people who mine the whole round I doubt that there is more than a 1% difference. I'm pretty sure that this is how it is done by other pools.
That's not the way it's done by any pool running stock stratum because that's not how stratum does it. Currently, stratum simply inserts the share difficulty into a column called "difficulty" with every share submitted. stratum is open source, so if you know python and are inclined to work on implementing a better method/algorithm, I'm sure a lot of people would be happy to use it. Given the number of clueless pool operators out there I bet there are even some using vardiff without using an extended DB (and hence not even recording the difficulty of shares submitted).
|
Multipool - Always mine the most profitable coin - Scrypt, X11 or SHA-256!
|
|
|
kslavik
Sr. Member
Offline
Activity: 441
Merit: 250
GET IN - Smart Ticket Protocol - Live in market!
|
|
May 06, 2013, 12:39:42 AM |
|
I'm not suggesting you would make 10 inserts for every share of difficulty 10, you would still insert 1 record for every submitted share as you are probably doing right now. I'm only suggesting you would also insert a weight with each submitted share proportional to the difficulty, so all the calculations are done based on this weight and not on # of submitted shares. No need to calculate averages if you can calculate sums.
I can also think of ways of only keeping 1 record per user per round/shift without calculating averages.
|
████ ███ ███ ████ ███ ███ ███ ███ ████ ███ ███ ███ ███ ███ ███ ████ ███ ███ ██ ███ ███ █████████████████ ███ ███ ███ ██ ███ ███ ██ ██ ███ ██████████ ███ ███ ██████ ███ ███ ██ ███ ███ ███ ███ ███ ███ ███ ████
| | GUTS | | ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███
| | smart-ticket protocol for events ✔ live product with market traction! | | ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███
| | ▶ BTC ANN ▶ WEBSITE ▶ BLOG
| | ▶ SANDBOX ▶ WHITEPAPER ▶ BOUNTY
| |
|
|
|
flound1129 (OP)
|
|
May 06, 2013, 02:19:00 AM |
|
I'm not suggesting you would make 10 inserts for every share of difficulty 10, you would still insert 1 record for every submitted share as you are probably doing right now. I'm only suggesting you would also insert a weight with each submitted share proportional to the difficulty, so all the calculations are done based on this weight and not on # of submitted shares. No need to calculate averages if you can calculate sums.
I can also think of ways of only keeping 1 record per user per round/shift without calculating averages.
The share difficulty is already a weight. I still don't understand why you would add another useless column in the DB when you can just calculate it during scoring. As I said earlier, the challenge is determining last # of shares for PPLNS scoring when you have a bunch of shares with different weights. If anyone has either a SQL query or an algorithm written in PHP for this I'd be glad to get a copy.
|
Multipool - Always mine the most profitable coin - Scrypt, X11 or SHA-256!
|
|
|
mrvegad
|
|
May 06, 2013, 02:29:27 AM |
|
Hi Flound Have u had time to look into the issues i emailed you about? (mrvegas)
Thank you
|
|
|
|
kslavik
Sr. Member
Offline
Activity: 441
Merit: 250
GET IN - Smart Ticket Protocol - Live in market!
|
|
May 06, 2013, 03:07:01 AM |
|
The share difficulty is already a weight. I still don't understand why you would add another useless column in the DB when you can just calculate it during scoring.
As I said earlier, the challenge is determining last # of shares for PPLNS scoring when you have a bunch of shares with different weights. If anyone has either a SQL query or an algorithm written in PHP for this I'd be glad to get a copy.
I'm assuming you have a table with a structure like that: tbl_SubmittedShares(int Id,int AccountId,int Difficulty,date SubmittedOn) To get an Id of the cut off record for last 500000 shares of difficulty one, run this query: select top 1 p1.id,sum(p2.Difficulty) from tbl_SubmittedShares p1,tbl_SubmittedShares p2 where p2.id>=p1.id group by p1.id having sum(p2.Difficulty)>=500000 order by p1.id desc Once you have an cut off id from that query, it will be easy to calculate how many difficulty 1 shares there are for a given user within last 500000 shares: select sum(Difficulty) from tbl_SubmittedShares where AccountId=@userId and id>=@cutOffId
|
████ ███ ███ ████ ███ ███ ███ ███ ████ ███ ███ ███ ███ ███ ███ ████ ███ ███ ██ ███ ███ █████████████████ ███ ███ ███ ██ ███ ███ ██ ██ ███ ██████████ ███ ███ ██████ ███ ███ ██ ███ ███ ███ ███ ███ ███ ███ ████
| | GUTS | | ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███
| | smart-ticket protocol for events ✔ live product with market traction! | | ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███ ███
| | ▶ BTC ANN ▶ WEBSITE ▶ BLOG
| | ▶ SANDBOX ▶ WHITEPAPER ▶ BOUNTY
| |
|
|
|
flound1129 (OP)
|
|
May 06, 2013, 02:27:14 PM |
|
The share difficulty is already a weight. I still don't understand why you would add another useless column in the DB when you can just calculate it during scoring.
As I said earlier, the challenge is determining last # of shares for PPLNS scoring when you have a bunch of shares with different weights. If anyone has either a SQL query or an algorithm written in PHP for this I'd be glad to get a copy.
I'm assuming you have a table with a structure like that: tbl_SubmittedShares(int Id,int AccountId,int Difficulty,date SubmittedOn) To get an Id of the cut off record for last 500000 shares of difficulty one, run this query: select top 1 p1.id,sum(p2.Difficulty) from tbl_SubmittedShares p1,tbl_SubmittedShares p2 where p2.id>=p1.id group by p1.id having sum(p2.Difficulty)>=500000 order by p1.id desc Once you have an cut off id from that query, it will be easy to calculate how many difficulty 1 shares there are for a given user within last 500000 shares: select sum(Difficulty) from tbl_SubmittedShares where AccountId=@userId and id>=@cutOffId Thanks, I'll look into implementing this as soon as I get some time.
|
Multipool - Always mine the most profitable coin - Scrypt, X11 or SHA-256!
|
|
|
flound1129 (OP)
|
|
May 06, 2013, 05:30:02 PM |
|
Looks like FTC is dead folks! We need more LTC miners, tell your friends!
|
Multipool - Always mine the most profitable coin - Scrypt, X11 or SHA-256!
|
|
|
produ1
Newbie
Offline
Activity: 7
Merit: 0
|
|
May 06, 2013, 07:54:33 PM |
|
And we need more FRC miners !
xeilpprod
|
|
|
|
Jama
|
|
May 06, 2013, 08:36:03 PM |
|
I have jumped in on TRC with about 1850 MHash.
|
|
|
|
|