fcmatt
Legendary
Offline
Activity: 1106


August 22, 2011, 02:08:36 AM 

Oh god. Now solidcoins. What's next ??
I am not sure there will be any next ones. After 3 different scam artists creating new clones of bitcoin have come out... people are going to run out of steam for them. There is only so many greater fools out in the world that have BTC they want to trade for a forked clone's coins. I expect the idea will settle down for a while as these fade away into obscurity. but on the topic of this pool mining BTC.. the pool is on fire. Great times.





Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.





eleuthria
Legendary
Offline
Activity: 1764
BTC Guild Owner


August 23, 2011, 04:43:45 AM 

Really nothing I can do at this point. You can't disprove a negative. You can prove someone stole, you can't prove someone didn't steal. At this point I can just sit here arguing or ignore it, and I'm choosing the latter. The last time I responded I gave our stats over the last few difficulties that I had recorded and display. It showed about a 20% chance of our luck over the recorded difficulties using Vladmir's method to calculate the probability with a Poisson distribution calculator. I didn't bother going back into the past because I did not track difficulties/luck at previous levels, so I would've had to hunt down the proper block #s and calculate our luck for each one at a time where I was trying to respond to the witch hunt as quickly as possible. All I know is there were a LOT of software problems between June and July, which was when we were massively expanding (peaking at 8 servers). There were times where servers ran with duplicate wallet files (meaning they were [possibly] hashing the same headers/duplicating work). There were times where the pools were going up/down regularly, which I believe caused them to resend the same getwork space a few times thus awarding shares to duplicate work due to the crashes. And there were the JoelKatz patches trying to keep up with scaling the servers which had a number of issues before becoming stable. At this point I'll just let http://l0ss.net do the talking, which is monitoring the luck for 6 of the largest pools, now with nearly 1 month of time included. We swing up, we swing down, and I have no desire to make my life a living hell by refreshing our block page every 15 minutes to see which way we're swinging. EDIT: I can definitely say his method of adding shares from invalids to other blocks skews results further in his favor, since our invalid rate in that time was much higher due to server attacks, botnets making our connections unreliable, and the problems implementing JoelKatz' patches. Still looks bad, but as I said at the start: It's not possible to disprove the assertion.




kano
Legendary
Online
Activity: 2100
Linux since 1997 RedHat 4


August 23, 2011, 05:48:17 AM 

Well firstly, I went and visited that link and had a bit to say about the comment about hopping ... But I will also add this: MINING IS STATISTICALMany people REALLY do not understand this nor what it means. It means that % will go up and down. I could point out that for most of the past week I have been mining here almost continuously (I did a day or so of I0Mining) and been paid MORE than the expected BTC (and anyone can work their own expected BTC using my calculator in my sig: http://tradebtc.net/bitcalc.php ) So should someone now be saying OMG I'm ripping of BTCGuild? I don't hop coz I know it makes no difference (as I said I've been mining here for most of a week) and unless someone has come up with a bizarre mathematics to determine another way to beat a share% paying pool, then the reason why I'm getting more than my expected BTC is: probabilities of course Yes BTCGuild could be putting in fake members and stealing shares that way, but again it all boils down to percentages and they already charge a percentage (and rip people off with lose on the invalid share bit ) but otherwise, there really is no reason to even consider the rest of the scams included in that page for a big pool like BTCGuild OK, that's just my 2c worth, but, well, the mining community is seriously full of superstition and I think a level minded comment is needed.

Pool: https://kano.is Here on Bitcointalk: Forum BTC: 1 KanoPb8cKYqNrswjaA8cRDk4FAS9eDMLU FreeNode IRC: irc.freenode.net channel #kano.isHelp keep Bitcoin secure by mining on pools with full block verification on all blocks  and NO empty blocks!



kjj
Legendary
Offline
Activity: 1302


August 23, 2011, 06:28:17 AM 

I would like to add that statistics is the most difficult branch of mathematics that most people are even aware of. In general, if you can follow a statistical argument, it is almost certainly wrong.
His analysis probably isn't very wrong, other than the way he stacked all of the gray areas against, but there are some questionable areas.
For example, finding a block isn't really a function of the number of shares found. Shares and blocks are just two different thresholds on the same probability distribution. And hashes aren't counted, they are estimated. I don't know that it is necessarily wrong to use a probability distribution as the interval input to the Poisson function, but I can promise that if you get a number out of it, and not another probability distribution, you've missed some important steps.
At any rate, he is talking about a 5.6 to 6.5% shortfall. That's not nearly enough for me to worry about.




hugolp


August 23, 2011, 08:11:47 AM 

OK, that's just my 2c worth, but, well, the mining community is seriously full of superstition and I think a level minded comment is needed. Judging by the reaction I say that the mining community is very mature.




topheroly
Newbie
Offline
Activity: 21


August 23, 2011, 08:39:27 AM 

I was asked to comment on this issue. I'm providing a claim that Vlad's analysis is incorrect. Since many of the people that read this thread are statistical laymen, I'm going to walk through this step by step. First off, approximating the distribution with one huge encompassing Poisson Distribution is not the best choice in this scenario. A much better choice would be use the central limit theorem to approximate this Binomial Density separated by difficulty. The Poisson is only valid under certain criteria while the central limit theorem is pretty much good whenever n is large. I will show my work for one iteration of this procedure and produce the results for the rest. Taking difficulty 434877 as the example to go off: // Notes  bolded letters are estimates of what the true value should be (estimates taken from Vlad's sheet) // n  sample size // p  estimated probability of success // p = x/n, where x is the number of successes (estimated blocks found) Y ~ Binomial(n, p) n = 2016 p = 135.705/2016 = 0.0673 Y ~ Binomial(2016,0.0673) From the central limit theorem, we know that a Binomial of sufficient sample size will follow a normal with mean n* p and variance of n* p* q. Therefore in our example our distribution becomes the following Y ≈ Normal(n* p, n* p* q) Y ≈ Normal(135.677, 126.546) To calculate the probability of getting less than some number observed Y (actual blocks found), all that's left is to convert our normal distribution to a standard normal and look up the pvalue. P(Y ≤ 134) = P(Z ≤ (134135.677)/sqrt(126.546)) = P(Z ≤ 0.149) = 0.440 What does this value mean? This means we are 44% likely to see a value this extreme or more at this difficulty(434877) which is completely acceptable. Things to consider with a grain of salt. We gave an estimate for p when in fact p actually changes quite a lot during each difficulty with all the hashing power changes. I've also made a mention on the spreadsheet for occurrences that might indicate something odd happening, explained by DDoS attacks or other systematic errors fixed by patches later on. If this original value value that Vlad had stated was true, I would be concerned. However, thankfully this is not the case and I hope everyone can see the sense and reasoning posted here. The remaining pvalues are below for convenience and the sheet that I used to calculate said values is linked: https://spreadsheets.google.com/spreadsheet/ccc?key=0AoAyWRmssbLKdHduLURqdENHckw0SzRNX3JhN3ZKV2c&hl=en_USDifficulty Pvalue Verdict 434877.04 0.439 Nothing wrong at all 567269.53 0.000 Check For DDoS/Other Systematic Errors 876954.49 0.449 Nothing wrong at all 1379192.28 0.341 Nothing wrong at all 1563027.99 0.040 Statistical Anomaly 4% chance? 1690895.8 0.331 Nothing wrong at all 1888786.7 0.001 Check For DDoS/Other Systematic Errors 1805700.83 0.720 Nothing wrong at all




sirky


August 23, 2011, 12:58:27 PM 

I was asked to comment on this issue. I'm providing a claim that Vlad's analysis is incorrect. Since many of the people that read this thread are statistical laymen, I'm going to walk through this step by step. First off, approximating the distribution with one huge encompassing Poisson Distribution is not the best choice in this scenario. A much better choice would be use the central limit theorem to approximate this Binomial Density separated by difficulty. The Poisson is only valid under certain criteria while the central limit theorem is pretty much good whenever n is large. I will show my work for one iteration of this procedure and produce the results for the rest. Taking difficulty 434877 as the example to go off: // Notes  bolded letters are estimates of what the true value should be (estimates taken from Vlad's sheet) // n  sample size // p  estimated probability of success // p = x/n, where x is the number of successes (estimated blocks found) Y ~ Binomial(n, p) n = 2016 p = 135.705/2016 = 0.0673 Y ~ Binomial(2016,0.0673) From the central limit theorem, we know that a Binomial of sufficient sample size will follow a normal with mean n* p and variance of n* p* q. Therefore in our example our distribution becomes the following Y ≈ Normal(n* p, n* p* q) Y ≈ Normal(135.677, 126.546) To calculate the probability of getting less than some number observed Y (actual blocks found), all that's left is to convert our normal distribution to a standard normal and look up the pvalue. P(Y ≤ 134) = P(Z ≤ (134135.677)/sqrt(126.546)) = P(Z ≤ 0.149) = 0.440 What does this value mean? This means we are 44% likely to see a value this extreme or more at this difficulty(434877) which is completely acceptable. Things to consider with a grain of salt. We gave an estimate for p when in fact p actually changes quite a lot during each difficulty with all the hashing power changes. I've also made a mention on the spreadsheet for occurrences that might indicate something odd happening, explained by DDoS attacks or other systematic errors fixed by patches later on. If this original value value that Vlad had stated was true, I would be concerned. However, thankfully this is not the case and I hope everyone can see the sense and reasoning posted here. The remaining pvalues are below for convenience and the sheet that I used to calculate said values is linked: https://spreadsheets.google.com/spreadsheet/ccc?key=0AoAyWRmssbLKdHduLURqdENHckw0SzRNX3JhN3ZKV2c&hl=en_USDifficulty Pvalue Verdict 434877.04 0.439 Nothing wrong at all 567269.53 0.000 Check For DDoS/Other Systematic Errors 876954.49 0.449 Nothing wrong at all 1379192.28 0.341 Nothing wrong at all 1563027.99 0.040 Statistical Anomaly 4% chance? 1690895.8 0.331 Nothing wrong at all 1888786.7 0.001 Check For DDoS/Other Systematic Errors 1805700.83 0.720 Nothing wrong at all Before I say this, I just want to say that I completely trust Eleuthria, and I do not think that he is gaming the pool. I have chatted with him many times on IRC, and though I don't mine here anymore, it is only because it is proportional. Anyway, I am not sure I follow the logic of breaking it out by individual difficulty. If there was theft, it would show up by everything being slightly lower, and only when you add them up do you get something significant. I added everything back together and get .02% chance it was caused by luck. This really is the best one to use. I do believe that this was caused by technical issues coupled with legitimate bad luck, and not theft. As an aside, this only covers the hiding blocks from bitcoind from the pool. This won't detect fake workers.




kernelpanic
Jr. Member
Offline
Activity: 51


August 23, 2011, 02:07:00 PM 

My btcguild Account page shows two blocks found, but the API page shows 0 blocks found for all workers. I have not added/edited workers during the time when the most recent block was found (not 100% about the 1st block) so it shouldn't be because the worker that found it no longer exists. Any ideas?




eleuthria
Legendary
Offline
Activity: 1764
BTC Guild Owner


August 23, 2011, 09:26:26 PM 

My btcguild Account page shows two blocks found, but the API page shows 0 blocks found for all workers. I have not added/edited workers during the time when the most recent block was found (not 100% about the 1st block) so it shouldn't be because the worker that found it no longer exists. Any ideas?
This is something I will be fixing soon. The way block finders are recorded was drastically changed from the original code back when we were in our first 200 rounds, due to splitting up the servers. The API still tries to pull that information from old data.




Okama
Newbie
Offline
Activity: 23


August 24, 2011, 12:33:10 AM 

According to block explorer, block 142300 was found at 20:49:38 but btcguild claimed to find this one at 19:18:15. Is there any problem with your server?




eleuthria
Legendary
Offline
Activity: 1764
BTC Guild Owner


August 24, 2011, 01:50:36 PM 

According to block explorer, block 142300 was found at 20:49:38 but btcguild claimed to find this one at 19:18:15. Is there any problem with your server?
Sorry, I thought I posted this on the forums as well but apparently I only talked about it on IRC [it occurred while I was at work]: There was a small problem due to closing I0 Guild. The code is literally a copy of BTC Guild, with an added flag at run time which tells it not to sync round data with the master server. When I shut down I0 Guild, I realized that the payouts would likely not make it to the exchange in time for what looks like a bug that will completely destroy the I0 Coin fork until its patched and everybody updates. So I turned the pool server back on, to help push the coins to the exchange for the last payouts. The nosync flag was not included, so when I0Guild found an i0 block, it pushed a new round to the master server which caused some 2547 shares to be logged as 2548. I caught it at 2549. The total timeshifting was about 1h 30 minutes, thus the time differences on 2547/2548 vs BlockExplorer. At that point 2547 had already been calculated. I made the decision to leave the shares alone, rather than risk duplicating or not counting some shares. The effect on payouts would have been negligible, and altering rewards which already showed up as confirmed is something I would only do if there was a major problem. I've gone in and corrected the end/start times of those rounds to keep them in line with the actual times per Block Explorer. It makes the estimated hash rates for those blocks a bit off, but it keeps our database in line with actual block data, which is more important than things like estimated hash rates. The problem that happened at I0 Guild is one that can't be repeated on SC Guild since it is on a different physical server that cannot access the MySQL servers on BTC Guild. Future forks if they arise will also be launched on nonBTC Guild servers both as a precaution, and to avoid putting unrelated load on the BTC Guild servers.




PcChip


August 24, 2011, 10:47:13 PM 

"When I shut down I0 Guild, I realized that the payouts would likely not make it to the exchange in time for what looks like a bug that will completely destroy the I0 Coin fork until its patched and everybody updates. "
If you're not busy, I'm a bit curious to hear about that.
Also, I had 0.5 i0coins on my account and couldn't ever get them out (even after I saw the warning) since it was below the payout threshhold of 1.0 (Not that half an i0coin is worth anything anyway)
Keep up the excellent work good sir.

Legacy signature from 2011: All rates with Phoenix 1.50 / PhatK 5850  400 MH/s  5850  355 MH/s  5830  310 MH/s  GTX570  115 MH/s  5770  210 MH/s  5770  200 MH/s



eleuthria
Legendary
Offline
Activity: 1764
BTC Guild Owner


August 25, 2011, 01:34:32 AM 

"When I shut down I0 Guild, I realized that the payouts would likely not make it to the exchange in time for what looks like a bug that will completely destroy the I0 Coin fork until its patched and everybody updates. "
If you're not busy, I'm a bit curious to hear about that.
Also, I had 0.5 i0coins on my account and couldn't ever get them out (even after I saw the warning) since it was below the payout threshhold of 1.0 (Not that half an i0coin is worth anything anyway)
Keep up the excellent work good sir.
My understanding is paraphrased from what I was told by ArtForz, so if I butcher the explanation I hope somebody smarter than me can correct my understanding. I0Coin put in a timebased difficulty adjustment of 1 week, so if the network died as much as it has, the difficulty would drop without waiting for a ton of blocks that may never get made. This check SHOULD be run based off the timestamp of the most recent block, so that individual system clock variances don't affect how the node determines what the difficulty should be dropped to. This wasn't done. If you turn on a fresh node today, it tries to adjust the difficulty before the chain is even accepted, refusing the valid block chain generated thus far. Once 7 days have elapsed since the previous difficulty increase (which was from blocks, not time), the nodes that were already up and running with the block chain will likely have variances in the difficulty calculation because they may not be on and checking the NTP server at the exact time the change will occur. This means all the i0coind nodes out there will have variances in what they think the valid difficulty is, refusing blocks from any node that does not agree.




eleuthria
Legendary
Offline
Activity: 1764
BTC Guild Owner


August 25, 2011, 01:38:54 AM 

BTC Guild will likely be offering a PPS option in the near future for those who do not like variance, in a similar vein with Deepbit. I have not decided if PPS will be included on the primary BTC page, or if I want to handle it on a forked page (like pps.btcguild.com). It will be TRUE PPS, not SMPPS, so the pool's luck will never affect the rates/cause backpay situations. All donator perks on BTC Guild will be automatically granted to PPS users.
The fee percentage hasn't been decided on yet, I'm going to run the numbers on how low the pool's luck has been since inception as a percentage of expectation [including invalid blocks] to determine how high my risk is based on our current relatively large historical sample size. Just like Deepbit, over the long run you'll make more with standard proportional, but PPS is a nice insurance level where you have guaranteed revenue over any period of time, rather than a "close to average revenue" over a sufficiently long period of time.




Kermee


August 25, 2011, 02:02:03 AM 

PPS is 10% on DeepBit... I have a feeling you're going to end up with the same percentage.
Cheers, Kermee




eleuthria
Legendary
Offline
Activity: 1764
BTC Guild Owner


August 25, 2011, 02:30:32 AM 

PPS is 10% on DeepBit... I have a feeling you're going to end up with the same percentage.
Cheers, Kermee
If I offer PPS, it will be a lower rate than DeepBit, since my base proportional rate is 0% (2.5% for comparable deepbit features/invalid insurance). Tycho's PPS rate is set at a rate of 7% higher than his base rate, which seems to be about where our luck has been from inception (if I use Vladmir's numbers, I _THINK_ we're about 6.5% behind on total BTC production from expected average if you measure June 1stAugust 21st, although that discounts a high luck day, and assumes invalid blocks should be 0% rather than ~1%). Ideally in the long run, luck zeroes in on 0% as the time frame you use grows, so I may offer it even lower and just eat the loss assuming BTC Guild stays alive long enough to get back to the 0% average.




krzynek1
Jr. Member
Offline
Activity: 41


August 25, 2011, 05:49:34 AM 

why mining solidcoins on btc guild makes so many stales share ? up to 10 % !!




Okama
Newbie
Offline
Activity: 23


August 25, 2011, 07:15:40 AM 

There is a problem with confirmed rewards in solid coin mining.




kjj
Legendary
Offline
Activity: 1302


August 25, 2011, 08:05:22 PM 

finding a block isn't really a function of the number of shares found Oh really? What is it function of than? Number of shares AND bitcoin network difficulty, I hope? Here is some homework for you. Figure out using some incomprehensibly correct statistic with page long formulas what are the odds of tossing a fair coin 4865 times and getting only at most 2275 tails . Now please tell me how accepting D shares and solving a block (or not solving it) is different from tossing a fair coin, where D is difficulty. The network is not flipping coins over and over again. Let me say it again, this time more clearly. There is no fucking coin. The network is generating what we hope is a more or less random number, trillions of times per second. Can you really not tell the difference between quintillions of hashes vs. thousands of coin flips? Pardon me for not being startled by the difference between 2,298/10,000,000,000,000,000,000 vs. 2,432/10,000,000,000,000,000,000.




