I'm not getting my payment, I have my threshold @ 6.0 and I have 6.22 confirmed BTC I was probably unclear in my previous post, because few people wrote me PMs about this, too. Because of troubles with bitcoind I had to temporary stop payouts. I'm working on that right now, will be fixed in hour or two.
|
|
|
+1 Adafruit is great, I think ladyada is geeky enough to be interested in bitcoin .
|
|
|
Damn, bitcoind crashed on deadlock during sending rewards, again. There are many pending payouts but nothing lost; I'll solve the issue in few hours.
|
|
|
In general, static, unchanging information should not be sent repeatedly to mining clients with every getwork.
Agree. Unfortunately not sending some information in every requests leads to some refreshing and client side cache (because those information are not absolutely static; pool can connect new servers and want to use them in oncoming upgrade). That's why I don't like both solution (failover list in every getwork and static json config). This is why current "protocol" based on http is screwed. We _can_ implement config updates using LP channel, but all this is pretty ugly and complicated. Better than patching current "protocol" in all pool servers and all miners, I vote for discussion about message based TCP based protocol which can solve all current workarounds (I see LP as workaround, too) with some better 'final' solution. We already had a discussion about TCP protocol with jgarzik and we didn't found an agreement on details (mainly in message structure itself), but I hope that finding wider compromise (pool operators & miner developers) is the only way how to get things better.
|
|
|
Those miners sure caught a lucky break and a huge payday! Sucks for those with a slower hash rate though. Or maybe the hash rate didn't matter just an issue of who (due to luck of the draw) managed to put in some shares before the round was over?
Pool contribution is still 'fair', even in fast rounds. I mean - everybody have same chance to hit one of 55 shares, proportionally to their hashrate. With 100mhash/s, you have only 100x higher chance to hit one share than 1mhash/s user. So fast round does not add any advantage to fast miners... Highest user reward for block 117657 was 2.67 BTC for 3 shares, so rewards were still distributed pretty nicely (nobody hit, for example 20 BTC for himself).
|
|
|
Tycho, I like the switch feature, I plan something similar for my pool, so unified API is good thing (tm). But I don't feel that list of failover servers is needed in every getwork request. And I think it isn't needed at all (at least for me).
Currently I have implemented load balancer between getwork nodes and I'm planning to add IP failover of this balancer, so I don't need server failover feature in mining clients at all...
|
|
|
3073 2011-04-10 13:06:20 0:00:06 55 none 117657 21 confirmations left Any reason why for the reward column is says "none" even though I participated in that round? This is block 117657. What happened there? "Total shares" is commonly misunderstood. That's number of all shares of whole pool, not user's shares. So 49BTC was divided between those miners who submitted at least one share from those 55 total.
|
|
|
Is there any way to see which blocks a particular worker found? I noticed that one of mine found one today and purely out of curiosity I'm wondering which block it was. I'd take a certain amount of pride in being able to say, "Oh, yeah, block 117551, that was me!" Planning this into next release.
|
|
|
What kind of bribery will it take to get the average daily reward graph in tabular form, or even JSON,
Temporary you can use json-encoded data in graphs page. There will be direct json interface for stats like this in next release. additionally any shot at getting an "average reward per worker" graph of some kind?
That's quite hard to do. Everything (mainly reward calculations) is bounded to user account, so per-worker stats are not possible at the moment.
|
|
|
Must agree with Tycho and dbitcoin that it brings unneeded complexity. With extra headers in http requests the miner and pool code can be much more straightforward and those few extra bytes are not real problem.
|
|
|
The bitcoinpool miner is the only one that works with sdk 2.1 on win7 afaik, which is worth almost 4% more for me in increased hash rate (40mhash total more for me on dual 5970's). I'm solo mining though....
bitcoinpool miner is normal m0mchil miner with some cosmetic changes in logging and network layer, but it does not affect hashrate. standard poclbm works for me on sdk2.1 and win7 without any problem...
|
|
|
It means a lot of new people (or Gavin j/k) are signing up and finding out about bitcoins.
Or somebody have access to hundreds google accounts...
|
|
|
Amazing that it can still take 3 days in 2011.
I worked as an architect for the biggest subsidiary of Erste Bank Group (Europe), so I'm pretty familiar with the banking systems. I can say banks _want_ instant payments, but those systems are usually so old and inflexible that it isn't easily possible. I especially mean daily batch processing of payment queues between internal systemts etc. Thanks to their historical architecture, support for online transfers == completely new implementation of core systems. So we're talking about milions or even bilions euros. Younger banks are using newer technologies (obviously), so it's pretty common that small players have much faster payment processing.
|
|
|
I voted remove it, because I think a higher volume would make people take bitcoins more seriously.
DP does not affect volume; historical volume shows also DP trades. So that's not good reason for disabling DP.
|
|
|
I voted for 'keep it as is'. I believe that most of dark pool haters don't really understand what DP is and they are simply scared by unknowns. Writing custom bots for trading large amounts is ugly workaround, DP solve that nicely. I simply don't see the reason why to remove it. Disclaimer: I didn't used DP yet, so I'm not following my own interests . Edit: OK, I see one reason why DP is bad - market then looks thinner and people are affraid of trading because it looks like trade will be filled for worse price.
|
|
|
More *** from nster the troll Ignore the troll and his turds
Hey, bobR, please ignore nster. We all know that he is the biggest troll here so it's not so important to remind that again and again. Thanks.
|
|
|
It took 12 minutes to calculate this block: http://blockexplorer.com/b/117147There were more than 90 unconfirmed transaction at the time the block before this finished, and at the time this one finished. Why did only one (non-generation) transaction appear in the block? a) Somebody is accepting only transactions with fees (edit: most probable; there is currently only one more transaction with fees, which is 'sendmany' payout from the pool. Sendmany transactions are accepted only by newest clients, so older clients ignore this (their bad, they are missing my fees just because they use older bitcoin version . or b) Somebody started bitcoin client and found block pretty quick. Older transactions are not actively relayed to freshly connected node, so it's pretty normal that bitcoin client does not know all pending transactions.
|
|
|
and by the way - freenode channel #bitcoin-market outputs the streaming quotes from the bitcoincharts feed, in a pretty format.
so if you have a habit of hanging out on irc - join in and watch the bitcoin market activity scroll by!
I'm following #bitcoin-market, good for overall market view, but I'll prefer tcatm's solution for automatic processing like trading bots. Currently thinking about some, it's a pitty to have such nice market stream and don't use it for anything .
|
|
|
tcatm, good job! Consider some 'welcome json message' after connection to calm down impatient people (and for possible check that service is responding).
|
|
|
Why depend on google though?
Well... why not? If you're scared that Google will steal your pool credentials, simply block him in AdBlock or similar. AWstats is one of many a very good tracking program that you can run locally.
But I don't want to run it locally. I have tens of sites on analytics already, so I just generated yet another javascript code and put it to template; work done under 1 minute.
|
|
|
|