Bitcoin Forum
May 05, 2024, 09:42:43 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: can pools actually handle the increased shares from ASICs?  (Read 1263 times)
Desolator (OP)
Sr. Member
****
Offline Offline

Activity: 392
Merit: 250



View Profile
October 11, 2012, 04:41:20 AM
 #1

It'd suck and only be slightly funny if half the pools out there went offline after ASICs launched because everyone collectively sent 10x the shares all of a sudden Tongue The way I understand it, a pool sends a proof of work operation the same as the block but with a much lower difficulty.  The client solves it and submits the "answer" and the server re-runs that 1 calculation to make sure the answer is correct then accepts it.  If it happened to also have been a low enough hash value for the block itself, it submits it and the pool gets 50BTC.  Yay!

Except what's the CPU and RAM usage going to look like at 10x the shares?  I assume if my i5 can do like 12MH/s then doing 1 verification hash calculation wouldn't take real long but some pools get many thousands of shares in a minute and I'm more worried about the overhead and network traffic.  Are the pools in any trouble if they're running on crappy servers or is the client so lightweight, nobody needs to be concerned?

Or is there some manual or automatic adjustment to share difficulty that will make shares themselves harder to solve and keep the communication and verification from getting too intense?
1714945363
Hero Member
*
Offline Offline

Posts: 1714945363

View Profile Personal Message (Offline)

Ignore
1714945363
Reply with quote  #2

1714945363
Report to moderator
1714945363
Hero Member
*
Offline Offline

Posts: 1714945363

View Profile Personal Message (Offline)

Ignore
1714945363
Reply with quote  #2

1714945363
Report to moderator
Every time a block is mined, a certain amount of BTC (called the subsidy) is created out of thin air and given to the miner. The subsidy halves every four years and will reach 0 in about 130 years.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714945363
Hero Member
*
Offline Offline

Posts: 1714945363

View Profile Personal Message (Offline)

Ignore
1714945363
Reply with quote  #2

1714945363
Report to moderator
1714945363
Hero Member
*
Offline Offline

Posts: 1714945363

View Profile Personal Message (Offline)

Ignore
1714945363
Reply with quote  #2

1714945363
Report to moderator
crazyates
Legendary
*
Offline Offline

Activity: 952
Merit: 1000



View Profile
October 11, 2012, 04:57:30 AM
 #2

IIRC, bot the stratum and GBT protocols were designed to eliminate a pools bandwidth for having to deal with 100x the number of getworks and submitted shares.

As far as actually verifying a submitted share, IDK. Are you saying a pool rehashes every submitted share, cuz that doesnt sound right... Huh

Tips? 1crazy8pMqgwJ7tX7ZPZmyPwFbc6xZKM9
Previous Trade History - Sale Thread
ralree
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500


Manateeeeeeees


View Profile
October 11, 2012, 05:15:10 AM
 #3

DrHaribo at Bitminter is implementing stratum, and I have a feeling before ASIC hits it will be done.  There will be some rather large players (fefox, avidreader, gigavps) using the pool (unless they move or go solo), so it will have to stand up to at least 8-10TH almost immediately.

1MANaTeEZoH4YkgMYz61E5y4s9BYhAuUjG
-ck
Legendary
*
Offline Offline

Activity: 4102
Merit: 1632


Ruu \o/


View Profile WWW
October 11, 2012, 05:24:05 AM
 #4

Going to stratum with variable difficulty on pools will use less bandwidth and cpu for the pool, no matter what the hashrate is, compared to current hardware and protocols. In fact, stratum with a properly tuned variable diff server, the amount of traffic is static regardless of hashrate. The only real issue is what happens at the start of mining as currently they start at diff 1 and then increase. I doubt that will be the case long term as the software will deliver expected hashrate and/or desired start difficulty.

Developer/maintainer for cgminer, ckpool/ckproxy, and the -ck kernel
2% Fee Solo mining at solo.ckpool.org
-ck
Desolator (OP)
Sr. Member
****
Offline Offline

Activity: 392
Merit: 250



View Profile
October 11, 2012, 02:15:24 PM
 #5

dunno what half the technology you're talking about is, probably cuz I don't run a pool Tongue but it sounds like you're saying the pools have their own independent difficulty adjuster to keep the share rate pretty static and the only problem is a new pool starting at 1 instead of like 100,000.  Sounds stable enough.  Do all pools use such load balancing measures?

Btw yeah, I would assume all pools have to verify a submitted share so it can accept it or reject it as a correct answer.  So if they send out a block calculation with say 1000x easier difficulty than the real block but with the same base data, once someone's mining client does 10 billion calculations and finds a sufficiently low hash, the pool server won't just take their word for it.  It has to re-run that 1 single hash to verify that the result is low enough.  So if my GPU does 10 billion hashes in like 1 minute or whatever and then I just have to have the server verify the one that I said was correct.  A Pentium 2 could do that in 1 ms Tongue but the problem is, what if it's 1 server and there's a a million shares coming in at a time?  Then you're back up to somewhat big numbers.  1 MH/s = 1 million verified shares and any server chip can do that but this isn't a bitstream.  It's 1 million individual requests coming over the internet that the NIC has to translate, load into memory, etc so you're not quite going to see the same performance as a desktop running the operation solo in a repetitive fashion.

But, if the people above me are referring to an automatic difficulty adjustment for the shares themselves within a pool, I would assume you can just set it at a target of 1000 shares per minute or something and it will 10x the difficulty if it has to to keep the shares lower.
firefop
Sr. Member
****
Offline Offline

Activity: 420
Merit: 250


View Profile
October 11, 2012, 02:31:25 PM
 #6

But, if the people above me are referring to an automatic difficulty adjustment for the shares themselves within a pool, I would assume you can just set it at a target of 1000 shares per minute or something and it will 10x the difficulty if it has to to keep the shares lower.
right now there are some pools getting ready for stratum (which is basically a better way than getwork).

There are also a some pools where you can adjust the difficulty manually.


Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!