fireduck (OP)
|
|
July 25, 2012, 09:34:29 PM Last edit: April 06, 2013, 03:14:49 PM by fireduck |
|
I'd like to announce the Horrible Horrendous Terrible Tremendous Mining Pool http://hhtt.1209k.com/The special feature here is higher difficulty work units to reduce network chatter. It uses difficulty 32 work units (for now). Getting ready for higher hash rate butterflies in October. If you have a 1GH/s setup now and see the shares scrolling by at one every few seconds, imagine what it will be like with a 40GH/s BFL SC Single. There is simply no need for all that traffic. The time when difficulty one pool proof of works made sense is over.
|
Bitrated user: fireduck.
|
|
|
fireduck (OP)
|
|
July 26, 2012, 05:14:44 AM |
|
Payouts are working and detail pages are up for users.
|
Bitrated user: fireduck.
|
|
|
organofcorti
Donator
Legendary
Offline
Activity: 2058
Merit: 1007
Poor impulse control.
|
|
July 27, 2012, 05:49:02 AM |
|
This is a truly good idea - higher pool D is going to be necessary at some point soon. However, if you can make D user defined, then low hashrate will be less affected by variance - at 200Mhps, the standard deviation in share submission would be about 9% of the mean when going from D1 to D 32, but only 2.8% of the mean at 2Ghps. Sections 7.5 and 7.6 of Meni Rosenfeld's AoBPMRS covers this idea in more detail. While 200 Mhps might not be profitable soon, you might also have to increase the pool D, so this could be a good idea regardless. Well done on being the first! (afaik)
|
|
|
|
coinotron
Legendary
Offline
Activity: 1182
Merit: 1000
|
|
July 27, 2012, 09:11:08 AM |
|
This is a truly good idea - higher pool D is going to be necessary at some point soon. However, if you can make D user defined, then low hashrate will be less affected by variance - at 200Mhps, the standard deviation in share submission would be about 9% of the mean when going from D1 to D 32, but only 2.8% of the mean at 2Ghps. Sections 7.5 and 7.6 of Meni Rosenfeld's AoBPMRS covers this idea in more detail. While 200 Mhps might not be profitable soon, you might also have to increase the pool D, so this could be a good idea regardless. Well done on being the first! (afaik) Good idea. Just FYI, most LTC pools have implemented this idea for months. But very first implementation was (AFAIK) in Solidcoin 2.0 in November 2011.
|
|
|
|
fireduck (OP)
|
|
July 28, 2012, 04:08:24 PM |
|
I've thought about user selected difficulties, maybe with a lower pool fee the higher the difficulty which gives people incentive to reduce database work for the pool. I think that makes a lot of sense but as I don't know pushpoold code base very well it will probably take me some time to work out.
As an aside, apparently long polling works just fine. Just not via the proxy I was using with my miners so I thought it didn't work.
|
Bitrated user: fireduck.
|
|
|
Lethos
|
|
August 29, 2012, 01:21:44 PM |
|
Just testing out your Pool, Fireduck. Surprised it's still this small, already got my first payments came through.
Is 32 difficulty really that scary to everyone?
|
|
|
|
fireduck (OP)
|
|
August 29, 2012, 01:49:06 PM |
|
Just testing out your Pool, Fireduck. Surprised it's still this small, already got my first payments came through.
Is 32 difficulty really that scary to everyone?
It is probably my terrible web page, that I call all the users suckers on the web page, that the hash rate is low and I don't offer merged mining. All these things add up. I imagine I'll get more traffic once the ASICs start hitting but I'd be surprised if other pools didn't have a way to increase their difficulty by then as well.
|
Bitrated user: fireduck.
|
|
|
Lethos
|
|
August 29, 2012, 02:09:20 PM |
|
Just testing out your Pool, Fireduck. Surprised it's still this small, already got my first payments came through.
Is 32 difficulty really that scary to everyone?
It is probably my terrible web page, that I call all the users suckers on the web page, that the hash rate is low and I don't offer merged mining. All these things add up. I imagine I'll get more traffic once the ASICs start hitting but I'd be surprised if other pools didn't have a way to increase their difficulty by then as well. I've been mining long enough to realise merged mining means very very little difference to overall bitcoins I end up getting. Merged mining just makes it so that all these other bitcoin copies end up staying around longer than they should. What I saw as nice was a PPS rate that is done exactly like everyone else, but with only a 2% fee, which is quiet tempting and to be honest, not having to register an account to mine their made the process really a lot easier. Sure you could give it a face lift, but if you don't it's not the end of the world. I just want to see how my FPGA's handle doing a higher difficulty.
|
|
|
|
cmg5461
|
|
August 29, 2012, 02:23:59 PM |
|
Just a thought, but the shares people produce are random. Some shares can be seen as > 32 difficulty while being mined on 1 difficulty. What's to say the user can't submit false D data based on the share they produce?
Say you produce one share every second at 1 D with 1 GH
while someone with xx Gh produces 1 share every second at 32 D
a few of those shares for the 1 gh will be > 1 difficulty, so they can be seen as mined with a higher difficulty, falsely representing their true hash rate.
|
If I've helped: 1CmguJhwW4sbtSMFsyaafikJ8jhYS61quz
Sold: 5850 to lepenguin. Quick, easy and trustworthy.
|
|
|
Meni Rosenfeld
Donator
Legendary
Offline
Activity: 2058
Merit: 1054
|
|
August 29, 2012, 02:29:39 PM |
|
Just a thought, but the shares people produce are random. Some shares can be seen as > 32 difficulty while being mined on 1 difficulty. What's to say the user can't submit false D data based on the share they produce?
Say you produce one share every second at 1 D with 1 GH
while someone with xx Gh produces 1 share every second at 32 D
a few of those shares for the 1 gh will be > 1 difficulty, so they can be seen as mined with a higher difficulty, falsely representing their true hash rate.
The share difficulty needs to be decided in advance. If a miner configured as a 32 miner submits a 1 share it will be rejected. If a miner configured as 1 submits a 32 share it will count just like a normal 1 share.
|
|
|
|
fireduck (OP)
|
|
August 29, 2012, 02:31:00 PM |
|
Just a thought, but the shares people produce are random. Some shares can be seen as > 32 difficulty while being mined on 1 difficulty. What's to say the user can't submit false D data based on the share they produce?
Say you produce one share every second at 1 D with 1 GH
while someone with xx Gh produces 1 share every second at 32 D
a few of those shares for the 1 gh will be > 1 difficulty, so they can be seen as mined with a higher difficulty, falsely representing their true hash rate.
In the case of my pool, the difficulty is configured and used for everyone. So iff a share is good enough for difficulty 32 it gets accepted. If someone were to setup dynamic difficulty or user selectable difficulty they would have to track which work they gave at what difficulty to avoid the problem you are describing. That shouldn't be hard, since the pool needs to track which work it has issued anyways.
|
Bitrated user: fireduck.
|
|
|
fireduck (OP)
|
|
August 29, 2012, 02:34:56 PM |
|
The share difficulty needs to be decided in advance. If a miner configured as a 32 miner submits a 1 share it will be rejected. If a miner configures as 1 submits a 32 share it will count just like a normal 1 share.
Yep. I was thinking about doing something where the user specifies the difficulty in the username, so it would be address_difficulty. That way miners can decide what difficulty made sense for their rigs. Then I would also define a sliding scale where higher difficulty means less fees. This would allow me to encourage selecting a higher difficulty which reduces DB strain on my end.
|
Bitrated user: fireduck.
|
|
|
Lethos
|
|
August 29, 2012, 03:59:41 PM |
|
The share difficulty needs to be decided in advance. If a miner configured as a 32 miner submits a 1 share it will be rejected. If a miner configures as 1 submits a 32 share it will count just like a normal 1 share.
Yep. I was thinking about doing something where the user specifies the difficulty in the username, so it would be address_difficulty. That way miners can decide what difficulty made sense for their rigs. Then I would also define a sliding scale where higher difficulty means less fees. This would allow me to encourage selecting a higher difficulty which reduces DB strain on my end. Well since the password can be anything at the moment, if a user puts it down as d20, that could be the difficulty they get given, if it's something that doesn't look like a difficulty request you'd use a default.
|
|
|
|
fireduck (OP)
|
|
August 29, 2012, 04:07:59 PM |
|
Well since the password can be anything at the moment, if a user puts it down as d20, that could be the difficulty they get given, if it's something that doesn't look like a difficulty request you'd use a default.
That is a good idea and for some parts might be easier since then the username doesn't need to be manipulated. I might make the password be a HTTP style parameter string in case more things need to be added. Something like: difficulty=20&magic_pony=true&whatever=something
|
Bitrated user: fireduck.
|
|
|
Lethos
|
|
August 29, 2012, 04:17:54 PM |
|
Your going to have to explain that magic pony...
|
|
|
|
crazyates
Legendary
Offline
Activity: 952
Merit: 1000
|
|
August 29, 2012, 05:00:40 PM |
|
Just a thought, but the shares people produce are random. Some shares can be seen as > 32 difficulty while being mined on 1 difficulty. What's to say the user can't submit false D data based on the share they produce?
Say you produce one share every second at 1 D with 1 GH
while someone with xx Gh produces 1 share every second at 32 D
a few of those shares for the 1 gh will be > 1 difficulty, so they can be seen as mined with a higher difficulty, falsely representing their true hash rate.
In the case of my pool, the difficulty is configured and used for everyone. So iff a share is good enough for difficulty 32 it gets accepted. If someone were to setup dynamic difficulty or user selectable difficulty they would have to track which work they gave at what difficulty to avoid the problem you are describing. That shouldn't be hard, since the pool needs to track which work it has issued anyways. Is it even possible to mine at 1GH/s, and all diff>=32 get submitted to your pool, and any diff<32 get submitted to another pool?
|
|
|
|
fireduck (OP)
|
|
August 29, 2012, 05:14:37 PM |
|
Is it even possible to mine at 1GH/s, and all diff>=32 get submitted to your pool, and any diff<32 get submitted to another pool?
Negative. For the same reason that it is impossible to mine on a pool and take any found blocks yourself. When mining you are doing hash(DATA_FROM_POOL + NONCE + TIME). Your client changes the nonce and to a certain extent the time. The DATA_FROM_POOL includes a hash of a transaction group that includes a payment to the pool. You can't change that without invalidating the work. Pools also check that the work unit you are submitting is one they gave out.
|
Bitrated user: fireduck.
|
|
|
crazyates
Legendary
Offline
Activity: 952
Merit: 1000
|
|
August 29, 2012, 05:17:40 PM |
|
Is it even possible to mine at 1GH/s, and all diff>=32 get submitted to your pool, and any diff<32 get submitted to another pool?
Negative. For the same reason that it is impossible to mine on a pool and take any found blocks yourself. When mining you are doing hash(DATA_FROM_POOL + NONCE + TIME). Your client changes the nonce and to a certain extent the time. The DATA_FROM_POOL includes a hash of a transaction group that includes a payment to the pool. You can't change that without invalidating the work. Pools also check that the work unit you are submitting is one they gave out. That's what I thought, but your other post kinda made it sound like that could be possible.
|
|
|
|
fireduck (OP)
|
|
August 29, 2012, 05:28:02 PM |
|
That's what I thought, but your other post kinda made it sound like that could be possible.
Ah, I was talking about if a pool allowed users to pick their own difficulty. In that case there would be a possibility of a user finding a good hash and then claiming it as a higher difficulty share to get more credit for it.
|
Bitrated user: fireduck.
|
|
|
Lethos
|
|
August 30, 2012, 10:39:54 AM |
|
That's what I thought, but your other post kinda made it sound like that could be possible.
Ah, I was talking about if a pool allowed users to pick their own difficulty. In that case there would be a possibility of a user finding a good hash and then claiming it as a higher difficulty share to get more credit for it. How complicated is it to have multiple different share difficulties run from the same pool, and in your case I presume the same server?
|
|
|
|
|