ThiagoCMC
Legendary
Offline
Activity: 1204
Merit: 1000
฿itcoin: Currency of Resistance!
|
|
February 14, 2012, 05:12:20 AM |
|
1- WHEN the P2Pool difficulty will be bad for small miners? When we reaches 1THash, 2THash, 4THash?!
Much sooner than that. Some might say it is already bad for small miners. But I suppose it depends on what you consider "bad for small miners"? If it takes a typical miner several hours to find a share, that is going to lead to painful variance. I made a chart to illustrate what happens as the pool gets larger. The chart is based on an overall bitcoin difficulty of 1.4 million (approx what we are now and will be for the next round). Here is what the chart shows: - The solid red line is the average time for the pool to find a block as the pool's hashrate grows.
- The dotted lines are the average time for miners of various sizes to find a SHARE as the pool's size increases and the share difficulty increases with it.
- The point where a dotted line crosses the solid red line is the point at which the average time for a miner to find a share becomes larger than the time for the pool to find a block.
...Image removed from quote...twmz, Can you plote the same image but for Litecoin P2Pool?! So we can know when Litecoin P2Pool will be "bad" for small miners too... Thank you BTW! Best, Thiago
|
|
|
|
twmz
|
|
February 14, 2012, 05:30:43 AM |
|
Can you plote the same image but for Litecoin P2Pool?!
I have successfully avoided knowing anything about Litecoin, so unfortunatly, I can't.
|
Was I helpful? 1 TwmzX1wBxNF2qtAJRhdKmi2WyLZ5VHRs WoT, GPGBitrated user: ewal.
|
|
|
ThiagoCMC
Legendary
Offline
Activity: 1204
Merit: 1000
฿itcoin: Currency of Resistance!
|
|
February 14, 2012, 05:32:23 AM |
|
Can you plote the same image but for Litecoin P2Pool?!
I have successfully avoided knowing anything about Litecoin, so unfortunatly, I can't. Okay... Thanks anyway... Nevertheless, this would be good to the P2Pool development... We can use Litecoin P2Pool to improve Bitcoin P2Pool...
|
|
|
|
Raize
Donator
Legendary
Offline
Activity: 1419
Merit: 1015
|
|
February 14, 2012, 05:58:25 AM |
|
I have P2Pool running on one miner and it seems to be running fine. Then I tried to have another computer mine to it and here is what I get: Is that right?
|
|
|
|
twmz
|
|
February 14, 2012, 06:07:00 AM |
|
I have P2Pool running on one miner and it seems to be running fine. Then I tried to have another computer mine to it and here is what I get:
Is that right?
Have you successfully mined with that computer with other pools? Or is this behavior specific to p2pool. On the surface that hardware looks sick (lots of hardware errors (HW)). Note, the frequent long polls are normal.
|
Was I helpful? 1 TwmzX1wBxNF2qtAJRhdKmi2WyLZ5VHRs WoT, GPGBitrated user: ewal.
|
|
|
Regex
Newbie
Offline
Activity: 23
Merit: 0
|
|
February 14, 2012, 06:27:31 AM |
|
Public P2Pool, how?!
Guys, I would like to setup a Public P2Pool for anyone that doesn't have knowledge / time to setup its own P2Pool node...
How can I do it?!
Thanks! Thiago
So in other words, you want to centralize P2Pool? Are you kidding me?
|
|
|
|
ThiagoCMC
Legendary
Offline
Activity: 1204
Merit: 1000
฿itcoin: Currency of Resistance!
|
|
February 14, 2012, 06:33:02 AM |
|
Public P2Pool, how?!
Guys, I would like to setup a Public P2Pool for anyone that doesn't have knowledge / time to setup its own P2Pool node...
How can I do it?!
Thanks! Thiago
So in other words, you want to centralize P2Pool? Are you kidding me? No, you didn't get the idea. I want to help.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
February 14, 2012, 06:34:21 AM |
|
Public P2Pool, how?!
Guys, I would like to setup a Public P2Pool for anyone that doesn't have knowledge / time to setup its own P2Pool node...
How can I do it?!
Thanks! Thiago
So in other words, you want to centralize P2Pool? Are you kidding me? So think about this slowly and rationally. As p2pool grows the share difficulty will grow. At 1 TH it will take a miner on average 1 day to earn a share. And with intra-share variance that could be as long as 5-7 days every few blocks and occasionally hit 7-10 days in unlucky streaks. As p2pool grows its popularity for small and casual miners will wane as rising difficulty becomes higher and higher variance. So those miners WILL leave. Now they could go to a) a front end which uses p2pool and thus can't be used for 51% attack and preserves the core decentralized nature (albeit with some centralization). b) deepbit or so other major pool. So which is a superior solution?
|
|
|
|
Vanderbleek
|
|
February 14, 2012, 06:42:43 AM |
|
*snip*
So which is a superior solution?
In my opinion, a solution that automatically spins off smaller p2pools, and balances the load between them. That said, simple web interfaces (where users logins are the payout address) are not a terrible idea either, but I don't think we need to build pools that connect to p2pool -- why not just connect those pools straight to the main chain?
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
February 14, 2012, 06:48:08 AM Last edit: February 14, 2012, 06:58:30 AM by DeathAndTaxes |
|
but I don't think we need to build pools that connect to p2pool -- why not just connect those pools straight to the main chain?
A mutually beneficial relationship? They strengthen p2pool with hashing power. The hashing power of p2pool gives the pool a fighting chance to grow and reduce the hegemony of the big three. Variance can be brutal on small pools (including p2pool). It is a win-win. I agree a dynamic protocol which adaptively splits and reforms p2pools to create multiple medium sized pools based on demand is ideal but who knows if/when that will be written. It likely can be done but it is a non-trivial problem and won't be written in a weekend. p2pool will likely continue to grow rapidly and while block variance will continue to decline share variance will become a burden to small miners. Some may have interest in supporting p2pool indirectly. If they don't any "front end" pools will die and it is a non-issue. There is no magic one fit solution. It will take a lot of partial solutions to make a dent in deepbit's control over the network.
|
|
|
|
Regex
Newbie
Offline
Activity: 23
Merit: 0
|
|
February 14, 2012, 06:54:42 AM |
|
*snip*
So which is a superior solution?
In my opinion, a solution that automatically spins off smaller p2pools, and balances the load between them. That said, simple web interfaces (where users logins are the payout address) are not a terrible idea either, but I don't think we need to build pools that connect to p2pool -- why not just connect those pools straight to the main chain? I like that. :-)
|
|
|
|
cabin
|
|
February 14, 2012, 02:28:15 PM Last edit: February 14, 2012, 02:52:20 PM by cabin |
|
This seems like a good idea, share difficulty is already variable and pays variable so this is probably easy to do. I would prefer this over a complicated SMPPS scheme. That said, I'm thinking about several methods that would decrease P2Pool's difficulty, such as letting the big miners on P2Pool voluntarily raise their difficulty so that the smaller miners can use a reduced difficulty while maintaining 10 seconds per share.
|
|
|
|
Gabi
Legendary
Offline
Activity: 1148
Merit: 1008
If you want to walk on water, get out of the boat
|
|
February 14, 2012, 02:35:22 PM |
|
*snip*
So which is a superior solution?
In my opinion, a solution that automatically spins off smaller p2pools, and balances the load between them. That said, simple web interfaces (where users logins are the payout address) are not a terrible idea either, but I don't think we need to build pools that connect to p2pool -- why not just connect those pools straight to the main chain? Nice idea. +1
|
|
|
|
Rassah
Legendary
Offline
Activity: 1680
Merit: 1035
|
|
February 14, 2012, 04:45:58 PM |
|
The limitation of Bitcoin is that the block chain is only aware of the total hashing power, not individual miners, and thus can only adjust accordingly. P2Pool protocol chain is sort, and is easy to change, and each instance of P2Pool is aware of both the pool hashing power and each instance's local hashing power. Would it be possible to just change the algorithm from adjusting difficulty to make a pool block every ten seconds based on overall pool hashing power, to one that bases it on the fraction of your hashing power compared to the overall pool? Have the difficulty start out at average, and as you mine, every thirty minutes recalculate your local difficulty based on reported hashing power, so that strong miners get increased difficulty and fewer shares and weak miners get more? Or is this too difficult due to all blocks in the chain needing to be the same, or risky due to being easily hacked?
|
|
|
|
ThiagoCMC
Legendary
Offline
Activity: 1204
Merit: 1000
฿itcoin: Currency of Resistance!
|
|
February 14, 2012, 05:35:25 PM |
|
When I find a block, my reward is bigger?!
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
February 14, 2012, 05:36:51 PM Last edit: February 14, 2012, 06:02:17 PM by DeathAndTaxes |
|
The limitation of Bitcoin is that the block chain is only aware of the total hashing power, not individual miners, and thus can only adjust accordingly. P2Pool protocol chain is sort, and is easy to change, and each instance of P2Pool is aware of both the pool hashing power and each instance's local hashing power. Would it be possible to just change the algorithm from adjusting difficulty to make a pool block every ten seconds based on overall pool hashing power, to one that bases it on the fraction of your hashing power compared to the overall pool? Have the difficulty start out at average, and as you mine, every thirty minutes recalculate your local difficulty based on reported hashing power, so that strong miners get increased difficulty and fewer shares and weak miners get more? Or is this too difficult due to all blocks in the chain needing to be the same, or risky due to being easily hacked?
Currently shares are all the same because (your payout) = (your shares) / (total last n shares). While you could make shares variable in difficulty and make it (your payout) = (sum of your shares difficulty) / (total sum of last n shares difficulty) it doesn't get around the ophan problem. Bitcoin rarely has orphaned blocks because the round time is ~600 seconds. The shorter the round time the more likely two entities on the network find solution at roughly the same time and one of them gets orphaned. P2pool compromises between share difficulty & orphan rate by using a 10 second round time. It sets difficulty so someone will find a share roughly 10 seconds (and hopefully most of the time that "solution") can be shared to everyone else to avoid duplicated work in time. So to avoid higher orphan rate you still need the average share time to be ~10 seconds. You could within reason allow smaller miners to use lower difficulty and larger miners to have higher difficulty but the average must still work out to ~1 share per 10 seconds. So that solution has two problems: a) the amount share difficulty can be vary is not much and if most miners are small it is very little at all. b) larger miners would be accepting higher variance in order to give smaller miners lower variance. Something for nothing. Unlikely they will do that. The way I see it there are four decentralized solutions: multiple p2pools, merged share chain, dynamic p2pools, sub-p2pools.multiple p2pools.The simplest solution is to simply start a second p2pool. There is no reason only one has to exist. Take the source code and modify it so the "alternate p2pool can be identified" and start one node. Node can join using modified client. Eventually client could be modified to have user indicate which instance of the network to join or even scan all instances and give user the option. If the two pool gets too large they also could be split. The disadvantage is that each split requires migration and that requires people to look out for the good of the network. For example 3 p2pools with 10GH, 20GH, and 2.87TH/s isn't exactly viable. -------------------------------------- merged share chainIn Bitcoin there can only be "one block" which links to the prior block. The reason why is it is used to prevent double spends. Double spend isn't as much of a problem in p2pool. Sure one needs to ensure that workers don't get duplicate credit but that can be solved without a static "only one" block-chain. Modifying the protocol to allow multiple branches at one level would seem to be possible. Since this would allow oprhans to be counted (within reason) it would be possible to reduce the share time. For example a 1 TH/s p2pool with a 2 second share time would have no higher difficulty than a 200 GH/s p2pool with 10 second share time. There likely are "gotchas" which need to be resolved but I believe a sharechain which allows "merging" is possible. -------------------------------------- dynamic p2pool.Building upon that idea of multiple p2pools the protocol could be expanded so that a new node queries a p2pool information net and gets statuses of existing p2pools. The network assigns new nodes where they best optimize the balance of the network. If the protocol enforces this pool assignment then there is no human gaming involved and the pools will be relatively balances. As pools grow or shrink they can be split or combined with other pools by the protocol. Some simulation would be needed to find the optimal balance between share variance and block variance. The network could even intentionally allow variance in pool size and share time. Larger pools with high difficulty and large share time to accommodate very large miners and smaller pools with lower difficulty to provide optimal solution for smaller individual miners. -------------------------------------- sub p2poolsImagine the p2pool forming a "backbone" and for max efficiency the share time would be longer. Say 1 share per 60 seconds instead of 10 (difficulty goes up by factor of 6). At 1TH/s that is ~12,000 difficulty (which is high but not as high as block difficulty of 1.3 million). Due to 12K+ difficulty the only participants on this backbone are a) major hashing farms, b) conventional pools, and c) sub p2pools. You notice I said conventional pools. Conventional pools which submit valid shares to p2pool are less of a security risk to Bitcoin than opaque proprietary poools. For smaller miners who wish a fully decentralized solution they could form/join "sub-p2pools". These pools could be tuned for different speed miners to provide an optimal balance between block difficulty and share difficulty. They would maintain a sub-p2pool level share chain and use that to set the reward breakout for the subpool. When the one node in the subpool solves a "master p2pool" difficulty share (12K in above example) it submits it to the main pool (which updates the ultimate reward split to include the subpool current split for that share). subpools could be created manually (Rassah small miner subpool), or eventually dynamically by an protocol similar to the second solution. This requires a modest change to the existing p2pool (which would form the backbone). Currently 1 share can only be assigned to 1 address. To make sub-p2pools possible it would need to be possible to include an address list and distribution % for 1 share. -------------------------------------- Note: these ideas aren't fleshed out. Likely someone can point out issues and areas where the explanation is incomplete. They are more designed as a thought exercise to look a potential avenues for expanding p2pool to handle potentially someday 51% of network hashing power (at which point an internal 51% attack becomes impossible). Obviously these are complex ideas which will take time to implement. I believe that "front ends" are preferable to small miners going back to deepbit and could act as a bridge to transition p2pool from 250GH/s to 1TH/s+ while more decentralized solutions are developed.
|
|
|
|
miscreanity
Legendary
Offline
Activity: 1316
Merit: 1005
|
|
February 14, 2012, 05:47:24 PM |
|
The way I see it there are three decentralized solutions: multiple p2pools, dynamic p2pools, sub-p2pools
Excellent explanation. The way it looks is that multiple & sub-pools would be a natural result from the dynamic approach, so long as it incorporates the ability to communicate laterally, vertically, and internally. Until then, you're right - manually bootstrapping sub-pools does offer the best path for small miners. This is very similar to multicellular development in biology
|
|
|
|
Ente
Legendary
Offline
Activity: 2126
Merit: 1001
|
|
February 14, 2012, 05:48:48 PM |
|
I am all for sub-p2pools. Preferably, miners dont have to (and cant) choose where exactly they mine at, but are automatically transferred to the right sub-pool according to their own hashingpower and the general sub-pools-situation.
I just had another thought: To stay as a single p2pool: How about every payout address has its own, adjusting difficulty? You would just have to broadcast a lot of data/chains. But since you can compare/convert each miners difficulty to the total p2pool hashing power, payout should be easy to calculate? Just a thought..
Ente
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
February 14, 2012, 06:06:09 PM |
|
I just had another thought: To stay as a single p2pool: How about every payout address has its own, adjusting difficulty? You would just have to broadcast a lot of data/chains. But since you can compare/convert each miners difficulty to the total p2pool hashing power, payout should be easy to calculate? Just a thought.. That was the thought above which prompted by "essay". The largest constraint is that as average share time decreases the orphan rate increases. Eventually you reach a point where the share time is so small that orphan rate because astronomical. For example right now share difficulty is ~560. To have a share difficulty ~280 would require 5 second average share time which means a significant increase in orphan rate of pool. The "concepts" I outlined above (and no I didn't come up with them) are methods to "compartmentalize" the network so that you can have more shares per second without higher oprhan rates. All 4 "solutions" essentially do the same thing. They allow greater than 6 shares per minute without an increase in orphan rate. That is the ultimately the problem to solve.
|
|
|
|
Rassah
Legendary
Offline
Activity: 1680
Merit: 1035
|
|
February 14, 2012, 06:28:20 PM |
|
I was actually wondering whether it is possible for each P2Pool instance to adjust its own difficulty, so that both weak miners and strong miners took 10 seconds to solve a block. Essentially have each P2Pool server dynamically adjusts the difficulty for its own miners only, as opposed to having one difficulty for the overall pool. The payout would then be (total number of shares submitted) * (hash rate). Since total number of blocks submitted will be, on average, the same for both large miners and small ones (difference only depending on amount of time they were running), the payout will essentially be only dependent on the hash speed. This will require the chain to carry more data, but my main concern is that hackers could spoof their mining power to be way lower than it actually is, and then steal from everyone else by submitting way more than one share every 10 seconds. I realize this sort of defeats the purpose of a "verifiable chain," too...
|
|
|
|
|