The limitation of Bitcoin is that the block chain is only aware of the total hashing power, not individual miners, and thus can only adjust accordingly. P2Pool protocol chain is sort, and is easy to change, and each instance of P2Pool is aware of both the pool hashing power and each instance's local hashing power.
Would it be possible to just change the algorithm from adjusting difficulty to make a pool block every ten seconds based on overall pool hashing power, to one that bases it on the fraction of your hashing power compared to the overall pool? Have the difficulty start out at average, and as you mine, every thirty minutes recalculate your local difficulty based on reported hashing power, so that strong miners get increased difficulty and fewer shares and weak miners get more?
Or is this too difficult due to all blocks in the chain needing to be the same, or risky due to being easily hacked?
Currently shares are all the same because (your payout) = (your shares) / (total last n shares).
While you could make shares variable in difficulty and make it (your payout) = (sum of your shares difficulty) / (total sum of last n shares difficulty) it doesn't get around the ophan problem.
Bitcoin rarely has orphaned blocks because the round time is ~600 seconds. The shorter the round time the more likely two entities on the network find solution at roughly the same time and one of them gets orphaned. P2pool compromises between share difficulty & orphan rate by using a 10 second round time. It sets difficulty so someone will find a share roughly 10 seconds (and hopefully most of the time that "solution") can be shared to everyone else to avoid duplicated work in time.
So to avoid higher orphan rate you still need the average share time to be ~10 seconds. You could within reason allow smaller miners to use lower difficulty and larger miners to have higher difficulty but the average must still work out to ~1 share per 10 seconds.
So that solution has two problems:
a) the amount share difficulty can be vary is not much and if most miners are small it is very little at all.
b) larger miners would be accepting higher variance in order to give smaller miners lower variance. Something for nothing. Unlikely they will do that.The way I see it there are four decentralized solutions: multiple p2pools, merged share chain, dynamic p2pools, sub-p2pools.multiple p2pools.
The simplest solution is to simply start a second p2pool. There is no reason only one has to exist. Take the source code and modify it so the "alternate p2pool can be identified" and start one node. Node can join using modified client. Eventually client could be modified to have user indicate which instance of the network to join or even scan all instances and give user the option. If the two pool gets too large they also could be split. The disadvantage is that each split requires migration and that requires people to look out for the good of the network. For example 3 p2pools with 10GH, 20GH, and 2.87TH/s isn't exactly viable.
--------------------------------------merged share chain
In Bitcoin there can only be "one block" which links to the prior block. The reason why is it is used to prevent double spends. Double spend isn't as much of a problem in p2pool. Sure one needs to ensure that workers don't get duplicate credit but that can be solved without a static "only one" block-chain. Modifying the protocol to allow multiple branches at one level would seem to be possible. Since this would allow oprhans to be counted (within reason) it would be possible to reduce the share time. For example a 1 TH/s p2pool with a 2 second share time would have no higher difficulty than a 200 GH/s p2pool with 10 second share time. There likely are "gotchas" which need to be resolved but I believe a sharechain which allows "merging" is possible.
Building upon that idea of multiple p2pools the protocol could be expanded so that a new node queries a p2pool information net and gets statuses of existing p2pools. The network assigns new nodes where they best optimize the balance of the network. If the protocol enforces this pool assignment then there is no human gaming involved and the pools will be relatively balances. As pools grow or shrink they can be split or combined with other pools by the protocol. Some simulation would be needed to find the optimal balance between share variance and block variance. The network could even intentionally allow variance in pool size and share time. Larger pools with high difficulty and large share time to accommodate very large miners and smaller pools with lower difficulty to provide optimal solution for smaller individual miners.
Imagine the p2pool forming a "backbone" and for max efficiency the share time would be longer. Say 1 share per 60 seconds instead of 10 (difficulty goes up by factor of 6). At 1TH/s that is ~12,000 difficulty (which is high but not as high as block difficulty of 1.3 million). Due to 12K+ difficulty the only participants on this backbone are a) major hashing farms, b) conventional pools, and c) sub p2pools.
You notice I said conventional pools. Conventional pools which submit valid shares to p2pool are less of a security risk to Bitcoin than opaque proprietary poools.
For smaller miners who wish a fully decentralized solution they could form/join "sub-p2pools". These pools could be tuned for different speed miners to provide an optimal balance between block difficulty and share difficulty. They would maintain a sub-p2pool level share chain and use that to set the reward breakout for the subpool. When the one node in the subpool solves a "master p2pool" difficulty share (12K in above example) it submits it to the main pool (which updates the ultimate reward split to include the subpool current split for that share). subpools could be created manually (Rassah small miner subpool), or eventually dynamically by an protocol similar to the second solution. This requires a modest change to the existing p2pool (which would form the backbone). Currently 1 share can only be assigned to 1 address. To make sub-p2pools possible it would need to be possible to include an address list and distribution % for 1 share.
Note: these ideas aren't fleshed out. Likely someone can point out issues and areas where the explanation is incomplete. They are more designed as a thought exercise to look a potential avenues for expanding p2pool to handle potentially someday 51% of network hashing power (at which point an internal 51% attack becomes impossible). Obviously these are complex ideas which will take time to implement. I believe that "front ends" are preferable to small miners going back to deepbit and could act as a bridge to transition p2pool from 250GH/s to 1TH/s+ while more decentralized solutions are developed.
forrestv, are you considering acting on any of these ideas? What are you current thoughts on this?