IYFTech
|
|
June 19, 2014, 11:06:09 PM |
|
The motivation is getting p2pool to be #1 & decentralising the network.
The reward is donations from users for development. No development=no donations & a tiny pool that struggles to find users.
Edit: I've been using a custom bitcoind data location for nearly 2 years. It's not new.
|
|
|
|
norgan
|
|
June 19, 2014, 11:20:19 PM |
|
The motivation is getting p2pool to be #1 & decentralising the network.
The reward is donations from users for development. No development=no donations & a tiny pool that struggles to find users.
Edit: I've been using a custom bitcoind data location for nearly 2 years. It's not new.
I wonder how many nodes (if any) actually donate. Everyone seems to assume that forrestv is making a motza on donations but I fear it may be grossly overestimated. Let's say, hypothetically, that no nodes (or a couple of very small ones) are donating. Let's assume this has been the way for some time now, I mean the setup guides all talk about being able to turn off donations. Now lets pretend we are forrestv, we are taking no donations and every time we visit this thread someone is complaining about no dev going on. I wonder how many people, in that situation, would feel like doing any dev work for p2pool. What I am interested in is identifying the issues. Let's list them out and try to point out what part of the code or at least what part of the logic, the issue is in. Like a list of requirements for p2pool v2. Let's then try to get those requirements out, say by logging issues in github and then sharing it around as an active call for devs. Now, lets try to formulate some kind of incentive for devs to do this. How can we attract devs to p2pool to do some work for us? Maybe we take a donation drive with some kind of escrow that is a bounty for devs to complete the listed requests. Can we start doing that? actually being constructive. Let's make a list, what is the most pressing bug/issue/missing feature from p2pool?
|
|
|
|
PatMan
|
|
June 19, 2014, 11:21:59 PM |
|
Question: What would you rather have?
a) The ability to mine BTC on p2pool with whatever hardware you have/choose without issue?
b) The ability to mine another coin that nobody really uses with a very limited selection of mining equipment with issues?
I'll take a.
|
|
|
|
norgan
|
|
June 19, 2014, 11:27:22 PM |
|
https://github.com/forrestv/p2poolyou can follow dev going on here, there is a mailing list and a place to submit issues. code was updated 3 months ago. Again, I'm calling for p2pool node users, owners and others to think this through logically and identify what exactly needs doing. then lets take control and reach out to find someone to help us with those requirements. Let's work out a way to make it attractive for a dev to do some work on p2pool.
|
|
|
|
IYFTech
|
|
June 19, 2014, 11:53:43 PM |
|
I wonder how many nodes (if any) actually donate.
If any..... Well, for a start, everyone who followed murdof's excellent guide ( https://bitcointalk.org/index.php?topic=651819.0) is unknowingly/knowingly donating to invisible-dev, because there is unfortunately no mention of how to switch it off. That's going back a week. Now, work out how many weeks p2pool has been active & that will give you a fairly good idea of how many users are unknowingly/knowingly donating to invisible-dev. As for the rest of your statement - the reason you have to use words like "hypothetically", "assume", "pretend" & "I wonder" is because you don't know. The reason you don't know is because invisible-dev does not communicate with anyone whatsoever - nobody knows. That's why I prefer to go with facts - I go by what I see, and I haven't seen anything major happen with p2pool for over a year. That makes it stone age in software terms, and is why hardly any of the newer hardware coming out will work with it. That is what needs to be put right, not add another insignificant alt-coin & disguise it as "development" in an effort to bump up dwindling donations by unsuspecting users.
|
|
|
|
mdude77
Legendary
Offline
Activity: 1540
Merit: 1001
|
|
June 19, 2014, 11:56:07 PM |
|
The author donate address is in the code, isn't it? Last I looked it had received about 36 BTC since its inception.
I may have the wrong address of course.
M
|
I mine at Kano's Pool because it pays the best and is completely transparent! Come join me!
|
|
|
PatMan
|
|
June 19, 2014, 11:57:25 PM |
|
Thank you for pointing out the blindingly obvious to a noob like me
|
|
|
|
PatMan
|
|
June 19, 2014, 11:58:56 PM |
|
The author donate address is in the code, isn't it?
Yup
|
|
|
|
norgan
|
|
June 20, 2014, 12:03:04 AM |
|
Thank you for pointing out the blindingly obvious to a noob like me Some people don't know this, It wasn't targeted at you. IYFTech, yes I don't know, neither do you! so great! we agree, lets stick to fact. Fact is we need some dev to push p2pool forward. What do we need and how can we best go about it. I think we can safely give up on forrestv. mdude, thanks, that's a nice little purse pocket. All, I think we should stop bickering between each other and complaining about forrestv and make something happen. I'm trying to motivate you all to get a list of clear requirements together and find a way of getting it done. We need to go out and ask for help. P.s. the guide I followed had a clear mention of the --give-author 0 option, I've had it configured since day one and I was very new to p2pool when I found it.
|
|
|
|
jonnybravo0311
Legendary
Offline
Activity: 1344
Merit: 1024
Mine at Jonny's Pool
|
|
June 20, 2014, 12:26:55 AM |
|
The two biggest problems I see with p2pool are as follows:
1) Vertical scalability 2) Hardware support
By definition, p2pool does a fantastic job at scaling horizontally. Just take a look at the latest guide and you can have p2pool, complete with merged-mining up and running with two commands. Even if you don't follow that guide, and build everything from scratch, the amount of work to get a node up and running is pretty minimal. Unfortunately, the failure is in the pool's ability to scale vertically. As most pools grow in size, the variance the miners experience levels out. Look at pools like GHash and you'll see that your payouts are pretty spot-on accurate with the "expected" payout calculators. With p2pool, exactly the opposite happens. The more the pool grows, the more variance the individual miner will experience. Let's face it, as it stands now, you need to have at least 100GH/s to have an expectation of finding 1 share a day. All of those people we are trying to convince to join the p2pool experience need to realize that the entry fee at this point is pretty much an Antminer S1.
The reason it fails to miserably at scaling vertically is due to the design of the pool's payout mechanism. Essentially, as a miner on p2pool you are solving a block (we call it finding a share). The structure of this block chain is very similar to the BTC block chain. The primary difference is generation time. The share chain is expected to find a "block" every 30 seconds, whereas Bitcoin is designed to find a block every 10 minutes. Just like what happens to BTC difficulty every 2016 blocks, the same happens to the share chain (except more rapidly) to keep the average "block" time at that 30 second mark. If we suddenly increase the hashing power to 1PH/s, 10PH/s, etc, the difficulty to find a share will increase correspondingly. Then, the entry price of mining on p2pool will no longer be 100GH/s, it will be 1TH/s, 10TH/s, etc.
One possible way to address this problem is to introduce a concept of tiered nodes. Just like the share chain is now kept to store miners/payouts/etc, we would introduce sub-chains. Each of these sub-chains would have lower difficulty. As miners submit shares, those shares would be evaluated against difficulty for that share chain, the one above it, and so on, right up to the BTC block chain. Implementation would be tricky because miners would have to be restricted and directed properly to an appropriate share chain. You wouldn't want a miner with 10TH/s hitting the share chain whose difficulty was set to handle miners with an average of 100GH/s. Either the pool would have to dynamically adjust and put the miner on the appropriate chain based upon the miner's calculated hash rate, or it would have to outright reject the miner because of the calculated hash rate. This is just one possible solution, and I need to think on it further to discover pros and cons of the approach, and if it would be feasible to introduce. Comments and suggestions are most certainly welcomed.
As for hardware support, we are seeing a number of the newer manufacturers not building drivers into their hardware for p2pool mining. A great example is the Antminer S2. Point your S2 at a p2pool node and you'll be lucky to see 850-900GH/s. Point that same S2 to Eligius, BTCGuild, GHash.io and you'll get the advertised 1TH/s. Kano (one of the developers of cgminer) has been trying to build out a replacement cgminer for the S2... thus far unsuccessfully. The last post from him was that when he pointed the latest version of his firmware to p2pool it exploded with all kinds of errors, both on the miner and on the p2pool node. We need a developer who can work closely with ckolivas/kano/Luke-jr to ensure that p2pool is compatible with the new hardware releases. As a hardware manufacturer, what is going to be my best bet? Developing firmware for a pool that makes up a tiny portion of the network, or ensuring my hardware works well with pools that represent 80% of the network's total hash rate? Those aren't trick questions... it's easy. As a manufacturer, you work with the 80.
OK... that's my 2 satoshi.
|
Jonny's Pool - Mine with us and help us grow! Support a pool that supports Bitcoin, not a hardware manufacturer's pockets! No SPV cheats. No empty blocks.
|
|
|
PatMan
|
|
June 20, 2014, 12:32:53 AM |
|
The problem is finding a dev who is familiar with python, bitcoin & the p2pool idea who doesn't want a large payout for doing the work. Trust me, I tried it a year or so ago here: https://bitcointalk.org/index.php?topic=213051.0 - it ain't easy. I actually found a dev who was willing to do it - but at a price, which I didn't have. I also had doubts about his ability & reasons for doing it. The only difference between now & then is that now it seems that people are in agreement - back then when I dared to disagree & voice my concerns about the lack of development (over a year ago!) forrestv was considered some kind of demi-god, an untouchable, he could do no wrong. He also used to post every now and again back then......I was scorned - how dare I say such blasphemous things..... Now look at us. Funny how things change eh?
|
|
|
|
norgan
|
|
June 20, 2014, 12:44:00 AM |
|
The last post from him was that when he pointed the latest version of his firmware to p2pool it exploded with all kinds of errors, both on the miner and on the p2pool node. We need a developer who can work closely with ckolivas/kano/Luke-jr to ensure that p2pool is compatible with the new hardware releases.
Thanks, nice post, good constructive feedback. I read that Kano now has a working driver for the s2 but I think he has yet to solve the p2pool issue. I'm sure he will. He is one of our greatest allies in regards to hardware support via cgminer. I hope we can convince more people like him and/or hardware devs to build in p2pool support.
|
|
|
|
Hunterbunter
|
|
June 20, 2014, 12:51:53 AM |
|
One possible way to address this problem is to introduce a concept of tiered nodes. Just like the share chain is now kept to store miners/payouts/etc, we would introduce sub-chains. Each of these sub-chains would have lower difficulty. As miners submit shares, those shares would be evaluated against difficulty for that share chain, the one above it, and so on, right up to the BTC block chain. Implementation would be tricky because miners would have to be restricted and directed properly to an appropriate share chain. You wouldn't want a miner with 10TH/s hitting the share chain whose difficulty was set to handle miners with an average of 100GH/s. Either the pool would have to dynamically adjust and put the miner on the appropriate chain based upon the miner's calculated hash rate, or it would have to outright reject the miner because of the calculated hash rate. This is just one possible solution, and I need to think on it further to discover pros and cons of the approach, and if it would be feasible to introduce. Comments and suggestions are most certainly welcomed.
Just a thought, why aren't payouts split by straight valid hashpower between blocks? Is it too easily fudged? Too naive? P2Pool already logs hash area, and I've been using that to dish out merged coin rewards, and while there might be a problem with that, I don't know what it is.
|
|
|
|
jdot007
Member
Offline
Activity: 73
Merit: 10
|
|
June 20, 2014, 01:03:46 AM |
|
@p2p improvement talk
So It sounds like after a prioritized list is made, we need donations & bounty system via reputable escrow to get some of these changes implemented. Basically a P2p consortium.
|
|
|
|
norgan
|
|
June 20, 2014, 01:04:36 AM |
|
The problem is finding a dev who is familiar with python, bitcoin & the p2pool idea who doesn't want a large payout for doing the work. Trust me, I tried it a year or so ago here: https://bitcointalk.org/index.php?topic=213051.0 - it ain't easy. I actually found a dev who was willing to do it - but at a price, which I didn't have. I also had doubts about his ability & reasons for doing it. The only difference between now & then is that now it seems that people are in agreement - back then when I dared to disagree & voice my concerns about the lack of development (over a year ago!) forrestv was considered some kind of demi-god, an untouchable, he could do no wrong. He also used to post every now and again back then......I was scorned - how dare I say such blasphemous things..... Now look at us. Funny how things change eh? Nice work PatMan, I admit I'm relatively new to all this but I like the concept and how it all works. Good to see someone trying to get some work going. You come across quite jaded and seeing this I can understand some of that. You probably see me as quite naïve lol I guess I'm just an optimist and hope that we can drum up some interest in this. My c# skills are pretty limited, if I knew more I'd be happy to put in some work on it.
|
|
|
|
mdude77
Legendary
Offline
Activity: 1540
Merit: 1001
|
|
June 20, 2014, 01:11:00 AM |
|
One possible way to address this problem is to introduce a concept of tiered nodes. Just like the share chain is now kept to store miners/payouts/etc, we would introduce sub-chains. Each of these sub-chains would have lower difficulty. As miners submit shares, those shares would be evaluated against difficulty for that share chain, the one above it, and so on, right up to the BTC block chain. Implementation would be tricky because miners would have to be restricted and directed properly to an appropriate share chain. You wouldn't want a miner with 10TH/s hitting the share chain whose difficulty was set to handle miners with an average of 100GH/s. Either the pool would have to dynamically adjust and put the miner on the appropriate chain based upon the miner's calculated hash rate, or it would have to outright reject the miner because of the calculated hash rate. This is just one possible solution, and I need to think on it further to discover pros and cons of the approach, and if it would be feasible to introduce. Comments and suggestions are most certainly welcomed.
Just a thought, why aren't payouts split by straight valid hashpower between blocks? Is it too easily fudged? Too naive? P2Pool already logs hash area, and I've been using that to dish out merged coin rewards, and while there might be a problem with that, I don't know what it is. I think the fundamental reason p2pool is appealing is it just works. You don't have to trust a pool operator. It takes care of constructing the potential blocks so that when one is found, each miner receives his fair share of the block. You can't hack your p2pool node so that you get the block proceeds. I don't think your method would still work with the TNO decentralized process p2pool has now. Also, a third reason why p2pool isn't appealing, is, as someone said earlier, the whole merged mining issue. It just doesn't work. Each node operator can keep the proceeds for himself, or he can manually divvy it out. We need something more automated and TNO. M
|
I mine at Kano's Pool because it pays the best and is completely transparent! Come join me!
|
|
|
norgan
|
|
June 20, 2014, 01:41:38 AM |
|
Is something like a kickstarter or pozzible project suitable for this? the 51% attack possibility has hit pretty major news outlets so would have some public interest. I'm happy to set something up but again we need a way of distributing the funds to whoever contributes. Open to ideas, does anyone know of anything like that? Would be really nice if it accepted Bitcoin as well as fiat currency.
|
|
|
|
Hunterbunter
|
|
June 20, 2014, 01:43:25 AM |
|
One possible way to address this problem is to introduce a concept of tiered nodes. Just like the share chain is now kept to store miners/payouts/etc, we would introduce sub-chains. Each of these sub-chains would have lower difficulty. As miners submit shares, those shares would be evaluated against difficulty for that share chain, the one above it, and so on, right up to the BTC block chain. Implementation would be tricky because miners would have to be restricted and directed properly to an appropriate share chain. You wouldn't want a miner with 10TH/s hitting the share chain whose difficulty was set to handle miners with an average of 100GH/s. Either the pool would have to dynamically adjust and put the miner on the appropriate chain based upon the miner's calculated hash rate, or it would have to outright reject the miner because of the calculated hash rate. This is just one possible solution, and I need to think on it further to discover pros and cons of the approach, and if it would be feasible to introduce. Comments and suggestions are most certainly welcomed.
Just a thought, why aren't payouts split by straight valid hashpower between blocks? Is it too easily fudged? Too naive? P2Pool already logs hash area, and I've been using that to dish out merged coin rewards, and while there might be a problem with that, I don't know what it is. I think the fundamental reason p2pool is appealing is it just works. You don't have to trust a pool operator. It takes care of constructing the potential blocks so that when one is found, each miner receives his fair share of the block. You can't hack your p2pool node so that you get the block proceeds. I don't think your method would still work with the TNO decentralized process p2pool has now. TNO = trust not needed? The split by hashpower idea isn't very complicated...I'm sure someone would have thought of it before and decided against it. I just wonder if anyone knows what those reasons were. I don't see why P2Pool itself couldn't assign payments according to that, still no trust needed.
|
|
|
|
norgan
|
|
June 20, 2014, 01:58:47 AM Last edit: June 20, 2014, 02:16:34 AM by norgan |
|
ok so we need two things: 1. a way of tracking issues and feature requests and voting system to determine the need 2. a way to offer a reward to developers who get involved. Some kind of kickstarter project. For 1 I have this http://p2pminers.ideascale.com/If you post your ideas and issues there then we can have miners of p2pool vote to help establish an urgency of need for each idea or problem. Anyone can submit an idea. (moderators are welcome and happy to add you if you pm me your email addy). For 2 kickstarter.com could work? open to ideas for this. happy to set up a kickstarter but we need a system that either requires no trust or allows a level of trust that satisfies any doubts.
|
|
|
|
norgan
|
|
June 20, 2014, 05:50:13 AM |
|
so I was digging around and found some settings that may help with p2pool.
In cgminer and bfgminer there are options for --scan-time and --expiry also queue
I read that --scan-time 1 --expiry 1 and queue set to 0 is best for p2pool.
I managed to add these to my antminer so will monitor how it goes.
to do so you can edit \etc\init.d\cgminer and add in the PARAMS= section
e.g. PARAMS="$AOPTIONS $POOL1 $POOL2 $POOL3 $_pb --api-listen --scan-time 1 --expiry 1"
queue = 0 is already in the \etc\Config\cgminer file from Bitmain.
|
|
|
|
|