Bitcoin Forum
April 27, 2024, 10:36:32 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 3 »  All
  Print  
Author Topic: p2pool - Advancement of Decentralized Mining - Vital to Bitcoin Network Security  (Read 19407 times)
wtogami (OP)
Sr. Member
****
Offline Offline

Activity: 263
Merit: 250



View Profile
November 10, 2013, 08:32:36 AM
Last edit: January 09, 2014, 07:15:37 AM by wtogami
 #1

Centralization at large mining pools is a long-term risk to the safety and security of the Bitcoin network.  p2pool-like mining is vitally important as it does not create systemic risk in the form of centralized pools.  p2pool also benefits the network by substantially boosting the quantity of listening relay nodes across the world.  On November 8th, 2013, the Litecoin Dev Team awarded a ~$2,000 grant (and $1,000 more on December 29th) to Forrest Voight in support of development to improve the scalability of the p2pool decentralized mining network.


p2pool-13 enabled massive growth of decentralized mining since mid-July 2013

Our previous support of forrestv enabled the release of ASIC-capable p2pool-13.x for p2pool BTC along with minor improvements to scalability and dust reduction that were of benefit to both the Bitcoin and Litecoin networks.  This new grant is to encourage further research & development that would allow p2pool to comfortably scale to a much larger size.

p2pool Research & Development Ideas Document
https://docs.google.com/document/d/1fbc6yfMJMFAZzVG6zYOwZJvYU0AhM4cvd4bUShL-ScU/edit
Several different proposals are being brainstormed in this document.  Please add comments to the document or reply in this thread if you have ideas.

$2,000 is not enough for this important cause.  Join us in support of Decentralized Mining!
Decentralized mining is vitally important to the long-term security of the Bitcoin network.  $2,000 is too small for this important purpose, so I urge you to please join us in donating to support these important development goals.

forrestv's GPG signed statement: http://im.forre.st/pb/50065825.txt
Code:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

I, Forrest Voight, am working to improve the scalability
of P2Pool and decrease the dust payouts created by it.

I control these two donation addresses:

BTC: 1KxvX5Hx8nh36ig2gT5bpeEcqLQcwJsZGB
LTC: LPfkfi2tMuGSc64PZTsP9Amt367hwAUQzY
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.12 (GNU/Linux)

iQIcBAEBAgAGBQJSfmA2AAoJEKgljXA1bAEgM7oQAKZwuhvmGqy/RMjiFGBrJFtK
PN3wwn0g62cMH/JLGqfRAAYBxsjOY1l53VwGFLU1cBTB5yztigIAbjunf9UmYsgL
r7vtOCYL6RWv5+oFx4yC1JmFJXs0LkDhrhOwtLNlCi58h8TI77aMay6XiQ9ynsh3
W+AS6J8cQwjEtogGG0thk3SWkI1E6eZHrC9T2UjnOMUPHMsBpFqw35RXpXvtw0Yr
jAdFPPo0qCZA4BiwuhAkwuF7nVWp56YzRAwrwgx1s5cBR2l8049kDsOum4/mnU3b
3tDxZ9cFvO+x5AIuf/QQbguBeQ2tGaLLsDNxiLjIW4OUMO3Lw6wQJhogEZIPW1Ao
CUmAMhdCSdqE6SmmhOMM9xyJL6XAVhYrCEEZOg5toU7+aBfzsTZPEUNJUX+fgy6v
QjUUM0subv6rM+Ft8HgwoDdslmYog0QPlCzA0FvLMpP9MnKKvuYh02HzlVS8PnOo
FI1rN2pHlvKht6NW4HidGyg5uTES1p8M2wt4Ls63E+ar7fXChzw6p9T9ESAY59wh
7VaH8W01EPWpnE1w6XtlKV/rtk3PaCYWLIb54WMwLP8DeH2wB4R7PRfhZgoFWFt2
XWT+Jt6Llywf/zMPw37aFgITreUYhamEQYWCVpc8VE6YsHfs7m0VCcBwT4fP041S
l9N6cL309hKjUltMDrOO
=5Vpm
-----END PGP SIGNATURE-----

To be clear, forrestv is not a Litecoin Dev.  We just consider his work to be vitally important so we encourage the entire community to support his efforts to protect the Bitcoin network.  If someone else has a feasible implementation even better than p2pool then we may be interested in supporting that too.

Why is Litecoin supporting this cause?
Litecoin Dev takes part in several substantive efforts in support of Bitcoin because we are in the same boat in terms of technology development and the needs to protect the network.  Litecoin gives back to Bitcoin and will increasingly do more in the coming year.

  • Litecoin 0.8 contains some patches that differ from Bitcoin 0.8, backported fixes from Bitcoin 0.9, and other things not yet merged in Bitcoin 0.9.  Our testing on a smaller scale with real users and real money on the line has enabled finding bugs and further verifying patches before code shipped in Bitcoin 0.8.x and 0.9.
  • Those improvements to 0.8 are also published in Bitcoin 0.8.5 OMG, a well-tested branch of Bitcoin 0.8.5 plus many bug fixes and mature features beyond the standard 0.8.5.  This branch has also helped to find bugs before they reach Bitcoin 0.9.  p2pool miners may like the disablewallet feature.
  • Litecoin Dev also supports various Bitcoin devs including forrestv in support of work important to both features and network security.

If you appreciate my work please consider making a small donation.
BTC:  1LkYiL3RaouKXTUhGcE84XLece31JjnLc3      LTC:  LYtrtYZsVSn5ymhPepcJMo4HnBeeXXVKW9
GPG: AEC1884398647C47413C1C3FB1179EB7347DC10D
1714214192
Hero Member
*
Offline Offline

Posts: 1714214192

View Profile Personal Message (Offline)

Ignore
1714214192
Reply with quote  #2

1714214192
Report to moderator
1714214192
Hero Member
*
Offline Offline

Posts: 1714214192

View Profile Personal Message (Offline)

Ignore
1714214192
Reply with quote  #2

1714214192
Report to moderator
1714214192
Hero Member
*
Offline Offline

Posts: 1714214192

View Profile Personal Message (Offline)

Ignore
1714214192
Reply with quote  #2

1714214192
Report to moderator
BitcoinCleanup.com: Learn why Bitcoin isn't bad for the environment
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714214192
Hero Member
*
Offline Offline

Posts: 1714214192

View Profile Personal Message (Offline)

Ignore
1714214192
Reply with quote  #2

1714214192
Report to moderator
1714214192
Hero Member
*
Offline Offline

Posts: 1714214192

View Profile Personal Message (Offline)

Ignore
1714214192
Reply with quote  #2

1714214192
Report to moderator
Skinnkavaj
Sr. Member
****
Offline Offline

Activity: 469
Merit: 250


English Motherfucker do you speak it ?


View Profile
November 10, 2013, 12:16:02 PM
 #2

Keep Bitcoin and Litecoin decentralized!

jaminunit
Member
**
Offline Offline

Activity: 132
Merit: 14

Co-Founder of TheStandard.io & Vaultoro.com


View Profile WWW
November 10, 2013, 12:28:33 PM
 #3

Great work Warren!

I've been a Bitcoiner since 2010, and currently working on TheStandard.io, a next-generation stablecoin, and lending protocol.
The Standard Protocol Announcement thread
K1773R
Legendary
*
Offline Offline

Activity: 1792
Merit: 1008


/dev/null


View Profile
November 10, 2013, 12:29:05 PM
 #4

thanks warren Smiley

1. c. this is a must as not everyone is bothered with high variance, so there must be an option to not pay this fee.

2. a. good idea, this will lower the amount of new miners leaving p2pool.

5. we need this since quite some time.

6. is 6 needed if 5 is done properly?

7. this is probably one of the main goals as it will reduce lost (in terms of p2pool blocks, not btc blocks) work alot and would finally alow to create "super-nodes"




[GPG Public Key]
BTC/DVC/TRC/FRC: 1K1773RbXRZVRQSSXe9N6N2MUFERvrdu6y ANC/XPM AK1773RTmRKtvbKBCrUu95UQg5iegrqyeA NMC: NK1773Rzv8b4ugmCgX789PbjewA9fL9Dy1 LTC: LKi773RBuPepQH8E6Zb1ponoCvgbU7hHmd EMC: EK1773RxUes1HX1YAGMZ1xVYBBRUCqfDoF BQC: bK1773R1APJz4yTgRkmdKQhjhiMyQpJgfN
zvs
Legendary
*
Offline Offline

Activity: 1680
Merit: 1000


https://web.archive.org/web/*/nogleg.com


View Profile WWW
November 10, 2013, 12:55:20 PM
 #5

#2, this is a great idea in tandem with #3 or #4  (didn't read the full specifics on those)

#7, yeah, not so much anymore (since I put a 1% fee on it), but before I would have up to 10 work requests per cycle, so had to set maxblocksize to 1000...   even w/ a very good computer, at 100,000 size, it would take something around 100-150ms to get all the new work out

#8, i already do this for my own nodes  =p     ...   though i can't monitor the incoming connections constantly.. so some automated disconnect for peers that aren't helpful at all would be nice

ed: oh, on #7 + #8, since the majority of people never bother to alter the defaults, some easy method of choosing the type of node you want to run might be beneficial.  i've always felt 6 outgoing connections is far too few, and a lot of people have incoming firewalled.  

maybe when you initially run p2pool, it generates a config file & asks you how many outgoing connections you want, w/ suggestions based on bandwidth.  similar problem, most people never touch --p2pool-node
flound1129
Hero Member
*****
Offline Offline

Activity: 938
Merit: 1000


www.multipool.us


View Profile
November 10, 2013, 06:27:25 PM
 #6

Thanks for the donation warren and Litecoin devs!

I like the sub-pool idea.  I've already contributed a patch (mostly taken from generalfault's stratum-mining fork) to log shares to a MySQL database.  It would be great if this could be standardized and also offer clients the option of connecting using getblocktemplate as well as stratum or getwork.  I've also written a custom difficulty patch that I can share.  I like the idea of encouraging miners to use higher difficulty, but I'm not sure how well that would work with a sub-pool.  There would need to be a way to pass that cost on to miners, especially if allowing them to set their own difficulty.

The DB share logging patch I submitted could easily be extended using the same stratum-mining code (the two licenses are compatible) to support PGSQL and SQLite.

You might also want to take a look at how stratum-mining distributes new work to miners as it seems much more lightweight than the way P2Pool does it.

Multipool - Always mine the most profitable coin - Scrypt, X11 or SHA-256!
wtogami (OP)
Sr. Member
****
Offline Offline

Activity: 263
Merit: 250



View Profile
November 17, 2013, 08:34:27 AM
 #7

https://blockchain.info/address/1KxvX5Hx8nh36ig2gT5bpeEcqLQcwJsZGB
http://explorer.litecoin.net/address/LPfkfi2tMuGSc64PZTsP9Amt367hwAUQzY

Further donations raised in support of decentralized mining is now approaching $1,000.  Please give to further incentivize these important development goals.

If you appreciate my work please consider making a small donation.
BTC:  1LkYiL3RaouKXTUhGcE84XLece31JjnLc3      LTC:  LYtrtYZsVSn5ymhPepcJMo4HnBeeXXVKW9
GPG: AEC1884398647C47413C1C3FB1179EB7347DC10D
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
December 04, 2013, 03:52:49 PM
 #8

P2Pool all time "luck" is now at 117.6%— It would be interesting if we had a way of telling how much of that is actual luck vs just p2pool having a block relaying advantage due to its highly distributed nature.
oakpacific
Hero Member
*****
Offline Offline

Activity: 784
Merit: 1000


View Profile
December 04, 2013, 04:09:59 PM
 #9

Wonder if it makes sense to prioritize the transaction of volunteer miners as an incentive.

https://tlsnotary.org/ Fraud proofing decentralized fiat-Bitcoin trading.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
December 04, 2013, 04:44:53 PM
 #10

Wonder if it makes sense to prioritize the transaction of volunteer miners as an incentive.
It wouldn't be hard for p2pool to at least prioritize it's users spending their generated coin— but P2Pool has never had a large enough share of the global hashrate to make it seem worth doing.
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526
Merit: 1128


View Profile
December 04, 2013, 04:58:04 PM
 #11

Stopping existing miners from leaving sounds like the most important thing for sure, but after that is stabilised I think basic stuff like a proper website and more professional documentation/help/installers could go a long way.

I'm sure one of the reasons centralised pools are more popular is simply that the profit motive pushes them to a higher level of professionalism, and miners respond to that.
Carlton Banks
Legendary
*
Offline Offline

Activity: 3430
Merit: 3071



View Profile
December 04, 2013, 07:48:04 PM
Last edit: December 06, 2013, 08:26:09 PM by Carlton Banks
 #12

I'm sure one of the reasons centralised pools are more popular is simply that the profit motive pushes them to a higher level of professionalism, and miners respond to that.

Otherwise known as the "I like the interface" position.


The main issue to me is the competitivity of the p2pool node hardware. I believe that increased usage could be easier to encourage if something can be done to slim down the resource overheads on p2pool, but it's difficult to see how the disk space and disk access performance could be improved (at least with the present sharechain design). The memory and CPU requirements could be improved within the current design, but the obvious solution would sacrifice the ease of platform portability that is currently enjoyed with using the python runtime. And obviously a C or C++ re-write would be a massive job, particularly if it were to embrace a range of platforms, a problem that's already solved.

I guess there is some respite in the form of improvements to bitcoind's memory usage with the -disablewallet configuration option coming soon/available now in a testing branch, but I can't help wondering that time passing and the progress it brings might be most decisive. Did Pieter Wuille's custom ECDSA re-implementation ever get merged? That sort of thing will be more important when the upper limit to the block size inevitably changes, in whatever way is eventually decided upon.

I'm thinking along the lines of being able to easily adapt low-cost computing devices into p2pool nodes. A change to the block size may force the disk space and performance requirements out of the feasibility zone, but every other requirement is unlikely to become so unwieldy. In just a few years, the typical low cost computing device in the Arduino/RasPi mold may be more than capable of all the performance characteristics that good operation of the p2pool/bitcoind setup requires (taking into account the balance of increasing transaction frequency, downward revision in processing requirements per transaction and the upward direction of the processing capabilities of the latest ARM designs). At the present time though, some form of high performance desktop machine just has to be used as a p2pool node, there is no real alternative if you want to make the most of your hashrate.

The reason for the low-cost device angle is obvious: all new mining devices either include or rely on a computing device with networking controller, ethernet or Wifi, and at least enough performance to run unoptimised builds of the mining software (until the developers can get a device to test and code with). To make the stage where unoptimised miner code/drivers with as much comfort margin as possible will become the norm, and so the manufacturers may begin to choose over-specified devices as the more prudent option (it could even begin to help drive down the unit cost of these sorts of low cost computing devices in itself). Once the mining code for a given device is optimised, the now vacated headroom could be leveraged for running p2pool. Can it be done now with the current version of python and it's memory management, on our current generation of mining ASICs? No is the answer. Can native p2pool (and bitcoind) builds be practicably produced for the processor architecture of every possible low-cost computer used as a mining controller? I expect no is the answer to that question too.

But there must be some opportunity to leverage the processing controllers that inevitably form a part of nearly every typical miner that rolls out of the manufacturers doors. Maybe then someone might be (more necessarily) motivated to work on a shiny-tastic web interface  Cheesy

Vires in numeris
Dende
Newbie
*
Offline Offline

Activity: 56
Merit: 0


View Profile
December 05, 2013, 09:27:11 PM
 #13

Just sent 20mBTC
With decentralized system we are our own bank and I believe anyone involve has a responsibility of keeping the network safe. Im doing my part with the donation, hope it will help your effort.
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
December 05, 2013, 11:46:57 PM
 #14

A nice feature would be a distributed memory pool.

There could be a field in the links on the share chain for validated transactions.  The entry for each transaction would include;

- the transaction
- the path to the merkle root for that transaction
- the input transactions

Providing all the input transactions is expensive.  With 2 input transactions at 250 bytes, that means you need to provide 750 bytes worth of transactions.  If the merkle tree is 12 levels deep that is an additional 32 * 12 bytes = 384.  That is around 450% larger than just the transaction.

With large multi-input transactions, it would be bigger.  If the transactions per share chain link were limited in size, then naturally those transactions would be discouraged.

Double spending could be protected against by having a system where you can claim a share by showing a transaction included in the share was double spent. 

Nodes can't add the transaction to the share chain's memory pool without proving the transaction exists (and providing the inputs so that all verification information is provided).

The owner of the share which added a transactions might get the tx fees (or a percentage of them).  This would encourage nodes to add transactions.  If an illegal transaction is added, then that share goes to the address which submitted the notification of the error.

If it became popular, then even SPV wallets might store all that info for coins held in the wallet, and transmit it when trying to spend the coin.

Combined with something like the "Ultimate Blockchain Compression", maybe even double spending could be done locally.

There could be a network rule for how to pick transactions from the pool.  This would mean that all nodes mine against the same block.

Transactions that have been added would not be included for at least a few shares.  For example, if there was 10 second shares and transactions in the 12 most recent shares were not used, then there would be 2 minute for illegal transactions to be detected.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
December 06, 2013, 01:27:52 AM
 #15

Fantastic initiative Warren. Donation on the way...

Carlton Banks
Legendary
*
Offline Offline

Activity: 3430
Merit: 3071



View Profile
December 10, 2013, 01:51:26 AM
 #16

Just read the Brainstorming document. Some great plans.


Fee on each share is a smart idea, I didn't realise the payout per share is value/difficulty weighted as things are now. I now see the wisdom in larger miners upping their share diff threshold.

Trustless Accumulator (both variants) would be vital, infrequent dusty payouts to those with relatively small hashing power is a real barrier.

Multi-threaded share propagation is potentially good, and could work very well with the per-peer Statistical Tracking (as mentioned). Even though it does little in practical over-time terms to alter the payouts, it would increase user perception of "less wasted work" in the system as a whole. Although I'm not sure whether this won't just have the effect of increasing the effective granularity of stales, as there will still be the same number of shares in the chain.

Per-peer Statistical Tracking in it's own right is great for encouraging the kind of miner who uses professional hosting, and so contributes to a perception of a professionalism. Gives those miners a perceived high worth status, even if it's only in terms of the stratification of connections. And of course orphans could be less prevalent.

Vires in numeris
wtogami (OP)
Sr. Member
****
Offline Offline

Activity: 263
Merit: 250



View Profile
December 10, 2013, 06:48:17 AM
 #17

Just read the Brainstorming document. Some great plans.


Fee on each share is a smart idea, I didn't realise the payout per share is value/difficulty weighted as things are now. I now see the wisdom in larger miners upping their share diff threshold.

Trustless Accumulator (both variants) would be vital, infrequent dusty payouts to those with relatively small hashing power is a real barrier.

Multi-threaded share propagation is potentially good, and could work very well with the per-peer Statistical Tracking (as mentioned). Even though it does little in practical over-time terms to alter the payouts, it would increase user perception of "less wasted work" in the system as a whole. Although I'm not sure whether this won't just have the effect of increasing the effective granularity of stales, as there will still be the same number of shares in the chain.

Per-peer Statistical Tracking in it's own right is great for encouraging the kind of miner who uses professional hosting, and so contributes to a perception of a professionalism. Gives those miners a perceived high worth status, even if it's only in terms of the stratification of connections. And of course orphans could be less prevalent.

Glad to see someone clearly understands the issues involved. =)

If you appreciate my work please consider making a small donation.
BTC:  1LkYiL3RaouKXTUhGcE84XLece31JjnLc3      LTC:  LYtrtYZsVSn5ymhPepcJMo4HnBeeXXVKW9
GPG: AEC1884398647C47413C1C3FB1179EB7347DC10D
wtogami (OP)
Sr. Member
****
Offline Offline

Activity: 263
Merit: 250



View Profile
December 29, 2013, 09:30:00 AM
 #18

SUPPORT DECENTRALIZED MINING TECH

https://bitcointalk.org/index.php?topic=329860.0
If the community donates in excess of $1,000 to Forrest's donation addresses before noon UTC December 31st, the Litecoin Dev Team will contribute an additional $1,000 in support of  research and development of p2pool.

https://blockchain.info/address/1KxvX5Hx8nh36ig2gT5bpeEcqLQcwJsZGB
0.114995 BTC
http://ltc.block-explorer.com/address/LPfkfi2tMuGSc64PZTsP9Amt367hwAUQzY
204.00221964 LTC

The donation addresses must increase in value by > $1,000 above the current received totals.

Why are we doing this?
Litecoin Dev already donated $2,600 to Forrest earlier this year.  We strongly believe that p2pool improvement is critical to the future of Bitcoin so we want to encourage others to join us in supporting this cause.

If you appreciate my work please consider making a small donation.
BTC:  1LkYiL3RaouKXTUhGcE84XLece31JjnLc3      LTC:  LYtrtYZsVSn5ymhPepcJMo4HnBeeXXVKW9
GPG: AEC1884398647C47413C1C3FB1179EB7347DC10D
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
December 29, 2013, 09:54:52 AM
 #19

Anyone concerned about the growth of silent pools (Ghash.io, discus fish etc) should support this initiative.

TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
December 29, 2013, 01:06:49 PM
 #20

Adding a trustless accumulator helps with dust payouts.  It doesn't affect the minimum share difficulty.

Even without a bootstrap, it could be tweaked to reward larger miners.  This acts as a direct incentive for miners to push up their share difficulty during the transition.

The lower hashing power miners don't lose out directly, since their debt is held in the block chain.  They do have a lower expected payout though.

Minimum share difficulty requires increasing the share rate.  However, ASICs have difficulty with quickly updating shares.

One way would be to update the share chain in groups of shares.

Share Group Header:

int version
hash prev: points to the previous share group header
hash[32]: points to 32 valid shares
long: timestamp
int: difficulty

Shares would be valid if they paid out based on the previous share group.

Nodes would broadcast their shares.  Once a node has 32 shares that build on the previous share group, then it would construct a new header.

The most likely outcome is that the miner which hits the 32nd share would broadcast a sharegroup containing his share and the 31 others.

A miner who finds a share has an incentive to broadcast it so that it is included in any sharegroup.

If the share rate was 1 second, the share group rate would be 32 seconds.  ASICs would only need to update once per share-group rather than once per share.  

So, you get a 32X drop in the minimum share difficulty, but ASICs still only have to update once every 32 seconds.

This would create more dust, due to the higher share rate.  So, it would need to be combined with an accumulator of some kind.

[Edit]
If shares that pointed to any of the previous 3-4 share groups were considered valid, then the orphan rate would be even lower.  ASICS could run for 2-3 minutes on the same base.
[/edit]

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Pages: [1] 2 3 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!