Bitcoin Forum

Bitcoin => Development & Technical Discussion => Topic started by: aliashraf on June 08, 2018, 02:37:57 PM



Title: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 08, 2018, 02:37:57 PM
Proof of Collaborative Work

A proposal for eliminating the necessity of pool mining in bitcoin and other PoW blockchains

Motivation
For bitcoin and the altcoins which are based on common PoW principles,  centralization of mining through using pools, is both inevitable and unfortunate and puts all the reasonings that support the security of PoW in a paradoxical, fragile situation.

A same problem does exist with PoS networks. Things can get even worse  there because of the fact that most PoS based systems enforce long run deposit strategies for miners that is highly discouraging for them to migrate from one pool to another because of the costs involved.

Satoshi Nakamoto's implementation of PoW that is the core of bitcoin client software is based on a winner-takes-all strategy which is the fundamental factor behind two critical flaws: mining variance and proximity premium which are the most important forces that participate in forming pooling pressure.

Until now, both mining variance and proximity premium are known to be unavoidable and hence pooling pressure is considered an inherent flaw for bitcoin and other PoW based currencies.

In this proposal, we are suggesting an alternative variant of PoW in which the traditional winner-takes-all is replaced with a collaborator-takes-share strategy.

The problem of solo mining becoming too risky and impractical for small mining facilities appeared in 2010, less than 2 years after bitcoin had been launched. It was the worst timing ever, although Satoshi Nakamoto made a comment on bitcointalk about first pool proposals (https://bitcointalk.org/index.php?topic=1976.20#msg25119),  it was among the latest posts Satoshi made and he just disappeared few days later from this forum, forever, without making a serious contribution to the subject.

This way, the confused community came out with an unbelievable solution for such a critical problem, a second layer centralized protocol, named pooling, boosted by greed and ignorance, supported by junior hackers who as usual missed the forest.

Bitcoin was just 2 years old when pooling age began and eventually dominated almost all the hashpower of the network.

A quick review of Slush thread (https://bitcointalk.org/index.php?topic=1976) in which Satoshi has made the above referenced reply, could reveal how immature and naive this solution was and has been discussed and how it has been adopted: In a rush with an obvious greed.
Nobody ever mentioned the possibility of an algorithm tweak to keep PoW decentralized. Instead everybody was talking about how practical was such a centralized service while the answer was more than obvious:
Yes! you can always do everything with a centralized service, don't bother investigating.  

Anyway, in the thread, one couldn't find any arguments about the centralization consequences or the possibility of alternative approaches including the core algorithm improvements :o

I think it is not fair. PoW is great and can easily be improved to eliminate such a paradoxically centralized second layer solution. This proposal, Proof of Collaborative Work (PoCW) is an example of inherent possibilities and capacities of PoW. I didn't find any similar proposal and it looks to be original but If there is a history, I'll be glad to be informed about. :)

The Idea is accepting and propagating works with hundreds of thousands times lower difficulties and accumulating them as a proof of work for a given transaction set, letting miners with a very low shares of hash power ( say of orders like 10-6) to participate directly in the network and yet experience and monitor their performance on an hourly basis.



Imo, now, after almost a decade being passed, Moore law has done enough to make it feasible utilizing more bandwidth and storage resources and it seems to me kinda hypocritic to make arguments about 'poor miners' and pretending to be concerned about centralization threats and making excuses so for rejecting this very specific proposal that although increases the demand for such resources, can radically disrupt current situation with pools and centralized mining.

This proposal is mainly designed for bitcoin. For the sake of convenience and letting the readers to have a more specific perception of the idea, I have deliberately used constants instead of adjustable parameters.

Outlines
  • An immediate but not practically feasible approach can be reducing blocktime (along with proportional reduction in block reward). Although this approach, as mentioned, can not be applied because of network propagation problems involved, but a very excellent consequence would be its immediate impact on the scalability problem if employed, we will use it partially (reducing blocktime to 1 minute compared to current 10 minutes period).
  • As  mentioned earlier (and with all due respects to Core team), I don't take objections about the storage and network requirements implications and consequences of reducing blocktime as a serious criticism. We should not leave mining in hands of 5 mining pools to support a hypothetical poor miner/full node owner who can not afford installing a 1 terabyte HD in next 2 years!.
  • Also note, blocktime reduction is not a necessary part of PoCW, the proposed algorithm, I'm just including it as one of my old ideas (adopted from another forum member who suggested it as an alternative to infamous block size debate and later has been developed a bit more by me) which I think deserves more investigation and discussion.
  • PoCW uses a series of mining relevant data structures to be preserved on the blockchain or transmitted as network messages
    • Net Merkle Tree: It is an ordinary Merkle hash tree of transactions with the exception that its coinbase transaction shows no block reward (newly published coins) instead the miner charges all transaction fees to his account (supports SegWit)
    • Collaboration Share: it is  a completely new data structure composed of following fields:
      • 1- The root of a Net Merkle Tree
      • 2- Collaborating miner's wallet address
      • 3- A nonce
      • calculated difficulty using previous block hash padded with all previous fields, it is always assumed to be at least as hard as 0.0001 compared to current block difficulty
    • Coinbase Share: it is new too and is composed of
      • 1- A Collaborating miner's wallet address
      • 2- A nonce
      • 3- A computed difficulty score using the hash of
        • previous block's hash padded with
        • current block's merkle root, padded with
        • Collaborating miner's address padded with the nonce field
      • 4-  A reward amount field
    • Shared Coinbase Transaction: It is a list of Coinbase Shares  
      • First share's difficulty score field is fixed to be  2%
      • For each share difficulty score is at least as good as 0.0001
      • Sum of reward amount fields is equal to block reward and for each share is calculated proportional to its difficulty score
    • Prepared Block: It is an ordinary bitcoin block with some exceptions
      • 1- Its merkle root points to a  Net Merkle Tree
      • 2- It is fixed to yield a hash that is as difficult as target difficulty * 0.05
    • Finalization Block: It is an ordinary bitcoin block with some exceptions
      • 1- Its merkle root points to a  Net Merkle Tree
      • 2- It is fixed to yield a hash that is as difficult as target difficulty * 0.02
      • 3- It has a new field which is a pointer to (the hash of) a non empty Shared Coinbase Transaction
      • 4- The Shared CoinBase Transaction's sum of difficulty scores is greater than or equal to 0.95
  • Mining process goes through 3 phases for each block:
    • Preparation Phase: It takes just few seconds for the miners to produce one or (barely) 2 or 3 Prepared Blocks typically. Note that the transaction fees are already transferred to miner's wallet through coinbase transaction committed to the Net Merkle Tree's root for each block.
    • Contribution Phase: Miners start picking one valid Prepared Block's Merkle root, according to their speculations (which become more accurate as new shares are submitted to the network) about it to get enough shares eventually, and producing/relaying valid Contribution Shares for it.
      As the sum of the difficulty scores for a given Prepared Block's Merkle root grows we expect an exponential convergence rate for the most popular Merkle root to be included in Contribution Shares.  
    • Finalization Phase: After the total scores approaches the 0.93 limit, rational Miners would begin to produce a Finalized block
  • Verification process involves:
    • Checking both the hash of the finalized block and all of its Shared Coinbase Transaction items to satisfy network difficulty target cumulatively
    • Checking reward distribution in the shared coinbase transaction
    • Checking Merkle tree to be Net
  • UTXO calculation is extended to include Shared Coinbase Transactions committed to finalized blocks on the blockchain as well
  • Attacks/forks brief analysis:
    • Short range attacks/unintentional forks that try to change the Merkle root are as hard as they are in traditional PoW networks
    • Short range attacks/unintentional forks that preserve the Merkle root but try to change the Shared CoinBase Transaction has  zero side effects on the users (not the miners) and as of redistributing the shares in favor of the forking miner, they are poorly incentivized as gains won't go anything further than like %2-%10  redistribution ever.
    • Long Range attacks with a total rewrite agenda will fail just like Traditional PoW  
    • Long Range attacks with partial coinbase rewrite are again poorly incentivized and the costs won't be justified

Implementation

This is a radical improvement to classical PoW, I admit, but the costs involved are fair for the huge impacts and benefits. I have reviewed the bitcoin Core's code and found it totally feasible and practical form the sole programming perspective. Wallets could easily be upgraded to support the new algorithm as well,  but a series of more complicated issues, mostly political are extremely discouraging but it is just too soon to give up and go for a fresh start with a new coin, or just manage for an immature fork with little support, imo.

Before any further decisions, it would be of high value to have enough feedback from the community. Meanwhile I'll be busy coding canonical parts as a BIP for bitcoin blockchain, I think it takes like 2-3 weeks or even a bit more because I'm not part of the team and have to absorb a lot before producing anything useful, plus, I'm not full time, yet ;)

I have examined the proposed algorithm's feasibility as much as I could, yet I can imagine there might be some flaws overlooked, and the readers are welcome to improve it. Philosophical comments questioning the whole idea of eliminating pools don't look to be constructive tho. Thank you.


Major Edits and Protocol Improvements:
  • June 10, 2018 09:30 pm Inspired by a discussion with @ir.hn

    • A Prepared Block should be saved in the fullnodes for a long period of time enough to mitigate any cheating attempt to avoid Preparation Phase and using non-prepared, trivially generated Net Merkle Roots.  
      • Full nodes MAY respond to a query by peers asking for a block's respected Prepared Block if they have decided to save the required data long enough
      • For the latest 1000 blocks preserving such a data is mandatory.
      • For blocks with an accumulated difficulty harder than or equal to the respected network difficulty, it would be unnecessary to fulfil the above requirement.*
      • Prepared Block and Preparation phase terms replaced the original Initiation Block and Initiation Phase terms respectively to avoid ambiguity
      Notes:
      * This is added to let miners with large enough hash powers choose not to participate in collaborative work.
  • reserved for future upgrades
  • July 3, 2018 02:20 pm inspired by discussions with @anunimint

    • A special  transaction was added to Shared CoinBase Transaction to guarantee/adjust proper reward for the finder of Prepared Block and some enhancements were made to include both block reward and transaction fees (partially) in the final calculations.
      • In Finalization Phase, miners should calculate a total reward for the miner of its respected Prepared Block as follows:  
        preparedBlockReward = blockReward *0.04 + totalTransactionFees *0.3
      • Shared Coin Base Transaction should include one input/output address-amount pair that adjusts the amount of reward assigned to the Prepared Block Miner in the ordinary coinbase transaction
      • All the calculations needed for calculating rewards assigned to the miners of finalized block and shares will be carried out on the sum of network block reward and 70% of transaction fees collected in transactions that are committed to Net Merkle Tree
      Note:
       This change is made to both guarantee a minimum reward for miners of Prepared Blocks and incentivize them for including more transactions with higher fees  
  • reserved for future upgrades







Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: goddog on June 08, 2018, 02:55:51 PM
I'm sorry, I hope I can not fully understand your proposal, but if you are against centralized pools, why  do not simply try to improve p2pool protocol? Pow have to be simple so it don't need tuning costants finding some magic to have things working. Simple mean less errors and problems.

I hope pools are not a problem if miners can point their hashrate where they want, at any time at no cost.
however p2pool is a nice try to have  decentralized pools. You can try to develop a decentralized pool, if it is better than centralized pools solutions, it will be adopted by miners.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 08, 2018, 03:46:01 PM
@goddog
P2Pool has been around for a long time but as of this writing it has not done good enough (like 0.00005 of the total network hash power, last block found according to their official stats page (http://p2pool.info/) dates back to years ago! ) it is because of it being a lot more resource consuming than my proposal, and less efficient, I guess. I'm not happy to say this, but it looks like a failed project to me. :(

Nodes in the P2Pool protocol should maintain a separate blockchain plus performing a lot of validation checks on the submitted shares and there is no convergence mechanism unlike PoCW proposed here.

In PoCW, miners just converge on a Prepared Block's merkle root and focus on hashing instead of going through verifying and accounting submitted shares. It is just in Finalization Phase that they need to collect a set of valid shares with the sum of their difficulties being sufficient to satisfy the target difficulty of the network.

Another problem with P2Pool is that the level of convenience provided  is not enough (120 times  lower difficulty). PoCW, proposed here, can accept shares up to 100,000 times easier, like what centralized pools are capable of.



Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: goddog on June 08, 2018, 04:05:20 PM
I mentioned p2pool only as a starting point, I know it has a lot of problems and flaws,
From the little I can understand your proposal can be applied to the current pow system as a new kind of decentralized pool software(like an improved f2pool), you can call it C2pool :-D

However I can not, really, understand how your proposal should work. From what I can understand yuor finalization block , is the only thing that matters, so miners will simply fake all others data, or selfish mining all of them at no cost, to maximize their profit.

Also I think it is very susceptible to network latency and/or split, making it vulnerable very vulnerable to a wide range of sybil attack.
But as I was saying for sure I'm missing some foundamental point.
So can you explain me why a miner should be cooperative? I can not see where are the real incentives.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 08, 2018, 05:02:13 PM
@goddog

No it is not a pool of any kind, instead a new type of PoW. I see you are too lazy to read my opening post  :P

In classical PoW, although miners dedicate resources to generate blocks with average value, in terms of difficulty, they are completely ignored, (this is why people join pools). My Simple question can be formulated this way: Why just don't let this blocks to be accumulated as a proof of collaborative work?

Classical PoW, the way it is implemented in bitcoin, Ethereum, ..., fails to do so because typically the block is hashed as a whole. It is because the block header contains the merkle root that in turn has committed every single information effective on the blockchain state, both conventional transactions and the coinbase.

This prevents miners to collaborate because each miner generates a different coinbase (reflecting his reward address and amount) that produces a different merkle root in turn.

In PoCW, we separate coinbase transaction from the Merkle Tree. This way, miners can share their 'good hashes' for a Merkle root that is getting most popular over time and when enough 'good hashes' are accumulated, they try their chance to generate the final block that now can use the found shares to prove its quality and as an evidence for the way it suggests for distribution of the block rewards, at the same time, by means of the used shares that are unforgeable and can not be separated from the miner's wallet address who generated them.

In Collaboration phase, it is of miners best interest to converge asap on a popular Merkle root (transaction set) and submit their found shares that satisfy the trivially set difficulty.
When producing Finalized Blocks, miners have to include the previously found  share (as they are) and this way they have to admit the founders share, fairly.

In other words, in PoCW miners mainly focus on the actual ordinary transactions made by users and share their efforts to take care of them. Instead of artificially ruining the transaction set by injecting their reward expectation and delaying the consensus mechanism for a fortunate accident of a miner to 'find' a block with high enough difficulty.

PoCW looks to me ways more decent and fair and at the same time very solid, compared to PoW.

I hope this brief explanation could help and encourage you to read the opening post in its entire and more carefully.  :)


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: goddog on June 08, 2018, 05:51:08 PM
    I'm only trying to understand I'm not trying to judge your work, but I need some clarification.
  • Initiation Phase: It takes just few seconds for the miners to produce one or (barely) 2 or 3 Initiation Blocks typically. Note that the transaction fees are already transferred to miner's wallet through coinbase transaction committed to the Net Merkle Tree's root for each block.
As I can understand a net merkle is a merkle tree where the coinbase transaction have no block rewards(only comulated fee) so each miner will try to publish his net merkle tree with a coinbase tree pointing to his address.
why are you saying there will be only one or (barely) 2 o 3  initialization blocks? why a miner should use a net merkle tree with a coinbase pointing to an address owned by other miner? where is the incentive to start contribution phase loosing transaction fees? or I'm missing something.

  • Contribution Phase: Miners start picking one valid Initiation Block's Merkle root, according to their speculations (which become more accurate as new shares are submitted to the network) about it to get enough shares eventually, and producing/relaying valid Contribution Shares for it.
    As the sum of the difficulty scores for a given Initiation Block's Merkle root grows we expect an exponential convergence rate for the most popular Merkle root to be included in Contribution Shares.  
They will share an inizialization block, adding their address, and their, partial, proof of work to earn score.

  • Finalization Phase: After the total scores approaches the 0.93 limit, rational Miners would begin to produce a Finalized block
[/li][/list]

as soon as a miner get enought shares(with partial proof of work) it can begin trying to solve a more difficult proof of work to get only a partial block reward. But producing a final block is very hard as any contrib share can point to a different net merkle tree(because every miner will add a coinbase transaction using his address to earn fees), so he have to validate the same transactions more and more times to be sure final block wil be considered valid. So I hope it is better for a miner to simply  autoproduce his net merkle tree and his contribution shares so he doesn't have to share block reward with others contributors and to get the full fee booty.

So I can not understand where is the point in using contribution shares from others miners?


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 08, 2018, 06:34:06 PM
    I'm only trying to understand I'm not trying to judge your work, but I need some clarification.
  • Initiation Phase: It takes just few seconds for the miners to produce one or (barely) 2 or 3 Initiation Blocks typically. Note that the transaction fees are already transferred to miner's wallet through coinbase transaction committed to the Net Merkle Tree's root for each block.
As I can understand a net merkle is a merkle tree where the coinbase transaction have no block rewards(only comulated fee) so each miner will try to publish his net merkle tree with a coinbase tree pointing to his address.
why are you saying there will be only one or (barely) 2 o 3  initialization blocks? why a miner should use a net merkle tree with a coinbase pointing to an address owned by other miner? where is the incentive to start contribution phase loosing transaction fees? or I'm missing something.
5% difficulty requirement for Prepared Block causes one block to be generated in every 3 seconds, during the preparation phase, network latency and the required time for miners to decide ending preparation attempts and starting contribution phase may take up to 10 seconds, I guess.

When the first few Prepared Blocks have been already propagated in the network, other miners have tried their chance to win the fees and failed to do so. It is not a big deal tho, in the worst case it is about wasting like 5% of the resources and the transaction fees at the same time. Just like when a new block arrives and you have missed your chance to hit. Typically total transaction fees for a fully saturated block is about 0.5 btc i.e. 0.04 with respect to the current 12.5 btc block reward.
From this point on(having few published Prepared Blocks in hand) , the race is about winning the block reward (main meal) and it is of all miners best interest to forget about the previous race (winning tr fees) and focus on the live one(winning block reward).
Quote
  • Contribution Phase: Miners start picking one valid Initiation Block's Merkle root, according to their speculations (which become more accurate as new shares are submitted to the network) about it to get enough shares eventually, and producing/relaying valid Contribution Shares for it.
    As the sum of the difficulty scores for a given Initiation Block's Merkle root grows we expect an exponential convergence rate for the most popular Merkle root to be included in Contribution Shares.  
They will share an inizialization block, adding their address, and their, partial, proof of work to earn score.

  • Finalization Phase: After the total scores approaches the 0.93 limit, rational Miners would begin to produce a Finalized block
[/li][/list]

as soon as a miner get enought shares(with partial proof of work) it can begin trying to solve a more difficult proof of work to get only a partial block reward. But producing a final block is very hard as any contrib share can point to a different net merkle tree(because every miner will add a coinbase transaction using his address to earn fees), so he have to validate the same transactions more and more times to be sure final block wil be considered valid. So I hope it is better for a miner to simply  autoproduce his net merkle tree and his contribution shares so he doesn't have to share block reward with others contributors and to get the full fee booty.

So I can not understand where is the point in using contribution shares from others miners?

No you are missing the most important point here. Contribution shares don't point to different merkle trees, it is the magic of what I have defined as Net Merkle Tree, it is shared among most of the miners because it contains no block reward related information, just the real world transactions and their fees. Miners do not change this tree and produce contribution shares that commit it again and again. This is why these shares can be used as an objective evidence of  resource consumption of their respected producers.


EDIT:
Please pay enough attention that in contribution phase, miners don't mine blocks, they mine Contribution Shares instead that can be easily included in the Shared Coinbase Transaction in Finalization Phase. As I've already explained in the first post.



Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: ir.hn on June 08, 2018, 09:24:00 PM
This is a very complex idea and you have defined your method but not your reasons for determining this method.  You may want to start from scratch to explain this and give the idea, most basic theoretical implementation and then what problems occur that lead you to add the new factors that you did in the design.

A lot of thought went into this and it is hard for us to get up to speed with your thought process.

What you are doing is giving someone a recipie and saying it makes a better cake but not enough explanation for why eggs whipped at 675 rpm are needed exactly. 

I'm sure it is genius and mabye a couple pepple could follow it but I'm not amoung them.  Also realize that not many of us probably even understand the underpinnings of how pools work exactly.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 08, 2018, 11:31:51 PM
@ir.hn

I do admit that it is not an easy piece of cake when you get into implementation details , or even the design outlines, directly. But things are changing so fast in blockchain and developments are trending in so many false directions ... We have to be simultaneously perfect and quick. Hell of a job ...

Realising this situation, I deliberately chose to present this idea, from a technical point of view, to accelerate technical contributions. Yet, I believe you would be able find some useful general ideas following this topic.

Here I try to give a more general  and conceptual vision, I wish it would help: :)

In centralized pool mining that has been the dominant practice for years,  miners have no control on the blocks, they receive it from the server, ready to 'mine' which now is about finding a nonce that causes the hash of the block to be 'good enough' to be passed back as a share to the server.
The threshold of what is considered to be good enough is contracted between miner and the pool server and has almost nothing to do with the blockchain difficulty, it is just considered to be thousands to millions of times lower.

Pool server in turn, besides validating and keeping track of submitted shares, checks whether luckily, one of them is so much good (thousands or millions times better than what the trivial contracted difficulty has forced miner to find a nonce for) to be relayed as a freshly found block and added to the blockchain.
If so, the pool will get both block reward and transaction fees immediately. Shares of the miners who participated in this process, all of them and not only the one who has found the golden nonce, will be payed using one of the several common methods for pool accounting, later.

Miners desperately need such a mechanism to avoid waiting for years to find one block (if any) and pools with enough number of submitted shares are expected to be lucky very frequently.

This is a bad situation for security of the network because number of parties who actually decide about the transaction set to be included in each block and (probably) added to the blockchain is dangerously small. This is a centralization threat and any true bitcoiner should be against it, unquestionably.

Actually, as of this writing, 5 major bitcoin mining pools could easily form a cartel that would have >50% majority needed to perform occasional and intentional double spend attacks against people, for the least.

As I have briefly described in my article, I believe PoW can be improved to eliminate the miner's above mentioned need to pool mining. In this proposal this is done by reducing the difficulty hundreds of thousands times.

This difficulty reduction is achieved by tweaking classical PoW to let miners converge on a common set of transactions and submit their found results to the network primarily and wait until their submitted shares have been summarized in a finalized block to be permanently added to the network. In

In classical PoW it is done upside down, a finalized block is prepared from the first beginning, when the 'nonce hunt' is going to start,  no cooperation, no contribution, this is a gap that maliciously is filled by pools and I believe can be totally eliminated by PoCW, this proposal.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: ir.hn on June 09, 2018, 03:10:55 AM
So what your saying is... basically Proof of Participation (you can steal PoP I won't be offended  8)).  Everyone gets a participation trophy which is encoded into the "real" block before the block reward is distributed?  So basically there is in reality one big mining pool run by the software?  Sounds really good in theory.

Who's ledger would be the accepted ledger for the transactions in that case?  I think that is why bitcoin has one winner, so that one person's ledger becomes the accepted ledger.

Like you I figured the only way to beat pools other than coins like SpreadCoin (not sure if it is successful at it) was lowering the blocktime (which is a double edged sword in many ways) but also another idea I had was increase coinbase maturity to something like 9 months.  I figured that would throw a wrench into pools what do you think?


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: GLP on June 09, 2018, 06:33:38 AM
I thinked a lot about this proposition, and finally i think that:
1- It is a Decent one and i have not find any relevant technical objection against.
2- It's problem will be both social & political

So, i suggest you to make a proof of concept of it by implementing it on some testnet, ppl who embrace it can help by putting some bounty to hack it. benefits will be:
1- objectivity on the proof of work.
2- If the political resistence is umbeatable and the social insensitive is not motivated, you will have be allready done a half-step to it's productivity implementation on a new blockchain.

As you may agree, a political resistence is not something that can be forced by any rational/objective argumentation alone.

Thank you for sharing and i really valuate this work, please do not give up by a "BTC or nothing" reasoning, it deserve better horizons.

Wish you the best, i definitely will think more about it and will be glad to share any future thoughts.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 09, 2018, 07:53:44 AM
@GLP

Thanks for your encouraging reply. Glad to see you are supporting the core idea and I'm looking forward for your future contributions to this project.

As I have mentioned earlier, we are in an intense situation and we need to act like a samurai, both quick and perfect. Unless,  we do need spiritual, social, economical and technical contribution unlike a samurai.

I strongly support your suggested roadmap and will do my best to make it happen. Thank you.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 09, 2018, 08:39:59 AM
So what your saying is... basically Proof of Participation (you can steal PoP I won't be offended  8)).  Everyone gets a participation trophy which is encoded into the "real" block before the block reward is distributed?  So basically there is in reality one big mining pool run by the software?  Sounds really good in theory.

Thanks for the explanation and yes, people who are used to pool mining, would experience the network like a huge pool. Good analogy.  :)

By the way, I prefer keeping the title as is, Proof of Collaborative Work.

PoCW is just an  improvement to PoW that inherits its core objective ideas and could be categorized as an implementation variant rather than a new algorithm.

Quote
Who's ledger would be the accepted ledger for the transactions in that case?  I think that is why bitcoin has one winner, so that one person's ledger becomes the accepted ledger.
No matter whose ledger it is, rational miners should contribute to the first ledger (at the start of contribution phase) and converge to the most popular one (as this phase develops). Obviously the ledger under consideration should always be examined for its validity.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: buwaytress on June 09, 2018, 08:45:07 AM
Always reading with interest and happy to see that people are always trying to work on this problem of pool centralisation, although I'm not sure any of the direct threats (51% attacks for example) would really happen since pools themselves, no matter how large, have historically understood the fine balance they need to maintain to avoid miners just disconnecting from their pools to join others.

Nevertheless, it is at these finely-maintained balances that centralised pool decisions are a concern. I just wanted to share something else I read (a BIP draft (https://bitcointalk.org/index.php?topic=4436756.msg39624664#msg39624664)) yesterday that looked instead to just change the protocol to give a bit less power to pools when it comes to creating block templates.

Your suggestion of a way to somehow make individual miners count for the work they contribute, as well, is something a lot of people would support. This makes me wonder if anyone has ever calculated the cost of wasted work (work that never counted), I'm sure it's a LOT.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 09, 2018, 10:29:07 AM
@buwaytress

Thanks for the support and sharing the referenced BIP, I just posted a reply (https://bitcointalk.org/index.php?topic=4436756.msg39710846#msg39710846) in your topic.

As you  may have noticed and I have tried to prove there, that is somehow too conservative and can't change things more than like what you say 'a bit'.



Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: ir.hn on June 09, 2018, 01:02:09 PM
Ok thanks for your explanation it is helping me go back to your original post and make more sense out of it.

So the net merkle tree is a merkle tree without the block reward.  So if you win a nonce for that (collaboration share) you don't automatically get the block reward alla bitcoin.

I was wrong that it is a participation trophy.  It is a weighted block reward for each participant (collaborator) that varies based on how "good" their nonce was.  Is there a set time limit for collaboration shares, so a miner can basically "wait untill I get a nonce -this- good" before sending it in?  Or should they just find a bunch of low difficulty nonces?  Is there a limit as to how many collaboration shares can be included into the shared coinbase transaction?

How does the "convergence on an accepted transaction ledger" happen exactly and automatically? 


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 09, 2018, 03:00:48 PM
So the net merkle tree is a merkle tree without the block reward.  So if you win a nonce for that (collaboration share) you don't automatically get the block reward alla bitcoin.
Exactly.
Quote
...  Is there a set time limit for collaboration shares, so a miner can basically "wait untill I get a nonce -this- good" before sending it in?  Or should they just find a bunch of low difficulty nonces?  Is there a limit as to how many collaboration shares can be included into the shared coinbase transaction?
Yes. It is set to be at least as 0.0001 good as what the calculated network difficulty enforces and in a worst case scenario, there would be like 9,500 Coinbase Shares  packed in Shared Coinbase Transaction, in my terminology. Depending on the implementation details it implies up to 64 KB size for the transaction.

In practice you can expect much lower numbers down to just 2 transactions. So, regarding the normal distribution of probabilities, I guess 4,250 is a good assumption for the average. Hence 32 KB or so for Shared Coinbase Transaction.

Quote

How does the "convergence on an accepted transaction ledger" happen exactly and automatically?  


Once the very first Prepared Block is mined and propagated in the network, for average rational miners there is no more reasons to keep trying to mine another competitive instance of such a block, as long as they are sure that the mined block under consideration is valid and won't be rejected by peers, instead they would rationally accept the fact that they have lost the chance of gaining transaction fees and switch to Contribution Phase, producing shares for the Net Merkle Root (the transaction ledger) of the same block. It is the most innovative part of my proposal, I guess.

Obviously the above mentioned process converges exponentially because with every new Collaborative Share issued and propagated, the most popular Net Merkle Root gets even more support from the network. I think before we get to even close to 20% of the needed shares , practically the whole network would be mining the same work.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: ir.hn on June 09, 2018, 06:40:26 PM
Quote
Yes. It is set to be at least as 0.0001 good as what the calculated network difficulty enforces and in a worst case scenario, there would be like 9,500 Coinbase Shares  packed in Shared Coinbase Transaction, in my terminology. Depending on the implementation details it implies up to 64 KB size for the transaction.

What I was getting at actually was this: what if there is a really good miner.  You don't want this miner giving contribution shares every time he gets a .0001 nonce, because then he would fill up the block will all his contribution shares.  You would rather have him wait until he gets a .01 nonce for example that way he can get a bigger part of the pie yet not crowd the block with tons of small value nonces. If there was a set time limit he could say "ok I can wait until I have at least a nonce of .01 difficulty before sharing it, because I know in that given time frame I should be able to find one that high". The way this can be done is to reward higher difficulty blocks a little bit disproportionately higher.  But the problem remains there still there would be the incentive to post his lower difficulty nonces too, so I'm not sure what the solution would be to get him to not share his low difficulty nonces.

It may not be vital but it would be good if a flood attack like this could be avoided.


Quote
Once the very first Prepared Block is mined and propagated in the network, for average rational miners there is no more reasons to keep trying to mine another competitive instance of such a block, as long as they are sure that the mined block under consideration is valid and won't be rejected by peers, instead they would rationally accept the fact that they have lost the chance of gaining transaction fees and switch to Contribution Phase, producing shares for the Net Merkle Root (the transaction ledger) of the same block. It is the most innovative part of my proposal, I guess

Wow very cool.  So what you are saying that the "Prepared Block" is always mined first and has it's own proof of work requirement separate from anything else?  If so we could think of this Prepared Block as the "real" block of the blockchain.  So the winner of the Prepared Block gets the transaction fees, and the Contribution block mined later is for the block reward coins?  So the only problem is a little bulkier and slower blockchain, instead of 1 block mined per transaction group, there are two.  Not a bad tradeoff though when you consider not only the risk of pools on the network but the risk of pools in terms of loosing your money from being hacked, and making accounts and passwords for different pools and whatever.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 09, 2018, 07:54:11 PM
Quote
Yes. It is set to be at least as 0.0001 good as what the calculated network difficulty enforces and in a worst case scenario, there would be like 9,500 Coinbase Shares  packed in Shared Coinbase Transaction, in my terminology. Depending on the implementation details it implies up to 64 KB size for the transaction.

What I was getting at actually was this: what if there is a really good miner.  You don't want this miner giving contribution shares every time he gets a .0001 nonce, because then he would fill up the block will all his contribution shares.  You would rather have him wait until he gets a .01 nonce for example that way he can get a bigger part of the pie yet not crowd the block with tons of small value nonces. If there was a set time limit he could say "ok I can wait until I have at least a nonce of .01 difficulty before sharing it, because I know in that given time frame I should be able to find one that high". The way this can be done is to reward higher difficulty blocks a little bit disproportionately higher.  But the problem remains there still there would be the incentive to post his lower difficulty nonces too, so I'm not sure what the solution would be to get him to not share his low difficulty nonces.

It may not be vital but it would be good if a flood attack like this could be avoided.
Two points I have to clear here:
  • First: A miner who finds a Collaboration Share with a nonce better than 0.0001 will be rewarded with proportional amount of the block reward eventually, given he has chosen the right Net Merkle Tree to mine for, the one that network has converged around it
  • Second: A miner can mine any number of shares and publish them
Plus, it is important to note that Collaboration Shares are very small payloads (like 50 bytes) and there is no reason to set the quality limit that high.

In current Bitcoin blockchain once a block is found by a miner (being a solo miner or a pool operator) it takes a lot of communication overhead to be propagated, because the peer that is receiving the block should become aware of the exact Merkle tree path behind the Merkle root present in the header, at least the transaction ids should be transmitted and in some cases the actual transaction body needs transmission.

Also, I'm generally against hesitations to utilize networking bandwidths. I think it is very weird design strategy for a decentralized consensus based system running on an infrastructure like Internet.
Quote

Quote
Once the very first Prepared Block is mined and propagated in the network, for average rational miners there is no more reasons to keep trying to mine another competitive instance of such a block, as long as they are sure that the mined block under consideration is valid and won't be rejected by peers, instead they would rationally accept the fact that they have lost the chance of gaining transaction fees and switch to Contribution Phase, producing shares for the Net Merkle Root (the transaction ledger) of the same block. It is the most innovative part of my proposal, I guess

Wow very cool.  So what you are saying that the "Prepared Block" is always mined first and has it's own proof of work requirement separate from anything else?  If so we could think of this Prepared Block as the "real" block of the blockchain.  So the winner of the Prepared Block gets the transaction fees, and the Contribution block mined later is for the block reward coins?  So the only problem is a little bulkier and slower blockchain, instead of 1 block mined per transaction group, there are two.  Not a bad trade off though when you consider not only the risk of pools on the network but the risk of pools in terms of loosing your money from being hacked, and making accounts and passwords for different pools and whatever.


Prepared block won't go to the Blockchain ever and the 5% difficulty level set here won't be checked later. It is just a convention for lubricating the convergence mechanism . Let's dive a bit deeper into it:

Once a Prepared Block is mined and propagated, receiving miners will check its integrity and difficulty before starting to contribute, so the sole fact that some miners contribute to it and produce shares for its Net Merkle Root it is enough proof that the block has convinced them already.

One possible attack scenario by a selfish owner of a very large mining facility could be something like this:

A selfish miner may choose to bypass Preparation phase by simply producing a Net Merkle Tree and starting to produce Collaboration shares and propagating them in the network.
Receiving miners may be fooled that they have missed something and there IS actually some Prepared Block out there and decide to join the race before it is too late.
This way the selfish miner has hijacked transaction fees without consuming enough resources and more importantly he has accelerated the mining process artificially which has problematic consequences for difficulty adjustment algorithm of the network.

Mitigation:
Decent, loyal miners won't contribute to a work that can not be validated by querying the Prepared Block, so such an attack won't work unless a compromise has been made between miners that are not expected to be a 50%+1 majority. And for this minority there will always be a problem to distribute the illegitimate gains properly, thus they have to set a lower limit, say 1%, and yet they are risking to lose the main war, ever and ever because in most cases they are converging to a work that the majority just don't commit to it.

When it comes to bootstrap or partial synchronization for example, full nodes won't ask like 'From where you got this Net Merkle Tree?' because it just doesn't matter anymore.



Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: ir.hn on June 09, 2018, 10:23:51 PM
Hmm yes we should keep in mind finding something besides the internet would be better.  Ham radio is an option if the data size is very small.

I guess to answer my own question; yes a flood attack is possible but it is ok.  Difficulty would be adjusted so, for example, the block will typically fill up full of contribution shares every 10 minutes.  Instead of taking on average 10,000 x blocktime for a small miner to win his first block, he will win a little bit every block on average.  The big miners will fill up the block with small difficulty nonces, but the small miner has 10,000x better chance of getting one of his small difficulty nonces into the block.  Nice.

I think I am with you on the preparation block (same thing as "initiation block" I presume).  The person who can win that somewhat higher difficulty nonce will gain the transaction fees which will immediately show up in their wallet.  So this block never needs to become a part of any hash to prove that it happened?  It seems a little risky in my mind that it just disappears after it is won.  But I can see that it is just needed to "prime the pump" so the collaborators have something to check their ledger with.  Seems a tad bit risky though because it seems the prep block is just a suggestion.  I think from you explaining that "preparation block skipping" attack that the network may loose preparation blocks entirely, since it appears that the prep block hash doesn't need to be a part of the collaboration share.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 09, 2018, 11:45:54 PM
Yes we are getting synchronized I suppose.  :)

As of the risks you mentioned in forgetting the Prepared Block:

Firstly, it is to be kept in mind that this block's Net Merkle Root points to a coinbase transaction which charges the fees to the miner.'s wallet address. He should wait (praying for it to get enough contribution) until it is included as the Net Merkle Root of the finalized block.

Secondly, I have made a bit more assessments about the selfish mining attack which I described in my last post and I have come to a very interesting result: If we add a simple restriction for full nodes to reject just the last block in the chain where there is not a valid Prepared Block with 5% difficulty around.
This simple restriction will lead to a very excellent defence to this attack without enforcing much overhead. As a result we will have a new definition for the latest block in the chain: it is the one that has a twin (with same Net Merkle Root, 5% difficulty and same parent) present in the network. When a new round of mining is going to take place, miners would just point to the latest block that has such a twin.

It looks to me a waste of space  to keep this twins alive for more than a few rounds because  I don't see any reason to use Prepared Blocks for bootstrapping and/or partial resynchronization yet.



Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: ir.hn on June 10, 2018, 03:19:03 AM
So you are saying that the preparation block's hash is included in the net merkle tree.  Sounds good.

I'm not really sure about the twin idea, one block is a part of the blockchain and the other block is floating around somewhere?  If they are exactly the same then wouldn't they just be the same block?  I'm not sure how you can have two exactly identical blocks, unless one is a slight bit different.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 10, 2018, 06:49:34 AM
So you are saying that the preparation block's hash is included in the net merkle tree.  Sounds good.

I'm not really sure about the twin idea, one block is a part of the blockchain and the other block is floating around somewhere?  If they are exactly the same then wouldn't they just be the same block?  I'm not sure how you can have two exactly identical blocks, unless one is a slight bit different.

I afraid I haven't explained my ideas regarding this issue in previous post thoroughly.  Actually it is not possible to have Prepared Block's hash in the respected Net Merkle Tree (it leads to recursive definition with no practical exit condition).

Anyway, I decided to include it in the proposal formally and I hope it could help.

Please refer to the starting post a few minutes later.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 10, 2018, 07:11:40 PM
@anunymint
You are so sharp, I mean too sharp.  ;D

Really? Just like 2 minutes after I referred you here, you digested the whole idea and closed the case?

First of all, switching between pools does not change anything and I prefer not to waste the readers' time to argue about it. Just reconsider your claim and please, take your time.

Secondly, in this proposal, miners do have to validate transactions they are committing to and making any objection about it, is just saying that they have to leave validation to a limited number of trusted nodes, validators, authorities, whatever,  like what is the case with pool mining.   :o

You are full of surprise and this is another weird thing you are suggesting,  and again I can't imagine even arguing about such claims.  

I'm familiar with this literature tho, this is the outcome of any crisis: journalistic revisionism full of  deconstructional claims, ... just because something is going wrong.

I like it, seriously, it is inspiring but ultimately we have to adopt and improve instead of giving up and trying to invent a strange thing that is at the same time objective, scalable, decentralized, sharded, ... and the miners do not need to validate the transactions they are committing to, while they are consuming their resources in the mining process!

Anyway, I referred you here to reject one specific claim you have made about PoW that you are claiming it is doomed to mining variance and inevitability of pool mining as a result and you conclude that it is the most important factor that makes PoW networks infeasible for sharding schemas.

As I see you are not going to share and tell us how it would help, PoCW I mean, to implement sharding, I'll just try "reverse engineering" here:  

When you are saying that bitcoin won't let sharding because it is doomed to centralized pool mining, logically I come to this conclusion:
this variant of PoW (current proposal, Proof of Contributive Work) resolves this issue, and makes it feasible to implement sharding on a PoCW network. Right?


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 11, 2018, 11:31:35 AM
Quote from: anunymint  date=1528667760

In my design, there would be no such problem with liveness threshold, as the miners have very good luck to contribute frequently even with small fractions of hashpower.

Oh I thought you might reply with that, and I forgot to preempt it by saying that if you count proof-of-work shares instead of winning block solutions for shard participation, then I was thinking that would trivially enable a Sybil attack but on further thought I think the vulnerability is different than my first thought.

Thinking that out a bit more, that would induce the mining farm to send all its shares as shares instead accumulating them in house and releasing only the block solutions. This would mean that small miners can’t realistically even validate that shard participation is honest. So the centralization risk (that eliminating pools was supposed to fix) is essentially not ameliorated.
I quoted above reply partially form this topic (https://bitcointalk.org/index.php?topic=4438334.msg39835786#msg39835786)

I have merited @anunymint for this post as it is a serious objection and I appreciate it.  :)

Actually I was waiting for this to come and obviously I have the answer right in my pocket ;)

Talking about block validation costs, we usually forget about what a validator does in this process specifically.
In Nakamoto's implementation of PoW, which is installed in bitcoin and inherited by most of the altcoins either conceptually or directly through forking his code, when a peer claims a block, the receivers have a lot of job to verify its validity and make proper decision, accoridngly.
It involves checking
  • 1- The block is well formed and queried properly:                                                     very trivial, cpu bound
  • 2- The block header's HAshPrevBlock field to be the most recent block in the chain:    very trivial, cpu bound
  • 3- The block hash is as good as expected (regarding current calculated difficulty):      very trivial, cpu bound
  • 4- The Merkle path (actual block pyload) encompasses valid transactions:                  very cumbersome, I/O bound

In my proposal, miners have to go through this process for each Prepared Block (say 2-3 times in each round), no matter how heavy is the traffic!

This is the magic of the protocol, no need to re-check every submitted share's Merkle path (transaction set) because they are not even a conventional block, they are Collaboration Shares, they share the same merkle root with an already validated Prepared Block.

So, the main traffic, submitted shares, is validated in like few microseconds for each share, no considerable overhead is involved here.

Even when it comes to Finalized Blocks, nodes have not to go through the Merkle path validation as a rule. It is one of the most beautiful features of the protocol and I'm not aware of any competing alternative even close to this.

So, the question of whether a 'small miner' is able to contribute in PocW because of validation overhead becomes almost equivalent with asking whether he is able to solo mine in traditional PoW because of it, given the variance dilemma is no longer a discouraging factor?

it is what my proposal tries to fix, removing the  variance problem without adding a considerable validation overhead.

Quote
This creates another security problem in general which is that validation comes closer to the cost of mining, so it limits who can participate in validation. If you make each proof-of-work share 100X more difficult than the average network cost of computing one proof-of-work trial, then only miners with significantly more than 1% of the hashrate can possibly afford to do validation. If you instead make the share difficulty too high, then the variance of the small miners is too high to make it into the shards with a reliable liveness outcome.
Already resolved.
Quote
... Btw, AFAIR you were not the first person to propose that shares be credited instead of just block solutions at the protocol layer. Seems I vaguely recall the idea goes back to even before I arrived on BCT in 2013. Don’t recall how well developed the proposal was.
Highly appreciate it if you would kindly refer me to the source.
Quote
Also I think miners with more than 1% of the network hashrate would realize that it’s in their interest to agree to orphan any blocks that have share difficulties that is less than some reasonable level so that these miners will not be forced to do all this wasteful validation of low difficulty shares. Thus they would censor the small miners entirely. So there would seem to be a natural Schelling point around effectively (as I have described) automatically turning off your protocol change. The network would see your protocol change as malware from an economic perspective.
Although it is not a big deal to have big miners solo mine, I have looked somewhat closer to this issue:
No, there would be no incentive to solo mine and abstain to collaborate, even for big miners (while it is supported directly by the latest tweaks and upgrade I have committed before your post).

Analogically speaking:
In traditional PoW, a big miner, owning a large farm, may choose to solo mine because he doesn't suffer from mining variance that much to consider paying fees and taking risks by joining pools for a cure.
In PocW there is almost no costs involved, the whole network is a giant, decentralized pool and charges no fees and implies no risk while providing a stable payout and performance index.



Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 11, 2018, 08:28:53 PM
Quote from: anunymint  date=1528667760

In my design, there would be no such problem with liveness threshold, as the miners have very good luck to contribute frequently even with small fractions of hashpower.

Oh I thought you might reply with that, and I forgot to preempt it by saying that if you count proof-of-work shares instead of winning block solutions for shard participation, then I was thinking that would trivially enable a Sybil attack but on further thought I think the vulnerability is different than my first thought.

Thinking that out a bit more, that would induce the mining farm to send all its shares as shares instead accumulating them in house and releasing only the block solutions. This would mean that small miners can’t realistically even validate that shard participation is honest. So the centralization risk (that eliminating pools was supposed to fix) is essentially not ameliorated.
I quoted above reply partially form this topic (https://bitcointalk.org/index.php?topic=4438334.msg39835786#msg39835786)

I have merited @anunymint for this post as it is a serious objection and I appreciate it.  :)

Actually I was waiting for this to come and obviously I have the answer right in my pocket ;)

Talking about block validation costs, we usually forget about what a validator does in this process specifically.
In Nakamoto's implementation of PoW, which is installed in bitcoin and inherited by most of the altcoins either conceptually or directly through forking his code, when a peer claims a block, the receivers have a lot of job to verify its validity and make proper decision, accoridngly.
It involves checking
  • 1- The block is well formed and queried properly:                                                     very trivial, cpu bound
  • 2- The block header's HAshPrevBlock field to be the most recent block in the chain:    very trivial, cpu bound
  • 3- The block hash is as good as expected (regarding current calculated difficulty):      very trivial, cpu bound
  • 4- The Merkle path (actual block pyload) encompasses valid transactions:                  very cumbersome, I/O bound

In my proposal, miners have to go through this process for each Prepared Block (say 2-3 times in each round), no matter how heavy is the traffic!

This is the magic of the protocol, no need to re-check every submitted share's Merkle path (transaction set) because they are not even a conventional block, they are Collaboration Shares, they share the same merkle root with an already validated Prepared Block.

So, the main traffic, submitted shares, is validated in like few microseconds for each share, no considerable overhead is involved here.

Even when it comes to Finalized Blocks, nodes have not to go through the Merkle path validation as a rule. It is one of the most beautiful features of the protocol and I'm not aware of any competing alternative even close to this.

So, the question of whether a 'small miner' is able to contribute in PocW because of validation overhead becomes almost equivalent with asking whether he is able to solo mine in traditional PoW because of it, given the variance dilemma is no longer a discouraging factor?

it is what my proposal tries to fix, removing the  variance problem without adding a considerable validation overhead.

Appreciated the merit and I like spirited debate. The entire point is to make sure we have the correct analyses. So I must be frank about my available time and the quality of your explanations. Please put more effort into helping me understand your point.
You are welcome.  :)
Reading your reply to the end I become more convinced about your time issues. It is really a problem right now, you have time to write but not to read I suppose  :P

Quote

You have a lot of verbiage here. I don’t have the time to go learn your non-standard terminology, e.g. “transaction set”. Transactions? We’re talking about mining shares here. Is that a typo? Please reorganize your response into a succinct and very coherent one that doesn't require the reader to go wade through your thought typos and private language.

You should be able to explain your idea in a very coherent way that is very easy for an expert to grok without having to go try reverse engineer your specification which employs your private language.

What do you mean by standard? What standard? ISO has issued something?
There is no way to explain a new idea by the old language. Technology is the language itself, improving technology implies extending the terminology, manipulating it and redefining a lot of words and phrases.

Being a critic or an expert who does research on innovative ideas, requires spending time on understanding the way terms and words have been redefined and used to create the new idea.
It is what we do, we create terms and concepts, nothing more, absolutely nothing more.

And yet, the combination 'transaction set' is not such a complicated, innovative term of mine, it is used once or twice in my posts as a complementary description for non-expert readers who may be confused by 'Merkle path' that I use more.

Quote
Bottom line is that every mining share that will be recorded in the blockchain (in a Merkle tree or whatever) and is rewarded by the blockchain, has to be validated by all nodes to be sure that someone isn't cheating and being rewarded for shares that were invalid.

Bottom line is you should read  ;D

I explain again here, just for you but I swear to god, it is the last time I do this for a critic who doesn't read:

The shares under discussion are called Collaboration Share in this proposal, (I wish ISO or you are not offended) these are NOT conventional blocks like what solo or pool miners submit to the blockchain or pool service.

Most importantly, the Merkle tree these shares commit to, is NOT a conventional bitcoin Merkle tree your poisoned terminology is used to, instead they commit to a variant which I (with all due respects to ISO) call it Net Merkle Tree, the coinbase transaction that is committed to this tree has no block reward (i.e sum of inputs equals sum of outputs).

This way, miners repeatedly are dealing with different shares that have a same Merkle root , no need  to fetch the transactions from the mempool or the peers, verifying their signatures, checking them against the UTXO, ... ever and ever, it has been already done, once they have decided to contribute to this specific Net Merkle tree, once!

I think we are done, here. Just think about it, this is the most innovative part but not the most complex one. Just take a deep breath and instead of rushing for the kb, use the mouse and check the starting post of this topic.

Quote
Thus I maintain that my objection was correct, until you coherently explain how I was incorrect.
done.

Quote

It’s not just the fees. It’s the I/O cost of moving all those shares over the network. ...
The shares, unlike blocks, (remember? they are not blocks, they are just Collaborative Shares ) won't cost too much bandwidth. When an ordinary block is transmitted there is a lot of overhead regarding its payload to be loaded instantly or incrementally by each peer. Collaborative Shares do not cause such an overhead just like 50 bytes (not decided an exact data structure yet, tho) once, no handshake, no query, no more data.

Quote
...a mining farm can easily remain anonymous, but your proposal would make that more difficult.
Remaining anonymous is as easy as ever, connect to a trusted full node and do whatever you want.

Quote
... Also the cost of change in data center infrastructure and ASIC hardware to accommodate this change. Also the smaller miners are disproportionately affected by the lower economies-of-scale to deal with all these costs as well as the cost of actual validation which I did not see a coherent rebuttal to above.

You may have a coherent rebuttal. I will await it.
Well, the validation story is over (I hope) and FYI, this proposal does not involve any upgrade/change in infrastructures or ASICs it is just about software upgrade.

EDIT:
Quote
The following are essentially in the same direction as your idea (crediting all the blocks in a tree is the same as rewarding mining shares), but I also think I had seen specific mentions of the idea of putting the mining shares in the block chain. I know I had that idea already before you mentioned it, because I had read about is and even thought about it myself. I had dismissed it because of the validation asymmetries are lost. I vaguely remember in past discussion it was also shot down because of the bandwidth costs and impact of orphan rate due to propagation delays (at least that was the thinking at the time more than 5 years ago). Vitalik also blogged about Ghost and pointed out some its problems.

https://bitcointalk.org/index.php?topic=396350.0

https://bitcointalk.org/index.php?topic=359582.0

https://bitcointalk.org/index.php?topic=569403.0
I forgot this part when posting my reply, sorry.

Of course putting shares in the blockchain is not  far from imagination, the problem is how to deal with it without messing with resources and capacities, and it is what this proposal is about.
I checked the links, unfortunately they are not even close, Ghost (1st and 2nd proposals being about it) is a story of its own that attempts to change the 'longest chain' fork rule and is out of context, the third proposal about signing every single hash by the miner is just an anti-pool mobe with no solution for the core problem: mining variance.
So, I have to maintain that my work is original.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 11, 2018, 09:30:40 PM
@anunimint
Please edit your latest reply, som quote tags is missing there. I just don't quote and simply reply to your most important point in that post:

You say that your objection is not about signatures, UTXO, etc of the Markle Path and the transactions included in the block but about its hash being qualified enough!

Are you kidding? Running a SHA256 hash takes few microseconds even for an average cpu!

An ASIC miner does it in  few nanoseconds!

Am I missing something or you are just confused somewhat?


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 11, 2018, 09:38:34 PM
Are you kidding? Running a SHA256 hash takes few microseconds even for an average cpu!

An ASIC miner does it in  few nanoseconds!

Am I missing something or you are just confused somewhat?

No you’re not thinking. Think about what you just wrote. Everything is relative. And go back to my original objection and the point about “100X”.

So you are serious!
Really? One nano second 100X is just 0.1 microsecond and 1 microsecond 100X is 0.1 millisecond.

Come on, you have to take it back, your objection about validation crisis, there is no crisis, just take it back.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 11, 2018, 09:52:15 PM
Actually,  I guess handling  more than 180,000 shares per minute (3,000 shares per second) by a full node with a commodity pc is totally feasible.
With parameters I have proposed in this version, there would be less than 20,000 shares per minute in the worst scenario however.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 11, 2018, 10:32:28 PM
Are you kidding? Running a SHA256 hash takes few microseconds even for an average cpu!

An ASIC miner does it in  few nanoseconds!

Am I missing something or you are just confused somewhat?
...
Also please remember my objection was in terms of unbounded validators for OmniLedger shards. I never was coming over to refute your proposal for use not in OmniLedger.
Oh you did, we are not discussing OmniLedger here, but thank you, you are taking back your objection, well, somehow, it is progress.
Quote
It may be plausible to use your proposal up to some cutoff on the smallest hashrate miner allowed. I have not actually computed it.
progress, progress.
Quote
So you are serious!
Really? One nano second 100X is just 0.1 microsecond and 1 microsecond 100X is 0.1 millisecond.

The absolute time is irrelevant. It is the relativity that matters. Please take some time to understand what that means. I shouldn’t have to explain it.
I have been studying theoretical physics for a while and I'm somehow an expert in relativity theory and yet I can't find any relativity related issue here.
Quote
The variance on the small miners is so incredibly high because their hashrate is so minuscule compared to the network hashrate.
Mining in its contribution phase is not halted for validation, validation is done in the full node and is parallel with mining (generating hash), validation helps in transition between phases and is not strictly a part of mining process.
Quote
Therefore if you require the network to validate the small difficulty share that a small miner is capable of producing within 10 minutes, then that means all miners must validate all those small shares produced by all the hashrate!

The Bitcoin network hashrate (https://blockchain.info/charts/hash-rate) is currently 40 million THPS. A single Antminer S9 is 14 THPS. So we’d need more than 2.5 million shares per second to be validated.
Actually, a s9 will produce one share every 30 minutes or so! let's calculate :
1.4*1010 / 4*1016 = 3.5 *10-7 = 0.00000035

In PoCW, we need every share to be 0.0001 times easier than what the difficulty dictates , to generate one share per minute (block time) as an average, a miner should have the total hashrate, our s9 got just 0.0000035 should produce one block every 290000 minutes (1 / 0.00000035) by improving difficulty 10,000 times it turns to be one share like every 30 minutes.
Again, no flood, no crisis.
Quote
But you also have this problem to consider:

Essentially all you’re doing is lowering the block period, which is known to be insecure as it approaches a smaller multiple of the propagation delay in the network. So I am also thinking your proposal is flawed for that reason. I think that was the reason it was shot down in the past. As I wrote in one of my edits, Vitalik had explained some of these flaws in GHOST.
reducing block time to 1 minute is not a part of this proposal from the algorithmic point of view, but I vote in favor of it and can void any argument against it, Ethereum uses 15 second block time with an average of uncle blocks lower than 10% , I believe even a 30 second blocktime is feasible.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 12, 2018, 04:48:19 AM
@anunymint

I understand, you are a good writer and a respected crypto advocate, I have shown more than once my respect for you. But it just happens, the level of noise and wired claims and the number of white papers and proposals about Proof of Something other than work is annoyingly high and it was my fault from the first hand to start a thread and try to convince people, guys like you specially, that it is not alike.

I have to apologize for my too much expectation and getting too intimate and hurting you. I didn't mean it.

As I've just mentioned, it is too much expectation form advocates (who are already vaccinated against the said noise and hypes) to take this proposal as a serious one and try to digest it thoroughly (why should they?).

You might be right, I'm not potent enough for presenting a proposal with such an ambiguous agenda like shifting Nakamoto's Winner takes all  tradition with a collaborative proof of work alternative, as a serious paradigm shift and encouraging people to spend some time on digesting it.

But it is what I got and it makes me angry sometimes with myself primarily and with the whole situation secondly, not with you. You are just one other advocate, you are not alone, people are busy investigating PoS or pumping bitcoin, nobody cares. I'm sick of it.

And when you came on board and I started getting more optimistic, my expectations got too high and I went out of rail. Sorry.

Imo, despite the bitterness, we have made some progress and I sincerely ask you to schedule some time and take a closer look to the proposal, I assure you,  every single objection you have made here is already addressed by the starting post or through the replies I have made. Thank you for participation and sorry for the inconvenience.  :)


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 12, 2018, 08:48:30 AM
miner charges all transaction fees to his account <--- why is a miner paying transaction fees?
First of all, glad to see you being back and showing your commitment, I appreciate it:  :)
miner is not paying, he is charging, not being charged. I mean he rewards his wallet with transaction fees, (only transaction fees and not the block reward)
Quote
Quote
calculated difficulty using previous block hash padded with all previous fields <--- padded? how does a hash provide a difficulty adjustment?
who said anything here about difficulty adjustment? It is about calculating the difficulty of the share by
1- padding some fields to each other: previous block hash + other fields of the structure (Net Merkle root+the Meiner's wallet address+nonce)
2- performing a sha2 hash
3- evaluating the difficulty of the hash
Quote
Quote

A computed difficulty score using the hash of ...
A calculated difficulty score is the ratio of the difficulty of the share compared to the target difficulty. It is typically less than 1 but the greater scores (if any) will be set to 1.
Quote
Quote

For each share difficulty score is at least as good as 0.0001 <--- why is a difficulty good or bad? criteria?
being good means being close to the target difficulty.
Quote
Quote

Sum of reward amount fields is equal to block reward and for each share is calculated proportional to its difficulty score <--- Do you mean weighted sum? Huh? Needs better explanation.
Yes, it deserves more explanation. It is about the structure of Shared Coinbase transaction. It is a magical structure that we use for both proving the work of the contributor (sum of the scores/difficulties of all the items MUST satisfy the required difficulty target) and for distributing the reward (each share gets the fraction proportional to its score/difficulty).
Quote
Quote

It is fixed to yield a hash that is as difficult as target difficulty * 0.05  <--- how so? Where? What algorithm?
It is about Prepared Block  difficulty target that should be set to 0.05 of the calculated network difficulty. Nothing new in terms of algorithm just a matter of protocol, just like how traditional PoW enforces the difficulty for the blocks.
Quote
Quote

It is fixed to yield a hash that is as difficult as target difficulty * 0.02

Mining process goes through 3 phases for each block: <--- these sections are not a sufficient explanation of the algorithm. You expect the reader to read your mind. Help us out here and explain how this thing you invented works

Ok I'll do my best:

Unlike the situation with traditional PoW, in PoCW miners should go through three phases (they better do so unless they want to solo mine which is not of their interests or commit an attack against the network which is not feasible as long as they have not the majority):

Phase 1: Miners SHOULD try to find a block with at least 5% good hash and while rewarding the transaction fees into their wallets through a coinbase transaction (free of reward, just tr fees) committed to the merkle tree that its root  is committed to the block header. It is called the Preparation phase

Phase 2: After the network reaches to a state that one or two or three competing instances of such a block have been mined and propagated in the network miners MAY eventually realise that the window for mining such a block is closing because of the risks involved in not getting to the final stage because of the competition.
Instead they accept the fact that they won't be rewarded for transaction fess and choose to produce/mine Collaboration shares for one of the above mined blocks (i.e. putting their Merkle root in the data structure named Collaboration Share which can (later) trivially be translated to Coinbase share and being used for difficulty evaluation and reward distribution at the same time(if the miner happened to choose the most popular Prepared Block) .
I have extensively discussed with @ir.hn this phase and have shown that it is an exponentially convergent process and in the midsts of the process we will be witnessing the whole network being busy to produce shares for the same Net Merkle Tree root.
It is called the Contribution Phase, Note: As you might have already realized this is not mandatory. Also note that in this phase miners don't generate blocks, these are just shares, Contribution Shares that should wait for the next phase in which a miner (just one miner) may include them in a block for  using their scores both to prove the work and to share the reward.

Phase3: after enough shares have been accumulated for a Merkle root, miners SHOULD start to search for one  final block (with a difficulty being fixed to be at least 2% close to the calculated network difficulty) encompassing:
1- The Merkle root (remember it has one coinbase transaction only rewarding the original miner of the first phase) of one of the blocks mined in the first phase.
2- Anew coinbase transaction, the Shared Coinbase Transaction containing the required shares to prove the work and the weighed distribution of the block rewards as an integrated whole.
3- other usual fields

It is the Finalization Phase.
Quote
Quote

Phrases are devoid of meaning for me. With any key words that really confound me as making no sense are highlighted in bold.

Without being able to understand these, I can’t digest your specification unless I holistically reverse engineer what your intended meaning is. And I am unwilling to expend that effort.

Please translate.

I could figure it out if I really want to. But as I said, I have a lot of things to do and I enough puzzles on my TODO list to solve already.
Did my best. Thanks for the patience/commitment  :)


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: tromp on June 12, 2018, 09:13:47 AM
Verification process involves:
  • Checking both the hash of the finalized block and all of its Shared Coinbase Transaction items to satisfy network difficulty target cumulatively

This is a serious problem with your proposal. The proof of work is not self-contained within the header.
It requires the verifier to obtain up to 10000 additional pieces of data that must all be verified, which is too much overhead in latency, bandwidth, and verification time.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 12, 2018, 09:27:57 AM
  • Verification process involves:
    • Checking both the hash of the finalized block and all of its Shared Coinbase Transaction items to satisfy network difficulty target cumulatively
This is a serious problem with your proposal. The proof of work is not self-contained within the header.
It requires the verifier to obtain up to 10000 additional pieces of data that must all be verified, which is too much overhead in latency, bandwidth, and verification time.[/list]
Shared Coinbase transaction typically is 32 kB data (an average of 4500 items)  and doesn't need any further verification, like checking UTXO, mempool, whatever.
Although shares have to be verified to have the required difficulty (being hashed and examined) it is a cpu/bound task  and ways faster than the block itself to be verified.

Note: verifying a block takes a lot of communication, accessing the mempool in hard disk, querying/fetching the missing transactions from the peers, verifying transaction signatures (which is hell of a processing although not bein I/O bound), accessing the hard disk to check each transaction against the UTXO , ...

According to my assessments, this verification will be done with adding zero or a very small latency because verifier is multitasking and the job will be done in cpu idle times.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 12, 2018, 09:33:19 AM
Additionally I think I found another game theory flaw in his design.

The design presumes that the leadership (for finding the 0.05 * Prepared blocks) can’t be attacked and subdivide the rest of the hashrate because you assume they would need 50+% to get a lead, but AFAICT that is not true because of selfish mining.

The 33% attacker can mine on his hidden Prepared block and then release it right before the rest of the network catches up.

Thanks for the comment,  I have to analyse it more thoroughly, I am very glad to see you guys are approaching that good. Will be back in like half an hour with the analysis and possible mitigations.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: alfaenzo on June 12, 2018, 10:19:26 AM
i like it


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: tromp on June 12, 2018, 10:31:37 AM
Shared Coinbase transaction typically is 32 kB data (an average of 4500 items)  and doesn't need any further verification, like checking UTXO, mempool, whatever.

Since PoW should be considered an essential part of the header, what you are proposing then is to increase header size from 80 bytes upto 72 KB (worst case 10000 items), a nearly 1000 fold increase...


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 12, 2018, 11:21:12 AM
@anunymint

As of classical selfish attack itself, I personally disagree to call it an attack at all. I rather see it as a fallacy, a straw man fallacy (https://en.wikipedia.org/wiki/Straw_man).
My reasoning:
PoW has nothing to do with announcement. Once a miner prefers to keep his block secret it is his choice and his right as well, he is risking his block to become orphan in exchange for a possible advantage against the rest of the network in mining for the next block.

Although Like PoW, this proposal is not about prohibiting people from selfish mining, there is a point to rephrase the above reasoning somehow different, this proposal is about reducing the pooling pressure and helping the network to become more decentralized by increasing the number of miners. How? By reducing the variance of mining rewards that is one of the 2 important factors for this pressure (I will come back to the second factor, soon).

So, it might be a reasonable expectation from PoCW to have something to do with selfish mining.

It has, but first of all it is worth mentioning, according to the protocol, miners are free to choose not to collaborate and go solo if they wish although by keeping the costs of participation very low and the benefits high enough, this practice is discouraged.

PoCW improves this situation by reducing the likelihood of pools to take place, eliminating one of the most important factors that makes their existence possible at all.

Your second objection but happens to be about the second important factor for pooling pressure: proximity.

It is about taking advantage of having access to information (a freshly mined block for instance) and taking advantage of it or not having access to such an information and wasting resources (mining stall blocks) because of it. Even with completely loyal nodes in bitcoin and other PoW based networks, there is always a proximity premium for the nodes nearer to the source (lucky finder of the fresh block) compared to other nodes.

I have to accept that by pushing for more information being circulating around, PoCW, this proposal, is suspected to enforcing this second pressure for pooling.

I have been investigating it for a while and my analysis suggests otherwise. It is a bit complicated and deserves to be considered more cautiously I need to remind that proximity premium is known flaw for PoW's decentralization agenda.

For a traditional winner-takes-all PoW network, like bitcoin there is just one pieces of information (the fresh block) that causes the problem, true, but the weight of this information and resulting premium is very high and it is focused in one spot, the lucky miner in the focal point and its neighbors in the hot zone.

For this proposal, this premium is distributed more evenly, tens of thousands times.

OOps! there is almost no proximity premium flaw in Proof of Contributive Work!

Without a proximity premium and a mining variance flaw, there will be no mining pressure, no threat to centralization. It is how selfish mining concerns (again not a flaw) are addressed too. It turns to become a simple, innocent solo mining.

As of @tromp's and your concerns about share validation overhead, I have already addressed it, there is no resource other than a few cpu cycles to be consumed for it, not a big deal according to my analysis and by distributing the proximity premium almost evenly, it does more than enough to compensate  ;)



Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 12, 2018, 11:26:55 AM
Shared Coinbase transaction typically is 32 kB data (an average of 4500 items)  and doesn't need any further verification, like checking UTXO, mempool, whatever.

Since PoW should be considered an essential part of the header, what you are proposing then is to increase header size from 80 bytes upto 72 KB (worst case 10000 items), a nearly 1000 fold increase...

This is more significant when considered in conjunction with the 0.02 * threshold on finishing a block. That threshold means it’s more likely that two blocks will be finished closer together than for 10 minute block periods and thus the increased propagation and verification (for the up to 10,000 block solutions) can be significant relative to the spacing between duplicate finished blocks. As I wrote in my prior post, all of this contributes to amplifying the selfish mining attack.
Well, @tromp is not on the point, neither you @anunymint:

The Shared Transaction Coinbase is not a part of the Header, its hash(id) is,
The transaction itself is part of the block, like conventional coinbase transaction and other transactions. The block size remains the same as what protocol dictates, plus the size of this transaction it implies an almost 5% increase (worst case) which is not a big deal.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 12, 2018, 05:45:51 PM
The Shared Transaction Coinbase is not a part of the Header, its hash(id) is,

All the small proof-of-work solutions have to communicated and calculated before the winning block can be communicated. So that is up to 10,000 (if difficulty target is 0.0001) multiplied by the 64B size of a SHA256 hash, which is 640KB of data that must be communicated across the network. That’s not factoring in if the network is subdivided and miners are mining on two or more leader Prepared blocks, in which case the network load can be double or more of that.
You are mixing up heterogenous things imo:
As I have  said before, Shared Coinbase Transaction is just a transaction with a size as small as 60 bytes (likely, implementation dependent) up to as large as a maximum of 60,000 bytes with normal distribution of probabilities and an average of 30,000 bytes. This is it. There is just one SHA256(2) hash that is committed to block header.
This special transaction is verified by checking the asserted score and reward of each row (from 1 to 10,000 rows out there) by computing the hash of this row appended to previous block hash. There is no need to attach this hash to each row neither in the storage nor in the communication.

As of the need for fetching this special transaction by peers to be able to verify the finalized block, it is very common.
After BIP 152 peers check whether they have the corresponding transaction committed to the Merkle hash of the under validation block, is present in their version of mempool or not. In the latter case,  they fetch the transaction from the peer and validate it.

For ordinary transactions, as I have declared before, the validation process is by no means a trivial process, it involves ECDSA signature verification and UTXO consistency check for each input of each transaction which both are difficult jobs in orders of magnitude compared to what should be done for the (output)rows of our special transaction under consideration, Shared Coinbase Transaction.

For each row of this transaction there is only few processor cycles needed to compute the hash and it is not even the case for all of the rows, just for the rows missing from the memory of the node.

Conclusion: I maintain my previous assertion of zero computation over head and an average of 32 KB block size increase.
Quote
Now I do understand that these proof-of-work share solutions are communicated continuously and not all at once at the Finalized block, but you’ve got at least three possible issues:

1. As I told you from the beginning of this time wasting discussion, the small miners have to verify all the small proof-of-work solutions otherwise they’re trusting the security to the large miner which prepares the Finalized block. If they trust, then you do have a problem about non-uniform hashrate which changes the security model of Bitcoin. And if they trust you also have a change to the security model of Bitcoin.

Easy dude, it is not time wasting, and if it is, why in the hell we should keep doing this, nobody reads our posts, people are busy with more imporatnt issues, no body is going to be the president of bitcoin or anything.

I'm somewhat shocked  reading this post tho.
We have discussed it exhaustively before. It is crystal clear, imo.

First of all (I have to repeat) mining have nothing to do with verifying shares, blocks, whatever ... Miners just perform zillions times of nonce incrementation  and hash computation to find a good hash, It is  a full node's job to verify whatever it should. Agree?

Now, full nodes busy I/O operations, stuff that need extensive networking and disk access,  have a lot of cpu power free and a modern os can utilize it to perform hundreds of thousands of SHA256 hashes without hesitation and any bad performance consequence, just like nothing happened ever.

Is that hard to keep in mind and forget about what have been said in other context (infamous block size debate) please concentrate.

In that debate core team was against the block size increase because they were worried about transaction verification being an I/O bound task, with your share verification nightmare, we are dealing with a cpu bound task, it is not the same issue, don't worry about it.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: MISERICORDAE PROJECT on June 12, 2018, 06:35:39 PM
Proof of Collaborative Work

A proposal for eliminating the necessity of pool mining in bitcoin and other PoW blockchains

Motivation
For bitcoin and the altcoins which are based on common PoW principles,  centralization of mining through using pools, is both inevitable and unfortunate and puts all the reasonings that support the security of PoW in a paradoxical, fragile situation.

A same problem does exist with PoS networks. Things can get even worse  there because of the fact that most PoS based systems enforce long run deposit strategies for miners that is highly discouraging for them to migrate from one pool to another because of the costs involved.

The problem of solo mining becoming too risky and impractical for small mining facilities appeared in 2010, less than 2 years after bitcoin had been launched. It was the worst timing ever, although Satoshi Nakamoto made a comment on bitcointalk about first pool proposals (https://bitcointalk.org/index.php?topic=1976.20#msg25119),  it was among the latest posts Satoshi made and he just disappeared few days later from this forum, forever, without making a serious contribution to the subject.

This way, the confused community came out with an unbelievable solution for such a critical problem, a second layer centralized protocol, named pooling, boosted by greed and ignorance, supported by junior hackers who as usual missed the forest.

Bitcoin was just 2 years old when pooling age began and eventually dominated almost all the hashpower of the network.

A quick review of Slush thread (https://bitcointalk.org/index.php?topic=1976) in which Satoshi has made the above referenced reply, could reveal how immature and naive this solution was and has been discussed and how it has been adopted: In a rush with an obvious greed.
Nobody ever mentioned the possibility of an algorithm tweak to keep PoW decentralized. Instead everybody was talking about how practical was such a centralized service while the answer was more than obvious:
Yes! you can always do everything with a centralized service, don't bother investigating.  

Anyway, in the thread, one couldn't find any arguments about the centralization consequences or the possibility of alternative approaches including the core algorithm improvements :o

I think it is not fair. PoW is great and can easily be improved to eliminate such a paradoxically centralized second layer solution. This proposal, Proof of Collaborative Work (PoCW) is an example of inherent possibilities and capacities of PoW. I didn't find any similar proposal and it looks to be original but If there is a history, I'll be glad to be informed about. :)

The Idea is accepting and propagating works with hundreds of thousands times lower difficulties and accumulating them as a proof of work for a given transaction set, letting miners with a very low shares of hash power ( say of orders like 10-6) to participate directly in the network and yet experience and monitor their performance on an hourly basis.



Imo, now, after almost a decade being passed, Moore law has done enough to make it feasible utilizing more bandwidth and storage resources and it seems to me kinda hypocritic to make arguments about 'poor miners' and pretending to be concerned about centralization threats and making excuses so for rejecting this very specific proposal that although increases the demand for such resources, can radically disrupt current situation with pools and centralized mining.

This proposal is mainly designed for bitcoin. For the sake of convenience and letting the readers to have a more specific perception of the idea, I have deliberately used constants instead of adjustable parameters.

Outlines
  • An immediate but not practically feasible approach can be reducing blocktime (along with proportional reduction in block reward). Although this approach, as mentioned, can not be applied because of network propagation problems involved, but a very excellent consequence would be its immediate impact on the scalability problem if employed, we will use it partially (reducing blocktime to 1 minute compared to current 10 minutes period).
  • As  mentioned earlier (and with all due respects to Core team), I don't take objections about the storage and network requirements implications and consequences of reducing blocktime as a serious criticism. We should not leave mining in hands of 5 mining pools to support a hypothetical poor miner/full node owner who can not afford installing a 1 terabyte HD in next 2 years!.
  • Also note, blocktime reduction is not a necessary part of PoCW, the proposed algorithm, I'm just including it as one of my old ideas (adopted from another forum member who suggested it as an alternative to infamous block size debate and later has been developed a bit more by me) which I think deserves more investigation and discussion.
  • PoCW uses a series of mining relevant data structures to be preserved on the blockchain or transmitted as network messages
    • Net Merkle Tree: It is an ordinary Merkle hash tree of transactions with the exception that its coinbase transaction shows no block reward (newly published coins) instead the miner charges all transaction fees to his account (supports SegWit)
    • Collaboration Share: it is  a completely new data structure composed of following fields:
      • 1- The root of a Net Merkle Tree
      • 2- Collaborating miner's wallet address
      • 3- A nonce
      • calculated difficulty using previous block hash padded with all previous fields, it is always assumed to be at least as hard as 0.0001 compared to current block difficulty
    • Coinbase Share: it is new too and is composed of
      • 1- A Collaborating miner's wallet address
      • 2- A nonce
      • 3- A computed difficulty score using the hash of
        • previous block's hash padded with
        • current block's merkle root, padded with
        • Collaborating miner's address padded with the nonce field
      • 4-  A reward amount field
    • Shared Coinbase Transaction: It is a list of Coinbase Shares  
      • First share's difficulty score field is fixed to be  2%
      • For each share difficulty score is at least as good as 0.0001
      • Sum of reward amount fields is equal to block reward and for each share is calculated proportional to its difficulty score
    • Prepared Block: It is an ordinary bitcoin block with some exceptions
      • 1- Its merkle root points to a  Net Merkle Tree
      • 2- It is fixed to yield a hash that is as difficult as target difficulty * 0.05
    • Finalization Block: It is an ordinary bitcoin block with some exceptions
      • 1- Its merkle root points to a  Net Merkle Tree
      • 2- It is fixed to yield a hash that is as difficult as target difficulty * 0.02
      • 3- It has a new field which is a pointer to (the hash of) a non empty Shared Coinbase Transaction
      • 4- The Shared CoinBase Transaction's sum of difficulty scores is greater than or equal to 0.95
  • Mining process goes through 3 phases for each block:
    • Preparation Phase: It takes just few seconds for the miners to produce one or (barely) 2 or 3 Prepared Blocks typically. Note that the transaction fees are already transferred to miner's wallet through coinbase transaction committed to the Net Merkle Tree's root for each block.
    • Contribution Phase: Miners start picking one valid Prepared Block's Merkle root, according to their speculations (which become more accurate as new shares are submitted to the network) about it to get enough shares eventually, and producing/relaying valid Contribution Shares for it.
      As the sum of the difficulty scores for a given Prepared Block's Merkle root grows we expect an exponential convergence rate for the most popular Merkle root to be included in Contribution Shares.  
    • Finalization Phase: After the total scores approaches the 0.93 limit, rational Miners would begin to produce a Finalized block
  • Verification process involves:
    • Checking both the hash of the finalized block and all of its Shared Coinbase Transaction items to satisfy network difficulty target cumulatively
    • Checking reward distribution in the shared coinbase transaction
    • Checking Merkle tree to be Net
  • UTXO calculation is extended to include Shared Coinbase Transactions committed to finalized blocks on the blockchain as well
  • Attacks/forks brief analysis:
    • Short range attacks/unintentional forks that try to change the Merkle root are as hard as they are in traditional PoW networks
    • Short range attacks/unintentional forks that preserve the Merkle root but try to change the Shared CoinBase Transaction has  zero side effects on the users (not the miners) and as of redistributing the shares in favor of the forking miner, they are poorly incentivized as gains won't go anything further than like %2-%10  redistribution ever.
    • Long Range attacks with a total rewrite agenda will fail just like Traditional PoW  
    • Long Range attacks with partial coinbase rewrite are again poorly incentivized and the costs won't be justified

Implementation

This is a radical improvement to classical PoW, I admit, but the costs involved are fair for the huge impacts and benefits. I have reviewed the bitcoin Core's code and found it totally feasible and practical form the sole programming perspective. Wallets could easily be upgraded to support the new algorithm as well,  but a series of more complicated issues, mostly political are extremely discouraging but it is just too soon to give up and go for a fresh start with a new coin, or just manage for an immature fork with little support, imo.

Before any further decisions, it would be of high value to have enough feedback from the community. Meanwhile I'll be busy coding canonical parts as a BIP for bitcoin blockchain, I think it takes like 2-3 weeks or even a bit more because I'm not part of the team and have to absorb a lot before producing anything useful, plus, I'm not full time, yet ;)

I have examined the proposed algorithm's feasibility as much as I could, yet I can imagine there might be some flaws overlooked, and the readers are welcome to improve it. Philosophical comments questioning the whole idea of eliminating pools don't look to be constructive tho. Thank you.


Major Edits and Protocol Improvements:
  • June 10, 2018 09:30 pm Inspired by a discussion with @ir.hn

    • A Prepared Block should be saved in the fullnodes for a long period of time enough to mitigate any cheating attempt to avoid Preparation Phase and using non-prepared, trivially generated Net Merkle Roots.  
      • Full nodes MAY respond to a query by peers asking for a block's respected Prepared Block if they have decided to save the required data long enough
      • For the latest 1000 blocks preserving such a data is mandatory.
      • For blocks with an accumulated difficulty harder than or equal to the respected network difficulty, it would be unnecessary to fulfil the above requirement.*
      • Prepared Block and Preparation phase terms replaced the original Initiation Block and Initiation Phase terms respectively to avoid ambiguity
      Notes:
      * This is added to let miners with large enough hash powers choose not to participate in collaborative work.
  • reserved for future upgrades









This is a good technical proposal. Kudos!! All issues raised by commentators can be taken into account and addressed if not resolved already in the analytical model. More Grease to your elbow!



The Shared Transaction Coinbase is not a part of the Header, its hash(id) is,

All the small proof-of-work solutions have to communicated and calculated before the winning block can be communicated. So that is up to 10,000 (if difficulty target is 0.0001) multiplied by the 64B size of a SHA256 hash, which is up to 625KB of data that must be communicated across the network for each 10 minute period. That’s not factoring in if the network is subdivided and miners are mining on two or more leader Prepared blocks, in which case the network load can be double or more of that.

Now I do understand that these proof-of-work share solutions are communicated continuously and not all at once at the Finalized block, but you’ve got at least four potential issues:

1. As I told you from the beginning of this time wasting discussion, the small miners have to validate all the small proof-of-work solutions otherwise they’re trusting the security to the large miner which prepares the Finalized block. If they trust, then you do have a problem about non-uniform hashrate which changes the security model of Bitcoin. And if they trust you also have a change to the security model of Bitcoin. And if the don’t trust and attempt the validation, then they’re incurring more costs than they would in pools and be further marginalized.

2. All of these solutions still have to be validated in terms of Shared Transaction Coinbase, when the Finalized block is. Although the previously validated small proof-of-work solutions themselves do not have to be revalidated, the hash of all the small proof-of-work solutions has to be checked and the miner has to verify he already validated the solution for each one. This is also some overhead which can delay propagation because it adds up. Each node has to add this validation step before propagating to the next node in the P2P network. You ostensibly do not seem to fully appreciate how small verification steps add up in the propagation to form significant delays w.r.t. to lowering the effective block period to 10 and 30 seconds (as appears to me your design does) for the Final and Prepared block stages. Nodes do not propagate invalid block solutions (or any invalid data) because they would make the P2P network vulnerable to a DoS amplication attack.

3. Because the network can be subdivided on two or more leader blocks, the nodes no longer have an incentive validate and propagate the solutions on the block they are not contributing small proof-of-work solutions to. Presumably they have slightly better ROI if they always contribute to the Prepared block they received first, and not to every Prepared block they received.

4. This is for 0.0001 difficulty target for the small proof-of-work solutions. As I already stated up-thread, this will get worse over time as this target has be decreased as network hashrate grows faster than the hashrate and capital of the small miner.

As of classical selfish attack itself, I personally disagree to call it an attack at all. I rather see it as a fallacy, a straw man fallacy (https://en.wikipedia.org/wiki/Straw_man).
My reasoning:
PoW has nothing to do with announcement. Once a miner prefers to keep his block secret it is his choice and his right as well, he is risking his block to become orphan in exchange for a possible advantage against the rest of the network in mining for the next block.

When you apparently do not understand the math and research paper on selfish mining (or you’re just being disingenuous?), and you start arguing philosophically and handwaving, then the time wasting discussion is terminated.

Selfish mining is always profitable for the 33+% attacker. It isn’t probably employed as an attack on Bitcoin because it increases the orphan rate and would tarnish the image of Bitcoin. So presumably the powers-that-be are not using it and they do not need to as they already have ASICBOOST and control over the 12/14/16nm ASIC fabs. So it’s not in their interest to deploy the attack on Bitcoin. But that doesn’t mean it is not being deployed already on proof-of-work altcoins.

Although Like PoW, this proposal is not about prohibiting people from selfish mining, there is a point to rephrase the above reasoning somehow different, this proposal is about reducing the pooling pressure and helping the network to become more decentralized by increasing the number of miners. How? By reducing the variance of mining rewards that is one of the 2 important factors for this pressure (I will come back to the second factor, soon).

My point which you seem to be trying your best to obfuscate is that, AFAICT I posit that your design makes selfish mining much worse. I posit that it lowers the 33+% that the miner needs to attack the network with selfish mining, thus further lowering the security. And will be more dubious to detect it because your design AFAICT drastically increases the orphan rate.

I am even wondering if your design will even reliably converge on a longest chain. And especially as the 10,000 factor is pushed to 1 milllion as the market capitalization of Bitcoin grows, then surely your design will fall flat on its face.

AFAICS, you’re fighting a losing battle against the invariants of physics. The game theory flaws multiply as you attempt to put decentralization into a paradigm that is inherently centralizing.

All time expended trying to decentralize proof-of-work is time wasted thrown down a rathole. Proof-of-work is a centralization paradigm. There will be no escape.

For a traditional winner-takes-all PoW network, like bitcoin there is just one pieces of information (the fresh block) that causes the problem, true, but the weight of this information and resulting premium is very high and it is focused in one spot, the lucky miner in the focal point and its neighbors in the hot zone.

For this proposal, this premium is distributed more evenly, tens of thousands times.

OOps! there is almost no proximity premium flaw in Proof of Contributive Work!

As I have posited with my incomplete list of concerns above, there will likely be game theory flaws lurking that you do not expect. I don’t want to expend the effort do more than handwave about those I posited. Apology but I’m very pessimistic about what more can be accomplished with proof-of-work.

There’s no way around the invariants of physics and economics that make proof-of-work inherently centralizing.

And I don’t want to expend my time on this. Sorry. I already expended many months contemplating the variants of designs and realized there’s no escape from the invariants. You can continue to invent new obfuscations for yourself over and over again until you finally come to realize the same. Good luck.


@anunymint   Your technical evaluations and criticisms are highly valued. Two heads are better than one. Of course there are issues of centralization with PoW and already being exploited but there shouldn't be loss of hope in addressing them technically, from scratch or even with an additive/add-on. There is always a way around, an escape, and that has been driving new Physics and technological innovations. It's encouraging to let @aliashraf's codes be completed and reach a testing stage where unaddressed flaws in his analytical models will be discovered and solved. Don't lose hope in continuing your technical analysis of the proposal, even when you are feeling a sense of obfuscation.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: MISERICORDAE PROJECT on June 12, 2018, 06:45:43 PM
There is always a way around, an escape, and that has been driving new Physics and technological innovations.

Sorry no. You are handwaving.

I do not buy into false hopes. There are invariants here which cannot be overcome.

Is it possible for you to supply your mathematical details of these insurmountable invariants so we can look into it from our end? 


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: MISERICORDAE PROJECT on June 12, 2018, 07:33:49 PM
There is always a way around, an escape, and that has been driving new Physics and technological innovations.

Sorry no. You are handwaving.

I do not buy into false hopes. There are invariants here which cannot be overcome.

Is it possible for you to supply your mathematical details of these insurmountable invariants so we can look into it from our end?  

...Much better for me if the competition wastes time on an insoluble direction, while I am working on a promising one...


Not really aware of a competition. Is the promising one you are working on a solution to the issues of Bitcoin with an entirely new algorithm?


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: manfredmann on June 13, 2018, 10:27:52 AM
Well that sounds good to get rid of pools for mining crypto in order to promote individual or small scale mining where areas mining earning would achieve an optimal profit. Shared mining profit will favored on mining team developers and for the actual miners will get a percentage on it. We should promote and create a mining opportunity that will be able to take the miners a good profit in doing it.


Title: Re: Getting rid of pools: Proof of Collaborative Work
Post by: aliashraf on June 13, 2018, 05:47:23 PM
    There is always a way around, an escape, and that has been driving new Physics and technological innovations.

    Sorry no. You are handwaving.

    I do not buy into false hopes. There are invariants here which cannot be overcome.

    Is it possible for you to supply your mathematical details of these insurmountable invariants so we can look into it from our end?  

    ...Much better for me if the competition wastes time on an insoluble direction, while I am working on a promising one...


    Not really aware of a competition. Is the promising one you are working on a solution to the issues of Bitcoin with an entirely new algorithm?
    I'm sorry saying this but I think we have been trolled by @anunymint.  :(

    PoW is one of the most important innovations in modern history (kudos Satoshi  :)) it is very irresponsible decision to abandon it because of some flaws and limitations by  claiming every flaw to be an inherent, essential one  and  jumping back to a pre-crypto, failed,  subjective alternative (like reputation based systems) often rebranded by using the same terminology of Satoshi Nakamoto and bitcoin!

    I'm not against change, on the contrary I strongly support any new idea whenever by whoever. But I personally feel good about a change when it is suggested (at least mainly) to help people to do something better not as an instrument in hands of an opportunist who has found or been informed about a weakness in a newly born technology and instead of trying or helping to fix it, initiates a hypocritical campaign just to sell us his crippy name or to convince his dad that he is genius, ... whatever.

    I'm not that kind of person, It's so tempting to take advantage of weaknesses and flaws of a system but I don't like such a miserable life. This proposal is a fix not a hypocritical alternative to PoW.

    It is a fix for  a series of of important challenges of bitcoin and PoW networks, it deserves decent reasoning and discussion instead of trolling and vandalism.

    To understand how unacceptable is that kind of behavior it is better to understand the importance and beauty of the subject, imo. let's take a look:

    1- It fixes Pooling pressure as the most centralization threat to bitcoin, by:
    • eliminating (solo)mining variance flaw by dividing mining to 3 phases that in the most important one, Collaboration phase (being the second one), where 98% of the block reward is going to be distributed,  they can partially contribute to PoW process  tens of thousands of times easier directly.
    • eliminating proximity premium flaw by uniquely distributing 'new block found' information across tens of thousands points in the network and incentivizing announcement of this information simultaneously.

    2- Although this proposal is ready for an alpha version implementation and consequent deployment phases, it is too young to be thoroughly understood for its other impacts and applications, the ones that it is not primarily designed for.  As some premature intuitions I can list:
    • It seems to be a great infrastructure for sharding , the most important onchain scalability solution.
      The current situation with pools makes sharding almost impossible, when +50% mining power is centralized in palms of few (5 for bitcoin and 3 for Ethereum) pools, the problem wouldn't be just security and vulnerability to cartel attacks, unlike what is usually assumed, it is more importantly a prohibiting factor for implementing sharding (and many other crucial and urgent improvements).
      If my intuition might be proven correct, it would have a disruptive impact on the current trend that prioritizes off chain against on chain scalability solutions.
    • This protocol probably can offer a better chance for signaling and autonomous governance solutions
    • {TODO: suggest more}

    A thorough analysis of the details suggested in the design, would convince non-biased reader that this proposal is thought enough and is not that immature to encourage anybody to attempt a slam dunk and reject it trivially, on the contrary considering the above features and promises, and the importance of pooling pressure as one of the critical flaws of bitcoin, it deserves a fair extensive discussion.

    Now, when someone comes and ruins such a decent topic, like what @anunimint did here, by repeating nosens objections and being never convinced no matter what, it could be either due to his ingenuity or as a result of him being biased obsessively because of his history in public sphere that is full of Proof of everything other than Work obsessions and vague claims about PoW being a boring, old fashioned weak system, doomed to be centralized, vulnerable to every possible attack vector, blah, blah, blah, ...  that he is trapped himself in or both .

    I vote for the second option about this guy, but if he is really smart, he should put the load (of his own history) off his shoulders and be ready for revision and improve.[/list]


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: Carlton Banks on June 13, 2018, 07:17:51 PM
    Proof of everything other than Work

    Annoymint doesn't like the implications of proof of work; he's been claiming for 5-6 years that he's working on a "blockchain breakthrough", but never proves he's working on anything :)


    @Annoymint, you need to start a new Bitcointalk user called "Proof of everything other than work"


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 13, 2018, 07:25:15 PM
    Proof of everything other than Work

    Annoymint doesn't like the implications of proof of work; he's been claiming for 5-6 years that he's working on a "blockchain breakthrough", but never proves he's working on anything :)


    @Annoymint, you need to start a new Bitcointalk user called "Proof of everything other than work"

    I see, being trapped by his own narration, a very common threat for all of us. I guess we have to do kinda meditation or Zen to avoid or heal.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 14, 2018, 12:56:00 AM
    Why is the thread being derailed by some comments about me? What does anything I did or did not do have to do with the discussion of the technological facts of PoCW.
    Really nothing, besides the need to stop you from trolling.

    Quote
    he's been claiming for 5-6 years that he's working on a "blockchain breakthrough"

    I challenge you to quote from the past where I extensively made such a claim 5 or 6 years ago.

    EDIT: {and a long story about what you have been bout in 5-6 years ago}
    In any case, I welcome your ridicule. It motivates me. Please do not stop. And please do report me to @theymos so this account can be banned so I stop wasting time posting on BCT.

    Like this. Please ... Just put and end to this if you may. You did something inappropriate, and some objections was made about it. Let it go.

    Quote
    My technological comments stand on their own merits regardless what ever is done to cut my personal reputation down.


    Absolutely not. You questioned the overhead of the validation process for miners in my proposal and I answered it solidly: There is no overhead because there is no I/O involved because the submitted contribution shares have an exactly the same Merkle root that is already evaluated (once, when the Prepared Block has been evaluated by the miner when he decided to contribute to it afterwards).

    Only a troller continues with repeating this question over and over and in an aggressive way full of insults and hype.

    A decent contributor with good faith, may show up to be doubtful about the predicates like  'there is no I/O' , 'the Merkle tree has not to be validated' , 'the shares enjoy a common Merkle tree' , ... and this time with less confidence about the validity of his position because s/he understands that there is huge possibility for her/his doubts to be removed by the designer of the protocol trivially by posting few references references to the original proposal. Actually it is exactly the case here because all of the three predicates under consideration are absolutely true by the design.

    When the doubts cleared to be unnecessary the the discussion can go a step forward. It is no war, there is nothing to conquer other than the truth.





    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 14, 2018, 09:56:54 AM
    Why is the thread being derailed by some comments about me? What does anything I did or did not do have to do with the discussion of the technological facts of PoCW.

    Really nothing, besides the need to stop you from trolling.

    Please define trolling and then show that I did it. Specific links or quotes please.
    no need to go so far this post of you is %90+  nothing other than trolling.
    Quote
    he's been claiming for 5-6 years that he's working on a "blockchain breakthrough"

    I challenge you to quote from the past where I extensively made such a claim 5 or 6 years ago.

    EDIT: {and a long story about what you have been bout in 5-6 years ago}
    In any case, I welcome your ridicule. It motivates me. Please do not stop. And please do report me to @theymos so this account can be banned so I stop wasting time posting on BCT.


    Like this. Please ... Just put and end to this if you may. You did something inappropriate, and some objections was made about it. Let it go.

    What did I do that was inappropriate “5-6 years ago” that was related to “claiming […] he's working on a ‘blockchain breakthrough’”?  Specific links or quotes please.

    If you can’t specifically show that the SPECIFIC “5-6 years ago” allegation is true, then you are the one who is trolling by stating the lie, “You did something inappropriate”.

    I politely asked you to end this but you love twisting it more ... it is trolling in the specific context we are in .. I was not the one aho said things abut the last 5-6 years of your history, FYI.
    Quote
    My technological comments stand on their own merits regardless what ever is done to cut my personal reputation down.


    Absolutely not. You questioned the overhead of the validation process for miners in my proposal and I answered it solidly:

    I said my “My technological comments stand on their own merits regardless what ever is done to cut my personal reputation down”.

    That does not mean I claim “My technological comments” are unarguable. Only that “my personal reputation” has nothing to do with the discussion of the technology.

    It is another and the most important form of troll you commit, repeatedly. Your argument here is void and makes no sense:
    Once an objection is made and it proves to be irrelevant or false and the proposal addresses the asserted issues, , it should be dropped and not maintained the way you are putting it, every issue remains open and can be used as a toy by trollers by making false claims whenever they wish to.
    Quote
    There is no overhead because there is no I/O involved because the submitted contribution shares have an exactly the same Merkle root that is already evaluated (once, when the Prepared Block has been evaluated by the miner when he decided to contribute to it afterwards).

    I already refuted that line of logic in that the ratios over time have to either increase or the capitalization of the miner within your current 10,000 factor will place them in the crosshairs of having to cowtail to the oligarchy.
    And I punted on the entire concept, because I stated mining is becoming ever more centralized so it’s pointless and futile to try to make a protocol for small miners.

    one another example of trolling, after you have been clearly informed about negligible costs of validation of shares, instead of closing the case and moving on, you just deny everything by bringing forward a very weak argument to keep the issue open no matter how. You can't help it, you need issues to remain open forever to be used by you for ruining the topic.

    In this case, you are saying that future increases in network hash power should be compensated by increasing the number of shares and it will eventually be problematic. IOW, you are saying 2-3 years later the hashrate will probably double and small miners would again experience variance phenomenon, then devs will improve the protocol and double the number of shares by a hard fork and now, this increase would prove that verification of shares is a weakness!

    Firstly, doubling or tripling the number of shares don't make significant problem in terms of share validation costs, it is yet a cpu bound process, some very low profile nodes may require $200 or so to buy a better processor, in the worst case.

    Secondly, by increases in network hash power, although it is nonlinear, we will have an improvement in mining devices and their efficiency.

    Quote
    Only a troller misrepresents to the thread what I wrote in the thread as explained above.

    Now I am done with you.

    Bye.


    I should have stuck to my first intuition and never opened the thread. Or certainly never had posted after I read that horrendously bad OP description of the algorithm. That was indicative of the quality of the person I am interacting with unfortunately. I have learned a very important lesson on BCT. Most people suck (https://youtu.be/4XpnKHJAok8?t=1110) (see also (https://youtu.be/4XpnKHJAok8?t=1664) and also (https://www.quora.com/What-are-some-must-read-Linus-Torvalds-rants)!). And they don’t reach their potential (https://youtu.be/jVLMUb1mxn0?t=497). The quality people are very few and far between, when it comes to getting serious work done.


    See? You are offending me, my work, bitcointalk, its members, ... very aggressively at the end of the same post that you are asking for evidence of you being a troll! I can imagine you may reply like this:
    "I never told I'm not a troll I've just wanted you to give evidence about it, so I 'maintain' my inquiry for evidence. This issue, me being a troll or not is open just like all other issues we have been arguing about."!

    Quote
    In a lighter social setting a wider array of people can be tolerated (especially when we do not need to rely on them in any way).
    Tolerance is good but trolling is not among ones that are to be tolerated, imo.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 14, 2018, 07:58:07 PM

    I don’t think my argument is weak. I think my analyses of your design is 100% spot on correct. And I encourage you to go implement your design and find out how correct I am! Please do!

    You continue to not mention the point I made about incremental validation overhead and accumulated propagation delay and its effect on orphan rate, especially when you have effectively decreased the block period to 15 seconds for the Finality phase Schelling point and 30 seconds for the Prepared block Schelling point.

    And you continue to not relate that I also pointed as the transaction fees go to $50,000 with Lightning Networks Mt. Gox hubs dominating settlements in the 1 MB blocks (or pick any size you want which which does not spam the network with low transactions fees because the miners will never agree and unlimited block sizes drive the orphan rate up and break security), then active UTXO will shrink because most people can’t afford to transact on-chain. Thus the MRU UTXO will be cached in L3 SRAM. And the block will have huge transactions and not many transactions. Thus your entire thesis about being I/O bound on transaction validation will also be incorrect.

    You can’t seem to pull all my points together holistically. Instead you want to try to cut a few of them down piece-meal out-of-context of all the points together.
    I remain silent about the trolling part, I'm realising you can't help it and it is just unintentional behavior of a  polemicist when things get too intense.

    Let's take a look at  technical part of your reply:
    1- There is no incremental overhead, I've never mentioned any incremental increase/decrease (enforced by the protocol or by scheduled forks) in the proposed parameters including the relative difficulty of contribution shares . I have to confess, tho, I'm investigating this possibility.
    Will keep you informed about the outcome which will not be a simple linear increase with network hashpower, anyway.

    2- Also propagation delay won't accumulate even if we might increase (incrementally or suddenly) the driving factors behind the number of contribution shares because validation cost is and remains negligible for nodes. Remember? The client software is I/O bound and contribution share validation is cpu bound(I'll come to your new objection about it later).

    3- I am not 100% against your analysis of lightning or in favor of it, I'm not that much interested or believer in LN as a scaling solution, but it won't help your position in this debate:

    Your arguments:
    You are speculating that transactions will go off chain in the future and the main chain will be busy processing huge transactions produced by flush operations in LN nodes and at the same time network nodes will manage to keep the UTXO (its most Recently Used Part) in SDRAM and it helps them not to access HD frequently and so, they will be no more I/O bound (relatively) and this way the processing overhead of contribution shares begins to look more important and will become to be a bottleneck eventually. Right?

    Answer:
    • You are speculating TOO much here, my perception of LN and off chain solutions differs moderately
    • Having MRU UTXO in SDRAM cache won't help that much, task would remain halted for RAM access and yet would access HD for page faults and most importantly for writing to UTXO after the block has been verified
    • Also, a relative improvement to node's performance in validating full blocks is not a disaster,  the number of blocks is the same as always

    4- As of your expectation from me not to cutting your objections down to pieces is like asking me to troll against a troller. On the contrary I prefer to go more specific and resolve issues one by one. On the contrary you want keep the discussion in the ideological level, being optimistic or pessimistic about this or that trend or technology and so on, ... I think in the context of making assessments about a proposed protocol my approach is more practical and useful.

    Quote
    I specifically mentioned order-of-magnitude readjustments in the future. There you go again twisting my words and being disingenuous.

    Firstly, doubling or tripling the number of shares don't make significant problem in terms of share validation costs, it is yet a cpu bound process, some very low profile nodes may require $200 or so to buy a better processor, in the worst case.

    You’re ignoring that I argued that your thesis on transaction validation bounded validation delay will also change and the validation of the shares will become incrementally more relative. And that taken together with the 15 second effective Schelling points around which orphaning can form. You’re not grokking my holistic analyses.

    there will be no order-of-magnitude (=exponential like tens or hundreds of times? ) increase to the network hash power  in foreseeable future and I did you a favor not to simply rejecting this assumption, instead I tried to address more probable scenarios like 2 or 3 times increase in next 2-3 years or so.
    Although it is good to see the big picture and take cumulative effects in consideration, but it won't help if you have not a good understanding of each factor and its importance.

    You are saying like :
               look! there are so many factors to be considered isn't this terrifying?
    No! This is not terrifying as long as we could be able to isolate each factor and understand it deeply, instead of being terrified or terrifying people by it.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 15, 2018, 10:43:15 AM
    @anunymint
    I appreciate the fact that you spend a considerable time on this subject, it is a good evidence for me to become even more convinced that:
    1- You have good faith and as of trolling part of your writings, you just can't help it and I should be an order of magnitude more tolerant with you  :)
    2- You are smart and have been around for a long time, a good choice for chewing a complicated proposal like PoCW. Again, more tolerant, as tolerant as possible and more ... I should repeat and keep it in mind  ;)

    I was nearly certain I had already mentioned it up-thread, but couldn’t quickly find it to quote it. So let me recapitulate that your PoCW design proposes to put 10,000 times more (and I claim eventually 100,000 and 1 million) times more proof-of-work hashes in the blockchain that have to be validated.
    [/quot]
    Nop. It is about 10,000 and will remain in that neighborhood for long,long time to reach 100,000 it will take a century or so! I have already described it:I have no plan and there wont be such a plan to increase this number linearly with the network hashrate.
    This proposal, with current parameters is about making solo mining 100,000 times more convenient right now, it is a good improvement regardless of what happens in the next few years and how we should deal with it.
    Quote
    This is going to make objectively syncing a full node (especially for a new user who just downloaded the entire blockchain since the genesis block) incredibly slow unless they have very high capitalization ASICs. And the asymmetry between commodity single CPUs and mining farms of ASICs will widen, so perhaps eventually it becomes impractical for a CPU to even sync from genesis. So while you claim to improve some facets, your design makes other facets worse. {*}
    Unlike what you suggest, ASICs won't be helpful in syncing block chain. Full nodes are not ASICs and don't utilise ASICs ever to validate a hash, they just compute that hash with their own cpu!
    SHA256 and other hash functions, are NP-Complete problems: their solutions consume negligible time and resource to be verified, it is basic in "computer science"  ;)

    Quote
    {*} Including you continue to deflect the correct criticism that your design very likely makes selfish mining worse thus reducing an attack on economic security from 33% of the hashrate to an even lower threshold. The onus is on you to write a detailed whitepaper with formal security proofs. Not on me to write proofs that your design has such failure modes.

    One polite request if I may: Please remain focused as much as possible. I quoted this from where I cut the previous one and inserted {*}.  This is really hard to be productive this way, jumping from this objection to that one with little or zero relations between.
    Actually I have answered this before.

    Retrying:
    • This proposal discourages selfish mining and generally any consequences regarding proximity premium flaw including but not limited to mining pressure which is its design goal.
    • To understand how PoCW does this huge improvement, one should note that proximity premium  flaw is about nodes having access to important, valuable information sooner than what competitors do, so that they may have chances to take advantage of this premium intentionally or not.
      In worst case scenarios like selfish mining the privileged node(s) may decide to escalate this situation by deliberately keeping that information private for longer time, instead of relaying it according to the protocol.
    • One should also beware of the nature of this information: It is always about that a block has been found and the details of this discovery.
    • Proof of Collaborative Work, this proposal, addresses this issue for the first time, by distributing the critical information in tens of thousands of possible points in each round across the network. It is smart because instead of eliminating the privilege of being close to the source of information, distributes it almost evenly through the participants
    • It is also very important to note that how this feature of the proposal, relaxes any doubts about propagation delay, it turns to become a much less threat to the security of the network:
      The proposal has another secret weapon worth mentioning here: It incentivizes sharing information because of the mechanisms provided to help finazilingthe the information, suspending its actual value.
      This way, not only the focal points have been distributed across the network 100,000 times more, the distributed energy is not even a finalized energy yet. This makes it ways more irrelevant to keep it as a secret.
    • The above property, provides an excellent self regulatory mechanism that for every bit of hypothetical overhead it generates for the nodes to comply, and every microsecond of propagation delay it causes, there would exist a considerable decrease  in security vulnerability of the network to propagation delay.  

    I appreciate your concerns and if you manage to post a reply focused on something like this or any other technical concerns regarding the proposal, I'll send merits for you again, I just can't do it right now as you don't stop being generally and totally negative and aggressive and acting like a warrior who is fighting against ... against what or whom really?


    As a conclusion and brief summary:
    PoCW, practically fixes one of the most known flaws in traditional PoW,  proximity premium. This has various important consequences besides its direct effect on mining pressure, including and not limited to discouraging selfish mining.

    Quote
    I’m quite confident that many experts already considered your design. I know I did circa 2014 or 2015. And we dismissed it without discussing in great detail in public because it was just not a worthwhile design to pursue. However, I do think if you read every post of bitcointalk.org (or which I can claim I have probably read more of it than most people reading this), then you will find discussion that proposed designs analogous to yours. And they were shot down for the reasons I am providing to you now. If you want to look for fertile design ground, you need to look away from the obvious things that everybody was trying to solve years ago. I raised the pools bugaboo incessantly in 2013/14 (https://bitcointalk.org/index.php?topic=339902.msg3647072#msg3647072). You’re like 5 years too late to the party. That sailed already.

    It is not true as a general rule, actually it is not true even occasionally, imo. Technology doesn't trend only driven by just one factor: smartness of inventors or advocates, it is ways more importantly driven by interests and enthusiasm.

    I started this thread by giving a brief historical perspective of how pools have been developed conceptually and practically. It was driven by ignorance and greed as I've concluded there.

    If you are right, and there has been  a mysterious proposal somewhere in the history, similar to mine, I'm sure it has been abandoned not because of being impractical. My analysis suggests with Satoshi being disappeared, people were left in the hands of junior programmers who were not committed to decentralization enough, on one hand and greedy pool operators who did anything to take advantage of bitcoin, centralized parasites grown on a decentralized infrastructure, like what Google, Facebook, ... are for TCP/IP and Internet.

    Now, I managed to design this algorithm not because I'm very smart and can outperform all of the advocates and developers, I'm just more concerned about centralization compared to many of these guys who have got rich enough that many of them are already retired and instead of doing their job, they are just pretending and the remaining are now investigating how to take advantage from the centralized situation to get even more rich. They just don't care.

    There should be a voice for people, fresh people who join, being promised to live in a better world a voice for average and small miners and hobbyists who wanna be a part of a fair business, free of corporates and pools. And guess what? They are a force, driving force after all that we have been through, after Bitmain.

    This community is becoming more aware and a driving force is pushing for decentralization and it is the true reason that someone like me is so confident about the future. It is no more 2010, things have happened and there is a shift that encourages and dictates decentralization.

    I'm not a genius, I'm just a dedicated programmer/software engineer who tries to think out of the box and ask crucial questions and could not be satisfied easily by stupid arguments and does not pay a sh*t to history and how it has happened to bring us this misery. A person who got enough courage to ask how it could happen and what do we have to deal with this mess now.

    Hereby I ask for help, from brave, dedicated developers and advocates to join this proposal, improve it, implement it and kick these folks out of the decentralized ecosystems. I strongly believe they should better invest in or work for Google ...
    Vitalik is specially a good candidate, he is already invited  ;D

    Quote

    3. I expect your design makes the proximity issue worse. I had already explained to you a hypothesis that the Schelling points in your design will cause network fragmentation. Yet you ignore that point I made. As you continue to ignore every point I make and twist my statements piece-meal.

    Good objection.
    Shelling points (transition points from Preparation to Contribution and from the Finalization to the next round ) have %7 value cumulatively (%5 for first and %2 for the second point). It is low enough, yet it is not at stake, totally:

    For the first %5 part, the hot zone (the miner and its neighboring peers) are highly incentivized to share it asap, because it is not finalized and practically worth nothing as it won't be appreciated if it doesn't get enough support finding its way to finalization. Note that neighbors are incentivized too, as if they want to join the dominating current, they need their shares to be finalized as soon as possible, it needs the Prepared Block to be Populated although it is not their's.

    For the second Schelling point, the finalized block found event, with %2 percent block reward value, hesitating to relay the information is in a very high risk of being orphan by means of other competitors(for the lucky miner), and to be mining orphan shares/final blocks (for the peers).

    I understand you have some feelings that more complicated scenarios could be feasible but I don't think so and until somebody has not presented such a scenario, we better not to be afraid of it.

    I'm aware that you are obsessed with some kind of network being divided because you think selfish mining is a serious vulnerability and/or propagation delay is overwhelming.
    Network wont be divided neither intentionally nor as a result of propagation delay, and if you are not satisfied with my assessment of propagation delay you should recall my secret weapon, incentivizing nodes to share their findings, as fast as possible to the extent that they will put it in high priority. They will dedicate more resources (both hardware and software) to the job.

    Quote
    2- Although this proposal is ready for an alpha version implementation and consequent deployment phases, it is too young to be thoroughly understood...

    Correct! Now if you would just internalize that thought, and understand that your point also applies to your reckless (presumptuous) overconfidence and enthusiasm.

    Here you have 'trimmed' my sentence to do what you are repeatedly accuse me of. I'm not talking about other people being not smart enough to understand me and/or my proposal.
    I'm talking about the limitations of pure imagination and discussion about the consequences of a proposal any proposal when it might be implemented and adopted.
    Why should you tear my sentence apart? The same sentence that you have continued quoting. Isn't that an act of ... let's get over such things, whatever.
    Quote
    ...for its other impacts and applications, the ones that it is not primarily designed for.  As some premature intuitions I can list:

    • It seems to be a great infrastructure for sharding , the most important onchain scalability solution.
      The current situation with pools makes sharding almost impossible, when +50% mining power is centralized in palms of few (5 for bitcoin and 3 for Ethereum) pools, the problem wouldn't be just security and vulnerability to cartel attacks, unlike what is usually assumed, it is more importantly a prohibiting factor for implementing sharding (and many other crucial and urgent improvements).
      If my intuition might be proven correct, it would have a disruptive impact on the current trend that prioritizes off chain against on chain scalability solutions.
    • This protocol probably can offer a better chance for signaling and autonomous governance solutions

    In the context of the discussion of OmniLedger, I already explained that it can’t provide unbounded membership for sharding, because one invariant of proof-of-work is that membership in mining is bounded by invariants of physics. When you dig more into the formalization of your design and testing, then you’re going to realize this invariant is inviolable. But for you now you think you can violate the laws of physics and convert the Internet into a mesh network. Let me link you to something I wrote recently about that nonsense which explains why mesh networking will never work:

    https://www.corbettreport.com/interview-1356-ray-vahey-presents-bitchute/#comment-50338
    https://web.archive.org/web/20130401040049/http://forum.bittorrent.org/viewtopic.php?id=28
    https://www.corbettreport.com/interview-1356-ray-vahey-presents-bitchute/#comment-50556
    I'll check your writings about sharding later, thanks for sharing. But As I have mentioned here, these are my initial intuitions and are provided to show the importance and beauty of the proposal and opportunities involved. I just want to remind that how pointless would be to just fighting with it, instead of helping to improve and implement it.
    Quote
    A thorough analysis of the details suggested in the design, would convince non-biased reader that this proposal is thought enough and is not that immature to encourage anybody to attempt a slam dunk and reject it trivially, on the contrary considering the above features and promises, and the importance of pooling pressure as one of the critical flaws of bitcoin, it deserves a fair extensive discussion.

    https://www.google.com/search?q=site%3Atrilema.com+self-important
    https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Matthew-Laine-1
    https://medium.com/@shelby_78386/if-you-want-the-country-to-be-less-polarized-then-stop-writing-talking-and-thinking-about-b3dcd33c11f1


    Now you are just fighting (for what?) ...
    You are accusing me to be of this or that personality, being over-confident, ... whatever, instead I suggest you to provide more illuminating points and objections and make me to reconsider parts of the proposal, instead of repeating just one or two objections while you are playing your game of thrones scenes.

    well, it was hell of a post to reply. I'll be back to is later.
    Cheers


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: tromp on June 15, 2018, 11:27:45 AM
    • Verification process involves:
      • Checking both the hash of the finalized block and all of its Shared Coinbase Transaction items to satisfy network difficulty target cumulatively
    This is a serious problem with your proposal. The proof of work is not self-contained within the header.
    It requires the verifier to obtain up to 10000 additional pieces of data that must all be verified, which is too much overhead in latency, bandwidth, and verification time.[/list]
    Shared Coinbase transaction typically is 32 kB data (an average of 4500 items)

    On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.

    BUT, the expected highest weight among these shares is close to 0.5 !
    (if you throw 5000 darts at a unit interval, you expect the smallest hit near 1/5000)

    So rather than summarize a sum weight of 1 with an expected 5000 shares,
    it appears way more efficient to just summarize a sum weight of roughly 0.5 with the SINGLE best share.
    But now you're essentially back to the standard way of doing things. In the time it takes bitcoin to find a single share of weight >=1, the total accumulated weight of all shares is around 2.

    All the overhead of share communication and accumulation is essentially wasted.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 15, 2018, 12:01:51 PM
    • Verification process involves:
      • Checking both the hash of the finalized block and all of its Shared Coinbase Transaction items to satisfy network difficulty target cumulatively
    This is a serious problem with your proposal. The proof of work is not self-contained within the header.
    It requires the verifier to obtain up to 10000 additional pieces of data that must all be verified, which is too much overhead in latency, bandwidth, and verification time.[/list]
    Shared Coinbase transaction typically is 32 kB data (an average of 4500 items)

    On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.
    It is 0.93 to be exceeded to be exact.
    Quote

    BUT, the expected highest weight among these shares is close to 0.5 !
    (if you throw 5000 darts at a unit interval, you expect the smallest hit near 1/5000)

    Yes. To be more exact, as the shares are randomly distributed in the range between 0.0001 up to 0.93,  the median would be 0.465.
    Yet it is not the highest difficulty, just the median.
    Quote

    So rather than summarize a sum weight of 1 with an expected 5000 shares,
    it appears way more efficient to just summarize a sum weight of roughly 0.5 with the SINGLE best share.
    But now you're essentially back to the standard way of doing things. In the time it takes bitcoin to find a single share of weight >=1, the total accumulated weight of all shares is around 2.

    All the overhead of share communication and accumulation is essentially wasted.

    As you mentioned, it is more what traditional bitcoin is doing and I'm trying to fix. It is not collaborative and as both theoretically and experimentally  has been shown, is vulnerable to centralization. The same old winner-takes-all philosophy leaves no space for collaboration.

    As of the 'overhead' issues, this has been discussed before. Shares are not like conventional blocks, they take a very negligible cpu time to be validated and network bandwidth to be propagated.

    EDIT:
    I have to take back my above calculations:  Some blocks may have as few as 2 shares and some may have as many as 9301 shares to satisfy the difficulty, this yields an average number of shares to be around 4650 for a large number of rounds. The highest share is (0.93) and the lowest will be 0.0001 no more indexes I've calculated and tried to calculate till now.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 15, 2018, 12:11:39 PM
    @anonymint

    keep cool and remain focused, ...

    unfortunately your last post was of no quality in terms of putting enough meals on the table, instead you are continuing your holly war (against what?) with inappropriate language, as usual.

    Please take a break, think a while and either leave this discussion (as you promise repeatedly) or improve your attitude,

    will be back  ;)


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: tromp on June 15, 2018, 12:12:36 PM
    SHA256 and other hash functions, are NP-Complete problems: their solutions consume negligible time and resource to be verified, it is basic in "computer science"  ;)

    Hash functions are not decision problems, so they cannot be NP-complete.
    I could create a decision problem out of a hash function though.
    Something relevant for mining would look like:

    The set of of pairs (p,y) where
      p is a bitstring of length between 0 and 256,
      y is a 256 bit number,
      and there exists an 256-bit x with prefix p such that SHA256(x) < y

    Such a problem is in NP.
    But it would still not be NP-complete, since there is no way to reduce other NP problems to this one.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 15, 2018, 12:40:20 PM
    SHA256 and other hash functions, are NP-Complete problems: their solutions consume negligible time and resource to be verified, it is basic in "computer science"  ;)

    Hash functions are not decision problems, so they cannot be NP-complete.
    I could create a decision problem out of a hash function though.
    Something relevant for mining would look like:

    The set of of pairs (p,y) where
      p is a bitstring of length between 0 and 256,
      y is a 256 bit number,
      and there exists an 256-bit x with prefix p such that SHA256(x) < y

    Such a problem is in NP.
    But it would still not be NP-complete, since there is no way to reduce other NP problems to this one.


    Yes, my mistake to call it NP-complete, it is NP.  In the context of this discussion, when we refer to hash functions, the PoW problem (like one you have suggested, a conditional hash generating problem) is what we usually mean, yet I should have been more precise.

    This was posted in a chaotic atmosphere but the point is maintainable that, verifying shares (not the Prepared block or its counterpart in traditional PoW, block) is a trivial job, by definition. Because it needs just verifying an answer for a NP  problem.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: tromp on June 15, 2018, 01:12:26 PM
    On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.

    I calculated wrong.

    n shares expect to accumulate about n*ln(n)*10^-4 in weight, so we expect
    a little under 1400 shares to accumulate unit weight...


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: fr4nkthetank on June 15, 2018, 01:15:40 PM
    Interesting, you put a lot of thought into this proposal.  I would support it and see how it goes.  The goal is really hard to reach.  The idea would be to increase difficulty to scale up operations.  Pool mining can be damaging, but one guy with a huge operation can be worse if no one can pool together.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 15, 2018, 01:52:56 PM
    On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.

    I calculated wrong.

    n shares expect to accumulate about n*ln(n)*10^-4 in weight, so we expect
    a little under 1400 shares to accumulate unit weight...
    Interesting, appreciate it if you would share the logic beyond the formula. It would be very helpful. To be honest I have not done too much on it and my initial assumption about 4650 shares is very naive. I was just sure that the number won't be any higher for average number of shares per block.

    Thank you so much for your contribution. :)


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 15, 2018, 02:08:05 PM
    Interesting, you put a lot of thought into this proposal.  I would support it and see how it goes.  The goal is really hard to reach.  The idea would be to increase difficulty to scale up operations.  Pool mining can be damaging, but one guy with a huge operation can be worse if no one can pool together.
    Thanks for the support.

    As of your argument about hardware centralization being more dangerous without pools:

    It is so tricky. This proposal is not an anti-pool or pool-resistant protocol, instead it is a fix for pooling pressure.

    Iow, it does no prevent people to come together and start a pool , it just removes the obligation for them to join pools (and the big-pool-better-pool implications) the current situation for almost any PoW coin.

    EDIT:
    It is also interesting to consider the situation with Bitmain. No doubts this company has access to the biggest mining farms ever and yet Bitmain has Antpool and insists on having more and more people pointing their miners to their pool. Why? because it is always better to have more power and be safe against the variance and have a smooth luck statistics.

    So, I would say after this fix, there would be not only no pressure toward pooling but also no incentive.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: tromp on June 15, 2018, 04:34:29 PM
    On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.

    I calculated wrong. Again. Edited for correctness:

    n shares expect to accumulate about n * ln(10^4) * 10^-4 in weight, so we expect
    a little under 1100 shares to accumulate unit weight...
    Interesting, appreciate it if you would share the logic beyond the formula.

    Consider a uniformly random real x in the interval [10^-4,1]
    Its expected inverse is the integral of 1/x dx from 10^-4 to 1, which equals ln 1 - ln (10^-4) = ln(10^4).

    Now if we scale this up by 10^4*T, where T is the target threshold, and assume that shares are not lucky enough to go below T, then the n hashes will be uniformly distributed in the interval [T, 10^4*T], and we get the formula above.



    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 15, 2018, 05:10:17 PM
    On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.

    I calculated wrong.

    n shares expect to accumulate about n*ln(n)*10^-4 in weight, so we expect
    a little under 1400 shares to accumulate unit weight...
    Interesting, appreciate it if you would share the logic beyond the formula.

    Consider a uniformly random real x in the interval [10^-4,1]
    Its expected inverse is the integral of 1/x dx from 10^-4 to 1, which equals ln 1 - ln (10^-4) = 4 ln 10.

    Now if we scale this up by 10^4*T, where T is the target threshold, and assume that shares are not lucky enough to go below T, then the n hashes will be uniformly distributed in the interval [T, 10^4*T], and we get the formula above.


    Solid. Will include this formula and the proof in the white paper, if you don't mind.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: tromp on June 15, 2018, 05:34:36 PM
    Solid.If you don't mind will include this formula and the proof in the white paper, if you don't mind.

    I don't mind, as long as you consider the edits i made to fix some errors.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 16, 2018, 11:25:39 AM
    I am not much of a probability theory expert, but for now, I'm persuaded about @tromp calculations:
    For any minimum relative difficulty* set for every share, mindiff, the average number of shares per (Finalized) block, n, would satisfy this formula : n*ln(n)*mindiff = 1

    * minimum relative difficulty is the ratio by which PoCW reduces the calculated network difficulty and is the least difficulty allowed for submitted shares.

    it yields n * ln(n) = 1/mindiff and suggests a better than scalable dependency between the two. i.e. a decrease in mindiff (more utilization) causes a better than linear growth in the avaerage number of shares.

    Notes:
    1-  @tromp assumption about shares being not overly lucky enforces this formula even more (i.e. the average weight can be a bit higher, hence n is always a bit smaller )

    2- The exact sum of n shares according to this protocol is 0.93 instead of 1 (there is one fixed %2 share for the finalized block itself) so the formula should be modified for n to satisfy n*ln(n)*mindiff = 0.95

    I just did some trivial calculations using the latter formula
    For mindiff being set to 10^-4, we will have n < 1,320  
    For 10^-5 we have n<10,300
    For  10^-6 we have n< 83,800

    It is so encouraging: Setting difficulty for shares to a minimum of one million times easier than network difficulty we need only 83,800 shares per block as an average instead of 1320 for current 0.0001. Note that the difficulty is already reduced by a factor of 10,  as a result of decreased block time to one minute and we are talking about 10 million times utilization compared to currently proposed 100 thousand times.

    And yet we don't have to decrease the mindiff (currently set to 10^-4) in such a strict way, instead we would prefer  moderate adjustments, an even more promising situation.

    Based on this assessments, it is assertable that Proof of Collaborative Work, is scalable and can achieve its design goal,  despite constant growth in network hashrate and difficulty indexes by a better than linear increase in demand for computing and networking resources (and no increase in other resources). The design goal is keeping difficulty of shares low enough to help average and small miners in participating directly in the network without being hurt by  phenomenons such as mining variance or their inherent proximity disadvantage. Fixing one of the known flaws of PoW, mining pressure.

    I guess we might start thinking about a self adjustment algorithm for mindiff (the minimum difficulty figure for issued shares).
    No rush for this tho, the core proposal is open to change and it is just a long term consideration

    This hypothetical algorithm should have features such as:

    -Not being too dynamic. I think the adjustment shouldn't happen frequently, once every year I suggest.

    -Not being linear. The increase in network hashrate is about introducing both new investment by miners and improved efficiency of mining hardware. Both factors, specially the latter, suggest that we don't have to keep too small facilities competitive artificially by subsidizing them. We are not Robin Hood and we shouldn't be.

    So, our algorithm should "dump" the impact of difficulty increase instead of covering it.
    It would help the network to upgrade smoothly.
    A factor of 30% to 50% adjustment, as a result of 100% increase in target difficulty, seems more reasonable to me than just an exact proportional compensation for new difficulty.






    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: tromp on June 16, 2018, 12:50:47 PM
    I am not much of a probability theory expert, but for now, I'm persuaded about @tromp calculations:

    NOTE that you overlooked my fix where ln(n) should instead be ln(1/mindiff).


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: goddog on June 16, 2018, 01:04:05 PM
    Hi, I have a stupid question, for sure I'm missing something,


    • Finalization Block: It is an ordinary bitcoin block with some exceptions
      • 1- Its merkle root points to a  Net Merkle Tree
      • 2- It is fixed to yield a hash that is as difficult as target difficulty * 0.02
      • 3- It has a new field which is a pointer to (the hash of) a non empty Shared Coinbase Transaction
      • 4- The Shared CoinBase Transaction's sum of difficulty scores is greater than or equal to 0.95

    I cannot see any reward for finalization block.
    where is the incentive to to mine a finalization block?


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 16, 2018, 03:12:25 PM
    I am not much of a probability theory expert, but for now, I'm persuaded about @tromp calculations:

    NOTE that you overlooked my fix where ln(n) should instead be ln(1/mindiff).
    Would you please do a complete rewrite of your proposed formula, ... for clarification? Substituting ln(1/mindiff) for ln(n) just makes no sense to me or I'm missing something here.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 16, 2018, 03:18:53 PM
    Hi, I have a stupid question, for sure I'm missing something,


    • Finalization Block: It is an ordinary bitcoin block with some exceptions
      • 1- Its merkle root points to a  Net Merkle Tree
      • 2- It is fixed to yield a hash that is as difficult as target difficulty * 0.02
      • 3- It has a new field which is a pointer to (the hash of) a non empty Shared Coinbase Transaction
      • 4- The Shared CoinBase Transaction's sum of difficulty scores is greater than or equal to 0.95

    I cannot see any reward for finalization block.
    where is the incentive to to mine a finalization block?

    Block reward is distributed by means of Shared Coinbase Transaction in which the first transaction is supposed to be a special transaction fixed to have a score of 0.02 and obviously will refer to the wallet address of the miner (of Finalized Block).

    • Coinbase Share: it is new too and is composed of
      • 1- A Collaborating miner's wallet address
      • 2- A nonce
      • 3- A computed difficulty score using the hash of
        • previous block's hash padded with
        • current block's merkle root, padded with
        • Collaborating miner's address padded with the nonce field
      • 4-  A reward amount field
    • Shared Coinbase Transaction: It is a list of Coinbase Shares  
      • First share's difficulty score field is fixed to be  2%
      • For each share difficulty score is at least as good as 0.0001
      • Sum of reward amount fields is equal to block reward and for each share is calculated proportional to its difficulty score


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: tromp on June 16, 2018, 06:31:44 PM
    Would you please do a complete rewrite of your proposed formula, ... for clarification? Substituting ln(1/mindiff) for ln(n) just makes no sense to me or I'm missing something here.

    Let T be the target threshold determined by the difficulty adjustment,
    and scale be some suitably big number like 10^4.

    Let shares be hashes that fall into the interval [T, T*scale], and define their score as T / hash.
    When accumulating shares until their sum score exceeds 1, one is interested in the expected score of a share.

    This can be seen to equal 1/scale times the expected value of 1/x for a uniformly random real x in the interval [1/scale,1]. Considering the area under a share score, the latter satisfies (1-1/scale) E(1/x) = integral of 1/x dx from 1/scale to 1 = ln 1 - ln(1/scale) = ln(scale).

    So the expected score is approximately ln(scale)/scale.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 16, 2018, 06:55:14 PM
    Would you please do a complete rewrite of your proposed formula, ... for clarification? Substituting ln(1/mindiff) for ln(n) just makes no sense to me or I'm missing something here.

    Let T be the target threshold determined by the difficulty adjustment,
    and scale be some suitably big number like 10^4.

    Let shares be hashes that fall into the interval [T, T*scale], and define their score as T / hash.
    When accumulating shares until their sum score exceeds 1, one is interested in the expected score of a share.

    This can be seen to equal 1/scale times the expected value of 1/x for a uniformly random real x in the interval [1/scale,1]. Considering the area under a share score, the latter satisfies (1-1/scale) E(1/x) = integral of 1/x dx from 1/scale to 1 = ln 1 - ln(1/scale) = ln(scale).

    So the expected score is approximately ln(scale)/scale.

    if the expected score is  ln(scale)/scale then total number of shares  n= scale/ln(scale) by definition.
    Hence,  n <1086   for 10000 times scaling down the difficulty. Right?

    If it was just a school exam I wouldn't hesitate that much because as of my knowledge and up to the extent I checked it with available references it seems to be basic:
    score = T/hash (checked)
    probability of x = 1/x (checked)
    expected value of x = integral of 1/x dx in the range [1/scale to 1] = ln(1)-ln(scale-1) = ln(scale) (checked)

    Yet I'm not entitled to weigh on it, and the result (10,000 times scaling down achieved with 1,086) is too good. I just didn't expect that much efficiency.

    Any more comments?
    correct reasoning, I mean expected value of a variable x is defined to be  random distribution


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: tromp on June 16, 2018, 07:48:52 PM
    if the expected score is  ln(scale)/scale then total number of shares  n= scale/ln(scale) by definition.
    Hence,  n <1086   for 10000 times scaling down the difficulty. Right?

    That's only approximately right. You do have that the expected sum of 1086 scores exceeds 1,
    since expectation of a sum is sum of expectation, but asking for expected number of shares to exceed 1 is something else.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 16, 2018, 08:17:43 PM
    if the expected score is  ln(scale)/scale then total number of shares  n= scale/ln(scale) by definition.
    Hence,  n <1086   for 10000 times scaling down the difficulty. Right?

    That's only approximately right. You do have that the expected sum of 1086 scores exceeds 1,
    since expectation of a sum is sum of expectation, but asking for expected number of shares to exceed 1 is something else.
    As I understand, the expected value is also known as the expectation, mathematical expectation, EV, average, mean value, mean, or first moment (the way Wikipedia defines it) and my primitive observation says once you have the  expected value/average of a finite number of uniformly distributed random values and their  total sum you have the cardinality by means of dividing sum to the expected value /average you got, if the variance is zero or very low which is true for a pseudo random function like sha2.

    What am I missing?


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: tromp on June 16, 2018, 09:44:32 PM
    As I understand, the expected value is also known as the expectation, mathematical expectation, EV, average, mean value, mean, or first moment (the way Wikipedia defines it)


    Correct.

    Quote
    and my primitive observation says once you have the  expected value/average of a finite number of uniformly distributed random values and their  total sum you have the cardinality by means of dividing sum to the expected value /average you got, if the variance is zero or very low which is true for a pseudo random function like sha2.

    In your case you have a (potentially unbounded) sequence of i.i.d. random variables S_i
    (score of i'th share) and a separate random variable N depending on all S_i which is the minimum n for which the sum of the first S_i exceeds 1.
    Of course if the S_i have 0 variance then N = ceiling(1/S_i).

    A closely related case is where there is a single random variable S and N is just 1/S.
    In that case Jensen's inequality [1] applies and you have E(N) >= 1/E(S) , with equality only for Var(S)=0.

    I'm not sure to what extent Jensen's equality carries over to your case.

    [1] https://en.wikipedia.org/wiki/Jensen%27s_inequality


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 17, 2018, 08:23:29 AM
    @tromp
    Thanks, for the explanation.
    I checked a few references, 1086 scores for 10,000 scale down is not exactly true as we have variance > 0 in distribution of scores, as you have mentioned correctly. But we also should notice that for large number of blocks we have another convex function for n (being the number of shares) distributed randomly with an order of magnitude less variance (my intuition) and its expected value is what we are looking for. It is beyond my expertise to go much further and discuss this problem thoroughly, tho.
    It would be of much help if you could spend some time on this  and share the results. Both for a more active discussion and for keeping this topic reserved for more general discussions, I have started another topic regarding this problem (https://bitcointalk.org/index.php?topic=4484357).

    By the way, for practical purposes I suppose we can confidently use our previous estimation for n (1400), as the least optimistic one, for now.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 17, 2018, 03:39:27 PM
    I shouldn’t come back for sloppy seconds, but sometimes I like math.

    Last year I analyzed the bet which was made in the context of a probability error (https://www.reddit.com/r/btc/comments/6q2uak/peter_rizun_and_craig_wright_just_bet_1_btc_on_a/dlb2d0i/) that Craig Wright made in his paper on selfish mining (which he pulled in shame and I was never able to find an archive of the original paper).

    The error that Craig Wright made was assuming the selfish miner is not joint probability with the honest miners (which frankly is an impossible error for the real Satoshi to make):

    Quote from: iamnotback
    If we ask how long after selfish miner finds a block (i.e. which only happens 33.3% of the time overall!) will the honest miners find another block, then it is 15 minutes after because we have excluded 66.7% of the cases. But if we ask how long will it take for the honest miners to find a block from the starting of mining on the public block in 100% of the cases, then it is 15 minutes (which is thus 5 minutes after selfish miner finds a block if we are modeling both events independently). So it depends what we are modeling and how the bet is stated. Afaik from the screen captures I saw, Craig’s timeline chart apparently didn’t make it clear if we were modeling the honest miner and selfish miners as independent events. So the bet is ill-defined.

    Applying divergent thinking to your PoCW, I think I’ve found another flaw in your assumptions about the game theory, which applies to the calculation of how many lower difficulty shares will accumulate to reach the target block difficulty.

    As the shares accumulate and their summed difficulty approaches 0.93 of the target difficulty, the variance of profit skyrockets exponentially. IOW, the ROI for continuing to mine shares declines exponentially approaching the finalization of the block. Thus the hashrate applied will plummet if miners are acting rationally to maximize their profit. I’m not even confident that the block will ever be finalized in every instance. The math is getting complex and I would really need to think deeply about it to try to formalize it. Of course sometimes accidentally a block will finalize because a share will infrequently have a very high difficulty result due to variance and thus finalization of the block will be achieved. But I think the finalization step is actually not profitable for small miners. Only a very large mining farm which can mine that final 2% with an ultra low variance is likely to find it profitable to do so. Also we have to factor in the vestment the miners have in the shares they already mined on the block, so the larger miner will have more vestment and more profit incentive to finalize. Thus this seems to eliminate some of the claimed decentralization.

    So as I told you from the very start, your proposed PoCW system is much more complex and non-linear than you may realize. And there’s likely many such land mines lurking. There’s a damn good reason Satoshi made the block period 10 minutes and didn’t go chopping it up in intervals. You change the economic theory of mining because the winner of a share doesn’t immediately win a block.
    This post is quoted completely from the cardinality problem topic (https://bitcointalk.org/index.php?topic=4484357.msg40302613#msg40302613) to keep that topic focused on its purpose.

    Quote
    As the shares accumulate and their summed difficulty approaches 0.93 of the target difficulty, the variance of profit skyrockets exponentially. IOW, the ROI for continuing to mine shares declines exponentially approaching the finalization of the block.


    I couldn't figure out how approaching the required share difficulty may result in the situation you suggest.

    When a miner is able to generate a Shared Coin Transaction (i.e. he is aware of enough valid shares with same Net Merkle root and a cumulative score of at least 0.93) it is time for him to immediately switch to Finalization phase i.e. mining a block with at least 2% difficulty compared to networks calculated target difficulty.

    The block our miner is trying to mine in this phase points to a Shared Coinbase Transaction that has miner's address in its very first row and rewards it exactly 2%, so, there is a fair reward proportional to the required difficulty which is 2% equivalently.
    The lucky miner who manages to find such a block

    Quote
    Only a very large mining farm which can mine that final 2% with an ultra low variance is likely to find it profitable to do so. Also we have to factor in the vestment the miners have in the shares they already mined on the block, so the larger miner will have more vestment and more profit incentive to finalize. Thus this seems to eliminate some of the claimed decentralization.

    small miners will hit occasionally too and they are loosing just 2% of the reward if they don't manage to, large miners either should include the shares (belonging to them or not ) asap and switch to finalization phase or keep trying to mine more shares (like privately) and risking both their chance for mining the finalized block and their privately mined shares not to be rewarded at all.

    I don't get why you suggest 2% is not enough incentive for mining a 2% difficult block for small or medium miners. As I see the protocol, it distributes work and rewards in a fair way and makes it the better choice for participant to collaborate otherwise they will be isolated and will lose profits or at least they will experience mining variance headaches.

    Quote
    So as I told you from the very start, your proposed PoCW system is much more complex and non-linear than you may realize.

    It is not complex, imo. it just looks to be so, because people are used to winner-takes-all tradition that Satoshi Nakamoto has established and never has been questioned properly.

    I've said it before, Satoshi did an extraordinary great job by PoW (the most important innovation ever after TCP/IP, imo) but when it came to details, he picked the most straightforward and naive approach for his rewarding system, winner-takes-all, putting bitcoin in great centralization danger because of both proximity premium and mining variance flaws that took just 2 years to show their impact on bitcoin.

    Thinking out of the box, I don't see anything complex in PoCW, on the contrary I see it so natural, the most natural way to implement PoW:
    You contributed in security? You should  be rewarded properly
    instead of
    You got enormously lucky?  So you probably have hashed exhaustively, take everything and enjoy it!

    As of non-linearity, I have to agree that in some points we may experience non-linear patterns and sensitivities (transition to and from Contribution phase) but I guess miners can use a good client software to make the critical decisions for questions like :
    To which Prepared Plock should I contribute?
    Is it the time to replace the block I'm contributing to,  with another more popular one?
    Is it the time to end mining shares and try my chance to mine a finalized block?
    To which Finalized Block I should point my Prepared Block, I'm going to mine?

    All of these questions can be easily addressed by means of simple calculations. Note that the critical  information vital for making such decisions is tens of thousands times more distributed in the network and unlike Nakamoto's variant of PoW, we have practically no proximity premium flaw in PoCW.

    Quote
    ... There’s a damn good reason Satoshi made the block period 10 minutes and didn’t go chopping it up in intervals. You change the economic theory of mining because the winner of a share doesn’t immediately win a block.

    Understand and appreciate your respect for Satoshi, he deserves it and more. But I don't think bitcoin was a secret project of Pentagon ot Musad leaded by Satoshi, backed by a think tank. Actually I wouldn't care if it was the case. No matter how much thought you spend on a project, it is always flawed and needs improvement, I mean radical improvements and not minor (and always dangerous) tricks like SW.  :P

    As of 'changing the economic theory of mining', the way you put it, I'm sure I'm doing so and I feel good about it. Mining is suffering too much and  miners are desperately in need for change.  

    I have a lot to say about it just one point for now:
    One of the most disastrous consequences of pool mining  is the worst phenomenon can happen for any human activity, the one that is the true criticism of Marx against Capitalism (which never have been truly understood/appreciated by critics), alienation!

    We have miners like workers in a factory (the pool) that do not choose and do not decide, do not design and do not change and are not been paid for their human characteristics like being faithful and loyal, they are paid for their power and not for what they produce, miners are alienated from their activity in the same way that workers are from their work. Ironically they are named 'worker' in pooling terminology.

    Proof of Collaborative work will have  the whole mining community in his back both because of their interests and their need to matter again, their definite need to resurrection, once it is properly designed and implemented and propagated.

    If I'm not doing good enough, no worries, anybody with deep understanding of the protocol is welcome to take the lead, I'll support. :)


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 18, 2018, 10:47:44 AM
    I received a cordial PM from @aliashraf.
    :) I feel good about you despite the bitterness and this topic is about collaborative work after all. PM'd just to keep topic focused.
    Quote

    ... he and I incompatible ... originate from two distinct cultures ... Iwas born in New Orleans in the Deep Old South in 1965. ... those who live in kleptocracy countries have become very cynical and they have more flippant mannerism of interacting. ..
    {A LOT of excuses for not dong what you are supposed to}
    I think it is so natural, expecting more commitment and responsibility when it is about 'the truth' and 'the right'.

    Definition of cynical
    1 : having or showing the attitude or temper of a cynic: such as
    a : contemptuously distrustful of human nature and motives
    … those cynical men who say that democracy cannot be honest and efficient. —Franklin D. Roosevelt
    b : based on or reflecting a belief that human conduct is motivated primarily by self-interest a cynical ploy to win votes

    See? By above definition, I'm not cynical here or may be in the 'Old South' you mean something else by this word.
    I'm a believer in humanity, do not judge people based on their race, gender, nationality, ... (not even based on their attitude or temper). you can ask @achow101, the mod, that how I've responded about you and the chaos in this thread few days ago.

    Quote
    Also it should be correctly noted that as of now, I have a vested interest for proof-of-work to not be salvageable for decentralization and scalability. That was a conclusion I made a couple of years ago, and so I invested a lot of R&D effort in non-proof-of-work Byzantine fault tolerant consensus system design.

    Reconsider! You decided wrong. I think you have wasted your valuable time 'in the box' and it is your fault, come on, think 'out of the box' for a while.

    Quote
    I hope that helps you better understand about me and my attitude.

    So it’s difficult for me to rally around a claimed solution or improvement to proof-of-work, because:

    • I have a vested interest for it to not be true.{False! You are interested! }
    • I think Satoshi was a think-tank for 160+ IQ researchers who designed an exquisite game theory and had considered every possible angle (https://bitcointalk.org/index.php?topic=4433000.msg40262427#msg40262427).
      {Good story, but not real! F*k think tanks by the way, they are just a bunch of technocrats who suck and waste people's taxes}
    • I’m nearly certain there are invariants which will make it impossible to achieve decentralization along with scalability in proof-of-work. Even just scaling up economic demand without TPS causes proof-of-work to centralize due to economies-of-scale in profit. {Don't get neare any more because it is simply wrong!}

    So I am not against you personally, but please do not attack me just because I can’t get excited around rallying around the open source R&D of your PoCW idea. {It is not an idea, it is a proposal, detail designed}
    PoCW is not in R&D phase,  I've already begun implementation phase, it will be open source, undisputable.

    Quote
    I am not trying to tell others to not participate. I am not trying to railroad your thread. I am not trying to belittle or insult you (although I got angry that you were violating my culture above and were like forcing me into an acrimonious discussion about something I am not that interested). None of that.
    Good start. I maintain my strategy about you to commit more, tho.

    Quote
    And you have another problem which is apparently most of the Core developers and their supporters have exited this subforum. I do not see the super smart guys commenting here anymore. So you probably better to go propose your PoCW in one of the development discussion communication channels where they are. You may ask the mod @achow101 as I saw he had posted recently in response to @traincarswreck where and how to communicate with Core.
    I'm aware of that.
    Bitcointalk is founded by Satoshi and is the only reference forum I do care about, if it has lost momentum, it is our obligation to bring in some.

    Core devs are free to join this discussion I won't go after them to do so. I don't think they are busy elsewhere to do something more important, they are doing nothing other than playing with the same toy they have always been playing with. If I'm wrong refer me to a serious discussion they are involved in comparable with this thread.
    Quote
    I wish you the best luck with your proposal. But for me personally, I am not going to be one of its supporters unless someone formalizes it and convinces me that my stance is incorrect.
    Deal!

    Quote
    As the shares accumulate and their summed difficulty approaches 0.93 of the target difficulty, the variance of profit skyrockets exponentially. IOW, the ROI for continuing to mine shares declines exponentially approaching the finalization of the block.

    I couldn't figure out how approaching the required share difficulty may result in the situation you suggest.

    When a miner is able to generate a Shared Coin Transaction (i.e. he is aware of enough valid shares with same Net Merkle root and a cumulative score of at least 0.93) it is time for him to immediately switch to Finalization phase i.e. mining a block with at least 2% difficulty compared to networks calculated target difficulty.

    The block our miner is trying to mine in this phase points to a Shared Coinbase Transaction that has miner's address in its very first row and rewards it exactly 2%, so, there is a fair reward proportional to the required difficulty which is 2% equivalently.
    The lucky miner who manages to find such a block

    When mining shares early in the Shared Coin Transaction phase, the variance of the smaller miner is less of a cost because his share will still likely be produced before the window of time expires. But approaching 0.93 level of accumulated shares, it becomes more costly from a variance weighted analysis for the smaller miner to risk starting the search for a share solution. The math is that the miner will be less likely to earn a profit on average over the short-term of a few blocks when mining closer to the expiration time. Over the long-term, I think the variance skew may average out, but the problem is that time is not free. There’s a cost to receiving profits delayed. This is why smaller miners have to join a pool in the first place.
    In Contribution Phase, miners won't interrupt mining and continue brute forcing the search space, unless they realise that one of the two following events is happened:
    1- Another Net Merkle root (Prepared Block) is getting hot and they are in danger of drawing dead, This is supposed to happen in the very first beginning of contribution phase and they will respond by switching to new trend.

    2- The 93% limit is reached. It happens as a sharp simultaneous event across the network and miners react by switching to Finalization Phase simultaneously, (in a very short interval, say 2-3 seconds, I guess).

    Wait! do not rush to the kb to muck at me or to teach me what I don't really need to be taught, just, continue reading ...


    I understand, minds poisoned by traditional PoW, for which propagation delay is a BIG concern, can not imagine how it is possible, but it is exactly the case with PoCW:

    Suppose we are approaching the 0.93 limit (say 0.736, 0.841,0.913, ... ) from the viewpoint of a miner A. Shares are receiving and it is getting closer and closer to the limit ...

    What would be the situation for miner B (say, at the end of longest ever, short path from A)?

    B can feasibly b e experiencing (0.664, 0.793, 0.879, ...) at the same time!
     
    Before proceeding further, let's understand how this situation is feasible at all ...

    Looking closer to the shares that A has validated and is accumulating to calculate the series 0.73, 0.841,0.913, ... and the ones B is using to generate 0.664, 0.793, 0.879, ... may reveal an interesting fact: they are not exactly the same and specially when it comes to newer shares they diverge meaningfully!

    Suppose the collection of all the shares (regardless of their miners and how far they have been propagated) to be
    S={s1, s2, s3, s3, s3, s3, ... ,, sn-5, sn-4, sn-3, sn-2, sn-1, sn}

    Obviously SA and SBare subsets of S representing how far each of the miners, A and B, are aware (being informed about or the source of each share) of S.

    As there is propagation delay and other reasons for any miner to 'miss' a member of S, it is completely possible to have
    SA = {s1, s3, s4, ... , sn-5, sn-4, sn-2 , sn} *
    * Missing s2, sn-3

    SB = {s1, s2, s4, ... ,  sn-5, sn-3, sn-1}*
    * Missing s3, sn-4, sn-2, sn

    They have most of the shares in common but they have not access to all the same shares, they don't need to, Miner B may suddenly receive more shares from the adjacent peers and find himself closer to 0.93 limit and so on,...

    It is how PoCW  mitigates the troubles usually we are dealing with because of network propagation problem, we distribute information almost evenly across the network and reduce the proximity premium weight and importance.

    This is why I'm convinced that network phase transition  occurs synchronously, in a very short window of time.

    This analysis, confirms your concern about risks of the latest shares no to be Finalized ever, but it shows that it is about a very short duration and the chances are distributed more evenly across the network.









    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 18, 2018, 12:27:12 PM
    You wrote nothing to refute my technical objection. Appears you don’t even comprehend the economics I presented.

    No, I addressed it by :
    Quote
    This is why I'm convinced that network phase transition  occurs synchronously, in a very short window of time.

    This analysis, confirms your concern about risks of the latest shares not to be Finalized ever, but it shows that it is about a very short duration and the chances are distributed more evenly across the network.

    To make it clear and to be more precise:

    In two windows of time, in the early seconds of contribution and in the last seconds (both being very short as I have argued above for the latter and have shown in my analysis of convergence process for the first window)  there is a chance for shares not to be finalized.

    The proximity premium resistance of the algorithm, will compensate this risk by distributing it evenly in the network.

    Note: I'm not sure yet but I suppose the latest point, the risk being distributed, has an interesting implication: In long term, it is no risk at all, it is part of the protocol and is automatically adjusted by target difficulty.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 18, 2018, 02:35:22 PM
    AFAICT, my objection is not ameliorated by any randomization or distribution.

    Of course it ameliorates your objection. How could it possibly do anything else after all?

    Distributing a risk tens of thousand times in the network and neutralizing proximity premium (what PoCW definitely achieves) is not a simple point to be overlooked easily. When you participate in a network with relatively good distribution of information and luck, you just don't back off because you are afraid of not hitting every single round. It is just part of the game. The only concern is always about how distributed and fair this game is.

    Quote
    I’m sorry I am not going to repeat the economics again. AFAICT you are simply not understanding.


    On the contrary, I do understand every bit of your objection and more ...
    It is by no means about "economics" , you are simply questioning the incentives being enough compared to the risks. It is not complicated that much to be called  economics.
    Accusing me of being ignorant about such a simple trade off between costs and expenses ... , well it is too much in this context.

    Quote
    The miner will simply observe that by not mining within a window nearer to 0.93 will be more profitable. If they have ambiguity around the window, they’ll simply have to back off to an even lower threshold or use a timeout.
    Now let's have a bit more "economical" assessment here:

    You are suspicious that when the total score of shares that the miner is aware of is close enough to 0.93 threshold, a rational miner may stop mining, take a break, waiting for probable more shares to come and switching to next phase because there is more chance for his newly mined shares not to be included in the finalized block and it will be waste of ... wait ... waste of what? Electricity? Because the rents are already paid. aren't they?

    So it will be about risking electricity expenses in a 2-3 seconds duration against a proportional fair chance to hit and be the first (almost) who transits to Finalization phase.

    Quote
    But I have nothing to gain by continuing to explain it. Cheers.

    Note that doesn’t mean I am 100% certain. I would need to do some formalization. And I am unwilling to expend the effort.

    Neither do I. It is too luxurious for this protocol to be analyzed to that extents, I'll leave it as it is. For now, I'm convinced that no back-off threats practically exist. Miners will simply take their shuts because the crisis threshold is very narrow, as I have proved before.





    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 19, 2018, 06:33:49 AM
    Of course it ameliorates your objection. How could it possibly do anything else after all?

    Craig Wright is an obnoxious, stubborn, boastful person who fell flat on his face when he finally tried to formalize and published a whitepaper (https://bitcointalk.org/index.php?topic=4484357.msg40302613#msg40302613).

    I’ll await your formalization, as I awaited for a several years for Radix to finally specify their flawed design (https://steemit.com/cryptocurrency/@anonymint/scaling-decentralization-security-of-distributed-ledgers-part-3).

    Although formalism is not my first priority right now and instead implementation is, I have presented this proposal in a formal way. The algorithm and the terminology have been described in details and free of ambiguity. You might have noticed until now, I've made only one major edit as a result of discussion with @ir.hn. If by formalism you mean a lot of mathematical analysis to address every single possible attack or vulnerability, I think it is too much in this stage.

    I'm not suggesting it generally to postpone a more formal and finalized whitepaper but for this special project, implementation is of higher priority. Let me explain:

    I introduced PoCW not for giving birth to a new coin, improving bitcoin and  Ethereum (while saving it from Butterin's Caspar coup d tate) are my main concerns. As one of the first contributors to this topic has correctly emphasised the main challenge here is political. I started this topic to spread the word and the idea not to convince people to join a scammy shitcoin project but to find about who is who  and how is the weather.  The next step is implementing the code and demonstrating how feasibly smart and clean I can do this.

    I'm having a long war to win and I don't play this game so cautiously: saying nothing unless you got an army of academicians in your back. It is not the way I fight, when I find the answer I show up with it and start fighting and will fight to the river, as I have told you elsewhere. I don't hesitate and don't postpone everything for paperwork.

    Quote

    Distributing a risk tens of thousand times in the network and neutralizing proximity premium (what PoCW definitely achieves) is not a simple point to be overlooked easily. When you participate in a network with relatively good distribution of information and luck, you just don't back off because you are afraid of not hitting every single round. It is just part of the game. The only concern is always about how distributed and fair this game is.

    Correct. You’re also describing Nakamoto proof-of-work, in which small miners must join pools, because the variance risk is always too high for the entire block period.

    Analogously, I claim that PoCW creates increasing variance risk later in the block period. So again smaller miners need to turn off their mining the closer to the end of the block period, and wait to mine the next block again.
    Unless Nakamoto's PoW suffers from being vulnerable to mining variance in the whole period and mine is in danger just in a short transition period Plus in Nakamoto's case we have a single focal point of information and proximity premium while in my proposal, PoCW, we compensate for the said danger by distributing the information (new shares) tens of thousands of times.

    You totally ignore both differences and I don't know why.
    Quote
    Throughout this entire thread you seem utterly incapable of comprehending the concept of relativism. I grow very weary of repeating myself, even though you continue to pretend (misrepresent) to readers that I’m dim witted or disingenuous.
    Please don't go back there. It is not true at all. Nobody implies people are disingenuous by arguing with them. why should you put it that way?
    Quote
    You are suspicious that when the total score of shares that the miner is aware of is close enough to 0.93 threshold, a rational miner may stop mining, take a break, waiting for probable more shares to come and switching to next phase because there is more chance for his newly mined shares not to be included in the finalized block and it will be waste of ... wait ... waste of what? Electricity? Because the rents are already paid. aren't they?

    Electricity is not already paid. If there is any form of flat-rate mining hardware hosting account which does not meter electricity on a usage basis, then the account is not profit to mine with, because electricity is the major cost of mining.


    What I was trying to say is that mining involves several cost factors: rents, hardware depreciation, wages and electricity. Hypothetical back-off strategy just can help reducing electricity costs for few seconds per minute by relaxing the miner from hashing. I suggest even with high electricity fees it won't trade-off with dropping the chances to hit and be rewarded.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 19, 2018, 09:36:36 AM
    P.S. If you understood anything about the Old South, you would understand that is the way you talk to other people that makes you a bonafide citizen of that culture. I do not talk to people on this forum the way I talk to people in my culture, because very few here have the etiquette of the Old South. So on this forum I get to be as big of an asshole as others are to me. It is pure defect-defect because the forum is anonymous. Although I have a made a decision to try to exhibit much greater patience and tolerance.
    No worries dude, I will do my best to keep this debate alive as long as some meat is served here  ;)
    About Old South, ....  you already know,  I'm a fan!

    Quote

    I introduced PoCW not for giving birth to a new coin, improving bitcoin and  Ethereum (while saving it from Butterin's Caspar coup d tate) are my main concerns. As one of the first contributors to this topic has correctly emphasised the main challenge here is political. I started this topic to spread the word and the idea not to convince people to join a scammy shitcoin project but to find about who is who  and how is the weather.  The next step is implementing the code and demonstrating how feasibly smart and clean I can do this.

    I'm having a long war to win and I don't play this game so cautiously: saying nothing unless you got an army of academicians in your back. It is not the way I fight, when I find the answer I show up with it and start fighting and will fight to the river, as I have told you elsewhere. I don't hesitate and don't postpone everything for paperwork.


    Proof-of-work is “might is right.” Why are you idealistic about fungible stored capital enslaves mankind? You are just fiddling with the fringe of the issue. The central issue is that proof-of-work is all about the same ole paradigm throughout human history that says fungible stored claims on someone else’s labor is powerful.

    I reject that and we are headed into a paradigm-shift which will render the NWO reserve currency Bitcoin irrelevant. Security of fungible stored capital is not the big deal it was before in the fixed investment capital Agricultural and Industrial ages. You need to understand that you are barking up an old, rotten tree with old, rotting Old World money in its midst.

    Read every linked document in this blog, and my comments below the blog and wake up:

    https://steemit.com/cryptocurrency/@anonymint/bitcoin-rises-because-land-is-becoming-worthless
    It is the true story behind this debate, isn't it?
    At a very unfortunate moment of your history with bitcoin and PoW, you made a horrible decision: giving up on both!
    Cultures won't help, people just are the same, no matter to when or where they belong, they just give up when they become disappointed.

    For me, this is a different story. When I find something brilliant, I don't care about its current state of development, brilliance is enough for me to commit and not to give up on it, no matter what.

    History teaches us another important lesson too: When a paradigm shows up, it will stay for a while and it is pointless and mostly impossible to have a paradigm shift every decade.
    I will check your link and I'll go through your replies as you wish, I promise, but, I have to say, I'm strategically against any attempt to replace PoW, it seems to me just a fake ridiculous attempt, a cartoon. Sorry, but it was you who chose the wrong side.

    Quote
    If by formalism you mean a lot of mathematical analysis to address every single possible attack or vulnerability, I think it is too much in this stage.

    I think it’s impossible to not have acrimony without it. You just assume epilson without actually proving how small are the effects you dismiss.

    {....}

    You need to show the math and prove it is epsilon.

    And that includes also your presumption of “for few seconds per minute.” Depends on the variance of the miner.
    I don't agree. It is always possible to discuss issues without going through formal and mathematical analysis. I did some formal analysis of this specific subject of your concern (variance in transition period) and have shown how sharp is this period.
    But now you are unsatisfied and keep pushing for more details which I just can't schedule more time for , and if I do it, nobody will read it, not now.

    Quote
    There is also a countervailing force which you could argue for, which (I mentioned up-thread) is the investment in shares the miner already has and the amount of luck he adds to the block being solved if he does not stop mining. But that is probably a neglible factor and Vitalik already explained that altruism-prime is an undersupplied public good (see the weak subjectivity Ethereum blog), so I doubt miners would be able to agree to not defect for the greater good. It’s a Prisoner’s dilemma.

    Again you need to show the math.
    Please! You probably know my opinion about this kid, Buterin and his foolish "weak subjectivity" thing. It is a shame, a boy desperately obsessed with being genius, is trying to revolutionize cryptocurrency by 'weak' shits. Absolutely not interested.

    As of the proposed 'additive' for miners not to back-off because of the hypothetical variance in transition phase, thanks for reminding and I'm fully aware of that, I just didn't bring it forward to avoid complicating the subject even more.

    Anyway, it improves the odds and can't be rejected by the boy's "discovery" of altruism not being the dominant factor  in a monetary system  :D
    It is not about well being of others.
    Miners have always incentive to have their own previously mined shares (in the current round) to be part of the chosen %93 and their late shares besides direct rewards will help this process.

    Quote
    Actually perhaps the most logical action is for smaller miners to switch to centralized pools for a portion of the block. Probably it will be difficult to argue against that mathematically. So that if true, probably more or less defeats the stated purpose of PoCW.
    I think, there is a possibility (not a force) for some kind of pooling in PoCW. But it won't be the same as conventional centralized pools even a bit (it doesn't need to be) and won't defeat the purpose being eliminating the pooling pressure and its centralization consequences.

    I have to analyze it far more, but I guess a light gradient exists here in favor of forming kinda 'agreements' between clusters of small miners to communicate in star topologies to help each other transiting more smoothly. It is a light gradient, as there is very low stakes ( 2% or so) on the table.

    One should again take into consideration the way PoCW fixes proximity premium and practically synchronizes miners to transit between phases almost in the same time, as I have already discussed it extensively, implying short transient periods and less incentives for setup/cleanup costs needed to join a  pool temporarily.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 19, 2018, 10:43:09 AM
    @anunymint

    Bitcoin is not just about security, it is far more about decentralized issuance of money and transacting it. Money is not wealth, it is not capital it is just a device. Its fungibility essentially makes it such a device.

    A new monetary system, of any kind, will have to face scalability and flexibility issues and it is absolutely possible in its earliest stages of development to suffer from shortages in this regard. Improvement is the solution.

    Betraying PoW and Satoshi is your right (and Butterin's and PoS enthusiasts too).

    But my choice is different, I remain loyal and try to fix issues. This proposal is about fixing two important flaws in PoW, and will scale bitcoin 10 times at least while keeping it fully objective and decentralized.

    We will see how breaking the prison will escalate more improvements later. specially I'm very optimistic about on chain scaling by sharding infrastructure utilizations inherent in PoCW.

    PoCW is just an evidence that shows the feasibility of improving instead of giving up and sticking with subjective, failed alternatives. If they were not just failed  ideas, how was it possible at all for bitcoin to rise?

    Anyway I'll go through your links and will discuss them with you probably in separate threads as I promised, for this topic, I think we can forget about strategic and visionary issues and take it just as it is, a case study for an improvement proposal to PoW


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 19, 2018, 03:17:50 PM
    {....}
    We will see how breaking the prison will escalate more improvements later. specially I'm very optimistic about on chain scaling by sharding infrastructure utilizations inherent in PoCW.

    You already lost that debate up-thread. You cannot increase your factor to 1 million or 10 million to accommodate unbounded scalabilty.

    Besides I still do not think PoCW even works at a 10,000 factor.

    And proof-of-work will centralize and there’s nothing you can do to ameliorate that outcome.
    Unbounded scalability is not needed at all and latest calculations by @tromp suggest a practical scalability for PoCW.

    By reducing block time to 1 minute and applying 0.0001 minimum threshold rule for shares we reach to 10-5 scale down needed for keeping a single S9 enabled to participate in solo mining bitcoin by producing an average of 2 shares per hour.

    As of latest calculations we need an average of shares needed in scale/ln(scale) order that is promising for scaling down difficulty to 10-7 with just 72,000 shares per round (1200 shares/second) in average.

    10-7 scale down is more than sufficient for 1000 times increase in network difficulty(how many decades later?) , because we expect at least 10 times better smallest-miner-to-protect and 1200 shares/second is not that high for 2050, I suppose.

    So, unlike what you claim, scalability debate is not lost for PoCW.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on June 20, 2018, 03:06:07 PM
    @anunimint

    I'm not going to patronize you, but you have been helpful and productive here, I'll mention your contribution in the white paper when discussing possible objections including the ones you made this far and the way Proof of Contributive Work is supposed to resist against them.




    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: MISERICORDAE PROJECT on June 22, 2018, 09:29:21 AM
    @aliashraf @anunimint

    Congratulations you progressed despite the high entropy of the thread. Very Glad!

    Implementation and testing phase, even to mvp, will shine light on any unresolved issues and hopefully they will be addressed.


    @anunimint  How are you getting on with your health (read your post about the complications)... Wish you to get better and better soon.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on July 03, 2018, 04:04:36 PM
    Just edited the starting post to improve the protocol for  guaranteeing a minimum reward for miners of Prepared Blocks in case transaction fees might not be enough and at the same time encouraging them to commit more transactions in the Net Merkle Tree (probably with higher fees) by dedicating 30%of the respected fees to the finder, it is traded with 1% of the block reward.

    For the time being it is not done by a complete rewrite of the article, just a comment has been added to the end.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on July 08, 2018, 03:56:12 PM
    @anunymint
    Miners who begin finalization process need to accumulate enough shares, Once a share is found with a difficulty much higher than what is needed  they can choose to eliminate few smaller shares (typically the ones that don't belong to themselves) and include the newly minted one, keeping the sum at the 95% threshold needed. This will distribute the 'bad beat' consequences between more miners.

    Thanks a lot for your reviews you are truly helpful, I appreciate it, I maintain that PoCW, this proposal, is rigid, tho.

    By the way, your introduction of my proposal as a design that
    Quote
    changes the winning block solution from a single Schelling point to multi-inflection point game theory
    is formulated in a somewhat sneaky way.

    Distributing what you call "a single Schelling point" is nothing less than eliminating it!

    For proximity premium flaw of traditional PoW, e.g. the flaw is caused by the very fact that a single miner happens to find a block for which people are killing themselves while he has already started mining new block (after relaying his discovery). It puts him (and his peers in the next rank) in a premium state which can leverage it to perform better in the next phase and so on.

    In PoCW, we have tens of thousands of hot zones (new share found events) distributed all over the network. One can hardly categorize it as a premium to be in the focal point of such zones (the lucky miner) or be closer to it, simply because it is not big news at all and happens frequently and evenly distributed in the network.

    I think it is very important to remain relativistic (as you always like to mention): PoCW is an improvement, it improves PoW relatively, tens of thousands of times.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: icoinist on July 09, 2018, 05:26:48 AM
    Hi @alishraf
     Went through the post and conversations around it several times. Had a bunch of questions, posting the/my top 3.

    1. Maybe I didn't understand this. It appears that anyone who puts out a prepared block will be able to pocket the transaction fees, even if it's a competing block. If so read on. If not, Go to 1.1. What stops miners from putting out prepared blocks, which have a rather low barrier to entry and keep pocketing transaction fees ? Also, would this not cause an increase in the average transaction fees any transaction experiences ? Considering there will be atleast some (>1) competing prepared blocks, each of would would have claimed the transaction fee.
       1.1 Btw, would each of these prepared blocks need to have the same transactions or can they be different ?
    2. How does this dis-incentivize a centralized mining outfit from winning races…i.e., they could still go ahead and prepare a new, what you call a “prepared block” and funnel more “contributions” to it than a collection of miners acting in their individual capacity…(because, If I understand correct, your premise is that of a few prepared blocks to come into existence and then the early winner acquiring pole position in-so-far as the rest of the race is concerned).
     - This would only need a new kind of mining pool to exist where instead of hashes being funneled via a single key owner, it would come from the entire membership set of that pool. Ie, they would still be acting together to win the race, yet publishing their hashes individually.
    3. If I understand your proposal correctly, you’re taking what is essentially happening within a mining pool, but instead of a hub-n-spoke nature of the mining pool, you’re disaggregating it by pushing the computation to the individual nodes) while pushing the collective state into the blockchain, thus allowing for state-keeping of rewards on an individual node basis.
    If above is accurate, wouldn’t this be a major computational challenge for the individual nodes ? i.e., while the state has been pushed into the blockchain, the computation that needs to be done in order to keep track of cumulative mining score/difficulty (the .95 number you alluded to). The reason I say this is: Lowering the difficulty results in lengthening/increasing the state chain that has to be maintained in each node (and the computation thereof). Your net merkel tree will become huge. in a nutshell, wouldn’t it be the case that now each node would have to do the exact same thing, in terms of processing/computation that in a mining pool server is/was doing ?



    I admit all my reasoning is more qualitative and intuition driven rather than math. Among the reasons for this is that your proposal is rather hard for me to wrap my head around…mainly because you’ve gone into articulating an implementation as opposed to a top down explanation.
    Furthermore, it’s a bit confusing to keep track of and map the various terminologies you’ve used…since in some places they are used a bit inconsistently.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on July 09, 2018, 06:27:07 AM
    Hi @icoinist,

    1- Couldn't understand your suggested impact of this proposal on tr fees.
    Plus, there is no reason for competing Prepared Blocks to have a same set of transactions committed to the Net Merkle Root. Actually they always have at least one different transaction, their coinbase.

    2- PoCW, de-incentivizes pooling by making them unnecessary but not impossible. Miners still can participate in pools but there would be much less advantages while the costs (maintaining the service) and the risks (centralization related) remain the same. It is done by smoothening mining variance and distributing proximity premium.

    3- You are correct about topological nature of the proposal and it is what exactly happens and the protocol is intentionally designed for.
    The computational overhead, your concern, is not critical tho, because it is about verifying and keeping track of like 1500 shares per round for suggested parameters (as an average, according to latest calculations up-thread).
    It is very important to note that computation is not the bottleneck in a decentralized network, communication is. A full node is capable of verifying tens of millions of hashes in each round (one minute) without any degradation in performance.
    In PoCW, shares are 'contributions' to the same Net Merkle Root which is already verified (once per round) in preparation phase. It takes few microseconds to verify each received share's integrity and difficulty, no I/O no additional communications.

    As of your qualitative review and it being due to my 'bottom-up' representation of the proposal:
    I presented the proposal this way to have more contribution from community members because I (still do) think that a concrete example is easier to understand and discuss for the average reader.


    EDIT:
    I noticed that you made some edits:
    Quote
    1. Maybe I didn't understand this. It appears that anyone who puts out a prepared block will be able to pocket the transaction fees, even if it's a competing block. ... What stops miners from putting out prepared blocks, which have a rather low barrier to entry and keep pocketing transaction fees ? Also, would this not cause an increase in the average transaction fees any transaction experiences ? Considering there will be at least some (>1) competing prepared blocks, each of would would have claimed the transaction fee.

    Putting out a  Prepared block is not enough to get rewarded, it should attract enough contribution and be finalized by means of a Finalized block that includes the shares and the Merkle Root of the prepared block under consideration. Only Finalized blocks and hence  a unique Merkle Tree will be committed to the blockchain.

    Rational miners should stop producing Prepared Block as soon as they find out that another prepard block is propagated and is getting contribution shares from peers. It is just like traditional PoW in which miners give up with their current works and start mining new blocks by referencing to the newly found block, because they do realize that otherwise their current work will go stall.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on July 10, 2018, 01:05:41 AM
    May be later. For now, I'm just sick of it.

    So many white papers out there, any scammer got one. You want to find sponsors to run a scammy ico? Easy! Find some crippy idea about a token or a new coin or something, no matter what, and write a white paper!

    In the rare non-scamming cases we have PoS poisoned shits, proof of jumping ideas, topologically ill designed heterogenous networks of fucking specialized   nodes, ... I feel bad about white papers.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: Traxo on July 13, 2018, 01:44:51 PM
    Every post from @anunymint apparently was deleted. The thread is now very difficult to understand because a significant portion of the discussion is missing.

    Some of this thread was archived here (http://archive.is/https://bitcointalk.org/index.php?topic=4438334.0;all) and here (https://web.archive.org/web/*/https://bitcointalk.org/index.php?topic=4438334.0;all).


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on July 13, 2018, 02:11:53 PM
    Every post from @anunymint apparently was deleted. The thread is now very difficult to understand because a significant portion of the discussion is missing.
       >:(

    Although @anunymint is somewhat harsh and used a bitter language, I've to acknowledge his contribution as being helpful.

    Banning users is cruelty but removing their posts? It is slutter.

    I got this pm from @mpremp (the supreme leader) regarding my posts being deleted because of quoting @anunymint  :o

    Believe it? He has removed my posts because I've quoted @anunymint. I mean what is it? A devious recursive slutter algorithm, run by a bot?

    I'm shocked and disappointed, bitcointalk is not the right place for such malicious behaviors, I'll stop posting in here for a while.





    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: Traxo on July 13, 2018, 03:37:11 PM

    I got this pm from @mpremp (the supreme leader) regarding my posts being deleted because of quoting @anunymint  :o



    You mean mtwerp (https://bitcointalk.org/index.php?topic=1887077.msg18824147#msg18824147)?


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on July 13, 2018, 03:40:06 PM

    I got this pm from @mpremp (the supreme leader) regarding my posts being deleted because of quoting @anunymint  :o



    You mean mtwerp (https://bitcointalk.org/index.php?topic=1887077.msg18824147#msg18824147)?

    No it is @mprep (https://bitcointalk.org/index.php?action=profile;u=51173). Does it matter who?


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: Traxo on July 14, 2018, 09:28:43 AM
    No it is @mprep (https://bitcointalk.org/index.php?action=profile;u=51173). Does it matter who?

    Yes it matters who. Try clicking the link. We were referring to same vandal.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: kirkarsedat on July 17, 2018, 08:51:11 PM
    Well that sounds good to get rid of pools for mining crypto in order to promote individual or small scale mining where areas mining earning would achieve an optimal profit.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: danda on July 21, 2018, 06:30:24 AM
    @aliashraf  First, thank-you for this proposal and for moving the ball forward on POW improvements to increase decentralization.

    Please do not let trolling or forum moderators get you down.  Sometimes it is better to ignore the noise and focus on building.

    Anyway, having skimmed over the entire thread, here are my initial thoughts.

    1. This is important work.  One aspect of pooling that has only been lightly touched on in this thread is the power that pool operators have when it comes to consensus changes.  During the segwit debate, pool operators would signal this way or that and the individual miners operating on those pools had no say in the matter, except to leave, which some did but not many.  The exact same thing happened with the eth/etc fork over the DAO debacle.  It is quite possible that neither of those consensus changes would have happened without pools.  In my view, a truly decentralized POW would result in a coin that is MUCH MUCH more resistant to consensus algorithm changes because it is like herding wild cats (many asleep and hiding under brush) instead of influencing a few zookeepers.  This improves immutability.     If consensus changes are actually desired, a formal change mechanism such as Decred's can be built-in.

    2. It is admirable that you wish to help bitcoin and eth with this improvement.  However, this is a long and frustrating road you have set yourself, full of politics and headache.  B. Fuller said "You never change things by fighting the existing reality".  Sometimes it is necessary to create a viable working alternative just to prove something can be done.  Just look at monero and zcash.  People have been talking about improving privacy in bitcoin pretty much since day 1.  We still don't have it.  But at least now we have choice, and some working implementations that can serve as testbeds for things like bulletproofs that may yet find their way into bitcoin.  So I would encourage you to reconsider your stance on building an altcoin.  A fairly launched, decentralized coin would not be a "shitcoin" in my book, but a chance to start over and do some things right.  If the innovations are better, that coin will eventually win in the marketplace or have its innovations adopted by larger coins.   Truly decentralized mining, if achievable, is an idea worthy of its own coin, if ever there was one.

    3. Are you working on code already?   I'd be interested to check out github for this project....

    4. What are your goals for who would be able to profitably mine on such a network?  For example, would every person on the planet with a smartphone be able to profitably mine?   How about raspberry pi or even an old commodore-64?   I'm just trying to get an idea of what the lower limits are for contributions to the network's security, and also I would like to understand if such "micro-mining" requires or supports micro-payments.   Eg payments for value that might be worth $0.00000001 cents in today's value.  Do fees become the limiting factor?    Is 1 satoshi too large to express some of these mining rewards?

    5.  You may find the bitcoin-dev mailing list a better audience for deep technical discussion/feedback.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on July 21, 2018, 09:12:47 AM
    @aliashraf  First, thank-you for this proposal and for moving the ball forward on POW improvements to increase decentralization.

    Please do not let trolling or forum moderators get you down.  Sometimes it is better to ignore the noise and focus on building.

    Anyway, having skimmed over the entire thread, here are my initial thoughts.

    1. This is important work.  One aspect of pooling that has only been lightly touched on in this thread is the power that pool operators have when it comes to consensus changes.  During the segwit debate, pool operators would signal this way or that and the individual miners operating on those pools had no say in the matter, except to leave, which some did but not many.  The exact same thing happened with the eth/etc fork over the DAO debacle.  It is quite possible that neither of those consensus changes would have happened without pools.  In my view, a truly decentralized POW would result in a coin that is MUCH MUCH more resistant to consensus algorithm changes because it is like herding wild cats (many asleep and hiding under brush) instead of influencing a few zookeepers.  This improves immutability.     If consensus changes are actually desired, a formal change mechanism such as Decred's can be built-in.
    I highly appreciate your contribution.

    Actually it was the smartest and the most relevant contribution in this thread ever (no offense to other guys posted here) your deep understanding of the importance of 'getting rid of pools' agenda, just surprised me.

    Let me tell you a secret: I forced myself to take care of pools just because of the same cause you distinguished and pointed out: consensus changes.

    I was working on ASIC resistance proposals when I realized for any consensus change, instead of convincing the community I have to convince few pool operators, the very job I'm not good at. Actually I suck negotiating with authorities of any kind, I feel desperate and worthless and it always ends to the same result, being humiliated by a bunch of idiots who got no brains and no hearts.

    Then I started asking myself: Shouldn't it be different with crypto? wasn't it supposed to be different? Haven't I been promised to leave in a more decentralized planet? Why should I have to negotiate when I have the logos on my side? Who gave them the ethos to sit with me and negotiate?

    It was how I left everything else suspended and started to design this proposal.

    I want to set crypto free for further evolutions, to make it a fair environment for the most important resource on the earth: human's creativity and talent. It is why I'm so proud of this work. It is right to the point, the most important point ever after Satoshi: elimination of pooling pressure in PoW consensus systems.

    No other development in the cryptocurrency and blockchain technology deserves to be compared with this proposal other than bitcoin itself.
    This leads us to the first truly decentralized crypto coin in the history. Let trolls and biased reviewers do their best to undermine it, they won't succeed and the pooling age, bitcoin's ice age, is over. It is just a matter of time and not a long boring time  ;)

    Obviously, I'm motivated by your support but I deliberately went so far to help you understand my approach to this project: No doubts, no hesitations, just feeling more responsible and trying to become more ready.

    Quote
    2. It is admirable that you wish to help bitcoin and eth with this improvement.  However, this is a long and frustrating road you have set yourself, full of politics and headache.  B. Fuller said "You never change things by fighting the existing reality".  Sometimes it is necessary to create a viable working alternative just to prove something can be done.  Just look at monero and zcash.  People have been talking about improving privacy in bitcoin pretty much since day 1.  We still don't have it.  But at least now we have choice, and some working implementations that can serve as testbeds for things like bulletproofs that may yet find their way into bitcoin.  So I would encourage you to reconsider your stance on building an altcoin.  A fairly launched, decentralized coin would not be a "shitcoin" in my book, but a chance to start over and do some things right.  If the innovations are better, that coin will eventually win in the marketplace or have its innovations adopted by larger coins.   Truly decentralized mining, if achievable, is an idea worthy of its own coin, if ever there was one.

    I understand your concerns but some thoughts:
    1- We made bitcoin what it is.
    We propagated and defended it. We introduced it along with Ethereum and Monero and others to our friends, our family, our colleagues enthusiastically and confidently. We are responsible against the community that we contributed in building it, we just can't leave them alone with Jihan, it is not fair.

    2-PoCW, this proposal, is about eliminating the need for pools in a network saturated by hashpower and transaction load. Releasing a fresh coin based on this protocol is a feasible option but it may take too long to have such a network under a real stress test and it would put the project in the risk of being obsolete and lost in the hypes and speculations.

    For now, my plan is neither a fresh coin nor a traditional hard fork. I'll discuss it later when I'm more ready. But If hypothetically, somebody is interested in releasing a new coin using this protocol, I'll support technically and mentally.

    Quote
    3. Are you working on code already?   I'd be interested to check out github for this project....

    Yes I am and I will commit my work asap.

    Actually, after some coding I found myself with a lot of new ideas and improvements (it happens very often, the code thinks and designs autonomously) so I went back to my papers and decided to release a new version of the proposal with a LOT of interesting improvements.

    I suppose it takes quite a time but worth it. Will keep you informed.

    Quote
    4. What are your goals for who would be able to profitably mine on such a network?  For example, would every person on the planet with a smartphone be able to profitably mine?   How about raspberry pi or even an old commodore-64?   I'm just trying to get an idea of what the lower limits are for contributions to the network's security, and also I would like to understand if such "micro-mining" requires or supports micro-payments.   Eg payments for value that might be worth $0.00000001 cents in today's value.  Do fees become the limiting factor?    Is 1 satoshi too large to express some of these mining rewards?

    One should be careful here.
    What we fix is mining variance and its centralization consequences, mining profitability and efficiency is another whole damn issue, damaged by ASICs.

    This proposal have no fix for inefficiency of cpu/gpu mining in a coin attacked by ASICs like bitcoin. So, a commodity device will fail being profitable even if you help with variance disaster because of its inefficiency compared to an S9.

    But there are hopes (more than one):

    -First of all, this proposal opens doors for further consensus changes and improvements (ASIC resistance on top of them).

    -Ethereum, Monero, ... are not ASICed yet or have survived it (and for Ethereum we will save it,  Vitalike likes it or not  ;) )

    -One of new design concepts I've already finalized and will publish very soon is an exciting possibility for wallets to participate in collaborative work when they generate difficult transactions by pointing and weighing on the most recent block they verify as valid. Other than PoW related impacts it helps supporting micro payments by letting wallets to compensating for fees with work, it was not feasible before collaborative work concept.

    Quote
    5.  You may find the bitcoin-dev mailing list a better audience for deep technical discussion/feedback.

    I don't know why but I feel comfortable with public discussion right now and honestly, I feel somehow offended by dev guys who didn't contribute here.

    Anyway, may be in the future, I'd consider a more active strategy in this regard. Thanks for the comment.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: danda on July 21, 2018, 12:19:43 PM
    Let me tell you a secret: I forced myself to take care of pools just because of the same cause you distinguished and pointed out: consensus changes.

    I was working on ASIC resistance proposals when I realized for any consensus change, instead of convincing the community I have to convince few pool operators, the very job I'm not good at. Actually I suck negotiating with authorities of any kind, I feel desperate and worthless and it always ends to the same result, being humiliated by a bunch of idiots who got no brains and no hearts.

    Then I started asking myself: Shouldn't it be different with crypto? wasn't it supposed to be different? Haven't I been promised to leave in a more decentralized planet? Why should I have to negotiate when I have the logos on my side? Who gave them the ethos to sit with me and negotiate?

    It was how I left everything else suspended and started to design this proposal.

    I want to set crypto free for further evolutions, to make it a fair environment for the most important resource on the earth: human's creativity and talent. It is why I'm so proud of this work. It is right to the point, the most important point ever after Satoshi: elimination of pooling pressure in PoW consensus systems.

    I agree with and support your above statements.  I would simply maintain that your goals may be most readily achieved by letting bitcoin be bitcoin, and start something new.  permission-less innovation and no need to convince anyone.

    Quote
    I understand your concerns but some thoughts:
    1- We made bitcoin what it is.
    We propagated and defended it. We introduced it along with Ethereum and Monero and others to our friends, our family, our colleagues enthusiastically and confidently. We are responsible against the community that we contributed in building it, we just can't leave them alone with Jihan, it is not fair.

    We each also agreed to bitcoin's consensus rules, as they are/were when we began using it.  I've argued elsewhere that the ideal cryptocurrency in terms of maintaining a stable and trusted value is one whose consensus rules cannot be changed, ever.  In practice, what we have now with BTC and ETH are coins that are quite difficult to change, by design, so anyone that tries will likely be frustrated.   What you are proposing is a major change to the rules, so you will likely be frustrated.  That I happen to agree with your reason/goals is irrelevant to that basic point.

    Quote
    2-PoCW, this proposal, is about eliminating the need for pools in a network saturated by hashpower and transaction load. Releasing a fresh coin based on this protocol is a feasible option but it may take too long to have such a network under a real stress test and it would put the project in the risk of being obsolete and lost in the hypes and speculations.

    I believe that if this algo works as intended and a coin is fairly launched, a lot of people would mine it from day 1, and it would be under a real stress test long before your proposal would be adopted on bitcoin mainnet.

    Quote
    For now, my plan is neither a fresh coin nor a traditional hard fork. I'll discuss it later when I'm more ready. But If hypothetically, somebody is interested in releasing a new coin using this protocol, I'll support technically and mentally.

    Ok, cool.  Sounds like you have a plan.  I won't belabor that point any longer.

    Quote
    Yes I am and I will commit my work asap.

    Actually, after some coding I found myself with a lot of new ideas and improvements (it happens very often, the code thinks and designs autonomously) so I went back to my papers and decided to release a new version of the proposal with a LOT of interesting improvements.

    I suppose it takes quite a time but worth it. Will keep you informed.

    Excellent!  I will look forward to that.

    Quote
    One should be careful here.
    What we fix is mining variance and its centralization consequences, mining profitability and efficiency is another whole damn issue, damaged by ASICs.

    Hmm, here I get a little confused.

    Your proposal states:

    Quote
    The Idea is accepting and propagating works with hundreds of thousands times lower difficulties and accumulating them as a proof of work for a given transaction set, letting miners with a very low shares of hash power ( say of orders like 10-6) to participate directly in the network and yet experience and monitor their performance on an hourly basis.

    Also in your writeup on the mining variance you state:

    Quote
    while small miners are losing opportunity costs every single round, they will never have a practical chance to be compensated ever.

    From these statements, I inferred that your proposal intends to enable practical mining on "small" devices.  If that's not really the case, I would encourage you to define what is a "small miner" and what the goals are in terms of hardware participation, as this directly relates to how decentralized mining can practically be.

    a blue-sky aside:

    I have always thought that a theoretically "ideal" decentralized POW consensus algorithm would reward every participant that contributes computes cycles to securing the network, in proportion to the work performed, each and every block.  

    Practically speaking, this would seem to require breaking the share puzzles into chunks small enough for each device to solve multiple shares per block, according to its abilities.  It also might require something like lightning to make regular micropayment payouts to thousands or millions (or billions) of mining participants worldwide:  think everybody with a smartphone wallet app.  The aggregate coinbase payout amount could be recorded onchain.

    maybe such an ideal algo will always be pure fantasy.  What do you think?



    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on July 21, 2018, 01:09:15 PM

    Quote
    One should be careful here.
    What we fix is mining variance and its centralization consequences, mining profitability and efficiency is another whole damn issue, damaged by ASICs.

    Hmm, here I get a little confused.

    Your proposal states:

    Quote
    The Idea is accepting and propagating works with hundreds of thousands times lower difficulties and accumulating them as a proof of work for a given transaction set, letting miners with a very low shares of hash power ( say of orders like 10-6) to participate directly in the network and yet experience and monitor their performance on an hourly basis.

    Also in your writeup on the mining variance you state:

    Quote
    while small miners are losing opportunity costs every single round, they will never have a practical chance to be compensated ever.

    From these statements, I inferred that your proposal enables practical mining on "small" devices.  If that's not really the case, I would encourage you to define what is a "small miner" and what the goals are in terms of hardware participation, as this directly relates to how decentralized mining can practically be.
    To be precise you should think abstract.
    Abstractly speaking, efficiency of a device is a matter of algorithm and the machine which runs it. As long as you keep hashing algorithm unchanged, you can't change the distribution of  efficiency.

    This proposal, does not cover ASIC problem that makes commodity devices uncompetitive. What it tries to fix is the mining scale effects on devices with same efficiency (hundreds of thousands of ASICs distributed unevenly in different scales).

    Taking care of this problem is the most basic and preliminary step towards the efficiency problem you are interested in, otherwise, billions of devices with same efficiency would choose to conglomerate around 10-20 pools and you would have nothing more than what you got right now.

    Quote
    a blue-sky aside:

    I have always thought that a theoretically "ideal" decentralized POW consensus algorithm would reward every participant that contributes computes cycles to securing the network, in proportion to the work performed, each and every block.  

    Practically speaking, this would seem to require breaking the share puzzles into chunks small enough for each device to solve multiple shares per block, according to its abilities.  It also might require something like lightning to make regular micropayment payouts to thousands or millions (or billions) of mining participants worldwide:  think everybody with a smartphone wallet app.  The aggregate coinbase payout amount could be recorded onchain.

    maybe such an ideal algo will always be pure fantasy.  What do you think?

    I think it needs to be refined and adjusted:

    First of all you should figure out a way for keeping ASICs, FPGAs and even GPUs out of the race. They are ways more efficient and leave no space for commodity devices to profit.

    PLus, you should adjust your anticipations a bit more. For instance a device contributing in the PoW as its auxiliary feature, like a mobile and its wallet app, doesn't need a consistent micro payment every minute. It is not an investment, it can tolerate a reasonable variance like once or twice a day.

    So, For your dream to be realized:
    1- You definitively need a ASIC resistant algorithm which is  imun to parallelism as well.
    Suppose something like a Dagger-Hashimoto algorithm (or any memory hard algorithm) with both read/write navigation requirements in each loop (actually I have a proposal for this)  which needs holding locks on its footprint.
    Such an algorithm will keep almost every single commodity device competitive for mining.

    2- Now you will need PoCW, to ensure that any device will be paid in real time for its contribution i.e. with a tolerable variance.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: danda on July 21, 2018, 02:29:07 PM
    To be precise you should think abstract.
    Abstractly speaking, efficiency of a device is a matter of algorithm and the machine which runs it. As long as you keep hashing algorithm unchanged, you can't change the distribution of  efficiency.

    This proposal, does not cover ASIC problem that makes commodity devices uncompetitive. What it tries to fix is the mining scale effects on devices with same efficiency (hundreds of thousands of ASICs distributed unevenly in different scales).

    yes, I get that.  And I don't expect cell phone mining to ever be really profitable vs asics, gpus, botnets or whatever.  Even against only other cell phones each one would only have like a 1/billionth share.  Fundamentally, I'm just thinking of an ideal algo that is provably fair, ie pays out according to work performed, and each device added actually provides more security to the network.   Given such a system, more people might mine altruistically, and perhaps people could use it to generate some micro-tokens for participation in the network without having to visit an exchange of some type.  Like 21, inc used to talk about:  bootstrapping towards internet of value/things. 

    Stated another way:   today I could turn on a dusty old laptop and start it mining to give me warm fuzzies that it is helping secure the blockchain.  But I know that it isn't really achieving anything because it would not find a block in my lifetime and those cycles/elec have been wasted due to winner-take-all.  But if I knew that my tiny contribution was actually additive with everyone else's tiny contribution each block, and also we together are providing resistance to consensus rule changes, then I would probably do it, even if not profitable.

    From my initial read of your proposal, I understood the work to be additive, not winner-take-all, and thus would give me the warm fuzzies with my dusty old laptop or my cell phone mining.  is that not so?   could it be made so?


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on July 21, 2018, 04:11:40 PM
    To be precise you should think abstract.
    Abstractly speaking, efficiency of a device is a matter of algorithm and the machine which runs it. As long as you keep hashing algorithm unchanged, you can't change the distribution of  efficiency.

    This proposal, does not cover ASIC problem that makes commodity devices uncompetitive. What it tries to fix is the mining scale effects on devices with same efficiency (hundreds of thousands of ASICs distributed unevenly in different scales).

    yes, I get that.  And I don't expect cell phone mining to ever be really profitable vs asics, gpus, botnets or whatever.  
    Well, you should! Without incentive and profitability, it doesn't make sense to participate in network security, altruism is not a good foundation for a system.

    My next version of this proposal, as I've mentioned previously, includes a solution for your problem: contribute to PoW whenever you have a transaction to relay. Incentives? Miners would take advantage of your difficult-to-generate transactions as an additive factor to pass their shares/blocks easier and this way you can cover the transaction fee partially/completely.

    I didn't mean to go through details but it may be very interesting for you that wallets don't contribute by the same algorithm as miners, it is supposed to be safe against parallelism.

    The only way to have your idea practically implemented would be releasing a coin totally mined with such an algorithm.

    Quote
    ... today I could turn on a dusty old laptop and start it mining to give me warm fuzzies that it is helping,  secure the blockchain.  But I know that it isn't really achieving anything because it would not find a block in my lifetime and those cycles/elec have been wasted due to winner-take-all.  But if I knew that my tiny contribution was actually additive with everyone else's tiny contribution each block, and also we together are providing resistance to consensus rule changes, then I would probably do it, even if not profitable.

    From my initial read of your proposal, I understood the work to be additive, not winner-take-all, and thus would give me the warm fuzzies with my dusty old laptop or my cell phone mining.  is that not so?   could it be made so?
    No it is not so as long as we are stuck with current sha2 algorithm in bitcoin or even with Ethash in Ethereum.

    It is because we have to set a minimum difficulty threshold on mining shares to keep network uncongested for bitcoin my suggestion is like 0.00001 difficulty which gives a S9 an opportunity to hit a share every 30 minutes with current network hashrate. A commodity cpu based device will hit almost one million times worse, obviously not a practical option.

    As I said, you need to use a different hashing algorithm to support like tens of millions of small devices mining directly and hit shares every few hours, I suppose.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: Cobrak777 on July 22, 2018, 08:20:13 PM
    Well that sounds good to get rid of pools for mining crypto in order to promote individual or small scale mining where areas mining earning would achieve an optimal profit. Shared mining profit will favored on mining team developers and for the actual miners will get a percentage on it. We should promote and create a mining opportunity that will be able to take the miners a good profit in doing it.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: danda on August 28, 2018, 07:17:38 AM
    bump.  any updates on this?    code or design wise...


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on August 28, 2018, 03:08:12 PM
    @danda

    Yes, both are underway. Many design issues have been reconsidered/improved and new features have been included:
    • Wallets now are able to take part in consensus and proof of work and compensate for fees (partly/totally) by doing work
    • An infrastructure for a conservative version of sharding is included
    • The whole proposal is revised and improved

    I'm all on my own in this project for now, appreciate any contribution.
    Thanks for the interest, by the way. :)


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: empty[g] on September 04, 2018, 09:09:14 AM
    OP
    i can understand why you say in POCW small miners can mine without pool pressure explained at https://bitcointalk.org/index.php?topic=4687032.msg42296802#msg42296802 (https://bitcointalk.org/index.php?topic=4687032.msg42296802#msg42296802)
    maybe it was right if it was the first cryptocurrency ever as there would be no need in making a pool in first place.
    but now we already witnessing more than 50% of network in hands of 3 pools. if we change the network protocol now, pools (with already so many ppl in them that may not even care or find out about changes in network) would mine in new network and selfish-mining now would be a problem as big pools and probably anyone working and mining for them (who may use the program that pools write for this matter) would not transmitting other pools or even solo miners shares.
    actually selfish-mining in POCW with already existence of pools may be a bigger problem than it is now as there is so many more shares to transmitting and not transmitting them really means something.


    Title: Re: Getting rid of pools: Proof of Collaborative Work
    Post by: aliashraf on September 04, 2018, 01:02:55 PM
    @empty[g]

    Pooling will be de-incentivized with PoCW because miners with very low hashrate can survive solo mining, there will be no reason for a small miner with few S9s to give up with his sovereignty by joining a pool and paying fees for this at the same time.

    As of selfish mining by big farms, I've discussed it up-thread with @anonymint but I haven't checked whether after his ban and the removal of his posts and the ones which has quoted them, the selfish mining related discussions are still comprehensive enough or not. So let's briefly take a look at this problem once more:

    First of all, selfish mining in its essence should not be categorized as a flaw even in legacy PoW.
    A miner with any power can choose to keep his mined blocks private (and putting them in the risk of becoming orphan) hoping to find next block(s). Traditional winner-takes-all approach in bitcoin correctly assumes that for miners such a trade off won't be encouraging enough but many authors have tried to find a justification for miners with a large hash rate share (> 30%) to take advantage of selfish mining.

    Although Sirer et all classical analysis (http://) suggests feasibility of such an attack* on bitcoin and we have now at least one  pool with such a large share (BTC.com) and historically Slush pool reached to even higher thresholds temporarily, there is no sign of a selfish mining attack on bitcoin blockchain, ever.

    Such an attack would cause an unexpected increase in orphan rate, trivially detectable. If Sirer is correct(which I doubt it), it should be probably because a large pool/farm with up to 30% network share has long term interests in keeping system secure rather than seeking a few percent more profits.


    Now that we have a more precise understanding of this threat in PoW, it is time to do a comparative analysis for PoCW:

    1- PoCW de-incentivizes joining pools, so selfish mining should be committed by a farm rather than a pool. But a mining farm  with 30%+ network hash power is many times harder to be established compared to a pool.

    2- A pool operator has less risk factors than a farm owner involved. A pool operator is gambling with his 1-2% shares but a farm owner is risking total block's reward.

    3- In PoW, winner-takes-all implies both large risks and large rewards to be gambled, in PoCW miners should decide about every single share they generate gradually during each round. It would be easily provable that there is no advantage in keeping shares private in the first stages of contribution phase and it would yield little advantage in the latest stages because the selfish miner will have almost nothing more than a 3% reward incentive to start finalization phase earlier.

    The 3rd argument above is of much importance and you should carefully consider the protocol details to grab it:
    Suppose the network is already converged to a prepared block and its Merkle tree consequently. At first when shares are starting to accumulate, it is in the best interests of both regular and selfish miners to publish a share because no matter who finalizes the block, it would be included with a high probability.

    As the shares accumulate more and more, in a critical threshold (90%+ ), a selfish miner with very large hashrate, may start to consider the possibility of keeping his (frequently generating) shares private to be more profitable.

    Just like the case with PoW, a trade off should be made between the risk of losing both privately kept shares and the 3% reward of generating a finalized block on one hand and guaranteeing one's own shares to be included ( as there is always a chance for a share no to be included because a miner who manages to finalize a block puts his shares in prority) and gaining the 3% on the other hand.

    I think an exact mathematical analysis is needed here but let's assume a miner with a very large share would find it more profitable to go selfish in the last stages, still the network would be like 95% contributive because the selfish miner needs to include most of the shares to prove his finalized block.  

    It is worth mentioning here that I have made an improvement in the protocol that allows Prepared blocks to freely use any of the n (n not being so large) older blocks as their parent for other reasons but one side effect of this improvement would be neutralizing any effort to keep Finalized blocks private and deceiving network to generate stall shares. It is why I deliberately ignored this scenario, but even without such an improvement a same analysis would supportively show how better is the situation with PoCW compared to traditional PoW.



    *I have doubts in Sirer's work's integrity, because from a pure mathematical point of view it implies that a long range attack on bitcoin network is possible by less than 50% network share which is false because a far more rigorous mathematical analysis would show you need 50%+1 power for such an attack. If Sirer's analysis was correct, a selfish miner with 30%  power would be able to keep his chain private for more than just few blocks and show up with a longer alternative chain which is practically impossible with just 30% share of the network's total power.