Bitcoin Forum
May 25, 2024, 10:05:14 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 [3] 4 5 6 »  All
  Print  
Author Topic: Getting rid of pools: Proof of Collaborative Work  (Read 1854 times)
MISERICORDAE PROJECT
Jr. Member
*
Offline Offline

Activity: 56
Merit: 3

ONNI COIN! The New Paradigm!


View Profile WWW
June 12, 2018, 06:35:39 PM
Last edit: June 13, 2018, 03:36:13 AM by MISERICORDAE PROJECT
 #41

Proof of Collaborative Work

A proposal for eliminating the necessity of pool mining in bitcoin and other PoW blockchains

Motivation
For bitcoin and the altcoins which are based on common PoW principles,  centralization of mining through using pools, is both inevitable and unfortunate and puts all the reasonings that support the security of PoW in a paradoxical, fragile situation.

A same problem does exist with PoS networks. Things can get even worse  there because of the fact that most PoS based systems enforce long run deposit strategies for miners that is highly discouraging for them to migrate from one pool to another because of the costs involved.

The problem of solo mining becoming too risky and impractical for small mining facilities appeared in 2010, less than 2 years after bitcoin had been launched. It was the worst timing ever, although Satoshi Nakamoto made a comment on bitcointalk about first pool proposals,  it was among the latest posts Satoshi made and he just disappeared few days later from this forum, forever, without making a serious contribution to the subject.

This way, the confused community came out with an unbelievable solution for such a critical problem, a second layer centralized protocol, named pooling, boosted by greed and ignorance, supported by junior hackers who as usual missed the forest.

Bitcoin was just 2 years old when pooling age began and eventually dominated almost all the hashpower of the network.

A quick review of Slush thread in which Satoshi has made the above referenced reply, could reveal how immature and naive this solution was and has been discussed and how it has been adopted: In a rush with an obvious greed.
Nobody ever mentioned the possibility of an algorithm tweak to keep PoW decentralized. Instead everybody was talking about how practical was such a centralized service while the answer was more than obvious:
Yes! you can always do everything with a centralized service, don't bother investigating.  

Anyway, in the thread, one couldn't find any arguments about the centralization consequences or the possibility of alternative approaches including the core algorithm improvements Shocked

I think it is not fair. PoW is great and can easily be improved to eliminate such a paradoxically centralized second layer solution. This proposal, Proof of Collaborative Work (PoCW) is an example of inherent possibilities and capacities of PoW. I didn't find any similar proposal and it looks to be original but If there is a history, I'll be glad to be informed about. Smiley

The Idea is accepting and propagating works with hundreds of thousands times lower difficulties and accumulating them as a proof of work for a given transaction set, letting miners with a very low shares of hash power ( say of orders like 10-6) to participate directly in the network and yet experience and monitor their performance on an hourly basis.



Imo, now, after almost a decade being passed, Moore law has done enough to make it feasible utilizing more bandwidth and storage resources and it seems to me kinda hypocritic to make arguments about 'poor miners' and pretending to be concerned about centralization threats and making excuses so for rejecting this very specific proposal that although increases the demand for such resources, can radically disrupt current situation with pools and centralized mining.

This proposal is mainly designed for bitcoin. For the sake of convenience and letting the readers to have a more specific perception of the idea, I have deliberately used constants instead of adjustable parameters.

Outlines
  • An immediate but not practically feasible approach can be reducing blocktime (along with proportional reduction in block reward). Although this approach, as mentioned, can not be applied because of network propagation problems involved, but a very excellent consequence would be its immediate impact on the scalability problem if employed, we will use it partially (reducing blocktime to 1 minute compared to current 10 minutes period).
  • As  mentioned earlier (and with all due respects to Core team), I don't take objections about the storage and network requirements implications and consequences of reducing blocktime as a serious criticism. We should not leave mining in hands of 5 mining pools to support a hypothetical poor miner/full node owner who can not afford installing a 1 terabyte HD in next 2 years!.
  • Also note, blocktime reduction is not a necessary part of PoCW, the proposed algorithm, I'm just including it as one of my old ideas (adopted from another forum member who suggested it as an alternative to infamous block size debate and later has been developed a bit more by me) which I think deserves more investigation and discussion.
  • PoCW uses a series of mining relevant data structures to be preserved on the blockchain or transmitted as network messages
    • Net Merkle Tree: It is an ordinary Merkle hash tree of transactions with the exception that its coinbase transaction shows no block reward (newly published coins) instead the miner charges all transaction fees to his account (supports SegWit)
    • Collaboration Share: it is  a completely new data structure composed of following fields:
      • 1- The root of a Net Merkle Tree
      • 2- Collaborating miner's wallet address
      • 3- A nonce
      • calculated difficulty using previous block hash padded with all previous fields, it is always assumed to be at least as hard as 0.0001 compared to current block difficulty
    • Coinbase Share: it is new too and is composed of
      • 1- A Collaborating miner's wallet address
      • 2- A nonce
      • 3- A computed difficulty score using the hash of
        • previous block's hash padded with
        • current block's merkle root, padded with
        • Collaborating miner's address padded with the nonce field
      • 4-  A reward amount field
    • Shared Coinbase Transaction: It is a list of Coinbase Shares  
      • First share's difficulty score field is fixed to be  2%
      • For each share difficulty score is at least as good as 0.0001
      • Sum of reward amount fields is equal to block reward and for each share is calculated proportional to its difficulty score
    • Prepared Block: It is an ordinary bitcoin block with some exceptions
      • 1- Its merkle root points to a  Net Merkle Tree
      • 2- It is fixed to yield a hash that is as difficult as target difficulty * 0.05
    • Finalization Block: It is an ordinary bitcoin block with some exceptions
      • 1- Its merkle root points to a  Net Merkle Tree
      • 2- It is fixed to yield a hash that is as difficult as target difficulty * 0.02
      • 3- It has a new field which is a pointer to (the hash of) a non empty Shared Coinbase Transaction
      • 4- The Shared CoinBase Transaction's sum of difficulty scores is greater than or equal to 0.95
  • Mining process goes through 3 phases for each block:
    • Preparation Phase: It takes just few seconds for the miners to produce one or (barely) 2 or 3 Prepared Blocks typically. Note that the transaction fees are already transferred to miner's wallet through coinbase transaction committed to the Net Merkle Tree's root for each block.
    • Contribution Phase: Miners start picking one valid Prepared Block's Merkle root, according to their speculations (which become more accurate as new shares are submitted to the network) about it to get enough shares eventually, and producing/relaying valid Contribution Shares for it.
      As the sum of the difficulty scores for a given Prepared Block's Merkle root grows we expect an exponential convergence rate for the most popular Merkle root to be included in Contribution Shares.  
    • Finalization Phase: After the total scores approaches the 0.93 limit, rational Miners would begin to produce a Finalized block
  • Verification process involves:
    • Checking both the hash of the finalized block and all of its Shared Coinbase Transaction items to satisfy network difficulty target cumulatively
    • Checking reward distribution in the shared coinbase transaction
    • Checking Merkle tree to be Net
  • UTXO calculation is extended to include Shared Coinbase Transactions committed to finalized blocks on the blockchain as well
  • Attacks/forks brief analysis:
    • Short range attacks/unintentional forks that try to change the Merkle root are as hard as they are in traditional PoW networks
    • Short range attacks/unintentional forks that preserve the Merkle root but try to change the Shared CoinBase Transaction has  zero side effects on the users (not the miners) and as of redistributing the shares in favor of the forking miner, they are poorly incentivized as gains won't go anything further than like %2-%10  redistribution ever.
    • Long Range attacks with a total rewrite agenda will fail just like Traditional PoW  
    • Long Range attacks with partial coinbase rewrite are again poorly incentivized and the costs won't be justified

Implementation

This is a radical improvement to classical PoW, I admit, but the costs involved are fair for the huge impacts and benefits. I have reviewed the bitcoin Core's code and found it totally feasible and practical form the sole programming perspective. Wallets could easily be upgraded to support the new algorithm as well,  but a series of more complicated issues, mostly political are extremely discouraging but it is just too soon to give up and go for a fresh start with a new coin, or just manage for an immature fork with little support, imo.

Before any further decisions, it would be of high value to have enough feedback from the community. Meanwhile I'll be busy coding canonical parts as a BIP for bitcoin blockchain, I think it takes like 2-3 weeks or even a bit more because I'm not part of the team and have to absorb a lot before producing anything useful, plus, I'm not full time, yet Wink

I have examined the proposed algorithm's feasibility as much as I could, yet I can imagine there might be some flaws overlooked, and the readers are welcome to improve it. Philosophical comments questioning the whole idea of eliminating pools don't look to be constructive tho. Thank you.


Major Edits and Protocol Improvements:
  • June 10, 2018 09:30 pm Inspired by a discussion with @ir.hn

    • A Prepared Block should be saved in the fullnodes for a long period of time enough to mitigate any cheating attempt to avoid Preparation Phase and using non-prepared, trivially generated Net Merkle Roots.  
      • Full nodes MAY respond to a query by peers asking for a block's respected Prepared Block if they have decided to save the required data long enough
      • For the latest 1000 blocks preserving such a data is mandatory.
      • For blocks with an accumulated difficulty harder than or equal to the respected network difficulty, it would be unnecessary to fulfil the above requirement.*
      • Prepared Block and Preparation phase terms replaced the original Initiation Block and Initiation Phase terms respectively to avoid ambiguity
      Notes:
      * This is added to let miners with large enough hash powers choose not to participate in collaborative work.
  • reserved for future upgrades









This is a good technical proposal. Kudos!! All issues raised by commentators can be taken into account and addressed if not resolved already in the analytical model. More Grease to your elbow!



The Shared Transaction Coinbase is not a part of the Header, its hash(id) is,

All the small proof-of-work solutions have to communicated and calculated before the winning block can be communicated. So that is up to 10,000 (if difficulty target is 0.0001) multiplied by the 64B size of a SHA256 hash, which is up to 625KB of data that must be communicated across the network for each 10 minute period. That’s not factoring in if the network is subdivided and miners are mining on two or more leader Prepared blocks, in which case the network load can be double or more of that.

Now I do understand that these proof-of-work share solutions are communicated continuously and not all at once at the Finalized block, but you’ve got at least four potential issues:

1. As I told you from the beginning of this time wasting discussion, the small miners have to validate all the small proof-of-work solutions otherwise they’re trusting the security to the large miner which prepares the Finalized block. If they trust, then you do have a problem about non-uniform hashrate which changes the security model of Bitcoin. And if they trust you also have a change to the security model of Bitcoin. And if the don’t trust and attempt the validation, then they’re incurring more costs than they would in pools and be further marginalized.

2. All of these solutions still have to be validated in terms of Shared Transaction Coinbase, when the Finalized block is. Although the previously validated small proof-of-work solutions themselves do not have to be revalidated, the hash of all the small proof-of-work solutions has to be checked and the miner has to verify he already validated the solution for each one. This is also some overhead which can delay propagation because it adds up. Each node has to add this validation step before propagating to the next node in the P2P network. You ostensibly do not seem to fully appreciate how small verification steps add up in the propagation to form significant delays w.r.t. to lowering the effective block period to 10 and 30 seconds (as appears to me your design does) for the Final and Prepared block stages. Nodes do not propagate invalid block solutions (or any invalid data) because they would make the P2P network vulnerable to a DoS amplication attack.

3. Because the network can be subdivided on two or more leader blocks, the nodes no longer have an incentive validate and propagate the solutions on the block they are not contributing small proof-of-work solutions to. Presumably they have slightly better ROI if they always contribute to the Prepared block they received first, and not to every Prepared block they received.

4. This is for 0.0001 difficulty target for the small proof-of-work solutions. As I already stated up-thread, this will get worse over time as this target has be decreased as network hashrate grows faster than the hashrate and capital of the small miner.

As of classical selfish attack itself, I personally disagree to call it an attack at all. I rather see it as a fallacy, a straw man fallacy.
My reasoning:
PoW has nothing to do with announcement. Once a miner prefers to keep his block secret it is his choice and his right as well, he is risking his block to become orphan in exchange for a possible advantage against the rest of the network in mining for the next block.

When you apparently do not understand the math and research paper on selfish mining (or you’re just being disingenuous?), and you start arguing philosophically and handwaving, then the time wasting discussion is terminated.

Selfish mining is always profitable for the 33+% attacker. It isn’t probably employed as an attack on Bitcoin because it increases the orphan rate and would tarnish the image of Bitcoin. So presumably the powers-that-be are not using it and they do not need to as they already have ASICBOOST and control over the 12/14/16nm ASIC fabs. So it’s not in their interest to deploy the attack on Bitcoin. But that doesn’t mean it is not being deployed already on proof-of-work altcoins.

Although Like PoW, this proposal is not about prohibiting people from selfish mining, there is a point to rephrase the above reasoning somehow different, this proposal is about reducing the pooling pressure and helping the network to become more decentralized by increasing the number of miners. How? By reducing the variance of mining rewards that is one of the 2 important factors for this pressure (I will come back to the second factor, soon).

My point which you seem to be trying your best to obfuscate is that, AFAICT I posit that your design makes selfish mining much worse. I posit that it lowers the 33+% that the miner needs to attack the network with selfish mining, thus further lowering the security. And will be more dubious to detect it because your design AFAICT drastically increases the orphan rate.

I am even wondering if your design will even reliably converge on a longest chain. And especially as the 10,000 factor is pushed to 1 milllion as the market capitalization of Bitcoin grows, then surely your design will fall flat on its face.

AFAICS, you’re fighting a losing battle against the invariants of physics. The game theory flaws multiply as you attempt to put decentralization into a paradigm that is inherently centralizing.

All time expended trying to decentralize proof-of-work is time wasted thrown down a rathole. Proof-of-work is a centralization paradigm. There will be no escape.

For a traditional winner-takes-all PoW network, like bitcoin there is just one pieces of information (the fresh block) that causes the problem, true, but the weight of this information and resulting premium is very high and it is focused in one spot, the lucky miner in the focal point and its neighbors in the hot zone.

For this proposal, this premium is distributed more evenly, tens of thousands times.

OOps! there is almost no proximity premium flaw in Proof of Contributive Work!

As I have posited with my incomplete list of concerns above, there will likely be game theory flaws lurking that you do not expect. I don’t want to expend the effort do more than handwave about those I posited. Apology but I’m very pessimistic about what more can be accomplished with proof-of-work.

There’s no way around the invariants of physics and economics that make proof-of-work inherently centralizing.

And I don’t want to expend my time on this. Sorry. I already expended many months contemplating the variants of designs and realized there’s no escape from the invariants. You can continue to invent new obfuscations for yourself over and over again until you finally come to realize the same. Good luck.


@anunymint   Your technical evaluations and criticisms are highly valued. Two heads are better than one. Of course there are issues of centralization with PoW and already being exploited but there shouldn't be loss of hope in addressing them technically, from scratch or even with an additive/add-on. There is always a way around, an escape, and that has been driving new Physics and technological innovations. It's encouraging to let @aliashraf's codes be completed and reach a testing stage where unaddressed flaws in his analytical models will be discovered and solved. Don't lose hope in continuing your technical analysis of the proposal, even when you are feeling a sense of obfuscation.

▰▰▰[ ONNI ]▰▰▰     ●●●Grüezi mitenand!●●●
Revolutionary Gold-Pegged SuperStable Price for Businesses
Bespoke HemiVolatile Price for Currency Traders
MISERICORDAE PROJECT
Jr. Member
*
Offline Offline

Activity: 56
Merit: 3

ONNI COIN! The New Paradigm!


View Profile WWW
June 12, 2018, 06:45:43 PM
 #42

There is always a way around, an escape, and that has been driving new Physics and technological innovations.

Sorry no. You are handwaving.

I do not buy into false hopes. There are invariants here which cannot be overcome.

Is it possible for you to supply your mathematical details of these insurmountable invariants so we can look into it from our end? 

▰▰▰[ ONNI ]▰▰▰     ●●●Grüezi mitenand!●●●
Revolutionary Gold-Pegged SuperStable Price for Businesses
Bespoke HemiVolatile Price for Currency Traders
MISERICORDAE PROJECT
Jr. Member
*
Offline Offline

Activity: 56
Merit: 3

ONNI COIN! The New Paradigm!


View Profile WWW
June 12, 2018, 07:33:49 PM
Merited by anunymint (1)
 #43

There is always a way around, an escape, and that has been driving new Physics and technological innovations.

Sorry no. You are handwaving.

I do not buy into false hopes. There are invariants here which cannot be overcome.

Is it possible for you to supply your mathematical details of these insurmountable invariants so we can look into it from our end?  

...Much better for me if the competition wastes time on an insoluble direction, while I am working on a promising one...


Not really aware of a competition. Is the promising one you are working on a solution to the issues of Bitcoin with an entirely new algorithm?

▰▰▰[ ONNI ]▰▰▰     ●●●Grüezi mitenand!●●●
Revolutionary Gold-Pegged SuperStable Price for Businesses
Bespoke HemiVolatile Price for Currency Traders
manfredmann
Member
**
Offline Offline

Activity: 518
Merit: 21


View Profile WWW
June 13, 2018, 10:27:52 AM
 #44

Well that sounds good to get rid of pools for mining crypto in order to promote individual or small scale mining where areas mining earning would achieve an optimal profit. Shared mining profit will favored on mining team developers and for the actual miners will get a percentage on it. We should promote and create a mining opportunity that will be able to take the miners a good profit in doing it.
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 13, 2018, 05:47:23 PM
Last edit: June 13, 2018, 07:19:17 PM by aliashraf
 #45

    There is always a way around, an escape, and that has been driving new Physics and technological innovations.

    Sorry no. You are handwaving.

    I do not buy into false hopes. There are invariants here which cannot be overcome.

    Is it possible for you to supply your mathematical details of these insurmountable invariants so we can look into it from our end?  

    ...Much better for me if the competition wastes time on an insoluble direction, while I am working on a promising one...


    Not really aware of a competition. Is the promising one you are working on a solution to the issues of Bitcoin with an entirely new algorithm?
    I'm sorry saying this but I think we have been trolled by @anunymint.  Sad

    PoW is one of the most important innovations in modern history (kudos Satoshi  Smiley) it is very irresponsible decision to abandon it because of some flaws and limitations by  claiming every flaw to be an inherent, essential one  and  jumping back to a pre-crypto, failed,  subjective alternative (like reputation based systems) often rebranded by using the same terminology of Satoshi Nakamoto and bitcoin!

    I'm not against change, on the contrary I strongly support any new idea whenever by whoever. But I personally feel good about a change when it is suggested (at least mainly) to help people to do something better not as an instrument in hands of an opportunist who has found or been informed about a weakness in a newly born technology and instead of trying or helping to fix it, initiates a hypocritical campaign just to sell us his crippy name or to convince his dad that he is genius, ... whatever.

    I'm not that kind of person, It's so tempting to take advantage of weaknesses and flaws of a system but I don't like such a miserable life. This proposal is a fix not a hypocritical alternative to PoW.

    It is a fix for  a series of of important challenges of bitcoin and PoW networks, it deserves decent reasoning and discussion instead of trolling and vandalism.

    To understand how unacceptable is that kind of behavior it is better to understand the importance and beauty of the subject, imo. let's take a look:

    1- It fixes Pooling pressure as the most centralization threat to bitcoin, by:
    • eliminating (solo)mining variance flaw by dividing mining to 3 phases that in the most important one, Collaboration phase (being the second one), where 98% of the block reward is going to be distributed,  they can partially contribute to PoW process  tens of thousands of times easier directly.
    • eliminating proximity premium flaw by uniquely distributing 'new block found' information across tens of thousands points in the network and incentivizing announcement of this information simultaneously.

    2- Although this proposal is ready for an alpha version implementation and consequent deployment phases, it is too young to be thoroughly understood for its other impacts and applications, the ones that it is not primarily designed for.  As some premature intuitions I can list:
    • It seems to be a great infrastructure for sharding , the most important onchain scalability solution.
      The current situation with pools makes sharding almost impossible, when +50% mining power is centralized in palms of few (5 for bitcoin and 3 for Ethereum) pools, the problem wouldn't be just security and vulnerability to cartel attacks, unlike what is usually assumed, it is more importantly a prohibiting factor for implementing sharding (and many other crucial and urgent improvements).
      If my intuition might be proven correct, it would have a disruptive impact on the current trend that prioritizes off chain against on chain scalability solutions.
    • This protocol probably can offer a better chance for signaling and autonomous governance solutions
    • {TODO: suggest more}

    A thorough analysis of the details suggested in the design, would convince non-biased reader that this proposal is thought enough and is not that immature to encourage anybody to attempt a slam dunk and reject it trivially, on the contrary considering the above features and promises, and the importance of pooling pressure as one of the critical flaws of bitcoin, it deserves a fair extensive discussion.

    Now, when someone comes and ruins such a decent topic, like what @anunimint did here, by repeating nosens objections and being never convinced no matter what, it could be either due to his ingenuity or as a result of him being biased obsessively because of his history in public sphere that is full of Proof of everything other than Work obsessions and vague claims about PoW being a boring, old fashioned weak system, doomed to be centralized, vulnerable to every possible attack vector, blah, blah, blah, ...  that he is trapped himself in or both .

    I vote for the second option about this guy, but if he is really smart, he should put the load (of his own history) off his shoulders and be ready for revision and improve.[/list]
    Carlton Banks
    Legendary
    *
    Offline Offline

    Activity: 3430
    Merit: 3074



    View Profile
    June 13, 2018, 07:17:51 PM
     #46

    Proof of everything other than Work

    Annoymint doesn't like the implications of proof of work; he's been claiming for 5-6 years that he's working on a "blockchain breakthrough", but never proves he's working on anything Smiley


    @Annoymint, you need to start a new Bitcointalk user called "Proof of everything other than work"

    Vires in numeris
    aliashraf (OP)
    Legendary
    *
    Offline Offline

    Activity: 1456
    Merit: 1174

    Always remember the cause!


    View Profile WWW
    June 13, 2018, 07:25:15 PM
     #47

    Proof of everything other than Work

    Annoymint doesn't like the implications of proof of work; he's been claiming for 5-6 years that he's working on a "blockchain breakthrough", but never proves he's working on anything Smiley


    @Annoymint, you need to start a new Bitcointalk user called "Proof of everything other than work"

    I see, being trapped by his own narration, a very common threat for all of us. I guess we have to do kinda meditation or Zen to avoid or heal.
    aliashraf (OP)
    Legendary
    *
    Offline Offline

    Activity: 1456
    Merit: 1174

    Always remember the cause!


    View Profile WWW
    June 14, 2018, 12:56:00 AM
    Last edit: June 14, 2018, 01:23:42 AM by aliashraf
     #48

    Why is the thread being derailed by some comments about me? What does anything I did or did not do have to do with the discussion of the technological facts of PoCW.
    Really nothing, besides the need to stop you from trolling.

    Quote
    he's been claiming for 5-6 years that he's working on a "blockchain breakthrough"

    I challenge you to quote from the past where I extensively made such a claim 5 or 6 years ago.

    EDIT: {and a long story about what you have been bout in 5-6 years ago}
    In any case, I welcome your ridicule. It motivates me. Please do not stop. And please do report me to @theymos so this account can be banned so I stop wasting time posting on BCT.

    Like this. Please ... Just put and end to this if you may. You did something inappropriate, and some objections was made about it. Let it go.

    Quote
    My technological comments stand on their own merits regardless what ever is done to cut my personal reputation down.


    Absolutely not. You questioned the overhead of the validation process for miners in my proposal and I answered it solidly: There is no overhead because there is no I/O involved because the submitted contribution shares have an exactly the same Merkle root that is already evaluated (once, when the Prepared Block has been evaluated by the miner when he decided to contribute to it afterwards).

    Only a troller continues with repeating this question over and over and in an aggressive way full of insults and hype.

    A decent contributor with good faith, may show up to be doubtful about the predicates like  'there is no I/O' , 'the Merkle tree has not to be validated' , 'the shares enjoy a common Merkle tree' , ... and this time with less confidence about the validity of his position because s/he understands that there is huge possibility for her/his doubts to be removed by the designer of the protocol trivially by posting few references references to the original proposal. Actually it is exactly the case here because all of the three predicates under consideration are absolutely true by the design.

    When the doubts cleared to be unnecessary the the discussion can go a step forward. It is no war, there is nothing to conquer other than the truth.



    aliashraf (OP)
    Legendary
    *
    Offline Offline

    Activity: 1456
    Merit: 1174

    Always remember the cause!


    View Profile WWW
    June 14, 2018, 09:56:54 AM
     #49

    Why is the thread being derailed by some comments about me? What does anything I did or did not do have to do with the discussion of the technological facts of PoCW.

    Really nothing, besides the need to stop you from trolling.

    Please define trolling and then show that I did it. Specific links or quotes please.
    no need to go so far this post of you is %90+  nothing other than trolling.
    Quote
    he's been claiming for 5-6 years that he's working on a "blockchain breakthrough"

    I challenge you to quote from the past where I extensively made such a claim 5 or 6 years ago.

    EDIT: {and a long story about what you have been bout in 5-6 years ago}
    In any case, I welcome your ridicule. It motivates me. Please do not stop. And please do report me to @theymos so this account can be banned so I stop wasting time posting on BCT.


    Like this. Please ... Just put and end to this if you may. You did something inappropriate, and some objections was made about it. Let it go.

    What did I do that was inappropriate “5-6 years ago” that was related to “claiming […] he's working on a ‘blockchain breakthrough’”?  Specific links or quotes please.

    If you can’t specifically show that the SPECIFIC “5-6 years ago” allegation is true, then you are the one who is trolling by stating the lie, “You did something inappropriate”.

    I politely asked you to end this but you love twisting it more ... it is trolling in the specific context we are in .. I was not the one aho said things abut the last 5-6 years of your history, FYI.
    Quote
    My technological comments stand on their own merits regardless what ever is done to cut my personal reputation down.


    Absolutely not. You questioned the overhead of the validation process for miners in my proposal and I answered it solidly:

    I said my “My technological comments stand on their own merits regardless what ever is done to cut my personal reputation down”.

    That does not mean I claim “My technological comments” are unarguable. Only that “my personal reputation” has nothing to do with the discussion of the technology.

    It is another and the most important form of troll you commit, repeatedly. Your argument here is void and makes no sense:
    Once an objection is made and it proves to be irrelevant or false and the proposal addresses the asserted issues, , it should be dropped and not maintained the way you are putting it, every issue remains open and can be used as a toy by trollers by making false claims whenever they wish to.
    Quote
    There is no overhead because there is no I/O involved because the submitted contribution shares have an exactly the same Merkle root that is already evaluated (once, when the Prepared Block has been evaluated by the miner when he decided to contribute to it afterwards).

    I already refuted that line of logic in that the ratios over time have to either increase or the capitalization of the miner within your current 10,000 factor will place them in the crosshairs of having to cowtail to the oligarchy.
    And I punted on the entire concept, because I stated mining is becoming ever more centralized so it’s pointless and futile to try to make a protocol for small miners.

    one another example of trolling, after you have been clearly informed about negligible costs of validation of shares, instead of closing the case and moving on, you just deny everything by bringing forward a very weak argument to keep the issue open no matter how. You can't help it, you need issues to remain open forever to be used by you for ruining the topic.

    In this case, you are saying that future increases in network hash power should be compensated by increasing the number of shares and it will eventually be problematic. IOW, you are saying 2-3 years later the hashrate will probably double and small miners would again experience variance phenomenon, then devs will improve the protocol and double the number of shares by a hard fork and now, this increase would prove that verification of shares is a weakness!

    Firstly, doubling or tripling the number of shares don't make significant problem in terms of share validation costs, it is yet a cpu bound process, some very low profile nodes may require $200 or so to buy a better processor, in the worst case.

    Secondly, by increases in network hash power, although it is nonlinear, we will have an improvement in mining devices and their efficiency.

    Quote
    Only a troller misrepresents to the thread what I wrote in the thread as explained above.

    Now I am done with you.

    Bye.


    I should have stuck to my first intuition and never opened the thread. Or certainly never had posted after I read that horrendously bad OP description of the algorithm. That was indicative of the quality of the person I am interacting with unfortunately. I have learned a very important lesson on BCT. Most people suck (see also and also!). And they don’t reach their potential. The quality people are very few and far between, when it comes to getting serious work done.


    See? You are offending me, my work, bitcointalk, its members, ... very aggressively at the end of the same post that you are asking for evidence of you being a troll! I can imagine you may reply like this:
    "I never told I'm not a troll I've just wanted you to give evidence about it, so I 'maintain' my inquiry for evidence. This issue, me being a troll or not is open just like all other issues we have been arguing about."!

    Quote
    In a lighter social setting a wider array of people can be tolerated (especially when we do not need to rely on them in any way).
    Tolerance is good but trolling is not among ones that are to be tolerated, imo.
    aliashraf (OP)
    Legendary
    *
    Offline Offline

    Activity: 1456
    Merit: 1174

    Always remember the cause!


    View Profile WWW
    June 14, 2018, 07:58:07 PM
     #50


    I don’t think my argument is weak. I think my analyses of your design is 100% spot on correct. And I encourage you to go implement your design and find out how correct I am! Please do!

    You continue to not mention the point I made about incremental validation overhead and accumulated propagation delay and its effect on orphan rate, especially when you have effectively decreased the block period to 15 seconds for the Finality phase Schelling point and 30 seconds for the Prepared block Schelling point.

    And you continue to not relate that I also pointed as the transaction fees go to $50,000 with Lightning Networks Mt. Gox hubs dominating settlements in the 1 MB blocks (or pick any size you want which which does not spam the network with low transactions fees because the miners will never agree and unlimited block sizes drive the orphan rate up and break security), then active UTXO will shrink because most people can’t afford to transact on-chain. Thus the MRU UTXO will be cached in L3 SRAM. And the block will have huge transactions and not many transactions. Thus your entire thesis about being I/O bound on transaction validation will also be incorrect.

    You can’t seem to pull all my points together holistically. Instead you want to try to cut a few of them down piece-meal out-of-context of all the points together.
    I remain silent about the trolling part, I'm realising you can't help it and it is just unintentional behavior of a  polemicist when things get too intense.

    Let's take a look at  technical part of your reply:
    1- There is no incremental overhead, I've never mentioned any incremental increase/decrease (enforced by the protocol or by scheduled forks) in the proposed parameters including the relative difficulty of contribution shares . I have to confess, tho, I'm investigating this possibility.
    Will keep you informed about the outcome which will not be a simple linear increase with network hashpower, anyway.

    2- Also propagation delay won't accumulate even if we might increase (incrementally or suddenly) the driving factors behind the number of contribution shares because validation cost is and remains negligible for nodes. Remember? The client software is I/O bound and contribution share validation is cpu bound(I'll come to your new objection about it later).

    3- I am not 100% against your analysis of lightning or in favor of it, I'm not that much interested or believer in LN as a scaling solution, but it won't help your position in this debate:

    Your arguments:
    You are speculating that transactions will go off chain in the future and the main chain will be busy processing huge transactions produced by flush operations in LN nodes and at the same time network nodes will manage to keep the UTXO (its most Recently Used Part) in SDRAM and it helps them not to access HD frequently and so, they will be no more I/O bound (relatively) and this way the processing overhead of contribution shares begins to look more important and will become to be a bottleneck eventually. Right?

    Answer:
    • You are speculating TOO much here, my perception of LN and off chain solutions differs moderately
    • Having MRU UTXO in SDRAM cache won't help that much, task would remain halted for RAM access and yet would access HD for page faults and most importantly for writing to UTXO after the block has been verified
    • Also, a relative improvement to node's performance in validating full blocks is not a disaster,  the number of blocks is the same as always

    4- As of your expectation from me not to cutting your objections down to pieces is like asking me to troll against a troller. On the contrary I prefer to go more specific and resolve issues one by one. On the contrary you want keep the discussion in the ideological level, being optimistic or pessimistic about this or that trend or technology and so on, ... I think in the context of making assessments about a proposed protocol my approach is more practical and useful.

    Quote
    I specifically mentioned order-of-magnitude readjustments in the future. There you go again twisting my words and being disingenuous.

    Firstly, doubling or tripling the number of shares don't make significant problem in terms of share validation costs, it is yet a cpu bound process, some very low profile nodes may require $200 or so to buy a better processor, in the worst case.

    You’re ignoring that I argued that your thesis on transaction validation bounded validation delay will also change and the validation of the shares will become incrementally more relative. And that taken together with the 15 second effective Schelling points around which orphaning can form. You’re not grokking my holistic analyses.

    there will be no order-of-magnitude (=exponential like tens or hundreds of times? ) increase to the network hash power  in foreseeable future and I did you a favor not to simply rejecting this assumption, instead I tried to address more probable scenarios like 2 or 3 times increase in next 2-3 years or so.
    Although it is good to see the big picture and take cumulative effects in consideration, but it won't help if you have not a good understanding of each factor and its importance.

    You are saying like :
               look! there are so many factors to be considered isn't this terrifying?
    No! This is not terrifying as long as we could be able to isolate each factor and understand it deeply, instead of being terrified or terrifying people by it.
    aliashraf (OP)
    Legendary
    *
    Offline Offline

    Activity: 1456
    Merit: 1174

    Always remember the cause!


    View Profile WWW
    June 15, 2018, 10:43:15 AM
    Last edit: June 15, 2018, 11:38:32 AM by aliashraf
     #51

    @anunymint
    I appreciate the fact that you spend a considerable time on this subject, it is a good evidence for me to become even more convinced that:
    1- You have good faith and as of trolling part of your writings, you just can't help it and I should be an order of magnitude more tolerant with you  Smiley
    2- You are smart and have been around for a long time, a good choice for chewing a complicated proposal like PoCW. Again, more tolerant, as tolerant as possible and more ... I should repeat and keep it in mind  Wink

    I was nearly certain I had already mentioned it up-thread, but couldn’t quickly find it to quote it. So let me recapitulate that your PoCW design proposes to put 10,000 times more (and I claim eventually 100,000 and 1 million) times more proof-of-work hashes in the blockchain that have to be validated.
    [/quot]
    Nop. It is about 10,000 and will remain in that neighborhood for long,long time to reach 100,000 it will take a century or so! I have already described it:I have no plan and there wont be such a plan to increase this number linearly with the network hashrate.
    This proposal, with current parameters is about making solo mining 100,000 times more convenient right now, it is a good improvement regardless of what happens in the next few years and how we should deal with it.
    Quote
    This is going to make objectively syncing a full node (especially for a new user who just downloaded the entire blockchain since the genesis block) incredibly slow unless they have very high capitalization ASICs. And the asymmetry between commodity single CPUs and mining farms of ASICs will widen, so perhaps eventually it becomes impractical for a CPU to even sync from genesis. So while you claim to improve some facets, your design makes other facets worse. {*}
    Unlike what you suggest, ASICs won't be helpful in syncing block chain. Full nodes are not ASICs and don't utilise ASICs ever to validate a hash, they just compute that hash with their own cpu!
    SHA256 and other hash functions, are NP-Complete problems: their solutions consume negligible time and resource to be verified, it is basic in "computer science"  Wink

    Quote
    {*} Including you continue to deflect the correct criticism that your design very likely makes selfish mining worse thus reducing an attack on economic security from 33% of the hashrate to an even lower threshold. The onus is on you to write a detailed whitepaper with formal security proofs. Not on me to write proofs that your design has such failure modes.

    One polite request if I may: Please remain focused as much as possible. I quoted this from where I cut the previous one and inserted {*}.  This is really hard to be productive this way, jumping from this objection to that one with little or zero relations between.
    Actually I have answered this before.

    Retrying:
    • This proposal discourages selfish mining and generally any consequences regarding proximity premium flaw including but not limited to mining pressure which is its design goal.
    • To understand how PoCW does this huge improvement, one should note that proximity premium  flaw is about nodes having access to important, valuable information sooner than what competitors do, so that they may have chances to take advantage of this premium intentionally or not.
      In worst case scenarios like selfish mining the privileged node(s) may decide to escalate this situation by deliberately keeping that information private for longer time, instead of relaying it according to the protocol.
    • One should also beware of the nature of this information: It is always about that a block has been found and the details of this discovery.
    • Proof of Collaborative Work, this proposal, addresses this issue for the first time, by distributing the critical information in tens of thousands of possible points in each round across the network. It is smart because instead of eliminating the privilege of being close to the source of information, distributes it almost evenly through the participants
    • It is also very important to note that how this feature of the proposal, relaxes any doubts about propagation delay, it turns to become a much less threat to the security of the network:
      The proposal has another secret weapon worth mentioning here: It incentivizes sharing information because of the mechanisms provided to help finazilingthe the information, suspending its actual value.
      This way, not only the focal points have been distributed across the network 100,000 times more, the distributed energy is not even a finalized energy yet. This makes it ways more irrelevant to keep it as a secret.
    • The above property, provides an excellent self regulatory mechanism that for every bit of hypothetical overhead it generates for the nodes to comply, and every microsecond of propagation delay it causes, there would exist a considerable decrease  in security vulnerability of the network to propagation delay.  

    I appreciate your concerns and if you manage to post a reply focused on something like this or any other technical concerns regarding the proposal, I'll send merits for you again, I just can't do it right now as you don't stop being generally and totally negative and aggressive and acting like a warrior who is fighting against ... against what or whom really?


    As a conclusion and brief summary:
    PoCW, practically fixes one of the most known flaws in traditional PoW,  proximity premium. This has various important consequences besides its direct effect on mining pressure, including and not limited to discouraging selfish mining.

    Quote
    I’m quite confident that many experts already considered your design. I know I did circa 2014 or 2015. And we dismissed it without discussing in great detail in public because it was just not a worthwhile design to pursue. However, I do think if you read every post of bitcointalk.org (or which I can claim I have probably read more of it than most people reading this), then you will find discussion that proposed designs analogous to yours. And they were shot down for the reasons I am providing to you now. If you want to look for fertile design ground, you need to look away from the obvious things that everybody was trying to solve years ago. I raised the pools bugaboo incessantly in 2013/14. You’re like 5 years too late to the party. That sailed already.

    It is not true as a general rule, actually it is not true even occasionally, imo. Technology doesn't trend only driven by just one factor: smartness of inventors or advocates, it is ways more importantly driven by interests and enthusiasm.

    I started this thread by giving a brief historical perspective of how pools have been developed conceptually and practically. It was driven by ignorance and greed as I've concluded there.

    If you are right, and there has been  a mysterious proposal somewhere in the history, similar to mine, I'm sure it has been abandoned not because of being impractical. My analysis suggests with Satoshi being disappeared, people were left in the hands of junior programmers who were not committed to decentralization enough, on one hand and greedy pool operators who did anything to take advantage of bitcoin, centralized parasites grown on a decentralized infrastructure, like what Google, Facebook, ... are for TCP/IP and Internet.

    Now, I managed to design this algorithm not because I'm very smart and can outperform all of the advocates and developers, I'm just more concerned about centralization compared to many of these guys who have got rich enough that many of them are already retired and instead of doing their job, they are just pretending and the remaining are now investigating how to take advantage from the centralized situation to get even more rich. They just don't care.

    There should be a voice for people, fresh people who join, being promised to live in a better world a voice for average and small miners and hobbyists who wanna be a part of a fair business, free of corporates and pools. And guess what? They are a force, driving force after all that we have been through, after Bitmain.

    This community is becoming more aware and a driving force is pushing for decentralization and it is the true reason that someone like me is so confident about the future. It is no more 2010, things have happened and there is a shift that encourages and dictates decentralization.

    I'm not a genius, I'm just a dedicated programmer/software engineer who tries to think out of the box and ask crucial questions and could not be satisfied easily by stupid arguments and does not pay a sh*t to history and how it has happened to bring us this misery. A person who got enough courage to ask how it could happen and what do we have to deal with this mess now.

    Hereby I ask for help, from brave, dedicated developers and advocates to join this proposal, improve it, implement it and kick these folks out of the decentralized ecosystems. I strongly believe they should better invest in or work for Google ...
    Vitalik is specially a good candidate, he is already invited  Grin

    Quote

    3. I expect your design makes the proximity issue worse. I had already explained to you a hypothesis that the Schelling points in your design will cause network fragmentation. Yet you ignore that point I made. As you continue to ignore every point I make and twist my statements piece-meal.

    Good objection.
    Shelling points (transition points from Preparation to Contribution and from the Finalization to the next round ) have %7 value cumulatively (%5 for first and %2 for the second point). It is low enough, yet it is not at stake, totally:

    For the first %5 part, the hot zone (the miner and its neighboring peers) are highly incentivized to share it asap, because it is not finalized and practically worth nothing as it won't be appreciated if it doesn't get enough support finding its way to finalization. Note that neighbors are incentivized too, as if they want to join the dominating current, they need their shares to be finalized as soon as possible, it needs the Prepared Block to be Populated although it is not their's.

    For the second Schelling point, the finalized block found event, with %2 percent block reward value, hesitating to relay the information is in a very high risk of being orphan by means of other competitors(for the lucky miner), and to be mining orphan shares/final blocks (for the peers).

    I understand you have some feelings that more complicated scenarios could be feasible but I don't think so and until somebody has not presented such a scenario, we better not to be afraid of it.

    I'm aware that you are obsessed with some kind of network being divided because you think selfish mining is a serious vulnerability and/or propagation delay is overwhelming.
    Network wont be divided neither intentionally nor as a result of propagation delay, and if you are not satisfied with my assessment of propagation delay you should recall my secret weapon, incentivizing nodes to share their findings, as fast as possible to the extent that they will put it in high priority. They will dedicate more resources (both hardware and software) to the job.

    Quote
    2- Although this proposal is ready for an alpha version implementation and consequent deployment phases, it is too young to be thoroughly understood...

    Correct! Now if you would just internalize that thought, and understand that your point also applies to your reckless (presumptuous) overconfidence and enthusiasm.

    Here you have 'trimmed' my sentence to do what you are repeatedly accuse me of. I'm not talking about other people being not smart enough to understand me and/or my proposal.
    I'm talking about the limitations of pure imagination and discussion about the consequences of a proposal any proposal when it might be implemented and adopted.
    Why should you tear my sentence apart? The same sentence that you have continued quoting. Isn't that an act of ... let's get over such things, whatever.
    Quote
    ...for its other impacts and applications, the ones that it is not primarily designed for.  As some premature intuitions I can list:

    • It seems to be a great infrastructure for sharding , the most important onchain scalability solution.
      The current situation with pools makes sharding almost impossible, when +50% mining power is centralized in palms of few (5 for bitcoin and 3 for Ethereum) pools, the problem wouldn't be just security and vulnerability to cartel attacks, unlike what is usually assumed, it is more importantly a prohibiting factor for implementing sharding (and many other crucial and urgent improvements).
      If my intuition might be proven correct, it would have a disruptive impact on the current trend that prioritizes off chain against on chain scalability solutions.
    • This protocol probably can offer a better chance for signaling and autonomous governance solutions

    In the context of the discussion of OmniLedger, I already explained that it can’t provide unbounded membership for sharding, because one invariant of proof-of-work is that membership in mining is bounded by invariants of physics. When you dig more into the formalization of your design and testing, then you’re going to realize this invariant is inviolable. But for you now you think you can violate the laws of physics and convert the Internet into a mesh network. Let me link you to something I wrote recently about that nonsense which explains why mesh networking will never work:

    https://www.corbettreport.com/interview-1356-ray-vahey-presents-bitchute/#comment-50338
    https://web.archive.org/web/20130401040049/http://forum.bittorrent.org/viewtopic.php?id=28
    https://www.corbettreport.com/interview-1356-ray-vahey-presents-bitchute/#comment-50556
    I'll check your writings about sharding later, thanks for sharing. But As I have mentioned here, these are my initial intuitions and are provided to show the importance and beauty of the proposal and opportunities involved. I just want to remind that how pointless would be to just fighting with it, instead of helping to improve and implement it.
    Quote
    A thorough analysis of the details suggested in the design, would convince non-biased reader that this proposal is thought enough and is not that immature to encourage anybody to attempt a slam dunk and reject it trivially, on the contrary considering the above features and promises, and the importance of pooling pressure as one of the critical flaws of bitcoin, it deserves a fair extensive discussion.

    https://www.google.com/search?q=site%3Atrilema.com+self-important
    https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Matthew-Laine-1
    https://medium.com/@shelby_78386/if-you-want-the-country-to-be-less-polarized-then-stop-writing-talking-and-thinking-about-b3dcd33c11f1


    Now you are just fighting (for what?) ...
    You are accusing me to be of this or that personality, being over-confident, ... whatever, instead I suggest you to provide more illuminating points and objections and make me to reconsider parts of the proposal, instead of repeating just one or two objections while you are playing your game of thrones scenes.

    well, it was hell of a post to reply. I'll be back to is later.
    Cheers
    tromp
    Legendary
    *
    Offline Offline

    Activity: 980
    Merit: 1088


    View Profile
    June 15, 2018, 11:27:45 AM
     #52

    • Verification process involves:
      • Checking both the hash of the finalized block and all of its Shared Coinbase Transaction items to satisfy network difficulty target cumulatively
    This is a serious problem with your proposal. The proof of work is not self-contained within the header.
    It requires the verifier to obtain up to 10000 additional pieces of data that must all be verified, which is too much overhead in latency, bandwidth, and verification time.[/list]
    Shared Coinbase transaction typically is 32 kB data (an average of 4500 items)

    On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.

    BUT, the expected highest weight among these shares is close to 0.5 !
    (if you throw 5000 darts at a unit interval, you expect the smallest hit near 1/5000)

    So rather than summarize a sum weight of 1 with an expected 5000 shares,
    it appears way more efficient to just summarize a sum weight of roughly 0.5 with the SINGLE best share.
    But now you're essentially back to the standard way of doing things. In the time it takes bitcoin to find a single share of weight >=1, the total accumulated weight of all shares is around 2.

    All the overhead of share communication and accumulation is essentially wasted.
    aliashraf (OP)
    Legendary
    *
    Offline Offline

    Activity: 1456
    Merit: 1174

    Always remember the cause!


    View Profile WWW
    June 15, 2018, 12:01:51 PM
    Last edit: June 15, 2018, 01:03:23 PM by aliashraf
     #53

    • Verification process involves:
      • Checking both the hash of the finalized block and all of its Shared Coinbase Transaction items to satisfy network difficulty target cumulatively
    This is a serious problem with your proposal. The proof of work is not self-contained within the header.
    It requires the verifier to obtain up to 10000 additional pieces of data that must all be verified, which is too much overhead in latency, bandwidth, and verification time.[/list]
    Shared Coinbase transaction typically is 32 kB data (an average of 4500 items)

    On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.
    It is 0.93 to be exceeded to be exact.
    Quote

    BUT, the expected highest weight among these shares is close to 0.5 !
    (if you throw 5000 darts at a unit interval, you expect the smallest hit near 1/5000)

    Yes. To be more exact, as the shares are randomly distributed in the range between 0.0001 up to 0.93,  the median would be 0.465.
    Yet it is not the highest difficulty, just the median.
    Quote

    So rather than summarize a sum weight of 1 with an expected 5000 shares,
    it appears way more efficient to just summarize a sum weight of roughly 0.5 with the SINGLE best share.
    But now you're essentially back to the standard way of doing things. In the time it takes bitcoin to find a single share of weight >=1, the total accumulated weight of all shares is around 2.

    All the overhead of share communication and accumulation is essentially wasted.

    As you mentioned, it is more what traditional bitcoin is doing and I'm trying to fix. It is not collaborative and as both theoretically and experimentally  has been shown, is vulnerable to centralization. The same old winner-takes-all philosophy leaves no space for collaboration.

    As of the 'overhead' issues, this has been discussed before. Shares are not like conventional blocks, they take a very negligible cpu time to be validated and network bandwidth to be propagated.

    EDIT:
    I have to take back my above calculations:  Some blocks may have as few as 2 shares and some may have as many as 9301 shares to satisfy the difficulty, this yields an average number of shares to be around 4650 for a large number of rounds. The highest share is (0.93) and the lowest will be 0.0001 no more indexes I've calculated and tried to calculate till now.
    aliashraf (OP)
    Legendary
    *
    Offline Offline

    Activity: 1456
    Merit: 1174

    Always remember the cause!


    View Profile WWW
    June 15, 2018, 12:11:39 PM
     #54

    @anonymint

    keep cool and remain focused, ...

    unfortunately your last post was of no quality in terms of putting enough meals on the table, instead you are continuing your holly war (against what?) with inappropriate language, as usual.

    Please take a break, think a while and either leave this discussion (as you promise repeatedly) or improve your attitude,

    will be back  Wink
    tromp
    Legendary
    *
    Offline Offline

    Activity: 980
    Merit: 1088


    View Profile
    June 15, 2018, 12:12:36 PM
    Merited by anunymint (1)
     #55

    SHA256 and other hash functions, are NP-Complete problems: their solutions consume negligible time and resource to be verified, it is basic in "computer science"  Wink

    Hash functions are not decision problems, so they cannot be NP-complete.
    I could create a decision problem out of a hash function though.
    Something relevant for mining would look like:

    The set of of pairs (p,y) where
      p is a bitstring of length between 0 and 256,
      y is a 256 bit number,
      and there exists an 256-bit x with prefix p such that SHA256(x) < y

    Such a problem is in NP.
    But it would still not be NP-complete, since there is no way to reduce other NP problems to this one.
    aliashraf (OP)
    Legendary
    *
    Offline Offline

    Activity: 1456
    Merit: 1174

    Always remember the cause!


    View Profile WWW
    June 15, 2018, 12:40:20 PM
    Last edit: June 15, 2018, 12:51:02 PM by aliashraf
     #56

    SHA256 and other hash functions, are NP-Complete problems: their solutions consume negligible time and resource to be verified, it is basic in "computer science"  Wink

    Hash functions are not decision problems, so they cannot be NP-complete.
    I could create a decision problem out of a hash function though.
    Something relevant for mining would look like:

    The set of of pairs (p,y) where
      p is a bitstring of length between 0 and 256,
      y is a 256 bit number,
      and there exists an 256-bit x with prefix p such that SHA256(x) < y

    Such a problem is in NP.
    But it would still not be NP-complete, since there is no way to reduce other NP problems to this one.


    Yes, my mistake to call it NP-complete, it is NP.  In the context of this discussion, when we refer to hash functions, the PoW problem (like one you have suggested, a conditional hash generating problem) is what we usually mean, yet I should have been more precise.

    This was posted in a chaotic atmosphere but the point is maintainable that, verifying shares (not the Prepared block or its counterpart in traditional PoW, block) is a trivial job, by definition. Because it needs just verifying an answer for a NP  problem.
    tromp
    Legendary
    *
    Offline Offline

    Activity: 980
    Merit: 1088


    View Profile
    June 15, 2018, 01:12:26 PM
     #57

    On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.

    I calculated wrong.

    n shares expect to accumulate about n*ln(n)*10^-4 in weight, so we expect
    a little under 1400 shares to accumulate unit weight...
    fr4nkthetank
    Legendary
    *
    Offline Offline

    Activity: 2294
    Merit: 1182


    Now the money is free, and so the people will be


    View Profile
    June 15, 2018, 01:15:40 PM
     #58

    Interesting, you put a lot of thought into this proposal.  I would support it and see how it goes.  The goal is really hard to reach.  The idea would be to increase difficulty to scale up operations.  Pool mining can be damaging, but one guy with a huge operation can be worse if no one can pool together.
    aliashraf (OP)
    Legendary
    *
    Offline Offline

    Activity: 1456
    Merit: 1174

    Always remember the cause!


    View Profile WWW
    June 15, 2018, 01:52:56 PM
     #59

    On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.

    I calculated wrong.

    n shares expect to accumulate about n*ln(n)*10^-4 in weight, so we expect
    a little under 1400 shares to accumulate unit weight...
    Interesting, appreciate it if you would share the logic beyond the formula. It would be very helpful. To be honest I have not done too much on it and my initial assumption about 4650 shares is very naive. I was just sure that the number won't be any higher for average number of shares per block.

    Thank you so much for your contribution. Smiley
    aliashraf (OP)
    Legendary
    *
    Offline Offline

    Activity: 1456
    Merit: 1174

    Always remember the cause!


    View Profile WWW
    June 15, 2018, 02:08:05 PM
    Last edit: June 15, 2018, 02:21:37 PM by aliashraf
     #60

    Interesting, you put a lot of thought into this proposal.  I would support it and see how it goes.  The goal is really hard to reach.  The idea would be to increase difficulty to scale up operations.  Pool mining can be damaging, but one guy with a huge operation can be worse if no one can pool together.
    Thanks for the support.

    As of your argument about hardware centralization being more dangerous without pools:

    It is so tricky. This proposal is not an anti-pool or pool-resistant protocol, instead it is a fix for pooling pressure.

    Iow, it does no prevent people to come together and start a pool , it just removes the obligation for them to join pools (and the big-pool-better-pool implications) the current situation for almost any PoW coin.

    EDIT:
    It is also interesting to consider the situation with Bitmain. No doubts this company has access to the biggest mining farms ever and yet Bitmain has Antpool and insists on having more and more people pointing their miners to their pool. Why? because it is always better to have more power and be safe against the variance and have a smooth luck statistics.

    So, I would say after this fix, there would be not only no pressure toward pooling but also no incentive.
    Pages: « 1 2 [3] 4 5 6 »  All
      Print  
     
    Jump to:  

    Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!