Bitcoin Forum
May 12, 2024, 03:02:42 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 [4] 5 6 »  All
  Print  
Author Topic: Getting rid of pools: Proof of Collaborative Work  (Read 1854 times)
tromp
Legendary
*
Offline Offline

Activity: 978
Merit: 1087


View Profile
June 15, 2018, 04:34:29 PM
Last edit: June 16, 2018, 12:51:40 PM by tromp
Merited by aliashraf (2)
 #61

On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.

I calculated wrong. Again. Edited for correctness:

n shares expect to accumulate about n * ln(10^4) * 10^-4 in weight, so we expect
a little under 1100 shares to accumulate unit weight...
Interesting, appreciate it if you would share the logic beyond the formula.

Consider a uniformly random real x in the interval [10^-4,1]
Its expected inverse is the integral of 1/x dx from 10^-4 to 1, which equals ln 1 - ln (10^-4) = ln(10^4).

Now if we scale this up by 10^4*T, where T is the target threshold, and assume that shares are not lucky enough to go below T, then the n hashes will be uniformly distributed in the interval [T, 10^4*T], and we get the formula above.

1715482962
Hero Member
*
Offline Offline

Posts: 1715482962

View Profile Personal Message (Offline)

Ignore
1715482962
Reply with quote  #2

1715482962
Report to moderator
The Bitcoin network protocol was designed to be extremely flexible. It can be used to create timed transactions, escrow transactions, multi-signature transactions, etc. The current features of the client only hint at what will be possible in the future.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1715482962
Hero Member
*
Offline Offline

Posts: 1715482962

View Profile Personal Message (Offline)

Ignore
1715482962
Reply with quote  #2

1715482962
Report to moderator
1715482962
Hero Member
*
Offline Offline

Posts: 1715482962

View Profile Personal Message (Offline)

Ignore
1715482962
Reply with quote  #2

1715482962
Report to moderator
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 15, 2018, 05:10:17 PM
Last edit: June 15, 2018, 08:33:23 PM by aliashraf
 #62

On further reflection, if you randomly accumulate shares of weight (fraction of required difficulty) >= 10^-4 until their sum weight exceeds 1, then the expected number of shares is 5000.

I calculated wrong.

n shares expect to accumulate about n*ln(n)*10^-4 in weight, so we expect
a little under 1400 shares to accumulate unit weight...
Interesting, appreciate it if you would share the logic beyond the formula.

Consider a uniformly random real x in the interval [10^-4,1]
Its expected inverse is the integral of 1/x dx from 10^-4 to 1, which equals ln 1 - ln (10^-4) = 4 ln 10.

Now if we scale this up by 10^4*T, where T is the target threshold, and assume that shares are not lucky enough to go below T, then the n hashes will be uniformly distributed in the interval [T, 10^4*T], and we get the formula above.


Solid. Will include this formula and the proof in the white paper, if you don't mind.
tromp
Legendary
*
Offline Offline

Activity: 978
Merit: 1087


View Profile
June 15, 2018, 05:34:36 PM
 #63

Solid.If you don't mind will include this formula and the proof in the white paper, if you don't mind.

I don't mind, as long as you consider the edits i made to fix some errors.
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 16, 2018, 11:25:39 AM
 #64

I am not much of a probability theory expert, but for now, I'm persuaded about @tromp calculations:
For any minimum relative difficulty* set for every share, mindiff, the average number of shares per (Finalized) block, n, would satisfy this formula : n*ln(n)*mindiff = 1

* minimum relative difficulty is the ratio by which PoCW reduces the calculated network difficulty and is the least difficulty allowed for submitted shares.

it yields n * ln(n) = 1/mindiff and suggests a better than scalable dependency between the two. i.e. a decrease in mindiff (more utilization) causes a better than linear growth in the avaerage number of shares.

Notes:
1-  @tromp assumption about shares being not overly lucky enforces this formula even more (i.e. the average weight can be a bit higher, hence n is always a bit smaller )

2- The exact sum of n shares according to this protocol is 0.93 instead of 1 (there is one fixed %2 share for the finalized block itself) so the formula should be modified for n to satisfy n*ln(n)*mindiff = 0.95

I just did some trivial calculations using the latter formula
For mindiff being set to 10^-4, we will have n < 1,320  
For 10^-5 we have n<10,300
For  10^-6 we have n< 83,800

It is so encouraging: Setting difficulty for shares to a minimum of one million times easier than network difficulty we need only 83,800 shares per block as an average instead of 1320 for current 0.0001. Note that the difficulty is already reduced by a factor of 10,  as a result of decreased block time to one minute and we are talking about 10 million times utilization compared to currently proposed 100 thousand times.

And yet we don't have to decrease the mindiff (currently set to 10^-4) in such a strict way, instead we would prefer  moderate adjustments, an even more promising situation.

Based on this assessments, it is assertable that Proof of Collaborative Work, is scalable and can achieve its design goal,  despite constant growth in network hashrate and difficulty indexes by a better than linear increase in demand for computing and networking resources (and no increase in other resources). The design goal is keeping difficulty of shares low enough to help average and small miners in participating directly in the network without being hurt by  phenomenons such as mining variance or their inherent proximity disadvantage. Fixing one of the known flaws of PoW, mining pressure.

I guess we might start thinking about a self adjustment algorithm for mindiff (the minimum difficulty figure for issued shares).
No rush for this tho, the core proposal is open to change and it is just a long term consideration

This hypothetical algorithm should have features such as:

-Not being too dynamic. I think the adjustment shouldn't happen frequently, once every year I suggest.

-Not being linear. The increase in network hashrate is about introducing both new investment by miners and improved efficiency of mining hardware. Both factors, specially the latter, suggest that we don't have to keep too small facilities competitive artificially by subsidizing them. We are not Robin Hood and we shouldn't be.

So, our algorithm should "dump" the impact of difficulty increase instead of covering it.
It would help the network to upgrade smoothly.
A factor of 30% to 50% adjustment, as a result of 100% increase in target difficulty, seems more reasonable to me than just an exact proportional compensation for new difficulty.




tromp
Legendary
*
Offline Offline

Activity: 978
Merit: 1087


View Profile
June 16, 2018, 12:50:47 PM
 #65

I am not much of a probability theory expert, but for now, I'm persuaded about @tromp calculations:

NOTE that you overlooked my fix where ln(n) should instead be ln(1/mindiff).
goddog
Member
**
Offline Offline

Activity: 168
Merit: 47

8426 2618 9F5F C7BF 22BD E814 763A 57A1 AA19 E681


View Profile
June 16, 2018, 01:04:05 PM
 #66

Hi, I have a stupid question, for sure I'm missing something,


  • Finalization Block: It is an ordinary bitcoin block with some exceptions
    • 1- Its merkle root points to a  Net Merkle Tree
    • 2- It is fixed to yield a hash that is as difficult as target difficulty * 0.02
    • 3- It has a new field which is a pointer to (the hash of) a non empty Shared Coinbase Transaction
    • 4- The Shared CoinBase Transaction's sum of difficulty scores is greater than or equal to 0.95

I cannot see any reward for finalization block.
where is the incentive to to mine a finalization block?
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 16, 2018, 03:12:25 PM
Last edit: June 16, 2018, 03:55:56 PM by aliashraf
 #67

I am not much of a probability theory expert, but for now, I'm persuaded about @tromp calculations:

NOTE that you overlooked my fix where ln(n) should instead be ln(1/mindiff).
Would you please do a complete rewrite of your proposed formula, ... for clarification? Substituting ln(1/mindiff) for ln(n) just makes no sense to me or I'm missing something here.
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 16, 2018, 03:18:53 PM
Last edit: June 16, 2018, 03:30:02 PM by aliashraf
 #68

Hi, I have a stupid question, for sure I'm missing something,


  • Finalization Block: It is an ordinary bitcoin block with some exceptions
    • 1- Its merkle root points to a  Net Merkle Tree
    • 2- It is fixed to yield a hash that is as difficult as target difficulty * 0.02
    • 3- It has a new field which is a pointer to (the hash of) a non empty Shared Coinbase Transaction
    • 4- The Shared CoinBase Transaction's sum of difficulty scores is greater than or equal to 0.95

I cannot see any reward for finalization block.
where is the incentive to to mine a finalization block?

Block reward is distributed by means of Shared Coinbase Transaction in which the first transaction is supposed to be a special transaction fixed to have a score of 0.02 and obviously will refer to the wallet address of the miner (of Finalized Block).

  • Coinbase Share: it is new too and is composed of
    • 1- A Collaborating miner's wallet address
    • 2- A nonce
    • 3- A computed difficulty score using the hash of
      • previous block's hash padded with
      • current block's merkle root, padded with
      • Collaborating miner's address padded with the nonce field
    • 4-  A reward amount field
  • Shared Coinbase Transaction: It is a list of Coinbase Shares  
    • First share's difficulty score field is fixed to be  2%
    • For each share difficulty score is at least as good as 0.0001
    • Sum of reward amount fields is equal to block reward and for each share is calculated proportional to its difficulty score
tromp
Legendary
*
Offline Offline

Activity: 978
Merit: 1087


View Profile
June 16, 2018, 06:31:44 PM
 #69

Would you please do a complete rewrite of your proposed formula, ... for clarification? Substituting ln(1/mindiff) for ln(n) just makes no sense to me or I'm missing something here.

Let T be the target threshold determined by the difficulty adjustment,
and scale be some suitably big number like 10^4.

Let shares be hashes that fall into the interval [T, T*scale], and define their score as T / hash.
When accumulating shares until their sum score exceeds 1, one is interested in the expected score of a share.

This can be seen to equal 1/scale times the expected value of 1/x for a uniformly random real x in the interval [1/scale,1]. Considering the area under a share score, the latter satisfies (1-1/scale) E(1/x) = integral of 1/x dx from 1/scale to 1 = ln 1 - ln(1/scale) = ln(scale).

So the expected score is approximately ln(scale)/scale.
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 16, 2018, 06:55:14 PM
Last edit: June 16, 2018, 07:46:08 PM by aliashraf
 #70

Would you please do a complete rewrite of your proposed formula, ... for clarification? Substituting ln(1/mindiff) for ln(n) just makes no sense to me or I'm missing something here.

Let T be the target threshold determined by the difficulty adjustment,
and scale be some suitably big number like 10^4.

Let shares be hashes that fall into the interval [T, T*scale], and define their score as T / hash.
When accumulating shares until their sum score exceeds 1, one is interested in the expected score of a share.

This can be seen to equal 1/scale times the expected value of 1/x for a uniformly random real x in the interval [1/scale,1]. Considering the area under a share score, the latter satisfies (1-1/scale) E(1/x) = integral of 1/x dx from 1/scale to 1 = ln 1 - ln(1/scale) = ln(scale).

So the expected score is approximately ln(scale)/scale.

if the expected score is  ln(scale)/scale then total number of shares  n= scale/ln(scale) by definition.
Hence,  n <1086   for 10000 times scaling down the difficulty. Right?

If it was just a school exam I wouldn't hesitate that much because as of my knowledge and up to the extent I checked it with available references it seems to be basic:
score = T/hash (checked)
probability of x = 1/x (checked)
expected value of x = integral of 1/x dx in the range [1/scale to 1] = ln(1)-ln(scale-1) = ln(scale) (checked)

Yet I'm not entitled to weigh on it, and the result (10,000 times scaling down achieved with 1,086) is too good. I just didn't expect that much efficiency.

Any more comments?
correct reasoning, I mean expected value of a variable x is defined to be  random distribution
tromp
Legendary
*
Offline Offline

Activity: 978
Merit: 1087


View Profile
June 16, 2018, 07:48:52 PM
 #71

if the expected score is  ln(scale)/scale then total number of shares  n= scale/ln(scale) by definition.
Hence,  n <1086   for 10000 times scaling down the difficulty. Right?

That's only approximately right. You do have that the expected sum of 1086 scores exceeds 1,
since expectation of a sum is sum of expectation, but asking for expected number of shares to exceed 1 is something else.
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 16, 2018, 08:17:43 PM
Last edit: June 16, 2018, 09:35:25 PM by aliashraf
 #72

if the expected score is  ln(scale)/scale then total number of shares  n= scale/ln(scale) by definition.
Hence,  n <1086   for 10000 times scaling down the difficulty. Right?

That's only approximately right. You do have that the expected sum of 1086 scores exceeds 1,
since expectation of a sum is sum of expectation, but asking for expected number of shares to exceed 1 is something else.
As I understand, the expected value is also known as the expectation, mathematical expectation, EV, average, mean value, mean, or first moment (the way Wikipedia defines it) and my primitive observation says once you have the  expected value/average of a finite number of uniformly distributed random values and their  total sum you have the cardinality by means of dividing sum to the expected value /average you got, if the variance is zero or very low which is true for a pseudo random function like sha2.

What am I missing?
tromp
Legendary
*
Offline Offline

Activity: 978
Merit: 1087


View Profile
June 16, 2018, 09:44:32 PM
 #73

As I understand, the expected value is also known as the expectation, mathematical expectation, EV, average, mean value, mean, or first moment (the way Wikipedia defines it)


Correct.

Quote
and my primitive observation says once you have the  expected value/average of a finite number of uniformly distributed random values and their  total sum you have the cardinality by means of dividing sum to the expected value /average you got, if the variance is zero or very low which is true for a pseudo random function like sha2.

In your case you have a (potentially unbounded) sequence of i.i.d. random variables S_i
(score of i'th share) and a separate random variable N depending on all S_i which is the minimum n for which the sum of the first S_i exceeds 1.
Of course if the S_i have 0 variance then N = ceiling(1/S_i).

A closely related case is where there is a single random variable S and N is just 1/S.
In that case Jensen's inequality [1] applies and you have E(N) >= 1/E(S) , with equality only for Var(S)=0.

I'm not sure to what extent Jensen's equality carries over to your case.

[1] https://en.wikipedia.org/wiki/Jensen%27s_inequality
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 17, 2018, 08:23:29 AM
Last edit: June 17, 2018, 11:08:36 AM by aliashraf
 #74

@tromp
Thanks, for the explanation.
I checked a few references, 1086 scores for 10,000 scale down is not exactly true as we have variance > 0 in distribution of scores, as you have mentioned correctly. But we also should notice that for large number of blocks we have another convex function for n (being the number of shares) distributed randomly with an order of magnitude less variance (my intuition) and its expected value is what we are looking for. It is beyond my expertise to go much further and discuss this problem thoroughly, tho.
It would be of much help if you could spend some time on this  and share the results. Both for a more active discussion and for keeping this topic reserved for more general discussions, I have started another topic regarding this problem.

By the way, for practical purposes I suppose we can confidently use our previous estimation for n (1400), as the least optimistic one, for now.
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 17, 2018, 03:39:27 PM
Last edit: June 17, 2018, 11:18:51 PM by aliashraf
 #75

I shouldn’t come back for sloppy seconds, but sometimes I like math.

Last year I analyzed the bet which was made in the context of a probability error that Craig Wright made in his paper on selfish mining (which he pulled in shame and I was never able to find an archive of the original paper).

The error that Craig Wright made was assuming the selfish miner is not joint probability with the honest miners (which frankly is an impossible error for the real Satoshi to make):

Quote from: iamnotback
If we ask how long after selfish miner finds a block (i.e. which only happens 33.3% of the time overall!) will the honest miners find another block, then it is 15 minutes after because we have excluded 66.7% of the cases. But if we ask how long will it take for the honest miners to find a block from the starting of mining on the public block in 100% of the cases, then it is 15 minutes (which is thus 5 minutes after selfish miner finds a block if we are modeling both events independently). So it depends what we are modeling and how the bet is stated. Afaik from the screen captures I saw, Craig’s timeline chart apparently didn’t make it clear if we were modeling the honest miner and selfish miners as independent events. So the bet is ill-defined.

Applying divergent thinking to your PoCW, I think I’ve found another flaw in your assumptions about the game theory, which applies to the calculation of how many lower difficulty shares will accumulate to reach the target block difficulty.

As the shares accumulate and their summed difficulty approaches 0.93 of the target difficulty, the variance of profit skyrockets exponentially. IOW, the ROI for continuing to mine shares declines exponentially approaching the finalization of the block. Thus the hashrate applied will plummet if miners are acting rationally to maximize their profit. I’m not even confident that the block will ever be finalized in every instance. The math is getting complex and I would really need to think deeply about it to try to formalize it. Of course sometimes accidentally a block will finalize because a share will infrequently have a very high difficulty result due to variance and thus finalization of the block will be achieved. But I think the finalization step is actually not profitable for small miners. Only a very large mining farm which can mine that final 2% with an ultra low variance is likely to find it profitable to do so. Also we have to factor in the vestment the miners have in the shares they already mined on the block, so the larger miner will have more vestment and more profit incentive to finalize. Thus this seems to eliminate some of the claimed decentralization.

So as I told you from the very start, your proposed PoCW system is much more complex and non-linear than you may realize. And there’s likely many such land mines lurking. There’s a damn good reason Satoshi made the block period 10 minutes and didn’t go chopping it up in intervals. You change the economic theory of mining because the winner of a share doesn’t immediately win a block.
This post is quoted completely from the cardinality problem topic to keep that topic focused on its purpose.

Quote
As the shares accumulate and their summed difficulty approaches 0.93 of the target difficulty, the variance of profit skyrockets exponentially. IOW, the ROI for continuing to mine shares declines exponentially approaching the finalization of the block.


I couldn't figure out how approaching the required share difficulty may result in the situation you suggest.

When a miner is able to generate a Shared Coin Transaction (i.e. he is aware of enough valid shares with same Net Merkle root and a cumulative score of at least 0.93) it is time for him to immediately switch to Finalization phase i.e. mining a block with at least 2% difficulty compared to networks calculated target difficulty.

The block our miner is trying to mine in this phase points to a Shared Coinbase Transaction that has miner's address in its very first row and rewards it exactly 2%, so, there is a fair reward proportional to the required difficulty which is 2% equivalently.
The lucky miner who manages to find such a block

Quote
Only a very large mining farm which can mine that final 2% with an ultra low variance is likely to find it profitable to do so. Also we have to factor in the vestment the miners have in the shares they already mined on the block, so the larger miner will have more vestment and more profit incentive to finalize. Thus this seems to eliminate some of the claimed decentralization.

small miners will hit occasionally too and they are loosing just 2% of the reward if they don't manage to, large miners either should include the shares (belonging to them or not ) asap and switch to finalization phase or keep trying to mine more shares (like privately) and risking both their chance for mining the finalized block and their privately mined shares not to be rewarded at all.

I don't get why you suggest 2% is not enough incentive for mining a 2% difficult block for small or medium miners. As I see the protocol, it distributes work and rewards in a fair way and makes it the better choice for participant to collaborate otherwise they will be isolated and will lose profits or at least they will experience mining variance headaches.

Quote
So as I told you from the very start, your proposed PoCW system is much more complex and non-linear than you may realize.

It is not complex, imo. it just looks to be so, because people are used to winner-takes-all tradition that Satoshi Nakamoto has established and never has been questioned properly.

I've said it before, Satoshi did an extraordinary great job by PoW (the most important innovation ever after TCP/IP, imo) but when it came to details, he picked the most straightforward and naive approach for his rewarding system, winner-takes-all, putting bitcoin in great centralization danger because of both proximity premium and mining variance flaws that took just 2 years to show their impact on bitcoin.

Thinking out of the box, I don't see anything complex in PoCW, on the contrary I see it so natural, the most natural way to implement PoW:
You contributed in security? You should  be rewarded properly
instead of
You got enormously lucky?  So you probably have hashed exhaustively, take everything and enjoy it!

As of non-linearity, I have to agree that in some points we may experience non-linear patterns and sensitivities (transition to and from Contribution phase) but I guess miners can use a good client software to make the critical decisions for questions like :
To which Prepared Plock should I contribute?
Is it the time to replace the block I'm contributing to,  with another more popular one?
Is it the time to end mining shares and try my chance to mine a finalized block?
To which Finalized Block I should point my Prepared Block, I'm going to mine?

All of these questions can be easily addressed by means of simple calculations. Note that the critical  information vital for making such decisions is tens of thousands times more distributed in the network and unlike Nakamoto's variant of PoW, we have practically no proximity premium flaw in PoCW.

Quote
... There’s a damn good reason Satoshi made the block period 10 minutes and didn’t go chopping it up in intervals. You change the economic theory of mining because the winner of a share doesn’t immediately win a block.

Understand and appreciate your respect for Satoshi, he deserves it and more. But I don't think bitcoin was a secret project of Pentagon ot Musad leaded by Satoshi, backed by a think tank. Actually I wouldn't care if it was the case. No matter how much thought you spend on a project, it is always flawed and needs improvement, I mean radical improvements and not minor (and always dangerous) tricks like SW.  Tongue

As of 'changing the economic theory of mining', the way you put it, I'm sure I'm doing so and I feel good about it. Mining is suffering too much and  miners are desperately in need for change.  

I have a lot to say about it just one point for now:
One of the most disastrous consequences of pool mining  is the worst phenomenon can happen for any human activity, the one that is the true criticism of Marx against Capitalism (which never have been truly understood/appreciated by critics), alienation!

We have miners like workers in a factory (the pool) that do not choose and do not decide, do not design and do not change and are not been paid for their human characteristics like being faithful and loyal, they are paid for their power and not for what they produce, miners are alienated from their activity in the same way that workers are from their work. Ironically they are named 'worker' in pooling terminology.

Proof of Collaborative work will have  the whole mining community in his back both because of their interests and their need to matter again, their definite need to resurrection, once it is properly designed and implemented and propagated.

If I'm not doing good enough, no worries, anybody with deep understanding of the protocol is welcome to take the lead, I'll support. Smiley
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 18, 2018, 10:47:44 AM
 #76

I received a cordial PM from @aliashraf.
Smiley I feel good about you despite the bitterness and this topic is about collaborative work after all. PM'd just to keep topic focused.
Quote

... he and I incompatible ... originate from two distinct cultures ... Iwas born in New Orleans in the Deep Old South in 1965. ... those who live in kleptocracy countries have become very cynical and they have more flippant mannerism of interacting. ..
{A LOT of excuses for not dong what you are supposed to}
I think it is so natural, expecting more commitment and responsibility when it is about 'the truth' and 'the right'.

Definition of cynical
1 : having or showing the attitude or temper of a cynic: such as
a : contemptuously distrustful of human nature and motives
… those cynical men who say that democracy cannot be honest and efficient. —Franklin D. Roosevelt
b : based on or reflecting a belief that human conduct is motivated primarily by self-interest a cynical ploy to win votes

See? By above definition, I'm not cynical here or may be in the 'Old South' you mean something else by this word.
I'm a believer in humanity, do not judge people based on their race, gender, nationality, ... (not even based on their attitude or temper). you can ask @achow101, the mod, that how I've responded about you and the chaos in this thread few days ago.

Quote
Also it should be correctly noted that as of now, I have a vested interest for proof-of-work to not be salvageable for decentralization and scalability. That was a conclusion I made a couple of years ago, and so I invested a lot of R&D effort in non-proof-of-work Byzantine fault tolerant consensus system design.

Reconsider! You decided wrong. I think you have wasted your valuable time 'in the box' and it is your fault, come on, think 'out of the box' for a while.

Quote
I hope that helps you better understand about me and my attitude.

So it’s difficult for me to rally around a claimed solution or improvement to proof-of-work, because:

  • I have a vested interest for it to not be true.{False! You are interested! }
  • I think Satoshi was a think-tank for 160+ IQ researchers who designed an exquisite game theory and had considered every possible angle.
    {Good story, but not real! F*k think tanks by the way, they are just a bunch of technocrats who suck and waste people's taxes}
  • I’m nearly certain there are invariants which will make it impossible to achieve decentralization along with scalability in proof-of-work. Even just scaling up economic demand without TPS causes proof-of-work to centralize due to economies-of-scale in profit. {Don't get neare any more because it is simply wrong!}

So I am not against you personally, but please do not attack me just because I can’t get excited around rallying around the open source R&D of your PoCW idea. {It is not an idea, it is a proposal, detail designed}
PoCW is not in R&D phase,  I've already begun implementation phase, it will be open source, undisputable.

Quote
I am not trying to tell others to not participate. I am not trying to railroad your thread. I am not trying to belittle or insult you (although I got angry that you were violating my culture above and were like forcing me into an acrimonious discussion about something I am not that interested). None of that.
Good start. I maintain my strategy about you to commit more, tho.

Quote
And you have another problem which is apparently most of the Core developers and their supporters have exited this subforum. I do not see the super smart guys commenting here anymore. So you probably better to go propose your PoCW in one of the development discussion communication channels where they are. You may ask the mod @achow101 as I saw he had posted recently in response to @traincarswreck where and how to communicate with Core.
I'm aware of that.
Bitcointalk is founded by Satoshi and is the only reference forum I do care about, if it has lost momentum, it is our obligation to bring in some.

Core devs are free to join this discussion I won't go after them to do so. I don't think they are busy elsewhere to do something more important, they are doing nothing other than playing with the same toy they have always been playing with. If I'm wrong refer me to a serious discussion they are involved in comparable with this thread.
Quote
I wish you the best luck with your proposal. But for me personally, I am not going to be one of its supporters unless someone formalizes it and convinces me that my stance is incorrect.
Deal!

Quote
As the shares accumulate and their summed difficulty approaches 0.93 of the target difficulty, the variance of profit skyrockets exponentially. IOW, the ROI for continuing to mine shares declines exponentially approaching the finalization of the block.

I couldn't figure out how approaching the required share difficulty may result in the situation you suggest.

When a miner is able to generate a Shared Coin Transaction (i.e. he is aware of enough valid shares with same Net Merkle root and a cumulative score of at least 0.93) it is time for him to immediately switch to Finalization phase i.e. mining a block with at least 2% difficulty compared to networks calculated target difficulty.

The block our miner is trying to mine in this phase points to a Shared Coinbase Transaction that has miner's address in its very first row and rewards it exactly 2%, so, there is a fair reward proportional to the required difficulty which is 2% equivalently.
The lucky miner who manages to find such a block

When mining shares early in the Shared Coin Transaction phase, the variance of the smaller miner is less of a cost because his share will still likely be produced before the window of time expires. But approaching 0.93 level of accumulated shares, it becomes more costly from a variance weighted analysis for the smaller miner to risk starting the search for a share solution. The math is that the miner will be less likely to earn a profit on average over the short-term of a few blocks when mining closer to the expiration time. Over the long-term, I think the variance skew may average out, but the problem is that time is not free. There’s a cost to receiving profits delayed. This is why smaller miners have to join a pool in the first place.
In Contribution Phase, miners won't interrupt mining and continue brute forcing the search space, unless they realise that one of the two following events is happened:
1- Another Net Merkle root (Prepared Block) is getting hot and they are in danger of drawing dead, This is supposed to happen in the very first beginning of contribution phase and they will respond by switching to new trend.

2- The 93% limit is reached. It happens as a sharp simultaneous event across the network and miners react by switching to Finalization Phase simultaneously, (in a very short interval, say 2-3 seconds, I guess).

Wait! do not rush to the kb to muck at me or to teach me what I don't really need to be taught, just, continue reading ...


I understand, minds poisoned by traditional PoW, for which propagation delay is a BIG concern, can not imagine how it is possible, but it is exactly the case with PoCW:

Suppose we are approaching the 0.93 limit (say 0.736, 0.841,0.913, ... ) from the viewpoint of a miner A. Shares are receiving and it is getting closer and closer to the limit ...

What would be the situation for miner B (say, at the end of longest ever, short path from A)?

B can feasibly b e experiencing (0.664, 0.793, 0.879, ...) at the same time!
 
Before proceeding further, let's understand how this situation is feasible at all ...

Looking closer to the shares that A has validated and is accumulating to calculate the series 0.73, 0.841,0.913, ... and the ones B is using to generate 0.664, 0.793, 0.879, ... may reveal an interesting fact: they are not exactly the same and specially when it comes to newer shares they diverge meaningfully!

Suppose the collection of all the shares (regardless of their miners and how far they have been propagated) to be
S={s1, s2, s3, s3, s3, s3, ... ,, sn-5, sn-4, sn-3, sn-2, sn-1, sn}

Obviously SA and SBare subsets of S representing how far each of the miners, A and B, are aware (being informed about or the source of each share) of S.

As there is propagation delay and other reasons for any miner to 'miss' a member of S, it is completely possible to have
SA = {s1, s3, s4, ... , sn-5, sn-4, sn-2 , sn} *
* Missing s2, sn-3

SB = {s1, s2, s4, ... ,  sn-5, sn-3, sn-1}*
* Missing s3, sn-4, sn-2, sn

They have most of the shares in common but they have not access to all the same shares, they don't need to, Miner B may suddenly receive more shares from the adjacent peers and find himself closer to 0.93 limit and so on,...

It is how PoCW  mitigates the troubles usually we are dealing with because of network propagation problem, we distribute information almost evenly across the network and reduce the proximity premium weight and importance.

This is why I'm convinced that network phase transition  occurs synchronously, in a very short window of time.

This analysis, confirms your concern about risks of the latest shares no to be Finalized ever, but it shows that it is about a very short duration and the chances are distributed more evenly across the network.







aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 18, 2018, 12:27:12 PM
 #77

You wrote nothing to refute my technical objection. Appears you don’t even comprehend the economics I presented.

No, I addressed it by :
Quote
This is why I'm convinced that network phase transition  occurs synchronously, in a very short window of time.

This analysis, confirms your concern about risks of the latest shares not to be Finalized ever, but it shows that it is about a very short duration and the chances are distributed more evenly across the network.

To make it clear and to be more precise:

In two windows of time, in the early seconds of contribution and in the last seconds (both being very short as I have argued above for the latter and have shown in my analysis of convergence process for the first window)  there is a chance for shares not to be finalized.

The proximity premium resistance of the algorithm, will compensate this risk by distributing it evenly in the network.

Note: I'm not sure yet but I suppose the latest point, the risk being distributed, has an interesting implication: In long term, it is no risk at all, it is part of the protocol and is automatically adjusted by target difficulty.
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 18, 2018, 02:35:22 PM
Last edit: June 18, 2018, 02:57:20 PM by aliashraf
 #78

AFAICT, my objection is not ameliorated by any randomization or distribution.

Of course it ameliorates your objection. How could it possibly do anything else after all?

Distributing a risk tens of thousand times in the network and neutralizing proximity premium (what PoCW definitely achieves) is not a simple point to be overlooked easily. When you participate in a network with relatively good distribution of information and luck, you just don't back off because you are afraid of not hitting every single round. It is just part of the game. The only concern is always about how distributed and fair this game is.

Quote
I’m sorry I am not going to repeat the economics again. AFAICT you are simply not understanding.


On the contrary, I do understand every bit of your objection and more ...
It is by no means about "economics" , you are simply questioning the incentives being enough compared to the risks. It is not complicated that much to be called  economics.
Accusing me of being ignorant about such a simple trade off between costs and expenses ... , well it is too much in this context.

Quote
The miner will simply observe that by not mining within a window nearer to 0.93 will be more profitable. If they have ambiguity around the window, they’ll simply have to back off to an even lower threshold or use a timeout.
Now let's have a bit more "economical" assessment here:

You are suspicious that when the total score of shares that the miner is aware of is close enough to 0.93 threshold, a rational miner may stop mining, take a break, waiting for probable more shares to come and switching to next phase because there is more chance for his newly mined shares not to be included in the finalized block and it will be waste of ... wait ... waste of what? Electricity? Because the rents are already paid. aren't they?

So it will be about risking electricity expenses in a 2-3 seconds duration against a proportional fair chance to hit and be the first (almost) who transits to Finalization phase.

Quote
But I have nothing to gain by continuing to explain it. Cheers.

Note that doesn’t mean I am 100% certain. I would need to do some formalization. And I am unwilling to expend the effort.

Neither do I. It is too luxurious for this protocol to be analyzed to that extents, I'll leave it as it is. For now, I'm convinced that no back-off threats practically exist. Miners will simply take their shuts because the crisis threshold is very narrow, as I have proved before.



aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 19, 2018, 06:33:49 AM
Last edit: June 19, 2018, 06:48:51 AM by aliashraf
 #79

Of course it ameliorates your objection. How could it possibly do anything else after all?

Craig Wright is an obnoxious, stubborn, boastful person who fell flat on his face when he finally tried to formalize and published a whitepaper.

I’ll await your formalization, as I awaited for a several years for Radix to finally specify their flawed design.

Although formalism is not my first priority right now and instead implementation is, I have presented this proposal in a formal way. The algorithm and the terminology have been described in details and free of ambiguity. You might have noticed until now, I've made only one major edit as a result of discussion with @ir.hn. If by formalism you mean a lot of mathematical analysis to address every single possible attack or vulnerability, I think it is too much in this stage.

I'm not suggesting it generally to postpone a more formal and finalized whitepaper but for this special project, implementation is of higher priority. Let me explain:

I introduced PoCW not for giving birth to a new coin, improving bitcoin and  Ethereum (while saving it from Butterin's Caspar coup d tate) are my main concerns. As one of the first contributors to this topic has correctly emphasised the main challenge here is political. I started this topic to spread the word and the idea not to convince people to join a scammy shitcoin project but to find about who is who  and how is the weather.  The next step is implementing the code and demonstrating how feasibly smart and clean I can do this.

I'm having a long war to win and I don't play this game so cautiously: saying nothing unless you got an army of academicians in your back. It is not the way I fight, when I find the answer I show up with it and start fighting and will fight to the river, as I have told you elsewhere. I don't hesitate and don't postpone everything for paperwork.

Quote

Distributing a risk tens of thousand times in the network and neutralizing proximity premium (what PoCW definitely achieves) is not a simple point to be overlooked easily. When you participate in a network with relatively good distribution of information and luck, you just don't back off because you are afraid of not hitting every single round. It is just part of the game. The only concern is always about how distributed and fair this game is.

Correct. You’re also describing Nakamoto proof-of-work, in which small miners must join pools, because the variance risk is always too high for the entire block period.

Analogously, I claim that PoCW creates increasing variance risk later in the block period. So again smaller miners need to turn off their mining the closer to the end of the block period, and wait to mine the next block again.
Unless Nakamoto's PoW suffers from being vulnerable to mining variance in the whole period and mine is in danger just in a short transition period Plus in Nakamoto's case we have a single focal point of information and proximity premium while in my proposal, PoCW, we compensate for the said danger by distributing the information (new shares) tens of thousands of times.

You totally ignore both differences and I don't know why.
Quote
Throughout this entire thread you seem utterly incapable of comprehending the concept of relativism. I grow very weary of repeating myself, even though you continue to pretend (misrepresent) to readers that I’m dim witted or disingenuous.
Please don't go back there. It is not true at all. Nobody implies people are disingenuous by arguing with them. why should you put it that way?
Quote
You are suspicious that when the total score of shares that the miner is aware of is close enough to 0.93 threshold, a rational miner may stop mining, take a break, waiting for probable more shares to come and switching to next phase because there is more chance for his newly mined shares not to be included in the finalized block and it will be waste of ... wait ... waste of what? Electricity? Because the rents are already paid. aren't they?

Electricity is not already paid. If there is any form of flat-rate mining hardware hosting account which does not meter electricity on a usage basis, then the account is not profit to mine with, because electricity is the major cost of mining.


What I was trying to say is that mining involves several cost factors: rents, hardware depreciation, wages and electricity. Hypothetical back-off strategy just can help reducing electricity costs for few seconds per minute by relaxing the miner from hashing. I suggest even with high electricity fees it won't trade-off with dropping the chances to hit and be rewarded.
aliashraf (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1174

Always remember the cause!


View Profile WWW
June 19, 2018, 09:36:36 AM
Last edit: June 19, 2018, 10:05:35 AM by aliashraf
 #80

P.S. If you understood anything about the Old South, you would understand that is the way you talk to other people that makes you a bonafide citizen of that culture. I do not talk to people on this forum the way I talk to people in my culture, because very few here have the etiquette of the Old South. So on this forum I get to be as big of an asshole as others are to me. It is pure defect-defect because the forum is anonymous. Although I have a made a decision to try to exhibit much greater patience and tolerance.
No worries dude, I will do my best to keep this debate alive as long as some meat is served here  Wink
About Old South, ....  you already know,  I'm a fan!

Quote

I introduced PoCW not for giving birth to a new coin, improving bitcoin and  Ethereum (while saving it from Butterin's Caspar coup d tate) are my main concerns. As one of the first contributors to this topic has correctly emphasised the main challenge here is political. I started this topic to spread the word and the idea not to convince people to join a scammy shitcoin project but to find about who is who  and how is the weather.  The next step is implementing the code and demonstrating how feasibly smart and clean I can do this.

I'm having a long war to win and I don't play this game so cautiously: saying nothing unless you got an army of academicians in your back. It is not the way I fight, when I find the answer I show up with it and start fighting and will fight to the river, as I have told you elsewhere. I don't hesitate and don't postpone everything for paperwork.


Proof-of-work is “might is right.” Why are you idealistic about fungible stored capital enslaves mankind? You are just fiddling with the fringe of the issue. The central issue is that proof-of-work is all about the same ole paradigm throughout human history that says fungible stored claims on someone else’s labor is powerful.

I reject that and we are headed into a paradigm-shift which will render the NWO reserve currency Bitcoin irrelevant. Security of fungible stored capital is not the big deal it was before in the fixed investment capital Agricultural and Industrial ages. You need to understand that you are barking up an old, rotten tree with old, rotting Old World money in its midst.

Read every linked document in this blog, and my comments below the blog and wake up:

https://steemit.com/cryptocurrency/@anonymint/bitcoin-rises-because-land-is-becoming-worthless
It is the true story behind this debate, isn't it?
At a very unfortunate moment of your history with bitcoin and PoW, you made a horrible decision: giving up on both!
Cultures won't help, people just are the same, no matter to when or where they belong, they just give up when they become disappointed.

For me, this is a different story. When I find something brilliant, I don't care about its current state of development, brilliance is enough for me to commit and not to give up on it, no matter what.

History teaches us another important lesson too: When a paradigm shows up, it will stay for a while and it is pointless and mostly impossible to have a paradigm shift every decade.
I will check your link and I'll go through your replies as you wish, I promise, but, I have to say, I'm strategically against any attempt to replace PoW, it seems to me just a fake ridiculous attempt, a cartoon. Sorry, but it was you who chose the wrong side.

Quote
If by formalism you mean a lot of mathematical analysis to address every single possible attack or vulnerability, I think it is too much in this stage.

I think it’s impossible to not have acrimony without it. You just assume epilson without actually proving how small are the effects you dismiss.

{....}

You need to show the math and prove it is epsilon.

And that includes also your presumption of “for few seconds per minute.” Depends on the variance of the miner.
I don't agree. It is always possible to discuss issues without going through formal and mathematical analysis. I did some formal analysis of this specific subject of your concern (variance in transition period) and have shown how sharp is this period.
But now you are unsatisfied and keep pushing for more details which I just can't schedule more time for , and if I do it, nobody will read it, not now.

Quote
There is also a countervailing force which you could argue for, which (I mentioned up-thread) is the investment in shares the miner already has and the amount of luck he adds to the block being solved if he does not stop mining. But that is probably a neglible factor and Vitalik already explained that altruism-prime is an undersupplied public good (see the weak subjectivity Ethereum blog), so I doubt miners would be able to agree to not defect for the greater good. It’s a Prisoner’s dilemma.

Again you need to show the math.
Please! You probably know my opinion about this kid, Buterin and his foolish "weak subjectivity" thing. It is a shame, a boy desperately obsessed with being genius, is trying to revolutionize cryptocurrency by 'weak' shits. Absolutely not interested.

As of the proposed 'additive' for miners not to back-off because of the hypothetical variance in transition phase, thanks for reminding and I'm fully aware of that, I just didn't bring it forward to avoid complicating the subject even more.

Anyway, it improves the odds and can't be rejected by the boy's "discovery" of altruism not being the dominant factor  in a monetary system  Cheesy
It is not about well being of others.
Miners have always incentive to have their own previously mined shares (in the current round) to be part of the chosen %93 and their late shares besides direct rewards will help this process.

Quote
Actually perhaps the most logical action is for smaller miners to switch to centralized pools for a portion of the block. Probably it will be difficult to argue against that mathematically. So that if true, probably more or less defeats the stated purpose of PoCW.
I think, there is a possibility (not a force) for some kind of pooling in PoCW. But it won't be the same as conventional centralized pools even a bit (it doesn't need to be) and won't defeat the purpose being eliminating the pooling pressure and its centralization consequences.

I have to analyze it far more, but I guess a light gradient exists here in favor of forming kinda 'agreements' between clusters of small miners to communicate in star topologies to help each other transiting more smoothly. It is a light gradient, as there is very low stakes ( 2% or so) on the table.

One should again take into consideration the way PoCW fixes proximity premium and practically synchronizes miners to transit between phases almost in the same time, as I have already discussed it extensively, implying short transient periods and less incentives for setup/cleanup costs needed to join a  pool temporarily.
Pages: « 1 2 3 [4] 5 6 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!