Not sure where you see a stubborn refusal to fix the difficulty problem when I just mentioned more than once recently in this thread that we have three people working on the 'zmark' chain testing algorithms.
My apologies, I missed that.
Can you briefly explain what the 'zmark' algo is? Is this the same difficulty algo that has been discussed for some time which aims to change difficulty based on demand? What is the ETA on this new algo, days, weeks, months?
Thanks
'zmark' isn't a new algo but just the name that they chose for their pfennig fork. I think right now they are testing DGW on it, no major modifications yet as far as I know. But the plan is to try to come up with an algo that is able to reduce supply as well. Vertcoin recently annouced that they've mannaged to develop such an algo, but as far as I know it's not released yet. So maybe we'll be able to try testing that one out on the zmark testchain at some point soon.
Here's some recent chat from #pfennig on slack. Anyone who wants an invite feel free to PM me.
00:39] markpfennig: Is there a summary of any alernative algo's discussed or found or implemented which control the supply whilst varying diff?
[00:43] amarha: i don't think they're at that point yet
[00:43] amarha: although i could be wrong
[01:37] dbkeys: Basically, KGW was the first one, but it has a time warp exploit. Someone with more than 50% of the hashing power can play games with time and give themselves lots of easy blocks.
[01:38] dbkeys: I was strongly advised by "The Altcoin Guy" to use Dark Gravity Wave, as it is simpler and has proven more resistant to attack.
[01:38] dbkeys: DigiShield, used in a number of coins (originally DigiByte I believe) and notably DOGE, has also been advocated. I know little about how it works.
[01:39] dbkeys: Leathans and I own idea is that we have to first understand what these algorithms are doing before we choose one, or adopt aspects of it into our own.
[01:40] dbkeys: Basically, we have a situation where the difficulty easily adjusts upward but once hash power leaves the network, is very slow (in real time terms) to come back to a reasonable value.
[01:41] leathan: amarha: correct we sorta skipped over that for some reason when we get a better udnerstanding of them i doubt we will go with any existing one like dbkeys said (or if we do it will be most likly a modified version)
[01:42] dbkeys: I'm inclined to think about how the honest peers in the network can keep closer track of real time, by requiring some type of time-consensus that guarantees a minimum block production rate. Emdje has pointed out that the probability that more than a 4 or 5 two-minutes intervals going by without the production of a block quickly goes down down as being attributable to a Poission random process and rapidly, with great certainty, becomes indicative of reduced network hash power.
[01:43] dbkeys: Clearly, after more than some threshold number of block-less time intervals (perhaps as low as 4 or 5, ie, (8 or 10 minutes in BTM block-time) the difficulty should begin to be ratcheted down.
[01:46] dbkeys: An analysis of peaks and valleys of hashrate looking back several week (perhaps even to the beginning of the coin's history) and an analysis of the average hourly coin emission rate could be used by an algorithm to not only set a reasonable difficulty factor, but also a block-reward that does not create more coins than hashrate, as a measure of demand, would seem to warrant.
[01:47] leathan: We should talk to that math chairman guy dbkeys :S (edited)
[01:49] dbkeys: I also like the notion that if the hash rate goes down significantly from recent historical peaks, not only should the difficulty go down somewhat, and the block reward reduced, but (within a certain limit) the block-time-interval target itself could be made longer than the nominal 2 minutes, to allow for the fact that there is reduced demand for the coin, _AND_ still provide a meaningful amount of proof-of-work by the remaining honest hash power.
[01:50] dbkeys: This would be an "elastic time-interval-target" that could range from the nominal 2 minutes betweeen blocks (when hashrate is at a peak) to perhaps ranging as hig as10x longer or 100x loner ( 20 minutes to 3.33 hours) This would still be preferable than 1 block per day (maybe)
[01:51] emdje: When the difficulty based upon current blocktime is not decreased to fast you already have slower blocks
[01:51] dbkeys: Yeah, @leathan , been trying to get a hold of that guy !
[01:51] emdje: thus accounting for decreased demand
[01:51] dbkeys: that is true
[01:51] leathan: thats what i meant by "double" elastic on the forum by the way. dynamic block time // reward
[01:52] leathan: witha clear target like dbkeys said on 2mins
[01:52] dbkeys: the thing is that if a KGW/ DGW algo mereley adjust the diff down, then very little (relatively speaking) hashpower is securing the transactions.
[01:53] dbkeys: it should indeed gradually adjust the diff down, but perhaps not so low that block production reaches the fastest nominal 2 min time interval between blocks
[01:53] dbkeys: maybe a 10x slowdown to 20 minutes block interval time is acceptable to all when hashpower is in a valley
[02:00] dbkeys: thus at least, the reduced hashpower has to work 10x more to produce a block ... but still 20 minutes is ok to wait, when we have been waiting 24 hours in some occasions ..... :O
[02:04] dbkeys: in other words, when network hashpower is low, the block interval time should be higher in order to require that some _minimum_ amount of hash energy be securing every transaction block
[02:56] emdje: What if you would take the ratio between 'normal' blocktime and current blocktime; at this moment that ratio is 2/367.876=.005436558, and thansform that using (Ratio-1)^3. (edited)
[02:58] emdje: When the hashrate goes down fast the ratio becomes negative (approching 0), when up is goes up increasingly fast
[03:02] emdje: transformation of our current ratio is -.9838. Where one could decide to intermittently multiply the difficulty by (4 + -.9838)/4=.754
[03:02] emdje: meaning a reduction to 1472.9
[03:06] emdje: The subsequent ratio, when hashrate stays the same, is .00721
[03:12] emdje: hmm I see that hashrate is adjusted to fast on the low side
[03:23] emdje: (Ratio-1)^x Inceasing x means that the extremes are adjusted more
[03:24] emdje: Don't know if it is useful, but this is what I just cooked up using excel :stuck_out_tongue: