solex
Legendary
Offline
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
|
|
April 01, 2014, 06:46:55 AM |
|
This puzzles me too.
Isn't cumulative difficulty normally the summed target difficulty? Does the TW make the target diff of each block appear higher than what was actually required in hashing power to mine it?
|
|
|
|
markm
Legendary
Offline
Activity: 2996
Merit: 1121
|
|
April 01, 2014, 06:49:43 AM |
|
There is no cumulation of difficulty, only of "work".
But, only chunks (hashes) of "work" that meet or beat the difficulty threshold that is their target get into the chain.
Low enough difficulty and every hash would get into the chain.
High difficulty hardly any hashes luck into enough of a "work" value result to get into the chain so most hashes aren't counted in the chain thus aren't part of the accumulated total "work" that determines which chain is "longest".
-MarkM-
|
|
|
|
iopq
|
|
April 01, 2014, 07:20:06 AM |
|
Shouldn't the coin then say that the chain with the most work "block * difficulty" is better than a chain with more blocks at a lower difficulty? As far as I understand every coin out there considers the highest blockchain to be the best. But wouldn't the highest sum of blocks * difficulty be more secure from time warp exploits?
|
|
|
|
markm
Legendary
Offline
Activity: 2996
Merit: 1121
|
|
April 01, 2014, 07:26:06 AM |
|
Shouldn't the coin then say that the chain with the most work "block * difficulty" is better than a chain with more blocks at a lower difficulty? As far as I understand every coin out there considers the highest blockchain to be the best. But wouldn't the highest sum of blocks * difficulty be more secure from time warp exploits?
On the contrary, it seems. The timewarp attacker spawns many more blocks than the defender, so if more blocks meant more length/height in the measure of which chain is the winner, the attacker would have even more of an advantage. Maybe divide by the number of blocks instead of multiplying? But likely that would lead instead to an attack by saving up a chain of few blocks that each took massive amounts of time/work to create. So either way the current way of adding up the work is likely the best, it is just unfortunate that there is no record of how much work was actually done (how many hashes actually done) by hashers between hashes that got lucky (got enough work to meet their target and thus get into the chain and thus count at all toward total work). -MarkM-
|
|
|
|
iopq
|
|
April 01, 2014, 07:30:58 AM |
|
Shouldn't the coin then say that the chain with the most work "block * difficulty" is better than a chain with more blocks at a lower difficulty? As far as I understand every coin out there considers the highest blockchain to be the best. But wouldn't the highest sum of blocks * difficulty be more secure from time warp exploits?
On the contrary, it seems. The timewarp attacker spawns many more blocks than the defender, so if more blocks meant more length/height in the measure of which chain is the winner, the attacker would have even more of an advantage. Maybe divide by the number of blocks instead of multiplying? But likely that would lead instead to an attack by sabing up a chain of few blocks that each took massive amounts of time/work to create. So either way the current way of adding up the work is likely the best, it is just unfortunate that there is no record of how much work was actually done (how many hashes actually done) by hashers between hashes that got lucky (got enough work to meet their target and thus get into the chain and thus count at all toward total work). -MarkM- well he's mining at a difficulty several times lower so that's why he's generating more blocks sum(block * the difficulty at the time of that block) = total chain the exploit gives a lot of "cheap" blocks at which point the difficulty is low, so the total sum is low if the height was calculated using the difficulty at the time the block was found, the real chain has fewer blocks but at a much higher difficulty I postulate that hashpower * time = the height of the blockchain under my calculation which means the real chain wins every time
|
|
|
|
Omnivion
|
|
April 01, 2014, 07:42:45 AM Last edit: April 01, 2014, 08:01:27 AM by Omnivion |
|
What iopq is saying is the sum of each every block's difficulties, which should be very close to the total amount of work done in all but the short term. (Basically the discrete version of an integral, not sure if there's a mathematical name for it or if you'd just call it a sum.)
That seems like a reasonable idea, although it may make it easier make very small forks. (You can mine one block slower than the true chain, which causes your difficulty to go up, then you only need to catch up to the true chain rather than surpass it to gain preference - something along those lines. You could probably fix this by making a minimum difficulty of at least the last 4 blocks or so greater to override an equal/longer block.)
|
Blockchain for Apps | Blockchain for Business | Blockchain for Future
|
|
|
YarkoL
Legendary
Offline
Activity: 996
Merit: 1013
|
|
April 01, 2014, 07:44:01 AM |
|
What would be potential drawbacks / sideeffects of this fix?
I'm wondering if there are any cases, where a "legit" block may have earlier timestamp than latest blocktime. For instance because of different clock settings and daylight saving time adjustments. Has there been a lot of rejected blocks since the new fix?
|
“God does not play dice"
|
|
|
thaaanos
|
|
April 01, 2014, 09:08:47 AM |
|
Actually mikey you might want to ask Nite69, the developer brought on by the Auroracoin team to help prepare the KGW TW fix. Nite69 realizes there is a KGW TW underway and by extension so does the Auroracoin Team even though won't and can't admit it.
About the vulnerability BCX found; I'm quite sure this will fix it, like any time warp vulnerability. Shame us, we should have fixed this already earlier + int64 LatestBlockTime = BlockLastSolved->GetBlockTime(); for (unsigned int i = 1; BlockReading && BlockReading->nHeight > 0; i++) { if (PastBlocksMax > 0 && i > PastBlocksMax) { break; } PastBlocksMass++; @@ -894,8 +895,10 @@ unsigned int static GravityWell(const CBlockIndex* pindexLast, const CBlock *pbl if (i == 1) { PastDifficultyAverage.SetCompact(BlockReading->nBits); } else { PastDifficultyAverage = ((CBigNum().SetCompact(BlockReading->nBits) - PastDifficultyAverage Prev) / i) + PastDifficultyAveragePrev; } PastDifficultyAveragePrev = PastDifficultyAverage; - - PastRateActualSeconds = BlockLastSolved->GetBlockTime() - BlockReading->GetBlockTime(); + if (LatestBlockTime < BlockReading->GetBlockTime()) { + LatestBlockTime = BlockReading->GetBlockTime(); + } + PastRateActualSeconds = LatestBlockTime - BlockReading->GetBlockTime(); PastRateTargetSeconds = TargetBlocksSpacingSeconds * PastBlocksMass; PastRateAdjustmentRatio = double(1); if (PastRateActualSeconds < 0) { PastRateActualSeconds = 0; }
just quick look int64 LatestBlockTime, shouldn't this be unsigned? do you store negative values in BlockTime? I wonder if an attacker can start in the past move his way backwards in time until he appears in the future.
|
|
|
|
iopq
|
|
April 01, 2014, 03:11:58 PM |
|
What iopq is saying is the sum of each every block's difficulties, which should be very close to the total amount of work done in all but the short term. (Basically the discrete version of an integral, not sure if there's a mathematical name for it or if you'd just call it a sum.)
That seems like a reasonable idea, although it may make it easier make very small forks. (You can mine one block slower than the true chain, which causes your difficulty to go up, then you only need to catch up to the true chain rather than surpass it to gain preference - something along those lines. You could probably fix this by making a minimum difficulty of at least the last 4 blocks or so greater to override an equal/longer block.)
well then you'd need to mine two blocks to orphan one and one of them at higher difficulty but any counter-measures would depend on the exact difficulty-adjustment algorithm and whether the difficulty adjusts every block or every x blocks someone more knowledgeable than me could probably tell me why my idea is bad
|
|
|
|
markm
Legendary
Offline
Activity: 2996
Merit: 1121
|
|
April 01, 2014, 03:27:33 PM |
|
The work is at least equal to the difficulty (well, actually, reciprocally equal; congruent, so to speak).
Adding up difficulty instead of work would make the lucky hashes that happen to be way the heck better than their target difficulty requires from counting as any better than would the worst possible hash that suffices to meet the target. If the attacker is getting way the heck more hashes into the blackchain (way the heck more blocks into the blockchain) thennot counting any work in excess of target on each block should statistically tend to lose the attacker more excess work than the defender, but if the attacker is using lower difficulty blocks then still more of the attacker's hashes will tend to meet the target so the attacker will be having more of their hashes count.
Maybe the attacker's advantage might be less but I suspect the attacker would still have the advantage.
-MarkM-
|
|
|
|
sudukkk
|
|
April 01, 2014, 03:31:27 PM |
|
Is this a good time to buying some AUR? The prize volatility is very large, maybe is a chance.
|
|
|
|
xxzbeh
Newbie
Offline
Activity: 2
Merit: 0
|
|
April 01, 2014, 03:36:57 PM |
|
It's bound to be a shit coin.
|
|
|
|
markm
Legendary
Offline
Activity: 2996
Merit: 1121
|
|
April 01, 2014, 03:37:37 PM |
|
If the attack works the price might go down. How is their hashing power looking now? Does it seem like enough yet to counter such an attack? If not likely they'll be available cheap once the attack takes effect, if exchanges re-list them after some kind of damage control or recovery plan is done after the attack.
Realistically they ought not be listed on exchanges at all until they fix their hashing power, as being listed increases the number of people who will be hurt and the incentive for attackers to bother attacking.
-MarkM-
|
|
|
|
kalus
Sr. Member
Offline
Activity: 420
Merit: 263
let's make a deal.
|
|
April 01, 2014, 03:52:46 PM |
|
auroracoin is ~5ghash/sec. still v. vulnerable for a coin of this size and value. http://bitinfocharts.com/comparison/hashrate-aur.htmlthis was up from the 1.2 ghash/sec when block 5400 rolled around.
|
DC2ngEGbd1ZUKyj8aSzrP1W5TXs5WmPuiR wow need noms
|
|
|
markm
Legendary
Offline
Activity: 2996
Merit: 1121
|
|
April 01, 2014, 03:58:22 PM |
|
Its a lot more than the ~1GH the (known) attacker is supposedly using, though. Can ~1GH beat ~5 using timewarp? I guess we will find out.
Agreed though that it is pathetically low. We know that just a stupid meme can conjure up 100+ gigahashes almost overnight, so any less is just asking for a "PWN the blockchain(s)!" meme to pop up and PWN it...
-MarkM-
|
|
|
|
Nite69
|
|
April 01, 2014, 04:14:40 PM |
|
just quick look int64 LatestBlockTime, shouldn't this be unsigned? do you store negative values in BlockTime?
I wonder if an attacker can start in the past move his way backwards in time until he appears in the future.
In principle, you are right. In practise it does not matter. The origin of LatestBlockTime is from block timestamp which is unsigned 32 bit integer. Of course it would make sense to use unsigned integers, but BlockLastSolved->GetBlockTime() also returns signed 64 bit integer. Don't know why, maybe using signed has made some calculations easier and since the origin is 32 bit there is no problem.
|
Sync: ShiSKnx4W6zrp69YEFQyWk5TkpnfKLA8wx Bitcoin: 17gNvfoD2FDqTfESUxNEmTukGbGVAiJhXp Litecoin: LhbDew4s9wbV8xeNkrdFcLK5u78APSGLrR AuroraCoin: AXVoGgYtSVkPv96JLL7CiwcyVvPxXHXRK9
|
|
|
jnada
|
|
April 01, 2014, 04:29:44 PM |
|
@Nite69
I sent you a PM about your fix.
~BCX~
Wow, seems like there is a tint of cooperation at the end That's nice.
|
NEM NEM NEM launch on 22nd November 2014 ! TAQDPC-BT35MC-YAWTKQ-OFEZOS-CMBKHM-F5Q3L6-YEYL
|
|
|
marcotheminer
Legendary
Offline
Activity: 2072
Merit: 1049
┴puoʎǝq ʞool┴
|
|
April 01, 2014, 04:32:03 PM |
|
seems like aurora is still going strong?
|
|
|
|
thaaanos
|
|
April 01, 2014, 06:32:27 PM |
|
just quick look int64 LatestBlockTime, shouldn't this be unsigned? do you store negative values in BlockTime?
I wonder if an attacker can start in the past move his way backwards in time until he appears in the future.
In principle, you are right. In practise it does not matter. The origin of LatestBlockTime is from block timestamp which is unsigned 32 bit integer. Of course it would make sense to use unsigned integers, but BlockLastSolved->GetBlockTime() also returns signed 64 bit integer. Don't know why, maybe using signed has made some calculations easier and since the origin is 32 bit there is no problem. I assume timestamp is Unix time? So what happens after 06:28:15 UTC on Sun, 7 February 2106? You will probably have fix it by then but lets say an attacker fast forward to this day and then wraps around 1970 to current day, are you safe?
|
|
|
|
|