monsterer
Legendary
Offline
Activity: 1008
Merit: 1007
|
|
January 10, 2016, 05:41:23 PM |
|
So let's say the union of both chains is the rule. The problem becomes how many chains do we allow in the union because it is unbounded. If we place a bound on it, the adversary can Sybil attack the bound.
It has to remain unbounded for it to have any chance of working. In this case, such an attack would bring consensus to a standstill - but would this be a sustainable attack? If you imagine the attacker generating new POW transactions continually to create the bigger union, this is going to get quite expensive for him quite quickly... whereas the minority do not need to expend as much energy, since they are just bundling transactions not fabricating them.
|
|
|
|
TPTB_need_war (OP)
|
|
January 10, 2016, 06:14:04 PM |
|
So let's say the union of both chains is the rule. The problem becomes how many chains do we allow in the union because it is unbounded. If we place a bound on it, the adversary can Sybil attack the bound.
It has to remain unbounded for it to have any chance of working. In this case, such an attack would bring consensus to a standstill - but would this be a sustainable attack? If you imagine the attacker generating new POW transactions continually to create the bigger union, this is going to get quite expensive for him quite quickly... whereas the minority do not need to expend as much energy, since they are just bundling transactions not fabricating them. Remember I had mentioned in my vaporcoin thread last month that this was my original idea to defeat 51% attack by including all chains. I mentioned when I started this thread that I had found a fundamental flaw in that design. Now I remember what it is. The attacker can put a double-spend on every block of the minority chain to invalidate it relative to the longest chain (which also contains the double-spend but confirmed after the minority chain has confirmed the double-spend). This is an example of how elusive the flaws can be. You really have to dig. I think what I did next was contemplate still taking the union (conjunction) of all non-double spent transactions from all chains. I forget at the moment what was the flaw with that. I should have wrote it down. Edit: now I remember. The attacker simply waits until there are derivative transactions, so no one can trust any transaction is truly confirmed on the minority chain.
|
|
|
|
monsterer
Legendary
Offline
Activity: 1008
Merit: 1007
|
|
January 10, 2016, 06:26:55 PM |
|
Remember I had mentioned in my vaporcoin thread last month that this was my original idea to defeat 51% attack by including all chains. I mentioned when I started this thread that I had found a fundamental flaw in that design. Now I remember what it is.
The attacker can put a double-spend on every block of the minority chain to invalidate it. This is an example of how elusive the flaws can be. You really have to dig.
You need a way to make the double spend only affect the recipient of the transaction, and not subsequent transactions which are chained off the double spend. In block based POW, a single double spend amongst many other transactions doesn't invalidate the entire block?
|
|
|
|
Come-from-Beyond
Legendary
Offline
Activity: 2142
Merit: 1010
Newbie
|
|
January 10, 2016, 06:27:29 PM |
|
Byzantine tolerance can not exceed 50% of failures.
Isn't it 33%?
|
|
|
|
TPTB_need_war (OP)
|
|
January 10, 2016, 06:38:43 PM |
|
Need for censorship resistant block chain technology applies not only to currency: 1. So the decentralized database is a block chain? 2. Does the browser need to run Java applets to do wallet level operations? 3. What is the advantage for the user of a decentralized block chain storage for their tweets? The data is open to display/access to anyone and thus isn't owned by Twitter, so tweets can be displayed by any client, not just through an API authorized by Twitter. Talk about how this creates advantages that users care about? Sounds like it will be used by terrorists so then the government has an incentive to shut it down. 4. How can this remain decentralized if the mining becomes centralized? Seems the same centralization problems that plague crypto currency thus hang over the head of any current block chain design. 5. Why should we think a product that is breaking SEC regulation by selling shares has any long-term future? The government can make an example out of you with SEC action.
|
|
|
|
TPTB_need_war (OP)
|
|
January 10, 2016, 06:45:20 PM |
|
Remember I had mentioned in my vaporcoin thread last month that this was my original idea to defeat 51% attack by including all chains. I mentioned when I started this thread that I had found a fundamental flaw in that design. Now I remember what it is.
The attacker can put a double-spend on every block of the minority chain to invalidate it. This is an example of how elusive the flaws can be. You really have to dig.
You need a way to make the double spend only affect the recipient of the transaction, and not subsequent transactions which are chained off the double spend. In block based POW, a single double spend amongst many other transactions doesn't invalidate the entire block? The LCR insures no double-spend exist. There is no fix for the idea of taking the conjunction of chains. The idea is flawed that is why I abandoned it. Also note I had added to my prior post.
|
|
|
|
Come-from-Beyond
Legendary
Offline
Activity: 2142
Merit: 1010
Newbie
|
|
January 10, 2016, 06:50:12 PM |
|
The LCR insures no double-spend exist.
Once quantum computers hit the market everyone will be able to rewrite complete Bitcoin blockchain from the genesis. So I would add something to emphasize temporal nature of your statement.
|
|
|
|
TPTB_need_war (OP)
|
|
January 10, 2016, 06:54:54 PM |
|
The LCR insures no double-spend exist.
Once quantum computers hit the market everyone will be able to rewrite complete Bitcoin blockchain from the genesis. So I would add something to emphasize temporal nature of your statement. Quantum computing does not significantly speed up PoW because it is hash based (thus Shor's algorithm doesn't apply). So you must mean that ECDSA signatures could be cracked and double-spent? But same as for a 51% attack that is not a threat vector because no one will honor that. Politically it is an impossible attack. And still there is no way to recompute all that PoW and arrive with a longer chain, so no attack exists. Besides in my current design I use Winternitz signatures that are not subject to Shor's algorithm.
|
|
|
|
Come-from-Beyond
Legendary
Offline
Activity: 2142
Merit: 1010
Newbie
|
|
January 10, 2016, 07:11:52 PM |
|
Quantum computing does not significantly speed up PoW because it is hash based (thus Shor's algorithm doesn't apply). So you must mean that ECDSA signatures could be cracked and double-spent?
https://en.wikipedia.org/wiki/Grover%27s_algorithm: Grover's algorithm is a quantum algorithm that finds with high probability the unique input to a black box function that produces a particular output value, using just O(N1/2) evaluations of the function, where N is the size of the function's domain.
|
|
|
|
TPTB_need_war (OP)
|
|
January 10, 2016, 07:12:35 PM |
|
The intended pool destroying feature of that coin does not work. As the white paper admits, pools can detect cheating statistically and ban miners. That is the Share Withholding Attack[1] which already exists in Bitcoin but pools are able to function. Besides pools are absolutely necessary else miners can't pay for their equipment because they may win a block only every so many years. By eliminating pools (if you could but you can't), you force mining on to mining farms! I don't have time to waste analyzing all these amateur copycoins. Thanks for trying and I don't fault you as a layman for not being able to discern bullshit from substance.
|
|
|
|
monsterer
Legendary
Offline
Activity: 1008
Merit: 1007
|
|
January 10, 2016, 07:13:05 PM |
|
The LCR insures no double-spend exist. There is no fix for the idea of taking the conjunction of chains. The idea is flawed that is why I abandoned it. Also note I had added to my prior post.
That's why you'd need a new chain selection rule... what happens if you were to use to greatest set of all new transactions as the rule? The greatest set of all new transactions would have the largest cumulative POW weight (in a system with a constant POW cost per transaction), such that any double spend by the attacker within this set would become the canonical spend, not the double spend.
|
|
|
|
TPTB_need_war (OP)
|
|
January 10, 2016, 07:14:42 PM Last edit: January 10, 2016, 07:27:21 PM by TPTB_need_war |
|
Quantum computing does not significantly speed up PoW because it is hash based (thus Shor's algorithm doesn't apply). So you must mean that ECDSA signatures could be cracked and double-spent?
https://en.wikipedia.org/wiki/Grover%27s_algorithm: Grover's algorithm is a quantum algorithm that finds with high probability the unique input to a black box function that produces a particular output value, using just O(N1/2) evaluations of the function, where N is the size of the function's domain. That is still EXPTIME. And that doesn't speed up PoW, rather it speeds up finding a preimage.
|
|
|
|
Fuserleer
Legendary
Offline
Activity: 1064
Merit: 1020
|
|
January 10, 2016, 07:17:13 PM |
|
Byzantine tolerance can not exceed 50% of failures.
Isn't it 33%? You can achieve 50% in some circumstances by making sacrifices in the system.
|
|
|
|
TPTB_need_war (OP)
|
|
January 10, 2016, 07:21:41 PM |
|
Byzantine tolerance can not exceed 50% of failures.
Isn't it 33%? You can achieve 50% in some circumstances by making sacrifices in the system. If you have the inclination, I suspect readers would appreciate a layman's example.
|
|
|
|
Come-from-Beyond
Legendary
Offline
Activity: 2142
Merit: 1010
Newbie
|
|
January 10, 2016, 07:26:13 PM |
|
That is still EXPTIME.
A single quantum computer working at speed of an average notebook would do mining like 21 billion notebooks. How is it EXPTIME?
|
|
|
|
TPTB_need_war (OP)
|
|
January 10, 2016, 07:28:53 PM |
|
That is still EXPTIME.
A single quantum computer working at speed of an average notebook would do mining like 21 billion notebooks. How is it EXPTIME? Please show me your calculation. Also: And that doesn't speed up PoW, rather it speeds up finding a preimage.
|
|
|
|
Come-from-Beyond
Legendary
Offline
Activity: 2142
Merit: 1010
Newbie
|
|
January 10, 2016, 07:28:57 PM |
|
You can achieve 50% in some circumstances by making sacrifices in the system.
Selfish Mining paper showed that 33% is the best that can be achieved. The same number is given in http://research.microsoft.com/en-us/um/people/lamport/pubs/byz.pdf. What are these circumstances you are talking about? A sandbox with everyone following your rules?
|
|
|
|
|
Come-from-Beyond
Legendary
Offline
Activity: 2142
Merit: 1010
Newbie
|
|
January 10, 2016, 07:38:10 PM |
|
Please show me your calculation.
Current Bitcoin difficulty = 103880340815. Difficulty of 1 requires 2^32 hashes. Number of hashes for current difficulty = 103880340815 * 2^32 = 446162666497758986240. A quantum computer needs only SQRT(446162666497758986240) = 21'122'562'971.8 hashes.
|
|
|
|
TPTB_need_war (OP)
|
|
January 10, 2016, 07:53:46 PM Last edit: January 10, 2016, 08:06:40 PM by TPTB_need_war |
|
I was stating there is a exponential speedup but the solution of a preimage is still EXPTIME (complexity) which means with say 256-bit hashes, then finding a preimage of a Lamport/Winternitz spend signature with Grover's algorithm is still intractable. Your SE link above makes me aware that Grover's can also be applied to finding PoW solutions with a relative exponential speedup, thus in that context you are correct PoW solutions are tractable with the speedup factor you provided. Thanks. However, I want to make you aware that quantum computers are unlikely to speed up anything[1] on a cost/performance basis. Thus the threat appears to be very remote. The footnote below explains that when we can afford to build parallel memory tables to speed up operations as much as Grover's using classical computers, then we can afford to speed up using Grover's. Thus quantum computing may not really provide any breakthrough computational advantage. Your 4.3 section makes the point that the lower the difficulty, the lower the threat. Interesting. Thanks. Edit: I think I perhaps know how to incorporate this into my design... [1] http://cr.yp.to/hash/collisioncost-20090823.pdf
|
|
|
|
|