Tobo


November 16, 2015, 12:40:31 PM 

If quantum computers become popular, what impact will it bring to Bitcoin and other existing cryptocoins?





Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.

patmast3r


November 16, 2015, 12:48:08 PM 

If quantum computers become popular, what impact will it bring to Bitcoin and other existing cryptocoins? Pretty much all current publickey cryptography is done for.





ComefromBeyond
Legendary
Offline
Activity: 2128
Merit: 1009
Newbie


November 16, 2015, 02:06:06 PM 

Pretty much all current publickey cryptography is done for.
PoW blockchain mining too. PoS/PoI/DPoS blockchains unaffected.




Tobo


November 16, 2015, 02:25:14 PM 

Pretty much all current publickey cryptography is done for.
PoW blockchain mining too. PoS/PoI/DPoS blockchains unaffected. How difficult is it for Bitcoin and Ethereum to switch their current algorithms to quantum resistant cryptographic algorithms?




ComefromBeyond
Legendary
Offline
Activity: 2128
Merit: 1009
Newbie


November 16, 2015, 02:26:22 PM 

How difficult is it for Bitcoin and Ethereum to switch their current algorithms to quantum resistant cryptographic algorithms?
Ethereum is migrating to PoS. Bitcoin will be dead.




patmast3r


November 16, 2015, 02:30:19 PM 

Pretty much all current publickey cryptography is done for.
PoW blockchain mining too. PoS/PoI/DPoS blockchains unaffected. I guess those would be unaffected but those projects would still need to swtich from ecdsa to something else that is "quantum secure" for all their signing right ?




Tobo


November 16, 2015, 02:30:39 PM 

How difficult is it for Bitcoin and Ethereum to switch their current algorithms to quantum resistant cryptographic algorithms?
Ethereum is migrating to PoS. Bitcoin will be dead. But Ethereum still needs to take care their public key attack issues. Is it easy for Etheruem or POS to switch to a new algorithms to protect the public keys?




ComefromBeyond
Legendary
Offline
Activity: 2128
Merit: 1009
Newbie


November 16, 2015, 02:33:23 PM 

I guess those would be unaffected but those projects would still need to swtich from ecdsa to something else that is "quantum secure" for all their signing right ?
Don't know about the others, but Nxt Account Control feature includes QCresistance.




ComefromBeyond
Legendary
Offline
Activity: 2128
Merit: 1009
Newbie


November 16, 2015, 02:36:20 PM 

Is it easy for Etheruem or POS to switch to a new algorithms to protect the public keys?
Existing schemes of QCproof signing require pretty big signatures. In Iota, for example, ~433 bytes are occupied by the signature, it's exactly 50% of the transaction size.




tromp


November 16, 2015, 02:39:29 PM 

Pretty much all current publickey cryptography is done for.
PoW blockchain mining too. Hashcash PoW (like Bitcoin's and most altcoin's) is amenable to Grover search which can search a space of n nonces in time O(sqrt(n)). But Hashcash with large memory requirements will likely not be affected as long as scaling quantum computers up to millions of bits remains elusive. Nonhashcash PoWs like Cuckoo Cycle are even less affected, as they are immune to Grover search.




ComefromBeyond
Legendary
Offline
Activity: 2128
Merit: 1009
Newbie


November 16, 2015, 02:49:54 PM 

Hashcash PoW (like Bitcoin's and most altcoin's) is amenable to Grover search which can search a space of n nonces in time O(sqrt(n)).
But Hashcash with large memory requirements will likely not be affected as long as scaling quantum computers up to millions of bits remains elusive.
Nonhashcash PoWs like Cuckoo Cycle are even less affected, as they are immune to Grover search.
"Improvements" like Scrypt make nodes more vulnerable to spam. Also, a quantum computer doesn't need to evaluate a whole hash value, it can verify first bits and throw away nonces that don't suit with high probability. BTW, why is Cuckoo Cycle less affected, birthday paradox problems are solved with N^(1/3) effort VS N^(1/2).




tromp


November 16, 2015, 03:39:36 PM Last edit: November 16, 2015, 03:50:56 PM by tromp 

Hashcash PoW (like Bitcoin's and most altcoin's) is amenable to Grover search which can search a space of n nonces in time O(sqrt(n)).
But Hashcash with large memory requirements will likely not be affected as long as scaling quantum computers up to millions of bits remains elusive.
Nonhashcash PoWs like Cuckoo Cycle are even less affected, as they are immune to Grover search.
Also, a quantum computer doesn't need to evaluate a whole hash value, it can verify first bits and throw away nonces that don't suit with high probability. Computing a single bit of a hash is almost as much effort as computing the whole hash; you might be saving a percent or two at most. BTW, why is Cuckoo Cycle less affected, birthday paradox problems are solved with N^(1/3) effort VS N^(1/2).
Because, unlike Birthday collision problems, Cuckoo Cycle is a more structured search problem; you must find a 42 cycle in an arbitrary graph. There are no known quantum speedups for such graph problems.




ComefromBeyond
Legendary
Offline
Activity: 2128
Merit: 1009
Newbie


November 16, 2015, 03:51:03 PM 

Computing a single bit of a hash is almost as much effort as computing the whole hash; you might be saving a percent or two at most.
Could you provide a proof of this statement? http://jheusser.github.io/2013/02/03/satcoin.html claims the opposite.




ComefromBeyond
Legendary
Offline
Activity: 2128
Merit: 1009
Newbie


November 16, 2015, 04:28:53 PM 

But Hashcash with large memory requirements will likely not be affected as long as scaling quantum computers up to millions of bits remains elusive.
I didn't find information on timememory tradeoff of quantum computers, but if we assume that the tradeoff is not worse than the tradeoff of classical computers then we get that memory increase of the hashing function can be counteracted by increasing time we run the computations. So Hashcash with large memory won't save us.




tromp


November 16, 2015, 04:33:59 PM 

Computing a single bit of a hash is almost as much effort as computing the whole hash; you might be saving a percent or two at most.
Could you provide a proof of this statement? Thus follows directly from how SHA256 is defined. It is many rounds of confusion and dispersion; so that each single bits in one round depends on pretty much all bits of previous rounds. That is a long document to read. Where exactly does it claim that?




tromp


November 16, 2015, 04:37:39 PM 

But Hashcash with large memory requirements will likely not be affected as long as scaling quantum computers up to millions of bits remains elusive.
I didn't find information on timememory tradeoff of quantum computers, but if we assume that the tradeoff is not worse than the tradeoff of classical computers then we get that memory increase of the hashing function can be counteracted by increasing time we run the computations. So Hashcash with large memory won't save us. Of course I was talking about hashfunctions that don't allow for timememory tradeoffs.




ComefromBeyond
Legendary
Offline
Activity: 2128
Merit: 1009
Newbie


November 16, 2015, 05:21:04 PM 

Of course I was talking about hashfunctions that don't allow for timememory tradeoffs.
Give me the name of one of such functions, please. The tradeoff is a pretty universal thing, the best a function can do is to keep time*memory*advice constant, if I'm not mistaken.




ComefromBeyond
Legendary
Offline
Activity: 2128
Merit: 1009
Newbie


November 16, 2015, 05:27:47 PM 

so that each single bits in one round depends on pretty much all bits of previous rounds.
This means that after some number of rounds SHA256 doesn't give a better mixing, hence it's possible to do a shortcut by finding a polynomial with fewer number of operators. That is a long document to read. Where exactly does it claim that?
I introduced a novel algorithm to solve the bitcoin mining problem without using (explicit) brute force. Instead, the nonce search is encoded as a decision problem and solved by a SAT solver in such a way that a satisfiable instance contains a valid nonce. The key ingredients in the algorithm are a nondeterministic nonce and the ability to take advantage of the known structure of a valid hash using assume statements.
A couple of benchmarks demonstrated that already with simple parameter tuning dramatic speed ups can be achieved. Additionally, I explored the contentious claim that the algorithm might get more efficient with increasing bitcoin difficulty. Initial tests showed that block 218430 with considerably higher difficulty is solved more efficiently than the genesis block 0 for a given nonce range. This means that in average computation of a single bit takes less time than computation of the whole hash.




tromp


November 16, 2015, 06:25:11 PM 

Of course I was talking about hashfunctions that don't allow for timememory tradeoffs.
Give me the name of one of such functions, please. The tradeoff is a pretty universal thing, the best a function can do is to keep time*memory*advice constant, if I'm not mistaken. You are quite mistaken. This is a recognized weakness in scypt's design. Here's one: Argon2, winner of the Password Hashing Competition. Most of the PHC candidates qualify, since timememorytradeoff resistance was one of the design goals.




