Which is still essentially nothing. For classical computing you move the timescale from quadrillions of years down to only millions of years. Congratulations.
Wait what? did you even read beyond that point? you are partially quoting to prove that you are right is that what you trying to do here? I repeat my self todays computing power 10^15+ so theoritically if classical computing keeps going forward at the same pace it's advanced with since the 60s we will be looking at 10^30 10^40 Flops in the next decade or two which is enough to crack 128bits in a few seconds and we will move on to 10^70 Flops and beyond in another decade or two from there, that is without taking into consideration anything else! which not even remotly true
No we haven't, no key with 128 bit strength has been brute forced. You can't simply compare key size. A 256 bit ECC key has equivalent strength to a 3,072 bit RSA key and a 128 bit symmetric key/hash. You may be talking about some individual algorithms being cryptographically broken, it is hard to tell because you are all over the place. I already pointed out that is possible but it has nothing to do with
You do really have some reading issues, please reread my sentence, and correct your statement I think I was clear enough, I do understand that I make mistakes from time to time because I'm not an english native speaker but please.
No people like me would have been warning that 56 bits was insufficient due to the fact that it was within 1000x of what current computing power was capable of. That is a far cry from saying 128 bit key strength is secure because it uses energy on a scale that would make brute infeasible. If we pretend the entire Bitcoin network (30 PH/s) "could" brute force symmetric keys at the same speed instead it would be able to brute force an 80 bit symmetric key in about one year. If it was 1000x more powerful it could brute force a 96 bit symmetric key in about a century. If it was a million times powerful it would still take on average a millennium to brute force a 128 bit symmetric key. To do it in a year would require a system which is a billion times more powerful.
No you woudn't, because people like you did claim the same and it was similar case scenarion in the 80s. And it wasn't x1000 diference I don't even know where are you getting that number from, The Cray Supercomputer from the 80s had 80MFlops or 80x10^6 (The Cray 2 which came in the end of the 80s has 1.2GFlops!) Todays supercomputer for example the Tianhe-2 has a computing power of 34x10^15 ! moving from 56bit to 128 means we squared the difficulty aka (2^56)^2 = 2^128 now if you compare this to computing power and how it is increasing (10^6)^2 = 10^12 <<< 10^15
And again you are taking into consideration CURRENT technology, with CURRENT technology it will take almost infinity to brute force even 128bit not even talking about 256 and I DO NOT DISAGREE WITH THIS, but that's not the point as I've been explaning to you repeatdly you are just being obstined here
None of those (except QC) would do anything more than switching from a teaspoon to a bucket when trying to empty an ocean.
Wrong as proven above.
Proven doesn't mean what you think it means. Proven doesn't mean spouting out false statements, gibberish, and strawmen.
False statement? the one you didn't prove wrong or false? the math calculation? the Flops from the supercomputer trough the decades which part is fasle? I'm sorry but you are loosing even more credibility here.
Dwave's system is not capable of implementing Shor's algorithm. It uses a process called quantum annealing. Quantum Computing isn't some super duper magical bullet which solves all problems all the time. Quantum annealing is a pretty cool concept for solving certain types of problems like pathfinding, simulating organic processes, network optimization, etc. It is completely useless for the purposes of breaking cryptographic keys.
On the progress of building a true general purpose quantum computer capable of implementing shor's algorithm the progress has been very slow. 15 was factored in 2001 using Shor's algorithm and a 4 qubits QC. By 2012 that had progressed to factoring 21 in using 5 qubits. One estimate for the total physical qubits (including circuits for error control and correction) necessary for breaking 256 bit ECC is on the order of 40,000 qubits. We went from 4 to 5 in the space of a decade and the "finish line" is 40,000 qubits. That could be doubled by switching to a 512 bit curve. Quantum Decoherence is a bitch.
The problem becomes increasingly difficult as the size of the computer grows. It may not be possible to accomplish that in our lifetimes. Wake me up when someone factors 32 bit number using quantum computing. If QC becomes a credible threat Bitcoin can evolve to addresses which use post-quantum cryptography.
That's a whole other debate and the only credible paper about on how DWave computer or how much it is a Quantum Computer is this one
http://arxiv.org/abs/1401.7087But Again DWave is by far not the only one in the field.
As for Quantum Annealing is a LEGIT interpretation of Quantum computing, as we are talking about Quantum-mechanical superposition principale here! And when use Quantum Mechanics principales and Quanta to compute, isn't this what Quantum computing is about?