tromp
Legendary
Offline
Activity: 1032
Merit: 1179
|
 |
October 05, 2023, 02:24:43 PM Last edit: October 05, 2023, 02:53:33 PM by tromp |
|
Proof-of-Work is completely dependent on a secure hash algorithm.
It's not. The Hashcash [1] Proof-of-Work system is. There are other PoW not based on hashing [2]. Miners would be affected because the current Proof of Work (PoW) algorithm in Bitcoin relies heavily on SHA-256 for mining. Quantum computers could potentially break the cryptographic primitives underpinning SHA-256, which would render the current mining hardware and strategies obsolete.
While you ponder about quantum attacks on SHA256, which are considered extremely unlikely, you overlook the fact that Bitcoin's PoW algorithm, namely Hashcash [1], is itself known to be vulnerable to quantum attack, independent of the choice of hash function in Hashcash (SHA256D in bitcoin). Using Grover's algorithm [3] for quadratic speedup, a quantum computer can find a hash pre-image with 2*k leading 0s in (very) roughly the same amount of time that a classical computer needs to find one with only k leading 0s. [1] https://en.wikipedia.org/wiki/Hashcash[2] http://cryptorials.io/beyond-hashcash-proof-work-theres-mining-hashing/[3] https://en.wikipedia.org/wiki/Grover%27s_algorithm
|
|
|
|
|
|
WatChe
|
 |
October 07, 2023, 05:09:32 PM |
|
This is always the case. Why? Because all algorithms are based on unsolved math problems, for example "elliptic curve discrete logarithm problem" (ECDLP). As long as it is unsolved, we can use elliptic curves in the same way as today. But once someone will find a mathematical solution, you need to find another problem, and build a new system around that. Also, for that reason, humans should never know the answer for every problem, because then you can no longer build any new crypto-based system.
Quantum computing is not a new thing, quantum computing algorithms like Shor's algorithm [1] that solves discrete logarithm problems and integer factorization in a polynomial time are launched in 1994. RSA is based in integer factorization while Diffie-Hellman Key Exchange is based on Discrete Log Problem. Quantum computing is targeting the unsolved problems (hard problems) on which these security protocols stand. One we have quantum computers of 4000 Qubits, things will get tough for current security protocols. [1]https://www.geeksforgeeks.org/shors-factorization-algorithm/
|
|
|
|
|
j2002ba2
|
 |
October 07, 2023, 07:43:04 PM Merited by JayJuanGee (1) |
|
One we have quantum computers of 4000 Qubits, things will get tough for current security protocols.
You are off several order of magnitudes. If they somehow make quantum error correction work, then it's more like 15000*4000 = 60M qubits. For 256-bit ECDLP the lowest logical qubit count is around 2330, giving 35M physical qubits. There is a big problem - one also needs 126G Toffoli gates. Additionally, the algorithm has to perform 116G time steps. If the time step is 1ps, then there might be even a correct result! With 1ns we are looking at 116 seconds runtime, enough for decoherence. AFAIK right now the time step is several hundred nanoseconds. This is several hours runtime. No result possible. Wait a moment! Error correcting Toffoli gates needs additionally at least 15 logical qubits. This is 225K qubits per Toffoli gate. All together 28.35 * 10 15 qubits. Even if the above is off by orders of magnitude, for now, all quantum hope is lost.
|
|
|
|
|
|
WatChe
|
 |
October 08, 2023, 04:15:59 PM |
|
Even if the above is off by orders of magnitude, for now, all quantum hope is lost.
Same thing was said about every new technology including Bitcoin. There is no reason for any individual to have a computer in his home Ken Olsen, founder of Digital Equipment Corporation, 1977
First 2-qubit quantum computer was demonstrated in 1998 and last year IBM rolled out there 400 Qubit-Plus Quantum Processor and Next-Generation IBM Quantum System Two ( IBM). The pace may be slow but quantum computing is a reality. Moreover US president has already signed quantum Computing Cyber security Preparedness Act in final days of 2022.
|
|
|
|
bkelly13
Member

Offline
Activity: 76
Merit: 35
|
 |
October 27, 2023, 10:35:51 PM |
|
...
- First, they'll try to attack old P2PK transactions, as they provide the public key. Satoshi's coins are the prime example for that. We will thus see slowly Satoshi's money moving (be it because Satoshi himself moves them with P2[W]PKH/P2TR txes, or because the quantum hacker moves them). An attacker will need years for that step alone, so they'll be focusing on coins where it's unlikely that thay'll be moved.
How do "we" know which coins are Satoshi's?
|
|
|
|
|
digaran
Copper Member
Hero Member
   
Offline
Activity: 1330
Merit: 905
🖤😏
|
 |
October 27, 2023, 10:52:00 PM |
|
How do "we" know which coins are Satoshi's?
"We" don't know exactly, but there are some speculations stating that he mined the first 20,000 blocks, untouched to this day.
|
🖤😏
|
|
|
vjudeu
Copper Member
Legendary
Offline
Activity: 909
Merit: 2363
|
 |
October 28, 2023, 01:15:08 AM |
|
How do "we" know which coins are Satoshi's? We don't. If you explore coinbase transactions from the past, you can notice that there is a field called "extraNonce". Because it is not resetted, it is incremented, and by looking at such numbers, you can conclude that if one block has extraNonce equal to 1035, and some next block has extraNonce equal to 1039, then you can guess that both blocks were mined by the same miner. http://satoshiblocks.info/See? Those blue lines are used to collect all such cases. Also, you can see some green lines, that are similar, and also can show you, which coins can be owned by another single miner. However, all of that is not a proof, that Satoshi is the person behind it. The only strong implication is that if you can identify such line, then you can guess, that all blocks on a single line, were mined by a single miner. However, this is similar to checking, which mining pool mined which block. This is just something you can get from exploring coinbase transactions. This is not a 100% proof, but rather a guess. Because, guess what: you can also run some solo miner, and put "Mined by AntPool" string inside. And then, if you release such solo-mined block, with your own address in the coinbase output, then people would see that and think "so, it was mined by AntPool, right?". Maybe. Or maybe not. We don't know, we can only guess. he mined the first 20,000 blocks Not exactly. People think he mined those blocks from those blue lines. But if you think he mined every single block, then you are wrong. There are many green dots, and it can show you, that many blocks were mined by other people. Also, because the slope of some green lines is different, people concluded that those miners had different hashrates. You can re-mine some old, CPU-mined blocks, to confirm, what was the exact algorithm for mining some old blocks.
|
I've moved on to other things.
|
|
|
|
smartcomet
|
 |
December 11, 2024, 02:12:07 AM Merited by JayJuanGee (1) |
|
Google announces Willow, quantum chip. The first is that Willow can reduce errors exponentially as we scale up using more qubits. This cracks a key challenge in quantum error correction that the field has pursued for almost 30 years. Second, Willow performed a standard benchmark computation in under five minutes that would take one of today’s fastest supercomputers 10 septillion (that is, 1025) years — a number that vastly exceeds the age of the Universe.
https://scottaaronson.blog/?p=8329&continueFlag=86a666619f5897003da1fae21f589db6Quantum Computing: Between Hope and Hype
https://twitter.com/adam3us/status/1866480523800932364the primary use of implementing winternitz signatures (PQ signatures), in bitcoin for now would be to knock out the quantum FUD traders! i can't see PQ being of relevance this decade, or probably more decades. "this time it's different" cool, we await with interest your results!
https://eprint.iacr.org/2011/191.pdfwinternitz signatures
|
|
|
|
|
|
|
Saint-loup
Legendary
Offline
Activity: 3206
Merit: 2522
|
 |
December 11, 2024, 10:28:08 PM Last edit: December 11, 2024, 11:18:33 PM by Saint-loup |
|
In addition we could see in this article from Google Quantum AI that this quantum computer named Willow is only using 105 qubits. However, according to this academic article published in 2022, it's almost one million times below the number of qubits needed to break a Bitcoin public key, so we are not even talking of an address hashed with RIPEMD-160 and SHA-256 from a public key, and then encoded with Base58Check. It means those figures only concern addresses already used to send funds(because of their public key available on the blockchain). Finally, we calculate the number of physical qubits required to break the 256-bit elliptic curve encryption of keys in the Bitcoin network within the small available time frame in which it would actually pose a threat to do so. It would require 317 × 106 physical qubits to break the encryption within one hour using the surface code, a code cycle time of 1 μs, a reaction time of 10 μs, and a physical gate error of 10-3. To instead break the encryption within one day, it would require 13 × 106 physical qubits. [...] This large physical qubit requirement implies that the Bitcoin network will be secure from quantum computing attacks for many years (potentially over a decade). https://doi.org/10.1116/5.0073075
|
|
|
|
NotFuzzyWarm
Legendary
Offline
Activity: 4354
Merit: 3421
Evil beware: We have waffles!
|
 |
December 12, 2024, 12:27:43 AM Last edit: December 13, 2024, 05:12:47 AM by NotFuzzyWarm |
|
Merit given for that link 'The impact of hardware specifications on reaching quantum advantage in the fault tolerant regime'That is the 1st paper on QC I've seen that emphasized that all efforts to-date have been research test beds built to test ideas on how quantum circuits will/do operate - they are NOT functional 'Quantum Computers" that are capable of doing anything other than that 1-specific series of tests. Now if only mainstream & social media would realize that and quit making it sound like QC's are just around the corner and coming soon to a BestBuy near you... Per the paper: However, the targeted problems solved were theoretical in nature, and not relevant to industrial applications. In short the progression has been: a. 1st 'Quantum Computers' (per how media covered it and repeated as each breakthrough was announced) were to see if a single quantum gate (QG) can actually be made. b. Once a single QG was made the next ones were to find what does it actually do. How can it be manipulated? The QG's officially became known as 'Qubits'. c. Next were to see if multiple qubits could be made on a single chip and connected to each other. d. After several iterations of 'c' it was found that data error rate was a huge stumbling block and there things sat for over 10 years. Good part is that during that time, evolution of 'c' led from only 4 qubits on a chip to the current number of qubits available on a test system (IBM's Quantum has 127 qubits). Now enough qubits are available to start building and testing logic circuits needed for operations - things like adders, multipliers, NAND & NOR operators, etc. but quantum data error rate remained a huge problem. e. Current level of development: Google's Sycamore and Willow chips finally cracked the error rate issue. f. Next comes addressing other problems such as quantum state stability and lifetime and how to make bigger arrays of qubits. Both are still in very early research stage. g. Once all that is resolved only then will the 1st real QC be able to be built. That is where we now stand - at point 'e'. Testing the bits & pieces of what will one day become a true Quantum Computer capable of working on actual real-world computational problems.
|
|
|
|
|
mcdouglasx
|
 |
December 12, 2024, 07:49:36 PM |
|
Given the almost exponential rate of technological evolution, vulnerabilities might surface sooner than we anticipate. I don't think these algorithms will take hundreds of years to become weak, but we still have some time to prepare. However, we're in an era where research is stagnating—either experts have too much money and focus on other things, or they don't have enough and investigating these matters becomes unappreciated work. This should be taken very seriously.
The community needs to be proactive to avoid a "Titanic effect" and not underestimate the risks out of arrogance or lack of appreciation for experts.
|
|
|
|
|