Bitcoin Forum
May 07, 2024, 08:41:09 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 7 8 [9] 10 11 »
161  Bitcoin / Development & Technical Discussion / Re: VanitySearch (Yet another address prefix finder) on: September 07, 2020, 10:48:18 PM

Do you have any references for the work required for the best multi-target discrete log attacks?


AFAIK there are no multi-target ECDLP algorithms.

Since the private key is not restricted, only the usual Pollard Rho is applicable. Selecting a single public key from the desired range, and doing 2^128 group operations (more or less).


[I'm posting in this thread because it's obviously most related to vanity search, in particular, it's a vanitysearch for an address prefix "BC1QLLLLLLLLLLLLLLLLLLLLLLLLL"


Taproot addresses should start with "BC1P".

162  Bitcoin / Development & Technical Discussion / Re: Vanity addresses are unsafe on: August 20, 2020, 07:04:45 PM
As a seed can generate billions of addresses [1], it would be as efficient as a normal vanity generator.

That's not nearly the case. A normal vanity generation is something like:

Step 1: Pick a random private key `p`
Step 2: Compute  the elliptical operation `P=p*G`
Step 3: Let V = to_bitcoin_address(P)
Step 4: If V matches our vanity filter, we are done. And the private key is p.
Step 5: Compute p = p+1
Step 6: Compute the elliptical operation P = P + G
Step 7: GO TO step 3.


This is insanely more efficient than using an HD seed which every iteration requires a lot of pointless, slow and complex steps (hashes, EC multiplications) . And also you can resume the process easily by just substituting "Step 1" for using the last `p` you want to continue the process from. In short: Using an HD seed makes absolutely zero sense. I can't think of a single upside.

With non-hardened keys it's almost the same:

Step 1: generate a seed s, derive (kpar, Kpar, Cpar) using know derivation path m/a'/b'/c'/0, set i32=0 (the index)
Step 2: P = Kpar + HMAC-SHA512(Cpar, Kpar || i32).L * G
Step 3: Let V = to_bitcoin_address(P)
Step 4: If V matches our vanity filter, we are done. And the private key is kpar + HMAC-SHA512(Cpar, Kpar || i32).L, with seed s, and path m/a'/b'/c'/0/i32
Step 5: Compute i32 = i32+1
Step 6: If i32 >= 2^31 GO TO step 1
Step 7: GO TO step 3.

The main slowdown is elliptic point multiplication in Step 2. HMAC-SHA512 is relatively fast.

My guess is that it's 100 times slower than normal vanity gen. Using some memory tradeoffs 30x or even 10x slowdown might be achieved.
163  Bitcoin / Development & Technical Discussion / Re: basic probability on: August 04, 2020, 01:47:46 PM
No. The zeroes are independent. The probability of having 5 (hexadecimal) leading zeroes is 16-5.
164  Bitcoin / Development & Technical Discussion / Re: Pollard's kangaroo ECDLP solver on: August 01, 2020, 05:00:59 AM
Hello,

Regarding https://bitcointalk.org/index.php?topic=5265788.0, would it be possible to modify kangaroos to work on sparse range?
I would like to launch search on range [a0, ..., ai, ..., aN], where ai = a0+(i*s), with stride 's'.
As I understand now program 'scales down' range to 0->aN', with a corresponding changes on pubkey, right?

Regards,

For continuous range you could divide ai by 's': ai/s = a0/s + i.
165  Bitcoin / Development & Technical Discussion / Re: Do invalid keys in BIP44 path matter? on: July 21, 2020, 08:08:24 AM
A 256-bit number bigger than n modulo n is at most only 128 bits.
It actually is at most 136 bits (0x014551231950b75fc4402da1732fc9bebe).

This number is 129 bits.

I got the rest wrong though.

The change appeared in the 23 March 2013 revision https://en.bitcoin.it/w/index.php?title=BIP_0032&type=revision&diff=36331&oldid=36142
166  Bitcoin / Development & Technical Discussion / Re: Do invalid keys in BIP44 path matter? on: July 20, 2020, 08:37:06 PM
Follow up question: Why was the decision made that if the left 256 bits are greater than n, to declare the key invalid instead of just using modulo n and proceeding as normal?

When calculating the child key, you add the left 256 bits to the parent key, and use modulo n to ensure a valid key is obtained. Why not just use modulo n at the previous step to ensure the left 256 bits are also valid?

A 256-bit number bigger than n modulo n is at most only 128 bits. If one knows the xpub, then k-par is known, and it is possible to recover the private key. The Summit supercomputer can find it in a week.
167  Bitcoin / Development & Technical Discussion / Re: Pollard's kangaroo ECDLP solver on: July 20, 2020, 08:44:01 AM

So...how do use a kangaroo without pub key?  What all of you looking for without it?


Pollard Kangaroo algorithm needs as input a public key and an interval for private key.

So... you don't use kangaroo without pub key.

One can get a public key from spent transaction, or early P2PK transaction.
168  Bitcoin / Development & Technical Discussion / Re: I don't believe Quantum Computing will ever threaten Bitcoin on: July 19, 2020, 10:13:23 AM
QC scaling as 2n is a common misconception. As n grows, the system scales worse and worse. At certain point, for n<50, the noise dominates

No, it's not a misconception, the number of potential classical outcomes that a QC can assess does scale 2^n, it's inherently true based on how QCs work.
A classical bit can be 0 or 1, either/or. A qubit, because of quantum superposition, is in a sense partially both values, a probability smear across the two, until it is measured, when it resolves to a definite classical 0 or 1 outcome. In a system with multiple entangled qubits, the number of values covered increases 2^n. Two entangled qubits cover 2^2=4 possibilities, 00, 01, 10, 11. Three entangled qubits cover 2^3=8, 000, 001, 010, 011, 100, 101, 110, 111. And so on.

Having said that, I absolutely understand and agree with your main point that number of qubits isn't everything, it's merely a headline figure, which can be misleading. 2^n means nothing if there is a high rate of error in the final result. Decoherence - the loss/corruption of information - is the fundamental obstacle to achieving large-scale functioning quantum computers. Adding and entangling additional qubits is not what is stopping QCs today, it is, as you say, the increased noise as number of qubits increases. But it doesn't change the 2^n scaling that makes QCs so efficient at for example integer factorisation and the discrete logarithm problem.

The scaling is an inherent truth due to immutable physical laws. The noise is an engineering problem.

 

Let me doubt.

A qbit is not simultaneously 0 and 1, it is probably 0 or 1, and eventually - when measured - certainly 0 or 1. That's why 2n is wrong. A system of n qbits is not simultaneously in 2n states, it is probably in one of them. Altering it via any constraints reduces the probability that the system is in certain state. But the system was already at certain state, and could transition to the more favorable one forced by constraints. So it has to travel to the correct state. But if there's a reasonable algorithm for this, no quantum stuff is needed.

Or let's say, that somehow, a system of n qbits is in most of the 2n states simultaneously. Then constraints are placed, and some of the states become "forbidden". The favorable states would have lower energy. Either there's a transition to lower energy, which has to be released, or influx of energy to make up for the unfavorable states. At the end finally we arrive at the correct single state. All this energy has to go somewhere. All 2n bits of energy. Would the solar system survive such energy blast? Would the Milky Way?

There's a lot of wishful thinking in quantum physics. It is believe based, and attracts all kind of believers. Something is fishy. It looks a lot like the geocentric solar system in Middle Ages. Circles upon circles, a very complex stuff, but never correct (except for a few isolated cases). So I would say, it is fundamentally wrong, and it is obvious. After all quantum physics is statistics, and statistics is never reality. Sometimes, many times, very useful, but still wrong.
169  Bitcoin / Development & Technical Discussion / Re: I don't believe Quantum Computing will ever threaten Bitcoin on: July 17, 2020, 03:08:23 PM
Why do you need a quantum computer to attack a bitcoin - I don't understand at all. Even the old asymmetric cryptography on elliptical curves, with a 4-fold increase in the length of the key - will remain a dream to crack the known algorithms on quantum computers.
Because the power of a QC scales exponentially due to superposition and entanglement. Superposition meaning that a qubit can be - to simplify somewhat - both 0 and 1 at the same time. Entanglement meaning that multiple qubits can be combined into a single state. So the number of classical outcomes that can be assessed scales 2^n. The nature of QCs means that they are strong on integer factorisation and the discrete logarithm problem (both normal and ECC). Shor's algorithm can dismantle current asymmetric cryptography.

QC scaling as 2n is a common misconception. As n grows, the system scales worse and worse. At certain point, for n<50, the noise dominates, no signal is left. For example, last year Google claimed "quantum supremacy". It was supremacy in generating noise.

170  Bitcoin / Development & Technical Discussion / Re: Pollard's kangaroo ECDLP solver on: July 17, 2020, 08:14:42 AM
Sorry, it is inconvenient to share information in public. But I can tell you that this paper wallet lacks the first 8 characters and the 5 characters in the middle and cannot be recognized. If you know of such software, can you recommend it?

If no more information is available, then you'd need 270 operations. While searching for it, there would be about 238 false positives. Currently it's unfeasible.

Every bit of additional information would halve the time.

Here are some questions that might help you:
How missing are the missing characters?
Are they totally erased?
Are there some parts of them visible?
Are there parts which are known to be empty, and certain characters wouldn't fit there?
Was it handwriting or printed?
If it was printed - was it with monospace font?
Is there a QR code (or part of it) available as well?
Was it password-protected (BIP38, starting with "6P")?

171  Bitcoin / Development & Technical Discussion / Using secp256k1 endomorphism on: July 03, 2020, 01:53:22 PM
@Jean_Luc Your code for EEA seems incorrect. Here are the correct values:
Code:
s70=FCE9B1DD4EB7DD2718A2906787061B2
t70=-3086D221A7D46BCDE86C90E49284EB15
r70=114CA50F7A8E2F3F657C1108D9D44CFD8   (>=sqrt(n))

s71=-4A5AC2BF7B5F37F9F1DB10D7A2A9C981
t71=E4437ED6010E88286F547FA90ABFE4C3
r71=3086D221A7D46BCDE86C90E49284EB15  (<sqrt(n))

s72=1839468DB3DC795B42AD17D3CA5C15137
t72=-4A5D84C4FAD1D149815130F31C84462E4
r72=2228364F61BCD8F0CDA23C16C0AC386F

Code:
(a1, b1) = (r71, -t71) = (3086D221A7D46BCDE86C90E49284EB15, -E4437ED6010E88286F547FA90ABFE4C3)
(a2, b2) = (r70, -t70) = (114CA50F7A8E2F3F657C1108D9D44CFD8, 3086D221A7D46BCDE86C90E49284EB15)
r70^2 + t70^2 = 13477B4472B4233ECA232A74B45B8E7F37640C02AF64BF4EEAEE3ABED3D0695F9
r72^2 + t72^2 = 159EC1A05B158DF471EAE76C8FA0CDA199E5599D41EC1E5915FBD403A750EC1B31
172  Bitcoin / Development & Technical Discussion / Re: Luck when selecting Secp256k1 parameters on: July 02, 2020, 07:26:47 PM
When a=0, for every prime p>=5 and p=2 mod 3, we get number of points n=p+1, which is always not prime.

When p=1 mod 3, there are 6 possible number of points.
There are (p-1)/6 values of b in interval (0,p) which produce the same n.
Let p=2256-s, s=232+t.
t=977 is the first prime p, which gives a curve with prime number of points (n).
The smallest positive value for b and prime n is b=7.
In the other five possibilities we have non-prime number of points, also 3 of the curves have torsion 2, 3, and 14.

Something interesting is, that the number of points of y2=x3+7 mod n = p, which might have been an additional requirement.
173  Bitcoin / Bitcoin Discussion / Re: HONEYWELL Quantum Computer: what does it mean to BTC on: June 29, 2020, 11:04:28 AM

It isn't just physical vibrations, it's any interaction with the world outside the quantum system. The system needs to be perfectly isolated - or as near perfect as can be managed. Any interaction can lead to loss of information through the collapse of the wave function. Electromagnetic fields or any form of radiation can trigger loss of quantum coherence. This is why quantum computers have to be cooled close to absolute zero.


At the level of isolation needed to find private keys from public, even a neutrino would disturb the system. Gravitation waves from someone making a step would wipe it out. Snapping fingers, moving eyes, breathing, etc. in a far away place would be enough.

Something more - if QC needs any kind of switching - following a program instead of hard-wired single-purpose system - any switching inside it would be enough to destroy the state.

174  Bitcoin / Development & Technical Discussion / Re: Bitcoin, cryptos and the imminent threat of a Quantum Computer on: June 28, 2020, 03:12:04 PM
Processing power of quantum computers may increase more rapidly relative to classical computers but it does not in terms of qbits gained. Quantum computers still need to increase their processing power by a factor of 20-30 before things get interesting.
20 to 30x is only an extra 4 or 5 qubits, though (2^4=16, 2^5=32).

No, I mean literally in terms of qubits. Google latest quantum chip is at 72 qubits (up from 54 qubits last year). The estimates I found on how many qubits are required to break ECDSA are conflicting, but most sources place it at 1500-3000 qubits.


I suppose what I'm trying to say is that it's very difficult to estimate when a QC that is capable of cracking bitcoin might become available, and that we can't use the development history of classical computers as a guideline.  The challenges to building a workable, reliable large-scale QC do remain immense, but we are all aware that work is continuing at pace, and a QC threat to bitcoin may be with us soon than we might envisage. I do think it's important that making bitcoin quantum-safe be considered as a problem to resolve now, rather than at some indefinite point in the future.

Agreed.

The lowest number of physical qubits in order to break secp256k1 ECDSA is more than 106 - yes, one million - and this is assuming at least 10 times better error rate, than the current record one. The 1-3k number is for perfect logical qbits. Additionally 1011 Toffoli gates are needed. All this has to work in perfect sync without errors.

But I highly doubt it would ever work. Before spitting out a solution, the system would have to simultaneously represent 2256 states. Obviously all and every physical disturbance would destroy the state, and render the computation futile. A neutrino is enough to turn this billions dollars equipment useless.
175  Bitcoin / Development & Technical Discussion / Re: Pollard's kangaroo ECDLP solver on: June 22, 2020, 07:32:10 PM
if i am not wrong,
you win 115bit puzzle in 13 days, and you count 120 puzzle is more hard as 32 time
its mean 13 x 32 = 416 days you need to win or ?

Pollard Kangaroo time grows with square root of the problem.
13 x sqrt(32) = 74 days

The expected time is lower ~45 days.
176  Bitcoin / Development & Technical Discussion / Re: Pollard's kangaroo ECDLP solver on: June 17, 2020, 05:00:00 PM
This would work properly only when the private key of P is zero mod 32. P=k*G, k=0 (mod 32).

Otherwise the point is guaranteed to not be in the interval.

The probability of being in the new interval is 1/32, which is about 3%. In all other 97% the algorithm would fail to find a point in any reasonable time.

If you somehow can deterministically move points from bigger interval to smaller, then no kangaroos are needed, ECDLP is solved. In the 1/32 case - in just 8 steps (!!!).

No, it is not like you say:

if you know that P lies in [1G,100G]

then you know that 2P lies in [2G,4G...,400G]

and that kP lies in [kG, k2G, k3G, ..., k100G]

that remains true even if k is inv(2), inv(3) and so on.

You are right, my mistake. The point would lay properly in the G' interval.
177  Bitcoin / Development & Technical Discussion / Re: Pollard's kangaroo ECDLP solver on: June 17, 2020, 04:24:11 PM
If you change the jumps old DPs are useless.
And I moved the point P in this new interval.

How?

I moved P in the interval C in this way: P -> P' = inv(32)*P


This would work properly only when the private key of P is zero mod 32. P=k*G, k=0 (mod 32).

Otherwise the point is guaranteed to not be in the interval.

The probability of being in the new interval is 1/32, which is about 3%. In all other 97% the algorithm would fail to find a point in any reasonable time.

If you somehow can deterministically move points from bigger interval to smaller, then no kangaroos are needed, ECDLP is solved. In the 1/32 case - in just 8 steps (!!!).
178  Bitcoin / Development & Technical Discussion / Re: Pollard's kangaroo ECDLP solver on: June 17, 2020, 03:26:50 PM
If you change the jumps old DPs are useless.
And I moved the point P in this new interval.

How?
179  Bitcoin / Development & Technical Discussion / Re: Pollard's kangaroo ECDLP solver on: June 17, 2020, 03:20:20 PM

Jumps (the points) are the same, only their private keys are different;

instead of using, for example, 2543*G as jump, you use (2543*32)*G', where G' = inv(32)*G


It sounds quite useless to me.

To get the benefits of kangaroo algorithm, you need to use a specific mean step.

G' makes all the steps multiple of 32, visiting only 1/32 of the points. This is instant sqrt(32) times slowdown.


Why you are visiting only 1/32 of the points? You are working in a wider interval, then it is normal to use larger jumps.

The optimal would be increase the length of the jumps by sqrt(32) instead of 32, but you have to increase it.


If you change the jumps old DPs are useless.
180  Bitcoin / Development & Technical Discussion / Re: Pollard's kangaroo ECDLP solver on: June 17, 2020, 03:10:48 PM

Jumps (the points) are the same, only their private keys are different;

instead of using, for example, 2543*G as jump, you use (2543*32)*G', where G' = inv(32)*G


It sounds quite useless to me.

To get the benefits of kangaroo algorithm, you need to use a specific mean step.

G' makes all the steps multiple of 32, visiting only 1/32 of the points. This is instant sqrt(32) times slowdown.

The other thing is transforming P.

If it stays the same or is translated, you get 32 times bigger interval, and sqrt(32) times slowdown.

If you do P' = inv(32)*P, then you have 1/32 chance that it lays in the target interval, so 1/32 chance to solve it, 31/32 chance that algorithm takes forever.

If you do 32 parallel P' kangaroo searches, so at least one fits in the target interval, you loose sqrt advantage. Instead of sqrt(32) slowdown of #120 compared to #115, you get 32 times slowdown.

Pages: « 1 2 3 4 5 6 7 8 [9] 10 11 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!