Bitcoin Forum
January 16, 2019, 04:55:53 AM
 News: The copper membership price will increase by about 300% around Friday.
 Home Help Search Login Register More
 Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 [44] 45 46 47 48 49 50 51 52 53 54 55 56 57
 Author Topic: Large Bitcoin Collider (Collision Finders Pool)  (Read 164972 times)
Its_a_Meeeee
Newbie

Offline

Activity: 6
Merit: 0

 April 12, 2017, 11:35:12 AM

Wow!

"The probability to find an address with funds on it within the next 24h is therefore 0.00000000112258370293%."

Hey Rico, can you share the formula that calculates this number?
Is it only speed based, does it take the progression of the project into account? and what about reset / reissued blocks?

Sorry if I'm bringing up already discussed stuff, I'm soon read up on everything ;-)
Until then I might need the occasional question answered plox ^.^

Happy hashing all
1547614553
Hero Member

Offline

Posts: 1547614553

Ignore
 1547614553

1547614553
 Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1547614553
Hero Member

Offline

Posts: 1547614553

Ignore
 1547614553

1547614553
 Report to moderator
1547614553
Hero Member

Offline

Posts: 1547614553

Ignore
 1547614553

1547614553
 Report to moderator
rico666
Legendary

Offline

Activity: 1064
Merit: 1022

฿ → ∞

 April 12, 2017, 12:21:36 PM

"The probability to find an address with funds on it within the next 24h is therefore 0.00000000112258370293%."

Hey Rico, can you share the formula that calculates this number?

And earlier today (when above number was smaller) I bragged in the German forum about this number:

0.00000000000000000045% (2016-09-17)
0.00000000000000200548% (2016-09-23)
0.00000000002801022860% (2017-03-28)
0.00000000108372998990% (2017-04-17)

Quote
Is it only speed based, does it take the progression of the project into account? and what about reset / reissued blocks?

The formula is:

Code:
my \$space           = 159 - log_bin(\$num_adr_funds);
my \$keys_day        = \$srch_speed * 86400 * 1000000;              # how many keys per day are searched
my \$prob24          = 1 / (2 ** (\$space - \$bits));                # 24h prob: 1/(effective search space - space searched)

\$prob24 *= \$keys_day;                                             # multiply with PK/24h rate (seconds * pk/s)
\$prob24 *= 100;                                                   # * 100 (for %)

\$bits is currently 51.32
The 159 is 2^159 keys to search, as this resolves to 2^160 addresses
\$srch_speed is the 24h Mkeys/s average you see, therefore * 106 * seconds in day
So this number gets bigger in time (because we have covered search space) and/or if the pool gets faster.
If the pool gets slower, this number may also become smaller. (search speed = 0 -> probability = 0)

Reissued blocks are not relevant, because the "space searched" takes only PoW blocks into account.
Promised and undelivered blocks do not count as searched space.

Rico

all non self-referential signatures except mine are lame ... oh wait ...   ·  LBC Thread (News)  ·  BURST Activities
arulbero
Legendary

Offline

Activity: 1165
Merit: 1253

 April 12, 2017, 12:38:29 PM

Code:
my \$prob24          = 1 / (2 ** (\$space - \$bits));                # 24h prob: 1/(effective search space - space searched)

effective search space - space searched = 2 ** (\$space - \$bits)  ??

2^135    -            2^51        =       2^84 ??
rico666
Legendary

Offline

Activity: 1064
Merit: 1022

฿ → ∞

 April 12, 2017, 12:52:13 PM

Code:
my \$prob24          = 1 / (2 ** (\$space - \$bits));                # 24h prob: 1/(effective search space - space searched)

effective search space - space searched = 2 ** (\$space - \$bits)  ??

2^135    -            2^51        =       2^84 ??

Oh...

bug introduced by intertness (from the above division... 159 - log_bin(\$num_adr_funds)

I guess we'll add some more 0 now.

Rico

all non self-referential signatures except mine are lame ... oh wait ...   ·  LBC Thread (News)  ·  BURST Activities
arulbero
Legendary

Offline

Activity: 1165
Merit: 1253

 April 12, 2017, 01:00:47 PM

Oh...

bug introduced by intertness (from the above division... 159 - log_bin(\$num_adr_funds)

I guess we'll add some more 0 now.

I'm afraid there are 26 0...

2^159 private keys
14,9M --> 23,8 bits
--> 2^135,2

24 hour --> 16,4 bits
2,2Gkeys/s --> 31 bits

(31 + 16,4) - 135,2  -->  -87,8 --> 0,0000000000000000000000000037116

I didn't count the 51 bits of the so far searched space, but i think it is more or less the same ...

rico666
Legendary

Offline

Activity: 1064
Merit: 1022

฿ → ∞

 April 12, 2017, 01:05:50 PM

(31 + 16,4) - 135,2  -->  -87,8 --> 0,0000000000000000000000000037116

I didn't count the 51 bits of the so far searched space, but i think it is more or less the same ...

2 orders of magnitude...
because %

0.000000000000000000000000003711
0.000000000000000000000000401596% <--- correct number (online)

So to sum up - how is this number going to be bigger?

• With more addresses with funds on them
• In time, as more space is searched
• With higher search speed

From what I can see, the number of addresses with funds has the most effect at the moment.

Rico

all non self-referential signatures except mine are lame ... oh wait ...   ·  LBC Thread (News)  ·  BURST Activities
arulbero
Legendary

Offline

Activity: 1165
Merit: 1253

 April 12, 2017, 01:33:34 PM

So to sum up - how is this number going to be bigger?

• With more addresses with funds on them
• In time, as more space is searched
• With higher search speed

From what I can see, the number of addresses with funds has the most effect at the moment.

In my opinion, the number of addresses is essentially, much more than your formula shows.

I'm almost sure that, with 2^159 keys, we could generate only 2/3 of all 2^160 addresses.

Now 14,9M (2^24) are very very few respect to 2^160, so it is not unlikely that the "real" probability is very far from our estimate, in other words the variability of the % of the 14,9M addresses that fall in our range is very high

If in our range 1-2^159 there are 2/3 of 2^24 too, we will have the same probability you computed :

2/3 * 2^24  /  2/3 * 2^160 = 2^24 / 2^160 = 1/2^136.

But there could be only 1/5 or worse 1/20 * 2^24 (it is not very unlikely IMHO), so more addresses is better without doubt.
rico666
Legendary

Offline

Activity: 1064
Merit: 1022

฿ → ∞

 April 12, 2017, 01:40:59 PM

Code:
my \$prob24          = 1 / (2 ** (\$space - \$bits));                # 24h prob: 1/(effective search space - space searched)

effective search space - space searched = 2 ** (\$space - \$bits)  ??

2^135    -            2^51        =       2^84 ??

Well - I have now

Code:
1 / (2 ** \$space - 2 ** \$bits)

but are we sure this is correct?, because if (in some distant time) \$bits should approach \$space, we'd get 0 in the denominator, meaning infinite probability. And should we somehow jump over this 0, we'd get negative probability when - in fact - we should have a constant expectation. However - if I have "1 / (2 ** (\$space - \$bits));" we'd get probability 1 for the case we search through \$space.

So certainly that simple subtraction above cannot be it.

edit: I think my original formula was correct...
By the time we reach 135.17 bit search space, we have 1/ 2^0 probability = 1

Now the logarithm of searched keys => 51.32 bits space searched is what actually allows us to perform a simple subtraction of the exponents.

Mr. Arulbero ...  maybe check the math again?

Rico

all non self-referential signatures except mine are lame ... oh wait ...   ·  LBC Thread (News)  ·  BURST Activities
arulbero
Legendary

Offline

Activity: 1165
Merit: 1253

 April 12, 2017, 01:53:22 PM

Well - I have now

Code:
1 / (2 ** \$space - 2 ** \$bits)

but are we sure this is correct?, because if (in some distant time) \$bits should approach \$space, we'd get 0 in the denominator, meaning infinite probability. And should we somehow jump over this 0, we'd get negative probability when - in fact - we should have a constant expectation. However - if I have "1 / (2 ** (\$space - \$bits));" we'd get probability 1 for the case we search through \$space.

So certainly that simple subtraction above cannot be it.

Rico

if you want to use a correct formula, you have to use a geometric distribution --> https://en.wikipedia.org/wiki/Geometric_distribution

Quote
Assumptions: When is the geometric distribution an appropriate model?

The geometric distribution is an appropriate model if the following assumptions are true.
The phenomenon being modelled is a sequence of independent trials.
There are only two possible outcomes for each trial, often designated success or failure.
The probability of success, p, is the same for every trial.

If these conditions are true, then the geometric random variable is the count of the number of failures before the first success. The possible number of failures before the first success is 0, 1, 2, 3, and so on. The geometric random variable Y is the number of failures before the first success.
Its_a_Meeeee
Newbie

Offline

Activity: 6
Merit: 0

 April 12, 2017, 01:53:53 PM

Your last one is correct: 1 / (2 ** \$space - 2 ** \$bits)
As might be the one you had before.

But see, at some point it will be 1 / 1 = 1 ==> 100% probabilty.
That point is when you have searched all but the very last key.
Then you check the very last key (being certain it will hit) and after that your calculation expires that's true but not relevant then :-)

cheers

EDIT:
There is even another aspect to it *sigh

And that is, by just scaling the probability up with the keys per day, you neglect that the probability for the 2nd key of the day is higher than the 1st and so on ...
Its not going to be a big difference in the result, it's more like neglecting air resistance in physics, kinda doesn't matter ^.^

I enjoy pointing stuff out tho :-)

peace
arulbero
Legendary

Offline

Activity: 1165
Merit: 1253

 April 12, 2017, 02:01:11 PM

Your last one is correct: 1 / (2 ** \$space - 2 ** \$bits)
As might be the one you had before.

But see, at some point it will be 1 / 1 = 1 ==> 100% probabilty.
That point is when you have searched all but the very last key.
Then you check the very last key (being certain it will hit) and after that your calculation expires that's true but not relevant then :-)

So the last day your probability will be:

1/1 * 2,2G = 2,2G

EDIT: every time you generate a key , you have always the same probability. You can talk only about the probability in n tries...
rico666
Legendary

Offline

Activity: 1064
Merit: 1022

฿ → ∞

 April 12, 2017, 02:03:46 PM

if you want to use a correct formula, you have to use a geometric distribution --> https://en.wikipedia.org/wiki/Geometric_distribution

That seems right. So:

We have 2135.2 Bernoulli trials, where we - for simplicity now - claim, that the probability to observe the event "collision" is 1 after all these trials were done. Ok?

So: ￼￼1 - (1 - p)k for the number we want to show. Yes?

k is the number of trials left to do (that's our 2space - 2done)

and probability is simply 1/2space. Yes?

Can someone check/confirm so I can hack that in?

Rico

all non self-referential signatures except mine are lame ... oh wait ...   ·  LBC Thread (News)  ·  BURST Activities
Its_a_Meeeee
Newbie

Offline

Activity: 6
Merit: 0

 April 12, 2017, 02:15:36 PM

Yes arulbero you are right thanks!

I misunderstood the expression of the number there.
I think what you want to show/express on the page is the chance of collision in the daily (approximated) workload, right?
I was more focused on the progressive chance which would answer the question: "what chance did we have so far".
The answer is also on the page: "search space covered:    51.32 of 160 bits"

Got what i wanted!

rico666
Legendary

Offline

Activity: 1064
Merit: 1022

฿ → ∞

 April 12, 2017, 02:26:18 PM

The answer is also on the page: "search space covered:    51.32 of 160 bits"

Got what i wanted!

And that may also not be right, because it could be "51.32 of 159" (keys vs. addresses)
Or "51.32 of 135.2" (until 1st collision)
or "51.32 of X" (if we have good estimates if and how the SHA256 and RIPEMD160 functions are not Uniform.

Pick one.

Rico

all non self-referential signatures except mine are lame ... oh wait ...   ·  LBC Thread (News)  ·  BURST Activities
arulbero
Legendary

Offline

Activity: 1165
Merit: 1253

 April 12, 2017, 02:38:27 PM

We have 2135.2 Bernoulli trials, where we - for simplicity now - claim, that the probability to observe the event "collision" is 1 after all these trials were done. Ok?

Yes, it is right.

So: ￼￼1 - (1 - p)k for the number we want to show. Yes?

k is the number of trials left to do (that's our 2space - 2done)

and probability is simply 1/2space. Yes?

Can someone check/confirm so I can hack that in?

You have to use Bayes' theorem and/or conditional probability. Give me some time, I think about it.
Araudan
Sr. Member

Offline

Activity: 333
Merit: 250

 April 12, 2017, 02:41:01 PM

Is it possible to technically focus only on 1  btc address ?

What are the probability mathematically to find the private key ?

monsanto
Legendary

Offline

Activity: 1235
Merit: 1004

..like bright metal on a sullen ground.

 April 12, 2017, 03:11:48 PM

Is it possible to technically focus only on 1  btc address ?

What are the probability mathematically to find the private key ?

That sounds like the equivalent of searching for an entire address in vanitygen.  I don't know the exact probability but you'll probably need multiple universes.
rico666
Legendary

Offline

Activity: 1064
Merit: 1022

฿ → ∞

 April 12, 2017, 03:38:47 PM

Is it possible to technically focus only on 1  btc address ?
What are the probability mathematically to find the private key ?

It would not be faster (like it would by 0.00000001%) than what we are doing now by looking at ALL
addresses. The probability to find a collision for ANY of these (currently ~ 15 million) addresses
is currently under discussion as you can see above. Let's call it X.

The probability of finding one specific in these 15 million is ... well ... X/15million.

Rico

all non self-referential signatures except mine are lame ... oh wait ...   ·  LBC Thread (News)  ·  BURST Activities
rico666
Legendary

Offline

Activity: 1064
Merit: 1022

฿ → ∞

 April 12, 2017, 03:54:24 PMLast edit: April 12, 2017, 04:25:44 PM by rico666

You have to use Bayes' theorem and/or conditional probability. Give me some time, I think about it.

This is how I see it now:

* we assume our total_search_space is 2159 - lb(num_adr) ~ 2135.17 (total until 1st collision)
* search_space_to_do = total_search_space - search_space_done
* Probability for the next key to be a collision is 1 / search_space_to_do
* The sum of the next-key probabilities for the keys done (estimated) in the following 24h is the probability we want to display.

For simplicity, we can assume linear behavior of the probabilities within the next 24h, therefore the sum is

1/2 * (probability of the last key in the 24h + probability of the 1st key in the 24h) * keys done in the 24h

If anyone has a different opinion, I will implement a rand(very low number) and finito!

Going to implement the above now.

Rico

all non self-referential signatures except mine are lame ... oh wait ...   ·  LBC Thread (News)  ·  BURST Activities
arulbero
Legendary

Offline

Activity: 1165
Merit: 1253

 April 12, 2017, 04:36:26 PMLast edit: April 12, 2017, 05:25:43 PM by arulbero

So: 1 - (1 - p)k for the number we want to show. Yes?

k is the number of trials left to do (that's our 2space - 2done)

and probability is simply 1/2space. Yes?

You have a different idea (different from mine) of the situation:

you think that there are 2^135 addresses (space search) and that there is surely the address you are looking for. If this is the model, you cannot use the geometric distribution, because it admit you could never find that address.

Probability 1/2^135 doesn't mean that you will get surely your address after 2^135 tries.

But if you want to think in this way, I wrote this formula :

https://www.dropbox.com/s/bulinnhroay5cg2/Probability%20of%201%20collision%20for%20k.pdf?dl=0

EDIT: I made a mistake, I computed another probability

Use your formula. Shame we have lost 10-15 "0" in 1 day!!!
 Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 [44] 45 46 47 48 49 50 51 52 53 54 55 56 57