Bitcoin Forum
September 09, 2024, 02:19:13 PM *
News: Latest Bitcoin Core release: 27.1 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 ... 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 [140] 141 142 143 »
  Print  
Author Topic: Pollard's kangaroo ECDLP solver  (Read 57553 times)
AndrewWeb
Jr. Member
*
Offline Offline

Activity: 46
Merit: 1


View Profile
March 27, 2024, 04:32:59 PM
 #2781

If you have the public key of Puzzle 66, you can find the privat key in seconds. Why is that so ? What happens ? What does the program do ?
Jean_Luc (OP)
Sr. Member
****
Offline Offline

Activity: 462
Merit: 696


View Profile
March 27, 2024, 07:13:12 PM
 #2782

If you have the public key of Puzzle 66, you can find the privat key in seconds.

Yes but we don't have it, so brute force and 2^65 itérations for 50% of success, 2^64 for 25% etc... Grin
Chail35
Newbie
*
Offline Offline

Activity: 5
Merit: 0


View Profile
March 27, 2024, 07:20:52 PM
 #2783

I checked on #125 with a RTX 4500 and a A100 (the H100 is not yet available).
The needed time evolve linearly with the number of board and you have to multiply by sqrt(32) for #130
So ~1 year to solve #130 with ~1000 RTX 4500 using this program with the required mods.

Kangaroo v2.2
Start:10000000000000000000000000000000
Stop :1FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF
Keys :1
Number of CPU thread: 0
Range width: 2^124
Jump Avg distance: 2^62.04
Number of kangaroos: 2^20.81
Suggested DP: 38
Expected operations: 2^63.10
Expected RAM: 1387.8MB
DP size: 38 [0xfffffffffc000000]
GPU: GPU #0 NVIDIA RTX A4500 (56x0 cores) Grid(112x128) (147.0 MB used)
SolveKeyGPU Thread GPU#0: creating kangaroos...
SolveKeyGPU Thread GPU#0: 2^20.81 kangaroos [9.9s]
[1514.20 MK/s][GPU 1514.20 MK/s][Count 2^34.77][Dead 0][22s (Avg 207.606y)][2.0/4.0MB]


Kangaroo v2.2
Start:10000000000000000000000000000000
Stop :1FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF
Keys :1
Number of CPU thread: 0
Range width: 2^124
Jump Avg distance: 2^62.04
Number of kangaroos: 2^21.75
Suggested DP: 37
Expected operations: 2^63.10
Expected RAM: 2760.3MB
DP size: 37 [0xfffffffff8000000]
GPU: GPU #0 NVIDIA A100-PCIE-40GB (108x0 cores) Grid(216x128) (277.0 MB used)
SolveKeyGPU Thread GPU#0: creating kangaroos...
SolveKeyGPU Thread GPU#0: 2^21.75 kangaroos [22.7s]
[3824.79 MK/s][GPU 3824.79 MK/s][Count 2^33.84][Dead 0][06s (Avg 82.0929y)][2.0/4.0MB]

---------

With a H100 (PCIe) (on #130)

Kangaroo v2.2
Start:200000000000000000000000000000000
Stop :3FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF
Keys :1
Number of CPU thread: 0
Range width: 2^129
Jump Avg distance: 2^64.01
Number of kangaroos: 2^21.83
Suggested DP: 40
Expected operations: 2^65.62
Expected RAM: 1985.4MB
DP size: 40 [0xffffffffff000000]
GPU: GPU #0 NVIDIA H100 PCIe (114x0 cores) Grid(228x128) (292.0 MB used)
SolveKeyGPU Thread GPU#0: creating kangaroos...
SolveKeyGPU Thread GPU#0: 2^21.83 kangaroos [23.3s]
[5113.99 MK/s][GPU 5113.99 MK/s][Count 2^34.26][Dead 0][06s (Avg 352.678y)][2.0/4.0MB]

You can find the cotation of the hypervisors we use for scientific calculation there (page 12 of my presentation):
https://indico.esrf.fr/event/93/contributions/559/



Are you going to modify the code to be able to search the 130 bit? There is a new NVIDIA GPU that was announced, the GB200 which is exponentially more powerful than the current gpus.
Jean_Luc (OP)
Sr. Member
****
Offline Offline

Activity: 462
Merit: 696


View Profile
March 27, 2024, 07:25:26 PM
 #2784

Are you going to modify the code to be able to search the 130 bit? There is a new NVIDIA GPU that was announced, the GB200 which is exponentially more powerful than the current gpus.

I don't think so, unless someone proove me he has the needed power and is ready to beat the world record  Wink
Chail35
Newbie
*
Offline Offline

Activity: 5
Merit: 0


View Profile
March 27, 2024, 07:31:11 PM
 #2785

Are you going to modify the code to be able to search the 130 bit? There is a new NVIDIA GPU that was announced, the GB200 which is exponentially more powerful than the current gpus.

I don't think so, unless someone proove me he has the needed power and is ready to beat the world record  Wink

Here's my current work file for 130, i got here in about 3 days
i need 25.55 DP
is what i have wrong? since you said the program can't solve it?
Kangaroo v2.2
Loading: save.work
Version   : 0
DP bits   : 40
Start     : 200000000000000000000000000000000
Stop      : 3FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF
Key       : 03633CBE3EC02B9401C5EFFA144C5B4D22F87940259634858FC7E59B1C09937852
Count     : 0 2^-inf
Time      : 00s
DP Size   : 2.1/4.7MB
DP Count  : 4480 2^12.129
HT Max    : 2 [@ 001ECF]
HT Min    : 0 [@ 000000]
HT Avg    : 0.02
HT SDev   : 0.13
Kangaroos : 0 2^-inf
Baskentliia
Jr. Member
*
Offline Offline

Activity: 60
Merit: 1

34Sf4DnMt3z6XKKoWmZRw2nGyfGkDgNJZZ


View Profile WWW
March 27, 2024, 08:08:17 PM
 #2786

Are you going to modify the code to be able to search the 130 bit? There is a new NVIDIA GPU that was announced, the GB200 which is exponentially more powerful than the current gpus.

I don't think so, unless someone proove me he has the needed power and is ready to beat the world record  Wink


Please update your program and upgrade to 256 bit range as soon as possible.
Others have updated your program but it wasn't successful enough. We expect from you

34Sf4DnMt3z6XKKoWmZRw2nGyfGkDgNJZZ
Gerbie
Newbie
*
Offline Offline

Activity: 1
Merit: 0


View Profile
March 29, 2024, 10:56:53 AM
 #2787

Dear @Baskentliia,

It may be because English is probably not your first language, but saying to Jean Luc "We expect from you" is pretty rude.
The "prize" for solving puzzle #130 is very high, so don't expect to be able to do it with someone else's tools/software.
If you want to solve one of the remaining puzzles, you have to team up with a lot of people (and share the prize money) or be smarter than everyone else, i.e. come up with a new method or code to do it much faster than is possible with the actual hardware and software.
In fact, showing how secure the bitcoin crypto is was the whole idea behind these puzzles.

And a 256 bit range is ridiculous at the moment, please do the maths and calculate how many resources you'll need to solve a 256 bit number! That's why the puzzles stop at 160 bit...

Sincerely and good luck puzzling!

Jean_Luc (OP)
Sr. Member
****
Offline Offline

Activity: 462
Merit: 696


View Profile
March 29, 2024, 07:06:29 PM
 #2788

It is possible to reduce a little bit the complexity of the classic kangaroo algorithm by spreading the starting kangaroo in a non uniform maner.
Roughly speaking, if the private key lies in certain areas it will be found faster. As a counterpart if it lies in certain other areas it will be found slower.
But in average, considering the whole range, the compexity is well reduced.
It you read and understand carefully the reference in the gitbub page, you will find the trick Wink

Baskentliia
Jr. Member
*
Offline Offline

Activity: 60
Merit: 1

34Sf4DnMt3z6XKKoWmZRw2nGyfGkDgNJZZ


View Profile WWW
March 29, 2024, 07:45:58 PM
 #2789

Dear @Baskentliia,

It may be because English is probably not your first language, but saying to Jean Luc "We expect from you" is pretty rude.
The "prize" for solving puzzle #130 is very high, so don't expect to be able to do it with someone else's tools/software.
If you want to solve one of the remaining puzzles, you have to team up with a lot of people (and share the prize money) or be smarter than everyone else, i.e. come up with a new method or code to do it much faster than is possible with the actual hardware and software.
In fact, showing how secure the bitcoin crypto is was the whole idea behind these puzzles.

And a 256 bit range is ridiculous at the moment, please do the maths and calculate how many resources you'll need to solve a 256 bit number! That's why the puzzles stop at 160 bit...

Sincerely and good luck puzzling!



Thank you buddy. But we are not software developers, we have difficulty understanding and running the programs we use, so it is not possible for us to make changes to these programs. We would like to thank the software developers who share their programs for free without expecting anything in return.
We run the program and wait for luck to strike us.

34Sf4DnMt3z6XKKoWmZRw2nGyfGkDgNJZZ
iceland2k14
Jr. Member
*
Offline Offline

Activity: 35
Merit: 68


View Profile
March 30, 2024, 05:28:20 AM
 #2790

Dear @Jean_Luc, What is your opinion about Stride in Kangaroo Algo, if probabilistically it can be shown to have lower total key space in that particular stride or stride range.

The jumptables are already defined and also the Paths of Kangaroos are deterministic based on X, so stride is possible in this algo or not?
kTimesG
Member
**
Offline Offline

Activity: 128
Merit: 25


View Profile
March 30, 2024, 02:10:21 PM
Last edit: March 30, 2024, 02:38:08 PM by kTimesG
 #2791

I had zero overflow during tests.

And for 130, I am using the average case scenario and numbers. No exabytes needed. And it’s obvious you don’t understand the difference between a kangaroo and a stored DP.

You do the math yourself, take a DP, we will say DP 32, and you tell me, in your expert opinion, how much storage space is needed, roughly, for solving 130.  

I would reference you to OPs GitHub to read on time/memory tradeoff but you’ve already stated you don’t agree with much of what he has said or programmed.

Anyway, let me know storage space required, avg run case, for 130, using DP32
I think you are forgetting something about real-life constraints. Let's go by your example.
I will use log2 (bits) as a base to simplify calculations.

Puzzle #130 - keyspace = N = 129 bits.

We have two sets: T (tame) and W (wild), each having sqrt(N) elements (64.5 bits).

At this point probability of a collision (T and W intersection is not empty) is:
P(success) = 1 - P(failure) = 1 - (1 - 2**(-64.5)) ** (2 ** 64.5) = 63%

Total operations so far: 2 * 2**64.5 = 2 ** 65.5

Adding DP = 32 into the mix. So we store on average every T point out of every other 2 ** 32 and every W point out of every 2 ** 32.

So size of T = 2 ** (64.5 - 32) = 2 ** 32.5
Size of W = 2 ** (64.5 - 32) = 2 ** 32.5

Remember we didn't change nothing about probability, so these numbers are still for a 63% success probability.

Now, since DP only reduces storage by a factor of 2**DP, then the number of operations until T and W collide increases by:
2 * 2**DP / 2 on average (operations between real collision and reaching the DP point, on average for T and W) = 2 ** DP

So total ops for a 63% chance of success = 2 ** 65.5 + 2 ** DP

Now, you may say: oh, so I should apologize because I stated we need much more storage. Well, let's go back to real-life:

- set of size T with DP 32 = 2**32.5 elements
- set of size W with DP 32 = 2**32.5 elements
- P(success) = 63%

Now, how many kangaroos do we use? The naive answer is, it doesn't matter, because we are counting OPERATIONS total.

But it does matter when having to think how to store the traveled distances.
Let's see what distance a single kangaroo would travel, on average.

Jump distances = [2**0, 2**1, 2**2, ... 2**128]
Avg(jump dist) = (2**129 - 1) / 129 which almost equals 2**(129 - 7) = 2 ** 122

Number of jumps performed by the kangaroo to fill the T or W set is NOT 2**32.5, but 2**64.5 (because we still jumped even if not reaching a DP)

So total traveled distance = 2**64.5 * avgJumpSize = 2 ** (122 + 64.5) = 186.5 bits = 24 bytes

So storing a jump requires:
- jump position / key, let's say 33 bytes (X + Y sign)
- distance traveled (24 bytes) + kang type

Just by doing some simple estimations this would require a lot of TB (terabytes) of RAM.
You will need to increase the number of kangaroos by a factor of 256 to get rid of one byte in the stored DP.
65536 kangaroos to get rid of two bytes. Etc...

So to conclude:

- you need 2**32.5 tames DP, each around 60+ bytes
- you need 2**32.5 wilds DP, each around 60+ bytes
- your chances after 2**65.5 operations are around 63%
- the more kangaroos you have, the more DP overhead increases: 2**32 * numKangaroos
- the kangaroo jumps and the lookup for stored jumps needs to be in the complexity range of O(1) - e.g. RAM, not some swap file

If you can prove me that you can fit in real-life the two T and W sets without having to rely on memory swap to a storage device, then yes, you were right.

So, it goes without saying that maybe a real-life approach would not even get anywhere near storing the DP points, in our lifetime. Simply due to resource constraints.

Why did I say we need exabytes of data? Well, sir, I will let this as an exercise for the reader.
WanderingPhilospher
Full Member
***
Offline Offline

Activity: 1148
Merit: 236

Shooters Shoot...


View Profile
March 30, 2024, 03:36:38 PM
 #2792

I had zero overflow during tests.

And for 130, I am using the average case scenario and numbers. No exabytes needed. And it’s obvious you don’t understand the difference between a kangaroo and a stored DP.

You do the math yourself, take a DP, we will say DP 32, and you tell me, in your expert opinion, how much storage space is needed, roughly, for solving 130.  

I would reference you to OPs GitHub to read on time/memory tradeoff but you’ve already stated you don’t agree with much of what he has said or programmed.

Anyway, let me know storage space required, avg run case, for 130, using DP32
I think you are forgetting something about real-life constraints. Let's go by your example.
I will use log2 (bits) as a base to simplify calculations.

Puzzle #130 - keyspace = N = 129 bits.

We have two sets: T (tame) and W (wild), each having sqrt(N) elements (64.5 bits).

At this point probability of a collision (T and W intersection is not empty) is:
P(success) = 1 - P(failure) = 1 - (1 - 2**(-64.5)) ** (2 ** 64.5) = 63%

Total operations so far: 2 * 2**64.5 = 2 ** 65.5

Adding DP = 32 into the mix. So we store on average every T point out of every other 2 ** 32 and every W point out of every 2 ** 32.

So size of T = 2 ** (64.5 - 32) = 2 ** 32.5
Size of W = 2 ** (64.5 - 32) = 2 ** 32.5

Remember we didn't change nothing about probability, so these numbers are still for a 63% success probability.

Now, since DP only reduces storage by a factor of 2**DP, then the number of operations until T and W collide increases by:
2 * 2**DP / 2 on average (operations between real collision and reaching the DP point, on average for T and W) = 2 ** DP

So total ops for a 63% chance of success = 2 ** 65.5 + 2 ** DP

Now, you may say: oh, so I should apologize because I stated we need much more storage. Well, let's go back to real-life:

- set of size T with DP 32 = 2**32.5 elements
- set of size W with DP 32 = 2**32.5 elements
- P(success) = 63%

Now, how many kangaroos do we use? The naive answer is, it doesn't matter, because we are counting OPERATIONS total.

But it does matter when having to think how to store the traveled distances.
Let's see what distance a single kangaroo would travel, on average.

Jump distances = [2**0, 2**1, 2**2, ... 2**128]
Avg(jump dist) = (2**129 - 1) / 129 which almost equals 2**(129 - 7) = 2 ** 122

Number of jumps performed by the kangaroo to fill the T or W set is NOT 2**32.5, but 2**64.5 (because we still jumped even if not reaching a DP)

So total traveled distance = 2**64.5 * avgJumpSize = 2 ** (122 + 64.5) = 186.5 bits = 24 bytes

So storing a jump requires:
- jump position / key, let's say 33 bytes (X + Y sign)
- distance traveled (24 bytes) + kang type

Just by doing some simple estimations this would require a lot of TB (terabytes) of RAM.
You will need to increase the number of kangaroos by a factor of 256 to get rid of one byte in the stored DP.
65536 kangaroos to get rid of two bytes. Etc...

So to conclude:

- you need 2**32.5 tames DP, each around 60+ bytes
- you need 2**32.5 wilds DP, each around 60+ bytes
- your chances after 2**65.5 operations are around 63%
- the more kangaroos you have, the more DP overhead increases: 2**32 * numKangaroos
- the kangaroo jumps and the lookup for stored jumps needs to be in the complexity range of O(1) - e.g. RAM, not some swap file

If you can prove me that you can fit in real-life the two T and W sets without having to rely on memory swap to a storage device, then yes, you were right.

So, it goes without saying that maybe a real-life approach would not even get anywhere near storing the DP points, in our lifetime. Simply due to resource constraints.

Why did I say we need exabytes of data? Well, sir, I will let this as an exercise for the reader.
A lot to unpack here.

First thing, I would not be using RAM. All DP info stored in files.

You still think it would require exabytes? To store all of those DPs?

Before more deep diving, riddle me this, based on your calculations for 130 bit range and DP 32 (which you say would require exabytes), how much would 115 bit range at DP 25 need, according to your calculations?
kTimesG
Member
**
Offline Offline

Activity: 128
Merit: 25


View Profile
March 30, 2024, 08:41:09 PM
 #2793

First thing, I would not be using RAM. All DP info stored in files.

You still think it would require exabytes? To store all of those DPs?
Glad to see some progress. Did you factor in the entire overhead of using a non-O(1) lookup and item storage for DP? Or are you talking in abstract?

You have only 2 options:
a. Insist on a constant time factor to solve the puzzle, with all the assumptions factored in. One of those requirements is a O(1) DP lookup / storage. You will need a few TB of random access memory at a physical level, not in "files".
b. Actually understand how real-life limitations kick in.

Since the DP points are impossible to predict or compute them in "ranges", actual overheads like disk page reads / writes kick in.
Reading to a SSD is thousands of times slower than reading from physical RAM.
Writing to files involves reading entire pages of data, combining it, and writing it back.

Since DP values are uniformly spread out across the entire range, so say goodbye to "fast" SSD speeds when reading and writing even a single DP entry, since it is not sequential, and you have almost 0% chance to even have two same DP points in a close range.

But sure, if you wait like 5 seconds to compute a DP, store it to a file, and using this as your base, sure, it's a 2**65.5 + DP overhead total time. But you'll end up at the end with a broken SSD with an exceeded write failures count.
WanderingPhilospher
Full Member
***
Offline Offline

Activity: 1148
Merit: 236

Shooters Shoot...


View Profile
March 31, 2024, 04:02:38 AM
 #2794

First thing, I would not be using RAM. All DP info stored in files.

You still think it would require exabytes? To store all of those DPs?
Glad to see some progress. Did you factor in the entire overhead of using a non-O(1) lookup and item storage for DP? Or are you talking in abstract?

You have only 2 options:
a. Insist on a constant time factor to solve the puzzle, with all the assumptions factored in. One of those requirements is a O(1) DP lookup / storage. You will need a few TB of random access memory at a physical level, not in "files".
b. Actually understand how real-life limitations kick in.

Since the DP points are impossible to predict or compute them in "ranges", actual overheads like disk page reads / writes kick in.
Reading to a SSD is thousands of times slower than reading from physical RAM.
Writing to files involves reading entire pages of data, combining it, and writing it back.

Since DP values are uniformly spread out across the entire range, so say goodbye to "fast" SSD speeds when reading and writing even a single DP entry, since it is not sequential, and you have almost 0% chance to even have two same DP points in a close range.

But sure, if you wait like 5 seconds to compute a DP, store it to a file, and using this as your base, sure, it's a 2**65.5 + DP overhead total time. But you'll end up at the end with a broken SSD with an exceeded write failures count.

Just answer the question, how much storage or RAM do you think one would need, solving a 115 bit range key, using a DP of 25? According to your previous calculations...
satashi_nokamato
Jr. Member
*
Offline Offline

Activity: 50
Merit: 3


View Profile
March 31, 2024, 04:19:04 PM
 #2795

Can someone show at least 2 DPs? I'd like to know more about them and to know what makes them useful in finding a collision.
WanderingPhilospher
Full Member
***
Offline Offline

Activity: 1148
Merit: 236

Shooters Shoot...


View Profile
March 31, 2024, 05:56:55 PM
 #2796

Can someone show at least 2 DPs? I'd like to know more about them and to know what makes them useful in finding a collision.

They can be in many forms, here is trailing, with 0s:
824EDCFE76F16D85BB77626BE00000000

Or could be leading, with 0s:
00000000824EDCFE76F16D85BB77626BE

Those are both, at least, DP 32.

With those DPs, you would also have a distance/private key.

So if you get a tame DP to match a wild DP, then you would have a collision and possible solution to whatever public key you are looking for.

Tame:
824EDCFE76F16D85BB77626BE00000000  1000
Wild:
824EDCFE76F16D85BB77626BE00000000  0001

Then subtract wild distance from tame distance and that would be the private key to the public key being searched for. Priv key = 999

This is an example, those do not match up in the real curve, I just used them for an easy explanation.
greenAlien
Newbie
*
Offline Offline

Activity: 12
Merit: 0


View Profile
April 01, 2024, 09:14:48 AM
 #2797

I don't understand so deeply how kangaroo works but @Etar has created one distributed kangaroo.
Here are the links:
- https://github.com/Etayson/Server-Client-apps-for-Etarkangaroo
- https://github.com/Etayson/Etarkangaroo

If someone has the resources needed to run a 130# puzzle operation we can try to gather some gpu power.

Regarding what @WanderingPhilospher is asking to @kTimesG we can try to run this tool for 115# and see how long it takes and how many resources do we need/use. 
If so just PM me here.
WanderingPhilospher
Full Member
***
Offline Offline

Activity: 1148
Merit: 236

Shooters Shoot...


View Profile
April 01, 2024, 10:32:15 AM
 #2798

I don't understand so deeply how kangaroo works but @Etar has created one distributed kangaroo.
Here are the links:
- https://github.com/Etayson/Server-Client-apps-for-Etarkangaroo
- https://github.com/Etayson/Etarkangaroo

If someone has the resources needed to run a 130# puzzle operation we can try to gather some gpu power.

Regarding what @WanderingPhilospher is asking to @kTimesG we can try to run this tool for 115# and see how long it takes and how many resources do we need/use. 
If so just PM me here.


No need to test #115, I already know the answer, I just wanted kTimesG to continue to put his foot in his mouth lol.

He really has no clue what he is saying nor understands this program. This program being the program that this thread is about. I like to have good laughs and was looking forward to his follow up answer but maybe he now realizes he is wrong and won't comment again lol.

As far as Etar's program, it is good and works and works up to 192 bits. The only drawback is it is written in PureBasic and many people who use Linux, struggle trying to compile it.
kTimesG
Member
**
Offline Offline

Activity: 128
Merit: 25


View Profile
April 01, 2024, 01:27:02 PM
 #2799

Just answer the question, how much storage or RAM do you think one would need, solving a 115 bit range key, using a DP of 25? According to your previous calculations...
What exactly is wrong with my calculations, to be precise?... You were specifically asking something about #130 with a specific DP. Now you want to know why #115 is a manageable effort? OK...

Keyspace: 114 bits.
T size: 57 bits
W size: 57 bits
DP = 25, so Tdp size = 32 bits
Wdp size = 32 bits

Required memory for hashmap: 2*2**32 bits entries = 8 billion entries * sizeof(jump entry)
Requires steps: 2*2**57 + numKang * 2**25
Probability(success) = 1 - (1 - 2**-57) ** (2**57) = 63%

I do not believe you understand how a hash map needs to work, in order to check for a collision.
In simplest terms, you will need some sort of a sorted list. Optimally, a balanced binary search tree. That one has O(log n) lookup and insert complexity. So, for a DP 25, that is filled with 2**33 entries, it requires 33 steps for a single lookup, IN AVEGAGE CASE scenario, when your search tree is balanced.

The memory requirement difference between #115 with DP 25 and #130 with DP 32 is orders of magnitudes higher!

It doesn't matter if you use a single CPU with few TB of RAM or many CPUs each with less RAM each. You will end up with the same problem of having to merge / query the DP "central table" / search tree / sorted list, call it whatever you want.

Good luck with your "files".
WanderingPhilospher
Full Member
***
Offline Offline

Activity: 1148
Merit: 236

Shooters Shoot...


View Profile
April 01, 2024, 01:42:42 PM
 #2800

Just answer the question, how much storage or RAM do you think one would need, solving a 115 bit range key, using a DP of 25? According to your previous calculations...
What exactly is wrong with my calculations, to be precise?... You were specifically asking something about #130 with a specific DP. Now you want to know why #115 is a manageable effort? OK...

Keyspace: 114 bits.
T size: 52 bits
W size: 52 bits
DP = 25, so Tdp size = 27 bits
Wdp size = 27 bits

Required memory for hashmap: 2*2**27 bits entries = 268 million entries * sizeof(jump entry)
Requires steps: 2*2**52 + numKang * 2**25
Probability(success) = 1 - (1 - 2**-52) ** (2**52) = 63%

I do not believe you understand how a hash map needs to work, in order to check for a collision.
In simplest terms, you will need some sort of a sorted list. Optimally, a balanced binary search tree. That one has O(log n) lookup and insert complexity. So, for a DP 25, that is filled with 2**28 entries, it requires 28 steps for a single lookup, IN AVEGAGE CASE scenario, when your search tree is balanced.

The memory requirement difference between #115 with DP 25 and #130 with DP 32 is orders of magnitudes higher!

It doesn't matter if you use a single CPU with few TB of RAM or many CPUs each with less RAM each. You will end up with the same problem of having to merge / query the DP "central table" / search tree / sorted list, call it whatever you want.

Good luck with your "files".
You are in this thread spouting non sense. That’s the issue. You have no clue as to how this program works.

You say solving 130 will require exabytes of RAM/storage, which is false. So I ask how many would it take for 115 with DP 25, how many exa or terabytes? You couldn’t answer in terms of bytes.

You talk about crashing SSDs, more info that you don’t know about this program or how it works. If an SSD can’t handle being written to, 12 times a day, then what’s the point of even owning one lol. Yes, this program, you control what and how often DPs are saved to your hard drive.

Now you claim 130 with DP 32 is magnitudes more than 115 with DP 25, which further shows you don’t know how kangaroo or this program works. Magnitudes? lol.

Maybe look at the code for this program that you are commenting on, before commenting on it.
Pages: « 1 ... 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 [140] 141 142 143 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!