Bitcoin Forum
May 20, 2024, 11:37:46 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 [24] 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 ... 142 »
  Print  
Author Topic: Pollard's kangaroo ECDLP solver  (Read 56248 times)
Jean_Luc (OP)
Sr. Member
****
Offline Offline

Activity: 462
Merit: 696


View Profile
May 27, 2020, 04:47:53 PM
 #461

I tested -wsplit in the 89bit range. Works perfet! Thanks JeanLuc.
Grabber take files from server and send to local PC, where merger merged this files with masterfile.
Key solved during merge after 32minutes on 2080ti. I did not wait until the key is solved on the server, enough that it was found through merge.

Many thanks for the test Smiley
I'm currently working on the merger trying to speed it up and decrease memory consumption.
Etar
Sr. Member
****
Offline Offline

Activity: 616
Merit: 312


View Profile
May 27, 2020, 07:38:25 PM
Last edit: May 27, 2020, 08:38:44 PM by Etar
 #462

If someone wants to run a solver with small DPs, but the server’s resources don’t allow it, then you can use the -wsplit option,
which appears in version 1.7 of the solver.
But in any case, you must have a PC that can merge files. I just had such a problem.
Now I can safely merge files on my home PC. In order not to do it all manually, you need a grabber and a merger.
File grabber is launched on the server, merger is launched on the home PC.
Merger communicates with the grabber and requests files from him. The graber sends, if any, and then removes them from the server.
The merger, in turn, after receiving the file from the grabber, starts the merge process, during which it is possible to find the key, after merge temp file deleted.
Grabber gives files only to an authorized merger.
If it helps someone, archive with source codes, compiled programs and example .bat files: https://drive.google.com/file/d/1wQWLCRsYY2s4DH2OZHmyTMMxIPn8kdsz
Edit: fixed little memory leak at grabber side.

As before, the sources under Purebasic.
mergeServer(grabber):
Code:
-pass >password for merger authorization
-port >listening port, where merger will be connect
-ext  >by this extension, the grabber will search for files in the folder,
       for ex. using -ext part, than you should start server with -w xxxx.part
mergeClient(merger):
Code:
-jobtime 60000>request a file from the grabber every 60s
-name >it is name of your merger, useless, just for stats
-pass >password for authorization(should be the same as in grabber)
-server >host:port grabber
-workfile >name of your masterfile
-merger >Kangaroo.exe by default
zielar
Full Member
***
Offline Offline

Activity: 277
Merit: 106


View Profile
May 27, 2020, 08:14:19 PM
 #463

If someone wants to run a solver with small DPs, but the server’s resources don’t allow it, then you can use the -wsplit option,
which appears in version 1.7 of the solver.
But in any case, you must have a PC that can merge files. I just had such a problem.
Now I can safely merge files on my home PC. In order not to do it all manually, you need a grabber and a merger.
File grabber is launched on the server, merger is launched on the home PC.
Merger communicates with the grabber and requests files from him. The graber sends, if any, and then removes them from the server.
The merger, in turn, after receiving the file from the grabber, starts the merge process, during which it is possible to find the key, after merge temp file deleted.
Grabber gives files only to an authorized merger.
If it helps someone, archive with source codes, compiled programs and example .bat files: https://drive.google.com/file/d/1YB3tz_tkJWTilXMCm_3GIfKsODGzDTqi
As before, the sources under Purebasic.
mergeServer(grabber):
Code:
-pass >password for merger authorization
-port >listening port, where merger will be connect
-ext  >by this extension, the grabber will search for files in the folder,
       for ex. using -ext part, than you should start server with -w xxxx.part
mergeClient(merger):
Code:
-jobtime 60000>request a file from the grabber every 60s
-name >it is name of your merger, useless, just for stats
-pass >password for authorization(should be the same as in grabber)
-server >host:port grabber
-workfile >name of your masterfile
-merger >Kangaroo.exe by default

You fell from heaven with this tool, man! Just tell me how to set -ext knowing that the latest release adds the ending without .part?

If you want - you can send me a donation to my BTC wallet address 31hgbukdkehcuxcedchkdbsrygegyefbvd
patatasfritas
Newbie
*
Offline Offline

Activity: 17
Merit: 0


View Profile
May 27, 2020, 08:16:25 PM
 #464

Jean_Luc, I want to look at the table which is saved to workfile. But as I understood, only 128bit are saved for X-coordinate and 126bit for distance (together with 1 bit for sign and 1 bit for kangaroo type).

Anyway, what is the easiest way to receive the whole table in txt format. I easily could read from binary file the head, dp, start/stop ranges, x/y coordinates for key. After that the hash table is saved with a lot of 0 bytes....
Can you just briefly describe the hash table structure which is saved to binary file?

if you still haven't managed to export the workfile, here are the changes I made to the code to export the DPs to txt.

The X-coord is 128+18 bits and distance 126bits. The X-coordinate could be re-generated from the distance if necessary.

https://github.com/PatatasFritas/FriedKangaroo/commit/1669df5f91ef2cc8a7619b21f66883fa164ab602
Etar
Sr. Member
****
Offline Offline

Activity: 616
Merit: 312


View Profile
May 27, 2020, 08:28:06 PM
 #465

You fell from heaven with this tool, man! Just tell me how to set -ext knowing that the latest release adds the ending without .part?
-ext you can set what ever you want, just it was a word in the name of small files.
Example. you have -w mysmallfile  on server, when it will be save with -wsplit it will look like  mysmallfile_27May20_232415
So you need set -ext like smallfile or small.. if only the grabbber could distinguish this file from others
WanderingPhilospher
Full Member
***
Offline Offline

Activity: 1064
Merit: 219

Shooters Shoot...


View Profile
May 28, 2020, 02:22:05 AM
 #466

With the new -wsplit option:

How are you using it?

The way I read, it saves an updated file let's say every -wi 60 seconds. Got it. I've tested it.

So the only way the key can be solved on the server side, is if a collision happens in the current, being worked on, reset/smaller hash table (the one that will be saved next)?

The server doesn't compare the different saved files for a collision, so one must merge the different saved working files to check for a collision.
If you have 10, 20, 30, 100, 1000 saved files over a period of days, weeks, months, are you manually merging the saved work files every so often?
The current merge option only allows the merge of 2 files at a time...manually merging could become an hourly task (depending on how many files you have being saved each minutes/hours.)

So you have a benefit in RAM but increases the amount of work you have to do in merging files?
WanderingPhilospher
Full Member
***
Offline Offline

Activity: 1064
Merit: 219

Shooters Shoot...


View Profile
May 28, 2020, 02:30:25 AM
 #467

@JeanLuc in next release if it possible fix please addiding date string to workfile before extention.
I mean i use file -w savework1.part but after saving got savework1.part_27May20_160341 should be savework1_27May20_160341.part  I think. Thanks.

Edit:With  -d 28, -wi 600, memory usage by server 10-20MB( it think -wsplit is great tools)

I recompiled a version that does similar to what you asked.

Simply changed the code in Backup.cpp

Code:
 string fileName = workFile;
  if(splitWorkfile)
    //fileName = workFile + "_" + Timer::getTS();
    fileName = Timer::getTS() + "_" + workFile;

works as described.

Instead of saving as savework1.part_27May20_160341, it saves as 27May20_160341_savework1.part

This way your .part extension is at the end of the file.
Etar
Sr. Member
****
Offline Offline

Activity: 616
Merit: 312


View Profile
May 28, 2020, 06:52:28 AM
Last edit: May 28, 2020, 07:06:38 AM by Etar
 #468

With the new -wsplit option:

How are you using it?

The way I read, it saves an updated file let's say every -wi 60 seconds. Got it. I've tested it.

So the only way the key can be solved on the server side, is if a collision happens in the current, being worked on, reset/smaller hash table (the one that will be saved next)?

The server doesn't compare the different saved files for a collision, so one must merge the different saved working files to check for a collision.
If you have 10, 20, 30, 100, 1000 saved files over a period of days, weeks, months, are you manually merging the saved work files every so often?
The current merge option only allows the merge of 2 files at a time...manually merging could become an hourly task (depending on how many files you have being saved each minutes/hours.)

So you have a benefit in RAM but increases the amount of work you have to do in merging files?
Grabber and merger merge files in realtime, so i do not have any split files on server. Once file exist merger merge him with masterfile.
all what you need it is set -jobtime to value (wi/2*1000). If -wi set to 60, than you need -jobtime 30000, -jobtime in ms.

Etar
Sr. Member
****
Offline Offline

Activity: 616
Merit: 312


View Profile
May 28, 2020, 11:53:13 AM
 #469

There seems to be some kind of mistake in this puzzle#110
A huge number of GPU capacities have already been spent and no one could find the key.
I checked the range borders with bsgs, there is no key there.
This was the only version why we could not find the key for so long.
But she is not true. The key is not near the edge.
Someone has an idea why we can’t find the key .. even with the possibility of 400 GPU?
brainless
Member
**
Offline Offline

Activity: 316
Merit: 34


View Profile
May 28, 2020, 12:08:46 PM
 #470

There seems to be some kind of mistake in this puzzle#110
A huge number of GPU capacities have already been spent and no one could find the key.
I checked the range borders with bsgs, there is no key there.
This was the only version why we could not find the key for so long.
But she is not true. The key is not near the edge.
Someone has an idea why we can’t find the key .. even with the possibility of 400 GPU?
how much gpu's you have ?

13sXkWqtivcMtNGQpskD78iqsgVy9hcHLF
Etar
Sr. Member
****
Offline Offline

Activity: 616
Merit: 312


View Profile
May 28, 2020, 12:18:50 PM
Last edit: May 28, 2020, 12:46:20 PM by Etar
 #471

-snip-
how much gpu's you have ?
now left 24x2080ti, around half GPU is out of race.

As say Turing, machine working we need only give him what he is undertsand..

BitCrack
Jr. Member
*
Offline Offline

Activity: 30
Merit: 122


View Profile
May 28, 2020, 01:01:27 PM
 #472

There seems to be some kind of mistake in this puzzle#110
A huge number of GPU capacities have already been spent and no one could find the key.
I checked the range borders with bsgs, there is no key there.
This was the only version why we could not find the key for so long.
But she is not true. The key is not near the edge.
Someone has an idea why we can’t find the key .. even with the possibility of 400 GPU?

I think folks are just being impeded by storage and network issues.
Jean_Luc (OP)
Sr. Member
****
Offline Offline

Activity: 462
Merit: 696


View Profile
May 28, 2020, 01:06:07 PM
 #473

Published a new release with a faster merger.
Thanks to test it Wink
brainless
Member
**
Offline Offline

Activity: 316
Merit: 34


View Profile
May 28, 2020, 01:08:15 PM
 #474

-snip-
how much gpu's you have ?
now left 24x2080ti, around half GPU is out of race.

As say Turing, machine working we need only give him what he is undertsand..


could you try this for calc time about your 24 2080ti gpu for 100 bit
0314AA23C847D86CE1FCDC4B3E0524F741FDE2AC4B38B522ED2E3B77CF500D5718
and you can use client server or single system with gpuid 0,1,2,3 to 24 command
and just run and you will have estimate time, could update me your estimate time ?

13sXkWqtivcMtNGQpskD78iqsgVy9hcHLF
Etar
Sr. Member
****
Offline Offline

Activity: 616
Merit: 312


View Profile
May 28, 2020, 01:21:54 PM
 #475

could you try this for calc time about your 24 2080ti gpu for 100 bit
0314AA23C847D86CE1FCDC4B3E0524F741FDE2AC4B38B522ED2E3B77CF500D5718
and you can use client server or single system with gpuid 0,1,2,3 to 24 command
and just run and you will have estimate time, could update me your estimate time ?
it can be calculated, not need to launch GPUs...

Code:
Range 2^ 100.0
Expected OP 2^ 51.054194794629616
SuggestedDP = 24
Total GPU: 24
Total hashrate 2^ 34.860826977961146
Kangaroos per GPU 2^ 21.09
Totalkangaroos 2^ 25.674962500721158
Nkangaroo*2^DPbit = 2^ 49.67496250072116
Average time 2^ 16.19336781666847 s =  0.8673126907677194 days

And i use server client ofcourse.
MrFreeDragon
Sr. Member
****
Offline Offline

Activity: 443
Merit: 350


View Profile
May 28, 2020, 01:27:22 PM
 #476

-snip-
could you try this for calc time about your 24 2080ti gpu for 100 bit
0314AA23C847D86CE1FCDC4B3E0524F741FDE2AC4B38B522ED2E3B77CF500D5718
and you can use client server or single system with gpuid 0,1,2,3 to 24 command
and just run and you will have estimate time, could update me your estimate time ?

The estimation for 100bit key with 24x2080ti is approximately 24 hours.

24x2080ti has 29500MJump/sec, so 2^(100/2+1.15) / 29500Mjumps/sec --> 24 hours

brainless
Member
**
Offline Offline

Activity: 316
Merit: 34


View Profile
May 28, 2020, 01:47:12 PM
 #477

-snip-
could you try this for calc time about your 24 2080ti gpu for 100 bit
0314AA23C847D86CE1FCDC4B3E0524F741FDE2AC4B38B522ED2E3B77CF500D5718
and you can use client server or single system with gpuid 0,1,2,3 to 24 command
and just run and you will have estimate time, could update me your estimate time ?

The estimation for 100bit key with 24x2080ti is approximately 24 hours.

24x2080ti has 29500MJump/sec, so 2^(100/2+1.15) / 29500Mjumps/sec --> 24 hours
mean if you run 400 gpu, then how you could found 110 bit in 2 days ? Smiley

13sXkWqtivcMtNGQpskD78iqsgVy9hcHLF
HardwareCollector
Member
**
Offline Offline

Activity: 144
Merit: 10


View Profile
May 28, 2020, 01:51:19 PM
 #478

There seems to be some kind of mistake in this puzzle#110
A huge number of GPU capacities have already been spent and no one could find the key.
I checked the range borders with bsgs, there is no key there.
This was the only version why we could not find the key for so long.
But she is not true. The key is not near the edge.
Someone has an idea why we can’t find the key .. even with the possibility of 400 GPU?
With my own implementation which is memory optimized (16bytes/point with 22-bit mask), I will consume between ~181GB and ~1.1TB of RAM from best to worst case for a 109-bit interval. If the 110-bit private key challenge hasn’t been solved by @zielar before Sunday night, then I would agree that something is definitely wrong; only assuming that he’s used in excess of 1TB of RAM, otherwise it’s just bad luck.
Jean_Luc (OP)
Sr. Member
****
Offline Offline

Activity: 462
Merit: 696


View Profile
May 28, 2020, 01:57:05 PM
 #479

I really like this race Cheesy
HardwareCollector or Zielar ?
Etar
Sr. Member
****
Offline Offline

Activity: 616
Merit: 312


View Profile
May 28, 2020, 02:31:43 PM
 #480

I really like this race Cheesy
HardwareCollector or Zielar ?

friendship)
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 [24] 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 ... 142 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!