WanderingPhilospher
Sr. Member
  

Activity: 1498
Merit: 286
Shooters Shoot...
|
 |
January 29, 2023, 07:05:02 PM |
|
Exactly that's right! Yes nice result but you have 6 nice GPUs . With one GPU let's say you would have taken about 15 min it's a good result
Well I did say in my original post, "As always, it depends on the program you are running and how much hardware you have." That was a 48 bit range; which could take up to 4-5 hours to check every key, so I leaned on the side of caution and ran with 6 GPUs lol... Thank you for the testing of 14 missing characters, I appreciate it. wif (the last 17 are missing) KwRPC6Be7ukp2fh4rVYU4GrmfSdCweo2RxL
Compressed address 1MpqX7tzAWAo1bZv6msHM3mkQ6fHFh2mjE
That's a 60 bit range C018588C2B6CC47... I'm gonna have to pass on that one lol. Can you insert for example the command line you used to perform this search and can you use it randomly? Sure, no problem: VBCr.exe -stop -t 0 -drk 1 -dis 1 -gpu -g 480,512 -r 2900 -begr 6013a6d9493032a55781798cbba36961d73fb72d4a86ac940fcecb3e9857bf1 -endr 6013a6d9493032a55781798cbba36961d73fb72d4a86ac94cfe723cac3c4838 1MpqX7tzAWAo1bZv6msHM3mkQ6fHFh2mjE pause
Note: This is searching with a GPU. You can change it to CPU by adding -t 4 (or whatever number of cores you want to use) and deleting the -gpu, -g flags. You can also redo the -r flag to however often you want the CPU/GPU to generate new random keys. The -r is in millions so if you use -r 10, it will generate new random keys after 10,000,000 keys have been checked.
|
|
|
|
|
proseller01
Copper Member
Newbie

Activity: 14
Merit: 1
Only 21 million.
|
 |
January 29, 2023, 10:37:21 PM |
|
How much estimated time will BitcCrack take to solve puzzle 65 with 4 GPU 2080
|
|
|
|
|
citb0in
|
 |
January 29, 2023, 10:57:17 PM |
|
How much estimated time will BitcCrack take to solve puzzle 65 with 4 GPU 2080
not too long. Save yourself the power consumption and the noise from the GPU fan volume. Here's the private key for it SCNR 
|
Some signs are invisible, some paths are hidden - but those who see, know what to do. Follow the trail - Follow your intuition - [bc1qqnrjshpjpypepxvuagatsqqemnyetsmvzqnafh]
|
|
|
Lolo54
Member


Activity: 133
Merit: 32
|
 |
January 30, 2023, 12:29:08 AM |
|
Exactly that's right! Yes nice result but you have 6 nice GPUs . With one GPU let's say you would have taken about 15 min it's a good result
Well I did say in my original post, "As always, it depends on the program you are running and how much hardware you have." That was a 48 bit range; which could take up to 4-5 hours to check every key, so I leaned on the side of caution and ran with 6 GPUs lol... Thank you for the testing of 14 missing characters, I appreciate it. wif (the last 17 are missing) KwRPC6Be7ukp2fh4rVYU4GrmfSdCweo2RxL
Compressed address 1MpqX7tzAWAo1bZv6msHM3mkQ6fHFh2mjE
That's a 60 bit range C018588C2B6CC47... I'm gonna have to pass on that one lol. Can you insert for example the command line you used to perform this search and can you use it randomly? Sure, no problem: VBCr.exe -stop -t 0 -drk 1 -dis 1 -gpu -g 480,512 -r 2900 -begr 6013a6d9493032a55781798cbba36961d73fb72d4a86ac940fcecb3e9857bf1 -endr 6013a6d9493032a55781798cbba36961d73fb72d4a86ac94cfe723cac3c4838 1MpqX7tzAWAo1bZv6msHM3mkQ6fHFh2mjE pause
Note: This is searching with a GPU. You can change it to CPU by adding -t 4 (or whatever number of cores you want to use) and deleting the -gpu, -g flags. You can also redo the -r flag to however often you want the CPU/GPU to generate new random keys. The -r is in millions so if you use -r 10, it will generate new random keys after 10,000,000 keys have been checked. Thanks. can you put an example of using the -rp command?
|
|
|
|
|
WanderingPhilospher
Sr. Member
  

Activity: 1498
Merit: 286
Shooters Shoot...
|
 |
January 30, 2023, 01:27:26 AM |
|
Exactly that's right! Yes nice result but you have 6 nice GPUs . With one GPU let's say you would have taken about 15 min it's a good result
Well I did say in my original post, "As always, it depends on the program you are running and how much hardware you have." That was a 48 bit range; which could take up to 4-5 hours to check every key, so I leaned on the side of caution and ran with 6 GPUs lol... Thank you for the testing of 14 missing characters, I appreciate it. wif (the last 17 are missing) KwRPC6Be7ukp2fh4rVYU4GrmfSdCweo2RxL
Compressed address 1MpqX7tzAWAo1bZv6msHM3mkQ6fHFh2mjE
That's a 60 bit range C018588C2B6CC47... I'm gonna have to pass on that one lol. Can you insert for example the command line you used to perform this search and can you use it randomly? Sure, no problem: VBCr.exe -stop -t 0 -drk 1 -dis 1 -gpu -g 480,512 -r 2900 -begr 6013a6d9493032a55781798cbba36961d73fb72d4a86ac940fcecb3e9857bf1 -endr 6013a6d9493032a55781798cbba36961d73fb72d4a86ac94cfe723cac3c4838 1MpqX7tzAWAo1bZv6msHM3mkQ6fHFh2mjE pause
Note: This is searching with a GPU. You can change it to CPU by adding -t 4 (or whatever number of cores you want to use) and deleting the -gpu, -g flags. You can also redo the -r flag to however often you want the CPU/GPU to generate new random keys. The -r is in millions so if you use -r 10, it will generate new random keys after 10,000,000 keys have been checked. Thanks. can you put an example of using the -rp command? https://github.com/JeanLucPons/VanitySearch#step-1
|
|
|
|
|
humerh3
Member


Activity: 684
Merit: 17
|
 |
January 30, 2023, 09:54:30 PM Last edit: January 30, 2023, 10:23:53 PM by humerh3 |
|
First time using linux in life, I'm stuck at command not found using root@ or even sudo, lol can't even get to github from linux command shell. I wonder what this error is ~# lol Linux: 1) sudo su - superuser/admin rights 2) then run the commands. You must also enter your system password before using the console and press Enter.
|
------------------------------------------------------------------------ 1DonateWffyhwAjskoEwXt83pHZxhLTr8H ------------------------------------------------------------------------
|
|
|
NotATether
Legendary

Activity: 2324
Merit: 9660
┻┻ ︵㇏(°□°㇏)
|
 |
February 02, 2023, 11:52:06 AM |
|
Linux: 1) sudo su - superuser/admin rights 2) then run the commands. You must also enter your system password before using the console and press Enter.
I don't know why cryptoxploit wrote that instruction there but you don't need to be root to run GPU bitcrack or any of the commands listed. It just so happens that GPU systems you can rent online are automatically provisioned with a root account so that is what most people end up using to run it.
|
|
|
|
Robert_BIT
Newbie

Activity: 33
Merit: 0
|
 |
February 07, 2023, 07:22:12 PM |
|
Hello,
Does anyone know how to best search for multiple addresses at once? Do we just add them line by line in a .txt file? What if we want to search for a large number of addresses, can we use a database / bloom of some sorts?
I am not sure of the limitations of searching a large number of addresses at once...I guess even for a small range like 2^40 it will make the search harder as you use more addresses. But haven't found any data on this on GitHub...
|
|
|
|
|
npuath
Copper Member
Jr. Member

Activity: 42
Merit: 67
|
 |
February 07, 2023, 07:55:58 PM Last edit: February 07, 2023, 10:39:14 PM by npuath |
|
Does anyone know how to best search for multiple addresses at once?
As is often the case, the problem as stated is massively underspecified. For instance: - Are your target addresses correlated, perhaps even strictly sequential (in which case the solution is trivial)? - How large is the dataset; is it feasible to transfer it to several kernels for parallellisation? - Is the dataset mutable, or is it feasible to perform an initial, heavy transformation (in which case perfect hashing might outperform probabilistic/Bloom)? Et cetera. A general database engine is almost certainly not optimal in either case.
|
|
|
|
|
Robert_BIT
Newbie

Activity: 33
Merit: 0
|
 |
February 07, 2023, 10:44:13 PM |
|
Does anyone know how to best search for multiple addresses at once?
As is often the case, the problem as stated is massively underspecified. ----- I know, sorry...----- For instance: - Are your target addresses are correlated, perhaps even strictly sequential (in which case the solution is trivial)? ----- no link between the addresses ----- - How large is the dataset; ----- 2^69 - 2^70 addresses----- is it feasible to transfer it to several kernels for parallellisation? ----- no idea, perhaps.----- - Is the dataset mutable, or is it feasible to perform an initial, heavy transformation (in which case perfect hashing might outperform probabilistic/Bloom)? ----- It is a random dataset of around 2^70 addresses specifically created so that only a few of them have small keys in range 2^40-2^41. The files are not yet created due to the lack of knowledge on how to create a dataset that would work with tools such as bitCrack, etc...But essentially the task at hand will be to find those small range keys while searching the huge dataset mentioned... ----- Et cetera. A general database engine is almost certainly not optimal in either case.----- rather than optimal my question hints at ...is this even possible? -----
|
|
|
|
|
npuath
Copper Member
Jr. Member

Activity: 42
Merit: 67
|
 |
February 07, 2023, 11:18:31 PM Last edit: February 08, 2023, 12:42:38 AM by npuath |
|
It is a random dataset of around 2^70 addresses
Wait until April 1, then by all means just add them line by line in a .txt file Edit:Because of The files are not yet created due to the lack of knowledge on how to create a dataset that would work with tools such as bitCrack, etc... ...is this even possible?
I just realised that maybe you're actually serious, in which case I'm sorry to predict that the answer is no, most likely for several years. You'll not even be able to "create the files": - Even if you somehow could create and store addresses as fast as the combined mining community can try hashes 2023 you'd have to wait 10000+ years. - Then there's the matter of storage. Luckily, Seagate expects to start selling 50TB hard drives in 2026, but you'd still need around 500 million of them.
|
|
|
|
|
WanderingPhilospher
Sr. Member
  

Activity: 1498
Merit: 286
Shooters Shoot...
|
 |
February 08, 2023, 01:51:29 AM |
|
It is a random dataset of around 2^70 addresses
Wait until April 1, then by all means just add them line by line in a .txt file Edit:Because of The files are not yet created due to the lack of knowledge on how to create a dataset that would work with tools such as bitCrack, etc... ...is this even possible?
I just realised that maybe you're actually serious, in which case I'm sorry to predict that the answer is no, most likely for several years. You'll not even be able to "create the files": - Even if you somehow could create and store addresses as fast as the combined mining community can try hashes 2023 you'd have to wait 10000+ years. - Then there's the matter of storage. Luckily, Seagate expects to start selling 50TB hard drives in 2026, but you'd still need around 500 million of them. Yeah, I think people underestimate how large 2^69 is... For Robert_BIT But to help further the example and to give you concrete evidence: Loading : 100 % Loaded : 55,246,870 Bitcoin xpoints
Those are xpoints which will be a little larger (file wise) than addresses. But for 55,246,870 xpoints, that file size in BINARY, is 1.72GB. In regular text format, the file size is 3.5GB. Now, the other thing to consider if one is creating addresses, you will have to save, at the minimum, the private key and the address, or else one would not know how to map back to the address. You can create a starting point and do sequential keys which then you would only need to keep your starting point (private key). But for the example above: 2^69 / 55,246,870 = 10,684,692,370,060; now take that and multiply by 1.72GB = 18,377,670,876,503GB, double all numbers for 2^70. That's a lot of GBs  You could probably trim some data and add in a hash table, maybe save some GB needed, but then you would need to create or mod an existing program to search via a hashtable. Doing that is doable, but then you would have to chunk the addresses in order to search for them. No way you could store 2^69 addresses in memory or hash table, and search for them. The 55,246,870 eats up around 1200MB of RAM in my program.
|
|
|
|
|
npuath
Copper Member
Jr. Member

Activity: 42
Merit: 67
|
 |
February 08, 2023, 10:57:00 AM |
|
But for the example above: 2^69 / 55,246,870 = 10,684,692,370,060; now take that and multiply by 1.72GB = 18,377,670,876,503GB, double all numbers for 2^70. That's a lot of GBs  Right, it's 18 ZB, roughly the same as 24 ZB (using 20 bytes per address, i.e. just HASH160, no prefix, checksum or private key). In other words, about twice as much as all the world's storage capacity (HDD, flash, tape, optical) in 2023, and would cost something like $1000 billion 1. 1 Using data from IDC and their "Worldwide Global StorageSphere" metric (not to be confused with the "DataSphere", which is the amount created and some 10x bigger).
|
|
|
|
|
Robert_BIT
Newbie

Activity: 33
Merit: 0
|
 |
February 08, 2023, 02:03:27 PM |
|
Understood...I posted a quite unrealistic scenario. Let's dial it down back to earth and try again.
1. How many x points could bitCrack / keyHunt, etc work with, saved in a dataset, to find small size keys within a reasonable amount of time (less than 24 hours). We can assume that the application is running on a high-end computer with a large amount of RAM and disk space. Does anyone have any experience with this? Would you be able to provide the results and the specifications of the machine used?
2. Same question for using addresses instead of points.
Thank you!!
|
|
|
|
|
WanderingPhilospher
Sr. Member
  

Activity: 1498
Merit: 286
Shooters Shoot...
|
 |
February 08, 2023, 02:42:56 PM |
|
Understood...I posted a quite unrealistic scenario. Let's dial it down back to earth and try again.
1. How many x points could bitCrack / keyHunt, etc work with, saved in a dataset, to find small size keys within a reasonable amount of time (less than 24 hours). We can assume that the application is running on a high-end computer with a large amount of RAM and disk space. Does anyone have any experience with this? Would you be able to provide the results and the specifications of the machine used?
2. Same question for using addresses instead of points.
Thank you!!
You would have to give me concrete examples. work with = I've tested with 60 million xpoints, so I know it can do that many, how many are you wanting to run/test? small key sizes = what do you mean? search in a 40 bit range, 48 bit range, etc. are you talking about ranges or small key sizes as in, private key sizes? addresses/xpoints=addresses would be converted to hash160 which would be smaller than xpoints, so I would imagine, ran on the same systems, with same amount of RAM, you could run more addresses/hash160s versus xpoints.
|
|
|
|
|
Robert_BIT
Newbie

Activity: 33
Merit: 0
|
 |
February 08, 2023, 03:27:29 PM |
|
You would have to give me concrete examples.
work with = I've tested with 60 million xpoints, so I know it can do that many, how many are you wanting to run/test? small key sizes = what do you mean? search in a 40 bit range, 48 bit range, etc. are you talking about ranges or small key sizes as in, private key sizes?
addresses/xpoints=addresses would be converted to hash160 which would be smaller than xpoints, so I would imagine, ran on the same systems, with same amount of RAM, you could run more addresses/hash160s versus xpoints.
I want to test maybe 10-15x more close to 1B addresses. Machine specs 50TB SSDs, 128 RAM. How should I proceed, just generate a large .txt file and see if it works? I doubt .txt files can handle that much data... yes, small key sizes as in 20^35 - 2^40 keysize. How much of an impact has the size of the key? As in the smaller the key, the larger the amount of xpoints or addresses the tool can work with? Or even if the key is really small like 123*G it's still going to search forever within a large dataset of addresses...
|
|
|
|
|
NotATether
Legendary

Activity: 2324
Merit: 9660
┻┻ ︵㇏(°□°㇏)
|
 |
February 08, 2023, 04:23:55 PM |
|
I want to test maybe 10-15x more close to 1B addresses. Machine specs 50TB SSDs, 128 RAM. How should I proceed, just generate a large .txt file and see if it works? I doubt .txt files can handle that much data... yes, small key sizes as in 20^35 - 2^40 keysize. How much of an impact has the size of the key? As in the smaller the key, the larger the amount of xpoints or addresses the tool can work with? Or even if the key is really small like 123*G it's still going to search forever within a large dataset of addresses...
Bitcoin addresses are about 52 characters long so if you're dealing with 1 billion of them you are looking at 53-54 GB text file (including the new line, and if you're using Windows there's also a carriage return before the new line). As long as you are not opening that thing using Notepad and you got 64GB of RAM to spare, then you should not have any problems with memory.
|
|
|
|
Robert_BIT
Newbie

Activity: 33
Merit: 0
|
 |
February 08, 2023, 05:27:22 PM |
|
Bitcoin addresses are about 52 characters long so if you're dealing with 1 billion of them you are looking at 53-54 GB text file (including the new line, and if you're using Windows there's also a carriage return before the new line). As long as you are not opening that thing using Notepad and you got 64GB of RAM to spare, then you should not have any problems with memory.
Thanks, will give it a try in a few days.
|
|
|
|
|
WanderingPhilospher
Sr. Member
  

Activity: 1498
Merit: 286
Shooters Shoot...
|
 |
February 10, 2023, 06:56:23 PM Last edit: February 12, 2023, 01:36:49 AM by WanderingPhilospher |
|
Bitcoin addresses are about 52 characters long so if you're dealing with 1 billion of them you are looking at 53-54 GB text file (including the new line, and if you're using Windows there's also a carriage return before the new line). As long as you are not opening that thing using Notepad and you got 64GB of RAM to spare, then you should not have any problems with memory.
Thanks, will give it a try in a few days. My largest test to date: Loading : 100 % Loaded : 100,000,001 Bitcoin addresses
Edit: It would work and run with CPU only, but not with GPU... Edit 2: It works with 100 million addresses on CPU and GPU... Edit 3: Loading : 100 % Loaded : 200,000,001 Bitcoin addresses
200 Million addresses loaded and working with CPU; Tested on GPU, loads and runs with 200 million addresses. It eats up 4300MB, 4.3GB of RAM. Binary to Bloom; I am sure if it was loaded via text file, it would eat up a lot more. Also, I doubt this will work with Bitcrack or any VanitySearch forks.
|
|
|
|
|
NotATether
Legendary

Activity: 2324
Merit: 9660
┻┻ ︵㇏(°□°㇏)
|
 |
February 11, 2023, 04:31:56 PM |
|
200 Million addresses loaded and working with CPU; GPU untested It eats up 4300MB, 4.3GB of RAM. Binary to Bloom; I am sure if it was loaded via text file, it would eat up a lot more. Also, I doubt this will work with Bitcrack or any VanitySearch forks.
Very impressive stats you found but I highly doubt you will get any higher than this if you test on GPU, this is because of its architecture where it's memory is separated from all the rest of the system RAM. So you could have a 96GB system (eg. A very very recent MacBook Pro with an external nvidia GPU (does that even exist??) but the GPU will only have only 8 or 16 gigs total which puts a ceiling on the number of addresses. That's kinda sad as these suckers can easily do 500x the search performance of a single CPU socket.
|
|
|
|
|