Bitcoin Forum
May 21, 2024, 02:09:39 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 [4] 5 6 »  All
  Print  
Author Topic: Bitcoin block data (728 GB): inputs, outputs and transactions  (Read 2898 times)
This is a self-moderated topic. If you do not want to be moderated by the person who started this topic, create a new topic. (3 posts by 1+ user deleted.)
yogg
Legendary
*
Offline Offline

Activity: 2464
Merit: 3158



View Profile WWW
March 15, 2022, 05:54:13 PM
 #61

Let me know, I can provide an FTP access to you.

People still use FTP these days? I remember using FTP with FileZilla long time ago.


Nothing better than the good old foundations of IT. Smiley
Like, emacs is more pure and "C-itonic" as it's been delivered to us mere mortals by the founder of GNU himself. (f*ck vi Tongue "<esc> :wq!" wtf)

However we'll go with SCP for this one. LoyceV I'm PMing you the credentials soon.




Which "project" do you mean exactly? List of all Bitcoin addresses with a balance? That one is covered, it only takes 20 GB to host. Although by now the 2 TB monthly bandwidth isn't enough anymore, at day 15 it already used 1.2 TB so it'll go offline at the end of the month.
If you happen to have 700 GB available, by all means, share "inputs, outputs and transactions" Cheesy

Yes, this is the one I meant to use for my project. (Bitcoin addresses with a balance)

I can find the 700 GB and host "inputs, outputs and transactions" Smiley
Luckily the bandwidth is unmetered so I can also provide a mirror for some of the other dumps.


It could work, but having to upload updates instead of just processing them on the server is far from ideal.

How is the data processed ? Running on Bitcoin Core ?
LoyceV (OP)
Legendary
*
Offline Offline

Activity: 3318
Merit: 16673


Thick-Skinned Gang Leader and Golden Feather 2021


View Profile WWW
March 15, 2022, 06:10:30 PM
 #62

However we'll go with SCP for this one. LoyceV I'm PMing you the credentials soon.
I actually prefer rsync, because scp changes the original file date, which is undesirable.

Quote
Yes, this is the one I meant to use for my project. (Bitcoin addresses with a balance)
In that case, just get the latest version Smiley No need for me to upload that one.

Quote
I can find the 700 GB and host "inputs, outputs and transactions" Smiley
Luckily the bandwidth is unmetered so I can also provide a mirror for some of the other dumps.
It would be great to have a mirror with decent bandwidth. If I don't have to upload 665 GB again when my host fails, it's already worth it.

Quote
How is the data processed ? Running on Bitcoin Core ?
If only I had the skills for that! I use this:
Credits
Blockchair Database Dumps has a staggering amount of data, easily accessible (at 10 100 kB/s) with daily updates. All data in this topic comes from Blockchair.

(f*ck vi Tongue "<esc> :wq!" wtf)
What's wrong with vi? I literally use it every day Cheesy

yogg
Legendary
*
Offline Offline

Activity: 2464
Merit: 3158



View Profile WWW
March 15, 2022, 10:08:08 PM
Last edit: March 15, 2022, 10:20:34 PM by yogg
 #63

However we'll go with SCP for this one. LoyceV I'm PMing you the credentials soon.
I actually prefer rsync, because scp changes the original file date, which is undesirable.

Ugh ... that leaves me with no other choice than use touch.



(Actually I'd prefer to do that .. can't do rsync straight to that appliance and I would rather avoid setting up a buffer VM just for rsync)



Quote
No need for me to upload that one.

Spot on. Cheesy

Quote
It would be great to have a mirror with decent bandwidth. If I don't have to upload 665 GB again when my host fails, it's already worth it.

How do you download this dump ?
I'm asking, since yeah the data needs to be updated every once in a while.

Are you downloading every update at 100kB/s ?
If yes, we'd "only" save some time if you upload it to my end.

Giving you the ability to blink my hardware's light emitting diodes might be pointless.


Quote
Quote
How is the data processed ? Running on Bitcoin Core ?
If only I had the skills for that! I use this:
Credits
Blockchair Database Dumps has a staggering amount of data, easily accessible (at 10 100 kB/s) with daily updates. All data in this topic comes from Blockchair.

Ha. Makes sense. Smiley
Some time ago I started coding a script to get something similar to the all addresses balance dump.
Unfortunately the initial processing of the blockchain was too resource expensive. The process never completed.

However ... This got me thinking..
If the "input outputs transaction" dump is something like a csv file, just compiling the "missing" data with Bitcoin Core and append it to the dump could do the trick.

(f*ck vi Tongue "<esc> :wq!" wtf)
What's wrong with vi? I literally use it every day Cheesy



The people who showed me the path to emacs warned me about vi users.
Everytime you use the dd p shortcut, the init entity turns off the numlock on boomer's keyboard so they can no longer login to their windows session. And can't figure out why.
You monster !  Angry
PawGo
Legendary
*
Offline Offline

Activity: 952
Merit: 1367


View Profile
March 16, 2022, 07:22:39 AM
 #64

(f*ck vi Tongue "<esc> :wq!" wtf)
What's wrong with vi? I literally use it every day Cheesy

Is that true that if you tell average user to leave vi, you will get a natural RNG or at least a good example of password (compliant with any strong password policy)?

LoyceV (OP)
Legendary
*
Offline Offline

Activity: 3318
Merit: 16673


Thick-Skinned Gang Leader and Golden Feather 2021


View Profile WWW
March 16, 2022, 09:13:01 AM
Merited by PowerGlove (1)
 #65

Ugh ... that leaves me with no other choice than use touch.
I used wget instead, this should preserve file dates.

Quote
How do you download this dump ?
Also using wget.

Quote
Are you downloading every update at 100kB/s ?
Yes.

Quote
If yes, we'd "only" save some time if you upload it to my end.
Using rsync would be so easy Wink

Quote
Giving you the ability to blink my hardware's light emitting diodes might be pointless.
It should be going wild now.

Quote
If the "input outputs transaction" dump is something like a csv file, just compiling the "missing" data with Bitcoin Core and append it to the dump could do the trick.
Until I can get the full data from my own Bitcoin Core installation, I'll keep using the data dumps. And even if I can get the data from Bitcoin Core, it would further increase the VPS requirements.

Quote
The people who showed me the path to emacs warned me about vi users.
What's emacs?
Image loading...
I kid you not Cheesy

Is that true that if you tell average user to leave vi
It can't be: the average user doesn't use vi.

LoyceV (OP)
Legendary
*
Offline Offline

Activity: 3318
Merit: 16673


Thick-Skinned Gang Leader and Golden Feather 2021


View Profile WWW
March 16, 2022, 10:26:04 AM
 #66

I've updated all links to the new host. Torrents should work again too:



You could run Bitcoin Core and processing script on local device, then upload the result to your VPS/seedbox.
Apart from the fact that I wouldn't know how to do this, I don't really want to add more load to my local PC.

PawGo
Legendary
*
Offline Offline

Activity: 952
Merit: 1367


View Profile
March 16, 2022, 10:57:28 AM
 #67


You could run Bitcoin Core and processing script on local device, then upload the result to your VPS/seedbox.
Apart from the fact that I wouldn't know how to do this, I don't really want to add more load to my local PC.

Wait, I am lost. How do you currently extract the data you publish?
For operations on local blocks I have used that parser: https://github.com/gcarq/rusty-blockparser
Configuration was really easy, processing of course takes some time.
LoyceV (OP)
Legendary
*
Offline Offline

Activity: 3318
Merit: 16673


Thick-Skinned Gang Leader and Golden Feather 2021


View Profile WWW
March 16, 2022, 11:05:08 AM
 #68

How do you currently extract the data you publish?
See:
Credits
Blockchair Database Dumps has a staggering amount of data, easily accessible (at 10 100 kB/s) with daily updates. All data in this topic comes from Blockchair.
(nobody ever reads the OP)

Quote
For operations on local blocks I have used that parser: https://github.com/gcarq/rusty-blockparser
Configuration was really easy, processing of course takes some time.
Memory usage to get balances: ~18GB. That would take a strong VPS.

PawGo
Legendary
*
Offline Offline

Activity: 952
Merit: 1367


View Profile
March 16, 2022, 11:21:04 AM
 #69

See:
Credits
Blockchair Database Dumps has a staggering amount of data, easily accessible (at 10 100 kB/s) with daily updates. All data in this topic comes from Blockchair.
(nobody ever reads the OP)

;-) Your point.
The same applies to the privacy policy and the washing machine manual.

Quote
Quote
For operations on local blocks I have used that parser: https://github.com/gcarq/rusty-blockparser
Configuration was really easy, processing of course takes some time.
Memory usage to get balances: ~18GB. That would take big VPS.

Ops. I did not use it, only transactions dump. It is up to you if you want to use it, even partially.

Yesterday I was looking for some hosting services for BTC and honestly speaking I haven't found anything I would trust. Once I tried one hosting just to compare with my main one, but service and reliability was terrible. I had impression all that companies which offer servers for BTC are running their "datacenters" somewhere in the garage.
Have you seen https://www.sync.com/pricing-individual/ ?
LoyceV (OP)
Legendary
*
Offline Offline

Activity: 3318
Merit: 16673


Thick-Skinned Gang Leader and Golden Feather 2021


View Profile WWW
March 16, 2022, 12:33:58 PM
Merited by ABCbits (1)
 #70

Yesterday I was looking for some hosting services for BTC and honestly speaking I haven't found anything I would trust.
For projects like this topic I don't really need AWS-level uptime, so I gladly go for the budget hosts. I run this project at Racknerd (good deals via Lowendtalk.com), and (so far) I'm quite happy with it. If I look at prices on their own site, it's much higher.
You could also go for RamNode, pay by the hour and less cheap, but from what I've seen RamNode has very solid performance too.
I don't think any of those are run from their garage Wink I run this project at Gullo's Hosting (again: see Lowendtalk.com for deals), which has the unique feature that it's run by one guy. Servers are international, so not from his garage, but you'll always deal with the same guy. He'll try his best, which for this project is everything I need.

I also found out that several of the higher range VPS providers don't accept Bitcoin, or demand a copy of my passport (yeah, right!) first.

Quote
That's not a VPS, so won't help me much.

LoyceV (OP)
Legendary
*
Offline Offline

Activity: 3318
Merit: 16673


Thick-Skinned Gang Leader and Golden Feather 2021


View Profile WWW
March 26, 2022, 02:24:31 PM
 #71

Big update

First: the Torrents won't last. I got an anonymous sponsor for a dedicated server. The Seedbox expires on April 6, so the Torrens won't work much longer:
New! Torrents!
inputs.torrent
outputs.torrent
transactions.torrent
For privacy, you may want to consider using a VPN so other users can't see your IP address.

The data is back at it's original location:
This server is has a 50 TB/month bandwidth limit. So far, at most a few people per month downloaded this (crazy amount of) data, so it should be sufficient.

Note: Some files are missing, those are still being updated now. By tomorrow automated daily updates should be on track again.

PrimeNumber7
Copper Member
Legendary
*
Offline Offline

Activity: 1624
Merit: 1899

Amazon Prime Member #7


View Profile
March 26, 2022, 05:13:11 PM
 #72

My VPS doesn't have enough storage for "inputs" torrent, so i didn't try until download is finished. But it starts almost immediately with similar speed on previous speed.
It is ~never appropriate to store that much data on a VPS. You are much better off storing these files in a storage bucket. If you try any other solution, you will either quickly hit your transfer limitations, or your files will eventually get taken down because you are taking up too many resources.

Hosing your files in a storage bucket will mean that others can ~instantly download your files (limited only by their own bandwidth and computer equipment). Allowing people to download your files from the internet will be expensive, however, this can be addressed by configuring your bucket such that the requestor (the person downloading your file) will pay for the egress bandwidth.

All major cloud providers offer storage buckets. Many smaller cloud providers do as well.
LoyceV (OP)
Legendary
*
Offline Offline

Activity: 3318
Merit: 16673


Thick-Skinned Gang Leader and Golden Feather 2021


View Profile WWW
March 26, 2022, 07:23:18 PM
 #73

Hosing your files in a storage bucket
We discussed this already. But I just got a very nice Xeon-powered dedicated server (no more VPS!) from an anonymous donation, so I'm covered for now.

Quote
All major cloud providers offer storage buckets. Many smaller cloud providers do as well.
I'm curious: what would it cost to store a TB in a storage bucket?

PrimeNumber7
Copper Member
Legendary
*
Offline Offline

Activity: 1624
Merit: 1899

Amazon Prime Member #7


View Profile
March 26, 2022, 08:26:49 PM
 #74

Hosing your files in a storage bucket
We discussed this already. But I just got a very nice Xeon-powered dedicated server (no more VPS!) from an anonymous donation, so I'm covered for now.
Ahh, yes, there it is. I thought I remembered giving you this advice, but I couldn't find the discussion in this thread.

Quote
All major cloud providers offer storage buckets. Many smaller cloud providers do as well.
I'm curious: what would it cost to store a TB in a storage bucket?
AWS and GCS have the same pricing structure, generally speaking. If you hosted the files on either of those platforms, it would cost approximately US$20 per month. This is not the same as hosting your files on a VPS. Accessing and transferring the files would be much quicker compared to using a VPS.

I believe you have previously stated that you don't want to use AWS because you don't want to have to use a credit card to pay, nor associate your IRL identity with the service. My argument would be that this is really your only option, as using a VPS with a smaller provider (as you have been doing) is eventually going to result in your account being shut down, or your bandwidth being exhausted.

There might be other cloud storage providers that can offer storage buckets at a lower price that might not be as reliable or have as high of throughput that offers their services for less. Your project is not one that is critical to always have access to all your data at a moment's notice, so this may be okay. It is possible you can find one that will be willing to accept crypto for their services.

If you use a storage bucket on AWS or GCS, someone will need to pay $0.09/gb to transfer the file to the internet, or $0.01 to transfer your file to another data center run by the same cloud provider located on the same continent where your data is housed (or $0.00/gb -- free - within the same datacenter location). You can configure the storage bucket so that the person downloading the file will need to pay the transfer charges.
DaveF
Legendary
*
Online Online

Activity: 3486
Merit: 6304


Crypto Swap Exchange


View Profile WWW
March 26, 2022, 10:46:48 PM
 #75

...I believe you have previously stated that you don't want to use AWS because you don't want to have to use a credit card to pay, nor associate your IRL identity with the service. My argument would be that this is really your only option, as using a VPS with a smaller provider (as you have been doing) is eventually going to result in your account being shut down, or your bandwidth being exhausted...

Not unless he is working with a really small provider. More and more 1GB un-metered for co-location is the standard.* Or if it is metered it's in the multi TB range.
Bandwidth has gotten so cheap at data centers that it's pointless to rate it anymore. Case in point, Cogent just made us an offer for a 1GB fiber loop to our rack for under $400 a month all in. 3 years ago that same circuit was well over $1500. Hurricane is under $2500 for a 10GB circuit. And we are a very small buyer of bandwidth. "Real" companies that are buying multi 100GB circuits are paying very very little.

-Dave

* Most places are going for a 10 to 1 over subscription so you probably may not get the 1GB all the time but the point is the same.

█▀▀▀











█▄▄▄
▀▀▀▀▀▀▀▀▀▀▀
e
▄▄▄▄▄▄▄▄▄▄▄
█████████████
████████████▄███
██▐███████▄█████▀
█████████▄████▀
███▐████▄███▀
████▐██████▀
█████▀█████
███████████▄
████████████▄
██▄█████▀█████▄
▄█████████▀█████▀
███████████▀██▀
████▀█████████
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
c.h.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀█











▄▄▄█
▄██████▄▄▄
█████████████▄▄
███████████████
███████████████
███████████████
███████████████
███░░█████████
███▌▐█████████
█████████████
███████████▀
██████████▀
████████▀
▀██▀▀
PrimeNumber7
Copper Member
Legendary
*
Offline Offline

Activity: 1624
Merit: 1899

Amazon Prime Member #7


View Profile
March 27, 2022, 06:00:14 AM
 #76

...I believe you have previously stated that you don't want to use AWS because you don't want to have to use a credit card to pay, nor associate your IRL identity with the service. My argument would be that this is really your only option, as using a VPS with a smaller provider (as you have been doing) is eventually going to result in your account being shut down, or your bandwidth being exhausted...

Not unless he is working with a really small provider. More and more 1GB un-metered for co-location is the standard.* Or if it is metered it's in the multi TB range.
Bandwidth has gotten so cheap at data centers that it's pointless to rate it anymore. Case in point, Cogent just made us an offer for a 1GB fiber loop to our rack for under $400 a month all in. 3 years ago that same circuit was well over $1500. Hurricane is under $2500 for a 10GB circuit. And we are a very small buyer of bandwidth. "Real" companies that are buying multi 100GB circuits are paying very very little.

-Dave

* Most places are going for a 10 to 1 over subscription so you probably may not get the 1GB all the time but the point is the same.
It is unlikely that Loyce would be dealing with Cogent directly, but rather would be dealing with one of Cogent's customers.

Even if someone's bandwidth is "unmetered", I can assure you that usage is still "monitored". As you note, Cogent is going to oversell their capacity, and most likely, Cogent's customer who sells VPS services will also oversell their capacity. If Loyce is constantly sending hundreds of GB's worth of data to the internet, there will be less capacity for other customers to send their own data to the internet, which will degrade service for others.

The files that Loyce is hosting total over 660 GB. It would not take much for Loyce to run into hitting the multi TB range with files of that size, especially considering that it is trivial for someone to request those files multiple times.
LoyceV (OP)
Legendary
*
Offline Offline

Activity: 3318
Merit: 16673


Thick-Skinned Gang Leader and Golden Feather 2021


View Profile WWW
March 27, 2022, 08:09:35 AM
 #77

AWS and GCS have the same pricing structure, generally speaking. If you hosted the files on either of those platforms, it would cost approximately US$20 per month. This is not the same as hosting your files on a VPS. Accessing and transferring the files would be much quicker compared to using a VPS.
There are several problems with that: first, I can find a much better deal for that price. The Seedbox for instance costs much less for 6 TB bandwidth (and unlimited at 100 Mbit after that). And I don't really mind if someone has to wait a day for a 665 GB download. That's a small inconvenience, and I prefer that over having to buy their own storage bucket and pay for bandwidth before they can download the files.

Quote
I believe you have previously stated that you don't want to use AWS because you don't want to have to use a credit card to pay, nor associate your IRL identity with the service. My argument would be that this is really your only option, as using a VPS with a smaller provider (as you have been doing) is eventually going to result in your account being shut down, or your bandwidth being exhausted.
For this project, I don't really mind if it runs out of bandwidth. That just means some people have to wait until the next month. Torrent was a nice workaround, but more work to update.
As I wrote earlier: at most a few people per month have downloaded this data, which means my current limit of 50 TB should be sufficient. To quote my sponsor about hitting the bandwidth limit:
Quote
Then we can upgrade the server or limitate the download speed
Or sell apikeys for downloading 😛

If you use a storage bucket on AWS or GCS, someone will need to pay $0.09/gb to transfer the file to the internet, or $0.01 to transfer your file to another data center run by the same cloud provider located on the same continent where your data is housed (or $0.00/gb -- free - within the same datacenter location). You can configure the storage bucket so that the person downloading the file will need to pay the transfer charges.
Up to $60 to download a few files. No wonder Bezos is rich Cheesy

The files that Loyce is hosting total over 660 GB. It would not take much for Loyce to run into hitting the multi TB range with files of that size, especially considering that it is trivial for someone to request those files multiple times.
It still takes several people crazy enough to download this much data per day to reach 50 TB. My other project currently uses over 2 TB per month, so for now I'm good Smiley

DaveF
Legendary
*
Online Online

Activity: 3486
Merit: 6304


Crypto Swap Exchange


View Profile WWW
March 27, 2022, 11:19:24 AM
 #78

...I believe you have previously stated that you don't want to use AWS because you don't want to have to use a credit card to pay, nor associate your IRL identity with the service. My argument would be that this is really your only option, as using a VPS with a smaller provider (as you have been doing) is eventually going to result in your account being shut down, or your bandwidth being exhausted...

Not unless he is working with a really small provider. More and more 1GB un-metered for co-location is the standard.* Or if it is metered it's in the multi TB range.
Bandwidth has gotten so cheap at data centers that it's pointless to rate it anymore. Case in point, Cogent just made us an offer for a 1GB fiber loop to our rack for under $400 a month all in. 3 years ago that same circuit was well over $1500. Hurricane is under $2500 for a 10GB circuit. And we are a very small buyer of bandwidth. "Real" companies that are buying multi 100GB circuits are paying very very little.

-Dave

* Most places are going for a 10 to 1 over subscription so you probably may not get the 1GB all the time but the point is the same.
It is unlikely that Loyce would be dealing with Cogent directly, but rather would be dealing with one of Cogent's customers.

Even if someone's bandwidth is "unmetered", I can assure you that usage is still "monitored". As you note, Cogent is going to oversell their capacity, and most likely, Cogent's customer who sells VPS services will also oversell their capacity. If Loyce is constantly sending hundreds of GB's worth of data to the internet, there will be less capacity for other customers to send their own data to the internet, which will degrade service for others.

The files that Loyce is hosting total over 660 GB. It would not take much for Loyce to run into hitting the multi TB range with files of that size, especially considering that it is trivial for someone to request those files multiple times.

Cogent and most providers like them as a rule do not oversubscribe, if anything they under subscribe.

They used to do it, but stopped around the GFC. No I don't think it's related, but around 2007 - 8 - 9 we just stopped seeing it, at least in the major DCs. I think there are just too many competitors. I can call Monday AM and have an agreement in a couple of days with another provider have the fiber connected a couple of days after that and be running a couple of days after that. So if I don't get my 1GB you're gone. Heck when Sandy took out people in NY loops were being turned up in hours as we all ran around the DCs getting stuff wired.

The people they sell to oversubscribe what they bought.

If you use a storage bucket on AWS or GCS, someone will need to pay $0.09/gb to transfer the file to the internet, or $0.01 to transfer your file to another data center run by the same cloud provider located on the same continent where your data is housed (or $0.00/gb -- free - within the same datacenter location). You can configure the storage bucket so that the person downloading the file will need to pay the transfer charges.
Up to $60 to download a few files. No wonder Bezos is rich Cheesy

Till we got out of that business we used to joke Amazon is our best salesperson.
Their product is good, don't get me wrong, but it's super expensive and most people don't need it and a 2nd or 3rd tier provider is usually fine.
Not to mention, if your provender is not peered fast enough with Amazon they tend to be super slow since they prioritize amazon.com traffic over the aws traffic. In most major markets it's not a big deal. But in the middle of nowhere when your local ISP only has a 10GB connection to the peering network come Christmas time it can get bad.

Either way this is drifting from the main point of the thread, so I'm going to leave it alone for now.

Side note, do you know how many people are actually using this data, it's neat and all but outside of the geek-base on this forum I don't see a lot of people caring.

-Dave

█▀▀▀











█▄▄▄
▀▀▀▀▀▀▀▀▀▀▀
e
▄▄▄▄▄▄▄▄▄▄▄
█████████████
████████████▄███
██▐███████▄█████▀
█████████▄████▀
███▐████▄███▀
████▐██████▀
█████▀█████
███████████▄
████████████▄
██▄█████▀█████▄
▄█████████▀█████▀
███████████▀██▀
████▀█████████
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
c.h.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀█











▄▄▄█
▄██████▄▄▄
█████████████▄▄
███████████████
███████████████
███████████████
███████████████
███░░█████████
███▌▐█████████
█████████████
███████████▀
██████████▀
████████▀
▀██▀▀
LoyceV (OP)
Legendary
*
Offline Offline

Activity: 3318
Merit: 16673


Thick-Skinned Gang Leader and Golden Feather 2021


View Profile WWW
April 11, 2022, 10:49:53 AM
 #79

I noticed "outputs" wasn't updating because the cronjob started when "transactions" was still running (and Blockchair.com only allows one connection at a time). It's updated now and daily updates should work again.



I used this data to check for myself if the Bitcoin blockchain has duplicate transaction hashes:
Code:
for day in `ls /var/www/blockdata.loyce.club/public_html/outputs/*gz`; do echo $day; cat $day | gunzip | cut -f 1-2 | grep -v transaction_hash > outputs.tmp; for block in `cat outputs.tmp | cut -f1 | sort -nu`; do grep -P "^$block\t" outputs.tmp | cut -f2 | awk '!a[$0]++' >> all_transaction_hashes_outputs.txt; done; rm outputs.tmp; done
# The awk-part removes duplicates within each block
cat all_transaction_hashes_outputs.txt | sort -S69% | uniq -d
d5d27987d2a3dfc724e359870c6644b40e497bdc0589a033220fe15429d88599
e3bf3d07d4b0375638d5f1db5255fe07ba2c4cb067cd81b84ee974b6585fb468
This confirmed 2 transactions have the same hash.

I tried to do the same for "inputs", but unfortunately I don't have the disk space to sort this large amount of data.

PawGo
Legendary
*
Offline Offline

Activity: 952
Merit: 1367


View Profile
May 14, 2022, 07:44:27 PM
 #80

Just a question - for the list of founded addresses, do you take in to account unconfirmed balance at the moment of snapshot or not? And opposite - if address with unconfirmed spending is still included or already not?
Pages: « 1 2 3 [4] 5 6 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!