I have skills and server limitations. I am not able to handle a huge amount of data. I believe 1 week data is already a lot (700kb file). One month data will be 2.8MB, which is huge for a webpage.
I will try to learn how to handle databases so I could process many gigabytes of data in the server, and just deliver the results to the client. But I don't have the skills (yet) neither the server for it . Learning how to use a database has been on my if-only-I-had-the-time list for a while too, but processing data with a script is quite easy. If it helps: I can quite easily make daily/weekly/monthly-versions of only certain columns. If you don't need all data that should make it a lot smaller. If you can use it, I can gzip the file too: blockdata.lastweek.txt.gz (this sample file is scheduled to be deleted in 7 days).
|
|
|
I just need someone to create topic. Dats all. There would be something like: All clients of company Brandname could left their feedback here. Anyone could do that (including you). The question is: I need someone with trustable account Why do you need a "trustable" account for this? If I open a topic for you, would you want me to vouch for you?
|
|
|
If you use Electrum to create multisig, it gives you a long list of addresses (starting with 3).
|
|
|
Bad news from Ledger (again). Can't they just send all customers a big sign to put in front of their house? "Ledger owner here!"
|
|
|
I can see the BCH address i sent to has the funds, but i can not see them ? - i can see it Blockchair though. Can you see the same address in your Electrum wallet? If so, you should be able to recover the BCH Forkcoins.
|
|
|
wat has happen to my BCH cash funds ? Search for your address on Blockchair: does it show any funds on BCH?
You already have a topic in Altcoin Discussion, which is where this belongs. There's no need to open another topic.
|
|
|
I don't have any issue with waiting days or weeks. I would just like to avoid the coins becoming unusable. You're going to want to read this: Consolidate your small inputs!Any tips on how to find a minimal fee and estimate how big this transaction would be in byes or kB? Bitcoin Core tells you the transaction size when you manually select Inputs. See https://jochen-hoenicke.de/queue/#0,30d for fees. In general, Sundays are best. With this many inputs, I would just be patient, set 1 sat/byte, and wait for it. If you enable Coin Control in Settings, you can manually select which Inputs to use for your transaction. It may be worth picking only the largest ones, or creating several transactions with 50-ish inputs.
|
|
|
If in future I would like to send my entire balance, then doing following: "send 0.05 BTC + 0.05 BTC to my other wallet with 1-2 sat per byte fee (it might take days to receive them back)", will decrease transaction fee and increase transaction speed? Yes. Or in pictures: You have this, which is slow and annoying to use: You want this for quick easy payments:
|
|
|
The second link has some problems caching, and I don't know how to fix that. It doesn't always update if you visited the same userID before, but if you load the page in a private window it updates just fine. You can try changing the fetch request like this: function get_info_from_file(url) { return fetch(url, { cache: "no-cache" })
Do I leave out the "re"? See: function get_info_from_file(url, re) { return fetch(url) Does that also mean I'll have to create a new function for this, so only the Trust list doesn't get cached? All others (like usernames) shouldn't be reloaded each time.
|
|
|
I love the wild theories about this, but couldn't it just be someone who set up (or is testing) some micro payment system?
|
|
|
In the morning after the weekly update [loyce.club], the data does not have time to update, regardless of whether I use the site directly [ loyce.club/trust/] or go to the site using an extension [ loyce.club/profile.html?id=]. Even though this is in the wrong topic, I'd like to answer this: the first link is updated when I post it here. If your browser doesn't show it, it may need a hard refresh. The second link has some problems caching, and I don't know how to fix that. It doesn't always update if you visited the same userID before, but if you load the page in a private window it updates just fine.
|
|
|
countries that are crypto friendly when it comes to paying income tax. ~ The Netherlands Lol Maybe you should explain what you call "crypto friendly". If you mean it's legal to earn money in crypto, and then sell it, then yes, it's a great country for crypto! But if you mean the income tax percentage, it's not that great: you pay either 37.1% or 49.5%. That's not just on crypto, it's on all income. Is there someone from these countries that has exchanged crypto for a profit and has paid little or no taxes on it? Yes: I've sold crypto for euro. No: taxes aren't little. I have some bitcoins that I've kept for a couple of years and I'd like to sell them for a profit, without paying a big tax on it, and then to transfer the money through bank transfer to my country. Here in my country I'd have to pay 20%, which is not acceptable. Is that a capital gains tax, or income tax? Wanting to go to another country, exchange your bitcons for fiat and transfer them by bank transfer to your country doesn't seem very smart at first sight. That screams "money laundering" indeed!
|
|
|
I want to invoke GDPR Even if Admin would care about the GDPR (lol), you'd first need to prove you are "sedeki". If there's a privacy-reason for it, Admin is probably willing to change your username.
|
|
|
We can read n from disk line by line and compare it to the current position in k. Yes. In fact, just 2 days ago (on another forum) I was pointed at the existence of " sort -mu": -m, --merge merge already sorted files; do not sort This does exactly what you described. I haven't tested it yet, but I assume it's much faster than "regular" sort. Update: I'm testing this now. However, the bigger problem remains: updating 1.5 billion unique addresses in chronological order. Those lists are unsorted, so for example: Existing long list with 12 years of data: New daily list: The end result should be: It can be done by awk '!a[$0]++', but I don't have that kind of RAM. I'm not sure how efficient this is for large datasets, it might also run into the problem of having to read 30 quadrillion bytes. Either way, I can't test it due to lack of RAM. I ended up with sort -uk2 | sort -nk1 | cut -f2. I can think of another option that might work: if I use the sorted list to get the new addresses, I can get those out of the daily update while keeping the chronological order. This way I only have to deal with two 20 MB files which is easy. After this, all I have to do is add them to the total file.
I don't remember if I offered you this before but I can host this data for you if it's not too big You did (more or less): If network bandwidth is a problem I'm able to host this on my hardware if you like. So I guess you missed my reply too: I'm more in need for more disk space for sorting this data, but I haven't decided yet where to host it. (I can throw up to 300GB for this project). I can also set up an rsync cron job to pull updates from your temporary location too. It is a good offer Currently, disk space isn't the problem. I am looking for a webhost that allows me to abuse the disk for a few hours continuously once in a while. Most VPS providers aren't that happy when I do that, and my (sponsored) AWS server starts throttling I/O when I do this. I'm (again) short on time to test everything, but after some discussion on another forum I created a RamNode account. This looks promising so far. If I can pull that off by automating everything, it's not that expensive to use it a couple hours per month only.
|
|
|
For example, going over all my historical received 2.608 TXs, I’ve merited "back" the Meriter on:
32 occasions within the hour (1,23% over received TXs) 408 occasions within 24 hours (15,64% over received TXs) 605 occasions within 48 hours (23,20% over received TXs) Thanks for making this list! Pmalek asked me before he created this topic but I didn't have the time.Considering how many Merit transactions I've sent and received, I'm not disappointed at all with 1.06%, 7.04% and 11.18% within 1, 24 and 48 hours respectively!
|
|
|
We then read the big list line by line while simultaneously running through the list of new addresses and comparing the values in O(n + k). In this case we can directly write the new file to disk line by line; only the list of new addresses is kept in memory. The problem with this is that running through a 20 MB list takes a lot of time if you need to do it 1.5 billion times. Keeping the 20 MB in memory isn't the problem, reading 30 quadrillion bytes from RAM still takes much longer than my current system. I may be able to improve on the sorted list by merging lists, and I may be able to improve on everything by keeping big temp files instead of only compressed files (but as always I need some time to do this). Have you considered releasing the big files as torrents with a webseed? This will allow downloaders to still download from your server and then (hopefully) continue to seed for a while; taking some strain of your server. No, until now download bandwidth isn't a problem. Only a few people have been crazy enough to download these files. If this ever goes viral it would be a great solution though.
|
|
|
LoyceV is verifying the amount and signatures. I will delete this double post after verification / failure.
I've verified the signatures. Update: Just to be clear: I didn't verify any amounts. I'll leave that to Vod
|
|
|
I saw that it wasn't reversable but now (since 2018) it is. Isn't it the other way around? When I started with Bitcoin, broadcasting a double spend was easier than it is now. An UTXO only gets removed from the UTXO set after being confirmed in a block, right? And was it always like that? Yes. Even though most nodes reject "classic" double spend transactions nowadays, a miner could always choose which transactions they include when they find a block. That's probably why the thiefs who monitor addresses with known/compromised private keys usually pay a very large transaction fee.
|
|
|
|