Bitcoin Forum
May 22, 2024, 02:26:18 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Hashing the entire blockchain every hash: permanent ASIC resistance full node re  (Read 157 times)
ir.hn (OP)
Member
**
Offline Offline

Activity: 322
Merit: 54

Consensus is Constitution


View Profile
December 21, 2021, 07:16:53 AM
Last edit: December 21, 2021, 07:39:58 AM by ir.hn
 #1

In an altcoin project blockvault https://web.archive.org/web/20211217023208/https://www.naturevault.org/wiki/pmwiki.php/CryptoProjects/Blockvault

I have proposed hashing the entire blockchain with every hash.  While this will slow down hashing extensively, currently asics run at tens of terahashes (trillions of hashes) per second so we have a little time to spare.

The benefits are that this slow hashing will also use much less power, since you can't just throw more power at the problem to increase hashes per second significantly. So it is greener.

Another benefit is for this idea to be integrated into bitcoin it would still keep the SHA256 algorithm so will be plug and play.

Also it will ensure that every miner is a full node.  Currently there are relatively few full nodes in the bitcoin network leading to vulnerability to data loss or conflict as to the real blockchain in the case of widespread data loss from a solar flare or emp.  If every miner needed to carry the full blockchain in its memory (for every chip) this would vastly improve the robustness of the network.

ASICS and even GPUs rely on low memory requirements to maximize speedup over CPU.  However when we are dealing with gigabytes required per asic chip onboard each chip, it becomes impossible.  So CPU can handle it because it has L1-L3 cache plus SDRAM plus SSD unlike GPU or ASIC.

Currently computers can hash multiple gigabytes per second so I think since the btc blockchain is a few hundred gigabytes this idea can be phased in slowly (like last 1% of blockchain needs to be hashed) so as to not bring hashrates from terahashes to millihashes per second lol.

But the good thing is that hashing rates of large datasets scale linearly not exponentially, so we will never hit a limit to how large the blockchain can get and still be able to hash it in a timely manner.

Another interesting thing this would do is open the possibility of forgoing the block"chain" and rather just have the entire dataset in one monolithic block that gets amended each time.  This sort of arrangement might even be set to allow for modifying the blockchain in the past, say if there was community consensus that funds were stolen or whatnot.  This doesn't have to be a feature, but this sort of thing opens up that possibility perhaps if desired and it would also act as compression reducing the size of the dataset.

As an alternative another idea could be hashing the UTXO database instead of the blockchain.  This would make sure each miner is storing important info and provide asic gpu resistance while also not make hashes take too long since the UTXO is much smaller , seemingly about 1-2% the size of the blockchain https://bitcoin.stackexchange.com/questions/35980/how-big-is-the-utxo-database

pooya87
Legendary
*
Offline Offline

Activity: 3458
Merit: 10577



View Profile
December 21, 2021, 08:32:37 AM
Last edit: December 23, 2021, 06:35:40 PM by pooya87
Merited by NeuroticFish (3), ABCbits (2)
 #2

It won't work because miners will end up doing practically one block compression regardless of the size considering the way hash algorithms are designed and the fact that blockchain is immutable. Another problem is that you would create a hard cap on blockchain size since hash algorithms can not compute hash of an unlimited size message, even though it is a big cap. For example for SHA256 this is capped at ~2.1 TB (the cap is so much bigger).

Another issue with this idea is that it defeats one of the main principles of PoW which is a computation that is very hard to finish but super easy to validate. In other words you have to spend a lot of time computing the nonce that gives the desired hash but it only takes a nanosecond to verify it on any hardware. A regular node would not be able to validate a PoW if it has to compute hash of the entire blockchain each time for each block.

So lets take a look at some hash algorithms:
SHA2
In this algorithm message is broken into smaller blocks and hash state is updated for each block. For the last block the size and a padding is added to the end and a final compression is performed to update hash state.
If you have a 3 GB message + a new block you can compute the hashstate for 3 GB message up until the final block and store it in memory then add the new block and update hashstate and then set final block.
Repeat the same thing for 3 GB message + new block + new block with the same hashstate you had in memory to essentially only compress one or two blocks.

SHA3 (Keccak)
Is slightly more complicated but the same optimization applies. In this algorithm we first split the data into smaller blocks and absorb each block in hashstate and after absorption is finished we switch to squeezing stage where the final hash is calculated.
To compute 3 GB message you could take the initial step to absorb the data as much as you can into the hashstate and leave the final absorption and final squeeze to when message was complete (the new block was added) and this intermediary value would be stored and helps skip a big chunk of work and the time would be shrunk to computing hash of a message that is very small (size between 1 and 2 blocks). The final absorption stage and the final squeeze are essentially like 2 block compressions and the speed of them is fixed regardless of data size.

Blake2b
Is very similar to what SHA256 looks like. Data is broken into small blocks and the hash state is updated for each block. Eventually there is a similar last block that includes padding and message length that can be postponed to when we have the final block.

RIPEMD160
Basically the same as SHA256.

.
.BLACKJACK ♠ FUN.
█████████
██████████████
████████████
█████████████████
████████████████▄▄
░█████████████▀░▀▀
██████████████████
░██████████████
████████████████
░██████████████
████████████
███████████████░██
██████████
CRYPTO CASINO &
SPORTS BETTING
▄▄███████▄▄
▄███████████████▄
███████████████████
█████████████████████
███████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
███████████████████████
█████████████████████
███████████████████
▀███████████████▀
█████████
.
NotATether
Legendary
*
Online Online

Activity: 1610
Merit: 6753


bitcoincleanup.com / bitmixlist.org


View Profile WWW
December 21, 2021, 02:20:20 PM
 #3

As an alternative another idea could be hashing the UTXO database instead of the blockchain.  This would make sure each miner is storing important info and provide asic gpu resistance while also not make hashes take too long since the UTXO is much smaller , seemingly about 1-2% the size of the blockchain https://bitcoin.stackexchange.com/questions/35980/how-big-is-the-utxo-database

You can't hash the entire blockchain, the UTXO structure or anything else like blocks and transactions, without defining a data structure for it. Even if it's just concatenating all the values together while assigning a wordsize to each field.

But I doubt that the blockchain will shrink as a result of hashing all of it together, because you still need to keep that information stored somewhere. Some of the fields cannot be generated programmatically because they describe things like dates. Most likely, all the blocks would just be kept inside a vector instead of a linked list.

.
.BLACKJACK ♠ FUN.
█████████
██████████████
████████████
█████████████████
████████████████▄▄
░█████████████▀░▀▀
██████████████████
░██████████████
████████████████
░██████████████
████████████
███████████████░██
██████████
CRYPTO CASINO &
SPORTS BETTING
▄▄███████▄▄
▄███████████████▄
███████████████████
█████████████████████
███████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
███████████████████████
█████████████████████
███████████████████
▀███████████████▀
█████████
.
larry_vw_1955
Sr. Member
****
Offline Offline

Activity: 1064
Merit: 371


View Profile
December 22, 2021, 01:18:59 AM
 #4


Another interesting thing this would do is open the possibility of forgoing the block"chain" and rather just have the entire dataset in one monolithic block that gets amended each time. This sort of arrangement might even be set to allow for modifying the blockchain in the past, say if there was community consensus that funds were stolen or whatnot.  This doesn't have to be a feature, but this sort of thing opens up that possibility perhaps if desired and it would also act as compression reducing the size of the dataset.


there is no way that bitcoin is ever going to allow modifying the blockchain in the past through community consensus. if that ever happened then bitcoin would be something else. and would take a big hit to its reputation and as a result, other projects that people tend to look down on might be better than bitcoin then.

Quote from: pooya87
Another issue with this idea is that it defeats one of the main principles of PoW which is a computation that is very hard to finish but super easy to validate.

excellent point, never would have thought of that. but i guess thats a nail in the coffin for hashing the entire blockchain. not that bitcoin cares about keeping ASICs out. to the contrary, I think it welcomes them. at the expense of pushing out the "little guy" who only has a cpu or gpu.
pooya87
Legendary
*
Offline Offline

Activity: 3458
Merit: 10577



View Profile
December 22, 2021, 03:46:32 AM
 #5

at the expense of pushing out the "little guy" who only has a cpu or gpu.
To be fair you can't just mine (alternative PoW algorithms) with any CPU or GPU either, you'd need a very strong one and in most cases you can't even mine with a single GPU, you'd need a GPU rig that consists of multiple units.

.
.BLACKJACK ♠ FUN.
█████████
██████████████
████████████
█████████████████
████████████████▄▄
░█████████████▀░▀▀
██████████████████
░██████████████
████████████████
░██████████████
████████████
███████████████░██
██████████
CRYPTO CASINO &
SPORTS BETTING
▄▄███████▄▄
▄███████████████▄
███████████████████
█████████████████████
███████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
███████████████████████
█████████████████████
███████████████████
▀███████████████▀
█████████
.
larry_vw_1955
Sr. Member
****
Offline Offline

Activity: 1064
Merit: 371


View Profile
December 23, 2021, 02:18:50 AM
 #6


To be fair you can't just mine (alternative PoW algorithms) with any CPU or GPU either, you'd need a very strong one and in most cases you can't even mine with a single GPU, you'd need a GPU rig that consists of multiple units.

in some cases (not alot) cpu is the only possible option since there are no asics and gpus have too low of a hash rate to be profitable. granted the cpu needs to be a strong one.  i am sure you know of some situations like that...
tromp
Legendary
*
Offline Offline

Activity: 980
Merit: 1088


View Profile
December 23, 2021, 07:51:24 AM
 #7

you can't even mine with a single GPU, you'd need a GPU rig that consists of multiple units.

That makes no sense to me. How can mining with one GPU not be profitable, when mining with multiple GPUs is profitable?
That would seem to require something really weird like a sublinear electricity cost.
pooya87
Legendary
*
Offline Offline

Activity: 3458
Merit: 10577



View Profile
December 23, 2021, 08:50:29 AM
 #8

That makes no sense to me. How can mining with one GPU not be profitable, when mining with multiple GPUs is profitable?
That would seem to require something really weird like a sublinear electricity cost.
It's about the share of the total work that your computing power (GPU) could perform. Any decent cryptocurrency with PoW algorithm surely has enough miners that the difficulty is so high that you'd need at least a top tier expensive GPU. And in most cases with only one GPU you would be performing such a small portion of the work that your reward won't even reach withdrawal limit.

.
.BLACKJACK ♠ FUN.
█████████
██████████████
████████████
█████████████████
████████████████▄▄
░█████████████▀░▀▀
██████████████████
░██████████████
████████████████
░██████████████
████████████
███████████████░██
██████████
CRYPTO CASINO &
SPORTS BETTING
▄▄███████▄▄
▄███████████████▄
███████████████████
█████████████████████
███████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
███████████████████████
█████████████████████
███████████████████
▀███████████████▀
█████████
.
garlonicon
Hero Member
*****
Offline Offline

Activity: 807
Merit: 1940


View Profile
December 23, 2021, 09:33:15 AM
Merited by pooya87 (2), ABCbits (2)
 #9

Quote
For example for SHA256 this is capped at ~2.1 TB.
How you got that number? For SHA-256 you have 64-bit value as message size in bits. So, for every 2^10 you can change units to kilos. That means you have 2^10 (kilo), 2^20 (mega), 2^30 (giga), 2^40 (tera), 2^50 (peta), and 2^60 (exa). One byte has eight bits, so 2^3. So 2^64 is something like 2 EiB. Maybe for 32-bit values you have too big messages, but for 64-bit values I never heard that "this file is too big to be hashed".

Quote
Another issue with this idea is that it defeats one of the main principles of PoW which is a computation that is very hard to finish but super easy to validate.
I wonder how to create difficulty for such chain. Because in Bitcoin sometimes you can see difficulty drop. In that case, if you have new hash for every block, you can just use easier target. But for single hash chain you have to always push chain forward, so always add some work to that chain. That means, even if you have two times easier difficulty, you still have to find better hash than anyone else in the world did for this whole time.

Basically, in this model, the difficulty is growing exponentially and there is no way to make it lower. You can easily check that, just by computing hashes locally. Start from regtest difficulty and then always compute better hash. If you broadcast the best hash immediately, then your difficulty is growing exponentially. The only way to raise it slowly is precomputing a lot of hashes, storing them locally, sorting, and then sharing in the right order. But then, one lucky miner can instantly raise the difficulty to insane levels and then there is no way to lower that difficulty.

If you want to decentralize things, you should do the opposite: there should be N hashes per block. Of course, such things should be done off-chain, because in other case, the blockchain would be quickly filled with rewards for miners. Ideally, hashes could be joined, but so far I have no better idea than storing N hashes per block.

Quote
Data is broken into small blocks and the hash state is updated for each block.
Exactly, so even in case of single hash per chain, it could be turned into what we have today. Just you could have IV for SHA-256 as your starting point, then first 256 bits would be single SHA-256 of the block, the second 256 bits would be (one_separator + zero_padding + message_size). So, to get any block hash, you could use the previous block hash as your IV, add current 256-bit block hash after running single SHA-256 on it, and then attach one, padding, and message size (from that field you can even get block number).

Quote
Most likely, all the blocks would just be kept inside a vector instead of a linked list.
Yes, because internally, SHA-256 is just processed as a vector of 512-bit chunks, so miners will take advantage of that.
pooya87
Legendary
*
Offline Offline

Activity: 3458
Merit: 10577



View Profile
December 23, 2021, 06:45:49 PM
 #10

How you got that number? For SHA-256 you have 64-bit value as message size in bits.
You are right, it was a confusion between (the irrelevant) Int32.Max and 64-bit integers! The cap is 264=18,446,744,073,709,551,615 bit = 2,305,843 TB.

Quote
I wonder how to create difficulty for such chain. Because in Bitcoin sometimes you can see difficulty drop.
Pretty easily: reduce the number of blocks to be hashed. For example if there weren't any blocks found after X minutes you require the miner to hash total count - 10k blocks.

But keep in mind that even though there are work arounds to all the things I pointed out above (including the precomputation of hash itself) the biggest unsolvable problem is that nodes will have a very hard time verifying each block since they too would have to take the long road which would get increasingly hard as the chain grows.

.
.BLACKJACK ♠ FUN.
█████████
██████████████
████████████
█████████████████
████████████████▄▄
░█████████████▀░▀▀
██████████████████
░██████████████
████████████████
░██████████████
████████████
███████████████░██
██████████
CRYPTO CASINO &
SPORTS BETTING
▄▄███████▄▄
▄███████████████▄
███████████████████
█████████████████████
███████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
███████████████████████
█████████████████████
███████████████████
▀███████████████▀
█████████
.
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!