Bitcoin Forum
April 09, 2026, 07:21:18 PM *
News: Latest Bitcoin Core release: 30.2 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: I want someone to tell me about the current state of UTXO set size optimisation.  (Read 46 times)
bigbroski (OP)
Newbie
*
Offline Offline

Activity: 5
Merit: 0


View Profile
April 08, 2026, 01:30:50 PM
 #1

By this I mean what are the currently proposed approaches and techniques or primitives on which research is going to combat the problem of the heavy utxo set size, and in my opinion the main bottleneck is not the size in itself but the consequences of the size on nodes who verify transaction.
bigbroski (OP)
Newbie
*
Offline Offline

Activity: 5
Merit: 0


View Profile
April 08, 2026, 02:51:52 PM
 #2

None of them.

The increase of the utxo set translates to linear increase in difficulty of verification of new coming transactions by nodes. Attempts like LevelDB optimization or hot utxos caching in memory still do not solve the fundamental bottle neck. The size of the set is so big that you have to inevitably store it in the SLOOOOOW disk and do a expensive O log N check everytime you verify a transaction....

New reasearch paradigms like utreexos...in my opinion it shifts the liability to much towards the user.

I just wanted an authentic and comprehensive answer from someone who can answer my original question.
Satofan44
Sr. Member
****
Offline Offline

Activity: 350
Merit: 1042


Don't hold me responsible for your shortcomings.


View Profile
April 08, 2026, 03:16:24 PM
Last edit: April 08, 2026, 03:41:00 PM by Satofan44
Merited by ABCbits (3), vapourminer (1)
 #3

and in my opinion the main bottleneck is not the size in itself but the consequences of the size on nodes who verify transaction.
This part doesn’t make sense as you’re basically contradicting yourself in the same sentence. You are saying that the size isn't the problem but then you point out the consequences of the size on nodes as the problem. They are basically the same thing.

By this I mean what are the currently proposed approaches and techniques or primitives on which research is going to combat the problem of the heavy utxo set size,
There are technological and incentive based ideas floating around, but generally we also educate people not to create worthless UTXOs and to keep consolidating whenever the fees are cheap. Obviously this latter approach has practical limitations at scale and can achieve only so much -- furthermore, it does nothing against attackers that want to spam the UTXO set. I was going to mention new "research paradigms" here, but you have written a reply during the writing of the post so you can find the continuation in the next section.

The increase of the utxo set translates to linear increase in difficulty of verification of new coming transactions by nodes. Attempts like LevelDB optimization or hot utxos caching in memory still do not solve the fundamental bottle neck. The size of the set is so big that you have to inevitably store it in the SLOOOOOW disk and do a expensive O log N check everytime you verify a transaction....
What are you talking about, the linear increase is only related to the storage requirements. Furthermore, "inevitable store it in the slow disk"? Are you stuck in the year 2005, or 2015 or what? Entry systems have 8 GB of memory, average systems 16 GB, and more. Discounting the current situation of the AI scam/boom, memory is becoming cheaper and larger faster than the negative impacts on the UTXO set is growing. Is a PCIe SSD a "slow disk"?  Cheesy I am not sure that the disk would be the bottleneck to validation anymore, for HDDs yes, for the average device that uses a SSD now, no.

New reasearch paradigms like utreexos...in my opinion it shifts the liability to much towards the user.

I just wanted an authentic and comprehensive answer from someone who can answer my original question.
You claim here to want an authentic and comprehensive answer but then you just wave away one of the main lines of research by some arbitrary reasoning. You are right that things like Utreexo shifts some of the burder to the user but that is the entire point of it, you are making it out as it was an arbitrary design choice or a downside to the proposal -- which it is not. You can't reduce the UTXO burden by magically making the cost disappearing, it has to be moved somewhere. Furthermore, you are making it as if it were something bad.

The software will be doing this automatically for the user/node operators, therefore it is not much different from many other mechanics that we have now. Each approach is going to have different tradeoffs that relate to the storage and complexity of implementation. The real answer is always "there's no free lunch" and burn that statement into your thick skull.

Utreexo is the dominant proposal as of today, there are some ZK-based ideas floating around too: https://delvingbitcoin.org/t/proving-utxo-set-inclusion-in-zero-knowledge/1142. In general all approaches could be grouped into these 3 categories:

  • Moving the state somewhere.
  • Compressing or committing it.
  • Split the state such as sharding


There is no magic solution that will make the cost go away completely unless we want to radically change the security assumptions and guarantees that Bitcoin has -- which we don't.


https://delvingbitcoin.org/t/new-utreexo-releases/2371

ABCbits
Legendary
*
Offline Offline

Activity: 3570
Merit: 9923



View Profile
Today at 08:12:39 AM
Merited by vapourminer (1), Satofan44 (1)
 #4

The size of the set is so big that you have to inevitably store it in the SLOOOOOW disk and do a expensive O log N check everytime you verify a transaction....

CMIIW, but O log N doesnt sound so bad. Current total UTXO is about 114 million, according to https://statoshi.info/d/000000009/unspent-transaction-output-set?orgId=1&from=now-5y&to=now&timezone=browser&refresh=10m. It's roughly only 34% slower/more expensive compared with only 1 million UTXO.

Code:
log(114000000) / log (1000000)
= 8.057 / 6
= 1.3428

I just wanted an authentic and comprehensive answer from someone who can answer my original question.

In that case, consider asking on Bitcoin delving forum, Bitcoin stackexchange or bitcoindev group instead. Note that they have more rule compared with this forum.

The increase of the utxo set translates to linear increase in difficulty of verification of new coming transactions by nodes. Attempts like LevelDB optimization or hot utxos caching in memory still do not solve the fundamental bottle neck. The size of the set is so big that you have to inevitably store it in the SLOOOOOW disk and do a expensive O log N check everytime you verify a transaction....
What are you talking about, the linear increase is only related to the storage requirements. Furthermore, "inevitable store it in the slow disk"? Are you stuck in the year 2005, or 2015 or what? Entry systems have 8 GB of memory, average systems 16 GB, and more. Discounting the current situation of the AI scam/boom, memory is becoming cheaper and larger faster than the negative impacts on the UTXO set is growing. Is a PCIe SSD a "slow disk"?  Cheesy I am not sure that the disk would be the bottleneck to validation anymore, for HDDs yes, for the average device that uses a SSD now, no.

It's also worth to mention Bitcoin Core let you store blockchain data on separate folder. So you could use SSD for chainstate and all other files, except blockchain data if you have limited SSD capacity.

███████████████████████████
███████▄████████████▄██████
████████▄████████▄████████
███▀█████▀▄███▄▀█████▀███
█████▀█▀▄██▀▀▀██▄▀█▀█████
███████▄███████████▄███████
███████████████████████████
███████▀███████████▀███████
████▄██▄▀██▄▄▄██▀▄██▄████
████▄████▄▀███▀▄████▄████
██▄███▀▀█▀██████▀█▀███▄███
██▀█▀████████████████▀█▀███
███████████████████████████
.
.Duelbits PREDICT..
█████████████████████████
█████████████████████████
███████████▀▀░░░░▀▀██████
██████████░░▄████▄░░████
█████████░░████████░░████
█████████░░████████░░████
█████████▄▀██████▀▄████
████████▀▀░░░▀▀▀▀░░▄█████
██████▀░░░░██▄▄▄▄████████
████▀░░░░▄███████████████
█████▄▄█████████████████
█████████████████████████
█████████████████████████
.
.WHERE EVERYTHING IS A MARKET..
█████
██
██







██
██
██████
Will Bitcoin hit $200,000
before January 1st 2027?

    No @1.15         Yes @6.00    
█████
██
██







██
██
██████

  CHECK MORE > 
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!