Bitcoin Forum

Bitcoin => Bitcoin Discussion => Topic started by: Baleg00 on March 14, 2018, 02:48:00 PM



Title: My idea of a long term solution for the scalability problem
Post by: Baleg00 on March 14, 2018, 02:48:00 PM
Hi everyone!  :)

I have been using Bitcoin for some time now and I am aware of the current issues of which I think the biggest is scalability.
There were two main solutions: SegWit2x which releases plus ~0.4MB of storage (if I know well) and the blocksize increase to 8MB (personally I think the second one is better)
In my opinion these are all just temporary, short term (4-5 years I think) solutions for scalability. If we want to make Bitcoin an everyday currency we have to think many years ahead where every third shop will accept it. I think neither SegWit2x nor blocksize increase could handle the tremendous amount of transactions and they can't deal with the ever increasing size of the blockchain. That's why I was thinking of an idea that could maybe endure for decades.
I know that my idea is not perfect and it has some flaws but I am curious about your thoughts on it. Maybe we can come up with something better.

The main problem is that if we have to keep the whole blockchain to be able to secure the network then at some point in the future only large companies will be able to store the terrabytes of data and that makes everything centralized. My solution for this would be to cut the chain and start a new one. "What?! That's crazy! How are you gonna do that?", you might ask. There should be a block called "recommencement block". It could be calculated that if the chain reaches a certain size it has to be cut. We calculate which block will be the new "recommencement block" (eg. the block at height 750000). This block contains a list of all used (public) addresses in the chain and the amount of Bitcoin associated with them. Every miner creates this block for themselves and starts mining it like any other one. The miner who first mines this block announces it, everyone checks if it matches their own blockchain and then saves it. (This prevents people from altering the numbers, because if it doesn't match the other chains, they will simply not accept it.) After that, everything can continue, the only difference is that you don't need the chain before the "recommencement block". Sounds good, right?
Well, I did some calculations and got the following results (data was mainly used from Blockchain.info):
- The average of new addresses created every day is around 549626.
- The number of addresses on the 14th of March 2018 was 290556386.
- My estimate for the number of addresses on the 1st of January 2020 is 361654270.
- There are 3 types of addresses (P2PKH,P2SH,Bech32) and their average size is ~38 bytes (rounded).
- Size of a floating-point variable is 4 bytes.
Using the values above, we can work out that if this block would be mined on the 1st of January in 2020, its size would be ~15GB...  :o That's a lot for one block I know... I tried compressing a file full of randomly generated addresses and values and it was still around 10GB. A block of this size would need at least 20 minutes to create, plus mining it, plus announcing it. The network would be working on just this block for half an hour which I think could kill the whole system (because of the mempool filling with transactions).

Let me know what you guys think! ;) Can we make this work somehow? Are computers going to be fast enough to deal with this? Is it even worth thinking about? ??? :'( What are your ideas for solving the issue?


Title: Re: My idea of a long term solution for the scalability problem
Post by: stompix on March 14, 2018, 03:15:21 PM
Not a new idea

https://bitcointalk.org/index.php?topic=913605.5
https://bitcointalk.org/index.php?topic=1385786.0
https://bitcointalk.org/index.php?topic=2585331.0

or:
https://github.com/BitcoinUnlimited/BitcoinUnlimited/issues/340

Besides your idea is all about the blockchain size and you started with the comparison with capacity in daily transactions.
Not really the same problem, although pretty close.

I'll start to worry when the chain hits 2 TB.


Title: Re: My idea of a long term solution for the scalability problem
Post by: BrewMaster on March 14, 2018, 03:21:32 PM
Quote
- The average of new addresses created every day is around 549626.
- The number of addresses on the 14th of March 2018 was 290556386.
- My estimate for the number of addresses on the 1st of January 2020 is 361654270.
- There are 3 types of addresses (P2PKH,P2SH,Bech32) and their average size is ~38 bytes (rounded).
- Size of a floating-point variable is 4 bytes.

none of these are important in the final block/blockchain size and it is completely irrelevant to your proposal.
what matters are the Unspent Transaction Outputs (UTXOs). and even if you wanted to create such a block that has all these "balances" you would want to keep these UTXOs instead. and there is how that has been changing: https://blockchain.info/charts/utxo-count


Title: Re: My idea of a long term solution for the scalability problem
Post by: burdagol12345 on March 14, 2018, 03:34:57 PM
Hi everyone!  :)

I have been using Bitcoin for some time now and I am aware of the current issues of which I think the biggest is scalability.
There were two main solutions: SegWit2x which releases plus ~0.4MB of storage (if I know well) and the blocksize increase to 8MB (personally I think the second one is better)
In my opinion these are all just temporary, short term (4-5 years I think) solutions for scalability. If we want to make Bitcoin an everyday currency we have to think many years ahead where every third shop will accept it. I think neither SegWit2x nor blocksize increase could handle the tremendous amount of transactions and they can't deal with the ever increasing size of the blockchain. That's why I was thinking of an idea that could maybe endure for decades.
I know that my idea is not perfect and it has some flaws but I am curious about your thoughts on it. Maybe we can come up with something better.

The main problem is that if we have to keep the whole blockchain to be able to secure the network then at some point in the future only large companies will be able to store the terrabytes of data and that makes everything centralized. My solution for this would be to cut the chain and start a new one. "What?! That's crazy! How are you gonna do that?", you might ask. There should be a block called "recommencement block". It could be calculated that if the chain reaches a certain size it has to be cut. We calculate which block will be the new "recommencement block" (eg. the block at height 750000). This block contains a list of all used (public) addresses in the chain and the amount of Bitcoin associated with them. Every miner creates this block for themselves and starts mining it like any other one. The miner who first mines this block announces it, everyone checks if it matches their own blockchain and then saves it. (This prevents people from altering the numbers, because if it doesn't match the other chains, they will simply not accept it.) After that, everything can continue, the only difference is that you don't need the chain before the "recommencement block". Sounds good, right?
Well, I did some calculations and got the following results (data was mainly used from Blockchain.info):
- The average of new addresses created every day is around 549626.
- The number of addresses on the 14th of March 2018 was 290556386.
- My estimate for the number of addresses on the 1st of January 2020 is 361654270.
- There are 3 types of addresses (P2PKH,P2SH,Bech32) and their average size is ~38 bytes (rounded).
- Size of a floating-point variable is 4 bytes.
Using the values above, we can work out that if this block would be mined on the 1st of January in 2020, its size would be ~15GB...  :o That's a lot for one block I know... I tried compressing a file full of randomly generated addresses and values and it was still around 10GB. A block of this size would need at least 20 minutes to create, plus mining it, plus announcing it. The network would be working on just this block for half an hour which I think could kill the whole system (because of the mempool filling with transactions).

Let me know what you guys think! ;) Can we make this work somehow? Are computers going to be fast enough to deal with this? Is it even worth thinking about? ??? :'( What are your ideas for solving the issue?

I thinks this situation has been before segwit2x implemented,this problem address last year of slow transaction bussiness in the block chain,but i thinks its already resolve now,because as far as i can make a trading transaction now of some exchange trading those pending moments,or some kitten bite problem  is already resolved,maybe you have a  some  points of this your ideas of solving issues,but for me this situation now is not the main issue of dropping the price value of bitcoin in the market.