Bitcoin Forum

Bitcoin => Development & Technical Discussion => Topic started by: RP08 on December 16, 2018, 06:14:46 AM



Title: On chain scaling
Post by: RP08 on December 16, 2018, 06:14:46 AM
First off, I really want apologize for wasting everyone's time.
I am no software developer and I really have the vaguest understanding of blockchain technically speaking.
I am just an average Joe who happens to be a huge fan of Bitcoin and not so good at explaining things.
I don't understand how any of this works but I do understand scaling somewhat thanks to Litecoin.
More coins = Faster/cheaper transactions.

Since adding more coins to Bitcoin would be detrimental to the integrity and would result in a fork like it has time after time.
Could you instead scale to the parameters of Bitcoin's 21 million coins?

This is what I mean.
since there are 21,000,000 Bitcoins which are made up of 2,100,000,000,000,000 (2.1 quadrillion) individual units.
Instead of adding more Bitcoins could you just add smaller fractions to the smallest individual unit?
Then roughly scale the chain up 1:1 increasing the overall individual units? Making Bitcoin faster?
So if I owned 100,000,000 bitcoin units I now own 100,000,000,000 units, but basically the same thing.

Does this make sense at all?

Will better hardware eventually make bitcoins scaling problems non existent anyway?






Title: Re: On chain scaling
Post by: joniboini on December 16, 2018, 07:34:38 AM
I think you got the wrong idea here. Scaling != increase the number of coins, nor more coins == faster transactions speeds. There are several factors that affect how fast/slow a block is generated, but the number of coins is not one of them. For example, block size, block time, difficulty and etc.

Litecoin is relatively faster because it has a faster block generation time, not because it has more coins (2.5 mins vs 10 mins). Other alt try to improve scaling by increasing block size so it can handle more transactions per block.

Your idea won't work, because the block size/block time of Bitcoin will still be the same. CMIIW.

Will better hardware eventually make bitcoins scaling problems non existent anyway?

Not really. Better hardware (I assume mining hardware, with a very high calculation power) will only result in a higher difficulty to produce a block. It might change the average block generation for around 2 weeks to be faster than 10 minutes but it will go back after the difficulty got readjusted.


Title: Re: On chain scaling
Post by: coinfirst on December 19, 2018, 09:18:37 AM
Not considering sharding, tps is determined by block sizem while latency is determined by block time. So if we want to scaling on the whole chain, it is necessary to increase block size and/or decrease block time.


Title: Re: On chain scaling
Post by: DooMAD on December 21, 2018, 10:47:38 PM
I don't see why we can't scale to 10 Gigabyte blocks, even without zero conf, worst wait time is about 10 minutes.  Taking an average 1000 bytes per TX, we are looking at roughly 10 million transactions per block.

All you need is to find some people who want to run nodes that support these 10GB blocks.  Otherwise it's not a peer-to-peer network.  Being peer-to-peer is kind of an important aspect of what we're doing here.  It's easy to achieve scaling if you completely compromise the important stuff and make something that works in a similar manner to how visa works.  The hard part is scaling in a way that doesn't sacrifice the things that make Bitcoin better than visa.



Title: Re: On chain scaling
Post by: CreateCryptoCo.in on January 01, 2019, 10:37:57 PM
the scaling issue, is a disk issue, we use to host bitcoin nodes and 40gigabytes was fine but when it goes over 100gb and up to 150gb then its a hosting
problem , meaning the network will be smaller cuz less stuff running on raspberry pi'es and other cheap hardware, what u need is a good way to verify transactions without saving them


Title: Re: On chain scaling
Post by: khaled0111 on January 02, 2019, 12:20:33 AM
the scaling issue, is a disk issue, we use to host bitcoin nodes and 40gigabytes was fine but when it goes over 100gb and up to 150gb then its a hosting
problem , meaning the network will be smaller cuz less stuff running on raspberry pi'es and other cheap hardware, what u need is a good way to verify transactions without saving them

The scalability issue has nothing to do with disk space or storage devices.
The problem is with the increasing number of transactions and the limited block size.


Title: Re: On chain scaling
Post by: joniboini on January 02, 2019, 03:33:48 AM
what u need is a good way to verify transactions without saving them

How can we verify that you truly own the bitcoin if we don't have a history of past transactions?

The scalability issue has nothing to do with disk space or storage devices.
The problem is with the increasing number of transactions and the limited block size.

Those two are related imo. If we increase block size & time, the size of blockchain will definitely get bigger and probably cause some kind of centralization too in terms of who store the blockchain. We need to find a way where we can verify more transactions without increasing the block size too much, which Segwit did. CMIIW.


Title: Re: On chain scaling
Post by: khaled0111 on January 02, 2019, 04:40:35 AM
Those two are related imo. If we increase block size & time, the size of blockchain will definitely get bigger and probably cause some kind of centralization too in terms of who store the blockchain. We need to find a way where we can verify more transactions without increasing the block size too much, which Segwit did. CMIIW.
I have to agree that both are related, there are many solutions to the scalability issue and one of them (but not the best) is increasing block size. Now, the blockchain size issue is not that urgent especially with the fast growth of the technology industry (high Internet bandwith/ storage devices) or we can simply refer to Satoshi recommendation since all of this is his idea in first place:
Quote
as the network grows beyond a certain point, it would be left more and more to specialists with server farms of specialized hardware. A server farm would only need to have one node on the network and the rest of the LAN connects with that one node.
https://satoshi.nakamotoinstitute.org/emails/cryptography/2/


Title: Re: On chain scaling
Post by: ABCbits on January 02, 2019, 05:46:42 AM
the scaling issue, is a disk issue, we use to host bitcoin nodes and 40gigabytes was fine but when it goes over 100gb and up to 150gb then its a hosting
problem , meaning the network will be smaller cuz less stuff running on raspberry pi'es and other cheap hardware, what u need is a good way to verify transactions without saving them

It's problem for initial sync, not on-chain scaling as old/cheap hardware can run full nodes smoothly (assumed they have big HDD capacity)

what u need is a good way to verify transactions without saving them
How can we verify that you truly own the bitcoin if we don't have a history of past transactions?

I think he's talking about pruning, but this still require you download all blockchain, verify all block/transaction, record UTXO on chainstate and remove older blocks. But this isn't real solution against increasing blockchain size problem.

Those two are related imo. If we increase block size & time, the size of blockchain will definitely get bigger and probably cause some kind of centralization too in terms of who store the blockchain. We need to find a way where we can verify more transactions without increasing the block size too much, which Segwit did. CMIIW.
I have to agree that both are related, there are many solutions to the scalability issue and one of them (but not the best) is increasing block size. Now, the blockchain size issue is not that urgent especially with the fast growth of the technology industry (high Internet bandwith/ storage devices) or we can simply refer to Satoshi recommendation since all of this is his idea in first place:
Quote
as the network grows beyond a certain point, it would be left more and more to specialists with server farms of specialized hardware. A server farm would only need to have one node on the network and the rest of the LAN connects with that one node.
https://satoshi.nakamotoinstitute.org/emails/cryptography/2/

There are better solution such as :
1. Move transaction to off-chain/side-chain (such as LN and Plasma)
2. Reduce transaction size (such as MuSig:Schnorr and MAST)
3. Reduce bandwidth used when propagate transaction/block (such as Minisketch)

Increasing block size is needed, but IMO we shouldn't use it as 1st option.