Bitcoin Forum

Bitcoin => Bitcoin Discussion => Topic started by: Nadziratel on May 07, 2019, 07:20:28 PM



Title: Blockchain Data Size
Post by: Nadziratel on May 07, 2019, 07:20:28 PM
As we know the Blockchain network has reached a great amount size already. Some people talking about increasing the block size. What is your thought?

In my opinion, increasing the block size is not an option. Because as you see in this chart (below), data size getting bigger day by day. In 2 years we reached x2 data size already.

https://i.imgur.com/0SvTxua.jpg

[Chart Source: https://www.statista.com/statistics/647523/worldwide-bitcoin-blockchain-size/]


I think if the block size will be increased, we can see 1 TB in a few months.

Shoot your opinion. Let's talk about block size! Do we need really any change in block size?


Title: Re: Blockchain Data Size
Post by: lukew on May 07, 2019, 07:23:21 PM
If blocks are twice the size, then they can hold twice as many transactions, ish. It wouldn't double the blockchain size though.


Title: Re: Blockchain Data Size
Post by: franky1 on May 07, 2019, 07:38:29 PM
I see that. But I think you don't want to understand we are talking about a big data size already now. As you see in this chart (https://www.statista.com/statistics/647523/worldwide-bitcoin-blockchain-size/), blockchain data size reach over 200GB. So all computer keep 200GB data in blockchain network already. If we increase the data size x2-x4, then we will reach to 1 tb in a few months. So I think we have no choice about data size.

over 200gb is 10 years of accumilation.. not months of accumilation.
even if people made the most perfectly bloated segwit multsigs to utilise the full 4mb weight. thats still 200gb A YEAR
at the moment its 1.2mb average block so lets call it 2.4mb at 2x (120gb a year)
at the moment its 1.2mb average block so lets call it 4.8mb at 4x (240gb a year)
meaning at 2-4x it would be 1tb in 3.5-7 years.. not months


Title: Re: Blockchain Data Size
Post by: rijaljun on May 07, 2019, 07:50:15 PM
Block size has no correlation with total data size. Block size are there to limit amount of transaction can be done in a single block. What makes data size growing is because there are more and more Bitcoin transactions everyday and it's normal to see the data size has grown two times in two year because Bitcoin was very popular on 2017. Let's say, we upgrade the block size to 4MB and no one create any transaction, then total data size won't increase. And to be honest, I don't know what the proper solution for this problem (gonna find an answer somewhere on google)  ;D


Title: Re: Blockchain Data Size
Post by: Ranly123 on May 07, 2019, 08:58:51 PM
As we know the Blockchain network has reached a great amount size already. Some people talking about increasing the block size. What is your thought?

In my opinion, increasing the block size is not an option. Because as you see in this chart (below), data size getting bigger day by day. In 2 years we reached x2 data size already.

https://i.imgur.com/0SvTxua.jpg

[Chart Source: https://www.statista.com/statistics/647523/worldwide-bitcoin-blockchain-size/]


I think if the block size will be increased, we can see 1 TB in a few months.

Shoot your opinion. Let's talk about block size! Do we need really any change in block size?

I think blocksize is important and increasing it is the best option to hold as many transactions in the blockchain. How can the blockchain of Bitcoin accommodate the daily requirements to cater it's increasing user if they don't increase it's block size.


Title: Re: Blockchain Data Size
Post by: Artemis3 on May 07, 2019, 11:31:18 PM
Here is my current node:

Code:
231G	./blocks
3,1G ./chainstate
20G ./indexes
254G .

If nothing changed, in 10 years it would be twice. But things already changed, Segwit actually also brought larger blocks, just not as big as some people wanted (and added some overhead).

Offchain transactions will of course not get written to the blockchain. LN is small at this point but one could assume those transactions would have gone into the blockchain if LN wasn't put in place. Will a time come when the majority of transactions occur offchain and only the minority remain registered?

I think by 2029 instead of 500ish G we should expect 750ish G unless LN gained traction enough that most people prefer to use that.


Title: Re: Blockchain Data Size
Post by: pooya87 on May 08, 2019, 04:10:37 AM
As we know the Blockchain network has reached a great amount size already.
what is the "Blockchain network"?!! :D
you mean the "block-chain" as in the chain of blocks? that is not a "network"!

Quote
In my opinion, increasing the block size is not an option. Because as you see in this chart (below), data size getting bigger day by day.
are you running a full node right now or have you ever in the past? if not then you what difference does it make to you who aren't even running a node?

Quote
Shoot your opinion. Let's talk about block size! Do we need really any change in block size?
there are about hundreds of topics with dozens of them active right now talking about this. i don't see the point of a new one unless you have some new discussions yourself that you want to add?


Title: Re: Blockchain Data Size
Post by: Ray55 on May 08, 2019, 04:11:46 AM
As far as I know, the size of the blockchain is going to be turned into a 4MB soon. It's really great when it's done In a short period of time, much more transaction can be done and not only will the transaction charge be too low.


Title: Re: Blockchain Data Size
Post by: feryjhie on May 08, 2019, 05:40:07 AM
As far as I know, the size of the blockchain is going to be turned into a 4MB soon. It's really great when it's done In a short period of time, much more transaction can be done and not only will the transaction charge be too low.

from where you read that blockchain size will be increased to 4MB? because there are no new articles about this
if the block size will be increased to 4mb there will be many articles about that


Title: Re: Blockchain Data Size
Post by: Nadziratel on May 08, 2019, 06:15:05 AM
As we know the Blockchain network has reached a great amount size already.
what is the "Blockchain network"?!! :D
you mean the "block-chain" as in the chain of blocks? that is not a "network"!

I mean Bitcoin network. As you know there is a lot of different running blockchain. That's not a point which I mean!


In my opinion, increasing the block size is not an option. Because as you see in this chart (below), data size getting bigger day by day.
are you running a full node right now or have you ever in the past? if not then you what difference does it make to you who aren't even running a node?

No, I have no experience about mining yet. I just want to learn something here, if you let me ;)



Shoot your opinion. Let's talk about block size! Do we need really any change in block size?
there are about hundreds of topics with dozens of them active right now talking about this. i don't see the point of a new one unless you have some new discussions yourself that you want to add?

I know that there are a lot of topics and no one of this didn't answer my question just like your post!
There are thousand of thread about "which coin should I buy" also. But everyone says something in that post. And sometimes we can see usefull post that threads. So I don't think it is forbidden to share! I found a blocksize chart and want to learn something. Please don't create unnecessary dispute. If you have a good info for this great community, share it and let us all benefit. We don't need any pointless discussion here.


Title: Re: Blockchain Data Size
Post by: Haunebu on May 08, 2019, 07:52:44 AM
Offchain transactions will of course not get written to the blockchain. LN is small at this point but one could assume those transactions would have gone into the blockchain if LN wasn't put in place. Will a time come when the majority of transactions occur offchain and only the minority remain registered?

I think by 2029 instead of 500ish G we should expect 750ish G unless LN gained traction enough that most people prefer to use that.
I think that the Neutrino protocol could solve the issue with downloading the entire blockchain and make the LN a better option for investors in the near future. The fully developed LN will help make BTC a much better coin overall.


Title: Re: Blockchain Data Size
Post by: Pursuer on May 08, 2019, 09:47:44 AM
you can't just consider one aspect and nothing else. it is never about increase the block size or not increase the block size at all. it is about how to increase and by how much and when. for example until 2017 we had 1 MB blocks and now we can go as high as a little less than 4 MB. you see increasing the capacity increases the ability of bitcoin to handle more transactions and support more users which means more adoption. not doing that means less transactions, less adoption and higher fees which can eventually make it impossible to use bitcoin unless you only want to move very large amounts of money which doesn't make any sense and goes against what bitcoin was created for.


Title: Re: Blockchain Data Size
Post by: mocacinno on May 08, 2019, 09:56:20 AM
--snip--
In my opinion, increasing the block size is not an option. Because as you see in this chart (below), data size getting bigger day by day.
are you running a full node right now or have you ever in the past? if not then you what difference does it make to you who aren't even running a node?

No, I have no experience about mining yet. I just want to learn something here, if you let me ;)
--snip--

Running a full node =/= mining
My point of view is simple: at the moment, i have no problem finding a HD that's big enough to store all blocks. Sure, in the future the size of the blockchain will increase... If it wouldn't increase it would me no more valid transactions were broadcasted and no new valid blocks would be broadcasted either (in other words: if the  blockchain size wouldn't increase, bitcoin would be dead).
I hope the HDD price to size ratio will decrease over time aswell, so i hope that even in 10 years i'll still be able to pick up a reasonably priced HDD that's big enough to store the complete blockchain, even if there's 750 Gb of data to be stored at that point in time.

If the size of the blockchain data is ever so big i cannot find a reasonably priced HDD, i'll switch to an SPV client in a heartbeat, so even this scenario doesn't worry me. There'll always be tons of companies that run full nodes, even if storing the data costs a lot... Think about exchanges, HW wallet vendors, block explorers, big companies that accept BTC, mining pools,...

Even IF you're planning on mining, you're best off joining a pool anyways (unless you're planning on investing millions of dollars to setup a huge mining farm... That's about the only usecase where it still makes sense to solo mine IMHO), so even if you're mining there is no need to run a full node...


Title: Re: Blockchain Data Size
Post by: Beerwizzard on May 08, 2019, 12:47:30 PM

In my opinion, increasing the block size is not an option. Because as you see in this chart (below), data size getting bigger day by day. In 2 years we reached x2 data size already.
The problem of increased block size is a bit different. Remember the 2017 when BTC raised from approximately $1800 to $20k+. During the whole year  the transaction fee increased multiple times. So if we increase the block size twice then without any other solutions like LN in some time (it could even be some moths) we gonna face the same old problem that would push us to another block size change.


Title: Re: Blockchain Data Size
Post by: BrewMaster on May 08, 2019, 02:30:19 PM

In my opinion, increasing the block size is not an option. Because as you see in this chart (below), data size getting bigger day by day. In 2 years we reached x2 data size already.
The problem of increased block size is a bit different. Remember the 2017 when BTC raised from approximately $1800 to $20k+. During the whole year  the transaction fee increased multiple times. So if we increase the block size twice then without any other solutions like LN in some time (it could even be some moths) we gonna face the same old problem that would push us to another block size change.

you can't really use the data that you get from 2017 because it was not just the price rise and adoption,... that led to the increased number of on chain transactions and the resulting high fees. that main reason for all that gigantic backlog was the prolonged spam attack that we had going on most of 2017 where multiple entities have been injecting the network with spam transactions pushing the fees up.


Title: Re: Blockchain Data Size
Post by: pawanjain on May 08, 2019, 02:40:23 PM
I see that. But I think you don't want to understand we are talking about a big data size already now. As you see in this chart (https://www.statista.com/statistics/647523/worldwide-bitcoin-blockchain-size/), blockchain data size reach over 200GB. So all computer keep 200GB data in blockchain network already. If we increase the data size x2-x4, then we will reach to 1 tb in a few months. So I think we have no choice about data size.

over 200gb is 10 years of accumilation.. not months of accumilation.
even if people made the most perfectly bloated segwit multsigs to utilise the full 4mb weight. thats still 200gb A YEAR
at the moment its 1.2mb average block so lets call it 2.4mb at 2x (120gb a year)
at the moment its 1.2mb average block so lets call it 4.8mb at 4x (240gb a year)
meaning at 2-4x it would be 1tb in 3.5-7 years.. not months
You must have quoted the approximate values. Can you confirm on my calculation below

1 block every 10 minutes leads to
1*10*6*24*30*12 = 518400

This brings us to
(1.2MB * 518400) / 1024 = 60.75GB
For 2.4MB = 121.5 GB
For 4.8MB = 243GB

So it will at least take more than 5 years to reach the 1TB data size which is not bad as of now in my opinion.
But yeah this might be considered in future if no solution has been created until that date to tackle this situation.



Title: Re: Blockchain Data Size
Post by: Beerwizzard on May 08, 2019, 03:24:38 PM

In my opinion, increasing the block size is not an option. Because as you see in this chart (below), data size getting bigger day by day. In 2 years we reached x2 data size already.
The problem of increased block size is a bit different. Remember the 2017 when BTC raised from approximately $1800 to $20k+. During the whole year  the transaction fee increased multiple times. So if we increase the block size twice then without any other solutions like LN in some time (it could even be some moths) we gonna face the same old problem that would push us to another block size change.

you can't really use the data that you get from 2017 because it was not just the price rise and adoption,... that led to the increased number of on chain transactions and the resulting high fees. that main reason for all that gigantic backlog was the prolonged spam attack that we had going on most of 2017 where multiple entities have been injecting the network with spam transactions pushing the fees up.
First of all with an increase of Bitcoin adoption we gonna get the increased amount of transactions. That would inevitably happen with time.
Also all those spam attacks would always take their place. No matter if it is a coordinated attack or just another service is spamming the BTC blockchain. That would keep happening and to become usable Bitcoin have to be resilient to such kind of things.


Title: Re: Blockchain Data Size
Post by: pawanjain on May 09, 2019, 03:02:27 PM
I see that. But I think you don't want to understand we are talking about a big data size already now. As you see in this chart (https://www.statista.com/statistics/647523/worldwide-bitcoin-blockchain-size/), blockchain data size reach over 200GB. So all computer keep 200GB data in blockchain network already. If we increase the data size x2-x4, then we will reach to 1 tb in a few months. So I think we have no choice about data size.

over 200gb is 10 years of accumilation.. not months of accumilation.
even if people made the most perfectly bloated segwit multsigs to utilise the full 4mb weight. thats still 200gb A YEAR
at the moment its 1.2mb average block so lets call it 2.4mb at 2x (120gb a year)
at the moment its 1.2mb average block so lets call it 4.8mb at 4x (240gb a year)
meaning at 2-4x it would be 1tb in 3.5-7 years.. not months
You must have quoted the approximate values. Can you confirm on my calculation below

1 block every 10 minutes leads to
1*10*6*24*30*12 = 518400


I can confirm it is wrong.
6 blocks an hour, times 24 hours.
Not 10 times that.
Yup, did a little mistake adding the extra 10 in the calculation.
Anyway the end results look the same   ;D since I had removed the extra zero while converting it to GB