Bitcoin Forum
June 08, 2024, 05:39:55 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 [2] 3 4 »  All
  Print  
Author Topic: 1mb is too big  (Read 3315 times)
thejaytiesto
Legendary
*
Offline Offline

Activity: 1358
Merit: 1014


View Profile
September 13, 2016, 03:07:14 PM
 #21

You'd be surprised how many people would be mislead by jokes like these and end up thinking that 'high' or 'unlimited' block sizes are a true possibility. The primary result of those is the lack of knowledge, or false knowledge. I do have to admit that the image has a bit of humor to it.

"640 kB ought to be enough for anybody" - Bill Gates ... and look at us now.  Roll Eyes
I'd call false analogy on that one though.

The blocksize is too big if it does not allow for users to run their own nodes in a decent computer. It's as simple as that.
It's not as simple as that. What does "not being allowed" to run nodes on a decent computer mean? I've seen a lot of people throw around words without actually backing them up with specifics. Does this imply a lack of storage space, an inadequate internet connection/bandwidth, not being able to validate on time? I do recall a presentation where the potential of 'never being able to catch up' as a new node as presented (I think this was Hong Kong Scaling 2015).

Just common sense. When the computer is struggling to run the node then it's not viable. If you are not able to work with your computer while you have your core wallet opened, then it's not viable since you would need a secondary computer.
notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
September 15, 2016, 09:27:07 PM
 #22

You'd be surprised how many people would be mislead by jokes like these and end up thinking that 'high' or 'unlimited' block sizes are a true possibility. The primary result of those is the lack of knowledge, or false knowledge. I do have to admit that the image has a bit of humor to it.

"640 kB ought to be enough for anybody" - Bill Gates ... and look at us now.  Roll Eyes
I'd call false analogy on that one though.

The blocksize is too big if it does not allow for users to run their own nodes in a decent computer. It's as simple as that.
It's not as simple as that. What does "not being allowed" to run nodes on a decent computer mean? I've seen a lot of people throw around words without actually backing them up with specifics. Does this imply a lack of storage space, an inadequate internet connection/bandwidth, not being able to validate on time? I do recall a presentation where the potential of 'never being able to catch up' as a new node as presented (I think this was Hong Kong Scaling 2015).

Just common sense. When the computer is struggling to run the node then it's not viable. If you are not able to work with your computer while you have your core wallet opened, then it's not viable since you would need a secondary computer.

Why does everyone need to run a full node?  See section 8 of the bitcoin white paper.  Even Satoshi viewed full node operation as a business activity.  If you didn't sign up for Satoshi's vision, what exactly did you sign up for?  It sounds like you're rather have a toy than a tool for stopping banks from leaching from hardworking citizens via government bailouts.  That is first and foremost the mission of bitcoin.  Just look at the timing of its creation and the message encoded in the Genesis block.

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
chopstick (OP)
Legendary
*
Offline Offline

Activity: 992
Merit: 1000


View Profile
September 16, 2016, 01:35:21 AM
 #23

Remember, BTC is now being ran by these types:

Quote
Today I was reading the chat logs from when Peter R was booted from a Core Slack channel. I could hardly believe anyone involved with Bitcoin could actually think such a thing. This seemed almost as shocking as the Core supporters at a recent Silicon Valley Bitcoin meetup telling people that they should use credit cards instead of Bitcoin for payments. I'm more than a bit stunned that Bitcoin has been partially taken over by people who think:

    Bitcoin shouldn't change to accommodate more users.

    People should be using credit cards instead of Bitcoin.

This is madness!

https://www.reddit.com/r/btc/comments/52l541/we_should_not_change_btc_to_accommodate_more/






Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
September 16, 2016, 05:49:31 AM
 #24

Why does everyone need to run a full node?  
No, not "everyone" has to run one. We're already pass that point IMO. However, you should be able to run a node without it costing you a lot of money (this is subjective I know) if you want any kind of decentralization.

See section 8 of the bitcoin white paper.  Even Satoshi viewed full node operation as a business activity.  If you didn't sign up for Satoshi's vision, what exactly did you sign up for?  
Just because Satoshi had a view regarding something, that doesn't mean that they were right. I disagree with 'running a full node to being a business activity'. It's not like the node count isn't slowly going down due to the ever increasing resource cost required to run one.

Remember, BTC is now being ran by these types:
-snip-
So we should believe in Ver's words, the same person that vouched for Mt.Gox solvency? Roll Eyes

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
davis196
Hero Member
*****
Offline Offline

Activity: 2996
Merit: 915



View Profile
September 16, 2016, 06:28:20 AM
 #25



lulz




Did you draw this by yourself?It`s nice. Grin

Or i guess it`s created before 20 years when 1MB was too big.

For me 1TB isn`t enough. Grin

notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
September 16, 2016, 12:00:26 PM
 #26

Why does everyone need to run a full node?  
No, not "everyone" has to run one. We're already pass that point IMO. However, you should be able to run a node without it costing you a lot of money (this is subjective I know) if you want any kind of decentralization.

See section 8 of the bitcoin white paper.  Even Satoshi viewed full node operation as a business activity.  If you didn't sign up for Satoshi's vision, what exactly did you sign up for?  
Just because Satoshi had a view regarding something, that doesn't mean that they were right. I disagree with 'running a full node to being a business activity'. It's not like the node count isn't slowly going down due to the ever increasing resource cost required to run one.

Remember, BTC is now being ran by these types:
-snip-
So we should believe in Ver's words, the same person that vouched for Mt.Gox solvency? Roll Eyes

Yeah, it is subjective.  My desktop could easily handle 20 mb blocks, and it isn't anywhere near top of the line.  Sure, in 5 years I'd need more disk space, but I could also buy enough disk for 20 years at that rate for less than $600.  In 5 years when I actually need it, that storage will likely be even cheaper.  The CPU, RAM, and network connectivity I have today can already handle it.

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
September 16, 2016, 12:11:37 PM
 #27

Yeah, it is subjective.  My desktop could easily handle 20 mb blocks, and it isn't anywhere near top of the line.  
No, I'm fairly certain that you've pulled that out of thin air. Exactly where, when and how did you benchmark validation time for a 20 MB block? Quadratic scaling induces problems with malicious 2 MB blocks, now imagine the implications on 10x of that.

Sure, in 5 years I'd need more disk space, but I could also buy enough disk for 20 years at that rate for less than $600.  In 5 years when I actually need it, that storage will likely be even cheaper.  The CPU, RAM, and network connectivity I have today can already handle it.
You are forgetting several things in your point of view: 1) Syncing from scratch; 2) Bandwidth. Sure, you could argue that you'd need only download 20 MB worth of data on average every 10 minutes. However, this does not apply for full nodes, but full clients. A node, doing an average of 1 mbit/s at the current block size of 1 MB, will spend around 300 GB of data in 30 days (source: My own node).

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
awesome31312
Hero Member
*****
Offline Offline

Activity: 826
Merit: 504


View Profile
September 16, 2016, 12:12:31 PM
 #28

What is this about? Is this a reference to the Blockchain?

Account recovered 08-12-2019
LoyceV
Legendary
*
Online Online

Activity: 3332
Merit: 16787


Thick-Skinned Gang Leader and Golden Feather 2021


View Profile WWW
September 16, 2016, 12:22:43 PM
 #29

My desktop could easily handle 20 mb blocks, and it isn't anywhere near top of the line.
Does anyone know why current blocks take so much longer to download in Bitcoin Core than old blocks? As far I as know the increased difficulty is only related to mining and has nothing to with checking downloaded blocks.

Example: When I start a fresh installation of Bitcoin Core, it starts downloading the blockchain. It quickly goes through the first years with small blocks, then after that it's downloading with more than 2 MB/s continuously. That means at least 2 blocks per second (say 90 seconds to download 1 day's worth of blocks). But when it gets closer to the current date, it slows down.
Now when I start Bitcoin Core, it downloads fast for a bit, then stops, continues a bit later again, and then downloading stops entirely for a long time. CPU load is low, hard drive I can't hear (and I have no indicator LED). My question is: why is it so slow? Why can't it continue to download 2 MB/s so I can update 3 days in 5 minutes?

This also happens on a trimmed version of the blockchain, so it can't be caused by Bitcoin Core checking all transactions on disk. The only thing that also gets bigger in the trimmed version, is the chainstate directory (now 1.6 GB). Could that be what slows it down?

As another test, on my old Atom laptop, it updates 3 weeks blockchain in 2 calendar days. To come back to the statement I quoted: if blocks would be 20 times larger, and I extrapolate this, my old Atom couldn't keep up anymore. And if this gets worse in the future, it could become a limitation for faster hardware too.

While I've typed all this, Bitcoin Core running on my i3 still says "3 days behind", the same as when I started typing. Network activity has slowed down to just a few kB/s "background" level. What is it doing?

Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
September 16, 2016, 12:25:09 PM
 #30

Does anyone know why current blocks take so much longer to download in Bitcoin Core than old blocks? As far I as know the increased difficulty is only related to mining and has nothing to with checking downloaded blocks.

Example: When I start a fresh installation of Bitcoin Core, it starts downloading the blockchain. It quickly goes through the first years with small blocks, then after that it's downloading with more than 2 MB/s continuously. That means at least 2 blocks per second (say 90 seconds to download 1 day's worth of blocks). But when it gets closer to the current date, it slows down.
I'm not sure how you don't know the obvious answer to that question: Because the blocks are larger. Just because the block size limit has been at 1 MB for years, that does not mean that the blocks were actually 1 MB in size. Blocks prior to 2015 and 2016 were usually quite smaller. Note: Downloading these blocks at 2 MB/s isn't the bottleneck, but validating them is. You can speed this up if you add a high "dbcache" setting (e.g. 4-8 GB).

Now when I start Bitcoin Core, it downloads fast for a bit, then stops, continues a bit later again, and then downloading stops entirely for a long time. CPU load is low, hard drive I can't hear (and I have no indicator LED). My question is: why is it so slow? Why can't it continue to download 2 MB/s so I can update 3 days in 5 minutes?
It shouldn't stop if it isn't doing anything. Relying on HDD noise is a bad way of testing whether it's actually doing something. If the CPU load is low, then the bottleneck is the IOPS of the HDD.

As another test, on my old Atom laptop, it updates 3 weeks blockchain in 2 calendar days. To come back to the statement I quoted: if blocks would be 20 times larger, and I extrapolate this, my old Atom couldn't keep up anymore. And if this gets worse in the future, it could become a limitation for faster hardware too.
Again: Validation time. You could copy over the whole blockchain and do a reindex to check. In other words, reindex builds the blockchain index and chain state from scratch (no downloading). You're going to notice that it takes a lot of time, especially on weak hardware such as yours.

Hint: Something like this should probably be discussed in a separate thread.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
LoyceV
Legendary
*
Online Online

Activity: 3332
Merit: 16787


Thick-Skinned Gang Leader and Golden Feather 2021


View Profile WWW
September 16, 2016, 12:42:48 PM
 #31

I'm not sure how you don't know the obvious answer to that question: Because the blocks are larger. Just because the block size limit has been at 1 MB for years, that does not mean that the blocks were actually 1 MB in size. Blocks prior to 2015 and 2016 were usually quite smaller. Note: Downloading these blocks at 2 MB/s isn't the bottleneck, but validating them is. You can speed this up if you add a high "dbcache" setting (e.g. 4-8 GB).
Does that mean validating 10 blocks (each 0.2 MB) takes less time than validating 2 blocks (each 1 MB)? That would explain it, although I don't get why this would be different.
I only have 4 GB RAM so much more dbcache won't work.

Quote
Hint: Something like this should probably be discussed in a separate thread.
Probably Wink I've had this on my mind for a while, and reading about the 20 MB blocks triggered it.

notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
September 16, 2016, 04:26:39 PM
 #32

Yeah, it is subjective.  My desktop could easily handle 20 mb blocks, and it isn't anywhere near top of the line.  
No, I'm fairly certain that you've pulled that out of thin air. Exactly where, when and how did you benchmark validation time for a 20 MB block? Quadratic scaling induces problems with malicious 2 MB blocks, now imagine the implications on 10x of that.

Sure, in 5 years I'd need more disk space, but I could also buy enough disk for 20 years at that rate for less than $600.  In 5 years when I actually need it, that storage will likely be even cheaper.  The CPU, RAM, and network connectivity I have today can already handle it.
You are forgetting several things in your point of view: 1) Syncing from scratch; 2) Bandwidth. Sure, you could argue that you'd need only download 20 MB worth of data on average every 10 minutes. However, this does not apply for full nodes, but full clients. A node, doing an average of 1 mbit/s at the current block size of 1 MB, will spend around 300 GB of data in 30 days (source: My own node).

If you restrict transactions to 1mb, then you can scale linearly from the current worst case.

Bitcoin unlimited is running a testnet with 10mb blocks and they aren't even close to straining decent desktop hardware.

Bandwidth usage can be high, but 20mbit/second is pretty easy to come by.  If you have less, you can serve fewer nodes but still provide more to the network than you consume.

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
September 17, 2016, 06:30:36 AM
 #33

If you restrict transactions to 1mb, then you can scale linearly from the current worst case.
What do you mean by "restrict transactions to 1 MB"? Restrict the maximum size of an individual transaction to 1 MB?

Bitcoin unlimited is running a testnet with 10mb blocks and they aren't even close to straining decent desktop hardware.
I highly suspected that they are testing normal 10 MB blocks, which is obviously the wrong way to test stuff. Otherwise, that statement wouldn't hold ground. At 10 MB, with quadratic validation time, you could reproduce a block that requires a ridiculous amount of time to process.

Bandwidth usage can be high, but 20mbit/second is pretty easy to come by.  If you have less, you can serve fewer nodes but still provide more to the network than you consume.
The problem isn't the speed, the problem is the amount that you spend. Even at 1mbit/s, you can have 40-80 connections (again, source: my node). However, do you really think that you wouldn't be throttled down very soon considering that they'd probably label this as "not fair use"?

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
September 20, 2016, 12:43:55 PM
 #34

If you restrict transactions to 1mb, then you can scale linearly from the current worst case.
What do you mean by "restrict transactions to 1 MB"? Restrict the maximum size of an individual transaction to 1 MB?

Bitcoin unlimited is running a testnet with 10mb blocks and they aren't even close to straining decent desktop hardware.
I highly suspected that they are testing normal 10 MB blocks, which is obviously the wrong way to test stuff. Otherwise, that statement wouldn't hold ground. At 10 MB, with quadratic validation time, you could reproduce a block that requires a ridiculous amount of time to process.

Bandwidth usage can be high, but 20mbit/second is pretty easy to come by.  If you have less, you can serve fewer nodes but still provide more to the network than you consume.
The problem isn't the speed, the problem is the amount that you spend. Even at 1mbit/s, you can have 40-80 connections (again, source: my node). However, do you really think that you wouldn't be throttled down very soon considering that they'd probably label this as "not fair use"?

Yes, at a 1mb transaction limit, it would scale linearly from today's worst case and still allow any transaction that is currently valid.

If you pay for a data rate and they don't deliver that data rate they have breached your contract.

Bitcoin has fallen to 80% of the cryptocurrency market.  Transactions throughput has stalled.  Other coins are working hard on massive throughput and are continuing to grow.

The promise of bitcoin was not that everyone can run a node.  The promise was that no single entity could control the issuance of money.  Bitcoin can only meet this promise it it can scale enough to be useful as one of the primary means of international exchange.  The good news is that if it doesn't, some other coin will.

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
September 20, 2016, 12:49:14 PM
 #35

Yes, at a 1mb transaction limit, it would scale linearly from today's worst case and still allow any transaction that is currently valid.
Who are you to prevent me from making a 1.01 MB transaction if the block size allows it? I see a bad precedence here.

If you pay for a data rate and they don't deliver that data rate they have breached your contract.
Flat rate is usually based on fair usage. You should read the small print.

Other coins are working hard on massive throughput and are continuing to grow.
None have succeeded with that, nor do they have many users. Not that this is a bad thing.

The promise of bitcoin was not that everyone can run a node.  The promise was that no single entity could control the issuance of money. 
They are both related. You lose decentralization, you will also likely lose values such as that. I disagree with the last part, that was not *the only promise*.

The good news is that if it doesn't, some other coin will.
There isn't even a single viable candidate yet.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
Carlton Banks
Legendary
*
Offline Offline

Activity: 3430
Merit: 3074



View Profile
September 20, 2016, 01:05:34 PM
 #36

I'm not sure how you don't know the obvious answer to that question: Because the blocks are larger. Just because the block size limit has been at 1 MB for years, that does not mean that the blocks were actually 1 MB in size. Blocks prior to 2015 and 2016 were usually quite smaller. Note: Downloading these blocks at 2 MB/s isn't the bottleneck, but validating them is. You can speed this up if you add a high "dbcache" setting (e.g. 4-8 GB).
Does that mean validating 10 blocks (each 0.2 MB) takes less time than validating 2 blocks (each 1 MB)? That would explain it, although I don't get why this would be different.
I only have 4 GB RAM so much more dbcache won't work.

Yes, essentially.

Your question contains an assumption: that downloading is the only thing that's happening, or that it's the only thing that takes time. It's not.

The signatures of the transactions need checking. That involves finding and reading the blocks where the tx's were ALL minted from (which is not an inconsiderable amount of disk I/O), and then checking that each tx signature satisfies a cryptographic proof that it really originated from those blocks. And Bitcoin has to do that for every transaction in every block since genesis.

The early blocks (1-170,000) barely contain any tx data at all, as no-one was sending much BTC around. Once we get to around 2011 (>170,000), the tx rate ramps up exponentially, and so the complexity of the initial sync'ing job goes up at an even higher rate.

Vires in numeris
countryfree
Legendary
*
Offline Offline

Activity: 3052
Merit: 1047

Your country may be your worst enemy


View Profile
September 20, 2016, 07:53:25 PM
 #37

I don't like topics about block size. It's smarter to talk about free space within each block. On an average computer, most people would say you need to keep at least 15% free space on your hard disk so that you work on files without hiccup. It's the same thing with BTC. If we're close to max capacity, things will be less smooth. It may even stall at some point and we don't want to see that.

I used to be a citizen and a taxpayer. Those days are long gone.
chopstick (OP)
Legendary
*
Offline Offline

Activity: 992
Merit: 1000


View Profile
September 20, 2016, 08:47:05 PM
 #38

I'm guessing certain Core members have large positions in alts.

No wonder they want to keep strangling BTC.

Greed.
Conus
Full Member
***
Offline Offline

Activity: 236
Merit: 250


View Profile
September 20, 2016, 08:50:15 PM
 #39

"640 kB ought to be enough for anybody" - Bill Gates ... and look at us now.  Roll Eyes ... Bitcoin allow for scaling, but you should not run into it with

closed eyes, hoping you not going to kill yourself. This experiment needs cautious people with open minds... putting investors interest first. I

would like to see bigger blocks, but not at the expense of the whole experiment.  Roll Eyes

LOL, bitcoin block size can't be estimated, maybe in 2050, bitcoin block is 100MB, who knows? At that time, there will be over 10 M or even 100M people use bitocin, the population will be 10 B.
The question will be whether or not we will have to power to computer that much in transactions on a regular basis!  That is a lot and because of the difficulty levels now, normal joe's can not just mine BTC, so what are we looking like in the future.
zimmah
Legendary
*
Offline Offline

Activity: 1106
Merit: 1005



View Profile
September 20, 2016, 08:58:17 PM
 #40

I'm guessing certain Core members have large positions in alts.

No wonder they want to keep strangling BTC.

Greed.

It's a pretty serious problem though and it's almost not even discussed anymore.      

Bitcoin needs room to grow, and 1MB limit is stopping it from growing.

This doesn't benefit anyone in bitcoin, not even the miners. Because more transactions means more fees, even if the fees per transaction might be a bit lower, the total amount of fees will be higher because there are more transactions to get fees from. And that's not even taking into consideration that with more transactions come more users, and with more users comes a higher price, and therefore more profit.

And people worrying about bitcoin getting more centralized if we get bigger blocks are not considering the fact that we have been sitting on 1MB for years now, and since that time internet has gotten cheaper and disk space has gotten cheaper as well. So we can easily increase the block size and we can do so again when we need to (and by that time, internet and disk space have become cheaper again). And on top of that we can find more efficient ways to manage the blockchain and transactions. As has been suggested in the whitepaper.

As long as the increase in blocksize is not too large at once it is fine. And it won't cause centralization at all. And even if the amount of nodes and miners goes down a little it won't matter, because there will still be plenty of different miners.

It's still better than having everyone run to altcoins because it is becoming impossible for bitcoin to grow the network and transactions are getting stuck.
Pages: « 1 [2] 3 4 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!