Bitcoin Forum
June 18, 2024, 07:44:13 AM *
News: Voting for pizza day contest
 
   Home   Help Search Login Register More  
Pages: « 1 [2]  All
  Print  
Author Topic: Bitcoin download speed singularity inevitable with Bitcoin Classic  (Read 1276 times)
danda
Full Member
***
Offline Offline

Activity: 202
Merit: 157


View Profile WWW
March 13, 2016, 12:51:06 AM
 #21

I agree about the liability statement.  Sure it's great for some full nodes (oracles) to exist with history since the beginning of time, but that's too much burden for average user as time goes on.

It seems to me that it should be possible to start/run a new node with only:
  1) The utxo set starting at block X in the past.  where X is at user's discretion.
  2) the full TX history since block X.

Normally X should be something like a month in the past or at least  a week to avoid problems with reorgs.  If a reorg did occur that is deeper than node history then I suppose it should just stop and issue an error message.

Yes it requires some amount of trust since one is not downloading entire history.   But it seems like the hash of each block and each corresponding utxo set could be distributed in enough places (peers, oracles) that one could be relatively certain.  I mean it would be faster to connect and check with eg 500 peers/oracles than to download and verify the entire blockchain.

I believe this should be significantly faster for initial use than even a pruned node because pruned nodes still require a  sync from genesis block for verification.

So what's wrong with this idea?  I imagine that it or something like it has already been proposed.   Does it have a name?

mybitprices.info - wallet auditing   |  hd-wallet-derive - derive keys locally |  hd-wallet-addrs - find used addrs
lightning-nodes - list of LN nodes  |  coinparams - params for 300+ alts  |  jsonrpc-cli - cli jsonrpc client
subaddress-derive-xmr - monero offline wallet tool
RealBitcoin (OP)
Hero Member
*****
Offline Offline

Activity: 854
Merit: 1009


JAYCE DESIGNS - http://bit.ly/1tmgIwK


View Profile
March 15, 2016, 02:36:42 AM
 #22

I agree about the liability statement.  Sure it's great for some full nodes (oracles) to exist with history since the beginning of time, but that's too much burden for average user as time goes on.

It seems to me that it should be possible to start/run a new node with only:
  1) The utxo set starting at block X in the past.  where X is at user's discretion.
  2) the full TX history since block X.

Normally X should be something like a month in the past or at least  a week to avoid problems with reorgs.  If a reorg did occur that is deeper than node history then I suppose it should just stop and issue an error message.

Yes it requires some amount of trust since one is not downloading entire history.   But it seems like the hash of each block and each corresponding utxo set could be distributed in enough places (peers, oracles) that one could be relatively certain.  I mean it would be faster to connect and check with eg 500 peers/oracles than to download and verify the entire blockchain.

I believe this should be significantly faster for initial use than even a pruned node because pruned nodes still require a  sync from genesis block for verification.

So what's wrong with this idea?  I imagine that it or something like it has already been proposed.   Does it have a name?

I`m not a tech expert, but i`m not sure if its a good idea to mess around with the blockchain format.

The more you complicate stuff the less secure it becomes and then you have to complicate it even more to fix your previous crap.

It's called entropy and i dont want bitcoin to get into the same trap, better keep it simple and safe.

Holliday
Legendary
*
Offline Offline

Activity: 1120
Merit: 1010



View Profile
March 16, 2016, 10:42:13 PM
Last edit: March 16, 2016, 10:52:24 PM by Holliday
 #23

average internet speeds of 1999:
256kbit down 64kbit up (32kbyte down 8kbyte up) = (19meg down per 10 minutes 4.8megUP per 10 minutes)
(usually throttled down, but yes its possible to download an 5mb MP3 in 5 minutes at those speeds)

average internet speeds of 2015:
5mbit down 750kbit up (625kbyte down 93kbyte up) = (375meg down per 10 minutes 55megUP per 10 minutes)
(usually throttled down, but yes its possible to watch 3gb netflix in an hour (500mb in 10 minutes at those speeds)

so we can see a 10x growth in less than 20 years..
and the dooms day of large blocks such as the 1tb block the OP is prophesying. but lets say that in we grw to 4mb in 4 years, 8mb in 4 years and so on and so on.

then the average speed in MANY decades WILL cope.

i really love the blockstreamers and altcoiners doing all they can to shout out that onchain bitcoin's cant cope and to instead use their offchain/altcoins

lesson one to fudsters:
dont try grabbing a doomsday prophacy of 20+years to denounce possibility of a smaller growth now.
coz thats like trying to say that child abortion should happen because obesity and cancer will kill most people in 50 years so there is no point in anyone growing up

lesson two
try to us logical and rational and HONEST information about CURRENT situation to denounce CURRENT debate about CURRENT proposals

Keep this in mind when you are talking about internet speeds and running a fully functional Bitcoin node. If you don't limit the number of connections to your node, it will often eat all of your available upload bandwidth. Once your upload bandwidth is saturated, your download bandwidth becomes practically useless.

I have been running a full node since 2011. I have top tier internet speeds (far, far better than the averages posted above) in a well developed location. Sometime in the past year or so I've had to limit the number of connections in order to conserve bandwidth for other regular internet uses. This is obviously with the 1MB anti-spam limit in place.

My concern is how much more bandwidth is my node going to require if we remove this anti-spam limit without addressing the efficiency of the network. My concern is if a Bitcoin fanatic like myself isn't able to run a fully functional node with his top tier home internet, aren't we harming decentralization? I understand that there are many efficiency improvements in the pipeline, but as of right now I still have to gimp my node and limit the amount of connections it will accept.

I've seen you talking about Netflix and modern video games, and the bandwidth required to use those services, several times in the past and I just want to remind you that the Bitcoin network needs peers who upload large amounts of data as well.

If you aren't the sole controller of your private keys, you don't have any bitcoins.
RealBitcoin (OP)
Hero Member
*****
Offline Offline

Activity: 854
Merit: 1009


JAYCE DESIGNS - http://bit.ly/1tmgIwK


View Profile
March 18, 2016, 03:54:06 AM
 #24


Keep this in mind when you are talking about internet speeds and running a fully functional Bitcoin node. If you don't limit the number of connections to your node, it will often eat all of your available upload bandwidth. Once your upload bandwidth is saturated, your download bandwidth becomes practically useless.

I have been running a full node since 2011. I have top tier internet speeds (far, far better than the averages posted above) in a well developed location. Sometime in the past year or so I've had to limit the number of connections in order to conserve bandwidth for other regular internet uses. This is obviously with the 1MB anti-spam limit in place.

My concern is how much more bandwidth is my node going to require if we remove this anti-spam limit without addressing the efficiency of the network. My concern is if a Bitcoin fanatic like myself isn't able to run a fully functional node with his top tier home internet, aren't we harming decentralization? I understand that there are many efficiency improvements in the pipeline, but as of right now I still have to gimp my node and limit the amount of connections it will accept.

I've seen you talking about Netflix and modern video games, and the bandwidth required to use those services, several times in the past and I just want to remind you that the Bitcoin network needs peers who upload large amounts of data as well.

Yes i forgot this, that the upload speed is always smaller than the download, so for a server that both has to push and get info, like the bitcoin core software, it's really hard to ignore this.

It's one thing to download blocks at 20mb/s rate, but you also have to push that on to your peers at 6-7mb/s speed in this top scenario.

And if you have a shitty internet in somewhere in africa with 0.5mb/s, then you are toast with big blocks.

18xk5oT2rLrAc24SL96XT14BX
Newbie
*
Offline Offline

Activity: 48
Merit: 0


View Profile
May 06, 2016, 06:28:03 PM
 #25

I`ve already had a thread talking about download speed singularity, but now let's apply this problem to Bitcoin Classic.

Block size download speed singularity = Block size is so big, that it takes more than 10 minutes to download 1 block, so that a new block will form while you barely downloaded the previous one. Which means that you can never download the blockchain, and it will become an isolated, centralized entity, as nobody could download it, but the ultra fast internet owners, or the blockchain breaks up.

This is similar to how our universe is expanding, the OBSERVABLE UNIVERSE is an inverse singularity, because nothing can escape it ,and everything is trapped in it. Therefore, if the block size were to become like 1 TB, then obviously it will become impossible to download it in 10 minutes and therefore bitcoin ends as we know it.



Ok now this is a remote situation, and I get it , blassic wants a  2-4-8 mb block size, and the situation with 1 TB block size will be far in the future.

However there is another singularity, that is much closer, and much more worrysome: The Blockchain Size Download Speed Singularity

The Blockchain Size Download Speed Singularity = The blockchain itself is so big, that it is impossible to download it all from the start, and by the time you download it, it grows even more. So in the previous issue, you already had the blockchain somewhat up to date and you only needed the new blocks, which was impossible to download. But in this example, you are a fresh user, and want to grab the entire blockchain, which it will become impossible, because the daily max bandwidth will be a tiny fraction of the entire blockchain size.


This means that if the block size becomes say 64mb in the foreseeable say 6-7 years that would mean that it grows 9 GB/DAY, and if you want to download it from the start, a blockchain of 1-2 petabyte size by then, it would become impossible to download it, or very unfeasable.

This means that only existing nodes can hold the bitcoin, and there will be no new nodes, and it will become instantly centralized.



So if the Classic people want eventually 64-128-256 mb blocks, then this singularity will become inevitable, and it would destroy decentralized bitcoin as we know it.

This is another huge flaw Classic has, but it has been swept under the rug, so I`m here to expose this flaw.
I would like to point out that fiber-optic cables are going to be increasing the download speeds of normal people over the next decade, so maybe having the blockchain as >1Tb isn't something that would be completely detrimental? Sure they wouldn't be able to implement it quickly, but it could be implemented.
Pages: « 1 [2]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!