Bitcoin Forum
March 30, 2024, 05:22:37 AM *
News: Latest Bitcoin Core release: 26.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 9 [10] 11 »  All
  Print  
Author Topic: FACT CHECK: Bitcoin Blockchain will be 700GB in 4 Years  (Read 9271 times)
pepethefrog
Member
**
Offline Offline

Activity: 120
Merit: 13


Pepe is NOT a hate symbol


View Profile
November 03, 2016, 05:23:31 AM
 #181

ill just leave this here


Bump, just because its better than my chart.

What will the next halving bring?

edits

(fucking hate mobile auto-correct sometimes)

Pepe thinks this graph is absolutely misleading.
Why make assumptions based on something back in 2012?
Also, in 2016 the blockchain is 100 GB big, and not 350 GB as graph wants to make Pepe believe.

Pepe is not stupid, you know?
So why fudge the numbers like that?
What is the point?

Bipcoin: bip1W2nq2vhM4f6kaHSsVD5J1LdRb1M3mCqftwq6erpEeKzsj8Kjrxy5xUs9VAtF233nNzcMQN2ZQfJ fvi2WensZ5tGJv2ysY8
Pepe is NOT a hate symbol.
1711776157
Hero Member
*
Offline Offline

Posts: 1711776157

View Profile Personal Message (Offline)

Ignore
1711776157
Reply with quote  #2

1711776157
Report to moderator
1711776157
Hero Member
*
Offline Offline

Posts: 1711776157

View Profile Personal Message (Offline)

Ignore
1711776157
Reply with quote  #2

1711776157
Report to moderator
1711776157
Hero Member
*
Offline Offline

Posts: 1711776157

View Profile Personal Message (Offline)

Ignore
1711776157
Reply with quote  #2

1711776157
Report to moderator
The block chain is the main innovation of Bitcoin. It is the first distributed timestamping system.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1711776157
Hero Member
*
Offline Offline

Posts: 1711776157

View Profile Personal Message (Offline)

Ignore
1711776157
Reply with quote  #2

1711776157
Report to moderator
franky1
Legendary
*
Online Online

Activity: 4172
Merit: 4370



View Profile
November 03, 2016, 06:10:26 AM
Last edit: November 03, 2016, 06:33:02 AM by franky1
 #182

ill just leave this here


Bump, just because its better than my chart.

What will the next halving bring?

edits

(fucking hate mobile auto-correct sometimes)

Pepe thinks this graph is absolutely misleading.
Why make assumptions based on something back in 2012?
Also, in 2016 the blockchain is 100 GB big, and not 350 GB as graph wants to make Pepe believe.

Pepe is not stupid, you know?
So why fudge the numbers like that?
What is the point?

the green glow line is not actual data.. its what could have been POTENTIAL if every block was filled from 2009.
the orange glow line is not actual data.. its what could have been POTENTIAL if the hard fork was implemented in block 210000

the funny part is that i actually moved some of the goal posts in favour of the anti-bloater. afterall satoshi wanted the hardfork in block 115000
but based on other historic things i just set the hypothetical line at 210,000

but those lines are meaningless in reality and are just showing hypothetical

anyway

actual data was the grey filled in shape.. more realistic POTENTIAL data growth is in the other grey shape

all the lines are hypothetical.. the greyed out shape is more of the actual and then expectant results based on the hypothetical lines and scenarios of delay in timeline to produce and activate features and then user adoption of those features.

here. ill throw another graph in
this time the orange glow hypothetical is based on:
Quote from: satoshi
It can be phased in, like:

if (blocknumber > 115000)
    maxblocksize = largerlimit

It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.

and the important demonstration (grey shape data) has the assumption this time of segwit and other features never activating


not sure why anyone should actually care about the green /orange glow lines as they are just arbitrary lines. its the grey shape that should be emphasised.

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
November 03, 2016, 06:17:02 AM
Last edit: November 03, 2016, 06:41:05 AM by Lauda
 #183

Pepe thinks this graph is absolutely misleading.
Why make assumptions based on something back in 2012?
Indeed. I would prefer if that was removed in the next update.

Also, in 2016 the blockchain is 100 GB big, and not 350 GB as graph wants to make Pepe believe.
I think you need a pair of glasses mister Pepe.

all the lines are hypothetical..
I think that it the point. The graph was supposed to represent the potential growth that the blockchain size might see in the future (as it clearly says "potential").

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
jbreher
Legendary
*
Offline Offline

Activity: 3038
Merit: 1660


lose: unfind ... loose: untight


View Profile
November 03, 2016, 02:09:38 PM
 #184

If we were to remedy this issue, here's how: re-validate every transaction in evvery block in the entire block chain, EVERY time someone loads up their Bitcoin network.

Well, no, Carlton. I would advocate that every transaction be validated at least once by each client. Upon initial download.

I would be more than happy to trust that data on local storage had not been meddled with. Heck, we've got the world's best cryptographers, right? We could secure data-at-rest with some sort of hash or something to ensure that nobody de-installed the drive, modified the data, and reinstalled it since the last time we ran the node. If that were a worry.

I am just pointing out that -- with the current Core implementation -- we do indeed need to trust in the goodness of others. For validation of older transactions. We are decidedly not operating in a trustless manner. I am heartened to learn that -- on this point at least -- you agree.

Anyone with a campaign ad in their signature -- for an organization with which they are not otherwise affiliated -- is automatically deducted credibility points.

I've been convicted of heresy. Convicted by a mere known extortionist. Read my Trust for details.
digicoinuser
Legendary
*
Offline Offline

Activity: 2688
Merit: 1072



View Profile
November 03, 2016, 02:15:31 PM
 #185

It will grow but I'm not sure by that much.  What size would it have to get to in order to be prohibitive for users to not want to download it at their current connection speeds?

pepethefrog
Member
**
Offline Offline

Activity: 120
Merit: 13


Pepe is NOT a hate symbol


View Profile
November 03, 2016, 03:44:20 PM
 #186


but those lines are meaningless in reality and are just showing hypothetical


Ok, Pepe now understands.
When you put hypotheticals on top of assumptions which are based on suppositions then Pepe's head can't gain a foot hold and gets confused.
Thanks for clarifying though, Pepe now understands that this is just one giant mental masturbation (*).

*Intellectual activity that serves no practical purpose

Bipcoin: bip1W2nq2vhM4f6kaHSsVD5J1LdRb1M3mCqftwq6erpEeKzsj8Kjrxy5xUs9VAtF233nNzcMQN2ZQfJ fvi2WensZ5tGJv2ysY8
Pepe is NOT a hate symbol.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
November 03, 2016, 03:45:50 PM
 #187

It will grow but I'm not sure by that much.  What size would it have to get to in order to be prohibitive for users to not want to download it at their current connection speeds?
Well, currently we are talking about a ridiculously low speed: 1 MB /10 /60 = 1.666 kb/s. We can't really estimate where the problem will start popping up for most users. The blockchain is already heavy as is, and the bigger it gets, the less people will want to use a full wallet or run a full node. Current average (global) internet speeds are almost able to download 1 GB blocks under 10 minutes (which requires less than 2 MB/s). However, this is way beyond the realm of possibility at this time, because if they were only half full on average we'd have over 2 TB worth of blocks each month (500 x 6 x 24 x 30 = 2 160 000).

I think a good portion would probably give up if we were talking about 100 GB monthly (probably even less) in blockchain growth.

Thanks for clarifying though, Pepe now understands that this is just one giant mental masturbation (*).
*Intellectual activity that serves no practical purpose

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
Tyrantt
Hero Member
*****
Offline Offline

Activity: 1022
Merit: 564

Need some spare btc for a new PC


View Profile
November 03, 2016, 03:52:37 PM
 #188

Thais came to my mind a few days ago, how can the aize be lowered tho? Can it be separated somehow, archived? Also, the usage of wallets (if people do not wish to download 100gb) will decrease, will that effect the btc somehow or will people switch to the new way of storing and mining btc?

Need some spare btc for a new PC that can at least run Adobe Dreamweaver.

BTC - 19qm3kH4MZELkefEb55HCe4Y5jgRRLCQmn ♦♦♦ ETH - 0xd71ACd8781d66393eBfc3Acd65B224e97Ae1952D
franky1
Legendary
*
Online Online

Activity: 4172
Merit: 4370



View Profile
November 03, 2016, 04:33:10 PM
 #189

It will grow but I'm not sure by that much.  What size would it have to get to in order to be prohibitive for users to not want to download it at their current connection speeds?
Well, currently we are talking about a ridiculously low speed: 1 MB /10 /60 = 1.666 kb/s. We can't really estimate where the problem will start popping up for most users. The blockchain is already heavy as is, and the bigger it gets, the less people will want to use a full wallet or run a full node. Current average (global) internet speeds are almost able to download 1 GB blocks under 10 minutes (which requires less than 2 MB/s). However, this is way beyond the realm of possibility at this time, because if they were only half full on average we'd have over 2 TB worth of blocks each month (500 x 6 x 24 x 30 = 2 160 000).

I think a good portion would probably give up if we were talking about 100 GB monthly (probably even less) in blockchain growth.

lauda why do you go from talking about a block of 1mb.. to jump immediately to blocks of [random number] or  22.8mb (100gb a month)..
why are you not thinking rational of what was previously consensual safe numbers of something like
2mb in 2015, 4mb by 2018.. 8mb by 2021.. 16mb by 2024

why are you always trying to push a doomsday of atleast a decade away to be a problem of today.. seriously 16 years ago people were on dialup and the largest hard drive was 4gb.. so what may seem a problem today when you scream blue murder about any randomly large number you seem to pull out of a hat.. wont be a problem in a decade. you dont seems to get the concept of natural progressive growth

if you want to talk about the possibility of blocks of 22.8mb atleast assume that we would get there in a few years and not today.
you know, like emphasis that you are speculating the year 202X and not actually talking about 2016/2017
because all your doing with your fake dooms days of 1mb today 22mb tomorrow is creating ammo to avoid safe and agreeable amounts of 2mb 4mb pushing the fake rhetoric just so that one group of devs can do their own thing.

if you actually came to the realisation that there is no need for one entity to make 5000 input/outputs in one tx. you would realise that limiting sigops and using libsecp256k1 optimisations would have solved your other fake doomsday of 'quadratics'.
you would at this point of reading this post probably want to rebut that yes quadratics wont be an issue by limiting sigops, but malleability wont be solved and double spends are still an issue.
which i would rebut that double spends are still an issue even after a malleability fix due to RBF, CPFP and other new features added to mess with unconfirmed transactions.
you would rebut that malleability also has issues for LN..
i would rebut that LN has even bigger issues to overcome first. such as the address reuse dilemma

i actually laughed that you were on the 2mb was bad bandwagon but now 4mb is acceptable bandwagon.
care to comment on your revelation concerning BLOAT(hard drive and bandwith doomsdays you were squawking just months ago).

also care to comment how half of that 4mb weight will be bloated with privacy features and other mundane data rather than expanding the capacity of actual transactions.
yep thats right. cores 1mb base 4mb weight. will be at most 2mb(~1.Cool of serialised(full tx+sig included data) and 2mb(~2.2) left vacant until filled with confidential payment codes, and whatnot
EG
1mb base 0.8mb witness.... leaving 2.2mb spare for feature bloat rather than more transactions = total 4mb weight

ok lets word it a different way.
whats your future mindset proposition and propaganda preference:

1mb base 0.8mb witness 2.2mb bloat for future features, for 4500tx
or
2mb base 1.6mb witness for 9000tx

where in both cases, bloat is ~<4mb

how would you like to see 4mb of data used??

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
pereira4
Legendary
*
Offline Offline

Activity: 1610
Merit: 1183


View Profile
November 03, 2016, 04:38:13 PM
 #190

In this thread some guy talks about the 700GB in 4 years prediction, which seems to be wrong according to DannyHamilton. It seems there is nothing worry about. You would need to raise the blocksize + have blocks full all the time to reach numbers like those... so the blockchain will be lighter.

https://bitcointalk.org/index.php?topic=1667806.0
franky1
Legendary
*
Online Online

Activity: 4172
Merit: 4370



View Profile
November 03, 2016, 04:49:28 PM
Last edit: November 03, 2016, 05:01:47 PM by franky1
 #191

In this thread some guy talks about the 700GB in 4 years prediction, which seems to be wrong according to DannyHamilton. It seems there is nothing worry about. You would need to raise the blocksize + have blocks full all the time to reach numbers like those... so the blockchain will be lighter.

https://bitcointalk.org/index.php?topic=1667806.0

the exponential graph on earlier pages are wrong not so much about the number.. but how he assumes that number based purely on an curve
because he didnt take into account the consensus limits.. which holds back natural growth
EG
if segwit does not activate, we will see the bloat held under the consensus limit of 1mb a block. meaning maximum bloat in 4 years for the entire blockchain will be about 300gb.

but if segwit was activated within the year followed by a year of adoption.. we will see the bloat held under the limit of ~1.8mb a block. meaning maximum bloat in 4 years for the entire blockchain will be about 300gb.

but if other features was activated followed by a year of adoption to fill the remaining 2.2mb of weight spare.. we will see the bloat held under the consensus limit of ~4mb a block. meaning maximum bloat in 4 years for the entire blockchain will be about 650gb.

thus its not a curve.. but a curve then line, curve then line, curve then line

ofcourse thats all theory.
because segwit may not activate or may activate sooner
also everyone may move their funds over to segwit compatible HD seed addresses sooner to all use it sooner causing bloat to hit the red line in 2017 instead of 2018 and then if other features fill the remaining weight sooner causes the bloat to hit the yellow line sooner and thus we have more than 650gb of bloat in 4 years. which makes 700mb+ possible and not irrational number. even if the OP of topic didnt come to the number using a more logical/rational methodology

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
November 03, 2016, 05:43:45 PM
 #192

lauda why do you go from talking about a block of 1mb.. to jump immediately to blocks of [random number] or  22.8mb (100gb a month)..
why are you not thinking rational of what was previously consensual safe numbers of something like
2mb in 2015, 4mb by 2018.. 8mb by 2021.. 16mb by 2024
Calm down. This has nothing to do with what you're writing about, not has it anything to do with what the user wanted. I gave them the current internet speed required to download the maximum size of a block (currently) and some arbitrary guess at which people would likely not bother at all. He asked for *size* at which users would not want to download at current speeds, not what was consensually safe or not (which I clearly stated that we are disregarding). One more time: the 100GB per month is completely arbitrary.

-snip-
Everything else is doomsday, propaganda, mumbo jumbo not relevant to this thread.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
coins101 (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1000



View Profile
November 04, 2016, 01:56:46 AM
 #193

will probably reference this in update on chart.

Quote from: nullc
Rought numbers as I might of omitted a byte here or there as this is a Reddit comment:
A typical 2 in 2 out transaction,
nversion + ninputs + 2(txid + vin + sequence + scriptlen + pubkey + signature) + noutpus + 2(scriptlen + dup + hash160 + hash + equalverify + checksig + value) + nlocktime
1+1+2(32+4+4+1+34+73) + 1+ 2(1+1+1+1+21+1+1+8) + 4 = 373
Maximum per block: 2680
With segwit (p2wphk):
1+1+2(32+4+4+1+(34+73).25) + 1+ 2*(1+1+1+1+21+1+1+8) + 4 = 212
Maximum per block: 4716
If instead we do the same with 2 of 3 multisig, which are a significant percentage of the transactions on the network, we get..
nversion + ninputs + 2(txid + vin + sequence + scriptlen + 3pubkey + 2signature + redeempush + dummy + checksig) + noutpus + 2(scriptlen + hash160 + hash + equal + value) + nlocktime
1+1+2(32+4+4+1+334+273+1+1+1) + 1+ 2(1+1+21+1+8) + 4 = 655
Maximum per block: 1526
with segwit (p2wsh):
1+1+2(32+4+4+1+(334+273+1+1+1).25) + 1+ 2*(1+1+21+1+8) + 4 = 278.50
Maximum per block: 3590.
These examples use direct P2WS instead of P2SH-P2WSH, which will be somewhat more common at first which is somewhat less efficient.

https://www.reddit.com/r/btc/comments/5azo8c/eli_40_how_do_we_get_17x_the_current_transaction/
Killerpotleaf
Sr. Member
****
Offline Offline

Activity: 812
Merit: 250


A Blockchain Mobile Operator With Token Rewards


View Profile
November 04, 2016, 03:01:24 AM
 #194

devs should just build the clients with a built-in highly pruned blockchain, and let that be the starting point for syncing old blocks when running that client.

nodes dont need the history from day 1 to work, all they need is a starting point they know will agree with the rest of the network.

it means placing some trust in the client you run, but that's already the case today, and it kinda always will be... unless core not only insists that everyone must be able to run a node, but code one from scratch too.

              ███
             █████
            ███████
           █████████
          ███████████
         █████████████
        ███████ ███████
       ███████   ███████
      ███████     ███████
     ███████       ███████
    ███████         ███████
   ███████           ███████
  ███████             ███████
 █████████████████████████████
███████████████████████████████
.
M!RACLE TELE
BRINGING MAGIC
TO THE TELECOM INDUSTRY

██
██
██
██
██
██
██
██
██
██
40% Biweekly Rewards
▬▬▬   Calls at €0.2   ▬▬▬
Traffic from €0.01 worldwide

██
██
██
██
██
██
██
██
██
██
      ██         ██     
        ▀▌     ▐▀       
       ▄██▄▄▄▄▄██▄      
     ▄█████████████     
   ▄█████████████████▄   
  ██████▄██████▄██████  
 ▐█████████████████████▌
  ██████▀███████▀██████ 
  █████   █████   █████  
  █████████████████████  
  █████████████████    
    ███████████████    
 ▀██▄ ████████████  ▄██▀
      ▀██▀   ▀██▀   
       ▄█       █▄
ANN
Lightpaper
Bounty
Facebook
Twitter
Telegram
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
November 04, 2016, 05:49:25 AM
 #195

devs should just build the clients with a built-in highly pruned blockchain, and let that be the starting point for syncing old blocks when running that client.

nodes dont need the history from day 1 to work, all they need is a starting point they know will agree with the rest of the network.
So exactly what is this supposed to do? Centralize Bitcoin in order to avoid downloading a part of the blockchain? That's a terrible idea. If you don't have enough storage space right now, you can run a pruned node.

it means placing some trust in the client you run, but that's already the case today, and it kinda always will be...
False.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
Killerpotleaf
Sr. Member
****
Offline Offline

Activity: 812
Merit: 250


A Blockchain Mobile Operator With Token Rewards


View Profile
November 04, 2016, 01:14:07 PM
 #196

devs should just build the clients with a built-in highly pruned blockchain, and let that be the starting point for syncing old blocks when running that client.

nodes dont need the history from day 1 to work, all they need is a starting point they know will agree with the rest of the network.
So exactly what is this supposed to do? Centralize Bitcoin in order to avoid downloading a part of the blockchain? That's a terrible idea. If you don't have enough storage space right now, you can run a pruned node.
all it would centralize is the src from which you get this "built-in highly pruned blockchain start point". but i guess you could go a step futher and get the network to validate that this "start point" is not a lie, and does in fact accurately represent the past history. hell the network itself could make this start point available for download, instated of having the network send the blockchain in its entirety .

it means placing some trust in the client you run, but that's already the case today, and it kinda always will be...
False.
if you subscribe to the idea that you dont need to trust the code behind the client you choose to run because its been peer reviewed by many people and is known to do exactly what you'd expect it to do.
then it isn't much of a stretch to say that you dont need to trust the "built-in highly pruned blockchain start point" for the same reason.

              ███
             █████
            ███████
           █████████
          ███████████
         █████████████
        ███████ ███████
       ███████   ███████
      ███████     ███████
     ███████       ███████
    ███████         ███████
   ███████           ███████
  ███████             ███████
 █████████████████████████████
███████████████████████████████
.
M!RACLE TELE
BRINGING MAGIC
TO THE TELECOM INDUSTRY

██
██
██
██
██
██
██
██
██
██
40% Biweekly Rewards
▬▬▬   Calls at €0.2   ▬▬▬
Traffic from €0.01 worldwide

██
██
██
██
██
██
██
██
██
██
      ██         ██     
        ▀▌     ▐▀       
       ▄██▄▄▄▄▄██▄      
     ▄█████████████     
   ▄█████████████████▄   
  ██████▄██████▄██████  
 ▐█████████████████████▌
  ██████▀███████▀██████ 
  █████   █████   █████  
  █████████████████████  
  █████████████████    
    ███████████████    
 ▀██▄ ████████████  ▄██▀
      ▀██▀   ▀██▀   
       ▄█       █▄
ANN
Lightpaper
Bounty
Facebook
Twitter
Telegram
requester
Sr. Member
****
Offline Offline

Activity: 280
Merit: 250


View Profile
November 04, 2016, 01:30:36 PM
 #197

does it mean that every miner wool have 700 GB of data ?
I don't have idea about that. does blockchain reside in all miners system or it is stored in a particular system? But well 700 GB is quit a large number. and database if only 3 columns of 700GB would be a huge database.
coins101 (OP)
Legendary
*
Offline Offline

Activity: 1456
Merit: 1000



View Profile
November 04, 2016, 01:41:38 PM
 #198

does it mean that every miner wool have 700 GB of data ?
I don't have idea about that. does blockchain reside in all miners system or it is stored in a particular system? But well 700 GB is quit a large number. and database if only 3 columns of 700GB would be a huge database.

Miners find ways to avoid having to run full nodes with a full data set. Don't worry about the miners use of full nodes, worry about the miners being concentrated into small pockets of big industrial groups. All it takes is for two or three big miners to collude and they can fix the game to suit their ends. The only thing stopping them from colluding to rig the game is self-interest so as not to collapse the market price. If they get a government visit, however, governments don't care about corporate self-interests. Maybe we should have a mining Canary?

It is looking like users will just be users. So for the average person they can just use Bitcoin without worrying about such things as the backbone of the system.

We are likely to be in a transition state where home connections are going to reach max capacity. The transition then is going to be towards professionally hosted nodes.

This guy on r/btc posted some stats about his node consumption. He has 200mbit download and 20mbit upload, which is a very good connection in the western world:

Code:
month        rx      |     tx      |    total    |   avg. rate
------------------------+-------------+-------------+---------------
  Feb '16      8.87 GiB |   22.38 GiB |   31.25 GiB |  104.62 kbit/s
  Mar '16    109.58 GiB |  635.21 GiB |  744.79 GiB |    2.33 Mbit/s
  Apr '16    144.85 GiB |    1.05 TiB |    1.19 TiB |    3.95 Mbit/s
  May '16    112.24 GiB |    1.08 TiB |    1.19 TiB |    3.80 Mbit/s
  Jun '16     95.28 GiB |  880.11 GiB |  975.38 GiB |    3.16 Mbit/s
  Jul '16     90.72 GiB |  925.71 GiB |    0.99 TiB |    3.18 Mbit/s
  Aug '16    178.99 GiB |    1.02 TiB |    1.20 TiB |    3.84 Mbit/s
  Sep '16    133.12 GiB |    1.03 TiB |    1.16 TiB |    3.83 Mbit/s
  Oct '16    115.43 GiB |    1.18 TiB |    1.30 TiB |    4.16 Mbit/s
  Nov '16     15.69 GiB |  213.81 GiB |  229.50 GiB |    6.46 Mbit/s
------------------------+-------------+-------------+---------------
estimated    136.41 GiB |    1.82 TiB |    1.95 TiB |

He is reaching his connection limits, but that does depend on how he sets his in / out connections.

Code:
>$ bitcoin-cli getpeerinfo | grep subver | sort | uniq -c | sort -nr
   21     "subver": "/Satoshi:0.13.0/",
   19     "subver": "/Satoshi:0.12.1/",
    9     "subver": "/bitcoinj:0.13.4/Bitcoin Wallet:4.46/",
    9     "subver": "/bitcoinj:0.13.2/MultiBitHD:0.1.4/",
    9     "subver": "/bitcoinj:0.13.1/Bitsquare:0.3.3/",
    9     "subver": "/bitcoinj:0.12.2/Bitcoin Wallet:2.9.3/",
    9     "subver": "/BitCoinJ:0.11.2/MultiBit:0.5.18/",
    7     "subver": "/Satoshi:0.12.0/",
    7     "subver": "/Satoshi:0.11.2/",
    6     "subver": "/Satoshi:0.11.0/",
    6     "subver": "/BitcoinUnlimited:0.12.1(EB16; AD4)/",
    4     "subver": "/Satoshi:0.9.99/",
    4     "subver": "/Satoshi:0.8.1/",
    3     "subver": "/iguana 0.00/",
    3     "subver": "/bitcoinj:0.14-SNAPSHOT/",
    3     "subver": "/bitcoinj:0.14.3/Bitcoin Wallet:5.03/",
    3     "subver": "/bitcoinj:0.12.2/",
    2     "subver": "/Satoshi:0.13.1/",
    2     "subver": "/Satoshi:0.10.1/",
    2     "subver": "/Classic:0.12.0/",
    2     "subver": "/btcwire:0.4.1/btcd:0.12.0/",
    2     "subver": "/bitcore:1.1.0/",
    1     "subver": "/ViaBTC:bitpeer.0.2.0/",
    1     "subver": "/Satoshi:0.9.1/",
    1     "subver": "/Satoshi:0.12.1(bitcore)/",
    1     "subver": "/Satoshi:0.11.1/",
    1     "subver": "/Bitcoin XT:0.11.0/",
    1     "subver": "/bitcoinj:0.14.3/Bitcoin Wallet:5.02/",
    1     "subver": "/bitcoinj:0.14.3/",
    1     "subver": "/BitCoinJ:0.11.3/",
    1     "subver": "",

https://www.reddit.com/r/btc/comments/5b0g4x/remember_that_time/
HCLivess
Legendary
*
Offline Offline

Activity: 2114
Merit: 1090


=== NODE IS OK! ==


View Profile WWW
November 04, 2016, 02:18:50 PM
 #199

what do you mean? there is a blocksize limit and blocks come in predefined frequency defined by difficulty

jbreher
Legendary
*
Offline Offline

Activity: 3038
Merit: 1660


lose: unfind ... loose: untight


View Profile
November 04, 2016, 06:24:05 PM
 #200

We are likely to be in a transition state where home connections are going to reach max capacity. The transition then is going to be towards professionally hosted nodes.

This guy on r/btc posted some stats about his node consumption. He has 200mbit download and 20mbit upload, which is a very good connection in the western world:

He is reaching his connection limits, but that does depend on how he sets his in / out connections.

Well, not every node requires 146 connections. While I commend that user on his beneficial fanout, it is little wonder that his meager 20Mb/s upload rate is a limitation. While I've not seen a survey, I would assume 'a handful of up connections and a balanced number of down connections' might be a more pervasive usage model. Maybe divide by ten?

Anyone with a campaign ad in their signature -- for an organization with which they are not otherwise affiliated -- is automatically deducted credibility points.

I've been convicted of heresy. Convicted by a mere known extortionist. Read my Trust for details.
Pages: « 1 2 3 4 5 6 7 8 9 [10] 11 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!