adamstgBit
Legendary
Offline
Activity: 1904
Merit: 1037
Trusted Bitcoiner
|
|
September 15, 2015, 09:20:50 PM |
|
Well I have an average connection for Europe and i could easily support two gigabyte blocks from home, eight megabyte blocks would not be a problem at all. This person in Florida needs to either get a new internet provider or update his client by the sounds of it. lol
Did you read my posts in this thread? I've had to drastically reduce the connectivity of my full node (latest Core release) running on it's own dedicated hardware (modern quad-core CPU, 16GB RAM, SSD) in order to keep the home network functional for other daily use demands. My node will happily eat as much upload speed as I give it and I have top 10% home internet speeds (probably better). It can bring simple web surfing to a standstill if I let it. Nodes don't just accept blocks. Most of my bandwidth use is on the upload side (sharing data with other peers)! Larger blocks will obviously have a direct impact on the amount of data shared. Do you run a node or are you just guessing? should bitcoins traffic limits be based on what 10% of typical home connection can handle? maybe 20% ? maybe 50%? surly anything demanding 20-10% isn't going to impact number of full nodes. in anycase i feel that there can be much improvement as to how data is shared across the network to dramatically decrease bandwidth use for full nodes, which should allow for proportionally bigger blocks. I know someone that turns on his full node like once a week, just to download the lastest blocks and then turns it off again, so that when he makes a TX it doesn't take him long to sync up first. is this node useful? should ALL users be asked to run a full node? Why would a user who only wants to send 1 transaction a week want to run a full node? That is useless. That user is better off using a different type of wallet. LMAO i know right!
|
|
|
|
Meuh6879
Legendary
Offline
Activity: 1512
Merit: 1012
|
|
September 15, 2015, 09:22:45 PM |
|
Nodes don't just accept blocks. Most of my bandwidth use is on the upload side (sharing data with other peers)! Larger blocks will obviously have a direct impact on the amount of data shared.
Do you run a node or are you just guessing?
i agree. in stress test, my upload usage explose ... (mempool satured). (with P2Pool stats, i can view the upload usage)
|
|
|
|
adamstgBit
Legendary
Offline
Activity: 1904
Merit: 1037
Trusted Bitcoiner
|
|
September 15, 2015, 09:24:18 PM |
|
Nodes don't just accept blocks. Most of my bandwidth use is on the upload side (sharing data with other peers)! Larger blocks will obviously have a direct impact on the amount of data shared.
Do you run a node or are you just guessing?
i agree. in stress test, my upload usage explose ... (mempool satured). (with P2Pool stats, i can view the upload usage) your node is busy telling every other node about TX they already know about.
|
|
|
|
adamstgBit
Legendary
Offline
Activity: 1904
Merit: 1037
Trusted Bitcoiner
|
|
September 15, 2015, 09:26:28 PM |
|
i'm currently syncing a full node and it feels like my computer is about to explode! cpu is running hot! hot! Hot!
|
|
|
|
Meuh6879
Legendary
Offline
Activity: 1512
Merit: 1012
|
|
September 15, 2015, 09:28:37 PM |
|
your node is busy telling every other node about TX they already know about.
aha ... so, Bitcoin have a problem ? noooooooooooooooooooooooo ?!?
|
|
|
|
mallard
|
|
September 15, 2015, 09:29:59 PM |
|
i'm currently syncing a full node and it feels like my computer is about to explode! cpu is running hot! hot! Hot!
AMD? You should look into upgrading the heatsink/cooler on your cpu.
|
|
|
|
adamstgBit
Legendary
Offline
Activity: 1904
Merit: 1037
Trusted Bitcoiner
|
|
September 15, 2015, 09:30:39 PM |
|
|
|
|
|
brg444
|
|
September 15, 2015, 09:32:29 PM |
|
Well I have an average connection for Europe and i could easily support two gigabyte blocks from home, eight megabyte blocks would not be a problem at all. This person in Florida needs to either get a new internet provider or update his client by the sounds of it. lol
Did you read my posts in this thread? I've had to drastically reduce the connectivity of my full node (latest Core release) running on it's own dedicated hardware (modern quad-core CPU, 16GB RAM, SSD) in order to keep the home network functional for other daily use demands. My node will happily eat as much upload speed as I give it and I have top 10% home internet speeds (probably better). It can bring simple web surfing to a standstill if I let it. Nodes don't just accept blocks. Most of my bandwidth use is on the upload side (sharing data with other peers)! Larger blocks will obviously have a direct impact on the amount of data shared. Do you run a node or are you just guessing? should bitcoins traffic limits be based on what 10% of typical home connection can handle? maybe 20% ? maybe 50%? surly anything demanding 20-10% isn't going to impact number of full nodes. in anycase i feel that there can be much improvement as to how data is shared across the network to dramatically decrease bandwidth use for full nodes, which should allow for proportionally bigger blocks. I know someone that turns on his full node like once a week, just to download the lastest blocks and then turns it off again, so that when he makes a TX it doesn't take him long to sync up first. is this node useful? should ALL users be asked to run a full node? Why would a user who only wants to send 1 transaction a week want to run a full node? That is useless. That user is better off using a different type of wallet. Yep, transact with bitcoin using other payment methods unless needs to harness the full security of Bitcoin.
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
brg444
|
|
September 15, 2015, 09:37:16 PM |
|
Well I have an average connection for Europe and i could easily support two gigabyte blocks from home, eight megabyte blocks would not be a problem at all. This person in Florida needs to either get a new internet provider or update his client by the sounds of it. lol
Did you read my posts in this thread? I've had to drastically reduce the connectivity of my full node (latest Core release) running on it's own dedicated hardware (modern quad-core CPU, 16GB RAM, SSD) in order to keep the home network functional for other daily use demands. My node will happily eat as much upload speed as I give it and I have top 10% home internet speeds (probably better). It can bring simple web surfing to a standstill if I let it. Nodes don't just accept blocks. Most of my bandwidth use is on the upload side (sharing data with other peers)! Larger blocks will obviously have a direct impact on the amount of data shared. Do you run a node or are you just guessing? should bitcoins traffic limits be based on what 10% of typical home connection can handle? maybe 20% ? maybe 50%? surly anything demanding 20-10% isn't going to impact number of full nodes. in anycase i feel that there can be much improvement as to how data is shared across the network to dramatically decrease bandwidth use for full nodes, which should allow for proportionally bigger blocks. I know someone that turns on his full node like once a week, just to download the lastest blocks and then turns it off again, so that when he makes a TX it doesn't take him long to sync up first. is this node useful? should ALL users be asked to run a full node? Bitcoin traffic should be limited based on capacity of anonymous bandwidth growth.
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
adamstgBit
Legendary
Offline
Activity: 1904
Merit: 1037
Trusted Bitcoiner
|
|
September 15, 2015, 10:09:55 PM |
|
Well I have an average connection for Europe and i could easily support two gigabyte blocks from home, eight megabyte blocks would not be a problem at all. This person in Florida needs to either get a new internet provider or update his client by the sounds of it. lol
Did you read my posts in this thread? I've had to drastically reduce the connectivity of my full node (latest Core release) running on it's own dedicated hardware (modern quad-core CPU, 16GB RAM, SSD) in order to keep the home network functional for other daily use demands. My node will happily eat as much upload speed as I give it and I have top 10% home internet speeds (probably better). It can bring simple web surfing to a standstill if I let it. Nodes don't just accept blocks. Most of my bandwidth use is on the upload side (sharing data with other peers)! Larger blocks will obviously have a direct impact on the amount of data shared. Do you run a node or are you just guessing? should bitcoins traffic limits be based on what 10% of typical home connection can handle? maybe 20% ? maybe 50%? surly anything demanding 20-10% isn't going to impact number of full nodes. in anycase i feel that there can be much improvement as to how data is shared across the network to dramatically decrease bandwidth use for full nodes, which should allow for proportionally bigger blocks. I know someone that turns on his full node like once a week, just to download the lastest blocks and then turns it off again, so that when he makes a TX it doesn't take him long to sync up first. is this node useful? should ALL users be asked to run a full node? Bitcoin traffic should be limited based on capacity of anonymous bandwidth growth. what if new node run with pruned versions of the blockchain? maybe we could have a sort of yearly checkpoint, we no longer ask nodes to download data that is >1year old and base verification of off these yearly checkpoints.
|
|
|
|
Chris_Sabian
Legendary
Offline
Activity: 896
Merit: 1001
|
|
September 15, 2015, 10:28:09 PM |
|
Well I have an average connection for Europe and i could easily support two gigabyte blocks from home, eight megabyte blocks would not be a problem at all. This person in Florida needs to either get a new internet provider or update his client by the sounds of it. lol
Did you read my posts in this thread? I've had to drastically reduce the connectivity of my full node (latest Core release) running on it's own dedicated hardware (modern quad-core CPU, 16GB RAM, SSD) in order to keep the home network functional for other daily use demands. My node will happily eat as much upload speed as I give it and I have top 10% home internet speeds (probably better). It can bring simple web surfing to a standstill if I let it. Nodes don't just accept blocks. Most of my bandwidth use is on the upload side (sharing data with other peers)! Larger blocks will obviously have a direct impact on the amount of data shared. Do you run a node or are you just guessing? should bitcoins traffic limits be based on what 10% of typical home connection can handle? maybe 20% ? maybe 50%? surly anything demanding 20-10% isn't going to impact number of full nodes. in anycase i feel that there can be much improvement as to how data is shared across the network to dramatically decrease bandwidth use for full nodes, which should allow for proportionally bigger blocks. I know someone that turns on his full node like once a week, just to download the lastest blocks and then turns it off again, so that when he makes a TX it doesn't take him long to sync up first. is this node useful? should ALL users be asked to run a full node? Bitcoin traffic should be limited based on capacity of anonymous bandwidth growth. what if new node run with pruned versions of the blockchain? maybe we could have a sort of yearly checkpoint, we no longer ask nodes to download data that is >1year old and base verification of off these yearly checkpoints. Doesn't that violate the whole point of being a node? Having a complete copy of the blockchain? Not saying that is bad but just bringing up a point.
|
|
|
|
brg444
|
|
September 15, 2015, 10:45:03 PM |
|
Well I have an average connection for Europe and i could easily support two gigabyte blocks from home, eight megabyte blocks would not be a problem at all. This person in Florida needs to either get a new internet provider or update his client by the sounds of it. lol
Did you read my posts in this thread? I've had to drastically reduce the connectivity of my full node (latest Core release) running on it's own dedicated hardware (modern quad-core CPU, 16GB RAM, SSD) in order to keep the home network functional for other daily use demands. My node will happily eat as much upload speed as I give it and I have top 10% home internet speeds (probably better). It can bring simple web surfing to a standstill if I let it. Nodes don't just accept blocks. Most of my bandwidth use is on the upload side (sharing data with other peers)! Larger blocks will obviously have a direct impact on the amount of data shared. Do you run a node or are you just guessing? should bitcoins traffic limits be based on what 10% of typical home connection can handle? maybe 20% ? maybe 50%? surly anything demanding 20-10% isn't going to impact number of full nodes. in anycase i feel that there can be much improvement as to how data is shared across the network to dramatically decrease bandwidth use for full nodes, which should allow for proportionally bigger blocks. I know someone that turns on his full node like once a week, just to download the lastest blocks and then turns it off again, so that when he makes a TX it doesn't take him long to sync up first. is this node useful? should ALL users be asked to run a full node? Bitcoin traffic should be limited based on capacity of anonymous bandwidth growth. what if new node run with pruned versions of the blockchain? maybe we could have a sort of yearly checkpoint, we no longer ask nodes to download data that is >1year old and base verification of off these yearly checkpoints. which defeats the whole purpose of Bitcoin that you should trust no one to validate the entirety of your coins history
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
brg444
|
|
September 15, 2015, 10:46:14 PM |
|
Well I have an average connection for Europe and i could easily support two gigabyte blocks from home, eight megabyte blocks would not be a problem at all. This person in Florida needs to either get a new internet provider or update his client by the sounds of it. lol
Did you read my posts in this thread? I've had to drastically reduce the connectivity of my full node (latest Core release) running on it's own dedicated hardware (modern quad-core CPU, 16GB RAM, SSD) in order to keep the home network functional for other daily use demands. My node will happily eat as much upload speed as I give it and I have top 10% home internet speeds (probably better). It can bring simple web surfing to a standstill if I let it. Nodes don't just accept blocks. Most of my bandwidth use is on the upload side (sharing data with other peers)! Larger blocks will obviously have a direct impact on the amount of data shared. Do you run a node or are you just guessing? should bitcoins traffic limits be based on what 10% of typical home connection can handle? maybe 20% ? maybe 50%? surly anything demanding 20-10% isn't going to impact number of full nodes. in anycase i feel that there can be much improvement as to how data is shared across the network to dramatically decrease bandwidth use for full nodes, which should allow for proportionally bigger blocks. I know someone that turns on his full node like once a week, just to download the lastest blocks and then turns it off again, so that when he makes a TX it doesn't take him long to sync up first. is this node useful? should ALL users be asked to run a full node? Bitcoin traffic should be limited based on capacity of anonymous bandwidth growth. what if new node run with pruned versions of the blockchain? maybe we could have a sort of yearly checkpoint, we no longer ask nodes to download data that is >1year old and base verification of off these yearly checkpoints. Doesn't that violate the whole point of being a node? Having a complete copy of the blockchain? Not saying that is bad but just bringing up a point. It is absolutely bad. A checkpoint implies someone somewhere is holding the full history. Who would that be?
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
Meuh6879
Legendary
Offline
Activity: 1512
Merit: 1012
|
|
September 15, 2015, 10:52:23 PM |
|
bitcoin is not datacenter and professionnal hosting. but you can use your limited and carbonised credit card if you don't want run a full node to control YOUR MONEY. bitcoins are pure money. so, it's requiered somes ... materials to use it. but without nodes, you don't have SPV wallet (phone wallet).
|
|
|
|
uxgpf
Newbie
Offline
Activity: 42
Merit: 0
|
|
September 15, 2015, 10:58:42 PM Last edit: September 15, 2015, 11:15:20 PM by uxgpf |
|
which defeats the whole purpose of Bitcoin that you should trust no one to validate the entirety of your coins history
That is not true. A pruned node downloads and validates the whole history since the genesis block. (Sync is identical to a normal node) It simply doesn't save it all to disk.
|
|
|
|
brg444
|
|
September 15, 2015, 11:09:00 PM |
|
which defeats the whole purpose of Bitcoin that you should trust no one to validate the entirety of your coins history
That is not true. A pruned node downloads and validates the whole history since the genesis block. (Sync is identical to a normal node) It simply doesn't save it all to disk. pruning is not the same as a checkpoint which is what Adam proposed. either way it still has to sync entirely as you've mentioned so:
|
"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
|
|
|
iCEBREAKER
Legendary
Offline
Activity: 2156
Merit: 1072
Crypto is the separation of Power and State.
|
|
September 15, 2015, 11:25:45 PM |
|
Bitcoin traffic should be limited based on capacity of anonymous bandwidth growth.
Growth, or decline. The lowering max block size to survive anonymous bandwidth reduction (or even elimination) is more important than ratcheting it up. Resiliency, not efficiency, is the paramount goal of decentralized, non-state sanctioned currency -Jon Matonis 2015 Bitcoin nodes should routinely operate in safe mode, and ideally be able to scale down into survival mode (sporogenesis+dormancy+reactivation) should there be indications of sufficient damage/disruption.
|
██████████ ██████████████████ ██████████████████████ ██████████████████████████ ████████████████████████████ ██████████████████████████████ ████████████████████████████████ ████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ██████████████████████████████████ ████████████████████████████████ ██████████████ ██████████████ ████████████████████████████ ██████████████████████████ ██████████████████████ ██████████████████ ██████████ Monero
|
| "The difference between bad and well-developed digital cash will determine whether we have a dictatorship or a real democracy." David Chaum 1996 "Fungibility provides privacy as a side effect." Adam Back 2014
|
| | |
|
|
|
uxgpf
Newbie
Offline
Activity: 42
Merit: 0
|
|
September 15, 2015, 11:38:13 PM |
|
pruning is not the same as a checkpoint which is what Adam proposed. either way it still has to sync entirely as you've mentioned so: Assuming yearly 20% increase blockchain size & 10% reduction in bandwidth costs, after 15-20 years no new nodes can enter system except maybe huge datacenter operations. https://www.youtube.com/watch?v=TgjrS-BPWDQ&feature=youtu.be&t=7331Ok, I must have misunderstood. How would checkpoints work? (I have no clue) Does this checkpoint basically hold a checksum of all previous transactions confirming that everything up to this point is true. If network has a checkpoint in it's blockchain (the nodes agree upon it) and you start building on it without validating the whole chain before the checkpoint, then what's the problem?
|
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
September 15, 2015, 11:41:51 PM |
|
Each transaction has to be evaluated only once to see if all of its inputs are valid. The previous work of tracing back each input to its source in some coinbase transaction was done when previous blocks were verified by the node and there is no need to redo this work. If there are two transactions going into the same block where one spends the outputs of the other then this will require sorting the transactions (via a topological sort), so that one transaction can be processed at a time and the proper accounting of the UTXO set be performed (remove and add). Finally, the processing of each transaction will have to verify that each of its inputs is used only once in the transaction. This can be done together with verifying the presence of each input if an efficient data structure is used. With appropriate use of hashing there will probably not be the need for any sorting operations and performance can be O(N), at least with a very high probability. (If random access memory is limited, the worst case here requires sorting and this makes the overall operation order n log n, not order n^2 assuming competent design and programming.) Easier said than done, just import this private key and tell me how long it takes L5WfGz1p6tstRpuxXtiy4VFotXkHa9CRytCqY2f5GeStarA5GgG5 Notice that a transaction of 1MB containing 5000+ inputs are totally legal in today's system, so it already showed some incentive that attackers might be interested. If even under 1MB block size nodes already have to setup defenses against data flooding attack, then the current network is far from robust to go for higher block sizes
|
|
|
|
RealBitcoin
|
|
September 15, 2015, 11:46:41 PM |
|
Easier said than done, just import this private key and tell me how long it takes L5WfGz1p6tstRpuxXtiy4VFotXkHa9CRytCqY2f5GeStarA5GgG5
Notice that a transaction of 1MB containing 5000+ inputs are totally legal in today's system, so it already showed some incentive that attackers might be interested. If even under 1MB block size nodes already have to setup defenses against data flooding attack, it shows the current network is far from robust to go for higher block sizes
But its a free market right, if somebody wants to flood the network, miners just increase the transaction fee. Let the spammer pay, eventually he will run out of money. I think it should be raised to 2 MB atleast, c`mon i got a 300 mb/s internet speed, guys get yourself better fucking internet:
|
|
|
|
|