Bitcoin Forum
April 19, 2024, 06:24:50 AM *
News: Latest Bitcoin Core release: 26.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 »  All
  Print  
Author Topic: Thanks to people who support 1-2 MB blocks - great idea u fools...  (Read 17055 times)
johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
September 15, 2015, 11:54:09 PM
 #121


wow so much fud coming out of your ass.....

You should write a book about this fud, ... genius AT work


You are exactly the type of human those economy books trying to modeling, and it seems their modeling works very well on you  Wink 

The question is: Are you willing to be modeled by a group of economists so that they can precisely calculate your next move and profit from it?

1713507890
Hero Member
*
Offline Offline

Posts: 1713507890

View Profile Personal Message (Offline)

Ignore
1713507890
Reply with quote  #2

1713507890
Report to moderator
1713507890
Hero Member
*
Offline Offline

Posts: 1713507890

View Profile Personal Message (Offline)

Ignore
1713507890
Reply with quote  #2

1713507890
Report to moderator
The block chain is the main innovation of Bitcoin. It is the first distributed timestamping system.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
Holliday
Legendary
*
Offline Offline

Activity: 1120
Merit: 1009



View Profile
September 16, 2015, 12:27:34 AM
 #122

should bitcoins traffic limits be based on what 10% of typical home connection can handle?

maybe 20% ? maybe 50%? surly anything demanding 20-10% isn't going to impact number of full nodes.

in anycase i feel that there can be much improvement as to how data is shared across the network to dramatically decrease bandwidth use for full nodes, which should allow for proportionally bigger blocks.

I don't have the answers to your questions, but I will ask you a few instead.

Should I be able to run a fully functional node while using the same internet connection for other daily household uses?

I pay near $100 per month for the highest tier internet speeds my ISP will offer.

I've built a dedicated machine to run Core (modern quad-core CPU, 16GB RAM, modern SSD).

To give you an example of my resources:

i'm currently syncing a full node and it feels like my computer is about to explode! cpu is running hot! hot! Hot!

You are still catching up from a partially synced state after at least 24 hours? I can sync the full chain from scratch in about 14 hours.

Yet, I can't run a full node without "putting the brakes on".

I'll ask a second question. Will a larger block size limit (which we agree will lead to larger blocks) make it easier or more difficult for me to run a full node?

I already have to gimp my node in order to have a functional connection to the internet. I already have dedicated hardware in order to keep my daily use computer "up to speed". What future steps will I (someone with fast computer / fast internet) need to take in order to keep running a full node?

I guess I could cross my fingers and hope my ISP (which has zero competition in my area) to just pump up my bandwidth for no additional cost.

I could continue to gimp my node when I know the network would happily accept more data from me.

Or, should I just give up on running a full node?

I know someone that turns on his full node like once a week, just to download the lastest blocks and then turns it off again, so that when he makes a TX it doesn't take him long to sync up first. is this node useful? should ALL users be asked to run a full node?

I haven't done a Bitcoin transaction in over 2 years. Yet, I have multiple full nodes with damn near 100% up time. Maybe I'm the sucker for continuing to bother. If it ends up taking even more resources, I may simply have to stop.

If you aren't the sole controller of your private keys, you don't have any bitcoins.
Q7
Sr. Member
****
Offline Offline

Activity: 448
Merit: 250


View Profile WWW
September 16, 2015, 01:17:24 AM
 #123

It's good that op points out that indeed there's a flaw in the system that really needs to be addressed. For those who opposes it, we should at least come up with a solution or at least admit there is something wrong. Alas anyway the issue only crops up probably 40 -  50 from now and high chance we will all not be here to see or experience it.

adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
September 16, 2015, 01:39:55 AM
 #124

It's good that op points out that indeed there's a flaw in the system that really needs to be addressed. For those who opposes it, we should at least come up with a solution or at least admit there is something wrong. Alas anyway the issue only crops up probably 40 -  50 from now and high chance we will all not be here to see or experience it.

if you mean 40-50 seconds your wrong, this problem cropped up 3 months ago.

hashman
Legendary
*
Offline Offline

Activity: 1264
Merit: 1008


View Profile
September 16, 2015, 05:57:14 AM
 #125

Oh noes! 

So the original prototype of public coin, bitcoin proper, can't be all types of coin for all types of people at once?  Run for the hills!!  Oh wait, we have altcoins.  Never mind. 

Zarathustra
Legendary
*
Offline Offline

Activity: 1162
Merit: 1004



View Profile
September 16, 2015, 07:22:54 AM
 #126

it would not make sense to use the worst possible examples as our baseline.

That baseline doesn't make sense if your goal is to optimize tx per second.  But why bother, when Visa/Paypal/Square can do that already very well.

OTOH, if you GAF about survivability of the diverse/diffuse/defensible/resilient network, you plan for the worst (and hope for the best).

Gavin wants to plan according to some rosy vision of the future, where the economy is all fine and dandy thanks to Helicopter Ben printing endless FunBux.

I want Bitcoin to survive World War Three (working subtitle 'Requiem for the Petrodollar').

That's not going to happen if Bitcoin lives in first strike targets like Google.mil datacenters.

World War Three? Are you dreaming? Einstein knew a long time ago that a World War Three would be the last one.

World War Three is a perpetual state of war. Much of the world is already there and has been for some time.

Well put. A worldwide cold war fought with weapons of financial destruction.

Bitcoin needs to survive the current 'Cold War 2' perpetual state of war fought with weapons of financial destruction.

And then it needs to survive WW3, which is entirely possible.  Einstein's anti-nuke hippie-dippie scare tactic blather about WW4 being fought with sticks and stones is just asinine objectively pro-Marxist counterfactual surrender-monkey BS.  Nuclear Winter was the Global Warming (popular bogyman) of his day...

We're not going to blanket the earth with mushroom clouds (it's too big and that would cost a fortune  Grin).


Ridiculous. As soon as the power grids collapse together with the financial system, the 'nuclear winter' will be around the corner. 500 reactors and fuel pools will blow out its inventory.
figmentofmyass
Legendary
*
Offline Offline

Activity: 1652
Merit: 1483



View Profile
September 16, 2015, 07:27:37 AM
Last edit: September 23, 2015, 07:04:31 AM by figmentofmyass
 #127

https://www.youtube.com/watch?v=TgjrS-BPWDQ&feature=youtu.be&t=12667

jgarzik suggests in his talk that Fidelity investment company has a beta bitcoin project which it is cannot turn on, because it would "max out" bitcoin capacity and future capacity growth is unknown


thanks for the video. it makes clear that capacity of the network now is not enough

sure, i think that is widely accepted, by those who likely include Adam Back and Nick Szabo. the "small blockean" misnomer is just that. the question is how, and it was pretty clear that XT/BIP101 was a dangerous route.

johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
September 16, 2015, 12:06:07 PM
 #128


Easier said than done, just import this private key and tell me how long it takes
L5WfGz1p6tstRpuxXtiy4VFotXkHa9CRytCqY2f5GeStarA5GgG5

Notice that a transaction of 1MB containing 5000+ inputs are totally legal in today's system, so it already showed some incentive that attackers might be interested. If even under 1MB block size nodes already have to setup defenses against data flooding attack, it shows the current network is far from robust to go for higher block sizes

But its a free market right, if somebody wants to flood the network, miners just increase the transaction fee.

Let the spammer pay, eventually he will run out of money.


I think it should be raised to 2 MB atleast, c`mon i got a  300 mb/s internet speed, guys get yourself better fucking internet:



A 2MB block is totally different thing as a 2MB mp3 file, it might contain much more complicated data that even a modern CPU can not handle well because of all the verification involved in thousands of transactions

If someone throwing out $20 notes every second on the top of empire state building, it will jam the traffic of whole Manhattan. Can you just say that let the free market decide and he will eventually run out of money?

Historically, attack of the network usually come together with the attack on bitcoin exchanges. The attacker can short the bitcoin on major exchanges and wait for the network chaos to profit

LiteCoinGuy (OP)
Legendary
*
Offline Offline

Activity: 1148
Merit: 1010


In Satoshi I Trust


View Profile WWW
September 16, 2015, 03:16:12 PM
 #129

Bitcoin will break down dams erected by special interest groups attempting to block the stream of transactions.

https://www.youtube.com/watch?v=iKDC2DpzNbw&t=26m30s

yayayo
Legendary
*
Offline Offline

Activity: 1806
Merit: 1024



View Profile
September 16, 2015, 03:30:06 PM
 #130

i'm currently syncing a full node and it feels like my computer is about to explode! cpu is running hot! hot! Hot!

Nice, that you finally come from theory to practice... but let me guess, you won't still see a problem downloading and verifying 8x the current amount of data...

It should be noted, that I'm not categorically against bigger blocks forever. But these changes must be done in a much more careful and restrictive manner than done by BIP101 and the corresponding altcoin. Proposing max-blocksize increases as the preferred solution for scaling Bitcoin shows a lack of engineering capability imho.

ya.ya.yo!

.
..1xBit.com   Super Six..
▄█████████████▄
████████████▀▀▀
█████████████▄
█████████▌▀████
██████████  ▀██
██████████▌   ▀
████████████▄▄
███████████████
███████████████
███████████████
███████████████
███████████████
▀██████████████
███████████████
█████████████▀
█████▀▀       
███▀ ▄███     ▄
██▄▄████▌    ▄█
████████       
████████▌     
█████████    ▐█
██████████   ▐█
███████▀▀   ▄██
███▀   ▄▄▄█████
███ ▄██████████
███████████████
███████████████
███████████████
███████████████
███████████████
███████████████
███████████▀▀▀█
██████████     
███████████▄▄▄█
███████████████
███████████████
███████████████
███████████████
███████████████
         ▄█████
        ▄██████
       ▄███████
      ▄████████
     ▄█████████
    ▄███████
   ▄███████████
  ▄████████████
 ▄█████████████
▄██████████████
  ▀▀███████████
      ▀▀███
████
          ▀▀
          ▄▄██▌
      ▄▄███████
     █████████▀

 ▄██▄▄▀▀██▀▀
▄██████     ▄▄▄
███████   ▄█▄ ▄
▀██████   █  ▀█
 ▀▀▀
    ▀▄▄█▀
▄▄█████▄    ▀▀▀
 ▀████████
   ▀█████▀ ████
      ▀▀▀ █████
          █████
       ▄  █▄▄ █ ▄
     ▀▄██▀▀▀▀▀▀▀▀
      ▀ ▄▄█████▄█▄▄
    ▄ ▄███▀    ▀▀ ▀▀▄
  ▄██▄███▄ ▀▀▀▀▄  ▄▄
  ▄████████▄▄▄▄▄█▄▄▄██
 ████████████▀▀    █ ▐█
██████████████▄ ▄▄▀██▄██
 ▐██████████████    ▄███
  ████▀████████████▄███▀
  ▀█▀  ▐█████████████▀
       ▐████████████▀
       ▀█████▀▀▀ █▀
.
Premier League
LaLiga
Serie A
.
Bundesliga
Ligue 1
Primeira Liga
.
..TAKE PART..
mallard
Full Member
***
Offline Offline

Activity: 196
Merit: 100


View Profile
September 16, 2015, 03:34:34 PM
 #131

Well I have an average connection for Europe and i could easily support two gigabyte blocks from home, eight megabyte blocks would not be a problem at all. This person in Florida needs to either get a new internet provider or update his client by the sounds of it. lol

Did you read my posts in this thread? I've had to drastically reduce the connectivity of my full node (latest Core release) running on it's own dedicated hardware (modern quad-core CPU, 16GB RAM, SSD) in order to keep the home network functional for other daily use demands.

My node will happily eat as much upload speed as I give it and I have top 10% home internet speeds (probably better). It can bring simple web surfing to a standstill if I let it.

Nodes don't just accept blocks. Most of my bandwidth use is on the upload side (sharing data with other peers)! Larger blocks will obviously have a direct impact on the amount of data shared.

Do you run a node or are you just guessing?

should bitcoins traffic limits be based on what 10% of typical home connection can handle?

maybe 20% ? maybe 50%? surly anything demanding 20-10% isn't going to impact number of full nodes.

in anycase i feel that there can be much improvement as to how data is shared across the network to dramatically decrease bandwidth use for full nodes, which should allow for proportionally bigger blocks.

I know someone that turns on his full node like once a week, just to download the lastest blocks and then turns it off again, so that when he makes a TX it doesn't take him long to sync up first. is this node useful? should ALL users be asked to run a full node?

Bitcoin traffic should be limited based on capacity of anonymous bandwidth growth.

Assuming yearly 20% increase blockchain size & 10% reduction in bandwidth costs, after 15-20 years no new nodes can enter system except maybe huge datacenter operations. https://www.youtube.com/watch?v=TgjrS-BPWDQ&feature=youtu.be&t=7331



what if new node run with pruned versions of the blockchain?

maybe we could have a sort of yearly checkpoint, we no longer ask nodes to download data that is >1year old and base verification of off these yearly checkpoints.

which defeats the whole purpose of Bitcoin that you should trust no one to validate the entirety of your coins history

Maybe everyone should use something like a Raspberry Pi and leave it running as a full node 24/7. Then use RPC or something to send and receive coins through a laptop or phone.
coalitionfor8mb
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
September 16, 2015, 04:02:09 PM
 #132

...
Or, should I just give up on running a full node?
...
I may simply have to stop.
...

One more thing.

Evolutionary stage transition we are currently facing with Bitcoin is where "sheer quantity" (of full nodes) becomes "higher quality" (of fewer ones). That's why evolution doesn't progress in a smooth linear way and has stages instead (nothing grows in a straight line). It has its own cycles and the limit in question serves as a barrier of some sort.

If we look at how stars operate (in outer space), we will see that they need to accumulate enough mass and build up enough pressure in order to be able to contain higher energy reactions and therefore produce heavier elements. At certain point they throw off their outer shell and begin attracting and accumulating new mass on the next level of their evolution.

Home-based demographic might constitute that outer shell for the transition process, but that's not the end of it, read on.

Bitcoin will break down dams erected by special interest groups attempting to block the stream of transactions.

https://www.youtube.com/watch?v=iKDC2DpzNbw&t=26m30s

Without the limit the most productive players in the ecosystem will begin losing the background (against which they used to show how good they are) and instead form (and consolidate around) new clusters of gravity not accessible by the majority of Bitcoin's user-base. That's where the star of Bitcoin would begin falling apart.

If instead of removing the limit we simply move the existing one far (but still safe) enough (like 8MB), we will outline the new barrier to which all the players would rush racing against each other. Some will get there faster than others, but because the limit is static and firmly cemented into the brand (while the technology continues to improve), the rest of the user-base will eventually catch up and begin accumulating even bigger mass to facilitate the next stage transition.

And so it goes... Smiley
hodedowe
Sr. Member
****
Offline Offline

Activity: 359
Merit: 251


View Profile
September 16, 2015, 04:17:01 PM
 #133

Continuing off the post above...


...All the while keeping the cost of transactions low while the (current) 25 BTC subsidy plays out. Once the subsidy is gone and the 21m coins are "printed" fees will still rise with forced inflation, however at that point the limit can be raised again since a dangerous precedent is set and miners will collect fees of 0.0001 BTC per transaction, meaning blocks will need to be an insanely high 30-40mb to collect a similar fee.

You'll always have users clamoring for lower fees and miners demanding higher fees. Why mine (and again, this keeps bitcoin alive guys) when you make no money or have no incentive? With the current 1mb limit you have the market dictating the rate. Those that *need* fast transactions pay for the privilege. Those that don't pay no fee at all. Paying for privilege is nothing new. You can't stay in the best hotel for 40 cents. You can stay in the worst hotel for 40 million but of course you don't have to. You can pay for privilege and not worry about body lice and rats.



Solo mining is alive and profitable!
Helped? Thanks! 1CXRFh4bDVFBsUzoHMMDbTMPcBP14RUTus
brg444
Hero Member
*****
Offline Offline

Activity: 644
Merit: 504

Bitcoin replaces central, not commercial, banks


View Profile
September 16, 2015, 04:40:28 PM
 #134

One more thing.

Evolutionary stage transition we are currently facing with Bitcoin is where "sheer quantity" (of full nodes) becomes "higher quality" (of fewer ones). That's why evolution doesn't progress in a smooth linear way and has stages instead (nothing grows in a straight line). It has its own cycles and the limit in question serves as a barrier of some sort.

If we look at how stars operate (in outer space), we will see that they need to accumulate enough mass and build up enough pressure in order to be able to contain higher energy reactions and therefore produce heavier elements. At certain point they throw off their outer shell and begin attracting and accumulating new mass on the next level of their evolution.

Home-based demographic might constitute that outer shell for the transition process, but that's not the end of it, read on.

 Huh

Please define "higher quality" nodes?

Without the limit the most productive players in the ecosystem will begin losing the background (against which they used to show how good they are) and instead form (and consolidate around) new clusters of gravity not accessible by the majority of Bitcoin's user-base. That's where the star of Bitcoin would begin falling apart.

If instead of removing the limit we simply move the existing one far (but still safe) enough (like 8MB), we will outline the new barrier to which all the players would rush racing against each other. Some will get there faster than others, but because the limit is static and firmly cemented into the brand (while the technology continues to improve), the rest of the user-base will eventually catch up and begin accumulating even bigger mass to facilitate the next stage transition.

And so it goes... Smiley

I wouldn't say 800% increase qualifies as "safe".

"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
September 16, 2015, 04:41:36 PM
 #135

should bitcoins traffic limits be based on what 10% of typical home connection can handle?

maybe 20% ? maybe 50%? surly anything demanding 20-10% isn't going to impact number of full nodes.

in anycase i feel that there can be much improvement as to how data is shared across the network to dramatically decrease bandwidth use for full nodes, which should allow for proportionally bigger blocks.

I don't have the answers to your questions, but I will ask you a few instead.

Should I be able to run a fully functional node while using the same internet connection for other daily household uses?

I pay near $100 per month for the highest tier internet speeds my ISP will offer.

I've built a dedicated machine to run Core (modern quad-core CPU, 16GB RAM, modern SSD).

To give you an example of my resources:

i'm currently syncing a full node and it feels like my computer is about to explode! cpu is running hot! hot! Hot!

You are still catching up from a partially synced state after at least 24 hours? I can sync the full chain from scratch in about 14 hours.

Yet, I can't run a full node without "putting the brakes on".

I'll ask a second question. Will a larger block size limit (which we agree will lead to larger blocks) make it easier or more difficult for me to run a full node?

I already have to gimp my node in order to have a functional connection to the internet. I already have dedicated hardware in order to keep my daily use computer "up to speed". What future steps will I (someone with fast computer / fast internet) need to take in order to keep running a full node?

I guess I could cross my fingers and hope my ISP (which has zero competition in my area) to just pump up my bandwidth for no additional cost.

I could continue to gimp my node when I know the network would happily accept more data from me.

Or, should I just give up on running a full node?

I know someone that turns on his full node like once a week, just to download the lastest blocks and then turns it off again, so that when he makes a TX it doesn't take him long to sync up first. is this node useful? should ALL users be asked to run a full node?

I haven't done a Bitcoin transaction in over 2 years. Yet, I have multiple full nodes with damn near 100% up time. Maybe I'm the sucker for continuing to bother. If it ends up taking even more resources, I may simply have to stop.

I'm currently catching up from a about 1 year of down time. anyway:

"I already have to gimp my node"
dose gimping your node reduce its usefulness??
that's the thing! there is a real demand for some kind of simi-fullnode.
which is why i believe there should be different levels of nodes

SPV, simi-full node, full node, supernode, miner node.

currently the network is configured in a completely random order, with everyone relaying TX's to everyone else, increasing bandwidth requirements needlessly. and on top of that the way new blocks are propagated also increases bandwidth needlessly.

If improvements were made such that your full node requires X4 less bandwidth would you not agree that a 4X block increase would be OK?

block propagation can be improved up to 250X faster ( there is a running network which currently utilizes this improvement, sadly your node does not currently benefit from this ) there are probably all kinds of other improvements to be made.

you could say you are already running a sort of simi-fullnode because you limit its bandwidth usage. it would be much better if a real simi full node was developed and optimized to sever a specific function on the network.

bottom line, we need improvements, But still the answer is No you shouldn't aim for running  full nodes while running a full household worth of devices. I believe we should keep the limit such that a typical home connection can handle it with about 80% of the its bandwidth being utilized. this will be the upper limit which should never be crossed. and at the same time offer simi-full nodes that do something useful with only 10-20% of a typical home connection.

some improved incentives will also help, what's the point of running a "super node" that relay GB's of data per day? maybe miners would be willing to pay a small fee to connect to this node so that it gets blocks faster? or something!

coalitionfor8mb
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
September 16, 2015, 05:07:02 PM
Last edit: September 16, 2015, 06:31:02 PM by coalitionfor8mb
 #136

One more thing.

Evolutionary stage transition we are currently facing with Bitcoin is where "sheer quantity" (of full nodes) becomes "higher quality" (of fewer ones). That's why evolution doesn't progress in a smooth linear way and has stages instead (nothing grows in a straight line). It has its own cycles and the limit in question serves as a barrier of some sort.

If we look at how stars operate (in outer space), we will see that they need to accumulate enough mass and build up enough pressure in order to be able to contain higher energy reactions and therefore produce heavier elements. At certain point they throw off their outer shell and begin attracting and accumulating new mass on the next level of their evolution.

Home-based demographic might constitute that outer shell for the transition process, but that's not the end of it, read on.

Please define "higher quality" nodes?

They will become those heavier elements (in star analogy), which will be able to handle larger amount of transactions and serve more end-users than the current ones. In the initial stages of the transition process people might want to crowd-fund and crowd-source a few full nodes for themselves (in the cloud) instead of running autonomous ones at home (jamming most of their internet daily use). By the time technology advances far enough (while the limit stays the same), people will be able to run full nodes at home again and the whole process will have to repeat itself.

Without the limit the most productive players in the ecosystem will begin losing the background (against which they used to show how good they are) and instead form (and consolidate around) new clusters of gravity not accessible by the majority of Bitcoin's user-base. That's where the star of Bitcoin would begin falling apart.

If instead of removing the limit we simply move the existing one far (but still safe) enough (like 8MB), we will outline the new barrier to which all the players would rush racing against each other. Some will get there faster than others, but because the limit is static and firmly cemented into the brand (while the technology continues to improve), the rest of the user-base will eventually catch up and begin accumulating even bigger mass to facilitate the next stage transition.

And so it goes... Smiley

I wouldn't say 800% increase qualifies as "safe".

I agree, that stars aren't very safe during their transition stages, but that's the nature of it. It's really up to us (as a collective) to make it as smooth as we want it to be, but don't underestimate the gravity and the inertia of the process. Star transitions can be beautiful, but there is always an element of uncertainty in there. Oh, and you just want to have at least twice the theoretical capacity of your closest PoW competitor in order to stay in the game. Smiley
Holliday
Legendary
*
Offline Offline

Activity: 1120
Merit: 1009



View Profile
September 16, 2015, 06:19:33 PM
 #137

bottom line, we need improvements, But still the answer is No you shouldn't aim for running  full nodes while running a full household worth of devices. I believe we should keep the limit such that a typical home connection can handle it with about 80% of the its bandwidth being utilized. this will be the upper limit which should never be crossed. and at the same time offer simi-full nodes that do something useful with only 10-20% of a typical home connection.

some improved incentives will also help, what's the point of running a "super node" that relay GB's of data per day? maybe miners would be willing to pay a small fee to connect to this node so that it gets blocks faster? or something!

Again, my connection and my hardware are no where close to "typical". You are, two days later, still syncing from 1 year behind? I sync the entire chain in 14 hours. My other household uses (frequent web browsing, occasional gaming and streaming video) take nowhere near the resources of Core.

If a Bitcoin enthusiast like myself, who spends money on hardware specifically to run a node, who has top tier internet speeds, can't run a full node at full capacity (without a dedicated line), I would say that the decentralization of the network has been harmed. (Of course, I am currently running multiple full nodes, but not at full capacity. My concern is increasing the data a node should share before making the system as efficient as possible.)

If you want to take off the training wheels, let's talk about it after we've got the pedals, chain, handlebars, and shifters in fantastic working order.

You envision a network where there are multiple implementations of the software based on the resources at hand. This isn't necessarily a bad thing, unless it raises the bar so much that the end user can no longer access the full censorship-proof nature of Bitcoin.

I envision a network that is able to function in a possible future where governments force ISPs to audit or even censor Bitcoin traffic. I don't think we will have truly censorship-proof money until the Bitcoin network is not only able, but actually functioning, on a global wireless mesh network that is entirely out of the reach of any entity that would wish to control it.

I want Bitcoin to function in a worst case scenario. This isn't a war to see who can provide the cheapest, most convenient transaction for buying a glass of wine, this is a war for financial freedom.


If you aren't the sole controller of your private keys, you don't have any bitcoins.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
September 16, 2015, 06:27:38 PM
 #138

bottom line, we need improvements, But still the answer is No you shouldn't aim for running  full nodes while running a full household worth of devices. I believe we should keep the limit such that a typical home connection can handle it with about 80% of the its bandwidth being utilized. this will be the upper limit which should never be crossed. and at the same time offer simi-full nodes that do something useful with only 10-20% of a typical home connection.

some improved incentives will also help, what's the point of running a "super node" that relay GB's of data per day? maybe miners would be willing to pay a small fee to connect to this node so that it gets blocks faster? or something!

Again, my connection and my hardware are no where close to "typical". You are, two days later, still syncing from 1 year behind? I sync the entire chain in 14 hours. My other household uses (frequent web browsing, occasional gaming and streaming video) take nowhere near the resources of Core.

If a Bitcoin enthusiast like myself, who spends money on hardware specifically to run a node, who has top tier internet speeds, can't run a full node at full capacity (without a dedicated line), I would say that the decentralization of the network has been harmed. (Of course, I am currently running multiple full nodes, but not at full capacity. My concern is increasing the data a node should share before making the system as efficient as possible.)

If you want to take off the training wheels, let's talk about it after we've got the pedals, chain, handlebars, and shifters in fantastic working order.

You envision a network where there are multiple implementations of the software based on the resources at hand. This isn't necessarily a bad thing, unless it raises the bar so much that the end user can no longer access the full censorship-proof nature of Bitcoin.

I envision a network that is able to function in a possible future where governments force ISPs to audit or even censor Bitcoin traffic. I don't think we will have truly censorship-proof money until the Bitcoin network is not only able, but actually functioning, on a global wireless mesh network that is entirely out of the reach of any entity that would wish to control it.

I want Bitcoin to function in a worst case scenario. This isn't a war to see who can provide the cheapest, most convenient transaction for buying a glass of wine, this is a war for financial freedom.



its only taking days to sync because i need to work all day on my computer and i cant have core eating 90% cpu while i work.
also i turn off my machine at night because it's hot in my room.


i think your exaggerating the resources core takes and that's why i'm sycning a full node, once the full node is synced i will see first hand if this hurts my streaming.

agreed we need improvements done before we up the limit.

i think your pushing the "war" thing a little to far, right now mostly all governments are OK with bitcoin.

brg444
Hero Member
*****
Offline Offline

Activity: 644
Merit: 504

Bitcoin replaces central, not commercial, banks


View Profile
September 16, 2015, 06:42:09 PM
 #139

bottom line, we need improvements, But still the answer is No you shouldn't aim for running  full nodes while running a full household worth of devices. I believe we should keep the limit such that a typical home connection can handle it with about 80% of the its bandwidth being utilized. this will be the upper limit which should never be crossed. and at the same time offer simi-full nodes that do something useful with only 10-20% of a typical home connection.

some improved incentives will also help, what's the point of running a "super node" that relay GB's of data per day? maybe miners would be willing to pay a small fee to connect to this node so that it gets blocks faster? or something!

Again, my connection and my hardware are no where close to "typical". You are, two days later, still syncing from 1 year behind? I sync the entire chain in 14 hours. My other household uses (frequent web browsing, occasional gaming and streaming video) take nowhere near the resources of Core.

If a Bitcoin enthusiast like myself, who spends money on hardware specifically to run a node, who has top tier internet speeds, can't run a full node at full capacity (without a dedicated line), I would say that the decentralization of the network has been harmed. (Of course, I am currently running multiple full nodes, but not at full capacity. My concern is increasing the data a node should share before making the system as efficient as possible.)

If you want to take off the training wheels, let's talk about it after we've got the pedals, chain, handlebars, and shifters in fantastic working order.

You envision a network where there are multiple implementations of the software based on the resources at hand. This isn't necessarily a bad thing, unless it raises the bar so much that the end user can no longer access the full censorship-proof nature of Bitcoin.

I envision a network that is able to function in a possible future where governments force ISPs to audit or even censor Bitcoin traffic. I don't think we will have truly censorship-proof money until the Bitcoin network is not only able, but actually functioning, on a global wireless mesh network that is entirely out of the reach of any entity that would wish to control it.

I want Bitcoin to function in a worst case scenario. This isn't a war to see who can provide the cheapest, most convenient transaction for buying a glass of wine, this is a war for financial freedom.



its only taking days to sync because i need to work all day on my computer and i cant have core eating 90% cpu while i work.
also i turn off my machine at night because it's hot in my room.


i think your exaggerating the resources core takes and that's why i'm sycning a full node, once the full node is synced i will see first hand if this hurts my streaming.

agreed we need improvements done before we up the limit.

i think your pushing the "war" thing a little to far, right now mostly all governments are OK with bitcoin.

How can you be so naive?

By exaggerating do you suggest Holliday is lying? What he's saying sounds plausible to me and certainly not anecdotal, there have been many reports of node owners experiencing similar problems.

Planning for "the war thing" is how you build security systems: assuming worst-case scenarios. Today's reality might be quite different from tomorrow's. Do you really expect governments of the world to let Bitcoin take over the world without putting up a fight?

"I believe this will be the ultimate fate of Bitcoin, to be the "high-powered money" that serves as a reserve currency for banks that issue their own digital cash." Hal Finney, Dec. 2010
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
September 16, 2015, 06:46:05 PM
 #140

bottom line, we need improvements, But still the answer is No you shouldn't aim for running  full nodes while running a full household worth of devices. I believe we should keep the limit such that a typical home connection can handle it with about 80% of the its bandwidth being utilized. this will be the upper limit which should never be crossed. and at the same time offer simi-full nodes that do something useful with only 10-20% of a typical home connection.

some improved incentives will also help, what's the point of running a "super node" that relay GB's of data per day? maybe miners would be willing to pay a small fee to connect to this node so that it gets blocks faster? or something!

Again, my connection and my hardware are no where close to "typical". You are, two days later, still syncing from 1 year behind? I sync the entire chain in 14 hours. My other household uses (frequent web browsing, occasional gaming and streaming video) take nowhere near the resources of Core.

If a Bitcoin enthusiast like myself, who spends money on hardware specifically to run a node, who has top tier internet speeds, can't run a full node at full capacity (without a dedicated line), I would say that the decentralization of the network has been harmed. (Of course, I am currently running multiple full nodes, but not at full capacity. My concern is increasing the data a node should share before making the system as efficient as possible.)

If you want to take off the training wheels, let's talk about it after we've got the pedals, chain, handlebars, and shifters in fantastic working order.

You envision a network where there are multiple implementations of the software based on the resources at hand. This isn't necessarily a bad thing, unless it raises the bar so much that the end user can no longer access the full censorship-proof nature of Bitcoin.

I envision a network that is able to function in a possible future where governments force ISPs to audit or even censor Bitcoin traffic. I don't think we will have truly censorship-proof money until the Bitcoin network is not only able, but actually functioning, on a global wireless mesh network that is entirely out of the reach of any entity that would wish to control it.

I want Bitcoin to function in a worst case scenario. This isn't a war to see who can provide the cheapest, most convenient transaction for buying a glass of wine, this is a war for financial freedom.



its only taking days to sync because i need to work all day on my computer and i cant have core eating 90% cpu while i work.
also i turn off my machine at night because it's hot in my room.


i think your exaggerating the resources core takes and that's why i'm sycning a full node, once the full node is synced i will see first hand if this hurts my streaming.

agreed we need improvements done before we up the limit.

i think your pushing the "war" thing a little to far, right now mostly all governments are OK with bitcoin.

How can you be so naive?

By exaggerating do you suggest Holliday is lying? What he's saying sounds plausible to me and certainly not anecdotal, there have been many reports of node owners experiencing similar problems.

Planning for "the war thing" is how you build security systems: assuming worst-case scenarios. Today's reality might be quite different from tomorrow's. Do you really expect governments of the world to let Bitcoin take over the world without putting up a fight?

honestly I expect them to embrace bitcoin as the banksters monopoly money fails them.

and i think Holliday is simply unlucky and is someone bombarded with requests from alot of node needlessly
again proving that the network is setup really ineffectively

look at what this guy is saying on the matter.


I'm running one. Currently 31 connections, 5.5 kB/s down 9.2 kB/s up.

[edit] Killed the node while fiddling with it...so now only 15 connections.  Anyway, here's some more numbers.
Code:
Every 10.0s: bitcoin-cli getinfo            Wed Sep 16 21:21:33 2015

{
    "version" : 110000,
    "protocolversion" : 70010,
    "blocks" : 374827,
    "timeoffset" : -1,
    "connections" : 15,
    "proxy" : "",
    "difficulty" : 56957648455.01000977,
    "testnet" : false,
    "relayfee" : 0.00001000,
    "errors" : ""
}

5 snapshots from download and upload bandwidth:
Code:
$ for i in {1..5}; do awk '{if(l1){print ($2-l1)/1024"kB/s",($10-l2)/1024"kB/s"} else{l1=$2; l2=$10;}}' <(grep wlan0 /proc/net/dev) <(sleep 1; grep wlan0 /proc/net/dev); sleep 1; done
2.47266kB/s 9.90723kB/s
5.45605kB/s 10.3809kB/s
0.604492kB/s 0.650391kB/s
7.7959kB/s 8.41016kB/s
3.51953kB/s 10.4092kB/s

how can they get such drastically different results 

Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!