Bitcoin Forum
May 12, 2024, 11:05:12 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 [3] 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 ... 65 »
41  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 22, 2014, 07:56:37 PM
Asside: did you preview your post at any point? It's in your draft history if so...

Thanks! Never noticed that before. Here is the full reply:

Consensus does not imply unanimous appeal. Although threads such as these help to flesh out different ideas and opinions, they do very little towards determining where consensus lies. This thread could represent a vocal majority just as easily as it could a small but vocal minority (on either side).

This I know, but tend to think it's not the case. Anyone in this part of the forum, on this thread, probably doesn't have some obscure minority point of view. Whatever the basis for their reasoning it's probably not isolated. I think if we can get some sort of consensus on a thread like this we can too in the wider community. If we can't it would be harder, maybe not impossible, but harder depending how people dug into their positions. The longer the wait the harder. If this were mid 2010 there would likely be zero problem. High profile devs (like Satoshi/Gavin) would simply explain the necessary way forward and the small community would move along. If we're talking 2019 I don't see that happening so easily, or at all actually.

This is exactly where a more formal governance model (as I mentioned) could help. It too would surely be imperfect, but just about anything would be better than determining consensus based on who writes the most posts, thoughtful though they may be.

I'd be in favor of more structure for progress, but you won't convince everybody. There will be purists that cry centralization.

If, for example, I knew that there was little support for gavin's version, I for one would be much more willing to compromise. But I simply don't know....

Yes, I think some sort of poll will be in order at some point. I haven't pushed that yet because I think people still need time to stew with their positions.

I'm having trouble imagining a use case where embedded hardware with difficult-to-update software would connect to the P2P network, much less having anything to do with handling the blockchain, but my imagination isn't all that great. I also have trouble in general with any device whose purpose is highly security related that isn't software upgradeable. (Such things do exist today, and they're equally ill-advised.)

There is always a long tail of technology out into the marketplace. Just because our community is at the cutting edge of technology doesn't mean everyone is. For example, I was surprised to learn of a story in the community I came from (Hacker News) about a very successful business that still ran BASIC. This was used for order fulfillment, accounting, you name it. The business was profitable in the millions if I recall, but completely reliant on their workhorse infrastructure. It wasn't cutting edge, but it worked, and that's all that mattered. A similar story exists in the banking industry.
42  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 22, 2014, 07:39:25 PM
Consensus does not imply unanimous appeal. Although threads such as these help to flesh out different ideas and opinions, they do very little towards determining where consensus lies. This thread could represent a vocal majority just as easily as it could a small but vocal minority (on either side).

I had a complete reply typed out for all your points but my browser closed before I sent it  Cry

Ah well, I'm not re-typing it. The gist is I'm aware of the above, but don't think that's the case. I tend to think those in this part of the forum on this thread have sentiments which are not isolated. If we can gain consensus here we have a good chance in the wider community; if not then who knows, but it would become ever harder with passing time.
43  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 22, 2014, 06:09:29 PM
At worst harder, but not impossible.

LOL are you not following this thread? What easy way forward do you see emerging for the block size issue?

If the ISO can finally manage to crank out C++11, despite the contentious issues and compromises that were ultimately required (and C++14 just two months ago too!), pretty much anything is possible IMO.

That's for a programming language not a protocol. Also see Andreas Antonopoulos's comment on ossification considering hardware, which I also agree with.
44  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 22, 2014, 06:01:15 AM
I have an idea. Why not ask everyone in Bitcoin what they think we should do, then just do all of them! Or, we can just debate each idea until it no longer matters since the year will be 2150.
45  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 21, 2014, 10:46:18 PM
No one is even filling up the current 1MB blocks with self-dealing transactions as it is.  Remind me again why it was lowered?

This is a perfect example of why I'm skeptical about good consensus at any point, especially after large adoption numbers. Everyone has theoretically equal vote/opinion/input in Bitcoin, but not everyone is working with the same information (or expertise, abilities, integrity etc. etc.), no offense to David Rabahy.

We need something which can pass the community and in my opinion fairly soon.
46  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 21, 2014, 07:04:06 PM
Correct.  If you fix something and it never needs fixing again, and just keeps working from them on, you have fixed in a right way.

What about this from you?

I do agree that any feedback mechanism such as we are seeking with this line of discussion holds the potential for creating a perverse incentive.

How can you know if your solution is right and still admit this?

Automatic adjustment based on the environment of the future at least has a possibility of being right.  

So does increases inline with 30 years of historical data.

We can play "what if" about all sorts of things, but at least the ones you mention here are not going to take additional hard forks to accommodate if we use any either of the methods described in this thread.  

We need to clear something up. As I've said I don't expect future hard forks to be possible due the protocol ossifying. You keep talking about changes which might occur later and how you're against that. I'm saying I don't believe there can be any changes after a certain point, one which we may be nearing, because it's harder to gain consensus the more people that need be consulted.

Whatever change we make that will probably be it. How well it works in the future depends on how technology plays out and the community's ability to adapt around the protocol's shortcomings if there are any.

Please tell me if you agree an ossifying of the protocol - the fact it will become increasingly hard, probably impossible to make changes as adoption grows - is what we'll likely see.

At best they are saying "not now".  

If they're saying not now it may become impossible to change from 1MB. I don't believe their position is realistic, but who is more right? Everything is subjectively based on the priorities of the advocate.

My goal, as you said earlier isn't to be right, it's to arrive at some solution which can gain consensus while meeting Bitcoin's promises. If that's Gavin's proposal, fine. If that's your solution, fine. Let's just get something that meets that criteria so we're not stuck.

Why I think Gavin's proposal would play out better:

  • it has good chance at gaining consensus given Gavin's position
  • it provides predictability which shouldn't be under appreciated; most people/businesses are not nerds enjoying complex ideas, they want simple understandable guidelines to make decisions upon and can chart numbers with Gavin's model
  • similar to now the max size is a cap, not what is actually used; the cap stays commensurate with Moore's Law and Nielsen's Law
  • it accommodates the largest number of users (inline with exponential adoption) while still offering a protective cap
  • in the worst case if centralization occurs there is still the community to deal with (remember GHash.io) which has alternative coins

What I dislike about an input based solution:

  • possibly does not serve the greatest number of people
  • less certain about gaining consensus as people that need to be swayed may include business types like BitPay, Circle, Bitcoin Foundation, etc.
  • it doesn't provide clear predictability; whatever happens is dynamic based on the network which is influenced by various factors
  • possible perverse incentives created
  • a more complex solution means greater chance something doesn't work as expected

I see your version as fitting more like a glove. I'd agree it's probably more conservative and protective than Gavin's proposal, but at what expense? Nothing will be ideal because the technology and likely usage demand don't match, and won't for some time. Perhaps you can produce a bullet point list too and we can begin debating specifics.
47  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 21, 2014, 02:38:46 AM
Necessitating future adjustment.  A change that does not resolve the fundamental problem, and addresses only the immediate perceptions of today.  

Now define "right". Is it simply a block size which grows/shrinks dynamically with real world bandwidth over time? What if usage demand is far higher? What if the BTC exchange rate experiences unending volatility due to uncertainty about usage capacity (ie its user base)?

In other words does not needing future bandwidth adjustment automatically mean "right"?

In the same way that a fixed 1MB is "wrong".  

Not according to these people. They think 1MB is the right answer for "many more years" and until we know it's "safe" to change. Can't you see any answer given is subjective? With that being the case doesn't it make sense to adopt a solution which can fit the most common perception of Bitcoin's promise, which certainly includes global usage, and can win popular support?
48  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 20, 2014, 06:05:58 PM
Gavin's proposal is fit for immediate purposes, but it just kicks the can down the road for future readjustment.  This is problematic in that it will then re-require this leader/authority/deciding force.

This is one of those instances I'm talking about regarding people thinking differently.

You and I seem to fundamentally think differently here, and who is to say who is right? I believe whatever hard fork change we make, if we make one, it will be locked in quite probably forever more. It won't be subject to adjustment. Whatever it is future users will have to work with, sort of like we simply have to work with 1MB if we can't adequately change it. This is due to an ossifying of the protocol, again, as I mentioned above.

Rather than accepting an extrapolation which is guaranteed to be wrong, ...

Define "wrong".
49  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 20, 2014, 05:22:37 PM
I understand you believe that Bitcoin is doomed to fail ...

I never said I believe Bitcoin is doomed to fail, although my questioning it's viability may strengthen with developments. That should be everyone's position, because Bitcoin is an experiment. Those who think Bitcoin is guaranteed to succeed in serving the world are not understanding the situation. This doesn't mean it can't succeed at that, only that it's not guaranteed (how could it be?).

... because of insufficient central authority

My position isn't Bitcoin needs central authority. My position is Bitcoin needs a viable solution. If you read the post I made above you'll see I asked whether the majority community could be convinced to accept Gavin's proposal or one more like what you're crafting. My position was one of adopting a viable solution.

In any case, even if you were right and such a thing were needed, that should not stop people from offering better ideas to those who are claiming to have authority.

Who has claimed any authority? Where? All I see is people putting forth their suggestions.

So you are about as wrong as anyone can possibly be, to suggest that just because someone claims authority, that they should make decisions and everyone blindly follow when they see clearly better solutions available.  
Why?
Just for the sake of establishing authorities?

Like I said above, it seems you're arguing from a position of ideology. You seem to see resolving the block size issue as divided between those who tend toward centralization and those who demand absolute decentralization, even to the point of seeing people establishing positions they haven't. That is the reason I question Bitcoin's viability. It's because people have their own thoughts about how things should work, or how things can work, and even if there is a solution which can work (I'm not saying which) it may not be possible to get everyone to agree, because it's not possible to do a Vulcan mind meld and have everyone understand everyone else's thoughts, conclusions, and informing information. People think differently (and with differing abilities). In the absence of some deciding force (usually a leader or authority as you call it) the result may be no clear decision whatsoever.

I'm simply seeking something which can work, something a majority can agree upon, nothing more.
50  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 19, 2014, 09:07:53 PM
KISS:

1. Since technology allows increase to 20 MB per block today, make an increase to this size as soon as consensus and logistics allow.

And what if consensus never allows it? Do we never do anything? It seems a lot of people have a "oh just do this" game plan, without really considering things might not work the way they think they will.

It's entirely possible hard and even somewhat messy choices may have to be made with Bitcoin. This is because some people will never be on the page you're trying to get them on, no matter how much conversation occurs.

2. Continue to evaluate the situation ...

Did you not read what I wrote above? I fully expect (as do others) for changes to become harder if not impossible to make as adoption grows. If some tangible solution isn't enacted within a fairly short period of time (meaning before the next bubble of interest and increased adoption) I myself may seriously have to re-evaluate the viability of Bitcoin - not cryptocurrency mind you, just this particular version of Bitcoin.
51  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 19, 2014, 08:04:34 PM
Consider the existence of a central authority, employed by a member organization with the charter of interfacing with governments.  The Chiefs then take the role of arbitrarily deciding on the supply and adjusting as the organization's economic advisers suggest, we then have progressed towards replicating the Federal Reserve Bank.

I completely disagree with this. Believe it or not it's actually not that easy for the Fed to adjust monetary policy. I mean all things considered, it's exceptionally easy, but they still have to get their board to go along and sell the public on what they're doing. That's a task made harder as they try more extraordinary things (like now) and the public becomes more astute to the way money works and its importance (like now), and that's a center driven design.

Bitcoin is designed from the ground up to be the opposite. It's extraordinarily hard to implement changes affecting the whole without consent from the whole. I sincerely believe after a certain point of adoption it will be impossible to make changes to Bitcoin, even ones not so controversial; if there isn't a do or die mandate behind the action (like a break in SHA256) I don't see consensus from millions and millions of independent thinkers coming easily. Somebody's going to think differently for some reason, even if it appears irrational. People call this ossifying of the protocol.

Think how hard this 1MB issue is. There was a time when Satoshi simply told everyone to start running a protocol change without question. He knew there was a severe bug allowing excess coins, but people simply upgraded and now the fork ignoring that block is locked in.

Bitcoin isn't the first to come up with decentralization. That was actually the idea behind America. Instead of power reigning down from monarchs it would be vested within all the individuals. However, the founders even then recognized authority by committee wasn't always ideal. It would be a clear disadvantage if attacked since the battle might be lost before it was decided what to do. That's why the president has full authority to respond militarily in case of attack.

It sounds like you're objecting for reasons more ideological than practical. While that's admirable and understandable I hope you also recognize that's not automatically best given the circumstances.
52  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 18, 2014, 06:27:33 PM
As I see it Bitcoin is like the U.S. government. It has made too many promises to keep. I agree with Gavin Bitcoin has been sold as being able to serve the world's population. At the same time it has been sold as being effectively decentralized. These two things can't happen at the same time with today's technology, because bandwidth numbers (primarily) don't  align with global transaction data numbers. They will work eventually, but they don't today.

The question is how to get from today to the future day when Bitcoin can handle the world's transaction needs while remaining decentralized down to technology available to average people.

We have effectively three choices.

- Do nothing and remain at 1MB blocks
- Gavin's proposal to grow transaction capacity exponentially, possibly fitting in line with Bitcoin adoption numbers
- Some algorithmic formula to determine block size which is probably more conservative than exponential growth, but less predictable

I think doing nothing is unrealistic.

I like Gavin's proposal because it can solve the issue while also being predictable. Predictability has value when it comes to money. I agree that some other algorithm using real world inputs is safer, but I wonder at what expense. In the worst case, using Gavin's proposal, there may be some risk of heaving hitting players hogging market share from lesser miners, maybe even to the extent of becoming centralized cartels. I don't think there is a good chance of that happening, but agree it's in the realm of possibility. In that case, though, nobody would be forced to continue using Bitcoin, since it's a voluntary currency. It's easy to move to an alternative coin. Free market forces, in my mind, would solve the problem.

If we try being as cautious as possible, seeking inputs along the way we probably rest assured centralization won't happen with Bitcoin. At the same time, though, the market has to continually assess what is Bitcoin's transaction capacity, and therefore value. I'm not sure how that would play out.

My question is can a majority of the community (say 70-80%) be convinced to choose one of the last two options?
53  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 17, 2014, 05:55:38 PM
I am just hoping that some more serious thought goes into avoiding the need to guess or extrapolate (an educated guess but still a guess).

It is offered as an example of the sort of thing that can work, rather than a finished product.

This is the problem.

People don't seem to realize Gavin's proposal may be the best we can do. I happen to think it is. If anyone had a better idea we'd have heard it by now. We, the entire community, have brooded on this issue for months if not years now. Here is a spoiler alert: nobody can predict the future.

Did anyone ever stop to think Bitcoin couldn't work? I mean I have, not for reasons technological, but for reasons of solving issues via consensus. Have you ever watched a three-legged human race, you know where one leg gets tied to the leg of another person? The reason they're funny is because it's hard to coordinate two separate thinking entities with different ideas on how to move forward, the result being slow or no progress and falling over. That may be our fate and progress gets harder the more legs get tied in. That's the reason for taking action sooner rather than later.

I've posted it before, but I'll say it again. I think a big reason Satoshi left is because he took Bitcoin as far as he could. With Gavin and other devs coming on-board he saw there was enough technical expertise to keep Bitcoin moving forward. I don't think he thought he had any more ironclad valuable ideas to give Bitcoin. Its fate would be up to the community/world he released it into. Bitcoin is an experiment. People don't seem to want to accept that, but it is. What I'd love to see is somebody against Gavin's proposal offer an actual debatable alternative. Don't just say, sorry it has to be 1MB blocks and as for what else, well that's not our thought problem; and don't just say no we don't want Gavin's proposal because it doesn't matter-of-factly predict the future, and as for what else, well we don't know.

Come up with something else or realize we need to take a possibly imperfect route, but one which could certainly work, so that we take some route at all.
54  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 17, 2014, 01:14:14 AM
Sure, we were also able to get x.25 and x.75 telecom to run over barbed wire, in the lab.  (There are places in the world that still use these protocols, some of which would deeply benefit from bitcoin in their area.)
The logistical challenges of implementation is not what you find in the lab.  
This stuff has to go out in environments where someone backs up their truck into a cross country line so they can cut it and drive off with a few miles of copper to sell as scrap.  We live in the world, not in the lab.

We're in luck then, because one advantage of fiber lines over copper is they're not good used for anything other than telecom Smiley

I'm no telecommunications specialist, but do have an electronics engineering background. Raise some issue with fundamental wave transmission and maybe I can weigh in. My understanding is it's easier to install fiber lines, for example, because there is no concern over electromagnetic interference. Indeed, the fiber lines I witnessed being installed a week ago were being strung right from power poles.

However, is such theoretical discussion even necessary? We have people being offered 2Gbps bandwidth over fiber not in theory but in practice in Japan, today.

That's already orders of magnitude over our starting bandwidth numbers. I agree with Gavin that demand for more bandwidth is inevitable. It's obvious all networks are converging - telephone, television, radio, internet. We'll eventually send all our data over the internet, as we largely do now, but in ever increasing bandwidth usage. To imagine progress in technology will somehow stop for no apparent reason, when history is chock full of people underestimating what technological capacity we actually experience is not only shortsighted, it borders unbelievable.
55  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 16, 2014, 06:03:53 PM
While (I think we'd all agree  that) predicting technology decades ahead is hard,
 it is not impossible that a group of specialists,  after a thorough discussion, could
get the prediction about right.

I linked you to the report of Bell Labs achieving 10Gbps over copper wire. Here is the link to them achieving 100 petabits per second over fiber in 2009:

http://www.alcatel-lucent.com/press/2009/001797

Quote
This transmission experiment involved sending the equivalent of 400 DVDs per second over 7,000 kilometers, roughly the distance between Paris and Chicago.

These are demonstrated capacities for these two mediums. The only limiting factors for achieving such rates for individual consumers are physical and economic considerations for building out the infrastructure. Nonetheless the technologies for achieving exponential increase in bandwidth over current offerings is proven. Achieving these rates in practice on a scale coinciding with historical exponential growth of 50% annually, which does take into consideration economic and physical realities, seems well within reason. I'm sure telecommunications specialists would agree.
56  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 15, 2014, 10:03:51 PM
so if the bandwith growth happens to  stop in 10 years

Why would it? Why on earth would it???

Look, Jakob Nielsen reports his bandwidth in 2014 is 120Mbps, which is around the 90Mbps figure Gavin mentions for his own calculations. Let's use 100Mbps as a "good" bandwidth starting point which yields:

1: 100
2: 150
3: 225
4: 338
5: 506
6: 759
7: 1139
8: 1709
9: 2563
10: 3844
11: 5767
12: 8650
13: 12975
14: 19462
15: 29193
16: 43789
17: 65684
18: 98526
19: 147789
20: 221684

Researchers at Bell Labs just set a record for data transmission over copper lines of 10Gbps. So we can use that as a bound for currently existing infrastructure in the U.S. We wouldn't hit that until year 12 above, and that's copper.

Did you not read my earlier post on society's bandwidth bottleneck, the last mile? I talk about society moving to fiber to the premises (FTTP) to upgrade bandwidth. Countries like Japan and South Korea already have installed this at over 60% penetration. The U.S. is at 7.7% and I personally saw fiber lines being installed to a city block a week ago. Researchers at Bell Labs have achieved over 100 petabits per second internet data transmission over fiber-optic lines. Do you realize how much peta is? 1 petabit = 10^15bits = 1 000 000 000 000 000 bits = 1000 terabits

That's a real world bound for fiber, and that's what we're working toward. Your fears appear completely unsubstantiated. On what possible basis, given what I've just illustrated, would you expect bandwidth to stop growing, even exponentially from now, after only 10 years?!?
57  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 15, 2014, 08:14:08 PM
I'm not making predictions on constants that we don't know; but when speaking
about exponential growth it  is not even necessary.  Want to know how fast the exponent
growth? Take your 50% growth, and just out of curiosity  see for which n your (1.5)^n exceeds
the number of atoms in the universe. Gives some idea.

But the proposal isn't to exceed the number of atoms in the universe. It's to increase block size for 20 years then stop. If we do that starting with a 20MB block at 50% per year we arrive at 44,337 after 20 years. That's substantially under the number of atoms in the universe.

The point being, with an exponent it's too easy to overshoot.

How so? You can know exactly what value each year yields. It sounds like you're faulting exponents for exponents sake. Instead, give the reason you feel the resulting values are inappropriate. Here they are:

1: 20
2: 30
3: 45
4: 68
5: 101
6: 152
7: 228
8: 342
9: 513
10: 769
11: 1153
12: 1730
13: 2595
14: 3892
15: 5839
16: 8758
17: 13137
18: 19705
19: 29558
20: 44337
58  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 15, 2014, 05:56:05 PM
Because exponential growth is unsustainable

Not inherently. It depends on the rate of growth and what is growing. For example, a 1% per year addition to bandwidth is exceedingly conservative, based on historical evidence.

it is bound to cap at some point in the near future.

Define 'near future'. Is that 5 years, 10 years, 40? And what makes you say that? It's easy to make a general unsupported statement. Don't be intellectually lazy. Show the basis for your reasoning, please.
59  Bitcoin / Development & Technical Discussion / Re: A Scalability Roadmap on: October 15, 2014, 04:09:23 AM
I sympathize with their plight, but Bitcoin is not made for these first.  Bitcoin is for everyone.  There are parts of the planet (some of which have the greatest need for Bitcoin) that have very limited bandwidth today and can be expected to not see rapid improvement.

You know I'm starting to think it doesn't matter. We win either way.

In the worst case, say we overshoot and Bitcoin becomes completely centralized by powerful miners which then emulate the current SWIFT system, blocking and regulating transactions. What would happen next? Would we curse and shout CRAP! We were this close. If only we'd ratcheted down our numbers a tiny bit. Well everyone go home. Nothing more to see here.

LOL of course not. We'd move to the next alt-coin not co-opted and continue on, having learned from our mistakes. In a post I wrote long ago which seems to have come true I talked about how alt-coins gave a value to our community Bitcoin never could by providing the one thing it alone never could: alternative.

The people who still say there can be only one will always be wrong. Alt-coins are not going anywhere. Most will have low market caps or blow up and deservedly die horrible deaths, but Bitcoin won't ever be all by itself. Won't happen. And if the free market demands a coin with fixed or less-than-bitcoin block size limit then that's what it will get, and value and usage will flow there.

The converse is also true. Say we are unable to gain consensus for raising the size limit, causing a collapse in price as people perceive Bitcoin as unable to serve the base they thought it would; or we proceed with a messy hard fork creating a rift in the community and price crash as people become confused about the future of Bitcoin and what to do next. Cryptocurrency would still go on, eventually, because that cat is out of the bag and people will continue working on it. Of course, I'd rather see the first scenario (a need to adopt an alt-coin) than second as I'm less certain about recovering well from the second since cryptocurrency ultimately has no backing other than overall confidence in its viability.

Either way I see Bitcoin as providing the world with education. It's teaching the world the possibilities of decentralization with currency and that's where the real value is, because Bitcoin isn't the only thing which can work in that model.
60  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 14, 2014, 08:24:35 PM
As I think it through 50% per year may not be aggressive.

Drilling down into the problem we find the last mile is the bottleneck in bandwidth:

http://en.wikipedia.org/wiki/Last_mile

That page is a great read/refresher for this subject, but basically:

Quote
The last mile is typically the speed bottleneck in communication networks; its bandwidth limits the bandwidth of data that can be delivered to the customer. This is because retail telecommunication networks have the topology of "trees", with relatively few high capacity "trunk" communication channels branching out to feed many final mile "leaves". The final mile links, as the most numerous and thus most expensive part of the system, are the most difficult to upgrade to new technology. For example, telephone trunklines that carry phone calls between switching centers are made of modern optical fiber, but the last mile twisted pair telephone wiring that provides service to customer premises has not changed much in 100 years.

I expect Gavin's great link to Nielsen's Law of Internet Bandwidth is only referencing copper wire lines. Nielsen's experience, which was updated to include this year and prior (and continue to be inline with his law), tops out at 120 Mbps in 2014. Innovation allowing increases in copper lines is likely near the end, although DSL is the dominant broadband access technology globally according to a 2012 study.

The next step is fiber to the premises. A refresher on fiber-optics communication:

Quote
Fiber-optic communication is a method of transmitting information from one place to another by sending pulses of light through an optical fiber. The light forms an electromagnetic carrier wave that is modulated to carry information. First developed in the 1970s, fiber-optic communication systems have revolutionized the telecommunications industry and have played a major role in the advent of the Information Age. Because of its advantages over electrical transmission, optical fibers have largely replaced copper wire communications in core networks in the developed world. Optical fiber is used by many telecommunications companies to transmit telephone signals, Internet communication, and cable television signals. Researchers at Bell Labs have reached internet speeds of over 100 petabits per second using fiber-optic communication.

The U.S. has one of the highest ratios of Internet users to population, but is far from leading the world in bandwidth. Being first in technology isn't always advantageous (see iPhone X vs iPhone 1). Japan leads FTTP with 68.5 percent penetration of fiber-optic links, with South Korea next at 62.8 percent. The U.S. by comparison is 14th place with 7.7 percent. Similar to users leapfrogging to mobile phones for technology driven services in parts of Africa, I expect many places to go directly to fiber as Internet usage increases globally.

Interestingly, fiber is a future-proof technology in contrast to copper, because once laid future bandwidth increases can come from upgrading end-point optics and electronics without changing the fiber infrastructure.

So while it may be expensive to initially deploy fiber, once it's there I foresee deviation from Nielsen's Law to the upside. Indeed, in 2012 Wilson Utilities located in Wilson, North Carolina, rolled out their FTTN (Fiber to the Home) with speeds offerings of 20/40/60/100 megabits per second. In late 2013 they achieved 1 gigabit fiber to the home.
Pages: « 1 2 [3] 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 ... 65 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!