jubalix
Legendary
Offline
Activity: 2660
Merit: 1023
|
|
September 26, 2017, 08:07:05 AM |
|
speed of technological development
We do in terms of _real_ technological development, but not fake marketing crap... and not at the expense of stability and security. Compare, Bitcoin moving to innovative second generation error protected addresses w/ BIP173 vs ethereum still behind bitcoin 0.1 without a strong safe to use address system. Or ethereum validation being so slow that its impractical to sync-up a full node anymore, pushing almost all eth users onto SPV-like security for their initial sync, while Bitcoin sync tx/s speed is orders of magnitude faster and keeps getting faster. well ok, fair enough, but your reply does not seem to answer why you would not 2x, excepting as to a reference to "stability" without more. Even for the sake of showing commitment by precedent to some block size increase, which appears to be reasonable given the size of HD vs cost decrease and bandwidth cost decrease. To not do this offers less choice to the market, and also gives BCH arguments more credence. Also (while I don't really agree with the word spam) it would allow 1/2 the fees and make twice the space say at 8MB (or approx 2MB and segwit, but I think you can see what I am getting at) and down that path it would become much more costly for spammers to spam the blockchain. Or another way of putting it, you allow a better economic signal by allowing more actors to signal at less cost and in fact increases miner reward. To address the point of stability, does 2MB really threaten stability? if so how.
|
|
|
|
TechPriest
Sr. Member
Offline
Activity: 377
Merit: 282
Finis coronat opus
|
|
September 26, 2017, 11:03:23 AM |
|
I was neutral on it until I talked to one of the Core devs about it. I don't understand all the technical explanation, but I'm now firmly against it. I hope Segwit2x fails.
It's brilliant "I don't understand principles of bitcoin but i believe someone" "and now i firmly against any other opinions" True bitcoin user With 2x, the worst case size is ~8 MB. We don't know how such large blocks will effect the network. It could increase orphan rates, it could cause many nodes to drop off of the network because they can't handle the additional load. We don't know if the network can handle such larger blocks. Just because your internet connection can and your machine can handle the load does not mean that everyone around the world who is running the node can. They may be restricted by things like the Great Firewall of China which may find larger blocks easier to spot and thus make Bitcoin easier for them to block entirely.
Very funny. I have an idea: lets decrease block size to 300kb because someone can't handle 1mb load.
|
In science we trust!
|
|
|
AtheistAKASaneBrain
|
|
September 26, 2017, 11:17:17 AM |
|
speed of technological development
We do in terms of _real_ technological development, but not fake marketing crap... and not at the expense of stability and security. Compare, Bitcoin moving to innovative second generation error protected addresses w/ BIP173 vs ethereum still behind bitcoin 0.1 without a strong safe to use address system. Or ethereum validation being so slow that its impractical to sync-up a full node anymore, pushing almost all eth users onto SPV-like security for their initial sync, while Bitcoin sync tx/s speed is orders of magnitude faster and keeps getting faster. Oh man I remember when I tried to set a full Ethereum node last year. It was an absolute mess. Apparently there is a part of the blockchain that got spammed into oblivion with some sort of attack and now that part is pretty much impossible to download. My computer was going into overdrive trying to process that part. It would have taken ages, so now they recommend that you skip that part. So I gave up and used Parity instead, only to find out later on that it got some massive security bugs. Trying to use Ethereum as a store of value is indeed an act of insanity.
|
|
|
|
ChromaticStar
|
|
September 26, 2017, 01:45:43 PM |
|
I was neutral on it until I talked to one of the Core devs about it. I don't understand all the technical explanation, but I'm now firmly against it. I hope Segwit2x fails.
It's brilliant "I don't understand principles of bitcoin but i believe someone" "and now i firmly against any other opinions" True bitcoin user With 2x, the worst case size is ~8 MB. We don't know how such large blocks will effect the network. It could increase orphan rates, it could cause many nodes to drop off of the network because they can't handle the additional load. We don't know if the network can handle such larger blocks. Just because your internet connection can and your machine can handle the load does not mean that everyone around the world who is running the node can. They may be restricted by things like the Great Firewall of China which may find larger blocks easier to spot and thus make Bitcoin easier for them to block entirely.
Very funny. I have an idea: lets decrease block size to 300kb because someone can't handle 1mb load. I oppose it mostly on philisophical grounds that I do understand. I suppose you understand all the technical aspects. So tell us please, why should we all believe 2x is going to be the best road forword? Make sure you address all the reasons why Core is opposed to it, they're the ones who really know what they're doing.
|
|
|
|
TechPriest
Sr. Member
Offline
Activity: 377
Merit: 282
Finis coronat opus
|
|
September 26, 2017, 02:47:41 PM |
|
I oppose it mostly on philisophical grounds that I do understand. I suppose you understand all the technical aspects. So tell us please, why should we all believe 2x is going to be the best road forword?
First point: I don't understand ALL technical aspects Second point: I don't think that is the best way, but it wouldn't do any real harm. Also, in future (especially for Lightning) we will be needed in bigger blocks. From my another post : Also, as i calculate Lightning network needed big blocks. There 7 billion of souls on our planet. If everyone will close 2 Lightning channels per year (this number can be less or more, i took average) so we have - 14 billion of transaction. Average number of trasnaction per block now is about 2 thousands. So, per year we have 51840 blocks. 51840*2000 = 103 680 000 transactions. It's not enough as you can see. Make sure you address all the reasons why Core is opposed to it, they're the ones who really know what they're doing.
Sorry, but i can't agree with this. Who tell you that they really know what they do? For example, Luke in his twitter wrote that high fees for btc is normal. ( https://np.reddit.com/r/Bitcoin/comments/6ereib/these_fees_are_unacceptable/dicq95v/) Bitcoin-user complaints about 85$ transaction fee. Core developer Luke-jr responds: "Then use fiat." Do you think it is good answer for Bitcoin developer, BITCOIN, who tries to become new financial system? For example, i don't think so. Also, here is interesting article. Author shows very interesting opinion: http://hackingdistributed.com/2017/08/26/whos-your-crypto-buddy/ In fiat system with 3d party you must to believe and trust someone. Bitcoin is network without trust (between peers) so you must verificate and check any kind of information and it doesn't matter who tells it to you: me, or core dev, or someone else.
|
In science we trust!
|
|
|
ChromaticStar
|
|
September 26, 2017, 03:35:37 PM |
|
I oppose it mostly on philisophical grounds that I do understand. I suppose you understand all the technical aspects. So tell us please, why should we all believe 2x is going to be the best road forword?
First point: I don't understand ALL technical aspects Second point: I don't think that is the best way, but it wouldn't do any real harm. Also, in future (especially for Lightning) we will be needed in bigger blocks. From my another post : Also, as i calculate Lightning network needed big blocks. There 7 billion of souls on our planet. If everyone will close 2 Lightning channels per year (this number can be less or more, i took average) so we have - 14 billion of transaction. Average number of trasnaction per block now is about 2 thousands. So, per year we have 51840 blocks. 51840*2000 = 103 680 000 transactions. It's not enough as you can see. Make sure you address all the reasons why Core is opposed to it, they're the ones who really know what they're doing.
Sorry, but i can't agree with this. Who tell you that they really know what they do? For example, Luke in his twitter wrote that high fees for btc is normal. ( https://np.reddit.com/r/Bitcoin/comments/6ereib/these_fees_are_unacceptable/dicq95v/) Bitcoin-user complaints about 85$ transaction fee. Core developer Luke-jr responds: "Then use fiat." Do you think it is good answer for Bitcoin developer, BITCOIN, who tries to become new financial system? For example, i don't think so. Also, here is interesting article. Author shows very interesting opinion: http://hackingdistributed.com/2017/08/26/whos-your-crypto-buddy/ In fiat system with 3d party you must to believe and trust someone. Bitcoin is network without trust (between peers) so you must verificate and check any kind of information and it doesn't matter who tells it to you: me, or core dev, or someone else. I heard that remark a while ago. Luke may not be on the same level as many of the other Core devs. At the very least, that remark was made when people should have been finding cheaper transactions due to scaling limitations. That being said, scaling solutions are being undertaken, and if lightning network with Segwit isn't enough, then 2X can be considered. Several of the Core devs have made it very clear that there is much more to 2X than just a change in the block size. A change in the block size has much larger dynamic implications which can not be predicted. Once 7 billion people are using BTC, trying to transact and close lightning network channels, 2X or 3X or nX can be considered, but why take chances on something that were not ready for? Eric Lombrozo has made comments on 2X that you might want to consider here: https://bitcoinmagazine.com/articles/bitcoin-core-developer-eric-lombrozo-on-misunderstandings-in-block-size-debate-1455817458/
|
|
|
|
gmaxwell
Staff
Legendary
Offline
Activity: 4298
Merit: 8818
|
|
September 26, 2017, 05:18:17 PM Last edit: September 26, 2017, 05:29:07 PM by gmaxwell |
|
well ok, fair enough, but your reply does not seem to answer why you would not 2x, excepting as to a reference to "stability" without more.
Other people already had. I was clarifying the innovation point and nothing else. [the sake of showing commitment by precedent to some block size increase, which appears to be reasonable given the size of HD vs cost decrease and bandwidth cost decrease.
What "decrease"?.... on state of the art hardware at both times the initial sync of Bitcoin increased by 50% in the last 6 months. This has a material impact on people's willingness to run nodes-- an uncompensated act which is essential for Bitcoin's survival as a secure decentralized system. Also (while I don't really agree with the word spam) it would allow 1/2 the fees
Twice the space doesn't mean one half fees, it means almost zero fees against the same demand. Half the fees would generally be achieved by a few percent more capacity. much more costly for spammers to spam the blockchain. A spammer must pay the fee of each transaction they displace for every block they displace it. Increases in blocksize in excess of demand just radically reduce their costs by resulting in very low fees. A lager block never lowers the price to displace a transaction for a spammer. To address the point of stability, does 2MB really threaten stability? if so how.
The best prior research we had showed that 2 to 4MB was the largest arguably safe amount even while ignoring safty margins, state attacks, initial synchronization, etc. 2X is _NOT_ 2MB, it is 4 to 8MB.
|
|
|
|
TechPriest
Sr. Member
Offline
Activity: 377
Merit: 282
Finis coronat opus
|
|
September 26, 2017, 06:10:28 PM |
|
Once 7 billion people are using BTC, trying to transact and close lightning network channels, 2X or 3X or nX can be considered, but why take chances on something that were not ready for? Eric Lombrozo has made comments on 2X that you might want to consider here: Yeah, but 104 million transactions it's enough only for 52 million of users. Also, i didn't counted another types of transactions like transaction with big amount of btc (Lighnting network is desined for small payment), multioutputs and others. So real number of users is much less than 52 millions Thanks, it was interesting to read. Yeah, i agreed with author - hard forks it is very dangerous way.
|
In science we trust!
|
|
|
aleksej996
Sr. Member
Offline
Activity: 490
Merit: 389
Do not trust the government
|
|
September 26, 2017, 07:38:42 PM |
|
Yeah, but 104 million transactions it's enough only for 52 million of users. Also, i didn't counted another types of transactions like transaction with big amount of btc (Lighnting network is desined for small payment), multioutputs and others. So real number of users is much less than 52 millions
Are you that sure about that number that you will have all the dangers of a hard fork with no replay protection when it is years, maybe even decades from potential necessity? I can very well imagine that everyone might use one big Lightning Network in the future and that there will be no need of ever closing it. There is also a cost to closing a channel, so there is an incentive of not paying mining fees that people might not want to ever close it. And people might not worry about the benefits of closing a channel as much as you think, since the only benefit is that there is no risk of your funds being lost. It makes sense to assume that people will not open a channel with a friend, but with a big well known node that has open channels with many other users. That way if that well known node betrays the well established trust starts closing channels, everyone would know about it, since it would be a big deal and maybe even be in the news and that could cost that node a great deal. This node won't be able to close all the channels at one either, since it will be literally impossible to put so many transactions in blocks. Here a bigger block size is actually a problem. You see, it isn't all that simple. But I can tell you this, the benefits of S2X are none for many years to come and costs and dangers are huge and there is no cost in postponing the fork and there are certainly benefits to do so. This is obvious to anyone who thought about it for a reasonable amount of time, which tells you something about that NYC agreement, whatever you think that something is.
|
|
|
|
d5000
Legendary
Offline
Activity: 4130
Merit: 7749
Decentralization Maximalist
|
|
September 27, 2017, 04:43:29 AM |
|
I am a bit surprised to see such low activity in the BTC1 repository. We could now draw the conclusion "Segwit2x is dead", like WhalePanda already did in a blog post, and party because the (tremendously dangerous - at least compared to the kindergarten BCH fork) November fork won't happen. To da moon! But perhaps is there another reason for that low activity - maybe the client is "ready" and won't see major updates until the fork, to not over-complicate things? If someone has insights there, I'm interested. (I have advocated for Segwit+2MB proposals in the past, but now I'm pretty neutral regarding Segwit2x, sympathizing perhaps now a bit more with the "no fork" scenario, because big blockers already have their play money now, we have Segwit, and the hard fork date is - in my opinion - way too early. In short, I wouldn't cry if Segwit2x was dead.)
|
|
|
|
exstasie
Legendary
Offline
Activity: 1806
Merit: 1521
|
|
September 27, 2017, 08:12:45 AM |
|
I am a bit surprised to see such low activity in the BTC1 repository. Why? It's not as if anyone involved with BTC1 is a prolific Bitcoin developer. I have doubts that they are capable of even merging Core's updates. I think they expect(ed) that the NYA and existence of BTC1 would be enough to force Core to merge the 8x consensus change. I'm glad to see they were wrong. But perhaps is there another reason for that low activity - maybe the client is "ready" and won't see major updates until the fork, to not over-complicate things? If someone has insights there, I'm interested. I think this will be the explanation provided. There will be no more major changes, just final testing of the consensus changes. And politicking. I actually won't lie: the No2x crowd and their obnoxious hats sort of piss me off, and the whole BIP148 thing really pissed me off -- particularly Luke's role. But I've always been against hard forks and contentious forks in general.
|
|
|
|
Carlton Banks
Legendary
Offline
Activity: 3430
Merit: 3080
|
|
September 27, 2017, 11:11:47 AM |
|
I am a bit surprised to see such low activity in the BTC1 repository. We could now draw the conclusion "Segwit2x is dead", like WhalePanda already did in a blog post, and party because the (tremendously dangerous - at least compared to the kindergarten BCH fork) November fork won't happen. To da moon! But perhaps is there another reason for that low activity - maybe the client is "ready" and won't see major updates until the fork, to not over-complicate things? If someone has insights there, I'm interested. (I have advocated for Segwit+2MB proposals in the past, but now I'm pretty neutral regarding Segwit2x, sympathizing perhaps now a bit more with the "no fork" scenario, because big blockers already have their play money now, we have Segwit, and the hard fork date is - in my opinion - way too early. In short, I wouldn't cry if Segwit2x was dead.) Well, gmaxwell noted that Bitcoin Cash have now implemented Segwit and Segwit native addresses (i.e. the BIP173 Bech32 format), hilariously calling them alternative names, despite heavily promoting a supposedly superior tech to solve quadractic sighash DoS attacks and signature malleation. So what's the actual difference between S2x and Bitcoin Cash in technical specification? While not identical, increasingly less (possibly Bitcoin Cash has the technical edge right now with their BIP173 deployment). And in market terms, Bitcoin Cash has a network, software running that network, a price, and a track record. S2x apparently has a promise of hashrate. I wonder how S2x futures prices are doing?
|
Vires in numeris
|
|
|
TechPriest
Sr. Member
Offline
Activity: 377
Merit: 282
Finis coronat opus
|
|
September 27, 2017, 11:36:10 AM |
|
Are you that sure about that number that you will have all the dangers of a hard fork with no replay protection when it is years, maybe even decades from potential necessity? Yeah, this is real problem. Here i agree with you. I can very well imagine that everyone might use one big Lightning Network in the future and that there will be no need of ever closing it. There is also a cost to closing a channel, so there is an incentive of not paying mining fees that people might not want to ever close it. And people might not worry about the benefits of closing a channel as much as you think, since the only benefit is that there is no risk of your funds being lost. Lightning network was created mostly for small payments. For example you open channel with some product shop. Or with your friend. Of course, you're able to use others channels but in that case you must pay fee as you know (and also it is risky idea. hackers don't sleep). So someone will close 10 channels per year, another - noone. So i took average number. It makes sense to assume that people will not open a channel with a friend, but with a big well known node that has open channels with many other users. Lol, this is some kind of exisiting bank system. Trusted nodes == banks.
|
In science we trust!
|
|
|
jubalix
Legendary
Offline
Activity: 2660
Merit: 1023
|
|
September 27, 2017, 11:44:41 AM Last edit: October 03, 2017, 02:21:36 AM by jubalix |
|
well ok, fair enough, but your reply does not seem to answer why you would not 2x, excepting as to a reference to "stability" without more.
Other people already had. I was clarifying the innovation point and nothing else. Ok thanks I will have to go and research this some more. [the sake of showing commitment by precedent to some block size increase, which appears to be reasonable given the size of HD vs cost decrease and bandwidth cost decrease.
What "decrease"?.... on state of the art hardware at both times the initial sync of Bitcoin increased by 50% in the last 6 months. This has a material impact on people's willingness to run nodes-- an uncompensated act which is essential for Bitcoin's survival as a secure decentralized system. I address this more appropriately infra. (my error for using 2X by the way). Also (while I don't really agree with the word spam) it would allow 1/2 the fees
Twice the space doesn't mean one half fees, it means almost zero fees against the same demand. Half the fees would generally be achieved by a few percent more capacity. If my simple math's holds. If block size 1 = x, and fee = y, and number for transaction to fill block = T then fee = yT if your block = 8x and your fee = y/2 then it would cost you 8yT/2 = 4yT, and 4yT/yT = 4. So in this case at half the fees, it would cost 4x the cost to fill the block and so on. It seems this model really costs the "spammers" more to get their market signal in, BUT I was wrong not to consider demand as you rightly pointed out. I must accept your zero fee argument insofar as demand does not drop of linearly with block size increase, probably more like a some sort of auction curve for the last unit (s)of space, so perhaps quasi Multiunit auction? Intuitively the curve seems to be perhaps sigmoidal in shape ~ S= 1/(1+e-t)? Does this exclude some break point where the you can get better market signal at a lower final price point in the blocks, and this may mean a 1.1MB or some small percentage and you use some sort of function as you also seem to suggest (more on that infra). much more costly for spammers to spam the blockchain. A spammer must pay the fee of each transaction they displace for every block they displace it. Increases in blocksize in excess of demand just radically reduce their costs by resulting in very low fees. A lager block never lowers the price to displace a transaction for a spammer. Yes, I accept you point in relation to the demand side of things, but can we more closely match demand as stated above? To address the point of stability, does 2MB really threaten stability? if so how.
The best prior research we had showed that 2 to 4MB was the largest arguably safe amount even while ignoring safty margins, state attacks, initial synchronization, etc. 2X is _NOT_ 2MB, it is 4 to 8MB. My error, I should say 2MB or as the case may be 1.1MB or whatever. More on point, Do you consider there would be an appropriate increase in the face of increasing demand, or more specifically Is there perhaps a function or family of functions that could say give a optimal or near optimal blocksize given a demand and fee cost curves? You seem suggest a few percent, which in light of the above seems right. It seems reasonable to say that demand has risen and there may be and appropriate blocksize increase at some point. Of course the functions may show that the blocksize is too large already, as I do not know the functions. However it seems some size increase should be mooted in the face of a level of demand. This may allow 2 advantages [1] Some certainty on scaleing [2] Trying to get the best market signal Thanks in advance.
|
|
|
|
aleksej996
Sr. Member
Offline
Activity: 490
Merit: 389
Do not trust the government
|
|
September 27, 2017, 07:25:08 PM |
|
Lightning network was created mostly for small payments. For example you open channel with some product shop. Or with your friend. Of course, you're able to use others channels but in that case you must pay fee as you know (and also it is risky idea. hackers don't sleep). So someone will close 10 channels per year, another - noone. So i took average number.
I don't know what it was created for and I would say it doesn't really matter. All I am saying is that it is completely possible to use it for that and there are costs if you don't use it. But we will see. no one can know an average number when the thing doesn't even exist yet. Lol, this is some kind of exisiting bank system. Trusted nodes == banks.
There us a reason I called them well known nodes and not trusted nodes, because trust isn't necessary in a Lightning Network, that is the whole point of it. If trust was needed, we would just use banks instead.
|
|
|
|
nicosey
|
|
September 28, 2017, 07:28:09 AM |
|
Who wants 2x? Seems like the compromise that no one wants.
|
|
|
|
NeuroticFish
Legendary
Offline
Activity: 3878
Merit: 6623
Looking for campaign manager? Contact icopress!
|
|
September 28, 2017, 07:39:11 AM |
|
Who wants 2x? Seems like the compromise that no one wants.
While most users talking here seem to be against SegWit2x, especially because for now it doesn't bring anything new and helpful, the miners are all for it. If you look at coin.dance or xbt.eu websites, SegWit2x (or NYA) has well over 90% support.
|
|
|
|
pebwindkraft
|
|
September 28, 2017, 08:38:13 AM |
|
While most users talking here seem to be against SegWit2x, especially because for now it doesn't bring anything new and helpful, the miners are all for it. If you look at coin.dance or xbt.eu websites, SegWit2x (or NYA) has well over 90% support.
this is correct, and I think the way to look at it is this: who can benefit from the Bitcoin system? The community - aka the vast majority (e.g. running core nodes) definetly did not express their wish to use SegWit2x. The miners would like to have SegWit2x. By this they are willing to take the vast majority of users as "hostage", to follow "their" will. This prevented already in the last 2 years the bring-in of SegWit, and with it the further development of the Bitcoin. Wether this was intentionally or accidentially, I don't want to judge. The work of the miners is to stabilize the network, and get the incentive for it. As mining is more and more centralized, this is a super-vast minority, which from my point of view should not be in a position, to decide, what the super-super vast majority wants. I see, that many companies also signed the NYA. So what? This still doesn't make a majority. There are more developpers listed on the bitcoin.org webpage (aka "core developpers"), then people/groups/companies in NYA. I understand the community does not accept a minority giving direction against the way, the core developpers take. And also I can see, that majority of SegWit2x people tend to think, that core devs are two or three names (Luke Jr, Greg Maxwell and Adam Beck). This is not the case, as I just showed with a look at bitcoin.org (there are hundreds of names). Looks like the SegWit2x group does not (want to) understand, that the concept of Bitcoin is a non-centralized system, where a single group or person is prevented from changing direction. There is no "king" or "government" in Bitcoin. With the current situation miners and NYA teams play the role of a government, and this contradicts the main principles of Bitcoin.
|
|
|
|
d5000
Legendary
Offline
Activity: 4130
Merit: 7749
Decentralization Maximalist
|
|
September 28, 2017, 10:30:28 AM |
|
Who wants 2x? Seems like the compromise that no one wants.
Well, there is a possible advantage I can see: The blockchain capacity achieved with Segwit2x could be the ideal "final" size necessary for Bitcoin to become "massively used as a currency". Obviously the whole concept would include second and third layers like Lightning Network and sidechains, but a blockchain with 2-4MB Segwit blocks alone may be a little bit too small to serve as a settlement layer for all these second-layer solutions. If that assumption is true (it may be not!), then it would be not a bad idea to do the hardfork in the current stage of Bitcoin's evolution, when the network is still young, there is not too much at stake (at least, no critical services like healthcare etc. are depending on it) and so it would not be the end of the world if something goes wrong. In a later stage, such an hard fork may be more difficult and painful. However, if this argumentation is used, there would have been research been done if 2MB+Segwit (realistically 4-7 MB blocks, 8MB max.) are really the ideal size for a "final stage Bitcoin main chain". This research has not happened, as far as I know.
|
|
|
|
Carlton Banks
Legendary
Offline
Activity: 3430
Merit: 3080
|
|
September 28, 2017, 02:02:25 PM |
|
ideal "final" size So, Bitcoin reached it's "final" ideal userbase numbers, when was it, yesterday? Forget it d5000, you're still talking technical issues (which are not in S2x's favour) when the fork is still a political move dressed in technical clothing. It always was, and this is the 4th such attempt to make this false assertion about blocksize. And how many times exactly have you personally been involved in blocksize conversations while ignoring the political elephant in the room? It's very tedious, and does your credibility in these matters no good. There's just no credible empirical evidence that the blocksize needs increasing now, even to the current 4MB maximum, and you do nothing but drone on about future ideal maximums being desirable next month. You do nothing but talk constantly about positive aspects of blocksize increases from as many different perspectives as you are able to summon up. C'mon.
|
Vires in numeris
|
|
|
|