Bitcoin Forum

Bitcoin => Development & Technical Discussion => Topic started by: jubalix on July 17, 2018, 05:54:11 AM



Title: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: jubalix on July 17, 2018, 05:54:11 AM
Why can't core scale as some f(x) = S curve so that you would get a % increase that increased supply and demand?

why are they committed to only 1mb (or ~ 4 mb with segwit)



Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: bob123 on July 17, 2018, 10:43:59 AM
Why can't core scale as some f(x) = S curve so that you would get a % increase that increased supply and demand?

why are they committed to only 1mb (or ~ 4 mb with segwit)

I am assuming you are talking about the block size?

Well, the most critical point probably is that increasing the block size is just a temporary scaling.
Doubling the block size brings a small benefit (lower fees) for a short period of time (until transactoun amount doubled).
But the price for this 'scaling' is, that you are sacrificing the possibility of decentralisation. If the block size would increase heavily, smaller nodes wouldn't be able to verify a block fast enough until another one has been mined.
This would lead to a (heavy form of) centralisation where only big data centre could allow to run full nodes. This would then require average user to trust some public nodes, without the possibility of fast and efficient validation.
Additionally, it is hard to perform proper testing on bigger blocks (e.g. 8mb or 16mb).

I recall there was a shitcoin fork which increased the block size, without even proper testing on how hardware/software/network reacts and with blocks being only filled to 0-10%.
Noone knows whether this network will still be fully functional when blocks start getting full, or which new attack vectors are being created out of this.


Segwit effictively already doubled (its at 2.3x at the moment, i think) the amount of transactions which fit into a block.

The next steps toward scaling in terms of transactions/block are
(1) the lightning network (which will allow to make an 'infinite' amount of transactions without paying the on-chain fee, as long your channel is open and filled) and
(2) schnorr signatures, which heavily reduce the size of transactions using multiple inputs (which most of the tx's are). They allow you to combine multiple signatures into one (and therefore saving space in the blockchain).



Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: ABCbits on July 17, 2018, 03:46:20 PM
As usual, bob123 already explain most things

why are they committed to only 1mb (or ~ 4 mb with segwit)

FYI, Bitcoin now use block weight limit with 4.000.000 weight limit. So, the maximum block size we can see is from 1MB to 4MB. But 4MB only happen on very specific case and usually the biggest block size is about 2MB.

Why can't core scale as some f(x) = S curve so that you would get a % increase that increased supply and demand?

why are they committed to only 1mb (or ~ 4 mb with segwit)

I am assuming you are talking about the block size?

Well, the most critical point probably is that increasing the block size is just a temporary scaling.
Doubling the block size brings a small benefit (lower fees) for a short period of time (until transactoun amount doubled).
But the price for this 'scaling' is, that you are sacrificing the possibility of decentralisation. If the block size would increase heavily, smaller nodes wouldn't be able to verify a block fast enough until another one has been mined.
This would lead to a (heavy form of) centralisation where only big data centre could allow to run full nodes. This would then require average user to trust some public nodes, without the possibility of fast and efficient validation.
Additionally, it is hard to perform proper testing on bigger blocks (e.g. 8mb or 16mb).

I agree, but there's possibility of increase maximum block size weight with considering majority nodes or people still can run full-nodes without mid/high-end device. But most likely that won't happen because Deciding The line of mid/high-end device and backward-compability.

The next steps toward scaling in terms of transactions/block are
(1) the lightning network (which will allow to make an 'infinite' amount of transactions without paying the on-chain fee, as long your channel is open and filled) and
(2) schnorr signatures, which heavily reduce the size of transactions using multiple inputs (which most of the tx's are). They allow you to combine multiple signatures into one (and therefore saving space in the blockchain).

Additionally, there are also :
1. MAST, which compress script/simple smart-contract while improve user privacy on some cases
2. BLS Signature, an alternative to Schnorr Signature. No idea whether bitcoin will use BLS though

More info :
https://bitcointechtalk.com/what-is-a-bitcoin-merklized-abstract-syntax-tree-mast-33fdf2da5e2f (https://bitcointechtalk.com/what-is-a-bitcoin-merklized-abstract-syntax-tree-mast-33fdf2da5e2f)
https://medium.com/@snigirev.stepan/bls-signatures-better-than-schnorr-5a7fe30ea716 (https://medium.com/@snigirev.stepan/bls-signatures-better-than-schnorr-5a7fe30ea716)


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on July 17, 2018, 08:20:52 PM
Above post by @bob123 is almost the Core's scalability manifesto. I don't agree with even one paragraph of it, not a single idea but I just answer his claims about block size debate, here.

Being concerned about centralization consequences of block size increase is rather a pretext for Core.
They have not showed up with any solution for the situation with pools and ASICs and now they want us to be convinced about their commitment to the cause, I do not feel good.  :-\  So, let's take a closer look at their excuse about block size increase prohibition, keeping block verification process, light :

We have two distinct class of full nodes in bitcoin ecosystem: mining nodes and non-mining ones. Right?

For non-mining full nodes, there is no rush to validate blocks. They have LOTS of time to spend and no opportunity cost is at stake, double it, triple it, make it 10 times, 100 times bigger and nothing bad happens in terms of opportunity costs involved.
The problem would be reduced to just 'scale' times more hard disk space requirement. Not a big deal. I mean are you kidding? Storage costs? 10 years after Satoshi and we are concerned about 1-2 fucking terabytes of storage?

For mining full nodes, it is different, there is opportunity costs involved and the infamous proximity premium flaw.

But, it doesn't help justifying this conservativism about block size as we will see:

A small mining farm (not a home miner) with 1 petahash (0.000025 share in the network), installed power has a chance to mine a block every 6 months with a standard deviation of around 0.95 (look here (https://bitcointalk.org/index.php?topic=4687032.0#new) for the calculation technique), hence it is definitively a requirement for such a miner to join a pool for keeping his site running as a normal business, paying bills, ...  unless he is essentially a gambler instead of a businessman.

See? Although our miner has almost 80XS9s installed already, plus an infrastructure for supplying power, air conditioning, monitoring, ... a minimum of 200K$ investment, he has to forget about solo mining and stick with AntPool, btc.com, ... Right? he doesn't run a full node, he typically has no idea what a full node is!

Actually our miner should install at least 40 times more hash power (40 PH/s) to have at least 0.001 share and mine 1 block every 3-4 days (yet with a rough 0.138 variance and praying to god while sacrificing a goat or something daily, perhaps) to choose solo mining, and a full node consequently.

How much the investment is now? 7-8 million dollars?

And Core is protecting him from what? Spending a few hundreds more on a full node capable of validating blocks in a reasonable time?

Isn't it kinda hypocrisy? Remaining silent about pools, ASICs, ... and pretending to be #1 savior of decentralization?

Which decentralization? Where is decentralization? Show me one decentralized 'thing' in this ecosystem!

Who was in charge of keeping bitcoin decentralized and what's the outcome?



Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: bob123 on July 17, 2018, 10:51:58 PM
Above post by @bob123 is almost the Core's scalability manifesto. I don't agree with even one paragraph of it, not a single idea but I just answer his claims about block size debate, here.

A big wall of text from you, but almost 0 serious content. I am not going to comment your view at the situation.
But, instead of just telling wild stories, what about you come up with an actual PRO argument (for a blocksize blockweight increase) ?
Or an actual argument AGAINST second-layer scaling solutions?

All you said was 'no thats not a con argument, because it doesn't matter for most people'.

If you want a serious discussion, bring up some serious arguments in favor of on-chain scaling (which by the way do not exist).



Who was in charge of keeping bitcoin decentralized and what's the outcome?

Actually, every single user.



Additionally, there are also :
1. MAST, which compress script/simple smart-contract while improve user privacy on some cases
2. BLS Signature, an alternative to Schnorr Signature. No idea whether bitcoin will use BLS though

I have already read about BLS, but i am not sure whether it will really be adopted instead of schnorr. Especially regarding the eventual (not well explored) security concerns (yet).

I haven't heard of MAST before. Thats definitely an interesting feature.
Unbelievable how much potential this whole system has (and how much still will be discovered).


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on July 17, 2018, 11:20:19 PM
Above post by @bob123 is almost the Core's scalability manifesto. I don't agree with even one paragraph of it, not a single idea but I just answer his claims about block size debate, here.

A big wall of text from you, but almost 0 serious content. I am not going to comment your view at the situation.
Seriously? I specifically refuted your (Core's) reasoning about the infamous insistence on crazy 1MB limit by showing its irrelevance to non-miner and miner full nodes both, in details, using proven mathematical techniques. It is 0 and your garbages about the fake and hypocritical concerns regarding centralization  are 'serious content'?

Oh, you guys are making me sick these days. What's happening here in bitcointalk? Legendary figures are marshaling and showing their loyalty to Core?

Are you guys blind or something? I have proved your pretext void, now what?

I have to prove that solo miners with tens of millions of dollars (minimum) investment can afford setting up an state of the art  full node?

Or I have to prove that non-mining nodes can afford 100 dollars for buying a multiple terabyte storage and they can wait a few more seconds, even minutes, to verify a block without losing anything?







Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: bob123 on July 17, 2018, 11:45:24 PM
Seriously? I specifically refuted your (Core's) reasoning about the infamous insistence on crazy 1MB limit by showing its irrelevance to non-miner and miner full nodes both, in details, using proven mathematical techniques.

1) You didn't refute anything.
2) It is not a 1MB limit, it is a 4.000.000 Weight limit.
3) The only 'mathematical techniques' you have used were simple percentage calculations..



Are you guys blind or something? I have proved [..]

You didn't prove anything. All you did was to express your opinion.



I have to prove that solo miners with tens of millions of dollars (minimum) investment can afford setting up an state of the art  full node?

Or I have to prove that non-mining nodes can afford 100 dollars for buying a multiple terabyte storage and they can wait a few more seconds, even minutes, to verify a block without losing anything?

What about the most obvious one:

[..] what about you come up with an actual PRO argument (for a blocksize blockweight increase) ?

Still not a single argument pro on-chain scaling.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on July 18, 2018, 02:12:05 AM
Seriously? I specifically refuted your (Core's) reasoning about the infamous insistence on crazy 1MB limit by showing its irrelevance to non-miner and miner full nodes both, in details, using proven mathematical techniques.

1) You didn't refute anything.
I refuted your nonsense about:
Why can't core scale as some f(x) = S curve so that you would get a % increase that increased supply and demand?

why are they committed to only 1mb (or ~ 4 mb with segwit)

I am assuming you are talking about the block size?

Well, the most critical point probably is that increasing the block size is just a temporary scaling.
Doubling the block size brings a small benefit (lower fees) for a short period of time (until transactoun amount doubled).
But the price for this 'scaling' is, that you are sacrificing the possibility of decentralisation. If the block size would increase heavily, smaller nodes wouldn't be able to verify a block fast enough until another one has been mined.
by asserting that
We have two distinct class of full nodes in bitcoin ecosystem: mining nodes and non-mining ones. Right?

For non-mining full nodes, there is no rush to validate blocks. They have LOTS of time to spend and no opportunity cost is at stake, double it, triple it, make it 10 times, 100 times bigger and nothing bad happens in terms of opportunity costs involved.
...
See? Although our miner has almost 80XS9s installed already, plus an infrastructure for supplying power, air conditioning, monitoring, ... a minimum of 200K$ investment, he has to forget about solo mining and stick with AntPool, btc.com, ... Right? he doesn't run a full node, he typically has no idea what a full node is!

Actually our miner should install at least 40 times more hash power (40 PH/s) to have at least 0.001 share and mine 1 block every 3-4 days (yet with a rough 0.138 variance and praying to god while sacrificing a goat or something daily, perhaps) to choose solo mining, and a full node consequently.

How much the investment is now? 7-8 million dollars?

And Core is protecting him from what? Spending a few hundreds more on a full node capable of validating blocks in a reasonable time?
Summary: non-miner nodes won't hurt economically by block size increase. miner-nodes are millionaires and wouldn't mind spending few hundred dollars to upgrade.

In your strange language what is it called other than refutting? heh ???

Quote
2) It is not a 1MB limit, it is a 4.000.000 Weight limit.

It has been 1 MB limit and it IS still 1 MB limit, go fool yourself. SW is another subject.

Quote
3) The only 'mathematical techniques' you have used were simple percentage calculations..
Really? So who calculated mining variance for the hypothetical miner and how?
A small mining farm (not a home miner) with 1 petahash (0.000025 share in the network), installed power has a chance to mine a block every 6 months with a standard deviation of around 0.95 (look here (https://bitcointalk.org/index.php?topic=4687032.0#new) for the calculation technique), hence it is definitively a requirement for such a miner to join a pool for keeping his site running as a normal business, paying bills, ...  unless he is essentially a gambler instead of a businessman.
Surprise! It was me and I used a more sophisticated technique than 'percentage calculations'

[..] what about you come up with an actual PRO argument (for a blocksize blockweight increase) ?

What about more transactions and less fees? It needs arguing? The onus is on you to prove such an increase having bad consequences not on me to prove it is good, of course it is good if there is no serious cost!

FYI I'm not in favor of block size increase personally, I'm thinking of reducing block time which has more pros, among them, better variance and helping decentralization which would be hard for someone like you to approach.

Stay in your box and enjoy your popcorn watching Core show.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on July 18, 2018, 06:44:52 AM
OP, because there is a "blockchain trilemma", a term from Vitalik Buterin, that needs to be solved or in some way "balanced". I believe his thoughts on that matter are sound.

In the the blockchain trilemma, it is believed that only 2 of 3 "traits", which are decentralization, scalability, and security, can be achieved on a fundamental level. Adjusting one would mean affecting or giving up some of the traits of either one of the other two.

I believe the Core developers stayed on a 1mb block size to maintain "security" and node "decentralization", and the Lightning developers are solving "scalability" through an offchain layer, and therefore solving the "blockchain trilemma".



Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: DooMAD on July 19, 2018, 07:56:46 PM
Oh, you guys are making me sick these days. What's happening here in bitcointalk? Legendary figures are marshaling and showing their loyalty to Core?

It's not about loyalty to any developer, it's about recognising the fact that the users on this network have made their decision.  You and franky1 need to start some sort of special club for people who are wholly incapable of avoiding conflation between what developers do and what users and miners do.  Devs aren't in charge, they just make whatever they want to make.  It's the people who freely choose to run the code who you appear to have an issue with.  People had the choice.  This is what they chose.

If you want total emphasis towards on-chain scaling, there are blockchains out there where you will find like-minded users who share the same ideals.  But you need to understand this is not one of those blockchains.  Users and miners already made that call.  Whine about it all you like, this is how it is until consensus says otherwise.  Take it or leave it, those are the options.  We can always come back to the idea of blockweight adjustments in future if there's a necessity for it, but it's not happening now.


FYI I'm not in favor of block size increase personally, I'm thinking of reducing block time which has more pros, among them, better variance and helping decentralization which would be hard for someone like you to approach.

I eagerly await your "Bitcoin" with a different block time, different algorithm, no pools and everyone magically agreeing with all your changes.   ::)

'Til then, cool story, bro.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on July 19, 2018, 08:52:36 PM
Oh, you guys are making me sick these days. What's happening here in bitcointalk? Legendary figures are marshaling and showing their loyalty to Core?

It's not about loyalty to any developer, it's about recognising the fact that the users on this network have made their decision.  You and franky1 need to start some sort of special club for people who are wholly incapable of avoiding conflation between what developers do and what users and miners do.  Devs aren't in charge, they just make whatever they want to make.  It's the people who freely choose to run the code who you appear to have an issue with.  People had the choice.  This is what they chose.
Bitcoin core, is what I've chosen too, for now, not forever! Naive forks are not of my type. But I don't belong to any secret brotherhood. Core is bad but is the best we got for now.

Bitcoin started experimentally and transited to operational state gradually, when people observed that their experimental coins have come with costs and got a price.

We shouldn't be fooled yet, the centralization flaws and scaling problems are there waiting for  us to fix them and improve.

A critical problem is the Core dilemma: They are good for the ecosystem because of their expertise and they are bad because of their lack of vision and  vulnerability to sectarianism.

Once I might have to choose in favor of one because of his  expertise or against him because of his  dogmatism, my choice would be against him eventually.

I think an average programmer with vision and open to change is far better than a super genius dogmatist who has nothing other than ultra-conservatism to offer. Average programmers learn and grow, arrogant dogmatists just sink.

Bitcoin is not a joke. It can't stop evolution and stick with the past and community will pay the price whenever there would be a vigorous solid proposal, (not just blindly increasing block size, switching to equihash, ... ).

What we have to do, our mission, is discussing such improvements freely and without any hesitation,  instead of being worried about Core's (lack of) agenda. They should listen to us (unlikely) or they would be eliminated (more likely).

When I come to an idea, the last thing I care about is how Core guys may feel or react to that idea, otherwise  there would be no idea at all.




Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on July 20, 2018, 05:43:32 AM
Oh, you guys are making me sick these days. What's happening here in bitcointalk? Legendary figures are marshaling and showing their loyalty to Core?

It's not about loyalty to any developer, it's about recognising the fact that the users on this network have made their decision.  You and franky1 need to start some sort of special club for people who are wholly incapable of avoiding conflation between what developers do and what users and miners do.  Devs aren't in charge, they just make whatever they want to make.  It's the people who freely choose to run the code who you appear to have an issue with.  People had the choice.  This is what they chose.
Bitcoin core, is what I've chosen too, for now, not forever! Naive forks are not of my type. But I don't belong to any secret brotherhood. Core is bad but is the best we got for now.

For now? What does that mean? I believe that means that you have no choice but to be in consensus in the social consensus that "Bitcoin is BTC". Hahaha.

You cannot escape. 8)

Quote
I think an average programmer with vision and open to change is far better than a super genius dogmatist who has nothing other than ultra-conservatism to offer. Average programmers learn and grow, arrogant dogmatists just sink.

Like the developers of Ethereum ICOs and "blockchain companies"? Hahaha.

Quote
Bitcoin is not a joke.

But yet you support an "urgent" hard fork to bigger blocks because "Bitcoin needs to scale now". The irony.

Quote
What we have to do, our mission, is discussing such improvements freely and without any hesitation,  instead of being worried about Core's (lack of) agenda. They should listen to us (unlikely) or they would be eliminated (more likely).

When I come to an idea, the last thing I care about is how Core guys may feel or react to that idea, otherwise  there would be no idea at all.


Do you truly believe that your ideas are very good?


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on July 20, 2018, 08:18:44 AM
...
Bitcoin core, is what I've chosen too, for now, not forever! Naive forks are not of my type. But I don't belong to any secret brotherhood. Core is bad but is the best we got for now.

For now? What does that mean? I believe that means that you have no choice but to be in consensus in the social consensus that "Bitcoin is BTC". Hahaha.

You cannot escape. 8)

It simply means that I understand improving bitcoin is not a trivial job and shit-forking it is not a solution but on the other hand, unfortunately things have happened and Core is distracted from the cause and there is little, if not zero, hope for them to heal.

Quote
Quote
I think an average programmer with vision and open to change is far better than a super genius dogmatist who has nothing other than ultra-conservatism to offer. Average programmers learn and grow, arrogant dogmatists just sink.

Like the developers of Ethereum ICOs and "blockchain companies"? Hahaha.
Yes. And a lot more. Code is not the bottleneck, vision and dedication is.

Quote
Quote
Bitcoin is not a joke.
But yet you support an "urgent" hard fork to bigger blocks because "Bitcoin needs to scale now". The irony.
Who said anything about bigger blocks as an ultimate or urgent scaling solution? I support reducing block time and yet not as a scaling solution, rather a decentralization complementary one.

Like Core, god forgive me, I believe neither increased block size nor reduced block time deserve to be classified as scaling solution because they can't scale up safely. But unlike Core, thanks god, I do believe that such improvements can be employed to some extent, no need to be superstitious about magical 1 MB block size or 10 min block time. We have Moor's law still applicable.

Quote
Quote
What we have to do, our mission, is discussing such improvements freely and without any hesitation,  instead of being worried about Core's (lack of) agenda. They should listen to us (unlikely) or they would be eliminated (more likely).

When I come to an idea, the last thing I care about is how Core guys may feel or react to that idea, otherwise  there would be no idea at all.


Do you truly believe that your ideas are very good?
100% sure! My PoCw proposal (https://bitcointalk.org/index.php?topic=4438334.0), in its current alpha phase, is unique in the whole crypto literature for being the only serious work on fixing pooling pressure related flaws like mining variance and proximity premium.

I did it when I started betraying Core, any serious work in bitcoin begins with getting rid of Core's (lack of) agenda, imo.



 


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: mfyilmaz on July 20, 2018, 08:50:27 AM
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: AdolfinWolf on July 20, 2018, 02:25:14 PM
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Foresee what exactly? I'm quite missing your point.. Looks to me like you put as many buzzwords in a paragraph as possible and didnt really read the thread...

Also about the BLS signatures, this is what the lead dev of the schnorr signatures BIP had to say about them;

I think they're awesome.

However, they're also slower to verify, and introduce additional security assumptions. That roughly means there are additional ways in which they could be broken, which don't apply to Schnorr or ECDSA. As a result, I expect it to be much easier to find widespread agreement to switch to Schnorr (which has no additional assumptions, and is slightly faster than ECDSA).

So i highly doubt that we will see these any time soon..

I also have yet to fully understand why these BLS signatures are so "Awesome", the only thing i've really heard about them is that they're generally slower to verify, (as stated above).. Does anyone know what the general advantages of them are in (for example) comparison to schnorr?





Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on July 20, 2018, 02:44:01 PM
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Some say satoshi expected consensus to be eventually reached in order to raise the blocksize, which seems pretty naive in retrospect, since it is clear now that it's almost impossible to reach said consensus, and most likely we are stuck with 1MB.

Other's say that this was completely expected by satoshi, and he knew perfectly well that the 1MB limit was set in stone, and this is just part of the Bitcoin game theory, so it can be immutable, while remaining open source and so on.

We will never know what his true intentions were, what do we know now is, no increases of the block size are going to happen anytime soon.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: DooMAD on July 20, 2018, 02:58:47 PM
Also about the BLS signatures, this is what the lead dev of the schnorr signatures BIP had to say about them;

I think they're awesome.

However, they're also slower to verify, and introduce additional security assumptions. That roughly means there are additional ways in which they could be broken, which don't apply to Schnorr or ECDSA. As a result, I expect it to be much easier to find widespread agreement to switch to Schnorr (which has no additional assumptions, and is slightly faster than ECDSA).

So i highly doubt that we will see these any time soon..

I also have yet to fully understand why these BLS signatures are so "Awesome", the only thing i've really heard about them is that they're generally slower to verify, (as stated above).. Does anyone know what the general advantages of them are in (for example) comparison to schnorr?

I'm still trying to understand what the practical implications of the differences are, since it's all quite technical.  Apparently, BLS could combine every signature in a block into a single signature, which would save space and allow for more transactions.  A 2-of-3 multisig is more efficient in BLS, as key aggregation works without needing to generate a merkle tree of public keys.  Stuff like that.  But the impression I get is that if BLS sigs take longer to verify, it's likely too high a price to pay for such bonuses, which may prove less important than just having Schnorr's raw efficiency and speed.

It's one of those either/or deals, so we can only pick one.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on July 20, 2018, 03:31:35 PM
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Some say satoshi expected consensus to be eventually reached in order to raise the blocksize, which seems pretty naive in retrospect, since it is clear now that it's almost impossible to reach said consensus, and most likely we are stuck with 1MB.

Other's say that this was completely expected by satoshi, and he knew perfectly well that the 1MB limit was set in stone, and this is just part of the Bitcoin game theory, so it can be immutable, while remaining open source and so on.

We will never know what his true intentions were, what do we know now is, no increases of the block size are going to happen anytime soon.
No matter what Satoshi did or said or intended to do, whatever. An increase in block size is not recommended because it won't help decentralization. I don't say it would escalate centralization (like what Core falsely claimed during the debate), it is just not helpful.

Instead I support block time decrease because of its proportional desired impact on decentralization.

Anyway, it is very important to understand, scalability is not addressable by either approaches because of nonlinearities involved, both should be considered like some complementary improvement.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on July 20, 2018, 04:37:46 PM
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Some say satoshi expected consensus to be eventually reached in order to raise the blocksize, which seems pretty naive in retrospect, since it is clear now that it's almost impossible to reach said consensus, and most likely we are stuck with 1MB.

Other's say that this was completely expected by satoshi, and he knew perfectly well that the 1MB limit was set in stone, and this is just part of the Bitcoin game theory, so it can be immutable, while remaining open source and so on.

We will never know what his true intentions were, what do we know now is, no increases of the block size are going to happen anytime soon.
No matter what Satoshi did or said or intended to do, whatever. An increase in block size is not recommended because it won't help decentralization. I don't say it would escalate centralization (like what Core falsely claimed during the debate), it is just not helpful.

Instead I support block time decrease because of its proportional desired impact on decentralization.

Anyway, it is very important to understand, scalability is not addressable by either approaches because of nonlinearities involved, both should be considered like some complementary improvement.

Block time modifications would screw up other things too. I don't think it would make a difference. Do you have any models with numbers that show it would improve without tradeoffs?

And nontheless it would require a hardfork. At the end of the day it's not even about a blocksize change but the hardfork itself. There will always be some people disagreeing about it which will lead to the creation of another altcoin.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on July 20, 2018, 04:57:07 PM
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Some say satoshi expected consensus to be eventually reached in order to raise the blocksize, which seems pretty naive in retrospect, since it is clear now that it's almost impossible to reach said consensus, and most likely we are stuck with 1MB.

Other's say that this was completely expected by satoshi, and he knew perfectly well that the 1MB limit was set in stone, and this is just part of the Bitcoin game theory, so it can be immutable, while remaining open source and so on.

We will never know what his true intentions were, what do we know now is, no increases of the block size are going to happen anytime soon.
No matter what Satoshi did or said or intended to do, whatever. An increase in block size is not recommended because it won't help decentralization. I don't say it would escalate centralization (like what Core falsely claimed during the debate), it is just not helpful.

Instead I support block time decrease because of its proportional desired impact on decentralization.

Anyway, it is very important to understand, scalability is not addressable by either approaches because of nonlinearities involved, both should be considered like some complementary improvement.

Assuming block weight limit is still 4.000.000 weight, block time decrease also could lead to centralization on different way. Without proper research, block propagation could be slower than block time and increase block orphan rate.
It is the old Core fallacy I denounced up-thread. Neither block size increase nor block time decrease would have a negative impact on centralization as long as we are not speaking about a linear increase/decrease in large scales.
Core campus argument was that scaling bitcoin by increasing block size would continue to reach to insane orders of magnitude eventually and eliminate more and more full nodes gradually.

They could simply disagree with such a linear continuous increase, instead they preferred to reject any increase proposal. Why? Nobody knows!
They could consider up to 10 times increase without affecting anything in the network in terms of centralization, but they made a taboo out of it: You are in favor of on-chain scaling? You are a bitcoin cash puppet!

Mining centralization is a real fact, I've extensively discussed and analysed it and proved that there is no 'small miner' in the network right now (other than few hobbyists I guess), it is all about very large mining farms and pools with millions of dollars worth of investments. I got mathematical proof for this, undisputable.

So, handling say 10 times more transactions in a period of time either by increasing block size or reducing block time is not a big deal for mining nodes, they are wealthy enough to use more powerful nodes. Actually, they already use fault tolerant server grade systems for this.

And guess what? When it comes to my suggestion about block time decrease, it has direct, complementary positive effect on decentralization: More distributed rewards, more chance for smaller pools to compete and remain competitive.

Quote
Besides, this could tamper total coin supply/production rate (even though this can be solved easily) and could mess with Timelock script which use block number as it's duration/time.
Both coin supply and Timelock could be readjusted, they should. As of coin supply it is trivial, just reduce the reward amount relatively, and for time lock scripts, it needs a little tweak in verifying old scripts plus retiring current opcode, CLTV, and using a new opcode such that new scripts would use the new scale for nlocktime.

Quote
But looking at the bright side, people will feel Bitcoin is faster and there will less waiting time.
I'm just Surprised!

Finally in a Core dominated atmosphere somebody realized a non-Core related proposal got one useful feature, see? We are progressing  8)


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on July 21, 2018, 12:14:08 AM
It's not even relevant. No matter if one thinks a bigger blocksize and blocktime wouldn't centralize (I disagree, there's no free ride, you pay a price with some % of centralization due node resource going up + more orphans) but anyway, the main point is, you just can't do it, how are you going reach hard fork consensus? how are you going to get all the whales on board so when the fork happens they do not dump on you, crashing the price, forcing miners to stop mining your fork? Betting against the original Bitcoin is always going to have an unreasonable amount of risk to do so, you are always going to on the losing side supporting a fork, unless you manage to make everyone and their money to agree with you somehow.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on July 21, 2018, 01:29:52 PM
there's no free ride, you pay a price with some % of centralization due node resource going up + more orphans
Nop. You are wrong. Centralization is not a ghost to be summoned by some spells, any claim against an improvement proposal should be analytically provable, no room for intuitions. I have refuted any such claim regarding centralization consequences of a moderate block size increase up-thread and as of block time decrease (my alternative suggestion) it has a considerable positive decentralization impact (actually in orders of magnitude).

Quote
It's not even relevant. No matter if one thinks a bigger blocksize and blocktime wouldn't centralize (I disagree,..) but anyway, the main point is, you just can't do it, how are you going reach hard fork consensus? how are you going to get all the whales on board so when the fork happens they do not dump on you, crashing the price, forcing miners to stop mining your fork? Betting against the original Bitcoin is always going to have an unreasonable amount of risk to do so, you are always going to on the losing side supporting a fork, unless you manage to make everyone and their money to agree with you somehow.
Aren't you and @Doomad same persons?  He is used to bring forward this argument whenever it comes to a change proposal.

I understand your concerns, bitcoin is no more an experimental project and everything is political here, I truly understand but I don't care. I'm sure at the end of the day we are left with nothing less or more than the truth not lies and the 'right'  not wrongs.

Any for the problem of evolution and consensus around it, I have some ideas, for now it is better to remain focused on 'the right' and 'the true', imo.





Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: DooMAD on July 21, 2018, 02:03:27 PM
there's no free ride, you pay a price with some % of centralization due node resource going up + more orphans
Nop. You are wrong. Centralization is not a ghost to be summoned by some spells, any claim against an improvement proposal should be analytically provable, no room for intuitions. I have refuted any such claim regarding centralization consequences of a moderate block size increase up-thread and as of block time decrease (my alternative suggestion) it has a considerable positive decentralization impact (actually in orders of magnitude).

If a faster block time had a positive impact on decentralisation, we would have done it years ago.  Kindly stop talking out of your posterior.  You have refuted nothing.  
 

Quote
It's not even relevant. No matter if one thinks a bigger blocksize and blocktime wouldn't centralize (I disagree,..) but anyway, the main point is, you just can't do it, how are you going reach hard fork consensus? how are you going to get all the whales on board so when the fork happens they do not dump on you, crashing the price, forcing miners to stop mining your fork? Betting against the original Bitcoin is always going to have an unreasonable amount of risk to do so, you are always going to on the losing side supporting a fork, unless you manage to make everyone and their money to agree with you somehow.
Aren't you and @Doomad same persons?  He is used to bring forward this argument whenever it comes to a change proposal.

That, or it's just a sensible argument, maybe?  

If lots of people think it's a good idea, then naturally a conversation will flow and things will fall into place without much effort involved.  Instead, what we have here is you talking at us and we frankly don't care what you're saying because it sounds terrible.  We had more years than I care to remember with the entire forum arguing about block size/block time/alternating blocks/paired blocks/whatever other ideas people had and we're pretty much tired of hearing it now.  People just aren't receptive to this kind of stuff anymore.  We've moved on.  You should too.


I understand your concerns, bitcoin is no more an experimental project and everything is political here, I truly understand but I don't care. I'm sure at the end of the day we are left with nothing less or more than the truth not lies and the 'right'  not wrongs.

Any for the problem of evolution and consensus around it, I have some ideas, for now it is better to remain focused on 'the right' and 'the true', imo.

That's just it, though.  You think your ideas are "right and true", but we've heard people suggesting shortening the block time dozens of times before and we still think it falls under the category of "maybe later if we absolutely have to".  If it seems like we're being short with you, it's because we've had enough of those kind of ideas now.  They're not original, they're not innovative and, most of all, they won't achieve scaling on the kind of levels we need.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on July 21, 2018, 03:21:44 PM
there's no free ride, you pay a price with some % of centralization due node resource going up + more orphans
Nop. You are wrong. Centralization is not a ghost to be summoned by some spells, any claim against an improvement proposal should be analytically provable, no room for intuitions. I have refuted any such claim regarding centralization consequences of a moderate block size increase up-thread and as of block time decrease (my alternative suggestion) it has a considerable positive decentralization impact (actually in orders of magnitude).

If a faster block time had a positive impact on decentralisation, we would have done it years ago.  Kindly stop talking out of your posterior.  You have refuted nothing.  
what do you mean IF?  
So, you think everything is what it should be and it could be and bitcoin is King James Bible which God supervised personally to happen and has promised to keep it safe and not altered ? You know nothing dude, you are just a believer, aren't you?

I have thoroughly proved it in the other thread (https://bitcointalk.org/index.php?topic=4687032.msg42595775#new), go read it, if you are interested in the truth, otherwise just keep praying, God bless you  ;D

...  We had more years than I care to remember with the entire forum arguing about block size/block time/alternating blocks/paired blocks/whatever other ideas people had and we're pretty much tired of hearing it now.  People just aren't receptive to this kind of stuff anymore.  We've moved on.  You should too.


{as of} the problem of evolution and consensus around it, I have some ideas, for now it is better to remain focused on 'the right' and 'the true', imo.

That's just it, though.  You think your ideas are "right and true", but we've heard people suggesting shortening the block time dozens of times before and we still think it falls under the category of "maybe later if we absolutely have to".  If it seems like we're being short with you, it's because we've had enough of those kind of ideas now.  They're not original, they're not innovative and, most of all, they won't achieve scaling on the kind of levels we need.
Block time decrease is not all I got, I'm just telling that it is one helping step for decentralization, and I strongly denounce such an improvement to be 'nothing' or to encompass centralization threats! It is ridiculously wrong.

As of you and your mates having long history of talking, it  doesn't change the fact that you didn't accomplished a great job:
Right now, we have bitcoin overwhelmingly centralized by pools, controlled by exchanges and whales and what your endless discussions have brought to us is nothing more than projecting almost every problem off the chain. I don't take it as an impressive carrier.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on July 21, 2018, 04:51:46 PM
there's no free ride, you pay a price with some % of centralization due node resource going up + more orphans
Nop. You are wrong. Centralization is not a ghost to be summoned by some spells, any claim against an improvement proposal should be analytically provable, no room for intuitions. I have refuted any such claim regarding centralization consequences of a moderate block size increase up-thread and as of block time decrease (my alternative suggestion) it has a considerable positive decentralization impact (actually in orders of magnitude).

Quote
It's not even relevant. No matter if one thinks a bigger blocksize and blocktime wouldn't centralize (I disagree,..) but anyway, the main point is, you just can't do it, how are you going reach hard fork consensus? how are you going to get all the whales on board so when the fork happens they do not dump on you, crashing the price, forcing miners to stop mining your fork? Betting against the original Bitcoin is always going to have an unreasonable amount of risk to do so, you are always going to on the losing side supporting a fork, unless you manage to make everyone and their money to agree with you somehow.
Aren't you and @Doomad same persons?  He is used to bring forward this argument whenever it comes to a change proposal.

I understand your concerns, bitcoin is no more an experimental project and everything is political here, I truly understand but I don't care. I'm sure at the end of the day we are left with nothing less or more than the truth not lies and the 'right'  not wrongs.

Any for the problem of evolution and consensus around it, I have some ideas, for now it is better to remain focused on 'the right' and 'the true', imo.





Even if you were to be right, you would need to argue this with the people that have the most influence in the ecosystem: big merchants, big miners, big whales, show them the proof, and see what the feedback is. Perhaps starting by going into MP's group chat on IRC (see Anonymint posts). Go there and show your technological arguments about why it's a good idea and post the log there.

Do the same on any other relevant place and post results. You need to be debating this in these places, besides in here, otherwise it just stay as what you think it's a good idea, but not ever implemented.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: andrew1carlssin on July 21, 2018, 05:34:34 PM
there's no free ride, you pay a price with some % of centralization due node resource going up + more orphans
Nop. You are wrong. Centralization is not a ghost to be summoned by some spells, any claim against an improvement proposal should be analytically provable, no room for intuitions. I have refuted any such claim regarding centralization consequences of a moderate block size increase up-thread and as of block time decrease (my alternative suggestion) it has a considerable positive decentralization impact (actually in orders of magnitude).

Quote
It's not even relevant. No matter if one thinks a bigger blocksize and blocktime wouldn't centralize (I disagree,..) but anyway, the main point is, you just can't do it, how are you going reach hard fork consensus? how are you going to get all the whales on board so when the fork happens they do not dump on you, crashing the price, forcing miners to stop mining your fork? Betting against the original Bitcoin is always going to have an unreasonable amount of risk to do so, you are always going to on the losing side supporting a fork, unless you manage to make everyone and their money to agree with you somehow.
Aren't you and @Doomad same persons?  He is used to bring forward this argument whenever it comes to a change proposal.

I understand your concerns, bitcoin is no more an experimental project and everything is political here, I truly understand but I don't care. I'm sure at the end of the day we are left with nothing less or more than the truth not lies and the 'right'  not wrongs.

Any for the problem of evolution and consensus around it, I have some ideas, for now it is better to remain focused on 'the right' and 'the true', imo.





Even if you were to be right, you would need to argue this with the people that have the most influence in the ecosystem: big merchants, big miners, big whales, show them the proof, and see what the feedback is. Perhaps starting by going into MP's group chat on IRC (see Anonymint posts). Go there and show your technological arguments about why it's a good idea and post the log there.

Do the same on any other relevant place and post results. You need to be debating this in these places, besides in here, otherwise it just stay as what you think it's a good idea, but not ever implemented.

I agree with that ... instead just act as politicians who are already committed to their party, views and believes .. we should act more as a scientist ready to review any hypothesis if new data is coming out ...

Regards to scaling, in the real world is very hard to predict how a certain type of network will scale because so many variables ( when we add economical, human decision, etc) ..
So I decided to look at simulators ...
There are so many types of networking simulators  ... virtualization, emulators, hybrid systems, discrete event, etc ... in my point of view if we need run a very large simulation in order to et data to support an argument or hypothesis .. I would choose discrete event simulation ...   


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on July 21, 2018, 06:30:40 PM

Even if you were to be right, you would need to argue this with the people that have the most influence in the ecosystem: big merchants, big miners, big whales, show them the proof, and see what the feedback is. Perhaps starting by going into MP's group chat on IRC (see Anonymint posts). Go there and show your technological arguments about why it's a good idea and post the log there.

Do the same on any other relevant place and post results. You need to be debating this in these places, besides in here, otherwise it just stay as what you think it's a good idea, but not ever implemented.

Thanks for the advise about IRC,  I've been advised today once more to do this and I appreciate it. I will consider it  once I'm done formalizing my proposal more thoroughly, meanwhile I'm just discussing some parts and debating hypes and misunderstandings.

As of influential figures, I don't think they are the best audiences. You know, the most important figures, ones who practically own the network are pool operators and all I have to tell is it's time for them to say goodbye. ;D


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on July 24, 2018, 06:08:13 AM
...
Bitcoin core, is what I've chosen too, for now, not forever! Naive forks are not of my type. But I don't belong to any secret brotherhood. Core is bad but is the best we got for now.

For now? What does that mean? I believe that means that you have no choice but to be in consensus in the social consensus that "Bitcoin is BTC". Hahaha.

You cannot escape. 8)

It simply means that I understand improving bitcoin is not a trivial job and shit-forking it is not a solution but on the other hand, unfortunately things have happened and Core is distracted from the cause and there is little, if not zero, hope for them to heal.

Heal from what sickness? The sickness of conservatism and making sure that the network is stable and efficient?

Quote
Quote
Quote
I think an average programmer with vision and open to change is far better than a super genius dogmatist who has nothing other than ultra-conservatism to offer. Average programmers learn and grow, arrogant dogmatists just sink.

Like the developers of Ethereum ICOs and "blockchain companies"? Hahaha.
Yes. And a lot more. Code is not the bottleneck, vision and dedication is.

This might prove that you might not be as smart as you make it appear to be.

Quote
Quote
Quote
Bitcoin is not a joke.
But yet you support an "urgent" hard fork to bigger blocks because "Bitcoin needs to scale now". The irony.
Who said anything about bigger blocks as an ultimate or urgent scaling solution? I support reducing block time and yet not as a scaling solution, rather a decentralization complementary one.

Like Core, god forgive me, I believe neither increased block size nor reduced block time deserve to be classified as scaling solution because they can't scale up safely. But unlike Core, thanks god, I do believe that such improvements can be employed to some extent, no need to be superstitious about magical 1 MB block size or 10 min block time. We have Moor's law still applicable.

Quote
Quote
What we have to do, our mission, is discussing such improvements freely and without any hesitation,  instead of being worried about Core's (lack of) agenda. They should listen to us (unlikely) or they would be eliminated (more likely).

When I come to an idea, the last thing I care about is how Core guys may feel or react to that idea, otherwise  there would be no idea at all.


Do you truly believe that your ideas are very good?
100% sure! My PoCw proposal (https://bitcointalk.org/index.php?topic=4438334.0), in its current alpha phase, is unique in the whole crypto literature for being the only serious work on fixing pooling pressure related flaws like mining variance and proximity premium.

I did it when I started betraying Core, any serious work in bitcoin begins with getting rid of Core's (lack of) agenda, imo.

If you believe your ideas are good then let them be recognized for their own merits because your verbal attacks on the Core developers are not helping your cause.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on July 24, 2018, 07:28:04 PM
...
Who said anything about bigger blocks as an ultimate or urgent scaling solution? I support reducing block time and yet not as a scaling solution, rather a decentralization complementary one.

Like Core, god forgive me, I believe neither increased block size nor reduced block time deserve to be classified as scaling solution because they can't scale up safely. But unlike Core, thanks god, I do believe that such improvements can be employed to some extent, no need to be superstitious about magical 1 MB block size or 10 min block time. We have Moor's law still applicable.

Quote
Quote
What we have to do, our mission, is discussing such improvements freely and without any hesitation,  instead of being worried about Core's (lack of) agenda. They should listen to us (unlikely) or they would be eliminated (more likely).

When I come to an idea, the last thing I care about is how Core guys may feel or react to that idea, otherwise  there would be no idea at all.


Do you truly believe that your ideas are very good?
100% sure! My PoCw proposal (https://bitcointalk.org/index.php?topic=4438334.0), in its current alpha phase, is unique in the whole crypto literature for being the only serious work on fixing pooling pressure related flaws like mining variance and proximity premium.

I did it when I started betraying Core, any serious work in bitcoin begins with getting rid of Core's (lack of) agenda, imo.

If you believe your ideas are good then let them be recognized for their own merits because your verbal attacks on the Core developers are not helping your cause.
Are you kidding? Any idea about improving bitcoin is taken by Core as an insult! They think they have been around for a long time and they have discussed and rejected any possible idea already. I'm done with them, I have no hope in them they are close minded and are responsible for leading bitcoin to its ice age, they have nothing to say about pools, ASICs and exchanges that have centralized the ecosystem, what kind of a crypto developer has no idea about how to defend against centralization threats?

You believe in these guys? Enjoy your popcorn watching their show, I'm no longer interested.


 


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on July 25, 2018, 05:28:44 AM
...
Who said anything about bigger blocks as an ultimate or urgent scaling solution? I support reducing block time and yet not as a scaling solution, rather a decentralization complementary one.

Like Core, god forgive me, I believe neither increased block size nor reduced block time deserve to be classified as scaling solution because they can't scale up safely. But unlike Core, thanks god, I do believe that such improvements can be employed to some extent, no need to be superstitious about magical 1 MB block size or 10 min block time. We have Moor's law still applicable.

Quote
Quote
What we have to do, our mission, is discussing such improvements freely and without any hesitation,  instead of being worried about Core's (lack of) agenda. They should listen to us (unlikely) or they would be eliminated (more likely).

When I come to an idea, the last thing I care about is how Core guys may feel or react to that idea, otherwise  there would be no idea at all.


Do you truly believe that your ideas are very good?
100% sure! My PoCw proposal (https://bitcointalk.org/index.php?topic=4438334.0), in its current alpha phase, is unique in the whole crypto literature for being the only serious work on fixing pooling pressure related flaws like mining variance and proximity premium.

I did it when I started betraying Core, any serious work in bitcoin begins with getting rid of Core's (lack of) agenda, imo.

If you believe your ideas are good then let them be recognized for their own merits because your verbal attacks on the Core developers are not helping your cause.
Are you kidding? Any idea about improving bitcoin is taken by Core as an insult! They think they have been around for a long time and they have discussed and rejected any possible idea already. I'm done with them, I have no hope in them they are close minded and are responsible for leading bitcoin to its ice age, they have nothing to say about pools, ASICs and exchanges that have centralized the ecosystem, what kind of a crypto developer has no idea about how to defend against centralization threats?

Any idea? I believe not "any idea" will be good. Maybe some of them might be so stupid that Core would have to be polite enough not to ridicule the person who made it.

But maybe your idea is good. Fight for it for its own technical merit. Or do this, make an altcoin. 8)

Quote
You believe in these guys? Enjoy your popcorn watching their show, I'm no longer interested.

Yes. Do you know which group of developers are better than them?


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: hatshepsut93 on July 29, 2018, 12:13:12 AM
Why can't core scale as some f(x) = S curve so that you would get a % increase that increased supply and demand?

why are they committed to only 1mb (or ~ 4 mb with segwit)



Because it's risky: no one knows how would full node hardware (read: desktops and laptops) evolve in the future - maybe in 10 years they will easily handle 30 MB blocks, maybe they will struggle even with 10 MB blocks. If your curve will grow faster than node resources, you'll end up with the same result as if you increased blocksize to some ridiculous number today: network centralization.

This risk also has very little reward: increasing tps from 7 to 30-50 won't solve the scalability problem, we need thousands transactions per second, and this is why we have Lightning. Also, there are more elegant on-chain capacity boosts that need to be implemented before any blocksize increases, like Schnorr.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Quickseller on July 29, 2018, 04:55:47 PM
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Some say satoshi expected consensus to be eventually reached in order to raise the blocksize, which seems pretty naive in retrospect, since it is clear now that it's almost impossible to reach said consensus, and most likely we are stuck with 1MB.

Other's say that this was completely expected by satoshi, and he knew perfectly well that the 1MB limit was set in stone, and this is just part of the Bitcoin game theory, so it can be immutable, while remaining open source and so on.

We will never know what his true intentions were, what do we know now is, no increases of the block size are going to happen anytime soon.
Satoshi was clear that the 1 MB max block size was meant to be a temporary anti-spam measure, and that the max block size should be increased in the future when the number of transactions warranted.


~
What exactly does a non-mining node do for a "regular" user of Bitcoin who is not running a node, and is in the process of spending some of his bitcoin in a store? Since every node is free to choose their own settings, what would happen if this particular node has different settings than the mining nodes as a whole?


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on July 29, 2018, 06:38:10 PM
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Some say satoshi expected consensus to be eventually reached in order to raise the blocksize, which seems pretty naive in retrospect, since it is clear now that it's almost impossible to reach said consensus, and most likely we are stuck with 1MB.

Other's say that this was completely expected by satoshi, and he knew perfectly well that the 1MB limit was set in stone, and this is just part of the Bitcoin game theory, so it can be immutable, while remaining open source and so on.

We will never know what his true intentions were, what do we know now is, no increases of the block size are going to happen anytime soon.
Satoshi was clear that the 1 MB max block size was meant to be a temporary anti-spam measure, and that the max block size should be increased in the future when the number of transactions warranted.


~
What exactly does a non-mining node do for a "regular" user of Bitcoin who is not running a node, and is in the process of spending some of his bitcoin in a store? Since every node is free to choose their own settings, what would happen if this particular node has different settings than the mining nodes as a whole?

Yes, indeed we know that. But as I said, did he knew the outcome too?

Was he ignorant/delusional enough that there would be consensus about it? Or he expected that such a change would be impossible?

He definitely entertained the idea of people being opposed:

Piling every proof-of-work quorum system in the world into one dataset doesn't scale.

Bitcoin and BitDNS can be used separately.  Users shouldn't have to download all of both to use one or the other.  BitDNS users may not want to download everything the next several unrelated networks decide to pile in either.

The networks need to have separate fates.  BitDNS users might be completely liberal about adding any large data features since relatively few domain registrars are needed, while Bitcoin users might get increasingly tyrannical about limiting the size of the chain so it's easy for lots of users and small devices.

Fears about securely buying domains with Bitcoins are a red herring.  It's easy to trade Bitcoins for other non-repudiable commodities.

If you're still worried about it, it's cryptographically possible to make a risk free trade.  The two parties would set up transactions on both sides such that when they both sign the transactions, the second signer's signature triggers the release of both.  The second signer can't release one without releasing the other.

So either he knew it was impossible and 1MB was set in stone once Bitcoin got enough traction for it to be "too big to change", or he was expecting that everyone involved would at some point reach super majority, or enough majority to make the change without causing a big disaster-split (which as far as I can tell, it's pretty delusional... I can only see a super majority consensus on hardforks when it comes to a fatal bug that would render everyone's coins useless... and still, even there we would have big debates and disagreements in how to proceed).


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Rath_ on July 29, 2018, 07:19:31 PM
Satoshi was clear that the 1 MB max block size was meant to be a temporary anti-spam measure, and that the max block size should be increased in the future when the number of transactions warranted.

Andreas M. Antonopoulos published a video (https://www.youtube.com/watch?v=Ub2LoTcYV54) on Scaling and "Satoshi's vision" two days ago.
 
It's been ten years since Bitcoin launched and a lot of things happened to it after Satoshi had gone. Take ASICs as an example, some people think that they are slowly leading us to the centralization of the network because most of the mining hardware is being manufactured by one company. There were a few Bitcoin hard forks which aim was to bring back GPU mining and ban specialized mining hardware. None of them succeeded.

Why did I mention mining? Well, that's what the community and mining pools have chosen. If users were really against ASICs then most of them would try to temporarily block them by changing the mining algorithm. Wasn't Satoshi against them too?

Do you remember SegWit2X? Despite its huge support from miners, it failed - users played a big part there. Why do you despise other scaling solutions? It will take months before Lightning Network becomes more reliable, but it shouldn't have any negative impact on the network unlike increasing the blocksize which might result in centralization (it was already explained above so I don't think if there is any point in repeating it). I doubt that many people would agree to decrease the blocksize after it turned out to be a bad idea.

Satoshi is not a God. Let the community choose.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Rath_ on July 29, 2018, 08:23:39 PM
While it's true that increasing block size weight limit could lead to decentralization, but it will be needed when more people adopt/use bitcoin regularly even considering majority use LN/other 2nd layer protocol.
People still need to use on-chain to open/close channel, refill channel balance. Also, i'm sure people would prefer on-chain transaction for big payment.

While block weight limit increase won't happen anytime soon, IMO community must aware it's inevitable if they want to see Bitcoin used on bigger scale.

I am aware that we will need to increase it at some point in the future. However, I don't support increasing the block weight without thinking about the consequences. There are still a few things that can be done in order to decrease the size of transactions, for example, implementing Schnorr signatures. Also, Channel factories (https://bitcoin.stackexchange.com/questions/67158/what-are-channel-factories-and-how-do-they-work/67187) might help once Lightning Network becomes more widely used.

Assuming that all transactions were SegWit ones, the maximum block weight would be 4 MB, right?


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on July 30, 2018, 06:10:55 AM
Why can't core scale as some f(x) = S curve so that you would get a % increase that increased supply and demand?

why are they committed to only 1mb (or ~ 4 mb with segwit)



Because it's risky: no one knows how would full node hardware (read: desktops and laptops) evolve in the future - maybe in 10 years they will easily handle 30 MB blocks, maybe they will struggle even with 10 MB blocks. If your curve will grow faster than node resources, you'll end up with the same result as if you increased blocksize to some ridiculous number today: network centralization.

This risk also has very little reward: increasing tps from 7 to 30-50 won't solve the scalability problem, we need thousands transactions per second, and this is why we have Lightning. Also, there are more elegant on-chain capacity boosts that need to be implemented before any blocksize increases, like Schnorr.

I believe you are over-emphasizing on hardware and less on bandwidth and network latency. Bandwidth's "growth" is going slower and slower over the years, and that slow growth will compound more on network latency because the effects of higher bandwidth does not translate immediately on the network according to Nielsen's Law.

There is also the "blockchain trillema" problem I posted page 1.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on July 31, 2018, 05:48:54 AM
Then there is more weight to the debate that block size should be regulated than to "let the miners decide" as in Ethereum through letting their miners decide the gas price limit.

Bitcoin Cash is the same but with a 30mb limit.



Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on August 01, 2018, 04:49:57 PM
Why can't core scale as some f(x) = S curve so that you would get a % increase that increased supply and demand?

why are they committed to only 1mb (or ~ 4 mb with segwit)



Because it's risky: no one knows how would full node hardware (read: desktops and laptops) evolve in the future - maybe in 10 years they will easily handle 30 MB blocks, maybe they will struggle even with 10 MB blocks. If your curve will grow faster than node resources, you'll end up with the same result as if you increased blocksize to some ridiculous number today: network centralization.

This risk also has very little reward: increasing tps from 7 to 30-50 won't solve the scalability problem, we need thousands transactions per second, and this is why we have Lightning. Also, there are more elegant on-chain capacity boosts that need to be implemented before any blocksize increases, like Schnorr.

I believe you are over-emphasizing on hardware and less on bandwidth and network latency. Bandwidth's "growth" is going slower and slower over the years, and that slow growth will compound more on network latency because the effects of higher bandwidth does not translate immediately on the network according to Nielsen's Law.

There is also the "blockchain trillema" problem I posted page 1.

The "blockchain trillema" fact is the reason why anyone with a brain should have their alarms go on full steam when someone is trying to sell you a coin that is "fast, cheap and as secure as bitcoin". You either get to have 2 of these features at once, unfortunately, and pretending that decreasing security for speed and cheaper transactions is a sane thing it's simply missing the point of what makes Bitcoin valuable on the first place. Any of these alts could be used to be fast and cheap, none of them can be as secure as Bitcoin, why the hell would you shoot yourself on the leg and destroy the niche you dominate? this is what big blockers don't get.

If LN works out well enough to do some low value fast and cheap transactions then that's fantastic, but fast and cheap transactions isn't the main goal, security and decentralization must always go fist.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: gmaxwell on August 01, 2018, 08:16:14 PM
I believe you are over-emphasizing on hardware and less on bandwidth and network latency. Bandwidth's "growth" is going slower and slower over the years, and that slow growth will compound more on network latency because the effects of higher bandwidth does not translate immediately on the network according to Nielsen's Law.

Another factor is that it's generally dangerous to set bars for participation based on any fraction of current participants.

Imagine, say we all decide 90% of nodes can handle capacity X. So then we run at X, and the weakest 10% drop out.  Then, we look again, and apply the same logic (... after all, it was a good enough reason before) and move to Y, knocking out 10%...  and so on. The end result of that particular process is loss of decentralization.

Some months back someone was crowing about the mean bandwidth of listening nodes having gone up. But if you broke it down into nodes on big VPS providers (amazon and digital ocean) and everyone else, what you find is that each group's bandwidth didn't change, but the share of nodes on centeralized 'cloud' providers went way up. :(  (probably for a dozen different reasons, -- loss of UPNP, increased resource usage, more spy nodes which tend to be on VPSes...)

Then we have the fact that technology improvements are not necessarily being applied where we need them most-- e.g. a lot of effort is spent making things more portable, lower power consuming and less costly rather than making things faster or higher bandwidth.  Similarly, lots of network capacity growth happens in dense/easily covered city areas rather for anyone.  In the US in a major city you can often get bidi gigabit internet at personally affordable prices but then 30 miles out spend even more money and get ADSL that barely does 750kbit/sec up. The most common broadband provider in the US usually has plenty of speed but has monthly usage caps that a listening node can use most of... Bitcoin's bandwidth usage doesn't sound like much but when you add in overheads and new peers syncing, and multiply that usage out 24/7 it adds up to more bandwidth than people typically use... and once Bitcoin is using most of a user's resources the costs of using it become a real consideration for some people.  This isn't good for the goal of decentralization.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 01, 2018, 08:53:29 PM
Although @gmaxwell in the above post is basically correct and there are always non-linearities  involved, we should take care not to have bitcoin frozen like in its ice age,  in the name of conservatism or decentralization.

Extremism doesn't help either way. I do agree blindly and radically playing with critical parameters like block size or block time is not rational and puts network in danger. But by no means it is acceptable to condemn any re-adjustment proposal. It is 10 years and we can
have some adjustments due to technology developments.

Vitalik Buterin trilemma is pure garbage. I think the kid has recently read something about  ideal gas law (PV/T= C) and is childishly making an analogy. ;D


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: gmaxwell on August 02, 2018, 03:20:41 AM
we should take care not to have bitcoin frozen like in its ice age, 
You state this as if the capacity wasn't recently roughly _doubled_, overshooting demand and knocking fees down to low levels...


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on August 02, 2018, 06:26:56 AM
Although @gmaxwell in the above post is basically correct and there are always non-linearities  involved, we should take care not to have bitcoin frozen like in its ice age,  in the name of conservatism or decentralization.

You know that your proposal will never be accepted in Bitcoin. I suggest you should do the next best option, fork Bitcoin and put every rule you want in there.

Quote
Extremism doesn't help either way. I do agree blindly and radically playing with critical parameters like block size or block time is not rational and puts network in danger. But by no means it is acceptable to condemn any re-adjustment proposal.

The Bitcoin Core developers are among the smartest people in the cryptocurrency world. If they made you look stupid, do not take it personally and move on.

Quote
It is 10 years and we can
have some adjustments due to technology developments.

But for Bitcoin, I believe the technically safer way forward is better. Let others dominate in market cap. Bitcoin will outlive them all.

Quote
Vitalik Buterin trilemma is pure garbage. I think the kid has recently read something about  ideal gas law (PV/T= C) and is childishly making an analogy. ;D

Please explain how decreasing or increasing one trait will not affect the traits of one of the other two. Security, decentralization, and scalability.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 02, 2018, 11:16:18 AM
we should take care not to have bitcoin frozen like in its ice age,
You state this as if the capacity wasn't recently roughly _doubled_, overshooting demand and knocking fees down to low levels...
SegWit is not enough and took too long.

Satoshi lacked the statistics and data for proper tuning of parameters so he rationally chose the most conservative ones, after 10 years with Moor's law in action and the huge amount of data available, it is time to have block time and block size proposals being discussed and evaluated both with more courage and less skepticism, imo.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 02, 2018, 11:24:52 AM
Vitalik Buterin trilemma is pure garbage. I think the kid has recently read something about  ideal gas law (PV/T= C) and is childishly making an analogy. ;D

Please explain how decreasing or increasing one trait will not affect the traits of one of the other two. Security, decentralization, and scalability.
It is not on me to reject every single ridiculous  claim Vitalik Buterin makes (believe me there is a lot), you better ask him to prove it. I mean a mathematical proof.

Actually the whole cryptocurrency movement (for which Buterin is just like a trojan) is based on axioms that guarantee  the feasibility of a secure, decentralized and fast p2p electronic cash system, check with Satoshi Nakamoto. ;)


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on August 02, 2018, 01:13:20 PM
we should take care not to have bitcoin frozen like in its ice age, 
You state this as if the capacity wasn't recently roughly _doubled_, overshooting demand and knocking fees down to low levels...
SegWit is not enough and took too long.

Satoshi lacked the statistics and data for proper tuning of parameters so he rationally chose the most conservative ones, after 10 years with both Moor's law in action and the huge amount of data available, it is time to have both block time and block size proposals being discussed and evaluated both with more courage and less skepticism, imo.

But the problem is once again, doing the actual hardfork in a clean way. You need to convince many people, some will oppose. What are you going to do about it?

Im always up for debating these things, but actually making it happen is different story. Satoshi knew super majority in Bitcoin was insanely difficult to reach. He may also have considered an scenario where when Bitcoin gets past a small project, it would basically be increasingly difficult to hardfork up to a point where it becomes practically impossible.

SegWit was added because it was a soft fork, otherwise it would have never happened. And it didn't came without massive controversy and struggle. Up to this day some say eventually miners will cartel against SegWit to steal funds.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: DooMAD on August 02, 2018, 01:22:34 PM
SegWit is not enough and took too long.

In your opinion, perhaps.  But for most people, they simply aren't voicing those concerns.  If the majority want change, change will happen.  What's proven on each and every day that no change occurs, is that the majority want it to remain exactly as it is.


Please explain how decreasing or increasing one trait will not affect the traits of one of the other two. Security, decentralization, and scalability.
It is not on me to reject every single ridiculous  claim Vitalik Buterin makes (believe me there is a lot), you better ask him to prove it. I mean a mathematical proof.

Actually the whole cryptocurrency movement (for which Buterin is just like a trojan) is based on axioms that guarantee  the feasibility of a secure, decentralized and fast p2p electronic cash system, check with Satoshi Nakamoto. ;)

It's on you to justify why you think change is needed.  No one seems conviced thus far.  Avoiding Wind_FURY's question probably isn't helping with that.  It's clear to me why they're asking you to provide an answer to that question.  If it isn't clear to you, maybe you need to come back once you've figured it out.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 02, 2018, 02:28:49 PM
@Doomad, @cellard

Understanding Bitcoin as an electronic, p2p cash system (sounds familiar?) is constitutional and axiomatic. It is not up to anybody to vote for crypto geeks whether they have right or not to establish, maintain or improve such a system to fulfil its constitutional design targets.

Vitalik, the trojan horse, suggests our axioms are not consistent and there will be no p2p decentralized, secure cash system ever because of a ridiculous hypothesis he has forged by copycatting from preliminary thermodynamics.

If he is right then people like me and Satoshi Nakamoto and many other crypto enthusiasts are idiots, otherwise he and people who think he is really a genius are fools. No negotiations, no voting, no consensus ever.

A system, computerized or not, is not complete from the first day, bitcoin not an  exception.

Decentralization+Scalability+Security  is vital to bitcoin because they are constitutional. Both three features have to be implemented and maintained simultaneously and to the maximum extent. Idiots think it is impossible? Let them ruin their ecosystem with Proof of Sh*t. Cowards think it is hard to achieve? They better retire and watch the show. Whales are worried about their $$ deposits being at price fluctuation risk? They better cash out.

There is no option other than reviewing and summarizing the past decade of experiment with (beta version of) bitcoin we have, fixing centralization and scalability and security shortcomings and starting a refined more finalized bitcoin perhaps for next decade.

I'm committed to such an evolutionary vision with our without any support. Axioms are not decidable


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on August 02, 2018, 04:19:03 PM
SegWit is not enough and took too long.

In your opinion, perhaps.  But for most people, they simply aren't voicing those concerns.  If the majority want change, change will happen.  What's proven on each and every day that no change occurs, is that the majority want it to remain exactly as it is.


Please explain how decreasing or increasing one trait will not affect the traits of one of the other two. Security, decentralization, and scalability.
It is not on me to reject every single ridiculous  claim Vitalik Buterin makes (believe me there is a lot), you better ask him to prove it. I mean a mathematical proof.

Actually the whole cryptocurrency movement (for which Buterin is just like a trojan) is based on axioms that guarantee  the feasibility of a secure, decentralized and fast p2p electronic cash system, check with Satoshi Nakamoto. ;)

It's on you to justify why you think change is needed.  No one seems conviced thus far.  Avoiding Wind_FURY's question probably isn't helping with that.  It's clear to me why they're asking you to provide an answer to that question.  If it isn't clear to you, maybe you need to come back once you've figured it out.

Bitcoin is not a democracy where 1 person = 1 vote... as far as im concerned, in Bitcoin you matter depending on the amount of Bitcoin you hold basically. So when there is a fork, you "vote with your coins", as you receive coins in both chains. If I support the original Bitcoin, I will dump on the other chain. If I don't, I will dump on the legacy chain. The conclusion as far as I can tell, is that whales sit at the top of the hierarchy since depending on what they do on this situation they will incentive miners to mine whatever chain whales are supporting. Of course whales aren't %100 controlling Bitcoin, but the point is, you better have as many whales on your side when attempting a hardfork.

If a whale has 100000 BTC, then that's 100000 votes so to speak.

So aliashraf, if you want your hard fork improvement proposals to be considered, you must go convince as many whales as possible, and other people that have a big stake in Bitcoin (miners etc).

Of course, the more peer reviewed your code gets between different experts on the field the better, so you must discuss with coders too.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on August 03, 2018, 06:33:43 AM
Vitalik Buterin trilemma is pure garbage. I think the kid has recently read something about  ideal gas law (PV/T= C) and is childishly making an analogy. ;D

Please explain how decreasing or increasing one trait will not affect the traits of one of the other two. Security, decentralization, and scalability.
It is not on me to reject every single ridiculous  claim Vitalik Buterin makes (believe me there is a lot), you better ask him to prove it. I mean a mathematical proof.

You are sidestepping the debate. I want to hear from you why you believe that increasing scalability will not affect security and decentralization if applied on the blockchain by increasing the blocks, or by your idea of decreasing the time between blocks. I want to know why you believe saying that there will be tradeoffs is "garbage".


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 04, 2018, 03:49:42 PM
Vitalik Buterin trilemma is pure garbage. I think the kid has recently read something about  ideal gas law (PV/T= C) and is childishly making an analogy. ;D

Please explain how decreasing or increasing one trait will not affect the traits of one of the other two. Security, decentralization, and scalability.
It is not on me to reject every single ridiculous  claim Vitalik Buterin makes (believe me there is a lot), you better ask him to prove it. I mean a mathematical proof.

You are sidestepping the debate. I want to hear from you why you believe that increasing scalability will not affect security and decentralization if applied on the blockchain by increasing the blocks, or by your idea of decreasing the time between blocks. I want to know why you believe saying that there will be tradeoffs is "garbage".
I see both you and @Doomad are very interested in refuting Buterin's claim. I understand, I feel offended too  ;)

First of all I have to maintain that it is on him to prove such a hypothetical trilemma exists either mathematically or experimentally.

Obviously, Buterin is not much of a proof boy and we are dealing with an intuitive argument inspired by:
-a Japanese maxim which states: everything comes with a price!
- the famous ideal gas rule in thermodynamics PV/T = constant
- recent congestion and performance problems in bitcoin and ethereum networks.

Yet and in spite of its  journalistic nature, this garbage and the magical key word, trilemma, is occasionally used to get rid of any serious discussion about improving cryptocurrencies or to make weird claims about impossibility of such improvements.

Some people go that far to use this fallacy (we will prove it being a fallacy) as a supporting assertion for their strategic interpretation of what bitcoin is or is not.

So, let's do it ... proving wrong an unproven, journalistic, ridiculous claim about block chain design being an engineering trade-off between security, decentralization and performance, made by Ethereum idol, Vitalik Buterin  

What is this trilemma of Buterin?
The trilemma claims that blockchain systems can only at most have two of the following three properties:
  • Decentralization (defined as the system being able to run in a scenario where each participant only has access to O(c) resources, i.e. a regular laptop or small VPS)
  • Scalability (defined as being able to process O(n) > O(c) transactions)
  • Security (defined as being secure against attackers with up to O(n) resources)
It is how Buterin and his proponents in Ethereum community present the trilemma: re-defining decentralization, scalability and security such that one might be convinced they are mutually incompatible or something!

What does decentralization mean in cryptocurrency?
Buterin suggests it is totally impractical to have a decentralized p2p network and we have always a degree of centralization. It is essential for his ridiculous trilemma: You got a distribution of votes not being homogenous? You are somewhat centralized!

Decentralization, the way bitcoin has established it is not such a naive concept. It is about Byzantine Fault Tolerance:
A cryptocurrency p2p network is supposed to be decentralized to resist against collusion between a majority of participants weighed by their power, for this to be achieved a network should be capable of maintaining a distribution of participants/voters being both divergent and granular enough.  

Although enough sounds to be relativistic, it is not. An adversary has always the option to consume enough resources to compromise the security of a network (Note for Buterin: regardless of how scalable it is ). He just doesn't commit such an attack because of the game theory analysis that shows him how irrational would be such an option.

No matter how granular or distributed are the participants, there is always a choice to locate them, bribe them and commit such a conspiracy against the network, it is just about the costs and the incentives involved.

So, a network is safe in terms of decentralization as long as collusion is much more costly to achieve than the rewards.

Although it is possible to imagine a hypothetical function taking a range of parameters and giving us some quantitative index for decentralization, that function by no means would be such a stupid linear relation with the total network resources: O(c). This kid is obsessed with computational theory's O function. He thinks it is very prestigious to use O(something) whenever you have no clue of the actual behavior of a variable.

Decentralization is a sophisticated property for a p2p network and the only legitimate framework to make an assessment regarding it, is game theory. There exists no linear, simple function that can be used to give a practical index of how decentralized a given distribution of resources and votes in a network is.

Plus and more importantly, there is a gap between a protocol (or an improvement  proposal for a protocol) and the way the resources are distributed in the real network. A protocol can encourage/discourage a distribution pattern but is not able to actually impose it!

Suppose we have this protocol freshly started with a very limited hashpower, if for some reason its coin gets unpredictably more attention from the market, no matter what the protocol is or how it encourages the distribution of resources, it will become radically centralized in few days and there is nothing the protocol can do about it as long as it is supposed to be public and permissionless.

What does scalability mean in cryptocurrency?

The way Buterin puts it, scalability is another impractical agenda for cryptocurrency. Actually it is impractical for any computing system.

Any educated, experienced software engineer is aware of the limitations in how scalable would be his design. Simply asking for O(N) > O(c) as a measure of scalability is nothing more than a joke (again using the infamous O function :D). In software engineering we always need a criteria for designing a system to be scalable within. Once the requirements grow higher than the boundaries, you have to reconsider the design properly and you may have to do it from scratch.

When in crypto ecosystem we talk about scalability we are not looking for a solution to compete with a stupid centralized data center like what Visa runs. Who put this in front of us as a challenge and why?

we don't need tens of thousands tps now and in near future. We are speaking of tens up to few hundreds tps for now and next few years. Neither bitcoin nor any other crypto coin is supposed to replace fiat currencies in like 5 years. Neither fucking Ethereum universal machine is going to be used as an alternative legal system for resolving day-to-day contracts between people.

Why people are screaming scalability or making fantasies about it? what's the catch? Making an embarrassment out of it for the community? For what? Selling their own shitcoin?

Although we are always dreaming of such days, they are not gonna happen right now and we have to worry about real problems with real performance bottlenecks at this very moment.

Scalability for crypto currencies should be considered as:
The capability of a network to  process transactions without unexpected delay or costs if the projected numbers and indexes of popularity and usage growth happened to become real or exceeded their initial values moderately.

For bitcoin, I suggest 100 tps is more than enough for next 4 years. So, we just need like 10-15 times better utilization of resources and technologies available nowadays.

What happens next, when network grows even larger and demand gets meaningfully higher?
Nothing! We make more assessments based on more data we would already have, and try to utilize more advanced technologies and sophisticated protocol enhancements to make the network even more efficient.

Next?
We don't know! We should wait for more improvements and more technological opportunities, may be?

Simply, we should improve and evolve gradually to keep the network efficient in the criteria that is dictated by the market.

What does security mean in cryptocurrency?

I don't understand WTF Viralik means by 'attacker with O(n) resources' but in crypto we have two important security measures:

1- Being resilient against double spend attack by 50%-1 resources.
2- Utilizing network resources fair enough to keep the above index in the same level by avoiding orphan blocks.
With orphan blocks and unintentional forks, we will have parts of the resources wasted making it possible for the adversary with less resources to commit double spend attack.

A protocol is considered to be safe if its consensus mechanism is solidly supporting the first property and other implementation and operation parameters
do not put the network in the second danger.

Now, suppose we have our coin designed (for some reason) in a way that a relative number of stall blocks or unintentional forks happen more often than a coin like bitcoin. What happens?

Firstly the adversary may find it practical to fork intentionally for a double spend attack using his hashpower which is  considerably less than 50% with a good chance of winning the competition because the network's honest majority of hashpower is already splitted between other competing, unintentional forks.  

Then what? a double spend has been made and somebody has lost his funds/assets.
But wait ... The loss should be considerably high like a very large transaction of coins? Shouldn't it? And why a user dealing with our coin shouldn't wait for more confirmations before releasing his $$ assets? Of course he should and he would wait for that.

The point is, even with not being fully obsessed with preventing unintentional forks, this network should be considered safe as long as it supports the more crucial 50%-1 attack resistance, deep in its protocol and thus resists long range attacks properly.


Conclusive remarks:
This is why I call this hypothetical trilemma, pure garbage. It is nothing more than a naive fallacy, a strawman flavor of fallacy to be exact:
The kid is trying to convince us we are not capable of keeping our promise for a p2p, decentralized, secure and adoptable electronic cash system because of some magical contradictions between features nobody have promised.


Vitalik Buterin is now an adult and has rights to attack our axioms but it should be clearly understood as an attack from outside of the cryptocurrency discourse .
All this thing is about weakening bitcoin/cryptocurrency agenda: building a better world by inventing a better monetary system.
He deliberately suggests/forges impractical and void definitions of basic features and says it is impossible to achieve the design goals of interest.

Why? Because he is a trojan horse. He says:
"Now that I have proved (;D)  decentralization is not achievable, let's have some centralization. Me and my org being the most important centers by the way."

And I'm like
"Ok Vitalik go ruin yourself and your org and any other toys you got son. Just watch your mouth and don't mess with bitcoin or other decent cryptocurrencies. Seriously!"


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on August 06, 2018, 05:32:29 AM
Then why is it hard for Bitcoin to scale while trying to maintain security and decentralization at the same time as the means for censorship resistance?

Did Dogecoin already solve the scaling problem? Did Litecoin?


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: amishmanish on August 06, 2018, 06:27:03 AM
What does scalability mean in cryptocurrency?

When in crypto ecosystem we talk about scalability we are not looking for a solution to compete with a stupid centralized data center like what Visa runs. Who put this in front of us as a challenge and why?

we don't need tens of thousands tps now and in near future. We are speaking of tens up to few hundreds tps for now and next few years. Neither bitcoin nor any other crypto coin is supposed to replace fiat currencies in like 5 years. Neither fucking Ethereum universal machine is going to be used as an alternative legal system for resolving day-to-day contracts between people.

Why people are screaming scalability or making fantasies about it? what's the catch? Making an embarrassment out of it for the community? For what? Selling their own shitcoin?

Although we are always dreaming of such days, they are not gonna happen right now and we have to worry about real problems with real performance bottlenecks at this very moment.

Scalability for crypto currencies should be considered as:
The capability of a network to  process transactions without unexpected delay or costs if the projected numbers and indexes of popularity and usage growth happened to become real or exceeded their initial values moderately.

For bitcoin, I suggest 100 tps is more than enough for next 4 years. So, we just need like 10-15 times better utilization of resources and technologies available nowadays.

What happens next, when network grows even larger and demand gets meaningfully higher?
Nothing! We make more assessments based on more data we would already have, and try to utilize more advanced technologies and sophisticated protocol enhancements to make the network even more efficient.

Next?
We don't know!
We should wait for more improvements and more technological opportunities, may be?

Simply, we should improve and evolve gradually to keep the network efficient in the criteria that is dictated by the market.

The above argument for scaling illustrates the basic difference in the approach for Big blocks vs Small Blocks. Big blockers seem to view bitcoin's demand growth as the basic motive. They want to ensure that any foreseeable demand is immediately filled by a step -wise increase of blocksize, even though it'd need a hardfork and even though there'll be a lot of unknowns.

This seems to ignore the basic ideal of keeping bitcoin within reach. Nobody should be able to takeover bitcoin sheerly by the amount of resources they can put into it. @ali mentioned in one of his opening argument about the small miner with a 200K investment versus the big data center with upwards of a million dollars in investment.

There is surely a point of exit for the enthusiast between this 200K and 1 Million, where he'll no longer be able to afford running a mining node. Apart from the hardware cost, access to crazy fast internet that will be needed for, say, a 32 MB blocksize syncing, further shrinks the set of people who can contribute meaningful hashing power.  For bitcoin to remain in the domain of the enthusiast and to remain usable for everyone, block size is probably the only thing that is under this community's control. Once that increases without an upper bound, it'd still function but the number of people who can still afford it will drastically reduce.

Maybe it is time we had a proper study on this. Is it possible to check the details of who is connected to a pool and respective hashpowers of such connections? We have some 9500 nodes live as of now. How many of them are mining nodes and what percentage of them are at the enthusiast level?


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 06, 2018, 02:25:05 PM
Then why is it hard for Bitcoin to scale while trying to maintain security and decentralization at the same time as the means for censorship resistance?


One can always suggest that performance of a consensus based decentralized system is limited compared to a foolish centralized TP. Not a big discovery. Decentralization makes it a bottleneck to synchronize the ledger.

Satoshi Nakamoto by inventing and introducing bitcoin implicitly claimed the feasibility of a system acceptable in both decentralization and performance measures and it had proven to be right until congestions and transaction backlog in late 2017, what is presumed imprecisely as a scaling crisis.

The main factor behind that crisis was the pointless block size debate that in spite of obvious growth in market demand delayed activation of SegWit which was efficiently capable of improving the performance up to 2.5 times without putting any thing in danger.

A moderate increase in block size or decrease in block-time was capable of improving the performance as well, again, without any bad side effects on decentralization or security, unlike what Core guys insist to maintain.

After BIP 152 and recently FIBRE and other dedicated network relay protocols and services, a very wide range of opportunities for both fine tuning operational parameters and the protocol itself are in the horizon.

Combined with communication and processing technologies the above opportunities would help improving to like 10 times better in terms of performance, again without any bad side effect on decentralization or security of the network, if the governance problems do not lead us to the same misery as the infamous block size debate.

Up to here, performance is guaranteed for like 50 tps and suffices for us to relax and decently investigate even more radical approaches like sharding proposals and define another 2-4 times improvement in next couple of years, passing 100 tps which I believe is enough for the current state of the market demand and technology development, both.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

What does scalability mean in cryptocurrency?
....
Scalability for crypto currencies should be considered as:
The capability of a network to  process transactions without unexpected delay or costs if the projected numbers and indexes of popularity and usage growth happened to become real or exceeded their initial values moderately.
.....

The above argument for scaling illustrates the basic difference in the approach for Big blocks vs Small Blocks. Big blockers seem to view bitcoin's demand growth as the basic motive. They want to ensure that any foreseeable demand is immediately filled by a step -wise increase of blocksize, even though it'd need a hardfork and even though there'll be a lot of unknowns.

This seems to ignore the basic ideal of keeping bitcoin within reach. Nobody should be able to takeover bitcoin sheerly by the amount of resources they can put into it. @ali mentioned in one of his opening argument about the small miner with a 200K investment versus the big data center with upwards of a million dollars in investment.

There is surely a point of exit for the enthusiast between this 200K and 1 Million, where he'll no longer be able to afford running a mining node. Apart from the hardware cost, access to crazy fast internet that will be needed for, say, a 32 MB blocksize syncing, further shrinks the set of people who can contribute meaningful hashing power.  For bitcoin to remain in the domain of the enthusiast and to remain usable for everyone, block size is probably the only thing that is under this community's control. Once that increases without an upper bound, it'd still function but the number of people who can still afford it will drastically reduce.

Maybe it is time we had a proper study on this. Is it possible to check the details of who is connected to a pool and respective hashpowers of such connections? We have some 9500 nodes live as of now. How many of them are mining nodes and what percentage of them are at the enthusiast level?

I admit that my personal judgment regarding block size debate is somehow biased in favor of big blockers rather than Core but not truly a big blocker  ;)

First of all I recommend block time decrease instead of block size increase and secondly I don't believe either approaches may be considered a scaling solution. just a moderate temporary improvement which is infeasible to be repeated recklessly because of centralization and security consequences (Vitalik made his trilemma joke about such naive proposals for scaling, I suppose).

Specially, Core agenda, projecting performance problem to off-chain and 2nd layer services like LN is what I'm strongly against. It is the most dangerous threat for bitcoin imo, because a team in charge of maintaining the network is giving up with improving the protocol and focused on semi-centralized 2nd layer solutions.


As of centralization of mining, I have recently published a complete proposal and an independent analysis of pooling pressure in bitcoin. You are welcome to check both:

An analysis of Mining Variance and Proximity Premium flaws in Bitcoin (https://bitcointalk.org/index.php?topic=4687032.0)

Getting rid of pools: Proof of Collaborative Work (https://bitcointalk.org/index.php?topic=4438334.0)







Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on August 07, 2018, 06:37:45 AM
Quote
What does scalability mean in cryptocurrency?

When in crypto ecosystem we talk about scalability we are not looking for a solution to compete with a stupid centralized data center like what Visa runs. Who put this in front of us as a challenge and why?

we don't need tens of thousands tps now and in near future. We are speaking of tens up to few hundreds tps for now and next few years. Neither bitcoin nor any other crypto coin is supposed to replace fiat currencies in like 5 years.

I agree with you, and I believe the Core developers also agree. There is no hurry. Why pressure the developers to make a dangerous hard fork to bigger blocks, or your proposal of reducing block times?

Quote
Neither fucking Ethereum universal machine is going to be used as an alternative legal system for resolving day-to-day contracts between people.

Apples and oranges if used an example for Bitcoin.

Quote
Why people are screaming scalability or making fantasies about it? what's the catch? Making an embarrassment out of it for the community? For what? Selling their own shitcoin?

Yes, who's in a hurry? The Core developers are very conservative and they are also taking their time to not break anything.

Quote
Although we are always dreaming of such days, they are not gonna happen right now and we have to worry about real problems with real performance bottlenecks at this very moment.

Yes, why are we arguing? Hahaha.

Quote
Scalability for crypto currencies should be considered as:
The capability of a network to  process transactions without unexpected delay or costs if the projected numbers and indexes of popularity and usage growth happened to become real or exceeded their initial values moderately.

For bitcoin, I suggest 100 tps is more than enough for next 4 years. So, we just need like 10-15 times better utilization of resources and technologies available nowadays.

Or a reliable layered protocol capable of off-chain transactions. Why not? It maintains security and decentralization at the base.

Quote
What happens next, when network grows even larger and demand gets meaningfully higher?
Nothing! We make more assessments based on more data we would already have, and try to utilize more advanced technologies and sophisticated protocol enhancements to make the network even more efficient.

What are they? Can your crystal ball see 20 years into the future?

Quote
Next?
We don't know!
We should wait for more improvements and more technological opportunities, may be?

Simply, we should improve and evolve gradually to keep the network efficient in the criteria that is dictated by the market.

But meanwhile the developers should be very careful not to break the network.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 07, 2018, 12:03:25 PM

... a reliable layered protocol capable of off-chain transactions. Why not? It maintains security and decentralization at the base.
Easy ... heh?
Don't touch 'the base' just build services on top of it!

Everybody would be happy, bitcoin remains solid and people have their services.
Sounds familiar ...

Quote
Quote
Simply, we should improve and evolve gradually to keep the network efficient in the criteria that is dictated by the market.
But meanwhile the developers should be very careful not to break the network.

Spreading FUD about  mysterious huge risks of 'breaking' the network because of improvement and evolution is an important part of the Core's strategy. In block-size debate they played this card extensively and unjustifiably and they keep using it ever and ever.

According to you and Core (and Vitalik Buterin) bitcoin has reached its limits so any attempt to make it better specially in terms of performance, will put it in danger. Proposals regarding such improvements face a dozen of accusations of being reckless, insecure, ... and weakening decentralization, of course.

This situation is unacceptable.
An idea in its first stages needs support and help to become more stronger and sophisticated, to mature. Instead what we have is an anti-evolutionary atmosphere with harsh and aggressive attitude biased against any form of change in the name of conservatism.

It is what I've labeled as bitcoin ice age and we are in its midst.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: amishmanish on August 07, 2018, 12:56:51 PM
Spreading FUD about  mysterious huge risks of 'breaking' the network because of improvement and evolution is an important part of the Core's strategy. In block-size debate they played this card extensively and unjustifiably and they keep using it ever and ever.

According to you and Core (and Vitalik Buterin) bitcoin has reached its limits so any attempt to make it better specially in terms of performance, will put it in danger. Proposals regarding such improvements face a dozen of accusations of being reckless, insecure, ... and weakening decentralization, of course.

This situation is unacceptable.
An idea in its first stages needs support and help to become more stronger and sophisticated, to mature. Instead what we have is an anti-evolutionary atmosphere with harsh and aggressive attitude biased against any form of change in the name of conservatism.

It is what I've labeled as bitcoin ice age and we are in its midst.

You don't have to fall to the levels of slandering to make your point.

On one hand, your ideas on "winner takes all" vs "shared effort" sound reasonable while on the other hand, you seem to be a big fan of the conspiracy theories. Why can't you understand that this is basically a difference of ideology-driven vs business-driven thinking??
More than enough people have suggested you above to focus on the merits of your argument rather than trying to make unnecessary accusations on core devs. There have been enough people and their alts doing that on the forum. Why'd you want the system to be a running experiment for your ideas? You should probably see that this can be done a lot less belligerently and a lot more collaboratively.

Why don't you implement a testnet for your ideas. People will notice. If you have put up code results at IRC or some other place with suitable peers, a link would be welcome.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 07, 2018, 03:29:17 PM
@amishmanish

More people have correctly reminded me of political obstacles rather than technical ones unlike what you suggest. I've been around for a while. I understand who is who in bitcoin and which frontiers worth fighting.

Buterin's trilemma is one example and Core exaggerations about the fragility and sensitivity of bitcoin protocol/software that prohibits any "reckless' manipulation is another example. We should get rid of such superstitious pretexts to have a fresh and dynamic development environment because we need contribution.

I'm not slandering anybody. It would be a sign of arrogance for these guys if they take my criticism as an offense.

They are great developers, I admit. I find @gmaxwell's technical assessments always very useful and inspiring but to have this guy focused on the technical problem under consideration, you have to convince him to put his pretexts and prejudgments away, at least for a moment and remain open to new ideas and proposals.

Protesting against extreme conservatism and asking for a more open minded team in charge of bitcoin development is not a crime. Nor being against the idea of leaving bitcoin unchanged and sticking with 2nd layer projects should be considered an act of slandering.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on August 07, 2018, 04:53:41 PM
You have talked about how we need 100 tps right now or in the near future, which would require a blocksize increase, even mentioned blocktime modifications and other things which require a hard fork.

You still have not explained how the hard fork is done without causing a big mess with different factions dumping tons of coins into each other after the split.

And you are assuming that if Core (by Core, I guess we understand the most known developers?) agreed with a blocksize increase then we would have a smooth sailing to a 2MB hard fork or something which seems delusional to me.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 07, 2018, 09:45:13 PM
@cellard

No block size increase is required or recommended for 100 tps:

Step 1: Decrease block time immediately to 2.5 minutes.  
It gives us a rough 60 tps throughput with considerable positive decentralization consequences without any security draw backs (unlike what Buterin claims with his stupid trilemma).

Step 2: Improve relay network.  A greater support should be available for advanced communication features like unsolicited push messages. Full nodes connected to mining facilities should have access to parallel dedicated channels to push/pull critical messages with fewest hops. The possibility of merging protocols like FIBRE should be studied.

Step 3: Decrease block time even more, down to 90 seconds and touch 100 tps record.
The main challenge would be centralized situation with pools that potentially lets them to commit selfish-mining-like misbehaviors. To mitigate this, non-mining honest full nodes should participate in maintaining a back-up channel on which they push critical unsolicited new_block_found messages as soon as they found the block header valid.
I think a clean implementation can keep propagation delay below 2 seconds and the longest distance below 4 hops. It yields a worst case progress around 6 seconds.



As of your eternal concerns about community consensu about change:

This discussion is about how feasible is improving bitcoin throughput without putting the network in danger of centralization or security risks, unlike what PoW/cryptocurrency opponents along with off-chain/second layer proponents claim.

For a real practical plan, I'd start with PoCW and eliminating pools in the first place.
I'm not personally recommending any such optimizations in the current framework of bitcoin PoW that is based on winner-takes-all approach. So, my plan includes eliminating pools from the ecosystem as well as optimizing performance.

So, my agenda is more sophisticated than just a few performance optimizations and harder to accomplish, a completely off-topic subject.




Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on August 08, 2018, 05:18:02 AM
@cellard

No block size increase is required or recommended for 100 tps:

Step 1: Decrease block time immediately to 2.5 minutes.  
It gives us a rough 60 tps throughput with considerable positive decentralization consequences without any security draw backs (unlike what Buterin claims with his stupid trilemma).

Then prove your theories by running a test net as said by amishmanish.

How many minutes between blocks does Dogecoin have? I believe it is close to your 2.5 minute target. Do you believe it solved Bitcoin's scaling problem?

Quote
Step 2: Improve relay network.  A greater support should be available for advanced communication features like unsolicited push messages. Full nodes connected to mining facilities should have access to parallel dedicated channels to push/pull critical messages with fewest hops. The possibility of merging protocols like FIBRE should be studied.

Ok. I cannot comment, but I hope someone will in connection to "Step 3".

Quote
Step 3: Decrease block time even more, down to 90 seconds and touch 100 tps record.
The main challenge would be centralized situation with pools that potentially lets them to commit selfish-mining-like misbehaviors. To mitigate this, non-mining honest full nodes should participate in maintaining a back-up channel on which they push critical unsolicited new_block_found messages as soon as they found the block header valid.
I think a clean implementation can keep propagation delay below 2 seconds and the longest distance below 4 hops. It yields a worst case progress around 6 seconds.

You "think" it will happen that way in theory, but everyone does not have access to the same speed of bandwidth.

Quote
As of your eternal concerns about community consensu about change:

This discussion is about how feasible is improving bitcoin throughput without putting the network in danger of centralization or security risks, unlike what PoW/cryptocurrency opponents along with off-chain/second layer proponents claim.

For a real practical plan, I'd start with PoCW and eliminating pools in the first place.
I'm not personally recommending any such optimizations in the current framework of bitcoin PoW that is based on winner-takes-all approach. So, my plan includes eliminating pools from the ecosystem as well as optimizing performance.

So, my agenda is more sophisticated than just a few performance optimizations and harder to accomplish, a completely off-topic subject.

Have you made a proposal or is it possible for you to tell everyone in the Bitcoin mailing list?


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Evil-Knievel on August 08, 2018, 06:38:38 AM
Quote
Step 3: Decrease block time even more, down to 90 seconds and touch 100 tps record.
The main challenge would be centralized situation with pools that potentially lets them to commit selfish-mining-like misbehaviors. To mitigate this, non-mining honest full nodes should participate in maintaining a back-up channel on which they push critical unsolicited new_block_found messages as soon as they found the block header valid.
I think a clean implementation can keep propagation delay below 2 seconds and the longest distance below 4 hops. It yields a worst case progress around 6 seconds.

You "think" it will happen that way in theory, but everyone does not have access to the same speed of bandwidth.

That 4 hop idea won't work! Given a degree (maximum connections) d, and a maximum diameter (hop count) k, the construction of a graph (of maximum size) that matches these properties is called the "degree diameter problem" and has been around for ages. Funnily enough, research has shown that such graphs have an upper bound for the number of vertices (nodes) called the "Moore-Bound".

Either you restrict the Bitcoin Network to a small number of participants (less than the Moore Bound) or you accept the fact, that the diameter in a scalable peer to peer network (that is, without infinitely long routing tables or connections) will always depend on the number of nodes itself - increasing as the network grows.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 08, 2018, 09:33:37 AM
Quote
Step 3: Decrease block time even more, down to 90 seconds and touch 100 tps record.
The main challenge would be centralized situation with pools that potentially lets them to commit selfish-mining-like misbehaviors. To mitigate this, non-mining honest full nodes should participate in maintaining a back-up channel on which they push critical unsolicited new_block_found messages as soon as they found the block header valid.
I think a clean implementation can keep propagation delay below 2 seconds and the longest distance below 4 hops. It yields a worst case progress around 6 seconds.

You "think" it will happen that way in theory, but everyone does not have access to the same speed of bandwidth.

That 4 hop idea won't work! Given a degree (maximum connections) d, and a maximum diameter (hop count) k, the construction of a graph (of maximum size) that matches these properties is called the "degree diameter problem" and has been around for ages. Funnily enough, research has shown that such graphs have an upper bound for the number of vertices (nodes) called the "Moore-Bound".

Either you restrict the Bitcoin Network to a small number of participants (less than the Moore Bound) or you accept the fact, that the diameter in a scalable peer to peer network (that is, without infinitely long routing tables or connections) will always depend on the number of nodes itself - increasing as the network grows.
Actually, Moore Bound is the maximum number of vertices for a given diameter (maximum distance) and degree which is proved to be:

For a degree (d) of 100 (100 peers) to have a 4 hops diameter (k), the upper bound is 100,000,000 vertices!

Of course to fit so many nodes in such a network we would  need very restricted topology which is infeasible for a permissionless network. But the point is  possibilities are wide enough. For instance I could imagine high speed relay only loops of nodes with a cumulative number of tens of thousand peers that compensate for topological imperfectness.



Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Evil-Knievel on August 08, 2018, 12:43:34 PM
Sure, when you scale up the number of connections each node has to have, you can always ensure very low hop counts.
Clearly, this doesn't apply for random graphs (as we have in Bitcoin) so you would need some kind of fancy management overhead that constructs those special graphs in some decentralized way. Also, you would have to check what effects those 100+ node connections actually have: are there any new attack surfaces regarding DOS, net split? Are slow nodes doomed to fail because they suddenly have to process and rebroadcast all that crap from the other 100 nodes.

I am sure it is doable somehow, but whoever tried to implement a proper, resilient Chord P2P network probably figured out quickly that it's virtually impossible to properly handle all the different side cases. For this type of graph, I bet it the task would be even harder. Probably, the graph structure maintaining overhead would become the limiting factor at some point.

But I am very interested in hearing your ideas, if you have some cool solution to such issues. I could never figure it out.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on August 08, 2018, 02:14:27 PM
@cellard

No block size increase is required or recommended for 100 tps:

Step 1: Decrease block time immediately to 2.5 minutes.  
It gives us a rough 60 tps throughput with considerable positive decentralization consequences without any security draw backs (unlike what Buterin claims with his stupid trilemma).

Step 2: Improve relay network.  A greater support should be available for advanced communication features like unsolicited push messages. Full nodes connected to mining facilities should have access to parallel dedicated channels to push/pull critical messages with fewest hops. The possibility of merging protocols like FIBRE should be studied.

Step 3: Decrease block time even more, down to 90 seconds and touch 100 tps record.
The main challenge would be centralized situation with pools that potentially lets them to commit selfish-mining-like misbehaviors. To mitigate this, non-mining honest full nodes should participate in maintaining a back-up channel on which they push critical unsolicited new_block_found messages as soon as they found the block header valid.
I think a clean implementation can keep propagation delay below 2 seconds and the longest distance below 4 hops. It yields a worst case progress around 6 seconds.



As of your eternal concerns about community consensu about change:

This discussion is about how feasible is improving bitcoin throughput without putting the network in danger of centralization or security risks, unlike what PoW/cryptocurrency opponents along with off-chain/second layer proponents claim.

For a real practical plan, I'd start with PoCW and eliminating pools in the first place.
I'm not personally recommending any such optimizations in the current framework of bitcoin PoW that is based on winner-takes-all approach. So, my plan includes eliminating pools from the ecosystem as well as optimizing performance.

So, my agenda is more sophisticated than just a few performance optimizations and harder to accomplish, a completely off-topic subject.





You must first prove empirically that your ideas will not result in a clusterfuck when put in practice. You need somehow a test model to try how it would work and get the data. At least in BCash, Craig Wright, Rizun and co seems to be spending resources trying to run models in which they can gather data to see how massive blocksizes would look like.

"t0 2.5 min then at t1 90s" sounds a bit reckless.

Richard Heart explains here why touching blocktime may not be a good idea:

https://www.youtube.com/watch?time_continue=2940&v=iFJ2MZ3KciQ

Again, you will need a lot more to convince people to hardfork.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: amishmanish on August 08, 2018, 04:53:11 PM
@amishmanish
More people have correctly reminded me of political obstacles rather than technical ones unlike what you suggest. I've been around for a while. I understand who is who in bitcoin and which frontiers worth fighting.

Buterin's trilemma is one example and Core exaggerations about the fragility and sensitivity of bitcoin protocol/software that prohibits any "reckless' manipulation is another example. We should get rid of such superstitious pretexts to have a fresh and dynamic development environment because we need contribution.
Apart from block-size and block-time changes, your other proposition (from another thread) is a radical change to PoW algorithm. I am not someone who understands the game-theory, the graph theory and associated higher mathematics to judge on decentralization aspects of a mining algorithm. After much reading and thought, I came to an understanding that PoW works. The history of its development shows that various such ideas of "Proof-of-X" were tried but failed to pass the trustlessness criterion. This made me decide that bitcoin is my best bet. I am sure that the whales invested in this are also aiming for stability and not radical experimentation.

If the obstacle is not technical but political then you need to generate enough information to help people make a decision as to why a change in algorithm or block times can be feasible. This needs code, testing and evidence. Hence, a testnet. It will be far more easier to convince people that way. Or maybe if you could point us to a peer-reviewed discussion on these topics.

As i admitted above, there are technical aspects I don't understand yet. What I do understand is that Bitcoin's PoW works and has been working through the past decade. What i know is that a whole hardware manufacturing industry supports it with increasing hashpower and that hashpower is what keeps it valuable. I understand that mining has a centralization aspect but I also know that the entities involved will be wary of risking a self-goal by targeting the very thing that keeps them rich. As a believer in bitcoin's ideals, I hope and cheer for news that mining hardware may soon see entry of other major players leading to some resolution of the issue that you wish to address.

Changing the mining algorithm so as to wreck uncertainty on the miners worldwide would be a bad-disruptive thing to do, not the good-disruptive. How disruptive changing network specifications would be, like you propose here, should be demonstrable on a test net.. :-\

I'm not slandering anybody. It would be a sign of arrogance for these guys if they take my criticism as an offense.
A lot of people including Greg Maxwell have pointed out that any technical discussion here at BCT has lost significance because of the way it quickly gets personal/ political. Here's what you said in your post:
Spreading FUD about  mysterious huge risks of 'breaking' the network because of improvement and evolution is an important part of the Core's strategy.
In my opinion, accusing someone of spreading FUD as part of their 'strategy' is not technical criticism. (which you should focus on and which no doubt will be welcomed). You cannot hope to call them FUDDers, Dogma-ridden and expect them not to think that you have an ulterior agenda..There have been too many of those out here..¯\_(ツ)_/¯

Protesting against extreme conservatism and asking for a more open minded team in charge of bitcoin development is not a crime. Nor being against the idea of leaving bitcoin unchanged and sticking with 2nd layer projects should be considered an act of slandering.
Many people don't view it as extreme conservatism. Its accepting that you don't know what you don't know. There are enough examples of lower blocktimes, bigger block-sizes and there is nothing to suggest that they won't ever face bottlenecks. There is the Github way to do it and it says it well.
Quote
Testing and code review is the bottleneck for development; we get more pull requests than we can review and test on short notice. Please be patient and help out by testing other people's pull requests, and remember this is a security-critical project where any mistake might cost people lots of money.
With that, I'll bow out of this discussion and leave you to point us to information on why what you say should work. Maybe some of that material you have been promising for a while. It will be good learning for us.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 08, 2018, 05:13:52 PM
Sure, when you scale up the number of connections each node has to have, you can always ensure very low hop counts.
Clearly, this doesn't apply for random graphs (as we have in Bitcoin) so you would need some kind of fancy management overhead that constructs those special graphs in some decentralized way. Also, you would have to check what effects those 100+ node connections actually have: are there any new attack surfaces regarding DOS, net split? Are slow nodes doomed to fail because they suddenly have to process and rebroadcast all that crap from the other 100 nodes.
So, no Moore Bound problem, ok?

Bitcoin relay network is consisted of two different class of full nodes: non mining nodes (wallets, browsers, etc.) and mining related nodes.

By mining related I mean full nodes that are directly connected to a mining facility(pools, farms, etc.). Currently with pools dominating the mining scene I believe the number of mining nodes is extremely low (below 2k probably). Most of this nodes belong to farms/pools, centers with hundreds of thousands up to millions of dollars investment which have incentives and strengths to do anything they found necessary for optimizing their presence on the network.

IOW, having a very compact graph of mining full nodes with a 4 hopes diameter in the current bitcoin situation with pools, seems to be a trivial job. Actually it is already done by miners using simple strategies like deliberately choosing each other as peers or more effectively by joining FIBRE and alike.

But my target is more ambitious: I wanna set free miners from pool slavery by neutralizing pooling pressure. So I have to solve a much more difficult problem for a network with a minimum of like 100,000 mining nodes constrained by the same 4 hopes diameter condition. 8)

For this, we would need some minor improvements in the networking module of bitcoin on one hand and promoting independent relay networks like FIBRE that provide a backbone service for instantaneous message relay, on the other hand. I think we don't have to be worried about centralization or trust problems with such networks because nodes always have legacy bitcoin relay network as backup and as a reference witness to evaluate the service both in terms of validity and speed and can easily abandon unfaithful services.

Quote
I am sure it is doable somehow, but whoever tried to implement a proper, resilient Chord P2P network probably figured out quickly that it's virtually impossible to properly handle all the different side cases. For this type of graph, I bet it the task would be even harder. Probably, the graph structure maintaining overhead would become the limiting factor at some point.
I don't think Chord protocol is necessary here just imagine we have a handful of commercial services each consisted of complete graphs of 20-50 non-mining full nodes trusting each other and optimized to handle hundreds of connected full nodes. Miners would pay for the service and besides bitcoin legacy p2p network, they join to such services probably through a load balancing mechanism ...



Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 08, 2018, 05:43:11 PM

"t0 2.5 min then at t1 90s" sounds a bit reckless.
I don't see anything reckless here:
With current situation of mining, dominated by pools, no worries about network diameter, they have already resolved it and for my proposed pool-free protocol, it would be taken care of,  properly.

 
Quote
Richard Heart explains here why touching blocktime may not be a good idea:

https://www.youtube.com/watch?time_continue=2940&v=iFJ2MZ3KciQ

Heart is basically diverged from bitcoin original idea: a p2p electronic cash system. He is a bitcoin-as-gold guy. Don't bother listening to him.

We need bitcoin as the monetary system of the future. Banks should back-off, any other proposal is void, imo.

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------


Apart from block-size and block-time changes, your other proposition (from another thread) is a radical change to PoW algorithm. ...

Changing the mining algorithm so as to wreck uncertainty on the miners worldwide would be a bad-disruptive thing to do, not the good-disruptive. How disruptive changing network specifications would be, like you propose here, should be demonstrable on a test net.. :-\

My PoCW proposal keeps SHA256 unchanged, the only disruptive aspect is eliminating pooling pressure and the need for pools (not the possibility).

Quote
I'm not slandering anybody. It would be a sign of arrogance for these guys if they take my criticism as an offense.
A lot of people including Greg Maxwell have pointed out that any technical discussion here at BCT has lost significance because of the way it quickly gets personal/ political. Here's what you said in your post:
Spreading FUD about  mysterious huge risks of 'breaking' the network because of improvement and evolution is an important part of the Core's strategy.
In my opinion, accusing someone of spreading FUD as part of their 'strategy' is not technical criticism. (which you should focus on and which no doubt will be welcomed). You cannot hope to call them FUDDers, Dogma-ridden and expect them not to think that you have an ulterior agenda..There have been too many of those out here..¯\_(ツ)_/¯
No, I don't agree. Spreading FUD about a mysterious fragility in bitcoin and claiming every serious improvement proposal as a potentially dangerous move, is a regular practice of these guys and should be denounced properly. Nobody has a privilege or diplomatic immunity to act like that and to be exempt from criticism.

Plus, I think it is very normal in a technical discussion to accuse the other side of such a misconduct when s/he is just handwaving about mysterious threats and consequences.  

For instance the whole Buterin trilemma thing is a FUD that targets average crypto enthusiasts and users to give up hopelessly with having a decentralized, secure, and well performing system and instead accept his weird proposals about PoS and weak subjectivity (believe it? 'weak' shittiness).

Analogically, when someone claims reducing block time like 2-4 times in a fully decentralized mining scene will lead to centralization, he is spreading FUD, simply because there is no evidence or technical proof supporting such a claim. It is just a FUD.

Quote
Protesting against extreme conservatism and asking for a more open minded team in charge of bitcoin development is not a crime. Nor being against the idea of leaving bitcoin unchanged and sticking with 2nd layer projects should be considered an act of slandering.
Many people don't view it as extreme conservatism. Its accepting that you don't know what you don't know. ...
When it comes to protocols we design and implement, there is almost nothing that we don't know.





Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on August 10, 2018, 05:15:29 AM
Good news for you, aliashraf. The "real Bitcoin Core", BTCC, https://thebitcoincore.org/, will go through a hard fork to one minute block times.

I believe it will be on Thursday next week. You can comtribute by reviewing their code for bugs. It should be a good experiment for their network. 8)



Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 10, 2018, 10:54:42 AM
@Wind_FURY,
It is the first time I'm hearing about this fork  :D

Just check their web site,  :-\

Too many forks out there, many of them with obvious poisonous incentives, many others with little or zero history of discussion and theoretical efforts, ...

My proposals, both for block time decrease and collaborative work are suggested for improving current bitcoin network in scales it is experiencing. I don't understand how a coin/fork with few million dollars market cap and few penta hash mining power would need any kind of improvement and how useful it would be as a testbed for such improvements.

So, thanks but no, I'm not interested.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on August 10, 2018, 12:36:37 PM

Heart is basically diverged from bitcoin original idea: a p2p electronic cash system. He is a bitcoin-as-gold guy. Don't bother listening to him.

We need bitcoin as the monetary system of the future. Banks should back-off, any other proposal is void, imo.

Of course we ideally want both: A store of value and a way to make cheap and fast transactions, but can we have both with no tradeoffs? This is what seems impossible to me thus far.

And Bitcoin already works well as a digital gold. What you are proposing is: "let's try my untested ideas by changing bitcoin, with the risk of ruining the digital gold property which already works, in order to see if my ideas actually turn out great in real life".

Like I said, im not seeing any numbers, tests, research. You need a testnet and gather some data with your modifications. I don't think you will get much support for your hard fork proposal otherwise.

Something like this for starters:

https://cdn-images-1.medium.com/max/778/1*hcneYIB5K0XUbI7F48hosw.png



Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: DooMAD on August 10, 2018, 02:08:19 PM
It is the first time I'm hearing about this fork  :D

Just check their web site,  :-\

Too many forks out there, many of them with obvious poisonous incentives, many others with little or zero history of discussion and theoretical efforts, ...

My proposals, both for block time decrease and collaborative work are suggested for improving current bitcoin network in scales it is experiencing. I don't understand how a coin/fork with few million dollars market cap and few penta hash mining power would need any kind of improvement and how useful it would be as a testbed for such improvements.

So, thanks but no, I'm not interested.

At least they're willing to actually code it and put their idea into practice, which is seemingly more than you're willing or able to do.  Get it coded, put it on a testnet, show everyone that it works without sacrificing any of the current qualities the current userbase sees value in.  Until then, thanks but no, we're not interested.  

I've pretty much given up my pursuit of an adaptive/dynamic blocksize.  I used to think like you do.  I couldn't see a good reason why we shouldn't be pursuing what seemed like a really good idea.  But I can't code it, no one else is volunteering to do so, which means it's clearly not happening.  Consider that the same is happening with your ideas right now.  You can't force these things.  Either people get on board, or they don't.



Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 10, 2018, 04:15:04 PM
It is the first time I'm hearing about this fork  :D

Just check their web site,  :-\

Too many forks out there, many of them with obvious poisonous incentives, many others with little or zero history of discussion and theoretical efforts, ...

My proposals, both for block time decrease and collaborative work are suggested for improving current bitcoin network in scales it is experiencing. I don't understand how a coin/fork with few million dollars market cap and few penta hash mining power would need any kind of improvement and how useful it would be as a testbed for such improvements.

So, thanks but no, I'm not interested.

At least they're willing to actually code it and put their idea into practice, which is seemingly more than you're willing or able to do.  Get it coded, put it on a testnet, show everyone that it works without sacrificing any of the current qualities the current userbase sees value in.  Until then, thanks but no, we're not interested.  

I've pretty much given up my pursuit of an adaptive/dynamic blocksize.  I used to think like you do.  I couldn't see a good reason why we shouldn't be pursuing what seemed like a really good idea.  But I can't code it, no one else is volunteering to do so, which means it's clearly not happening.  Consider that the same is happening with your ideas right now.  You can't force these things.  Either people get on board, or they don't.


It is not how it works. Improvement proposals have to be discussed and refined before being coded. You are not interested in discussing ideas? Ok then, don't participate just wait for the alpha/beta/release versions, no problem.

As of my PoCW project, I need a LOT of contribution for which, thanks to you and guys like you, I got nothing other than discouraging comments and FUD about how "dangerous" is touching bitcoin.

I'm not a shit-fork guy, neither an scammy ico/start-up one, so, I continue running my campaign to convince more people about the feasibility of having a better bitcoin. By better I mean better at accomplishing its mission as an alternative monetary system, decentralized, secure and fast.

For this I have already proposed an innovative original idea for replacing bitcoin's winner-takes-all  with a contributor-takes-share approach that can neutralize pooling pressure and eliminate pools from the ecosystem forever, plus a series of complementary improvements like decreasing block time, improving transaction format and more.

Coding is in progress and will keep you informed about the progress   ;)


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: cellard on August 10, 2018, 04:32:47 PM
Im up for discussing the game theory involved in the different proposals and im not discouraging you to continue working on it and hope we can finally see the code eventually.

But nonetheless we've discussed Richard Heart's claims on blocksize and blocktime. You said he was a "bitcoin as digital gold" charlatan and to be ignored.

I claim that this is a mistake. Most "bitcoin as digital gold" guys are the biggest whales in Bitcoin, in other words, these are the main guys you must convince since in any hardfork people vote with their coins at the end of the day.

My point is that the "bitcoin as digital gold" guys are not going to give you an inch of space for the tradeoffs involved in your proposal. They are usually extremely conservative and consider Bitcoin good enough for their needs and don't care if the rest cannot afford transactions or if it's too slow or whatnot.

So yeah, don't ignore their opinion or your hardfork will get dumped hard.


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: DooMAD on August 10, 2018, 04:34:36 PM
It is not how it works. Improvement proposals have to be discussed and refined before being coded. You are not interested in discussing ideas? Ok then, don't participate just wait for the alpha/beta/release versions, no problem.

As of my PoCW project, I need a LOT of contribution for which, thanks to you and guys like you, I got nothing other than discouraging comments and FUD about how "dangerous" is touching bitcoin.

You only seem interested in having a "discussion" up to the point where someone raises a concern, then you just start talking over them.  Case in point:

Sure, when you scale up the number of connections each node has to have, you can always ensure very low hop counts.
Clearly, this doesn't apply for random graphs (as we have in Bitcoin) so you would need some kind of fancy management overhead that constructs those special graphs in some decentralized way. Also, you would have to check what effects those 100+ node connections actually have: are there any new attack surfaces regarding DOS, net split? Are slow nodes doomed to fail because they suddenly have to process and rebroadcast all that crap from the other 100 nodes.
So, no Moore Bound problem, ok?

<snip>

"Yeah, that's just not a problem, so la-la-la-I'm-not-listening-la-la-la".  You don't want a discussion, you want people to blindly follow you without question as though what you're saying is the only way to move forwards.  You aren't looking for ways to overcome or work around any issues that are being raised, you're simply ignoring them and hoping they don't cause you grief later down the line.

You are not an easy person to have a discussion with.  


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: aliashraf on August 10, 2018, 05:01:00 PM
@Doomad
I don't understand how your example would help proving your point? @Evil-Knievel made an argument about Moore Bound and I mathematically proved it not being applicable because this is a very high bound for a graph with a degree of 100 (totally feasible) and a diameter of 4 (interestingly short) ...
So, who is not listening/reading here?  ???


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: DooMAD on August 10, 2018, 07:14:44 PM
--snip--
I've pretty much given up my pursuit of an adaptive/dynamic blocksize.  I used to think like you do.  I couldn't see a good reason why we shouldn't be pursuing what seemed like a really good idea.  But I can't code it, no one else is volunteering to do so, which means it's clearly not happening.  Consider that the same is happening with your ideas right now.  You can't force these things.  Either people get on board, or they don't.

I'm pretty sure few developer/contributor have made BIP about adaptive/dynamic blocksize few years ago. But we know it's rejected like most proposal.

Edit : it's BIP 105, 106 and 107.

Indeed.  And if an idea that can both raise and lower throughput limits depending on network conditions can be rejected, then an idea that can only raise them (such as a blocktime reduction) will be even more likely prone to rejection (unless there was a clear and urgent need for it, which there currently isn't).

A ~90 second blocktime like aliashraf proposes would be the approximate equivalent of a 6.66MB base blocksize or a potential 26.64MB SegWit blockweight with the current ~10 minute blocktime.  If the community deemed the SegWit2X potential 8MB blockweight overly excessive, how would anyone in their right mind think the community would openly support more than thrice that amount of potential throughput?

And don't even get me started on the PoW change where he'd happily obliterate the security of the network and reset the difficulty to circa 2011 levels.  Total lunacy. 


Title: Re: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X)
Post by: Wind_FURY on August 11, 2018, 06:06:21 AM
@Wind_FURY,
It is the first time I'm hearing about this fork  :D

Just check their web site,  :-\

Too many forks out there, many of them with obvious poisonous incentives, many others with little or zero history of discussion and theoretical efforts, ...

My proposals, both for block time decrease and collaborative work are suggested for improving current bitcoin network in scales it is experiencing. I don't understand how a coin/fork with few million dollars market cap and few penta hash mining power would need any kind of improvement and how useful it would be as a testbed for such improvements.

So, thanks but no, I'm not interested.

But you want to use the Bitcoin network as your "testbed" for your ideas, and then criticize the Bitcoin Core developers of they reject them? Hahaha.

Plant your feet on the ground, my friend. The Core developers are doing a good job maintaining the network's decentralization and security. That must not change.