Bitcoin Forum
May 20, 2019, 07:35:51 PM *
News: Latest Bitcoin Core release: 0.18.0 [Torrent] (New!)
 
   Home   Help Search Login Register More  
Pages: [1] 2 »  All
  Print  
Author Topic: Superspace: Scaling Bitcoin Beyond SegWit  (Read 372 times)
Kallisteiros
Copper Member
Member
**
Offline Offline

Activity: 85
Merit: 48


View Profile
August 27, 2018, 03:51:58 PM
Merited by ETFbitcoin (11), dbshck (4), suchmoon (4), DarkStar_ (2)
 #1

Superspace: Scaling Bitcoin Beyond SegWit

Steve Kallisteiros
August 25, 2018

Abstract. SegWit, or Segregated Witness, a soft fork successfully activated mid-2017 on Bitcoin blockchain, provided among other benefits a backward-compatible increase to the block size limit, all while not introducing consensus breaking changes to Bitcoin protocol and not resulting in a hard fork network partitioning. The potential space increase is estimated at being close to 2 MB total for practical purposes. In this paper, the author argues that it is possible, using the same mechanism, increase the block size further up to any agreed-upon limit, while still providing the same security guarantees SegWit does.

Complete paper: https://files.zazzylabs.com/LNx2ud/superspace.pdf

===

Keep in mind, it's only the first draft. I would really appreciate your feedback.
1558380951
Hero Member
*
Offline Offline

Posts: 1558380951

View Profile Personal Message (Offline)

Ignore
1558380951
Reply with quote  #2

1558380951
Report to moderator

European Cryptocurrency Mining Equipment Reseller

Bank Transfer & Bitcoin Accepted!

Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
kanzure
Newbie
*
Offline Offline

Activity: 11
Merit: 3


View Profile
August 27, 2018, 05:53:48 PM
Merited by DarkStar_ (2), ETFbitcoin (1)
 #2

Some follow-up took place here: http://gnusha.org/bitcoin-wizards/2018-08-27.log

Take a look at the previous work on "extension blocks": https://old.reddit.com/r/Bitcoin/comments/63b8lb/purse_extension_blocks_ready_for_liftoff/dft4nj1/?context=1
Kallisteiros
Copper Member
Member
**
Offline Offline

Activity: 85
Merit: 48


View Profile
August 27, 2018, 06:23:36 PM
Merited by DarkStar_ (2), ETFbitcoin (1)
 #3

Indeed, several people have pointed out to me that this idea has already been out there for some time, only it's been called "extension blocks", albeit not defined with as much detail. Started to read the discussions on it, seems very similar. Gonna continue reading to give them proper credit, but also to figure out why this proposal hasn't been greenlighted and what arguments could be used to showcase the benefit of its activation. I see it started in pre-SegWit era, maybe this time will be different because we have seen a successful SegWit activation, which is highly similar to the proposal in question, in the way SegWit was deployed.
domob
Legendary
*
Offline Offline

Activity: 1060
Merit: 1091


View Profile WWW
August 28, 2018, 08:13:02 AM
 #4

I don't think there is any consensus that we actually want to increase the block size any further for now - that is the main reason why extension blocks have not been deployed.  The hard problem is not how to do it technically, but whether we need to do it or whether we should stick to what we have now with Segwit and scale on layer 2 instead.

While the block-size increase was one of the effects of Segwit, it was not the only and perhaps not the most important one.  Segwit also solves transaction malleability in an elegant way, which makes the implementation of layer-2 techniques (including Lightning) easier.

Use your Namecoin identity as OpenID: https://nameid.org/
Donations: 1domobKsPZ5cWk2kXssD8p8ES1qffGUCm | NMC: NCdomobcmcmVdxC5yxMitojQ4tvAtv99pY
BM-GtQnWM3vcdorfqpKXsmfHQ4rVYPG5pKS | GPG 0xA7330737
Kallisteiros
Copper Member
Member
**
Offline Offline

Activity: 85
Merit: 48


View Profile
August 28, 2018, 10:07:36 AM
 #5

Congrats on reaching the "over 1000 posts" mark!  Smiley

I don't think there is any consensus that we actually want to increase the block size any further for now - that is the main reason why extension blocks have not been deployed.
You can hardly find 100% consensus about anything these days, but I think there is a need. Every day I see topics here with people complaining about the block size and freaking out about the thought that full block incident could happen again on the next bull run.

I also remember conversations happening pre-SegWit, some people were also saying there's no need to increase the block size any further than 1mb and therefore no need for SegWit, and it got rolled out anyway, as a compromise solution. Likewise, there is no harm in rolling out superspace/extension blocks, and it would actually be a better compromise solution. All I'm saying, if we're doing this anyway, why not go all the way?

Quote
The hard problem is not how to do it technically, but whether we need to do it or whether we should stick to what we have now with Segwit and scale on layer 2 instead.
How optimistic are you about when layer 2 solutions will become usable? I am working on one implementation of LN, and I see some questions, like routing and liquidity, are still in the air. This does not diminish the existing efforts of developers, of course, we've come a long way, but we're still somewhere in the middle.

Quote
While the block-size increase was one of the effects of Segwit, it was not the only and perhaps not the most important one.  Segwit also solves transaction malleability in an elegant way, which makes the implementation of layer-2 techniques (including Lightning) easier.
Of course, and I did mention that. However, SegWit was sold to the community mostly as the backward-compatible block increase solution, and that aspect of it caught the most attention.
aliashraf
Hero Member
*****
Offline Offline

Activity: 812
Merit: 622


View Profile
August 28, 2018, 10:16:37 AM
 #6

I don't think there is any consensus that we actually want to increase the block size any further for now - that is the main reason why extension blocks have not been deployed.  The hard problem is not how to do it technically, but whether we need to do it or whether we should stick to what we have now with Segwit and scale on layer 2 instead.

While the block-size increase was one of the effects of Segwit, it was not the only and perhaps not the most important one.  Segwit also solves transaction malleability in an elegant way, which makes the implementation of layer-2 techniques (including Lightning) easier.
Ethereum idol, Vitalik Buterin says it is impossible and there is a law like in thermodynamics that forbids onchain scaling.  It is a ridiculous claim and I've refuted it in many occasions.

2nd layer scaling solutions are inherently vulnerable to centralization and no matter how many bitcoin devs are working on it because once you are working on such a protocol you are an outlander for bitcoin community.

The basic axiom of bitcoin and cryptocurrency (unlike what Buterin is trying to sell us) is the possibility of achieving to all of the three characteristics (that he claims to be in a trilemma): security, decentralization and performance in a blockchain.

Axioms are not subject to debate, anybody who thinks we can't have a better performance without jeopardizing security or decentralization is not  bitcoiner or a member of cryptocurrency movement. Such a person is just a revisionist probably hired by feds or corps or like Buterin owns a corp or like some bitcoiner versions of buterin are planning for such a position. Fuck 2nd layer solutions, improve the actual blockchain.

As of this proposal:
I think the idea of having some double referenced blocks called Superblocks here won't help with the canonical drawback of bigger blocks. When the number of transactions grows the propagation delay increases because nodes should query and validate the transactions and it affects proximity related problems.

For now ,I suppose it won't have support not because of devs being so fond of LN, (bitcoin is not Ethereum, nobody has conquered it, nobody dares to resist against a brilliant onchain scaling idea because of his corporate's best interests in 2nd layer solutions) rather because of having a parallel block just doesn't sound very different than suggesting an increased block size. I have to read it more deeply to be sure, tho.






Kallisteiros
Copper Member
Member
**
Offline Offline

Activity: 85
Merit: 48


View Profile
August 28, 2018, 10:36:39 AM
Merited by Foxpup (2), DooMAD (2), DarkStar_ (2), ETFbitcoin (1), buwaytress (1)
 #7

2nd layer scaling solutions are inherently vulnerable to centralization
It is important to have the same definitions in mind when discussing things like centralization. In Bitcoin, decentralization means censorship-resistance (due to many parties involved that take random turns producing the blocks), impossibility to do any action to bring down the whole network (in centralized systems, this would be chopping off the head), and custodial decentralization (no need to trust someone that they won't steal your funds, because they can't). You would have the same in LN: censorship-resistance (if 10 routes don't want to do business with you, you route through 11th), can't bring down the whole network (you bring down big hubs, and the little ones take their place), and custodial decentralization (you still don't trust anybody other than the math, hubs are not custodians of your funds). What's left is perceived centralization, however, cause such a network would tend to center around hubs with the biggest liquidity and connections, so it would look like hub-and-spoke topology.

It's a bit offtopic here though.

Quote
because of having a parallel block just doesn't sound very different than suggesting an increased block size
Very different. One is a hard fork resulting in a network split, the other one is a soft fork, not affecting legacy software in the slightest. That's the whole point.
buwaytress
Hero Member
*****
Offline Offline

Activity: 980
Merit: 884


I bit, therefore I am


View Profile
August 28, 2018, 11:05:53 AM
 #8

Thank you for this OP. Severely limited in terms of technical knowledge here, but always keen to follow Bitcoin upgrades and have been happily using SW for a while now, enjoying the very obvious benefits. Just curious from the paper, which assumes (as I do) that SegWit has already reached the stage of significant adoption (at least through P2SH) and can be considered mature, yet the percentage of legacy users and services still stubbornly holding on. Will implementing this and possible further expansions cause even further risk of alienating legacy?

I know it doesn't affect them (no network partitioning as your paper says) but the fact that they can't transact/spend to native SW unless using the same client already must mean some users are shorn off... Or does it actually make no difference, or would it actually further push to encourage SW upgrade?

aliashraf
Hero Member
*****
Offline Offline

Activity: 812
Merit: 622


View Profile
August 28, 2018, 11:33:27 AM
 #9

2nd layer scaling solutions are inherently vulnerable to centralization
It is important to have the same definitions in mind when discussing things like centralization...
... What's left is perceived centralization, however, cause such a network would tend to center around hubs with the biggest liquidity and connections, so it would look like hub-and-spoke topology.

It's a bit offtopic here though.
Indeed it is. But generally, I think there is no decentralized, trustless, secure solution in the horizon other than blockchain and I don't believe in LN being an alternative technology. Why should we use blockchain for LN by the way. One could adopt LN to be run on fiat and traditional banking system.

Anyway, you are the one who has come with an onchain scaling proposal and at the same time you are promoting LN?! Why should anybody even care about onchain solutions if he is a believer in LN or any 2nd layer alternative?

Quote
Quote
because of having a parallel block just doesn't sound very different than suggesting an increased block size
Very different. One is a hard fork resulting in a network split, the other one is a soft fork, not affecting legacy software in the slightest. That's the whole point.

So it is a block size increase solution without a need for a hard fork. But as far as I understand, the main controversial issue with block size increase proposals is not their need for a hard fork rather it is  their centralization implications because of progress and proximity consequences.
Kallisteiros
Copper Member
Member
**
Offline Offline

Activity: 85
Merit: 48


View Profile
August 28, 2018, 12:35:35 PM
Last edit: August 28, 2018, 12:46:48 PM by Kallisteiros
 #10

Yet the percentage of legacy users and services still stubbornly holding on.
I think they will quickly reconsider when the next bull run happens, the network gets congested again, and the fees soar. Many people will adopt new things only when they absolutely have to, at the last moment, when they feel the pain and see the solution that eases it.

Quote
but the fact that they can't transact/spend to native SW unless using the same client already must mean some users are shorn off
You can provide, alternatively, a Segwit P2SH address to those legacy clients, the net result will be just the same. With a text underneath a bech32 address: "If your software doesn't understand this, use this address instead: 3........ Oh, and by the way, consider upgrading your software to the latest version."
Or you can just provide only the P2SH address. Even if you use P2SH addresses alone, you still use SegWit.
Kallisteiros
Copper Member
Member
**
Offline Offline

Activity: 85
Merit: 48


View Profile
August 28, 2018, 12:46:25 PM
Last edit: August 28, 2018, 01:07:08 PM by Kallisteiros
 #11

Anyway, you are the one who has come with an onchain scaling proposal and at the same time you are promoting LN?! Why should anybody even care about onchain solutions if he is a believer in LN or any 2nd layer alternative?

You kinda answered this question yourself:

Quote
But generally, I think there is no decentralized, trustless, secure solution in the horizon other than blockchain

My personal estimate is that it will take several years for LN to mature, but we need bigger blocks right now. This solution would provide time for LN developers (or sidechains, or whatever) to do their thing, and we would enjoy 5-10 MB blocks until that time.

Quote
and I don't believe in LN being an alternative technology.
So what's your alternative? 500 GB blocks every 10 minutes? If you read my paper, you'd see I'm for a reasonable increase, not extreme.

Quote
Why should we use blockchain for LN by the way. One could adopt LN to be run on fiat and traditional banking system.
Traditional banking systems are custodian systems, where they are kings and can do whatever they want: run off with your money, or shut down your account at any time for any reason, or tell you to come get your money Monday through Friday from 2pm to 4pm. They don't have this power with LN, at no point your funds are at risk. You can close the channel back to the blockchain if you don't like how the counterparty behaves. I can't vouch for how effective it can ultimately be, we'll see, but it's a neat concept nevertheless.

Quote
But as far as I understand, the main controversial issue with block size increase proposals is not their need for a hard fork rather it is  their centralization implications because of progress and proximity consequences
I think I've highlighted this enough times in the paper, but I'd like to point out an important point again: you can limit superspace blocks, by saying in the protocol the block is invalid if its size exceeds, let's say 10 MB. Increase from 1 MB to 2 MB in SegWit did not really affect decentralization so far, neither would 10 MB. 128 MB blocks definitely would.

Also, it doesn't mean that every block will be 10 MB from now on, only in the "rush hour".

Where did I get 10 MB number? Nowhere, it's just an example. We can set it to whatever limit the community deems reasonable. The point is that we can impose it. But now we ourselves will be able to decide what the block limit should be in today's circumstances, not 2010's Satoshi when he initially put 1 MB, looking at 2-3 KB blocks as they were back then.
Samarkand
Sr. Member
****
Offline Offline

Activity: 574
Merit: 278


View Profile
August 28, 2018, 01:04:52 PM
 #12

...
or tell you to come get your money Monday through Friday from 2pm to 4pm. They don't have this power with LN, at no point your funds are at risk. You can close the channel back to the blockchain if you don't like how the counterparty behaves. I can't vouch for how effective it can ultimately be, we'll see, but it's a neat concept nevertheless.
...

Nonetheless this system doesn´t work in various situations.
E.g. you can broadcast a closing transaction if you don´t like how
the counterparty behaves, but it is possible that the main layer is too
congested for it to go through at the time or you don´t have the necessary liquid
funds for a transaction fee that ensures a fast confirmation.

If the main layer is too congested and you are in the situation where you
want to close a channel unilaterally, you are screwed.
Kallisteiros
Copper Member
Member
**
Offline Offline

Activity: 85
Merit: 48


View Profile
August 28, 2018, 01:16:32 PM
 #13

If the main layer is too congested and you are in the situation where you
want to close a channel unilaterally, you are screwed.
This conversation is more suitable for a LN-specific thread and for people who know more about LN low-level security constraints than me, but I'll just point out that the deadline for closing the channel is not one block, it is whatever period of blocks you put in the funding transaction's timelock. If you see the main layer become more congested, you do it earlier at a safe distance from the deadline. You could, for example, have a rule of opening channels for two weeks, but always closing and reopening the channel 24 hours before it expires. That said, I agree with you, in the process of maturing, LN has a lot of things to take care of, including this one.
Carlton Banks
Legendary
*
Offline Offline

Activity: 2366
Merit: 1671



View Profile
August 28, 2018, 01:55:57 PM
 #14

If the main layer is too congested and you are in the situation where you
want to close a channel unilaterally, you are screwed.

eltoo solves this problem

Vires in numeris
aliashraf
Hero Member
*****
Offline Offline

Activity: 812
Merit: 622


View Profile
August 28, 2018, 03:41:17 PM
 #15

... and I don't believe in LN being an alternative technology.
So what's your alternative? 500 GB blocks every 10 minutes? If you read my paper, you'd see I'm for a reasonable increase, not extreme.
My alternative is like 10 times decrease in block time in short term and improving PoW to a pooling pressure free version and providing a strong enough infrastructure for sharding in long term. I would propose a lot of more improvements meanwhile including and not limited to moving all signature data to witness space, using Schnorr signatures, ...
Quote
Quote
But as far as I understand, the main controversial issue with block size increase proposals is not their need for a hard fork rather it is  their centralization implications because of progress and proximity consequences
I think I've highlighted this enough times in the paper, but I'd like to point out an important point again: you can limit superspace blocks, by saying in the protocol the block is invalid if its size exceeds, let's say 10 MB. Increase from 1 MB to 2 MB in SegWit did not really affect decentralization so far, neither would 10 MB. 128 MB blocks definitely would.

Also, it doesn't mean that every block will be 10 MB from now on, only in the "rush hour".

Where did I get 10 MB number? Nowhere, it's just an example. We can set it to whatever limit the community deems reasonable. The point is that we can impose it. But now we ourselves will be able to decide what the block limit should be in today's circumstances, not 2010's Satoshi when he initially put 1 MB, looking at 2-3 KB blocks as they were back then.
Setting  10 MB limit on superspace blocks is just the same as putting a 11 MB limit. Still you are offering nothing more than a complicated algorithm tweak to reach to a point that is simply affordable by a hard fork.

Don't want to undermine your work but I think it is just about avoiding hard forks by mimicking SegWit approach.

Honestly, I hate SW exactly because of its tricky approach, it looks to me kinda cobbling things in a hacker way. I love hack but not when it comes to core algorithm, as a rule of thumb we should keep core components elegant and smart.
Kallisteiros
Copper Member
Member
**
Offline Offline

Activity: 85
Merit: 48


View Profile
August 28, 2018, 04:23:18 PM
 #16

Honestly, I hate SW exactly because of its tricky approach, it looks to me kinda cobbling things in a hacker way. I love hack but not when it comes to core algorithm, as a rule of thumb we should keep core components elegant and smart.
You do know that many innovations in Bitcoin were rolled out this way, right? P2SH is treated by non-P2SH-aware nodes just as a hash preimage lock. OP_CHECKLOCKTIMEVERIFY and OP_CHECKSEQUENCEVERIFY for non-aware nodes are just no-ops.
HeRetiK
Legendary
*
Offline Offline

Activity: 1106
Merit: 1049


the forkings will continue until morale improves


View Profile
August 28, 2018, 06:08:00 PM
 #17

Interesting whitepaper!

I wonder though, are you (a) pessimistic on how long LN will take until it reaches more widespread adoption or are you (b) optimistic regarding how quickly one could implement and deploy Superspace?

Seeing how LN already hit mainnet earlier this year and how long it took for SegWit to reach maturity starting from its conceptional phase I believe that either must be the case; at least considering that we are looking at "a short-term proposal, intended to provide a temporary ease from scalability issues". Or would you see Superblocks as part of a long-term scaling approach, with temporary ease being a mere side-effect?

buwaytress
Hero Member
*****
Offline Offline

Activity: 980
Merit: 884


I bit, therefore I am


View Profile
August 30, 2018, 04:38:21 PM
 #18

Yet the percentage of legacy users and services still stubbornly holding on.
I think they will quickly reconsider when the next bull run happens, the network gets congested again, and the fees soar. Many people will adopt new things only when they absolutely have to, at the last moment, when they feel the pain and see the solution that eases it.

The demonstration of necessity being the mother of invention is certainly true in terms of Bitcoin development (which is one reason I always say "scaling" is a desirable problem), one might say that had we never reached those critical stages of congestion, we wouldn't be here at today's rate of Segwit adoption, or even today's progress with LN.

But I don't think we'll ever get that perfect storm again, where congestion coincided with heavy interest and high prices, and pressure from big blockers. If that wasn't enough to convince people to move over, I'm not sure what will. The pain wasn't, after all, unbearable.

cellard
Legendary
*
Offline Offline

Activity: 1372
Merit: 1209


View Profile
August 30, 2018, 05:26:11 PM
 #19

Im looking forward to another additional 12 to 24 months of drama comming from the so called community, so called devs, and miners alike, mixed with the big twitter megaphone guys trying to step in with so called "X place agreements" again.

I don't think we'll ever see another softfork similar to segwit. We somehow got segwit in and it seems to be working, but some still question that it is safe and always will (and have good arguments to think so)

The amount of controversy needed to get segwit in was incredibly insane. You would need to have a package of updates so good that it can be done again, and perhaps not without another round of transaction backlog either organic or spammed again.

I think no matter how good ideas are, unless bitcoin is pushed to its limits and this idea is presented as an acceptable solution by many relevant parties, we will not see further updates, definitely coming by way of hardfork, and very doubtfully by controversial softforks.
buwaytress
Hero Member
*****
Offline Offline

Activity: 980
Merit: 884


I bit, therefore I am


View Profile
August 31, 2018, 01:44:37 PM
 #20

Im looking forward to another additional 12 to 24 months of drama comming from the so called community, so called devs, and miners alike, mixed with the big twitter megaphone guys trying to step in with so called "X place agreements" again.

I don't think we'll ever see another softfork similar to segwit. We somehow got segwit in and it seems to be working, but some still question that it is safe and always will (and have good arguments to think so)

The amount of controversy needed to get segwit in was incredibly insane. You would need to have a package of updates so good that it can be done again, and perhaps not without another round of transaction backlog either organic or spammed again.

I think no matter how good ideas are, unless bitcoin is pushed to its limits and this idea is presented as an acceptable solution by many relevant parties, we will not see further updates, definitely coming by way of hardfork, and very doubtfully by controversial softforks.

Everything in hindsight seems big and mega, and I agree we may never get the perfect storm of 2017 again when it comes to something so politically-charged. They've all seen how damaging it can be, both financially and to reputations, but you just never know. If Bitcoin price again threatens to "explode" there may just be enough in it to motivate for yet more agendas aligned to X Y or Z direction.

Just need to wait for another culmination of all these aspects from dev sentiment, market action, and just general "tired of twiddling our thumbs" people.

Pages: [1] 2 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!