Bitcoin Forum
September 18, 2019, 12:17:22 PM *
News: Latest Bitcoin Core release: 0.18.1 [Torrent]
 
   Home   Help Search Login Register More  
Pages: 1 2 3 4 5 6 [All]
  Print  
Author Topic: Bitcoin Scaling Solution Without Lightning Network...  (Read 1419 times)
cfbtcman
Member
**
Offline Offline

Activity: 233
Merit: 12


View Profile
November 17, 2018, 03:25:57 AM
Merited by bones261 (1)
 #1

Today i read a tweet from Roger Ver that says bitcoin cant scale, LN dont work, Craig Wright says the same...

What about this:

 - Adjust actual Bitcoin blocksize for more MB (like BCH done)
 - Dinamically Split blocksize data in many parts like BTC Nodes Type A, B, C, etc (related to the number of total nodes running)

Example:

If we double the blocksize, we double the Type of nodes:

More nodes = more split types = bigger blocksize

This way we could save a lot of diskspace in nodes network, its sufficient that in a group of 1000 nodes, 500 save backup of 50% of the data (Type A node) and the other 50% can save backup of the other 50% of the data (Type B node), its a waste of space all the nodes in the network save all information repeated thousands of times.

Then each node Type A can "ask" to other node Type B the information is trying to find and vice-versa and gets the information anyway dont wasting so much space and solving the problems of centralized big block sizes data full nodes that no common mortal can manage like will happen in BCH with the use of big blocksize like BCHSV going to 128MB and planning keep scaling.

What you guys think about this?
1568809042
Hero Member
*
Offline Offline

Posts: 1568809042

View Profile Personal Message (Offline)

Ignore
1568809042
Reply with quote  #2

1568809042
Report to moderator
1568809042
Hero Member
*
Offline Offline

Posts: 1568809042

View Profile Personal Message (Offline)

Ignore
1568809042
Reply with quote  #2

1568809042
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1568809042
Hero Member
*
Offline Offline

Posts: 1568809042

View Profile Personal Message (Offline)

Ignore
1568809042
Reply with quote  #2

1568809042
Report to moderator
1568809042
Hero Member
*
Offline Offline

Posts: 1568809042

View Profile Personal Message (Offline)

Ignore
1568809042
Reply with quote  #2

1568809042
Report to moderator
ETFbitcoin
Legendary
*
Offline Offline

Activity: 1764
Merit: 2023

Use SegWit and enjoy lower fees.


View Profile WWW
November 17, 2018, 03:35:00 AM
Merited by bones261 (2)
 #2

This has been discussed many times and unfortunately, majority of Bitcoiner would disagree since increasing block size would increase the cost of running full nodes.
Split block data to many different nodes type is called Sharding and already proposed many times such as BlockReduce: Scaling Blockchain to human commerce
Besides, IMO sharding open lots of attack vector, increase development complexity and requiring more trust.

Additionally, LN help bitcoin scaling a lot, even though it's not perfect solution. Those who said that clearly don't understand how LN works and it's potential.
Lots of cryptocurrency including Ethereum are preparing 2nd-layer/off-chain as scaling solution because they know it's good scaling solution.

cfbtcman
Member
**
Offline Offline

Activity: 233
Merit: 12


View Profile
November 17, 2018, 04:26:02 AM
 #3

This has been discussed many times and unfortunately, majority of Bitcoiner would disagree since increasing block size would increase the cost of running full nodes.
Split block data to many different nodes type is called Sharding and already proposed many times such as BlockReduce: Scaling Blockchain to human commerce
Besides, IMO sharding open lots of attack vector, increase development complexity and requiring more trust.

Additionally, LN help bitcoin scaling a lot, even though it's not perfect solution. Those who said that clearly don't understand how LN works and it's potential.
Lots of cryptocurrency including Ethereum are preparing 2nd-layer/off-chain as scaling solution because they know it's good scaling solution.

This idea dont require to upgrade blocksize, you can split nodes even with 2MB block, you could save a lot of space in nodes and reduce the cost of running full nodes, but, if you upgrade blocksize and at same time split the nodes in a way like dinamically P2P data, the cost of running nodes will be the same as before.

Even with LN there will be a huge of data space wasted in all the nodes repeating informations and a pain in the ass to install and run a full node.

If today there is 9000 nodes and if tomorrow another 9000 are added to network, dont make sense to repeat the information so many times, its a waste of space.

Im not 100% inside how LN works, but what happens if i switch off my LN node with all the channels inside forever, there is loss of bitcoins or not?
pooya87
Legendary
*
Offline Offline

Activity: 1764
Merit: 1875


Remember tonight for it's the beginning of forever


View Profile
November 17, 2018, 04:55:54 AM
 #4

the problem with increasing block size as the only solution for scaling is that no matter how much you increase it, it will not be enough for a global payment system which has to process a lot of transactions per second.

you will eventually need a solution that doesn't need block space for increasing availability. and that is the second layer solution like lightning network.

cfbtcman
Member
**
Offline Offline

Activity: 233
Merit: 12


View Profile
November 17, 2018, 05:21:07 AM
 #5

the problem with increasing block size as the only solution for scaling is that no matter how much you increase it, it will not be enough for a global payment system which has to process a lot of transactions per second.

you will eventually need a solution that doesn't need block space for increasing availability. and that is the second layer solution like lightning network.

I understand your point of view, but i think if all the world uses LN the 2MB blocksize would not be sufficient to do the ONCHAIN transactions the people would want to do from blockchain to off-chain and vice-versa, so, block size needs to upgrade sooner or later, it would have been much better for everyone  if it was done without BTC/BCH split, LN can run over BCH too, so...

But i have a new idea, my purpose can be implemented without changing the block size, only to save data space in nodes and i think it can be possible to apply without hardforking, for example, we can create a new version node that stays 100% compatible with actual core and splits/communicates with the new node types to save disk space and act as normal node to comunicate with actual nodes.

That would be something very nice!

Some expert could help to implement it?
pooya87
Legendary
*
Offline Offline

Activity: 1764
Merit: 1875


Remember tonight for it's the beginning of forever


View Profile
November 17, 2018, 05:37:31 AM
 #6

I understand your point of view, but i think if all the world uses LN the 2MB blocksize would not be sufficient to do the ONCHAIN transactions the people would want to do from blockchain to off-chain and vice-versa, so, block size needs to upgrade sooner or later,

i completely agree! and i think everyone already knows that LN relies on on-chain scaling. it is called second layer for a reason, it is a layer that comes on another layer which needs to be functioning well.

Quote
it would have been much better for everyone  if it was done without BTC/BCH split, LN can run over BCH too, so...
that fork-off had nothing to do with scaling in my opinion, it was more like an attempt to take over and make money. not to mention that the whole motto of BCH is that the only way for scaling is on-chain scaling. so no, LN or any other second layer solution name it may be called will not "run over BCH" ever.

Quote
500 save backup of 50% of the data (Type A node) and the other 50% can save backup of the other 50% of the data (Type B node)
so lets say there is a transaction Tx1 and it is in the other half of the "data" in node Type B. and say i run node Type A. if someone pays me by spending Tx1, i do not have it in my database (blockchain) so how can i know it is a valid translation? am i supposed to assume that it is valid on faith?! or am i supposed to connect to another node and ask if the transaction is valid and then trust that other node to tell me the truth? how would you prevent fraud?

bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 17, 2018, 06:47:50 AM
Last edit: November 17, 2018, 07:14:17 AM by bones261
 #7

the problem with increasing block size as the only solution for scaling is that no matter how much you increase it, it will not be enough for a global payment system which has to process a lot of transactions per second.

you will eventually need a solution that doesn't need block space for increasing availability. and that is the second layer solution like lightning network.

That's not entirely accurate. 10GB blocks would be sufficient to scale on the level of Visa. However, many nodes would find it hard or impossible to keep up with that kind of capacity. However, in a decade or so, handling that capacity will probably be trivial. The LN network definitely will be less resource intensive. Unfortunately, I do not personally trust the LN in its current state. I find it particularly troubling that I could end up closing a channel in an earlier state due to a system issue on my end. Then end up getting penalized harshly like I was trying to scam someone.

Edit: OMG, just did a little more research and found this gem. https://bitcoin.stackexchange.com/questions/58124/how-do-i-restore-a-lightning-node-with-active-channels-that-has-crashed-causing
Really? If my node crashes I can loose all of my funds???  Huh That's even worse than my first concern. How long has this been in development? It appears that LN developers have a long way to go before than can make this pig fly.  Cheesy I'll keep the faith; I guess.  Huh

   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
pooya87
Legendary
*
Offline Offline

Activity: 1764
Merit: 1875


Remember tonight for it's the beginning of forever


View Profile
November 17, 2018, 07:54:03 AM
Merited by ETFbitcoin (1), bones261 (1)
 #8

That's not entirely accurate. 10GB blocks would be sufficient to scale on the level of Visa.

no it won't be enough because you can't spam VISA network but you can spam bitcoin blocks and fill them up.
besides when it comes to block size it is not just about having more space, you have to consider that you have to download, verify and store 10GB of transaction data every 10 minutes on average and also upload even more depending on how many nodes you connect to and how much you want to contribute.
ask yourself this, would you be willing to run or capable of running a full node that requires 1.44 TB disk space per day, an internet speed of at least 20 Mbps and an internet cap of above 3 TB per day? that is 43 TB per month. can your hardware (CPU, RAM) handle verification of that much data.

DooMAD
Legendary
*
Offline Offline

Activity: 2100
Merit: 1342


Leave no FUD unchallenged


View Profile WWW
November 17, 2018, 07:59:50 AM
Merited by franky1 (1)
 #9

its sufficient that in a group of 1000 nodes, 500 save backup of 50% of the data (Type A node) and the other 50% can save backup of the other 50% of the data (Type B node), its a waste of space all the nodes in the network save all information repeated thousands of times.

Then each node Type A can "ask" to other node Type B the information is trying to find and vice-versa and gets the information anyway

Bitcoin was designed to be trustless.  The idea of running a node is that you can validate and verify every single transaction yourself.  If you run a Type A node, you would have to trust the Type B nodes to do half of the validation for you.  If you're going to do that, why not just trust Visa and forget all about Bitcoin?

cfbtcman
Member
**
Offline Offline

Activity: 233
Merit: 12


View Profile
November 17, 2018, 08:59:55 AM
 #10

so lets say there is a transaction Tx1 and it is in the other half of the "data" in node Type B. and say i run node Type A. if someone pays me by spending Tx1, i do not have it in my database (blockchain) so how can i know it is a valid translation? am i supposed to assume that it is valid on faith?! or am i supposed to connect to another node and ask if the transaction is valid and then trust that other node to tell me the truth? how would you prevent fraud?

We ask the other Type B node for the other part of data and we check it, there will be houndred or thousands of nodes Type A and Type B, we could check that agains more then a only one node Type B, but think about a pair of nodes Type A and Type B as a full node, they connect together to have 100% of block data and that can be checksumed and verified, is like reading from our own disk in another cluster.

How many more nodes we have more splits we can do and if my Type A node saves 2MB of block information and your Type B saves other 2MB and Type C another 2MB we have 6MB blocks representing only 2MB in each Type node.
odolvlobo
Legendary
*
Offline Offline

Activity: 2618
Merit: 1401



View Profile
November 17, 2018, 09:14:35 AM
 #11

We ask the other Type B node for the other part of data and we check it, ...

If A nodes must also download B blocks and B nodes must also download A blocks, then you have accomplished nothing by splitting them.

Buy stuff on Amazon at a discount with bitcoins or convert Amazon points to bitcoins: Purse.io
Join an anti-signature campaign: Click ignore on the members of signature campaigns.
DooMAD
Legendary
*
Offline Offline

Activity: 2100
Merit: 1342


Leave no FUD unchallenged


View Profile WWW
November 17, 2018, 09:28:20 AM
 #12

We ask the other Type B node for the other part of data and we check it, ...

If A nodes must also download B blocks and B nodes must also download A blocks, then you have accomplished nothing by splitting them.

I assumed it was meant as a partial SPV setup.  Each type of node would be 50% SPV.  But yeah, it's not something that most users would be interested in pursuing as a concept.

cfbtcman
Member
**
Offline Offline

Activity: 233
Merit: 12


View Profile
November 17, 2018, 09:53:52 AM
 #13

We ask the other Type B node for the other part of data and we check it, ...

If A nodes must also download B blocks and B nodes must also download A blocks, then you have accomplished nothing by splitting them.

I assumed it was meant as a partial SPV setup.  Each type of node would be 50% SPV.  But yeah, it's not something that most users would be interested in pursuing as a concept.

Yeah, something like that, we will ask only the data of the blocks we need and we can test is integrity by checksum or simply by asking the same information to other Type B node randomly, if the result is the same for 2, 3 or 4 other nodes that proves integrity.

I cant see other way of shrinking blockchain sizes and this technique is like P2P file sharing, each seed dont need to have all file since there would be sufficient trusted seeds in the network with some % of the file that alltogether have whole information.

Its genius, i am Satoshi  Grin
buwaytress
Hero Member
*****
Offline Offline

Activity: 1106
Merit: 961


I bit, therefore I am


View Profile
November 17, 2018, 10:09:00 AM
 #14

We ask the other Type B node for the other part of data and we check it, ...

If A nodes must also download B blocks and B nodes must also download A blocks, then you have accomplished nothing by splitting them.

I assumed it was meant as a partial SPV setup.  Each type of node would be 50% SPV.  But yeah, it's not something that most users would be interested in pursuing as a concept.

Yeah, something like that, we will ask only the data of the blocks we need and we can test is integrity by checksum or simply by asking the same information to other Type B node randomly, if the result is the same for 2, 3 or 4 other nodes that proves integrity.

I cant see other way of shrinking blockchain sizes and this technique is like P2P file sharing, each seed dont need to have all file since there would be sufficient trusted seeds in the network with some % of the file that alltogether have whole information.

Its genius, i am Satoshi  Grin

Doesn't make sense to me, if the result is the same for 4 nodes but only for particular blocks, that proves only that these nodes share the same on those particular blocks, but unless your node has verified every block once, there's no way to prove integrity by this random checking.

And besides, in P2P file sharing, a seed technically is a peer with 100% of the files. You're not seeding until you have the full file, anything less than 100% just makes you a peer Tongue

Pruning already works for size concern also?

cfbtcman
Member
**
Offline Offline

Activity: 233
Merit: 12


View Profile
November 17, 2018, 10:47:51 AM
Last edit: November 18, 2018, 12:30:25 AM by cfbtcman
 #15

We ask the other Type B node for the other part of data and we check it, ...

If A nodes must also download B blocks and B nodes must also download A blocks, then you have accomplished nothing by splitting them.

I assumed it was meant as a partial SPV setup.  Each type of node would be 50% SPV.  But yeah, it's not something that most users would be interested in pursuing as a concept.

Yeah, something like that, we will ask only the data of the blocks we need and we can test is integrity by checksum or simply by asking the same information to other Type B node randomly, if the result is the same for 2, 3 or 4 other nodes that proves integrity.

I cant see other way of shrinking blockchain sizes and this technique is like P2P file sharing, each seed dont need to have all file since there would be sufficient trusted seeds in the network with some % of the file that alltogether have whole information.

Its genius, i am Satoshi  Grin

Doesn't make sense to me, if the result is the same for 4 nodes but only for particular blocks, that proves only that these nodes share the same on those particular blocks, but unless your node has verified every block once, there's no way to prove integrity by this random checking.

And besides, in P2P file sharing, a seed technically is a peer with 100% of the files. You're not seeding until you have the full file, anything less than 100% just makes you a peer Tongue

Pruning already works for size concern also?

That particular blocks are the blocks that contain information you are trying to find like the movements for specific address transactions.

4 Types of nodes is not static thing, as the block size increase the number of splited parts increase too, we can have Type A,B,C,D,E,F and so on to keep the size of each block data in only 2MB for each node type.

Each Type of node would have 100% of his 50% so could be a seed, but if you want you can call it peer.

Look this simple logic, if you have 1 million nodes this moment, you think it would be logic to have 1 million of full nodes running?
For me that would be a complete waste of resources and for sure more safety to put some node in the moon!

Its like storjcoin service, for sure they calculate a safety limit of backups around all the world to have redundance and dont have more backups than the reasonable.

About validation/verification, i think could be done a way to each Type of node validates/verifies his part of data.

bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 17, 2018, 04:03:08 PM
Merited by pooya87 (1)
 #16

That's not entirely accurate. 10GB blocks would be sufficient to scale on the level of Visa.

no it won't be enough because you can't spam VISA network but you can spam bitcoin blocks and fill them up.
besides when it comes to block size it is not just about having more space, you have to consider that you have to download, verify and store 10GB of transaction data every 10 minutes on average and also upload even more depending on how many nodes you connect to and how much you want to contribute.
ask yourself this, would you be willing to run or capable of running a full node that requires 1.44 TB disk space per day, an internet speed of at least 20 Mbps and an internet cap of above 3 TB per day? that is 43 TB per month. can your hardware (CPU, RAM) handle verification of that much data.

     First off, it would be nice if you quoted enough of my text to keep it in context...

That's not entirely accurate. 10GB blocks would be sufficient to scale on the level of Visa. However, many nodes would find it hard or impossible to keep up with that kind of capacity. However, in a decade or so, handling that capacity will probably be trivial.

     I've already acknowledged this isn't practical, right now. However, unless technology has somehow reached it's limits, it may be feasible in a decade or so. Also, in order to combat "spam," miners already have the option to not include transactions below a certain fee. They can make it very expensive for someone to try and spam the network. Also, with bigger blocks, it probably won't be worth the waste of resources for a miner to attempt to drive up the fee market by including spam in their blocks.
    Furthermore, in another thread, Greg Maxwell himself has acknowledged that a pruned node is a full node. There is really no need for every single full node to also store the entire blockchain, permanently. There are sufficient entities out there like exchanges, large pools, and hardware wallet providers, who would probably want to store the entire blockchain. I think the number of nodes that would run "archival" nodes would be of sufficient number to have the network remain decentralized.   
   I also acknowledge in my original post that a second layer solution will always end up using less resources. However, the current LN carries large risks of someone losing their funds due to either system errors or human error. I'm certainly not going to hang my hat on the LN until the developers and network itself can show results that are acceptable. Right now, I wouldn't store 1 sat in a channel on the Lightning Network. The risk of losing funds is just too high.

   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
cellard
Legendary
*
Offline Offline

Activity: 1372
Merit: 1211


View Profile
November 17, 2018, 04:35:32 PM
Merited by suchmoon (4), bones261 (1)
 #17

The main problem is that most of these scaling solutions that are being proposed will first require a hardfork. This means we'll have the drama of 2 competing bitcoins trying to claim that they are the real one (see the BCash ABC vs BCash SV ongoing war right now). This will not end well. Without consensus we will just end up with 2 bitcoins which are in sum of lesser value than before the hardfork happened.

Most bitcoin whales don't support any of the proposed scaling solutions so far so your scaling fork will end up dumped by tons of coins.
bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 17, 2018, 04:59:20 PM
 #18

The main problem is that most of these scaling solutions that are being proposed will first require a hardfork. This means we'll have the drama of 2 competing bitcoins trying to claim that they are the real one (see the BCash ABC vs BCash SV ongoing war right now). This will not end well. Without consensus we will just end up with 2 bitcoins which are in sum of lesser value than before the hardfork happened.

Most bitcoin whales don't support any of the proposed scaling solutions so far so your scaling fork will end up dumped by tons of coins.

     Unlike, BCH, the BTC network already has consensus mechanisms in place that they are willing to use in order to ensure the vast majority of the network is on the same page before proceeding. As demonstrated by the UASF, we can also implement ways to ensure the miners can be persuaded to go along with the wishes of the non-mining users. If someone doesn't want to wait for a high consensus in order to implement their "improvements," they are free to go fork off. That's why we already have hundreds of alt coins right now. As I have already acknowledged, the "bigger block" solution probably won't be practical for at least a decade or so. I have also acknowledged that the second layer solution would probably end up being more efficient with the resources. However, it is nice to know that there is a plan "B" to the scaling solution, just in case the problems with the LN cannot be overcome.

   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
ETFbitcoin
Legendary
*
Offline Offline

Activity: 1764
Merit: 2023

Use SegWit and enjoy lower fees.


View Profile WWW
November 17, 2018, 05:11:57 PM
Merited by bones261 (2)
 #19

     Unlike, BCH, the BTC network already has consensus mechanisms in place that they are willing to use in order to ensure the vast majority of the network is on the same page before proceeding. As demonstrated by the UASF, we can also implement ways to ensure the miners can be persuaded to go along with the wishes of the non-mining users. If someone doesn't want to wait for a high consensus in order to implement their "improvements," they are free to go fork off. That's why we already have hundreds of alt coins right now. As I have already acknowledged, the "bigger block" solution probably won't be practical for at least a decade or so. I have also acknowledged that the second layer solution would probably end up being more efficient with the resources. However, it is nice to know that there is a plan "B" to the scaling solution, just in case the problems with the LN cannot be overcome.

Looks like no one remember about transaction compression (reduce transaction size). This is similar scenario with internet scaling in past where people only focus on increasing bandwidth rather than compress the content and compression format such as MP3 solve many problem (including internet scaling a bit).

IMO, bitcoin need all of it (n-layer network, higher block size limit and lower transaction size) to be able to scale without lots of security/decentralization trade-off.

cellard
Legendary
*
Offline Offline

Activity: 1372
Merit: 1211


View Profile
November 17, 2018, 06:29:16 PM
Merited by bones261 (2)
 #20

The main problem is that most of these scaling solutions that are being proposed will first require a hardfork. This means we'll have the drama of 2 competing bitcoins trying to claim that they are the real one (see the BCash ABC vs BCash SV ongoing war right now). This will not end well. Without consensus we will just end up with 2 bitcoins which are in sum of lesser value than before the hardfork happened.

Most bitcoin whales don't support any of the proposed scaling solutions so far so your scaling fork will end up dumped by tons of coins.

     Unlike, BCH, the BTC network already has consensus mechanisms in place that they are willing to use in order to ensure the vast majority of the network is on the same page before proceeding. As demonstrated by the UASF, we can also implement ways to ensure the miners can be persuaded to go along with the wishes of the non-mining users. If someone doesn't want to wait for a high consensus in order to implement their "improvements," they are free to go fork off. That's why we already have hundreds of alt coins right now. As I have already acknowledged, the "bigger block" solution probably won't be practical for at least a decade or so. I have also acknowledged that the second layer solution would probably end up being more efficient with the resources. However, it is nice to know that there is a plan "B" to the scaling solution, just in case the problems with the LN cannot be overcome.


You can't really know if "the vast majority of the network" is on the same page or not until the D day actually comes. We have already seen miners voting supposed "intention" to support something with their hashrate, then when the day come some of then backpeddle. You would also need all exchanges on board. And ultimately you would need all whales on board, and many of them may not bother to say their opinion at all, then the day of the fork comes and you see an huge dump on your forked coin.

You basically need 100% consensus for a hardfork to be a success and not end up with 2 coins, and I don't see how this is even possible ever when a project gets as big as Bitcoin is (I mean it's still small in the grand scheme of things, but open source software development/network effect wise, it's big enough to not be able to ever hardfork seamlessly again. Maybe im wrong and there is a consensus in the future for a hardfork, but again I don't see how.
bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 17, 2018, 06:55:08 PM
Merited by aliashraf (2), BitHodler (1)
 #21

You can't really know if "the vast majority of the network" is on the same page or not until the D day actually comes. We have already seen miners voting supposed "intention" to support something with their hashrate, then when the day come some of then backpeddle. You would also need all exchanges on board. And ultimately you would need all whales on board, and many of them may not bother to say their opinion at all, then the day of the fork comes and you see an huge dump on your forked coin.

You basically need 100% consensus for a hardfork to be a success and not end up with 2 coins, and I don't see how this is even possible ever when a project gets as big as Bitcoin is (I mean it's still small in the grand scheme of things, but open source software development/network effect wise, it's big enough to not be able to ever hardfork seamlessly again. Maybe im wrong and there is a consensus in the future for a hardfork, but again I don't see how.

What's this fear of having 2 coins? We already have hundreds of coins, most of them being more or less hard forks off of BTC. The free market will decide which coins persists and which coins go by the wayside. BTC has already demonstrated over and over again that it is the honey badger. If we honestly have faith that BTC is anti-fragile, pesky minority coins are nothing but a mere nuisance.

   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
BitHodler
Legendary
*
Offline Offline

Activity: 1386
Merit: 1163


View Profile
November 17, 2018, 10:44:35 PM
Merited by bones261 (1)
 #22

What's this fear of having 2 coins? We already have hundreds of coins, most of them being more or less hard forks off of BTC. The free market will decide which coins persists and which coins go by the wayside. BTC has already demonstrated over and over again that it is the honey badger. If we honestly have faith that BTC is anti-fragile, pesky minority coins are nothing but a mere nuisance.
Absolutely nothing is wrong with that. I would even like to say that it improves scalability potential with how you no longer have to care about another side thinking its own roadmap and implementation is the one to follow.

Remember the drama we went through before the Bcash split? They are gone and we no longer have to care about what their plans are, which is the best thing that could happen to Bitcoin.

I get it that people want to protect Bitcoin, or that they feel they have to, but if it is the unbreakable powerhouse people say it is, then why worry about what's going to happen? Let the economy do its work.

Most forks will fail anyway because they won't be backed by any noteworthy players. There won't be a second Bcash. We've seen the worst.
cfbtcman
Member
**
Offline Offline

Activity: 233
Merit: 12


View Profile
November 18, 2018, 12:42:30 AM
 #23

The main problem is that most of these scaling solutions that are being proposed will first require a hardfork. This means we'll have the drama of 2 competing bitcoins trying to claim that they are the real one (see the BCash ABC vs BCash SV ongoing war right now). This will not end well. Without consensus we will just end up with 2 bitcoins which are in sum of lesser value than before the hardfork happened.

Most bitcoin whales don't support any of the proposed scaling solutions so far so your scaling fork will end up dumped by tons of coins.

Hi, i think we dont need to hard-fork for implement something like the Type of nodes A,B,C,D... they could exist at same time as full-nodes, we only need to hard fork after if we want to upgrade block-size or not, because i read something other day about some solution to change block without hard forking.

But even Satoshi left the block size limit to can be changed in future and we will really need to change it, even with LN working 100% we will need to change blocksize or else 2MB will not support all transactions ONCHAIN->OFFCHAIN and vice-versa.

If we dont do that, that would be dangerous for Layer 1, who would support layer 1 nodes and mining if all the transactions would be done almost 100% in layer 2?

Satoshi said that after mining the 21 millions the miners would be supported only by transactions, what transactions if everyone only uses LN?
When you want to put money ONCHAIN the miners would ask you maybe $5000 because would be very few transactions and that transactions would need to support layer1 expenses.
ETFbitcoin
Legendary
*
Offline Offline

Activity: 1764
Merit: 2023

Use SegWit and enjoy lower fees.


View Profile WWW
November 18, 2018, 04:49:16 AM
 #24

Hi, i think we dont need to hard-fork for implement something like the Type of nodes A,B,C,D... they could exist at same time as full-nodes, we only need to hard fork after if we want to upgrade block-size or not, because i read something other day about some solution to change block without hard forking.

Then how about backward compability with older node/client?
1. They wouldn't know that they need to connect to different type of nodes to get and verify all transaction
2. There's no guarantee they will get all transaction/block as there's no guarantee they connect to all different type of nodes

But even Satoshi left the block size limit to can be changed in future and we will really need to change it, even with LN working 100% we will need to change blocksize or else 2MB will not support all transactions ONCHAIN->OFFCHAIN and vice-versa.

If we dont do that, that would be dangerous for Layer 1, who would support layer 1 nodes and mining if all the transactions would be done almost 100% in layer 2?

Satoshi said that after mining the 21 millions the miners would be supported only by transactions, what transactions if everyone only uses LN?
When you want to put money ONCHAIN the miners would ask you maybe $5000 because would be very few transactions and that transactions would need to support layer1 expenses.

I agree that we need to increase block size as LN alone isn't enough, but any kind of hard-fork is difficult to execute without chain-split or create new "Bitcoin"

That won't happen, because :
1. LN was never created for processing transaction with big amount of Bitcoin, especially due to inheritance risk due to need to stay online or use watchtower
2. All LN channel have expired time, so on-chain/1st layer transaction is still required to open and close channel
3. People who want send big amount of Bitcoin or can't stay online 24/7 (or don't want rely on watchtower) prefer on-chain transaction

cellard
Legendary
*
Offline Offline

Activity: 1372
Merit: 1211


View Profile
November 18, 2018, 03:52:12 PM
 #25

You can't really know if "the vast majority of the network" is on the same page or not until the D day actually comes. We have already seen miners voting supposed "intention" to support something with their hashrate, then when the day come some of then backpeddle. You would also need all exchanges on board. And ultimately you would need all whales on board, and many of them may not bother to say their opinion at all, then the day of the fork comes and you see an huge dump on your forked coin.

You basically need 100% consensus for a hardfork to be a success and not end up with 2 coins, and I don't see how this is even possible ever when a project gets as big as Bitcoin is (I mean it's still small in the grand scheme of things, but open source software development/network effect wise, it's big enough to not be able to ever hardfork seamlessly again. Maybe im wrong and there is a consensus in the future for a hardfork, but again I don't see how.

What's this fear of having 2 coins? We already have hundreds of coins, most of them being more or less hard forks off of BTC. The free market will decide which coins persists and which coins go by the wayside. BTC has already demonstrated over and over again that it is the honey badger. If we honestly have faith that BTC is anti-fragile, pesky minority coins are nothing but a mere nuisance.

Well, isn't it obvious? Ask to someone with a decent amount at stake in Bitcoin if they want to see say, their 100 BTC, crash in value because someone decided to hardfork with the same hashing algorithm, which means that you will see miners speculating with the hashrate while they can. See the recent dip in hashrate, which is the biggest loss of hashrate on adjustment time of the year:



I guarantee you that Bitcoin holders don't appreciate this bullshit. Not that they pose a systemic risk for Bitcoin, but they are annoying and slowing down the rocket.

Anyhow, the real question should be: what's the fear of starting your own altcoin if it's such a good idea, instead of constantly trying to milk 5 minutes of fame for your altcoin by forking it off Bitcoin? If your idea is so good, start it as a an actual altcoin and compete.

DooMAD
Legendary
*
Offline Offline

Activity: 2100
Merit: 1342


Leave no FUD unchallenged


View Profile WWW
November 18, 2018, 04:10:28 PM
 #26

What's this fear of having 2 coins? We already have hundreds of coins, most of them being more or less hard forks off of BTC. The free market will decide which coins persists and which coins go by the wayside. BTC has already demonstrated over and over again that it is the honey badger. If we honestly have faith that BTC is anti-fragile, pesky minority coins are nothing but a mere nuisance.

Well, isn't it obvious? Ask to someone with a decent amount at stake in Bitcoin if they want to see say, their 100 BTC, crash in value because someone decided to hardfork with the same hashing algorithm, which means that you will see miners speculating with the hashrate while they can. See the recent dip in hashrate, which is the biggest loss of hashrate on adjustment time of the year:

I guarantee you that Bitcoin holders don't appreciate this bullshit. Not that they pose a systemic risk for Bitcoin, but they are annoying and slowing down the rocket.

This sounds like an argument that the feelings of those who hold bitcoin because they're speculating on the price should somehow outweigh the feelings of those who hold Bitcoin because they appreciate the fundamental principles of freedom and permissionlessness.  It's an argument for the ages, certainly, but one that neither side will ever back down from.

Ideological differences are naturally going to occur and forks are going to happen.  Your desire to see higher prices isn't going to prevent that.  These are exactly the risks people should consider when they invest if all they care about is the potential profit.

Bitcoin is still very much the "Wild West of money".  Don't decide to play cowboy if you can't handle some bandits every now and then.   Wink

aliashraf
Hero Member
*****
Offline Offline

Activity: 896
Merit: 656


View Profile
November 18, 2018, 04:46:40 PM
Last edit: November 18, 2018, 06:09:37 PM by aliashraf
Merited by Wind_FURY (1)
 #27

I suppose this thread is getting out the rails by falling into the old "To fork or not to fork? This is the question!" story, which cellard is an expert in it  Cheesy
The fact is no matter how and what bitcoin whales want it to be, forks happen and sooner or later we need to have current bitcoin retired as the former legitimate chain and the community should converge on an improved version. You just can't stop evolution from happening, ok?

For now, I'm trying to save this thread, rolling back to the original technical discussion and sharing an interesting idea with you guys meanwhile, please focus:
so lets say there is a transaction Tx1 and it is in the other half of the "data" in node Type B. and say i run node Type A. if someone pays me by spending Tx1, i do not have it in my database (blockchain) so how can i know it is a valid translation? am i supposed to assume that it is valid on faith?! or am i supposed to connect to another node and ask if the transaction is valid and then trust that other node to tell me the truth? how would you prevent fraud?
What op proposes, as correctly have been categorized by other posters in the thread is a special version of sharding.
Although it is an open research field, it is a MUST for bitcoin. Projecting scaling problem to second layer protocols (like LN) is the worst idea because, you can't simulate bitcoin on top of bitcoin as a #2 layer, it is absurd. Going to second layer won't happen unless by giving up about some essential feature of bitcoin or at least being tolerant about centralization and censorship threats, compromising the cause.

So, this is it, our destiny, we need an scalable blockchain solution and as of now, we got just sharding.

Back to Pooya objection, it occurs when a transaction that is supposed to be processed in a partition/shard is trying to access an unspent output from another shard. I think there may be a workaround for this:

Suppose in a sharding based on transaction partitioning that uses a simple mod operation where txid mod N determines the transaction's partition number in an N shards network, we put a constraint on transactions such that wallets are highly de-incentivized/not-allowed to make a transaction from heterogeneous outputs, i.e. outputs from transactions in multiple shards.
 
Now we have this transaction tx1 with its outputs belonging to transactions that are maintained on a same shard, the problem would be to which shard the transaction itself belongs? The trick is adding a nonce field to the transaction format and make the wallet client software to perform like N/2 hashes (a very small amount of work) to find a proper nonce that makes txid mod N such that it fits to the same shard as its output. For coinbase transaction, the same measure should be taken by miners.

It looks somewhat scary, being too partitioned but I'm working on it as it looks very promising to me.
cfbtcman
Member
**
Offline Offline

Activity: 233
Merit: 12


View Profile
November 18, 2018, 11:09:21 PM
 #28


Then how about backward compability with older node/client?
1. They wouldn't know that they need to connect to different type of nodes to get and verify all transaction
2. There's no guarantee they will get all transaction/block as there's no guarantee they connect to all different type of nodes

I think its possible to create new bitcoin core version supporting type nodes and emulating a normal node communication with old ones and new TYPE Node communication with new ones, at least with the actual block size, but that would be only experience, because without bigger blocks that is not usefull.
cfbtcman
Member
**
Offline Offline

Activity: 233
Merit: 12


View Profile
November 18, 2018, 11:21:22 PM
 #29

I suppose this thread is getting out the rails by falling into the old "To fork or not to fork? This is the question!" story, which cellard is an expert in it  Cheesy
The fact is no matter how and what bitcoin whales want it to be, forks happen and sooner or later we need to have current bitcoin retired as the former legitimate chain and the community should converge on an improved version. You just can't stop evolution from happening, ok?

For now, I'm trying to save this thread, rolling back to the original technical discussion and sharing an interesting idea with you guys meanwhile, please focus:
so lets say there is a transaction Tx1 and it is in the other half of the "data" in node Type B. and say i run node Type A. if someone pays me by spending Tx1, i do not have it in my database (blockchain) so how can i know it is a valid translation? am i supposed to assume that it is valid on faith?! or am i supposed to connect to another node and ask if the transaction is valid and then trust that other node to tell me the truth? how would you prevent fraud?
What op proposes, as correctly have been categorized by other posters in the thread is a special version of sharding.
Although it is an open research field, it is a MUST for bitcoin. Projecting scaling problem to second layer protocols (like LN) is the worst idea because, you can't simulate bitcoin on top of bitcoin as a #2 layer, it is absurd. Going to second layer won't happen unless by giving up about some essential feature of bitcoin or at least being tolerant about centralization and censorship threats, compromising the cause.

So, this is it, our destiny, we need an scalable blockchain solution and as of now, we got just sharding.

Back to Pooya objection, it occurs when a transaction that is supposed to be processed in a partition/shard is trying to access an unspent output from another shard. I think there may be a workaround for this:

Suppose in a sharding based on transaction partitioning that uses a simple mod operation where txid mod N determines the transaction's partition number in an N shards network, we put a constraint on transactions such that wallets are highly de-incentivized/not-allowed to make a transaction from heterogeneous outputs, i.e. outputs from transactions in multiple shards.
 
Now we have this transaction tx1 with its outputs belonging to transactions that are maintained on a same shard, the problem would be to which shard the transaction itself belongs? The trick is adding a nonce field to the transaction format and make the wallet client software to perform like N/2 hashes (a very small amount of work) to find a proper nonce that makes txid mod N such that it fits to the same shard as its output. For coinbase transaction, the same measure should be taken by miners.

It looks somewhat scary, being too partitioned but I'm working on it as it looks very promising to me.

Woow, you are the man  Cheesy

mechanikalk
Member
**
Offline Offline

Activity: 84
Merit: 20


View Profile WWW
November 18, 2018, 11:22:27 PM
 #30

This has been discussed many times and unfortunately, majority of Bitcoiner would disagree since increasing block size would increase the cost of running full nodes.
Split block data to many different nodes type is called Sharding and already proposed many times such as BlockReduce: Scaling Blockchain to human commerce
Besides, IMO sharding open lots of attack vector, increase development complexity and requiring more trust.

Additionally, LN help bitcoin scaling a lot, even though it's not perfect solution. Those who said that clearly don't understand how LN works and it's potential.
Lots of cryptocurrency including Ethereum are preparing 2nd-layer/off-chain as scaling solution because they know it's good scaling solution.

If anyone is interested in a simpler understanding of BlockReduce, they can checkout this 30 minute intro that I presented at University of Texas Blockchain Conference.  I ultimately don't think it is all that complicated, it is really just multithreading Bitcoin and tying it back together with merge mining.

BlockReduce Presentation

In terms of issues with lightning, I think that the biggest problem is the cost of capital.  Transaction fees are necessarily going to be non-zero in economic equilibrium because node operators will need to pay for the cost of money, or opportunity cost.  Therefore, you would expect to pay ~1-4% annually of a transaction that takes place in a channel that is open for any period of time.  It will also need to account for the costs of opening and closing the channel.  The only reason that LN kinda works is because people are not fully accounting for the cost of capital needed to run the nodes. 
cfbtcman
Member
**
Offline Offline

Activity: 233
Merit: 12


View Profile
November 19, 2018, 03:23:03 PM
Last edit: November 19, 2018, 06:10:48 PM by cfbtcman
 #31

People,

What about something like this:

https://ibb.co/c249h0

Vertical blockchain block scaling.
Each block would have the same limit size as now.
Possibility to create many "vertical" blocks in each 10 minutes interval when Mempool size is overcharged.
Each new layer blockchain would be part of a new layer node so we could have infinite Layer Type Nodes

That Layer Type nodes will include less repeated data, so we save disk space, if we have 1 million nodes we dont need to save all information in the same nodes, thats a waste, so when someone wants to install a node, the node install program will sugest the best Layer Type node to install, the one the system needs more.
The Layer Type nodes exchange information between them, something like pruned nodes.

This way we prevent download and upload of big quantities of data like it happens with BCH with huge block sizes.

All the Layer Type Nodes share information they need that can be tested with the information that Layer Type Node already have, everything is hashed with everything.

mechanikalk
Member
**
Offline Offline

Activity: 84
Merit: 20


View Profile WWW
November 19, 2018, 04:51:24 PM
 #32

People,

What about something like this:

https://ibb.co/c249h0

Vertical blockchain block scaling.
Each block would have the same limit size as now.
Possibility to create many "vertical" blocks in each 10 minutes interval when Mempool size is overcharged.
Each new layer blockchain would be part of a new layer node so we could have infinite Layer Type Nodes

That Layer Type nodes will include less repeated data, so we save disk space, if we have 1 million nodes we dont need to save all information in the same nodes, thats a waste, so when someone wants to install a node, the node install program will sugest the best Layer Type node to install, the one the system needs more and the Layer Type nodes exchange information, something like pruned nodes.

This way we prevent download and upload of big quantities of data like it happens with BCH with huge block sizes by block.

All the Layer Type Nodes share information they need that can be tested with the information that Layer Type Node already have, everything is hashed with everything.



The issue with your proposal above is that you do not know which transactions to include in which vertical block.  You will not know which transactions take precedence and will introduce a mechanism by which a double spend can occur because a conflicting transaction could be put into two blocks at once.  However, I do think you are thinking in the right direction.

You should check out BlockReduce, which is similar to this idea, but moves transactions in a PoW managed hierarchy to insure consistency of state.  If the manuscript is a bit long, please also check out a presentation I did at University of Texas Blockchain conference.
Anti-Cen
Member
**
Offline Offline

Activity: 210
Merit: 26

High fees = low BTC price


View Profile
November 21, 2018, 11:41:56 AM
 #33

"Lightning network" == Mini banks

I did warn you all and no the so called new "Off-Block" hubs did not save BTC as we can see from the price.

CPU-Wars, mere 9 transactions per second from 20,000 miners and fees hitting $55 per transaction is what
the BTC code will be remembered for as it enters our history books just like Tulip Mania did in the 1700's

Casino managers are not the best people in the world to take financial advise from and the same goes for the
dis-information moderator here that keeps pressing the delete button here because he hates the truth being exposed.

Mining is CPU-wars and Intel, AMD like it nearly as much as big oil likes miners wasting electricity. Is this what mankind has come too.
HeRetiK
Legendary
*
Online Online

Activity: 1232
Merit: 1118


the forkings will continue until morale improves


View Profile
November 21, 2018, 12:33:14 PM
 #34

"Lightning network" == Mini banks

They're not.


I did warn you all and no the so called new "Off-Block" hubs did not save BTC as we can see from the price

Caring about short-term fluctuations are usually a sign of a lack of long-term thinking.


CPU-Wars, mere 9 transactions per second from 20,000 miners and fees hitting $55 per transaction is what
the BTC code will be remembered for as it enters our history books just like Tulip Mania did in the 1700's

Irrelevant to the discussion.


Casino managers are not the best people in the world to take financial advise from and the same goes for the
dis-information moderator here that keeps pressing the delete button here because he hates the truth being exposed.

See above, which may also be the reason why some of these posts got deleted, rather than a hidden conspiracy by big crypto.

DooMAD
Legendary
*
Offline Offline

Activity: 2100
Merit: 1342


Leave no FUD unchallenged


View Profile WWW
November 21, 2018, 01:57:44 PM
 #35

<snip>

Off topic, but I thought RNC and all related accounts were banned?  I did have a link, but copying it on a phone is a ballache.

//EDIT:   https://bitcointalk.org/index.php?topic=2617240.msg31377296#msg31377296
RNC admitted to ban evasion and being Anti-Cen there.

cfbtcman
Member
**
Offline Offline

Activity: 233
Merit: 12


View Profile
November 21, 2018, 09:46:36 PM
Last edit: November 22, 2018, 12:36:58 AM by cfbtcman
 #36

"Lightning network" == Mini banks

I did warn you all and no the so called new "Off-Block" hubs did not save BTC as we can see from the price.

CPU-Wars, mere 9 transactions per second from 20,000 miners and fees hitting $55 per transaction is what
the BTC code will be remembered for as it enters our history books just like Tulip Mania did in the 1700's

Casino managers are not the best people in the world to take financial advise from and the same goes for the
dis-information moderator here that keeps pressing the delete button here because he hates the truth being exposed.

The exchangers are the mini banks yet, maybe worst than Lightning Network, LN is like you put your salary of 1 month and after that you will pay you expenses and you receive back and waste and earn, always the same money, if you loose money you will lose not so much, the actual exchanges are worst, MtGox, Btc-e, Wex, etc they are stealing millions and nobody cares.

Maybe BTC its a little complicated for common citizen and we need that "mini or big banks" to work together.
Wind_FURY
Hero Member
*****
Offline Offline

Activity: 1218
Merit: 812


Crypto-Games.net: Multiple coins, multiple games


View Profile
November 22, 2018, 08:42:28 AM
 #37

I would be very curious what Bitcoin's network topology will look like if by some miraculous event, Bitcoin hard forks to bigger blocks and sharding, with consensus. Haha.

Plus big blocks are inherently centralizing the bigger they go. Wouldn't sharding only prolong the issue, and not solve it?


▄▄▄████████▄▄▄
▄██████████████████▄
▄██████████████████████▄
██████████████████████████
████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
████████████████████████████
██████████████████████████
▀██████████████████████▀
▀██████████████████▀
▀▀▀████████▀▀▀
   ███████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
███████
BTC  ◉PLAY  ◉XMR  ◉DOGE  ◉BCH  ◉STRAT  ◉ETH  ◉GAS  ◉LTC  ◉DASH  ◉PPC
     ▄▄██████████████▄▄
  ▄██████████████████████▄        █████
▄██████████████████████████▄      █████
████ ▄▄▄▄▄ ▄▄▄▄▄▄ ▄▄▄▄▄ ████     ▄██▀
████ █████ ██████ █████ ████    ▄██▀
████ █████ ██████ █████ ████    ██▀
████ █████ ██████ █████ ████    ██
████ ▀▀▀▀▀ ▀▀▀▀▀▀ ▀▀▀▀▀ ████ ▄██████▄
████████████████████████████ ████████
███████▀            ▀███████ ▀██████▀
█████▀                ▀█████
▀██████████████████████████▀
  ▀▀████████████████████▀▀ 
✔️DICE           
✔️BLACKJACK
✔️PLINKO
✔️VIDEO POKER
✔️ROULETTE     
✔️LOTTO
aliashraf
Hero Member
*****
Offline Offline

Activity: 896
Merit: 656


View Profile
November 22, 2018, 09:30:15 AM
 #38

I would be very curious what Bitcoin's network topology will look like if by some miraculous event, Bitcoin hard forks to bigger blocks and sharding, with consensus. Haha.

Plus big blocks are inherently centralizing the bigger they go. Wouldn't sharding only prolong the issue, and not solve it?
Op's proposal is a multilevel hierarchical sharding schema in which bigger blocks are handled in the top level. As I have debated it extensively above thread there are a lot of issues remained unsolved but I think we need to take every sharding idea as a serious one and figure out a solution for scaling problems eventually, hence hierarchical schemas are promising enough to be discussed and improved, imo.
greg458
Jr. Member
*
Offline Offline

Activity: 336
Merit: 2


View Profile
November 22, 2018, 10:22:41 PM
 #39

The main problem is that most of these scaling solutions that are being proposed will first require a hardfork. This means we'll have the drama of 2 competing bitcoins trying to claim that they are the real one (see the BCash ABC vs BCash SV ongoing war right now). This will not end well. Without consensus we will just end up with 2 bitcoins which are in sum of lesser value than before the hardfork happened.

Most bitcoin whales don't support any of the proposed scaling solutions so far so your scaling fork will end up dumped by tons of coins.
Wind_FURY
Hero Member
*****
Offline Offline

Activity: 1218
Merit: 812


Crypto-Games.net: Multiple coins, multiple games


View Profile
November 23, 2018, 06:11:31 AM
 #40

I would be very curious what Bitcoin's network topology will look like if by some miraculous event, Bitcoin hard forks to bigger blocks and sharding, with consensus. Haha.

Plus big blocks are inherently centralizing the bigger they go. Wouldn't sharding only prolong the issue, and not solve it?

Op's proposal is a multilevel hierarchical sharding schema in which bigger blocks are handled in the top level.


Ok, but big blocks are inherently centralizing the bigger they go, are they not? Sharding would only prolong the issue on the network, any blockchain network, of scaling in.

Quote

As I have debated it extensively above thread there are a lot of issues remained unsolved but I think we need to take every sharding idea as a serious one and figure out a solution for scaling problems eventually, hence hierarchical schemas are promising enough to be discussed and improved, imo.


For Bitcoin? I believe it would be better proposed in a network that has big blocks, and does not have firm restrictions on hard forks. Bitcoin Cash ABC.


▄▄▄████████▄▄▄
▄██████████████████▄
▄██████████████████████▄
██████████████████████████
████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
████████████████████████████
██████████████████████████
▀██████████████████████▀
▀██████████████████▀
▀▀▀████████▀▀▀
   ███████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
███████
BTC  ◉PLAY  ◉XMR  ◉DOGE  ◉BCH  ◉STRAT  ◉ETH  ◉GAS  ◉LTC  ◉DASH  ◉PPC
     ▄▄██████████████▄▄
  ▄██████████████████████▄        █████
▄██████████████████████████▄      █████
████ ▄▄▄▄▄ ▄▄▄▄▄▄ ▄▄▄▄▄ ████     ▄██▀
████ █████ ██████ █████ ████    ▄██▀
████ █████ ██████ █████ ████    ██▀
████ █████ ██████ █████ ████    ██
████ ▀▀▀▀▀ ▀▀▀▀▀▀ ▀▀▀▀▀ ████ ▄██████▄
████████████████████████████ ████████
███████▀            ▀███████ ▀██████▀
█████▀                ▀█████
▀██████████████████████████▀
  ▀▀████████████████████▀▀ 
✔️DICE           
✔️BLACKJACK
✔️PLINKO
✔️VIDEO POKER
✔️ROULETTE     
✔️LOTTO
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 23, 2018, 01:18:15 PM
Last edit: November 24, 2018, 09:29:01 AM by franky1
Merited by bones261 (2), ETFbitcoin (1)
 #41

its sufficient that in a group of 1000 nodes, 500 save backup of 50% of the data (Type A node) and the other 50% can save backup of the other 50% of the data (Type B node), its a waste of space all the nodes in the network save all information repeated thousands of times.

Then each node Type A can "ask" to other node Type B the information is trying to find and vice-versa and gets the information anyway

Bitcoin was designed to be trustless.  The idea of running a node is that you can validate and verify every single transaction yourself.  If you run a Type A node, you would have to trust the Type B nodes to do half of the validation for you.  If you're going to do that, why not just trust Visa and forget all about Bitcoin?
finally doomad sees the light about "compatibility not = full node".. and how "compatible" is not good for the network..
one merit earned... may he accept the merit and drop that social drama debate now he seen the light.

onto the topic
the hard fork of removing full nodes that can only accept 1mb blocks has been done already, in mid 2017. reference the "compatible" nodes still on the network are not full nodes no more.

all is required is to remove the "witness scale factor" and the full 4mb can be utilised by legacy transactions AND segwit transactions.
this too can have positives by removing alot of wishy washy lines of code too and bring things back inline with a code base that resembled pre segwit block structuring to rsemble a single block structure where everything is together that doesnt need to be stripped/"compatible".
yes the "compatible" nodes would stall out and not add blocks to their hard drive. but these nodes are not full nodes anyway so people using them might aswell just use litewallets and bloom filter transaction information they NEED for personal use.

we will then have the network able to actually handle more tx/s at a 4x level rather than a 1.3-2.5x level(which current segwit blockstructure LIMITS (yep even with 4mb suggested weight. actual calculations limit it to 2.5x compared to legacy 1mb))

...
as for how to scale onchain.
please do not throw out "LN is the solution" or "servers will be needed" or "you cant buy coffee"
1. instead of needing LN for coffee by channelling to starbucks. just onchain use a tx to btc buy a $40 starbucks giftcard once a fortnight.

after all from a non technical real life utility perception of average joe. if your LN locking funds with starbucks for a fortnight anyway. its the same 'feel' as just pre-buying a fortnights worth of coffee via a giftcard.
(it also solves the routing, multiple channel requirement for good connection, spendability, also the other problems LN has)

2. onchain scaling is not about just raising the blocksize. its also about reducing the sigops limit so that someone cant bloat/maliciously fill a block with just 5 transactions to use up the limits. EG blocksigops=80k and txsigops=16k meaning 5 tx's can fill a block should they wish. this is by far a bad thing to let continue to be allowed as a network rule.*

3. point 2 had been allowed to let exchanges batch transactions into single transactions of more in/outputs so that exchanges could get cheaper fee's. yet if an exchange is being allowed to bloat a block alone. then that exchange should be paying more for that privelige, not less.
(this stubbornly opens up the debate of should bitcoins blockchain be only used by reserve hoarders of multiple users in permissioned services(exchanges/ln factories).. or should the network be allowing individuals wanting permissionless transacting).. in my view permissioned services should be charged more than an individual

4. and as we move away from centralised exchanges that hoard coins we will have less need for xchanges to batch such huge transactions and so there will be less need of such bloated transactions

5. scaling onchain is not just about raising the blocksize. its about making it more expensive for users who transact more often than those who transact less frequently.
EG imagine a person spend funds to himself every block. and was doing it via 2000 separate transactions a block (spam attack)
he is punishing EVERYONE else. as others that only spends once a month are finding that the fee is higher, even though they have not done nothing wrong.
the blocks are still only collating the same 2000tx average. so from a technical prospective are not causing any more 'processing cost' to mining pool nodes tx's into block collation mechanism. (they still only collate ~2000tx so no cost difference)
so why is the whole network being punished. due to one persons spam.

the person spending every block should pay more for spending funds that have less confirms than others. in short the more confirms your UTXO has the cheaper the transactions get. that way spammers are punished more.
this can go a stage further that the child fee also increases not just on how young the parent is but also the grandparent

in short bring back a fee priority mechanism. but one that concentrates on age of utxo rather than value of utxo(which old one was)

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
DooMAD
Legendary
*
Offline Offline

Activity: 2100
Merit: 1342


Leave no FUD unchallenged


View Profile WWW
November 23, 2018, 02:11:55 PM
 #42

5. scaling onchain is not just about raising the blocksize. its about making it more expensive for users who transact more often than those who transact less frequently.
EG imagine a person spend funds to himself every block. and was doing it via 2000 separate transactions a block (spam attack)
he is punishing EVERYONE else. as others that only spends once a month are finding that the fee is higher, even though they have not done nothing wrong.
the blocks are still only collating the same 2000tx average. so from a technical prospective are not causing any more 'processing cost' to mining pool nodes tx's into block collation mechanism. (they still only collate ~2000tx so no cost difference)
so why is the whole network being punished. due to one persons spam.

the person spending every block should pay more for spending funds that have less confirms than others. in short the more confirms your UTXO has the cheaper the transactions get. that way spammers are punished more.
this can go a stage further that the child fee also increases not just on how young the parent is but also the grandparent

in short bring back a fee priority mechanism. but one that concentrates on age of utxo rather than value of utxo(which old one was)

If you just stuck to raising points like this, rather than simply attacking everything that others are trying to build, I wouldn't have to spend so much time arguing with you.  This is one of those rare cases where we actually agree on something.  My only minor critique with this post is that you did a much better job of explaining this concept here:

imagine that we decided its acceptable that people should have a way to get priority if they have a lean tx and signal that they only want to spend funds once a day. where if they want to spend more often costs rise, if they want bloated tx, costs rise.. which then allows those that just pay their rent once a month or buys groceries every couple days to be ok using onchain bitcoin.. and where the costs of trying to spam the network (every block) becomes expensive where by they would be better off using LN. (for things like faucet raiding every 5-10 minutes)

so lets think about a priority fee thats not about rich vs poor but about respend spam and bloat.

lets imagine we actually use the tx age combined with CLTV to signal the network that a user is willing to add some maturity time if their tx age is under a day, to signal they want it confirmed but allowing themselves to be locked out of spending for an average of 24 hours.

and where the bloat of the tx vs the blocksize has some impact too... rather than the old formulae with was more about the value of the tx


as you can see its not about tx value. its about bloat and age.
this way
those not wanting to spend more than once a day and dont bloat the blocks get preferential treatment onchain.
if you are willing to wait a day but your taking up 1% of the blockspace. you pay more
if you want to be a spammer spending every block. you pay the price
and if you want to be a total ass-hat and be both bloated and respending often you pay the ultimate price

I've yet to hear any technical arguments from anyone as to why this isn't a good idea and something we should be seriously looking at.  In fact, I'd even suggest you start a new topic in Development & Technical Discussion just for this point alone.

franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 23, 2018, 02:59:46 PM
Last edit: November 23, 2018, 03:11:26 PM by franky1
 #43

If you just stuck to raising points like this, rather than simply attacking everything that others are trying to build, I wouldn't have to spend so much time arguing with you.  This is one of those rare cases where we actually agree on something.

i raise many points many times. the thing is, you meander in when the 4 letter word that a center of an apple is called is mentioned. thus you only see that point.

its like if i mentioned "you never see a blue lambo on the road". suddenly you start looking out for it and start only noticing blue lambo's and reacting/get emotional to it everytime you see one. not noticing the lack of observation and lack of emotion of other vehicles

that said the 4 letter word you defend is actually the ones that do type the code and do release the code because all others have been pushed off the network or pushed out the relay layer into the downstream "compatibility" layer of the network topology.

so just running nodes wont change the rules. just mining hashpower wont change the rules. we need actual devs to code rules. which goes against your perception of what devs should be doing. which is what we disagree with.
devs should be listening to the community.

again whats the point of me posting something about a rule change like the second part of your reply. if your side feels that devs should not listen to the community and just do whatever they please.
do you atleast see my point that the network should not have a power house that ignores the community, simply because it doesnt fit "their" roadmap

but seeing as your on their side how about just forwarding the second part of your post to them. you can even say its your idea if it helps. i have thrown many idea's out and let anyone take them and use it as their own idea. like i said a few times im not writing code as a authoritarian demanding people follow me. i never have. i just inform people what could/should be done and hope it wakes people up to see there are other options than just the 4 letter word's roadmap, and that we should not just blindly sheep follow a roadmap as if its the only way

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
DooMAD
Legendary
*
Offline Offline

Activity: 2100
Merit: 1342


Leave no FUD unchallenged


View Profile WWW
November 23, 2018, 03:50:52 PM
 #44

we need actual devs to code rules. which goes against your perception of what devs should be doing. which is what we disagree with.
devs should be listening to the community.

again whats the point of me posting something about a rule change like the second part of your reply. if your side feels that devs should not listen to the community and just do whatever they please.
do you atleast see my point that the network should not have a power house that ignores the community, simply because it doesnt fit "their" roadmap

It might be worth considering that if all you ever do is verbally abuse them, they might not be very receptive to what you're saying.  It's not just about having a good idea, it's about how you present it and (particularly in your case) how you conduct yourself while doing so.  

I could have the best idea in the world, but if I spent the entire time slagging off the people who I'm trying to convince to adopt it, it stands to reason that's not going to go the way I'd like it to.

It's overly simplistic to talk about whether developers "should" or "shouldn't" listen to the community.  It's not that black and white.  Each and every single idea has to be treated on a case-by-case basis.  What this is really about is that each developer and dev team is naturally going to produce the code which they believe is most likely to lead to Bitcoin's overall long-term success.  It's not practical for them to implement every random idea people throw out there.  Many of the ideas people suggest (or demand) are not viable.  So they have to be selective and focus on the few decent ideas.  But how can they know your idea is decent if they can't even hear it over all the conspiracy theory babble and outright FUD you constantly spout?  If you want the community to take your idea on board, the onus is on you to present a reasonable argument to support your case and convince them that your idea is actually worth implementing.  Then, with community support, developers are more likely to listen.  If they are then convinced your idea has merit, they are more likely to implement it.  Or, as always, feel free to skip that process and either pay someone to code it, or code it yourself and see how the community reacts then.

But don't just demand shit like an entitled child and then insult the developers when they inevitably ignore you.  When has that attitude ever worked for you in the real world?  That's not how you get what you want.  Try being an adult about this.  Yes, I think you have a good idea, but you really need to work on your people skills.  All you've earned for your efforts so far is negative feedback from a developer.  If you had been more reasonable from the offset, things could have been very different.  Seriously re-think your posting habits and mannerisms in general.  You only have yourself to blame if people aren't taking you seriously.

All that put aside, I will start a topic about fee priority for you if you don't think you can conduct yourself appropriately.  But I think it would be healthy for you to start one, while taking on board what I've said above and really watching your tone.  Maybe even open with an apology for your behaviour to date.  It's not too late to change.

franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 23, 2018, 04:31:46 PM
 #45

It might be worth considering that if all you ever do is verbally abuse them, they might not be very receptive to what you're saying.  It's not just about having a good idea, it's about how you present it and (particularly in your case) how you conduct yourself while doing so.  

...wall of text of accusations and insults.....

  Maybe even open with an apology for your behaviour to date.  It's not too late to change.

maybe instead of defending them. realise they release code BEFORE community input/conversation.

EG
they had segwit roadmap plan from 2014. before community input
they had code before community got to download.
the comunity had a november 2016-spring 2017 decision and devs lost.
but that was not acceptable to devs. hense the mandatory august 2017.. (no conspiracy. check it all out (i wont use the R word.))

it was not a concept of taking in random community idea's and going with the best one. it was them build thier road map and sway the community to adopt.
you might want to check it all out (i wont use the R word.)

as for me insulting . thats the bear biting the poke. not me poking the bear.
same went for your social drama involvements
if you word count actual insults i give against. say insults you give. you will be surprised by the result

all i said to you before as REPLIES was to re***rch(i know you dont like the word) .. check it out.
you were the one that resorted to flame wars.

anyways. things wont change no matter how much i kiss ass of a dev. the only main issue is that devs should change to be a more OPEN community. and not REKT things that are not their plan..

sorry but letting them lead and control, and require kissing the royal ring is not what decentralisation is about

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
aliashraf
Hero Member
*****
Offline Offline

Activity: 896
Merit: 656


View Profile
November 23, 2018, 04:45:06 PM
 #46

5. scaling onchain is not just about raising the blocksize. its about making it more expensive for users who transact more often than those who transact less frequently.
EG imagine a person spend funds to himself every block. and was doing it via 2000 separate transactions a block (spam attack)
he is punishing EVERYONE else. as others that only spends once a month are finding that the fee is higher, even though they have not done nothing wrong.
the blocks are still only collating the same 2000tx average. so from a technical prospective are not causing any more 'processing cost' to mining pool nodes tx's into block collation mechanism. (they still only collate ~2000tx so no cost difference)
so why is the whole network being punished. due to one persons spam.

the person spending every block should pay more for spending funds that have less confirms than others. in short the more confirms your UTXO has the cheaper the transactions get. that way spammers are punished more.
this can go a stage further that the child fee also increases not just on how young the parent is but also the grandparent

in short bring back a fee priority mechanism. but one that concentrates on age of utxo rather than value of utxo(which old one was)
Multiple layer of confusions over there:

1- Spamming is bad for bitcoin but not a serious threat right now, besides transaction malleability which is was a flaw and essentially have  improved after SW, other spamming practices come with a cost. Making bitcoin more resistant to spam is good but not an urgent agenda neither a relevant issue to on-chain scaling.

2- It isn't reasonable to consider it kinda "punishing others" when you make frequent txs. You pay fees, you utilize the service, that simple.

3- And it would be insane to encourage people to keep UTXO more occupied! Actually I'm working on a totally opposite direction. By any measure we need to keep UTXO as light as possible, UTXO is the problem and not the chain, chain always could be pruned, UTXO couldn't. I think people should pay even more when their balance is kept for a longer time than others. It occupies space!

Let's discuss an alternative approach:

Suppose as a decentralization measure we incentivize wallets to run a full node and participate actively in consensus by a replace-fee-by-work schema. In this schema wallets are free to do some work (solving an ASIC resistant hash function) and include a nonce plus the hash of the most recent block they are ready to commit to. By confirming such transactions, miners could enjoy a tiny discount for the difficulty they need to approve their block and ready to be more generous with the fee they expect. Let's call it Direct Commitment By Transactions.

For such a schema to work we need to give more weight to commitments to newer blocks and stop giving such a credit to transactions when they are committing to blocks older than 100 or so. More interestingly we could discourage hoarding and occupying the UTXO too long,  by means of another complementary technique which is even more disruptive and deserves to be called Indirect Commitment By Transactions.

Traditionally, in bitcoin and its clones, ordinary transactions use a txid:[#ouput] format as their inputs. Although this approach has some benefits and provides a level of convenience  for users it has drawbacks too. For instance, reply attack in forks wouldn't be possible if wallets was committing to the branch they are making the transaction for.
In Direct method above we have established such a possibility, now suppose we go even further and let transactions to refer to outputs by their relative position in the blockchain and give another bonus (difficulty discount) to miners for including transactions that spend younger outputs from the correct chain as by this reference they are indirectly committing to the right chain, helping security.
LeGaulois
Copper Member
Legendary
*
Offline Offline

Activity: 1190
Merit: 1179

Bitcoin Ninja Unregulated Banker Unbanking Folks


View Profile
November 23, 2018, 04:47:36 PM
 #47

@franky1
This is what we call being proactive and anticipation. The example you give about the SegWit roadmap from 2014 is one example. Are we forced to use SegWit? As DoomAD says, they cannot integrate everyone's wishes, but they anticipate to make Bitcoin usable with various convenients solutions. It's like complaining because someone is working to improve Bitcoin and talk about consensus. A consensus from the mass could turn in a 10 years old kid decision.

bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 23, 2018, 05:06:01 PM
 #48

@franky1
This is what we call being proactive and anticipation. The example you give about the SegWit roadmap from 2014 is one example. Are we forced to use SegWit? As DoomAD says, they cannot integrate everyone's wishes, but they anticipate to make Bitcoin usable with various convenients solutions. It's like complaining because someone is working to improve Bitcoin and talk about consensus. A consensus from the mass could turn in a 10 years old kid decision.

  No, we are not forced to use Segwit. However, if someone chooses not to use Segwit, you are penalized by paying higher fees. This may only amount to pennies at the moment, but it can add up. If BTC starts to get used even more, many casual users will then be compelled to use LN, to avoid prohibitive fees.

   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
LeGaulois
Copper Member
Legendary
*
Offline Offline

Activity: 1190
Merit: 1179

Bitcoin Ninja Unregulated Banker Unbanking Folks


View Profile
November 23, 2018, 05:40:44 PM
 #49

@bones26
Perhaps, but since @franky1 regularly tell us how Bitcoin is dying since there are "fewer transactions". The case you mention is not supposed to happen so, or just randomly (during the Christmas month with all people buying the gifts lol). There is no problem anymore with the fees or congestion by following him

bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 23, 2018, 06:03:11 PM
 #50

Perhaps, but since @franky1 regularly tell us how Bitcoin is dying since there are "fewer transactions". The case you mention is not supposed to happen so, or just randomly (during the Christmas month with all people buying the gifts lol). There is no problem anymore with the fees or congestion by following him
   We just want to give people a better alternative to using the banking system. If we are just going to try and herd people into using lightning network, is that system really any better? If BTC ever gets wide adoption, it won't even be affordable for many to open a LN channel for themselves. They will just have to rely on some gateway service to open channels, and your "share" of the channel will just be kept track of by the gateway service. The casual user will just need to rely on some third party, and hope that they are not "hacked."
    To get back somewhat on topic a little. I think if BTC can implement some kind of system where a node can validate each and every transaction without having to initially devote enough hard disk space, that would be great. My understanding of pruning is that you initially have to devote the required hard disk space before you can prune. I'm not certain the OPs proposal is the best way to go about this with the sharding.

   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
aliashraf
Hero Member
*****
Offline Offline

Activity: 896
Merit: 656


View Profile
November 23, 2018, 06:20:30 PM
 #51

Perhaps, but since @franky1 regularly tell us how Bitcoin is dying since there are "fewer transactions". The case you mention is not supposed to happen so, or just randomly (during the Christmas month with all people buying the gifts lol). There is no problem anymore with the fees or congestion by following him
 ... I think if BTC can implement some kind of system where a node can validate each and every transaction without having to initially devote enough hard disk space, that would be great. My understanding of pruning is that you initially have to devote the required hard disk space before you can prune. I'm not certain the OPs proposal is the best way to go about this with the sharding.
I have discussed it a couple of times before and you are welcome to check this post for instance. AFAICT what you are asking for is totally possible and it leads to a new generation of full nodes which are light enough to retire current spv nodes.

By the way, hierarchical or not, sharding as an on-chain scaling solution is not something we could ever ignore.
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 23, 2018, 06:26:58 PM
Last edit: November 23, 2018, 06:38:48 PM by franky1
 #52

anyway, back on topic.

the scaling onchain
reduce how much sig-op control one person can have is a big deal.
i would say the sigops limits alone can be abused more so than the fee war to victimise other users. and needs addressing

as for transactions per block. like i said (only reminding to get back ontopic) removing the witness scale factor and the wishy washy code to realign the block structure into a single block that doesnt need stripping is easy. as the legacy nodes are not full nodes anyway

but this can only be done by devs actually writing code. other teams have tried but found themselves relegated downstream as "compatible" or rejected off the network. so the centralisation of devs needs to change
(distributed nodes does not mean decentralised rule control.. we need decentralised not distributed)

as for other suggestions of scaling.
others have said sidechains. the main issue is the on-off ramp between the two

a alternative concept could be whereby a new transaction format. (imagine bc1q.. but instead SC1) which has no lock
bitcoin network sees a
bc1q->SC1 as a go to side chain tx (mined by both chains)
and
SC1->bc1q as a return to main net(mined by both chains)

mainnet will not relay or collate(mine to blocks) any sc1 -> sc1 transactions. (hense no need to lock)
sidechain will not relay or collate(mine to block) any bc1q -> bc1q transactions. (hense no need to lock)

this way it avoids a situation of "pegging" such as
bc1q->bc1q(lock)                                sc1(create)->sc1

having bc1q->sc1 is not about pegging a new token into creation.
its about taking the transaction of mainchain. mining it also into a sidechain. and then only able to move sc1 addresss->sc1 the sidechain until its put back into a bc1q address which are then only able to move on mainnet

i say this because having a
bc1q ->bc1q that has a lock. can have openings for abuse based on timing and also loss of key for the bc1q address.
where as moving funds to a sc1 address is obsolving loss/risk from the mainnet as value is not in a bc1q address no more(as its spent). and moving the value with the transaction to the sidechain.
(thus solves the UTXO issue on mainnet of not having to hold 'locked' value)

allowing value to flow without time lock allows the auditing of funds to show its still one value moving. instead a locked value in main chain and new value in sidechain

i do have issues and reservations about sidechains too but the original "pegging" concept of sidechains really was bad and open to abuse. (not having the BTC side seen as "spent" while spending was actually happening)

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
DooMAD
Legendary
*
Offline Offline

Activity: 2100
Merit: 1342


Leave no FUD unchallenged


View Profile WWW
November 23, 2018, 06:36:14 PM
Merited by bones261 (2)
 #53

they had segwit roadmap plan from 2014. before community input
they had code before community got to download.

Which means someone made a compelling argument about the idea and most of the developers in that team agreed with it.  Ideas can come from anywhere, including from developers themselves.  Saying that developers shouldn't work on an idea just because a developer proposed it isn't a mature or realistic stance.

I don't know where you get this perverse notion that developers need permission from the community before they are allowed to code something.  And crucially, if you start making the argument that it should work that way, then you will totally destroy any opportunity for alternative clients to exist.  Would the community have given the green light to the developers of that client you're running right now?  I find that pretty doubtful.  You think "REKT" is bad?  See how much you complain if no one was even allowed to code anything unless the community gave their blessing first.  That's how you ruin decentralisation.

Be careful what you wish for.  You really aren't thinking this through to conclusion.

franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 23, 2018, 06:51:54 PM
Last edit: November 23, 2018, 07:36:42 PM by franky1
 #54

here we go again  you poke, i bite.
shame you are missing the point of decentralisation

they had segwit roadmap plan from 2014. before community input
they had code before community got to download.
Which means someone made a compelling argument about the idea and most of the developers in that team agreed with it.  Ideas can come from anywhere, including from developers themselves.  Saying that developers shouldn't work on an idea just because a developer proposed it isn't a mature or realistic stance.

^ their internal circle agreed. before letting the community have a say
i guess you missed the 2014-5 drama.

v not letting the community be involved is prime example of centralisation
 
I don't know where you get this perverse notion that developers need permission from the community before they are allowed to code something.

do you ever wonder why i just publicly give out idea's and let people decide yay or nah. rather than keep idea's in secret and make code and then demand adoption. again before trying to say im demanding anything. show me a line of code i made that had a mandatory deadline that would take people off the network if not adopted.
.. you wont. there is no need for your finger pointing that im an authoritarian demanding rule changes. because there is no demanding rule changes made by me

i find it funny that you flip flop about community involvement.
my issue is that they plan a roadmap. code a roadmap. release it and even if rejected, they mandate it into force anyway

emphasis.. MANDATE without community ability to veto

again the point your missing
having code that allows community vote/veto (2016, good)
having code that mandates activation without vote/veto (2017, bad)

you do realise that core could have had segwit activate by christmas 2016 if they just actually went with the 2015 consensus(early variant of segwit2x). which was a consensus compromise of the wider community finding agreement
which gave legacy benefits too.
but by avoiding it. and causing drama all the way through 2016 of how they want it only thier way(segwitx1).. pretending they couldnt code it any other way
they still didnt get a fair true consensus vote in their favour in spring 2017. so had to resort to the madatory activation and swayed the community is (fake) option of segwit2x(nya) just to then backtrack to segwit1x once they got the segwit part activated

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
DooMAD
Legendary
*
Offline Offline

Activity: 2100
Merit: 1342


Leave no FUD unchallenged


View Profile WWW
November 23, 2018, 07:19:38 PM
 #55

here we go again  you poke, i bite.
shame you are missing the point of decentralisation

Shame you are missing the point of permissionless.

And again, you would only make Bitcoin more centralised if the community had to approve code before anyone could write it.  Don't dodge the argument by telling me I'm missing the point when you're deliberately evading the point.  You can't insist on a handicap for one dev team and then claim you want a level playing field.  It's already level, because anyone can code what they want.  Clearly what you want is an un-level playing field where the dev team you don't like have restrictions on what they can do, but everyone else is free to do whatever.  In the past, others have demanded the same un-level playing field, except stacked against alternative clients.  They argued (wrongly) that the developers of alternative clients needed permission from the community to publish the code they did.  I defended the alternative clients. 

How can I be the one missing the point of decentralisation when my argument defends the right of everyone to code what they want?  That means we get multiple clients.  You're the one arguing that developers need to have permission from the community to code stuff and alternative clients would simply not get that permission.  That means we would only get ONE client (and it wouldn't be the one you want).  You should be agreeing with me on this, not fighting me. 


do you ever wonder why i just publicly give out idea's and let people decide yay or nah. rather than keep idea's in secret and make code and then demand adoption. again before trying to say im demanding anything. show me a line of code i made that had a mandatory deadline that would take people off the network if not adopted.
.. you wont. there is no need for your finger pointing that im an authoritarian demanding rule changes. because there is no demanding rule changes made by me

You're demanding a change in the way developers act.  You don't have any code to show because it isn't possible for code to achieve what you're demanding. 


emphasis.. MANDATE without community ability to veto

You're using your veto right now by running a non-Core client.  If enough people did that, consensus would change.  The problem you appear to be having is that most people on the network have no desire to use their veto.  They don't want consensus to change. 

Cue Franky1 deflecting from all of these points instead of countering them in 3... 2...

ETFbitcoin
Legendary
*
Offline Offline

Activity: 1764
Merit: 2023

Use SegWit and enjoy lower fees.


View Profile WWW
November 23, 2018, 07:25:47 PM
 #56

franky1 & DooMAD, both of you starting going off-topic again

a alternative concept could be whereby a new transaction format. (imagine bc1q.. but instead SC1) which has no lock
bitcoin network sees a
bc1q->SC1 as a go to side chain tx (mined by both chains)
and
SC1->bc1q as a return to main net(mined by both chains)

mainnet will not relay or collate(mine to blocks) any sc1 -> sc1 transactions. (hense no need to lock)
sidechain will not relay or collate(mine to block) any bc1q -> bc1q transactions. (hense no need to lock)

this way it avoids a situation of "pegging" such as
bc1q->bc1q(lock)                                sc1(create)->sc1

having bc1q->sc1 is not about pegging a new token into creation.
its about taking the transaction of mainchain. mining it also into a sidechain. and then only able to move sc1 addresss->sc1 the sidechain until its put back into a bc1q address which are then only able to move on mainnet

i say this because having a
bc1q ->bc1q that has a lock. can have openings for abuse based on timing and also loss of key for the bc1q address.
where as moving funds to a sc1 address is obsolving loss/risk from the mainnet as value is not in a bc1q address no more(as its spent). and moving the value with the transaction to the sidechain.
(thus solves the UTXO issue on mainnet of not having to hold 'locked' value)

allowing value to flow without time lock allows the auditing of funds to show its still one value moving. instead a locked value in main chain and new value in sidechain

i do have issues and reservations about sidechains too but the original "pegging" concept of sidechains really was bad and open to abuse. (not having the BTC side seen as "spent" while spending was actually happening)

This is interesting idea and oddly it has similarity with proposal Superspace: Scaling Bitcoin Beyond SegWit on part moving between main-chain and side-chain.
But thinking about UI/UX, introducing another address format is confusing for most user. Even 1..,3... and bc1... are plenty confusing.

aliashraf
Hero Member
*****
Offline Offline

Activity: 896
Merit: 656


View Profile
November 23, 2018, 07:27:34 PM
 #57

@franky1, @doomad

Let it go guys. You are arguing too much about devs. Who cares about devs? Devs come and go, bitcoin stays and right now it needs your productive technical contribution.

Once there is a brilliant idea it will find its way to history. I don't care about politics involved, in a worst case scenario, if a group of devs could be proved to resist the truth too much, I'll personally help abandoning them.
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 23, 2018, 07:55:54 PM
Last edit: November 23, 2018, 09:36:30 PM by franky1
 #58

again another offtopic poke from that certain person.. one last bite


i make a point. and then you say i am missing and deflecting my point

thats like me speaking english. you speak german. i say a point about english and you get upset and then waffle on that my point is about german and how im missing a german point.

ill make it real clear for you. although there are dozens of topics that repeat the word enough

mandatory mandatory mandatory

you cannot rebut the mandatory. so you are deflecting it.

they had segwit planned back in 2014 and had to get it activated ($100m was at stake)
no matter what the community done/said/want/didnt want. they needed it activated THEIR WAY
they didnt get their way 2016-spring 2017
so they resorted to mandatory activation

my point is about mandatory.
i should know my point. because im the one making it.

point is: mandatory

if you want to argue against my point then you need to address the point im making.

again
for the dozenth topic you have meandered off topic with your pokes. my point has been about the MANDATORY

if you cannot talk about the mandatory not being decentralised.. then atleast hit the ignore button.

as for the whole no community permission.. re-read your own post i gave you merit on. and see your flip flop

as for your deflection of the writing code. its not that they just write code they want. its that they avoid community code/involvement. as it doesnt fit their internal circles PLAN they had as far back as 2014..

yea anyone can write code.. but making it mandatory.. no. as thats anti concensus

also i said if they actually listened to the community and went with the late 2015 consensus agreement of a early varient of segwit2x they would have got segwit activated sooner. and the community would have had legacy benefits too.

but again they mandated only their pre existing plan which is what caused such delays /drama and still causing drama today as we are still discussing scaling even now 3 years later.

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 23, 2018, 08:07:21 PM
Last edit: November 23, 2018, 09:43:20 PM by franky1
 #59

hurray. back on topic.. hopefully we can stay on topic.

franky1 & DooMAD, both of you starting going off-topic again

bc1q ->bc1q that has a lock. can have openings for abuse based on timing and also loss of key for the bc1q address.
where as moving funds to a sc1 address is obsolving loss/risk from the mainnet as value is not in a bc1q address no more(as its spent). and moving the value with the transaction to the sidechain.
(thus solves the UTXO issue on mainnet of not having to hold 'locked' value)


This is interesting idea and oddly it has similarity with proposal Superspace: Scaling Bitcoin Beyond SegWit on part moving between main-chain and side-chain.
But thinking about UI/UX, introducing another address format is confusing for most user. Even 1..,3... and bc1... are plenty confusing.

its not that difficult. its like if you never intend to use a side chain. you wont have to worry. because you wont get funds from a SC1 address. and never need to send to a sc1 address

as for the UI. well again a UI can be designed to have an option eg

File   Options
          display segwit features
          display sidechain features

if you dont want it. you dont select it / dont realise it exists as the UI wont display features.
again you wont get funds from a SC1 or send to an SC1 unless you want it. so easy for average joe


but yea it will help the UTXO set stay down
unlike some sidechain concepts and definitely unlike LN (as locks means keeping the funds as UTXO for a locked time(facepalm)



.. anyway.. superspace project... (hmm seems they missed a few things and got a few details misrepresented) but anyway

the specific No_op code used to make segwit backward compatible cant be used again.
this is why

imagine a transaction in bytes, where a certain byte was a option list($)
***********$*******************
in legacy nodes
if $is: (list in laymans eli-5 so dont knitpick)
     0= ignore anything after (treat as empty, meaning no recipient meaning anyone can spend)(no_op)
     1= do A
     2=do B

in segwit nodes the 0 option was changed to become do segwit checks
they also added a few other opcodes too. as a sublist
so now with segwit being the active fullnodes. there is no 0='ignore anything after' at that particular $ byte
as its now
EG
***********$%******************
if $is: (list in laymans eli-5 so dont knitpick)
     0= do segwit if %is: (list in laymans eli-5 so dont knitpick)
                            0= ignore anything after (meaning anyone can spend)(no_op)
                            1= ignore anything after (meaning anyone can spend)(no_op)
                            ....
                            11= do A
                            12=do B
     1= do A
     2=do B
theres actually more No_ops now for segwit(%)

so if someone was to want to do what segwit did. they would first need to find a new no_op thats not been used.
and then they would need to ensure pools didnt treat it as a no_op at activation. (yep not really as soft as made out)
which would be another 2016-2017 drama event.

what the link does not explain is that the summer 2017 was actually a hard fork as nodes that would reject segwit needed to be thrown off the network. and pools needed to treat the no_op as not a 'anyonecanspend'

which means another hard fork would be needed. (hopefully not mandated this time)

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
Wind_FURY
Hero Member
*****
Offline Offline

Activity: 1218
Merit: 812


Crypto-Games.net: Multiple coins, multiple games


View Profile
November 24, 2018, 06:06:36 AM
 #60

@franky1
This is what we call being proactive and anticipation. The example you give about the SegWit roadmap from 2014 is one example. Are we forced to use SegWit? As DoomAD says, they cannot integrate everyone's wishes, but they anticipate to make Bitcoin usable with various convenients solutions. It's like complaining because someone is working to improve Bitcoin and talk about consensus. A consensus from the mass could turn in a 10 years old kid decision.

  No, we are not forced to use Segwit. However, if someone chooses not to use Segwit, you are penalized by paying higher fees. This may only amount to pennies at the moment, but it can add up. If BTC starts to get used even more, many casual users will then be compelled to use LN, to avoid prohibitive fees.

Or be forced to use Bitcoin Cash. I believe that was their idea of why they split from Bitcoin, right? But apparently, not that many people in the community believed that bigger blocks for scalability were a good trade-off on decentralization.

The social consensus remains "Bitcoin is Bitcoin Core".


▄▄▄████████▄▄▄
▄██████████████████▄
▄██████████████████████▄
██████████████████████████
████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
████████████████████████████
██████████████████████████
▀██████████████████████▀
▀██████████████████▀
▀▀▀████████▀▀▀
   ███████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
███████
BTC  ◉PLAY  ◉XMR  ◉DOGE  ◉BCH  ◉STRAT  ◉ETH  ◉GAS  ◉LTC  ◉DASH  ◉PPC
     ▄▄██████████████▄▄
  ▄██████████████████████▄        █████
▄██████████████████████████▄      █████
████ ▄▄▄▄▄ ▄▄▄▄▄▄ ▄▄▄▄▄ ████     ▄██▀
████ █████ ██████ █████ ████    ▄██▀
████ █████ ██████ █████ ████    ██▀
████ █████ ██████ █████ ████    ██
████ ▀▀▀▀▀ ▀▀▀▀▀▀ ▀▀▀▀▀ ████ ▄██████▄
████████████████████████████ ████████
███████▀            ▀███████ ▀██████▀
█████▀                ▀█████
▀██████████████████████████▀
  ▀▀████████████████████▀▀ 
✔️DICE           
✔️BLACKJACK
✔️PLINKO
✔️VIDEO POKER
✔️ROULETTE     
✔️LOTTO
bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 24, 2018, 06:37:02 AM
Last edit: November 24, 2018, 06:49:03 AM by bones261
 #61

@franky1
This is what we call being proactive and anticipation. The example you give about the SegWit roadmap from 2014 is one example. Are we forced to use SegWit? As DoomAD says, they cannot integrate everyone's wishes, but they anticipate to make Bitcoin usable with various convenients solutions. It's like complaining because someone is working to improve Bitcoin and talk about consensus. A consensus from the mass could turn in a 10 years old kid decision.

  No, we are not forced to use Segwit. However, if someone chooses not to use Segwit, you are penalized by paying higher fees. This may only amount to pennies at the moment, but it can add up. If BTC starts to get used even more, many casual users will then be compelled to use LN, to avoid prohibitive fees.

Or be forced to use Bitcoin Cash. I believe that was their idea of why they split from Bitcoin, right? But apparently, not that many people in the community believed that bigger blocks for scalability were a good trade-off on decentralization.

The social consensus remains "Bitcoin is Bitcoin Core".

Why should anyone be forced to settle for something which is less secure?  So far LN is still in alpha testing stage. Risk of loosing funds is too high ATM. Maybe when they improve their network, I'll want to use it. BCH has always had less hash rate and is therefore less secure. I think people should be able to utilize the most secure network out there in an affordable manner and not be forced to settle for some less secure stuff. Even if lightning network gets its act together, a second layer solution will be second best when it comes to security. So I guess the BTC blockchain will only be secure vip2vip cash. The riffraff can settle for less secure crap.  Cheesy

   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
hulla
Hero Member
*****
Offline Offline

Activity: 1190
Merit: 529


First 100% Liquid Stablecoin Backed by Gold


View Profile
November 24, 2018, 07:25:07 AM
 #62

The OP was right about increasing of bitcoin blocksize to also be one of the solution to bitcoin scaling because big block size promote more nodes but we also have to put into consideration the side effect of the block increasing which I presume could lead to  the 51% attacks and if Lightning does not which I believe it will another solution will arouse.


franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 24, 2018, 01:47:33 PM
 #63

The OP was right about increasing of bitcoin blocksize to also be one of the solution to bitcoin scaling because big block size promote more nodes but we also have to put into consideration the side effect of the block increasing which I presume could lead to  the 51% attacks and if Lightning does not which I believe it will another solution will arouse.

51% attack will not be caused by larger blocks.

here is why
1. ASICS do not touch the collated TX data. asics are handed a hash and told to make a second hash that meets a threshold.
it does not matter if the unmined blockhash is an identifier of 1kb of block tx data or exabytes of tx data. the hash remains the same length.
the work done by asics has no bearing on how much tx data is involved.

2. the verifying of transactions is so fast its measured in nano/miliseconds not seconds/minutes. devs know verification times are of no inconvenience which is why they are happy to let people use smart contracts instead of straight forward transactions. if smart contracts/complex sigops inconvenienced block verification efficiencies they would not add them (well moral devs wouldnt(dont reply/poke to defend devs as thats missing the point. relax have a coffee))

they are happy to add new smart features as the sigops is a combined few seconds max compared to the ~10min interval

3. again if bloated tx's do become a problem. easy, reduce the txsigops. or remove the opcode of features that allows such massive delays

4. the collating of txdata is handled before a confirmed/mined hash is solved. while ASICS are hashing a previous block, nodes are already verifying and storing transactions in mempool for the next block. it takes seconds while they are given upto 10 minutes. so no worries.
pools specifically are already collating transactions from the mempool into a new block ready to add a mined hash to it when solved to form the chain link. thus when a block solution is found:
if its their lucky day and they found the solution first. boom. miliseconds they hand the ASICS the next block identifier
if its a competitors block. within seconds they know its valid or not
it only takes a second to collate a list of unconfirmed tx's to make the next block ID to give to asics.
try it. find an MP3(4mb) on your home computer and move if from one folder to another. you will notice it took less time than reading this sentance. remember transactions in the mempool that get collated into a block to get a block identifier had already been verified during the previous slot of time so its just a case of collating data that the competitor hasnt collated

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 24, 2018, 01:48:28 PM
Merited by bones261 (2)
 #64

5. we are not in the pre-millenia era of floppy disks. we are in the era where:
256gb is a fingernail size not server size.
4tb hard drives are the cost of a grocery shop not a lifetime pension.
4tb hard drives evn for 20mb blocks would be the average life cycle of a pc anyway if all blocks were filled
internet is not dialup, its fibre(landline), its 5g(cellular)
if your on a capped internet then your not a business, as your on a home/residance internet plan
if your not a business then you are not NEEDING to validate and monitor millions of transactions

if you think bandwidth usage is too high then simply dont connect to 120 nodes. just connect to 8 nodes

..
now. the main gripe of blocksize
its not actually the blocksize. its the time it takes to initially sync peoples nodes.
now why are people angry about that.
simple. they cannot see the balance of their imported wallet until after its synced.

solution
spv/bloom filter utxo data of imported addresses first. and then sync second
that way people see balances first and can transact and the whole syncing time becomes a background thing no one realises is happening because they are able to transact within seconds of downloading and running the app.
i find it funny how the most resource heavy task of a certain brand of node is done first. when it just causes frustrations.
after all if people bloomfilter important addresses and then make a tx.. if those funds actually are not spendable due to receiving bad data from nodes.. the tx wont get relayed by the relay network.
in short
you cannot spend what you do not have
all it requires is a bloomfilter of imported addresses first. list the balance as 'independently unverified' and then do the sync in the background. once synced. the "independently unverified" tag vanishes
simple. people are no longer waiting for hours just to spend their coin.

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
aliashraf
Hero Member
*****
Offline Offline

Activity: 896
Merit: 656


View Profile
November 24, 2018, 05:57:15 PM
 #65

5. we are not in the pre-millenia era of floppy disks. we are in the era where:
256gb is a fingernail size not server size.
4tb hard drives are the cost of a grocery shop not a lifetime pension.
4tb hard drives evn for 20mb blocks would be the average life cycle of a pc anyway if all blocks were filled
internet is not dialup, its fibre(landline), its 5g(cellular)
Although I like the tone I have to remind you of a somewhat bitter fact: None of these would help with scaling bitcoin, like definitively. It is good news that Moore law is still working (somehow) but the problem is not about the resources, it is the propagation delay of blocks because of the time it takes to fully validate transactions they are committing to. Unfortunately, propagation delay does not improve by Moore law.

That said, I'm ok with a moderate improvement in current numbers (by decreasing block time rather than increasing block size which are just the same in this context) but it won't be a scaling solution as it couldn't be used frequently because of proximity premium problem in mining. Larger pools/farms would have a premium once they hit a block as they are able to start mining next block while their poor competitors are busy validating the newborn and relaying it (they have to do both if they don't want to be on an orphan chain).

Many people are confused about this issue, even Gavin were confused about it, I read an article from him arguing about how cheap and affordable is a multi-terabyte HD. It is not about HDs neither about internet connectivity or bandwidth, it is about the number of transactions that need validation and the delayed progress of blocks and the resulting centralization threats.

Quote
if your on a capped internet then your not a business, as your on a home/residance internet plan
if your not a business then you are not NEEDING to validate and monitor millions of transactions
Home/non-business full nodes are critical parts of bitcoin ecosystem and our job is strengthening them by making it more feasible for them to stay and grow in numbers considerably.

Quote
now. the main gripe of blocksize
its not actually the blocksize. its the time it takes to initially sync peoples nodes.
now why are people angry about that.
simple. they cannot see the balance of their imported wallet until after its synced.
Good point but not the most important issue with block size.

Quote
solution
spv/bloom filter utxo data of imported addresses first. and then sync second
that way people see balances first and can transact and the whole syncing time becomes a background thing no one realises is happening because they are able to transact within seconds of downloading and running the app.
i find it funny how the most resource heavy task of a certain brand of node is done first. when it just causes frustrations.
after all if people bloomfilter important addresses and then make a tx.. if those funds actually are not spendable due to receiving bad data from nodes.. the tx wont get relayed by the relay network.
Recently, I have proposed a solution for fast sync and getting rid of the history but surprisingly I did it to abandon spvs(well, besides other objectives). I hate spvs, they are vulnerable and they add zero value to network, they just consume and give nothing because they don't validate blocks.

The problem we are discussing here is scaling and the framework op has proposed is kinda hierarchical partitioning/sharding. I am afraid instead of contributing to this framework, sometimes you write about side chains and now you are denying the problem as being relevant completely. Considering what you are saying, there is no scaling problem at all!

franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 24, 2018, 09:45:23 PM
 #66

The problem we are discussing here is scaling and the framework op has proposed is kinda hierarchical partitioning/sharding. I am afraid instead of contributing to this framework, sometimes you write about side chains and now you are denying the problem as being relevant completely. Considering what you are saying, there is no scaling problem at all!
the topic creator is proposing having essentially 2 chains.  then 4 chains then 8 chains.

we already have that, ever since clams split and then every other fork

the only difference the OP is saying is that the forks still communicate and atomic swap coins between each other..
the reason i digressed into sidechains is about the fact that without going into buzzwords. having 2 chains that atomic swap is when simplifying it down to average joe experience.. exactly the same on-offramp experience of sidechains.

i just made a simple solution to make it easily visable which "node-set"(chain) is holding which value (bc1q or sc1) without having to lock:peg value to one nodeset(chain) to peg:create fresh coin in another node-set(chain).

because pegging(locking) is bad.. for these reasons
it raises the UTXO set because coins are not treated as spent
the locks cause the coins in UTXO are out of circulation but still need to be kept in UTXO
the fresh coin of a sidechain dont have traceability back to a coinbase(blockreward)

...
the other thing is bitcoin is one chain.. and splitting the chain is not new (as my second sentence in this post highlighted)
...
the other thing about reducing blocktime. (facepalm) reducing blocktime has these issues:
1. reduces the interval of 10mins for the whole propagation things you highlight as an issue later in your post
2. its not just mining blocks in 5 minutes. its having to change the reward. difficulty. and also the timing of the reward halvening
3. changing these affect the estimate of when all 21mill coins are mined.(year ~2140)
...
as for propagation. if you actually time how long it takes to propagate it actually is fast, its only a couple seconds
this is because at transaction relay. its  about 14 seconds for transactions to get around 90% of the network, validated and set into mempool. as for a solved block. because fullnodes already have the (majority) of transactions in their mempool already they just need block header data and list of tx, not the tx data. and then just ensure all the numbers(hashes) add up. which is just 2 seconds
....
having home users of 0.5mb internet trying to connect to 100 nodes is causing a bottleneck for those 100 nodes.as they are only getting data streaming at 0.005mb (0.5/100)
where as having a home user of 0.5mb internet with just 10 connections is a 0.05 data stream speed

imagine it your a business NEEDING to monitor millions of transactions because they are your customers.
you have fibre.. great. you set your node to only connect to 10 connections. but find those 10 connections are home users who they are connecting to 100 nodes. you end up only getting data streamed to you at a combined 0.05mb.(bad)

but if those home users also decided to only connect to 10 nodes youll get data streams at 0.5mb
thats 10x faster

if you do the 'degrees of separation' math for a network of 10k nodes and say 1.5mb of blockdata
the good propagation: 10 connection(0.5mb combined stream)
  10  *  10   *   10   *   10=10,000
3sec + 3sec + 3sec + 3sec=12seconds for network

the bad propagation: 100 connection(0.05mb combined stream)
  100  *   100   =10,000
30sec + 30sec  =1 minute

alot of people think that connecting to as many nodes as possible is good. when infact it is bad.
the point i am making is:
home users dont need to be making 120 connections to nodes to "help the network". because that infact is causing a bottleneck

also sending out 1.5mb of data to 100 nodes instead of just 10 nodes is a waste of bandwidth for home users.
also if a home user only has bottomline 3g/0.5mb internet speeds as oppose to fibre. those users are limiting the fibre users that have 50mb.. to only get data at 0.5mb due to the slow speed of the sender.

so the network is better to be centralised by
10000 business fibre users who NEED to monitor millions of transactions
rather than
10000 home users who just need to monitor 2 addresses

yes of course. for independance have home users be full nodes but the network topology should be that slow home users be on the last 'hop' of the relay. not at the beginning/middle.

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 24, 2018, 10:26:48 PM
Last edit: November 24, 2018, 10:55:36 PM by franky1
 #67

the topic creator is talking about splitting the population/data in half

to split the block data in half is to have each half have traceability. thus. basically 2 chains.
yea you split the population in half. but the community tried that with all the forks
(i should have explained this in earlier post. my methodology is working backwards)

with all that said. its not just a fork the coin but make it atomically swappable.

the other thing the topic creator has not thought about is not just how to atomically swap but
also the mining is split across 2 chains instead of 1. thus weakening them both instead of having just 1 strong chain

its also that to ensure both chains comply with each other. a new "master"/"super" node has to be created that monitors both chains fully. which ends up back wher things started but this time the master node is juggling two datachain lines instead of one.
.
so now we have a new FULL NODE of 2 data chains.
a sub layer of lighter nodes that only work as full nodes for a particular chain..

and then we end up discussing the same issues with the new (master(full node)) in relation to data storage, propagation, validation.. like i said full circle, which is where instead of splitting the network/population in half,,
which eventually is just weakening the network data. from the new node layer its not changing the original problem of the full node(now masternode)

(LN for instance wants to be a master node monitoring coins like bitcoin and litecoin and vert coin and all other coins that are LN compatible)

which is why my methodolgy is backwards because i ran through some theoretical scenarios. skipped through the topic creators idea and went full circle back to addressing the full node(masternode) issues

wich is why if your going to have masternodes that do the heavy work. you might as well just skip weaking the data by splitting it and just call a masternode a full node. after all thats how it plays out when you run through the scenarios

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 24, 2018, 10:47:36 PM
Merited by aliashraf (2)
 #68

last point raise by aliashraf about my idea of using spv
it was not to actually be spv. it was just to use the spv mechanism for the first time load screen of fresh users. and then after 10seconds as a secondary be the fullnode.

..
think of it this way.
would you rather download a 300gb game via a torrent wait hours and then play.
or
download a small freeroam level that while you play its downloading the entire game via torrents in the background.

my idea was not to just (analogy) download a free roam level and thats it.
it was to just use the SPV mechanism for the first loading screen to make the node useful in the first 10seconds so that while the node is then downloading the entire blockchain people can atleast do something while they wait instead of frustrating themselves waiting for the sync

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
aliashraf
Hero Member
*****
Offline Offline

Activity: 896
Merit: 656


View Profile
November 24, 2018, 10:59:35 PM
Last edit: November 24, 2018, 11:30:26 PM by aliashraf
 #69

franky,
Splitting is far different from forking. Forks inherit the full history and the state, shards don't. @mehanikalk has done good job on a similar idea as OP and his topic Blockreduce ... is trending (in this subforum measures) too. In both topics we are dealing with sharding, neither forks nor side-chains.

I do agree that using atomic swaps (with recent advancements in HTLC) and forks has something to do with scaling, the problem being price as a free variable. It would be interesting tho, having a solution for this problem.

Back to your recent post:
the other thing about reducing blocktime. (facepalm) reducing blocktime has these issues:
1. reduces the interval of 10mins for the whole propagation things you highlight as an issue later in your post
2. its not just mining blocks in 5 minutes. its having to change the reward. difficulty. and also the timing of the reward halvening
3. changing these affect the estimate of when all 21mill coins are mined.(year ~2140)
I'm not offering block time reduction as an ultimate scaling solution, of course it is not. I'm just saying for a moderate improvement in bitcoin parameters it is ways better than a comparable block size increase. They may look very similar but there is a huge difference: A reduction in block time helps with mining variance and supports small pools/farms. The technical difficulties involved are not big deals as everything could be adjusted easily, block reward, halving threshold, ...

Quote
as for propagation. if you actually time how long it takes to propagate it actually is fast, its only a couple seconds
this is because at transaction relay. its  about 14 seconds for transactions to get around 90% of the network, validated and set into mempool. as for a solved block. because fullnodes already have the (majority) of transactions in their mempool already they just need block header data and list of tx, not the tx data. and then just ensure all the numbers(hashes) add up. which is just 2 seconds
Right, but it adds up once you go to next and next hops. It is why we call it proximity premium. Bitcoin p2p network is not a complete graph, it gets like 10 times more for a block to be relayed to all miners. When you double or triple the number of txs, the proximity flaw gets worse just a bit less than two or three times respectively.

Quote
having home users of 0.5mb internet trying to connect to 100 nodes is causing a bottleneck for those 100 nodes.as they are only getting data streaming at 0.005mb (0.5/100)
...
yes of course. for independance have home users be full nodes but the network topology should be that slow home users be on the last 'hop' of the relay. not at the beginning/middle.
No disputes. I just have to mention it is infeasible to engineer the p2p network artificially and AFAIK current bitcoin networking layer allows nodes to drop slow/unresponsive peers and if you could figure out an algorithm to help with a more optimized topology, it would be highly appreciated.

On the other hand, I think partitioning/sharding is a more promising solution for most of these issues. Personally I believe in sharding of state (UTXO) which is a very challenging strategy as it resides on edges of forking.
aliashraf
Hero Member
*****
Offline Offline

Activity: 896
Merit: 656


View Profile
November 24, 2018, 11:13:57 PM
 #70

last point raise by aliashraf about my idea of using spv
it was not to actually be spv. it was just to use the spv mechanism for the first time load screen of fresh users. and then after 10seconds as a secondary be the fullnode.

..
think of it this way.
would you rather download a 300gb game via a torrent wait hours and then play.
or
download a small freeroam level that while you play its downloading the entire game via torrents in the background.

my idea was not to just (analogy) download a free roam level and thats it.
it was to just use the SPV mechanism for the first loading screen to make the node useful in the first 10seconds so that while the node is then downloading the entire blockchain people can atleast do something while they wait instead of frustrating themselves waiting for the sync

As you may have noticed, I meritted this idea of yours and as you know, I've a lot to add here. Most importantly, a better idea than getting UTXO via a torrent download which implies using trust (you need the hash) and being subject to sybil attacks, could be implementing it in bitcoin as what I've described in this post.
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 25, 2018, 12:58:39 AM
Last edit: November 25, 2018, 01:15:27 AM by franky1
 #71

shards don't.

i already looked months ago into sharding and played around ran scenarios. and like i said a few posts ago. once you wash away all the buzzwords it all just comes full circle

many sharding concepts exist.
some are:
master chain(single) where every 10blocks.. each block designated to a certain region/group
    - this way no group can mine 10 blocks in one go. but only get 1 block in. then have to wait 9 blocks before another chance

master node(multichain) where by there are multiple chains that swap value
    - i say master node. because although some sharding concepts pretend to not need them.
      inevitably. without a master node the regions/group nodes end up having to "trust" the other chain when sending utxo's

and many more concepts
issues are the "trust" of data if its not in one place becomes its own weakness
even LN devs have noticed this and realised that LN full nodes would need to be master nodes downloading and monitoring bitcoin, litecoin, vertcoin
seems some of the sharding devs of sharding projects have not yet seen the dilemma play out.
(5 weak points more prone to attack than 1 strong point.
EG
easier to 51% attack one of the 5 points of 5exahash than it is to 51% attack one point of 50exahash. thus if one of the 5 weak points gets hit.. damage is done.)

I do agree that using atomic swaps (with recent advancements in HTLC) and forks has something to do with scaling, the problem being price as a free variable. It would be interesting tho, having a solution for this problem.

no, atomic swaps and HTLC are BADDDDDDD. think of the UTXO set. (as atomic swaps is about 2 tokens and pegging)
as i originally said, better to double mine (bc1q->sc1) that way bitcoin sees the bc1q as spent and thus no more UTXO
thus no holding large UTXO set of locked unspents (the sc1just vanishes and not counted on btc's UTXO as btc cant spend a SC1....
but again the whole needing a master node to monitor both chains comes up and circles around.

so yea LN, sharding, sidechains, multichains.. they all end up back full circle of needing a "masternode"(essentially a full node)
that monitors everything.. which end up as the debate of if a master node exists. just call it a full node and get back to the route problem.

i could waffle on about all the weakenesses of the 'trust' of relying on separate nodes holding separate data, but ill try to keep my posts short

block time reduction ..  a moderate improvement in bitcoin parameters it is ways better than a comparable block size increase. They may look very similar but there is a huge difference: A reduction in block time helps with mining variance and supports small pools/farms. The technical difficulties involved are not big deals as everything could be adjusted easily, block reward, halving threshold, ...

nope
transactions are already in peoples mempool before a block is made.
nodes dont need to send a block with transactions again. they just send the blockheader.
this is why stats show transactions take 14 seconds but a block only takes 2 seconds. because a block header is small and the whole verification is just joining the header to the already obtained transactions.
again
all the nodes are looking for is the blockheader that links them together. the blockheader doesnt take much time at all.

because transactions are relayed (14 seconds) which is plenty of time within the 10minute window.
if that 10 minute window is reduced to 5minute. then thats less time for transactions to relay.
i say this not about the average txsigops of upto 500.. but in cases where a tx has 16000sigops which i warned about a few pages ago.
a 5 minute interval will be worse than a 10min interval
math: a 16k sigop tx will take 7 and ahalf mins to relay the network. meaning if a pools seen a tx first. relayed it out. and then started mining a 5 minute block.. solves it. and relays out the blockheader ..
the block header would have reached everyone in 5minutes 2 seconds. but a bloated transaction (degrees of separation) only reached 1000 nodes. as the last 'hop' from 1000 to their 10 nodes each to multiply it to the last lot has not had time to deal with the bloated tx

also its kinda why the 16k sigops limit exists to keep things under 10 minutes. but foolisly allow such a large amount that it doesnt keep tx's down to seconds.

yes a solution would be bring it down to a lower txsigoplimit when reducing thee blocktime.
which alone brings the
difficulty retarget to weekly. so discussions of moving it to 4032 blocks to bring it back to fornightly.
reward halving happening every 2 years. which means 21mill in less time unless to move it to 420,000 blocks for a 4 year half
and as i said 5minute interval which without reducing txsigop limit. will hurt propagation

so reducing block time is NOT simpler than just increasing block size.. thers alot of ripple effects of reducing blocktime than there is increasing blocksize.
also what do end users gain with a ~5min confirm.. its not like standing at a cashier desk waiting for a confirm is any less frustrating.. it would require a 2second confirm to make waiting in line at a cashier not be a frustration.
even 30 seconds seems the same eternity as 10 minutes when you see an old lady counting change

Quote
Right, but it adds up once you go to next and next hops. It is why we call it proximity premium. Bitcoin p2p network is not a complete graph, it gets like 10 times more for a block to be relayed to all miners. When you double or triple the number of txs, the proximity flaw gets worse just a bit less than two or three times respectively.
i kinda explained it a few paragraphs ago in this post

Quote
No disputes. I just have to mention it is infeasible to engineer the p2p network artificially and AFAIK current bitcoin networking layer allows nodes to drop slow/unresponsive peers and if you could figure out an algorithm to help with a more optimized topology, it would be highly appreciated.

you have kinda self found your solution..
have 15 potential nodes and pick the best 10 with the best ping/speed. naturally the network finds its own placement where the slower nodes are on outter rings and faster ones are at the center

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
Wind_FURY
Hero Member
*****
Offline Offline

Activity: 1218
Merit: 812


Crypto-Games.net: Multiple coins, multiple games


View Profile
November 25, 2018, 06:08:37 AM
 #72

@franky1
This is what we call being proactive and anticipation. The example you give about the SegWit roadmap from 2014 is one example. Are we forced to use SegWit? As DoomAD says, they cannot integrate everyone's wishes, but they anticipate to make Bitcoin usable with various convenients solutions. It's like complaining because someone is working to improve Bitcoin and talk about consensus. A consensus from the mass could turn in a 10 years old kid decision.

  No, we are not forced to use Segwit. However, if someone chooses not to use Segwit, you are penalized by paying higher fees. This may only amount to pennies at the moment, but it can add up. If BTC starts to get used even more, many casual users will then be compelled to use LN, to avoid prohibitive fees.

Or be forced to use Bitcoin Cash. I believe that was their idea of why they split from Bitcoin, right? But apparently, not that many people in the community believed that bigger blocks for scalability were a good trade-off on decentralization.

The social consensus remains "Bitcoin is Bitcoin Core".

Why should anyone be forced to settle for something which is less secure?  So far LN is still in alpha testing stage. Risk of loosing funds is too high ATM. Maybe when they improve their network, I'll want to use it. BCH has always had less hash rate and is therefore less secure. I think people should be able to utilize the most secure network out there in an affordable manner and not be forced to settle for some less secure stuff. Even if lightning network gets its act together, a second layer solution will be second best when it comes to security. So I guess the BTC blockchain will only be secure vip2vip cash. The riffraff can settle for less secure crap.  Cheesy

VIP2VIP cash? Bitcoin will remain an open system that anyone in the world can use. What is "VIP" with Bitcoin? Nothing.

Are the fees constantly high that it discourages everyone from using Bitcoin? I don't believe it is. The fees have been low since the increasing adoption of Segwit.

Plus about sharding. Franky1, do you agree that bigger blocks are inherently centralizing, and that "sharding" is just prolonging the issue instead of solving it?


▄▄▄████████▄▄▄
▄██████████████████▄
▄██████████████████████▄
██████████████████████████
████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
████████████████████████████
██████████████████████████
▀██████████████████████▀
▀██████████████████▀
▀▀▀████████▀▀▀
   ███████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
███████
BTC  ◉PLAY  ◉XMR  ◉DOGE  ◉BCH  ◉STRAT  ◉ETH  ◉GAS  ◉LTC  ◉DASH  ◉PPC
     ▄▄██████████████▄▄
  ▄██████████████████████▄        █████
▄██████████████████████████▄      █████
████ ▄▄▄▄▄ ▄▄▄▄▄▄ ▄▄▄▄▄ ████     ▄██▀
████ █████ ██████ █████ ████    ▄██▀
████ █████ ██████ █████ ████    ██▀
████ █████ ██████ █████ ████    ██
████ ▀▀▀▀▀ ▀▀▀▀▀▀ ▀▀▀▀▀ ████ ▄██████▄
████████████████████████████ ████████
███████▀            ▀███████ ▀██████▀
█████▀                ▀█████
▀██████████████████████████▀
  ▀▀████████████████████▀▀ 
✔️DICE           
✔️BLACKJACK
✔️PLINKO
✔️VIDEO POKER
✔️ROULETTE     
✔️LOTTO
bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 25, 2018, 06:32:24 AM
 #73


VIP2VIP cash? Bitcoin will remain an open system that anyone in the world can use. What is "VIP" with Bitcoin? Nothing.

Are the fees constantly high that it discourages everyone from using Bitcoin? I don't believe it is. The fees have been low since the increasing adoption of Segwit.

Plus about sharding. Franky1, do you agree that bigger blocks are inherently centralizing, and that "sharding" is just prolonging the issue instead of solving it?

We are discussing the scaling issue.  Roll Eyes You really think the blockchain fee is going still going to be low if and when the demand is 100x higher than currently ? Let's hope if and when that ever happens, LN will somehow ease the risk of losing coins either due to your channel partner closing a channel in an earlier state and you not catching it or you having a system error and closing a channel in an earlier state in error, and getting a penalty. (Or closing it in an earlier state not in your favor.) BTW, can someone get a penalty if they close a channel in an earlier state that is not in their favor? It appears that way to me. Talk about adding insult to injury. Sure hope that I am dead wrong about that. Otherwise the penalty system is a joke.

   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
Wind_FURY
Hero Member
*****
Offline Offline

Activity: 1218
Merit: 812


Crypto-Games.net: Multiple coins, multiple games


View Profile
November 25, 2018, 08:29:46 AM
Merited by bones261 (1)
 #74


VIP2VIP cash? Bitcoin will remain an open system that anyone in the world can use. What is "VIP" with Bitcoin? Nothing.

Are the fees constantly high that it discourages everyone from using Bitcoin? I don't believe it is. The fees have been low since the increasing adoption of Segwit.

Plus about sharding. Franky1, do you agree that bigger blocks are inherently centralizing, and that "sharding" is just prolonging the issue instead of solving it?

We are discussing the scaling issue.  Roll Eyes


Then "VIP2VIP cash" is the wrong terminology. Bitcoin remains to be an open system.

Quote

You really think the blockchain fee is going still going to be low if and when the demand is 100x higher than currently ?


No. I already said that users will be forced to use Bitcoin Cash. Other more secure altcoins would be better though.

Quote

Let's hope if and when that ever happens, LN will somehow ease the risk of losing coins either due to your channel partner closing a channel in an earlier state and you not catching it or you having a system error and closing a channel in an earlier state in error, and getting a penalty. (Or closing it in an earlier state not in your favor.)


As any software development project, it may succeed, or it may fail. But Lightning has been developing well, let's hope that continues.

Quote

BTW, can someone get a penalty if they close a channel in an earlier state that is not in their favor? It appears that way to me. Talk about adding insult to injury. Sure hope that I am dead wrong about that. Otherwise the penalty system is a joke.


There has been misinformation attempts on the Lightning Network everywhere, made by the people who want all transactions to be processed on-chain, in big blocks, and by all nodes. That is not scalable.

But I will ask around and find a good answer for you.


▄▄▄████████▄▄▄
▄██████████████████▄
▄██████████████████████▄
██████████████████████████
████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
████████████████████████████
██████████████████████████
▀██████████████████████▀
▀██████████████████▀
▀▀▀████████▀▀▀
   ███████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
███████
BTC  ◉PLAY  ◉XMR  ◉DOGE  ◉BCH  ◉STRAT  ◉ETH  ◉GAS  ◉LTC  ◉DASH  ◉PPC
     ▄▄██████████████▄▄
  ▄██████████████████████▄        █████
▄██████████████████████████▄      █████
████ ▄▄▄▄▄ ▄▄▄▄▄▄ ▄▄▄▄▄ ████     ▄██▀
████ █████ ██████ █████ ████    ▄██▀
████ █████ ██████ █████ ████    ██▀
████ █████ ██████ █████ ████    ██
████ ▀▀▀▀▀ ▀▀▀▀▀▀ ▀▀▀▀▀ ████ ▄██████▄
████████████████████████████ ████████
███████▀            ▀███████ ▀██████▀
█████▀                ▀█████
▀██████████████████████████▀
  ▀▀████████████████████▀▀ 
✔️DICE           
✔️BLACKJACK
✔️PLINKO
✔️VIDEO POKER
✔️ROULETTE     
✔️LOTTO
aliashraf
Hero Member
*****
Offline Offline

Activity: 896
Merit: 656


View Profile
November 25, 2018, 09:17:31 AM
Last edit: November 25, 2018, 10:29:54 AM by aliashraf
 #75

many sharding concepts exist.
...
issues are the "trust" of data if its not in one place becomes its own weakness
...
(5 weak points more prone to attack than 1 strong point.
Security, is good but too much security is nightmare as it comes with costs and costs should be paid somehow. Sharding is what we need in over-secure situations where we can safely split.

In the context of bitcoin and cryptocurrencies, security is not defined as an absolute measure linearly dependent on the costs of 50%+1 attack, it is just an unfortunate misunderstanding:  Bitcoin has always been secure since the first day while the costs of carrying out such an attack has increased substantially from a few bucks to hundreds of million dollars.

Security is not quietly an 'indexable' measure, saying 'this coin is less secure', 'that coin is more secure' is absurd in cryptocurrency context, the way I understand bitcoin there is no "less" or "more" security, you are secure or you are not ... and wait ... there is a third state:
You may be ridiculously overpaying for being secure against threats that don't ever exist, e.g.  the current situation with bitcoin!

A proper sharding/splitting/partitioning would not put anybody in danger if it is applied to an over-secure blockchain and I'm not talking about an overloaded one, like what OP proposes.

As of your other arguments regarding propagation delay and my block time decrease idea, I choose not to go through an endless debate over this for now but to be clear, I denounce almost everything you say in this regard. Let's do it later, somewhere else.
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 25, 2018, 12:40:54 PM
Last edit: November 25, 2018, 02:08:37 PM by franky1
Merited by bones261 (2)
 #76

scaling bitcoin is not a 1mb base block or 1 gigabyte base block argument
so lets demyth that old PR campaign right away

its 1,2,4,8,16,32 and so on.. here is the important thing. over time
just like 2009-2017 (2009: 0.25->0.5 upto 2013  then 2013: 0.75->1mb upto 2017)

so when you mention costs, i have to ask at what cost.
people do not keep hold of and demand to only ever use their windows xp computer forever. they naturally upgrade.
things progress over time.
trying to keep things at floppy disk space / dialup internet speed forever is not natural.

the whole PR campaign narrative of "visa by midnight or else fail" is a fail in itself. the stats of visa are not of one network. but of multiple networks. and then combining the numbers to then assume one network and that bitcoin needs as one network to be that powerful ASAP.


so many people think scaling bitcoin means as soon as code is activated servers and centralisation occur. all because people think 1mb->1gb overnight.

..
as for sharding

imagine there are 5 regions with 5 maintainers per region where the final region(5) is the important one everyone wants to attack
5   5   5   5   5 taking over one region is easy

3   3   3   3   5
                   8 the last region is now being 160% attacked

4   4   4   4   5
                   4 the last region is now being 80% attacked

imagine an outside has 6 friends
5   5   5   5   5
                   6 the last region is now being 120% attacked

thus what happens is a master node that takes in all 5 regions where to break the masternodes rules now requires more than 25 malicious participants to take over the masternodes rules. because the masternode can just reject blocks made by the (8)160%, (4)80%, 120%(6) attackers allowing the (5) to be accepted while wasting the attackers time.
thus keeping the (5) region alive and acceptable

this is why bitcoin came into being because although there are(now) 'pools'.. there is 1 rule all node/pools(regions) have to abide by.
sharding does not solve the byzantine generals rule. sharding are undoing the byzantine generals solution and taking the debate back a decade to what cypherpunks couldnt solve before satoshi came up with the solution. where pools have separate rules.

for instance. in the 5 regions where 5 separate maintainers. without oversight of a master rule the maintainers can change the rules of their region which can affect the other 4 regions.
imagine one region decided not to accept transactions from another particular region. they can do it as they have no masternode that rules that each region must accept each other.

once you wash away all the buzzwords created by the "sharding" community and play out scenarios as a real world usage and not the utopian 'it will always work'... you will see issues arrise.
too many people have a project and only run tests on 'how its intended to work' and not run 'hammer it until it breaks to find the weakeness' tests

take visa. thats sharding in basic form(washing away buzzwords). america decided not to accept transactions from russia because the visa system is separate networks and one network can change the rules and just cut off another network.

however if there was a single rule that all transactions are acceptable. then russia would be treated the same as america. and america couldnt do a thing about it

bitcoins beauty is about how to solve having multiple generals ruling multiple regions but ensuring they all comply to one rule.
and solves how those generals abide by that one rule without having one general.

the answer was everyone watch everything and reject the malicious individuals
we have had "sharding" systems in the real world for decades. sharding is DE-inventing what makes bitcoin, bitcoin

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 25, 2018, 03:06:02 PM
 #77

Bitcoin has always been secure since the first day while the costs of carrying out such an attack has increased substantially from a few bucks to hundreds of million dollars.

Security is not quietly an 'indexable' measure, saying 'this coin is less secure', 'that coin is more secure' is absurd in cryptocurrency context, the way I understand bitcoin there is no "less" or "more" security, you are secure or you are not ...

bitcoin and crypto is not secure. its why difficulty exists.
yes bitcoin is secure against a CPU attack as it wil require trillions of PC's to match/overtake
but its not secure against certain things though. which is why it has to keep evolving.

only last month there was a bug that could have DDoSed the network

too many people have the mindset that once the titanic is built its too big to fail
once banks are in power they are too big to fail
once bitcoin was made in 2009 its too big to fail.

the mindset should be look for weakenesses find weakenesses solve weakenesses and then repeat
this is why so many ICO's and shardings dont launch. they spread out a utopian dream and instead of finding/solving problems they double down on promoting the utopia and try shutting people up if they mention weakenesses.

true developers want to hear weakenesses so they can fix them. bad developers only want to hear "great job, now you can retire rich"

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 25, 2018, 03:07:58 PM
 #78

We are discussing the scaling issue.  Roll Eyes


Then "VIP2VIP cash" is the wrong terminology. Bitcoin remains to be an open system.


People must have a short memory. During the prolonged tx backlog event back in 2017, it certainly seemed that way. Since the tx fee is based on tx size and not the amount sent, people wanting to move around smaller amounts were getting eaten alive with fees. Although a fee of 300 sats per byte is trivial for someone wanting to move around 1 BTC, it was prohibitive for someone wanting to move around 1 million sats.

Quote
Quote

You really think the blockchain fee is going still going to be low if and when the demand is 100x higher than currently ?


No. I already said that users will be forced to use Bitcoin Cash. Other more secure altcoins would be better though.

So the riffraff have to settle for a 3rd rate shitcoin network? Sounds like a vip2vip attitude to me.  Cheesy

Quote
Quote

Let's hope if and when that ever happens, LN will somehow ease the risk of losing coins either due to your channel partner closing a channel in an earlier state and you not catching it or you having a system error and closing a channel in an earlier state in error, and getting a penalty. (Or closing it in an earlier state not in your favor.)


As any software development project, it may succeed, or it may fail. But Lightning has been developing well, let's hope that continues.

How long has this been in development? I may not be from Missouri, but you still have to show me. Perhaps I will be less critical when and if I see a product that is actually usable and less prone to me losing funds for computer/human error.


   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
DooMAD
Legendary
*
Offline Offline

Activity: 2100
Merit: 1342


Leave no FUD unchallenged


View Profile WWW
November 25, 2018, 03:11:18 PM
 #79

scaling bitcoin is not a 1mb base block or 1 gigabyte base block argument
so lets demyth that old PR campaign right away

its 1,2,4,8,16,32 and so on.. here is the important thing. over time

It doesn't have to be an integer, so let's get rid of that myth too.  Why not:

1.25mb base/5mb weight,
1.5mb base/6mb weight,
1.75mb base/7mb weight
2mb base/8mb weight
and so on?  

It's not just about it happening "over time", it's also about sensible increments.  Based on what you've witnessed to date, it should be more than obvious that most BTC users are in no rush to double or quadruple the base.

franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 25, 2018, 03:18:17 PM
 #80

There has been misinformation attempts on the Lightning Network everywhere, made by the people who want all transactions to be processed on-chain, in big blocks, and by all nodes. That is not scalable.

But I will ask around and find a good answer for you.

or... thre are issues. that the LN devs themselves admit. but some people that want LN to be a success dont want the positive PR train to stop. so will argue endlessly that LN is utopia

here are LN devs themselves saying about issues with LN that wont be fixed.
https://youtu.be/8lMLo-7yF5k?t=570

and yes factories is the next evolution of LN concepts where factories will be the new masternodes housing lots of peoples data and also monitoring multiple chains because they know LN is not bitcoin. its a separate network for mutiple coins where LN wishes to be the main system and leave bitcoin and litecoin as just boring shards/data stores

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 25, 2018, 03:31:42 PM
 #81

anyway once people play scenarios around that sharding will eventually lead back around to needing masternodes (full nodes)
and then once people play scenarios of if a masternode is needed that stores all shard data. why have it separate because the regional nodes are now just lower down the network and of less importance.
and play around with scenarios of if not separate we are back to a single nodebase of code monitoring everything.. it just full circles. that the strongest network is one that is united under one ruleset

It doesn't have to be an integer, so let's get rid of that myth too.  Why not:

1.25mb base/5mb weight, requires hard fork to move to
1.5mb base/6mb weight,  requires hard fork to move to
1.75mb base/7mb weight  requires hard fork to move to
2mb base/8mb weight requires hard fork to move to
and so on?  

It's not just about it happening "over time", it's also about sensible increments.  Based on what you've witnessed to date, it should be more than obvious that most BTC users are in no rush to double or quadruple the base.

fixed that for you.
however having a case where we remove the witness scale factor and have code set in consensus of
4mb pool policy /32mb consensus,
4.25mb pool policy/32mb consensus,  
4.5mb pool policy/32mb consensus

means no hard forks pre increment. and then just some code of when the blocks soft increment by 0.25
again to demyth the PR campaign.
this is not about 32mb blocks.
this is about avoiding 3 years of debate just to perform one hard fork. then another 3 year debate to perform another hard fork
blocks will not be 32mb. they will increment at the pool policy amounts. all monitored and adhered to by nodes enforcing that pools dont go over the policy amount until a coded thing happens that soft activate a 0.25 increment

again to demyth the PR campaign
this is not about EB (trying to turn the debate into mentioning a certain group of people)
this is about allowing progressive growth without hard forks and 3 year debates per increment

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
DooMAD
Legendary
*
Offline Offline

Activity: 2100
Merit: 1342


Leave no FUD unchallenged


View Profile WWW
November 25, 2018, 04:20:49 PM
Last edit: November 25, 2018, 04:35:58 PM by DooMAD
 #82

this is not about 32mb blocks.
(...)
this is not about EB

How do you propose something and then basically say "this is not about the thing I'm literally proposing right now"?   Roll Eyes

Perhaps it would allow us to forego the continual hardfork drama, but it's still not remotely as simple and clear-cut as you're making it out to be.  There are very good reasons why people are opposed to such a system and if you aren't even going to attempt to overcome the objections and only talk about the positives, then don't expect people to take this proposal seriously.

franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 25, 2018, 04:35:38 PM
Last edit: November 25, 2018, 05:17:41 PM by franky1
 #83

this is not about 32mb blocks.
(...)
this is not about EB

How do you propose something and then basically say "this is not about the thing I'm literally proposing right now".   Roll Eyes

Perhaps it would allow us to forego the continual hardfork drama, but it's still not remotely as simple and clear-cut as you're making it out to be.  There are very good reasons why people are opposed to such a system and if you aren't even going to attempt to overcome the objections and only talk about the positives, then don't expect people to take this proposal seriously.

because "EB" is a buzzword
EB is for a particular limitd proposal

the way EB handles increments is one way. but i can think of dozens. so again its not about EB.. but about increments without hardforks.
just like mentioning 32mb. suddenly your mind instantly thinks of an existing proposal.
this is not about those specific proposals.

the 32mb is about something entirely different. which is technical. by which certain proposals latched onto. now if you ignore the proposals which came second to the 32mb thing. and then concentrate on the 32mb as its own thing. where many concepts and proposals can develop from. you will see that i am not talking anything about resurecting old proposals. but getting to the root issue of hardforks and the 32mb issue.
again try not to make is a thing about old proposals. but about how to scale bitcoin with known things that need to be addressed.

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
aliashraf
Hero Member
*****
Offline Offline

Activity: 896
Merit: 656


View Profile
November 25, 2018, 05:08:01 PM
Merited by bones261 (2), ETFbitcoin (1)
 #84

Bitcoin has always been secure since the first day while the costs of carrying out such an attack has increased substantially from a few bucks to hundreds of million dollars.

Security is not quietly an 'indexable' measure, saying 'this coin is less secure', 'that coin is more secure' is absurd in cryptocurrency context, the way I understand bitcoin there is no "less" or "more" security, you are secure or you are not ...

bitcoin and crypto is not secure. its why difficulty exists.
yes bitcoin is secure against a CPU attack as it wil require trillions of PC's to match/overtake
but its not secure against certain things though. which is why it has to keep evolving.
Nop. Bitcoin doesn't need to evolve because of security. Who says that? It is already secure ways more than necessary, as I mentioned above. It needs evolution for new problems that has been raised recently, the most important one being what we are discussing here: mass adoption and scaling. It is not a security question for me.

There is and there have been no security crisis in bitcoin, forget about bugs, we are not discussing bugs here are we?

Security of bitcoin is guaranteed by its elegant use of game theory in its core model. It is defined as the result of equilibrium between costs of attack and maximum incentives for committing it. It is not just about one side, there is no one sided equilibrium, do you really need to be addressed about this?

As bitcoin price surges the threats are being escalated and difficulty too, it keeps bitcoin safe, just safe, not safer. It does not make sense to say bitcoin is getting safer when threats are getting stronger! Difficulty rise keeps us safe despite the escalated threats. Again, safe and not safer.

franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 25, 2018, 07:06:10 PM
Last edit: November 25, 2018, 07:43:40 PM by franky1
 #85

here is another way how i see a system where shards could be used, (dismissing my own concerns of masternode monitoring inevitability)

imagine one chain as the master financial audit chain. where the transactions are smaller
FFFFF AAAAAAA -> FFFFF AAAAAAA
                            FFFFF AAAAAAA
in byte count
5 7 5 7
     5 7
=36 bytes

the F is a byte and are an identifier. 5 bytes allow over 1 trillion identifiers
the A is a byte and are coin amount.7bytes allows 72quadrillion so easy to store numbers upto 2.1quad satoshi

and then a shard stores an ID chain
   EG:       FFFFF = bc1q.... lets say less than 50 bytes per entry
and then another shard stores the signatures
  lets say under 100bytes per entry

essentially making the financial chain that audits coins right back to the coinreward(creation)
only using 36bytes of data per minimal tx instead of 225bytes and a multisig of 2in- 2 out being 48bytes instead of 300bytes+

this not only lets more tx's per mb. but brings down the utxo down to 12 bytes per 'address' and coinage

Nop. Bitcoin doesn't need to evolve because of security. Who says that? It is already secure ways more than necessary,

ok imagine it. everything got locked down tomorrow. hashrate doesnt evolve. difficulty locks, developers retire and we stay with 10,000 full nodes..
how long do you think it will be before things start to go bad

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
mechanikalk
Member
**
Offline Offline

Activity: 84
Merit: 20


View Profile WWW
November 25, 2018, 08:15:38 PM
 #86

bitcoins beauty is about how to solve having multiple generals ruling multiple regions but ensuring they all comply to one rule.
and solves how those generals abide by that one rule without having one general.

the answer was everyone watch everything and reject the malicious individuals
we have had "sharding" systems in the real world for decades. sharding is DE-inventing what makes bitcoin, bitcoin

Franky,  based on all of your comment and discussion, much of which I agree with, I think you should look at BlockReduce.

There is also a discussion thread on it specifically.

Basically it is using Proof-of-Work to create a hierarchy of merge mined blockchains.  It allows for incremental work to be used (lower chain blocks) to efficiently group and propagate transactions to the entire network.  I think of many of the people here, you would get it and be able to provide constructive feedback.
aliashraf
Hero Member
*****
Offline Offline

Activity: 896
Merit: 656


View Profile
November 25, 2018, 09:38:50 PM
 #87

Nop. Bitcoin doesn't need to evolve because of security. Who says that? It is already secure ways more than necessary,

ok imagine it. everything got locked down tomorrow. hashrate doesnt evolve. difficulty locks, developers retire and we stay with 10,000 full nodes..
how long do you think it will be before things start to go bad
This is not an argument. I think you should spend a couple of minutes reading my post, it is not a wall of text after all.

Hashrate and price basically follow each other in a complicated socioeconomic game while price linearly determines how tense are security threats. It is how bitcoin becomes a stable dynamic system that regulates itself to remain in an equilibrium state. Doubling or tripling the hashrate doesn't make bitcoin "more secure", it would be just a waste of electricity.

Developers have a lot of problems to solve, we are already late in terms of versioning, nobody should apply for a premature retirement and we need to recruit even more.
Wind_FURY
Hero Member
*****
Offline Offline

Activity: 1218
Merit: 812


Crypto-Games.net: Multiple coins, multiple games


View Profile
November 26, 2018, 05:42:48 AM
 #88

We are discussing the scaling issue.  Roll Eyes


Then "VIP2VIP cash" is the wrong terminology. Bitcoin remains to be an open system.


People must have a short memory. During the prolonged tx backlog event back in 2017, it certainly seemed that way. Since the tx fee is based on tx size and not the amount sent, people wanting to move around smaller amounts were getting eaten alive with fees. Although a fee of 300 sats per byte is trivial for someone wanting to move around 1 BTC, it was prohibitive for someone wanting to move around 1 million sats.


It "seemed" that way. But where are we now? The fees are as low as ever in satoshis/byte. You have to consider that Bitcoin is not Paypal. "Satoshi's vision" of a peer to peer digital cash cannot be achieved on-chain. It is not scalable.

Quote

Quote
Quote

You really think the blockchain fee is going still going to be low if and when the demand is 100x higher than currently ?


No. I already said that users will be forced to use Bitcoin Cash. Other more secure altcoins would be better though.

So the riffraff have to settle for a 3rd rate shitcoin network? Sounds like a vip2vip attitude to me.  Cheesy


It is reality, and what the market has to offer until a solution is found. What the Core developers will not do is give way to a group of people who want a hard fork to bigger blocks, which will be problematic on its own.

Quote

Quote
Quote

Let's hope if and when that ever happens, LN will somehow ease the risk of losing coins either due to your channel partner closing a channel in an earlier state and you not catching it or you having a system error and closing a channel in an earlier state in error, and getting a penalty. (Or closing it in an earlier state not in your favor.)


As any software development project, it may succeed, or it may fail. But Lightning has been developing well, let's hope that continues.

How long has this been in development? I may not be from Missouri, but you still have to show me. Perhaps I will be less critical when and if I see a product that is actually usable and less prone to me losing funds for computer/human error.


As long as it should take if it has to work well for everyone.

Lightning nodes and channels are increasing. I believe there is more than $1.5 million of liquidity in Lightning, and growing.


▄▄▄████████▄▄▄
▄██████████████████▄
▄██████████████████████▄
██████████████████████████
████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
████████████████████████████
██████████████████████████
▀██████████████████████▀
▀██████████████████▀
▀▀▀████████▀▀▀
   ███████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
███████
BTC  ◉PLAY  ◉XMR  ◉DOGE  ◉BCH  ◉STRAT  ◉ETH  ◉GAS  ◉LTC  ◉DASH  ◉PPC
     ▄▄██████████████▄▄
  ▄██████████████████████▄        █████
▄██████████████████████████▄      █████
████ ▄▄▄▄▄ ▄▄▄▄▄▄ ▄▄▄▄▄ ████     ▄██▀
████ █████ ██████ █████ ████    ▄██▀
████ █████ ██████ █████ ████    ██▀
████ █████ ██████ █████ ████    ██
████ ▀▀▀▀▀ ▀▀▀▀▀▀ ▀▀▀▀▀ ████ ▄██████▄
████████████████████████████ ████████
███████▀            ▀███████ ▀██████▀
█████▀                ▀█████
▀██████████████████████████▀
  ▀▀████████████████████▀▀ 
✔️DICE           
✔️BLACKJACK
✔️PLINKO
✔️VIDEO POKER
✔️ROULETTE     
✔️LOTTO
bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 26, 2018, 06:34:37 AM
 #89


It "seemed" that way. But where are we now? The fees are as low as ever in satoshis/byte. You have to consider that Bitcoin is not Paypal. "Satoshi's vision" of a peer to peer digital cash cannot be achieved on-chain. It is not scalable.


You act as if that is set in stone. Like it can't be changed for any reason. If Bitcoin is still around in 2118, is it still going to be this way?


It is reality, and what the market has to offer until a solution is found. What the Core developers will not do is give way to a group of people who want a hard fork to bigger blocks, which will be problematic on its own.


Asking people to compromise with a less secure solution just isn't acceptable. From what I can tell, the BTC network is the most secure by leaps and bounds over any other altcoin.


As long as it should take if it has to work well for everyone.

Lightning nodes and channels are increasing. I believe there is more than $1.5 million of liquidity in Lightning, and growing.

So the Core team is going to stick to this plan, no matter what? I really doubt they are that inflexible if it appears they can't find an acceptable solution to people loosing coins.

   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
franky1
Legendary
*
Offline Offline

Activity: 2520
Merit: 1462



View Profile
November 26, 2018, 07:41:25 AM
Last edit: November 26, 2018, 07:59:38 AM by franky1
 #90

LN is not the solution for bitcoin. because LN is not bitcoin

LN is a separate network <- emphasis

LN is a separate network which many coins will use.

im surprised that people would prefer to defend developers by saying bitcoin is broke and wont scale and the only option is another network
even when it has been made clear that LN have flaws that wont be fixed people continue to push the defend a dev, call bitcoin broke and then promote an alternative network that is not even a blockchain and not even directly tied to bitcoin.

LN is just.. and in simple words a separate project that is just using bitcoins as its trial test coins. but not a bitcoin feature itself

bitcoin has been modified to be LN compatible. much like litecoin, vert coin and other coins.
LN has not been made to only fit bitcoin (whereby being a feature of bitcoin) which is why its not a bitcoin feature. but a cryptocurrency feature.
they are simply currently using bitcoin as their test coin to play off the fame of bitcoin

EG if LN used vertcoin as the initial testcoin. LN devs wouldnt be paid(as much) to develop LN but LN would still be the same network it is today. again for emphasis. its not a bitcoin feature its a separate network thats just playing off the fame of bitcoin because the LN network allow bitcoin access to the network (now bitcoin has been modified to fit LN)

im not saying LN doesnt have a niche. but clarifying that those that want bitcoin to remain the main currency of utility are not realising that LN will hurt bitcoin.
(imagine all the locked UTXO's)
anyway heres a video of the LN dev's themselves actually trying to remind people that the utopia of LN is not as perfect as promoted.
https://youtu.be/8lMLo-7yF5k?t=570
(do not reply about my post unless you have watched the video)
(do not reply about my post unless you are replying to defend the bitcoin network(not devs but the bitcoin network))
(do not reply about my post to derail the conversation into personal attacks of me. as the video itself is doing the bashing. not me)
(do not reply about my post just to promote other networks. because that is not about scaling bitcoin)

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
Wind_FURY
Hero Member
*****
Offline Offline

Activity: 1218
Merit: 812


Crypto-Games.net: Multiple coins, multiple games


View Profile
November 27, 2018, 08:34:19 AM
 #91


It "seemed" that way. But where are we now? The fees are as low as ever in satoshis/byte. You have to consider that Bitcoin is not Paypal. "Satoshi's vision" of a peer to peer digital cash cannot be achieved on-chain. It is not scalable.


You act as if that is set in stone. Like it can't be changed for any reason. If Bitcoin is still around in 2118, is it still going to be this way?


What has the higher chance of a "decentralized Bitcoin" on 2118. 1mb blocks or 32mb blocks?

You also act as if the spammer flooding the mempool can do it until 2118.

Quote


It is reality, and what the market has to offer until a solution is found. What the Core developers will not do is give way to a group of people who want a hard fork to bigger blocks, which will be problematic on its own.


Asking people to compromise with a less secure solution just isn't acceptable. From what I can tell, the BTC network is the most secure by leaps and bounds over any other altcoin.


I'm not asking. It is reality. Cool

By the way, what is your standpoint on Segwit and the Core developers' firm stance on smaller blocks?

Quote

As long as it should take if it has to work well for everyone.

Lightning nodes and channels are increasing. I believe there is more than $1.5 million of liquidity in Lightning, and growing.

So the Core team is going to stick to this plan, no matter what? I really doubt they are that inflexible if it appears they can't find an acceptable solution to people loosing coins.


I can't speak for the developers, but I believe a layered architecture is the direction its taking.


▄▄▄████████▄▄▄
▄██████████████████▄
▄██████████████████████▄
██████████████████████████
████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
████████████████████████████
██████████████████████████
▀██████████████████████▀
▀██████████████████▀
▀▀▀████████▀▀▀
   ███████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
███████
BTC  ◉PLAY  ◉XMR  ◉DOGE  ◉BCH  ◉STRAT  ◉ETH  ◉GAS  ◉LTC  ◉DASH  ◉PPC
     ▄▄██████████████▄▄
  ▄██████████████████████▄        █████
▄██████████████████████████▄      █████
████ ▄▄▄▄▄ ▄▄▄▄▄▄ ▄▄▄▄▄ ████     ▄██▀
████ █████ ██████ █████ ████    ▄██▀
████ █████ ██████ █████ ████    ██▀
████ █████ ██████ █████ ████    ██
████ ▀▀▀▀▀ ▀▀▀▀▀▀ ▀▀▀▀▀ ████ ▄██████▄
████████████████████████████ ████████
███████▀            ▀███████ ▀██████▀
█████▀                ▀█████
▀██████████████████████████▀
  ▀▀████████████████████▀▀ 
✔️DICE           
✔️BLACKJACK
✔️PLINKO
✔️VIDEO POKER
✔️ROULETTE     
✔️LOTTO
bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 28, 2018, 02:14:25 AM
 #92


What has the higher chance of a "decentralized Bitcoin" on 2118. 1mb blocks or 32mb blocks?

You also act as if the spammer flooding the mempool can do it until 2118.


In 100 years, I think a toaster will probably be able to handle 32mb blocks. No detrimental effect on decentralization. Also, someone can attempt to "flood the mempool," now with ultra low fee and zero fee transactions. The only way to combat this is for nodes to set a minimum fee that they require for them towill accept and relay to other nodes. It has absolutely nothing to do with the capacity of the blocks.


I'm not asking. It is reality. Cool

By the way, what is your standpoint on Segwit and the Core developers' firm stance on smaller blocks?


I think it is interesting that you mention Segwit and smaller blocks when part of what Segwit did was introduced increased block capacity. Perhaps they will be able to do something similar with future improvements to BTC.


   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
Wind_FURY
Hero Member
*****
Offline Offline

Activity: 1218
Merit: 812


Crypto-Games.net: Multiple coins, multiple games


View Profile
November 28, 2018, 04:47:27 AM
 #93


What has the higher chance of a "decentralized Bitcoin" on 2118. 1mb blocks or 32mb blocks?

You also act as if the spammer flooding the mempool can do it until 2118.


In 100 years, I think a toaster will probably be able to handle 32mb blocks. No detrimental effect on decentralization. Also, someone can attempt to "flood the mempool," now with ultra low fee and zero fee transactions. The only way to combat this is for nodes to set a minimum fee that they require for them towill accept and relay to other nodes. It has absolutely nothing to do with the capacity of the blocks.


Then what's the hurry? I believe Bitcoin will need to hard fork to bigger blocks later at any rate. If it gets consensus.

But Bitcoin Core will not bow to a group of people demanding bigger blocks now because reasons. It is politics, and some group of people will always want control. Remember "2X".

Quote


I'm not asking. It is reality. Cool

By the way, what is your standpoint on Segwit and the Core developers' firm stance on smaller blocks?


I think it is interesting that you mention Segwit and smaller blocks when part of what Segwit did was introduced increased block capacity. Perhaps they will be able to do something similar with future improvements to BTC.


Then why aren't you happy? Cool


▄▄▄████████▄▄▄
▄██████████████████▄
▄██████████████████████▄
██████████████████████████
████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
██████████████████████████████
████████████████████████████
██████████████████████████
▀██████████████████████▀
▀██████████████████▀
▀▀▀████████▀▀▀
   ███████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
██████████
███████
BTC  ◉PLAY  ◉XMR  ◉DOGE  ◉BCH  ◉STRAT  ◉ETH  ◉GAS  ◉LTC  ◉DASH  ◉PPC
     ▄▄██████████████▄▄
  ▄██████████████████████▄        █████
▄██████████████████████████▄      █████
████ ▄▄▄▄▄ ▄▄▄▄▄▄ ▄▄▄▄▄ ████     ▄██▀
████ █████ ██████ █████ ████    ▄██▀
████ █████ ██████ █████ ████    ██▀
████ █████ ██████ █████ ████    ██
████ ▀▀▀▀▀ ▀▀▀▀▀▀ ▀▀▀▀▀ ████ ▄██████▄
████████████████████████████ ████████
███████▀            ▀███████ ▀██████▀
█████▀                ▀█████
▀██████████████████████████▀
  ▀▀████████████████████▀▀ 
✔️DICE           
✔️BLACKJACK
✔️PLINKO
✔️VIDEO POKER
✔️ROULETTE     
✔️LOTTO
bones261
Legendary
*
Offline Offline

Activity: 1680
Merit: 1702


KnowNoBorders.io


View Profile
November 28, 2018, 05:39:52 AM
 #94


What has the higher chance of a "decentralized Bitcoin" on 2118. 1mb blocks or 32mb blocks?

You also act as if the spammer flooding the mempool can do it until 2118.


In 100 years, I think a toaster will probably be able to handle 32mb blocks. No detrimental effect on decentralization. Also, someone can attempt to "flood the mempool," now with ultra low fee and zero fee transactions. The only way to combat this is for nodes to set a minimum fee that they require for them towill accept and relay to other nodes. It has absolutely nothing to do with the capacity of the blocks.


Then what's the hurry? I believe Bitcoin will need to hard fork to bigger blocks later at any rate. If it gets consensus.

But Bitcoin Core will not bow to a group of people demanding bigger blocks now because reasons. It is politics, and some group of people will always want control. Remember "2X".


      I'm not in a hurry. I just hope bitcoin core isn't thinking the scaling problem is solved with LN. They should be working on optimizations. It would be nice if they came up with a way that you could run a pruned node or a sharded node, and not have to devote a bunch of HD to the initial sync. Not sure why you can't prune as you go on the initial sync.
      I also remember the 2x. When they were supposed to attempt the fork, it locked up. Obviously the software wasn't adequately vetted and tested.


I'm not asking. It is reality. Cool

By the way, what is your standpoint on Segwit and the Core developers' firm stance on smaller blocks?


I think it is interesting that you mention Segwit and smaller blocks when part of what Segwit did was introduced increased block capacity. Perhaps they will be able to do something similar with future improvements to BTC.


Then why aren't you happy? Cool

I'm not happy with LN for several reasons.
1) Requires you to be online at all times. Offline time=risk of funds. Online time=funds in hot wallet=risk of funds.
2) Hope your channel partner is online at all times in order to function optimally.
3) When making a transaction requiring hops, hope no one happens to disconnect or transfer funds along the hop, before your transaction completes. Otherwise, your funds are locked up for days.



   ▄▄██████▄▄
  ████████████
███▄▄
 ██████████████▀▀▀██▄
████████████████   ▀██▄
████████████████     ▀██
██████████████       ██▌
██████████████        ▐██
██▌▀▀██████▀▀         ▐██
▐██                   ██▌
 ██▄                 ▄██
  ▀██▄             ▄██▀
    ▀██▄▄▄     ▄▄▄██▀
      ▀▀█████████▀▀
MAIN CLUB
PARTNER of
W A T F O R D  FC
Industry Leading Crypto Sportsbook
|
SPECIAL
WATFORD FC
PROMOTIONS
|
UNIQUE
CONTENT &
GIVEAWAYS
|
▄▄█████████▄▄
▄█████████████████▄
▄██████████▀▀▀▀███████▄
▄█████████▀     ████████▄
▄██████████   ████████████▄
█████████        ██████████
█████████▄▄   ▄▄███████████
███████████   █████████████
▀██████████   ████████████▀
▀█████████   ███████████▀
▀████████▄▄▄██████████▀
▀█████████████████▀
▀▀█████████▀▀
.PLAY  HERE.
[/t
ETFbitcoin
Legendary
*
Offline Offline

Activity: 1764
Merit: 2023

Use SegWit and enjoy lower fees.


View Profile WWW
November 28, 2018, 07:48:07 PM
 #95

Then what's the hurry? I believe Bitcoin will need to hard fork to bigger blocks later at any rate. If it gets consensus.

But Bitcoin Core will not bow to a group of people demanding bigger blocks now because reasons. It is politics, and some group of people will always want control. Remember "2X".
      I'm not in a hurry. I just hope bitcoin core isn't thinking the scaling problem is solved with LN. They should be working on optimizations. It would be nice if they came up with a way that you could run a pruned node or a sharded node, and not have to devote a bunch of HD to the initial sync. Not sure why you can't prune as you go on the initial sync.
      I also remember the 2x. When they were supposed to attempt the fork, it locked up. Obviously the software wasn't adequately vetted and tested.

I agree, but few developers are working on MuSig:Schnorr Signature which should help on-chain scaling. There's another proposal such as MAST, but i can't find information whether it's on development/not.

As for 2x, AFAIK that happens because there were very few active developer and tester/reviewer.

I'm not happy with LN for several reasons.
1) Requires you to be online at all times. Offline time=risk of funds. Online time=funds in hot wallet=risk of funds.
2) Hope your channel partner is online at all times in order to function optimally.
3) When making a transaction requiring hops, hope no one happens to disconnect or transfer funds along the hop, before your transaction completes. Otherwise, your funds are locked up for days.

I also don't happy with LN at current state, but
1. This can be solved by using watchtower (software/service which have all HTLC and watch blockchain 24/7). But this either require 3rd party trust or another server/device which run 24/7 and act as watchtower.
In case where user run their own watchtower, there's no hot wallet as wallet could only connect to watchtower server on local connection.
2. Yes, but in some cases (such as the channel partner only make payment and never receive payment), online all times isn't too important.