Bitcoin Forum
December 13, 2024, 01:44:02 AM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 »  All
  Print  
Author Topic: A different approach to Bitcoin's scalability issues?  (Read 424 times)
ISRBitcoin11 (OP)
Newbie
*
Offline Offline

Activity: 4
Merit: 19


View Profile
May 26, 2022, 08:38:21 PM
Merited by Welsh (4), o_e_l_e_o (4), BlackHatCoiner (4), pooya87 (3), ABCbits (2), odolvlobo (1), DdmrDdmr (1)
 #1

A different approach to Bitcoin's scalability issues?

Opening
I have a question that I have recently thought about, and wanted to share and get feedback from people that probably have much more knowledge than I am regarding Bitcoin and blockchain.
This is a honest thought, that I compiled into these paragraphs, just trying to 'think outside the box' as they say.
I am not an expert of bitcoin or blockchain, so it is really probable that there's something that I am missing or miscalculating. It is also possible that this solution has already been suggested before me thinking about it.
Any feedback would be great. Thanks in advance.

Alignment
So this is just a moment to review the stuff I know and understand (or think I do) about Bitcoin's mechanism.
I am separating it from the below concept, so it would be clear what I base my stuff on. It is also possible that I misunderstood something that can change it all from the core.
So, some basics:
    1) Bitcoin can handle about 7 tx/sec at most, and that's also best case scenario. This is due to the 1MB block size limit.
    2) The BCH hard fork, happened because it was being claimed that making the block size bigger, would cause a more centralized network, that would be much harder for the regular people to participate in.
    3) The network adapts the difficulty of mining a block in relation to the hashing power of the system, every 2 weeks.

Basic concept
So, given that bitcoin uses a self-adapting mechanism that adjusts the block-mining-difficulty in relation to the network's hashing power, why not use a similar mechanism in order to make block size bigger?
Now more specifically:
What if the block size would adjust itself in a relation (which could be calculated differently than the block-mining-difficulty) to the network's hashing power as well?
This concept is based on one major assumption - in general (not in a specific date or short period of time), more hashing power means one of two things:
     1) More people are joining the network
     2) The same people/mining farms etc. got more power
Either one of these two options, suggests that the network (by the price of it's coin - aka bitcoin) is profitable enough for people to invest in mining power.
So, if the coin's value is high enough, it could imply that more people are adopting the network or the coin - resulting eventually to higher transactions rate.
With that being settled, it means that there could be a certain relation between the network's hashing power and it's transaction rate.
This relation does not have to be accurate to the decimals, but just enough to give a general trend. In the end, the difference between 10,000 tx/sec and 10,001 tx/sec is not that big of a deal.
I have to stop and say - I am not even close to be qualified to say anything practical about this relation in the numeric part of it. I'm trying to hand out a concept.
The calculation mechanism doesn't have to be applied to the network every two weeks.

So, to sum the basics of the concept up - A calculation/data-based mechanism would change the block size in relation to the network's total hashing power. This equals (given that everything mentioned above is legit) to saying that the block size will change depending on the coin's and network's adoption throughout the globe.

Advantages
This mechanism has some big advantages that I can see, also compared to the lightning network (which I might not understand well enough so correct me if I'm wrong):
      1) Not relating on another network - The mechanism is an integral part of the bitcoin network, and does not require a special and new custom-tailored network to be made.
          It should also be mentioned, that the dependency on another layer to process transactions, makes the whole ecosystem more vulnerable because now 2 networks has to work constantly perfect
          instead of just one.
      2) Fees - As the block size will increase in some accordance to mass adoption and transactions rate, it should somewhat average the fees and keep them at about the same level.
          This is in a huge difference from the lightning network, which must have increased fees in order to be economically justified.
      3) It works both upwards and downwards - Meaning, that also when fear or pessimism (or regulations) are causing people to abandon the network, leading to lower hashing rates - the network will adjust the
          block size back to smaller sizes - making it easier for everyone to mine blocks again.
      4) Another advantage that I see in relation to the Lightning network, is that in the lightning network, you must have an open channel across many people simultaneously in order to make some complicated
          transactions. Think about it - if you want to buy from a seller in the US, and you are living in the UK. You have basically 2 options of making this transaction happening on the lightning network:
               a) Open a direct single channel with that seller. This method is practically inefficient, because you have to open and close many channels with many sellers throughout your life that will cause a lot of fees.
               b) Use an already opened channel-links that can somehow connect you both. As I see this method working efficiently in small communities or groups, what are the odds that someone that you know/have a
                   channel with, has a channel w/ someone that has a channel w/ someone....(you can see where this is going) that will connect you with people from other countries? Statistically I believe it could be done,
                   but the real question is how much network fees will have to be paid for such a long route?
        
Disadvantages
       1) The frequency in which the mechanism updates the block size must be sophisticated enough in order to perform well also in extreme conditions, and not collapse the whole network.
           For example, if a major country, like China, forbids mining in it's territory and mining giants are escaping the country or shutting down - the network won't fall as hard and as long.
           *Although this might be an interesting obstacle in the way, I believe that a smart calculation and algorithm might solve this.
        2) Basing the mechanism on the hash rate could cause over/de-evaluation of the network's transaction rate, since it predicts a trend in the network and not actual transaction rate.
 

This is what I have come up with so far. What are your thoughts?
odolvlobo
Legendary
*
Offline Offline

Activity: 4522
Merit: 3426



View Profile
May 26, 2022, 09:47:53 PM
Merited by o_e_l_e_o (4), Welsh (3), ABCbits (3)
 #2

I don't think basing the block size cap on the total difficulty is a good idea because the difficulty is not a measure of a need for more (or less) block space.

Any computation that determines the maximum size of a block must have the same result no matter who calculates it or when they calculate it, and the inputs to the calculation must be indisputable. Otherwise, the chain will fork.

Eventually, mining will be completely dependent on transaction fees and the maximum size of a block affects the value of those fees. So, a proposal for varying the maximum size should include some analysis on the effect on fees in potential future scenarios.

It can be argued that a 1 MB fixed cap may not be optimal (or even sufficient), but until a clearly better method can be demonstrated, I think it is likely to remain.

Join an anti-signature campaign: Click ignore on the members of signature campaigns.
PGP Fingerprint: 6B6BC26599EC24EF7E29A405EAF050539D0B2925 Signing address: 13GAVJo8YaAuenj6keiEykwxWUZ7jMoSLt
pooya87
Legendary
*
Offline Offline

Activity: 3668
Merit: 11103


Crypto Swap Exchange


View Profile
May 27, 2022, 03:03:21 AM
Merited by ABCbits (4), o_e_l_e_o (4), P2PECS (3), Welsh (2), BlackHatCoiner (2), d5000 (1), DdmrDdmr (1)
 #3

    1) Bitcoin can handle about 7 tx/sec at most, and that's also best case scenario. This is due to the 1MB block size limit.
We use weight now and the limit is 4 MB and the total depends on the transaction sizes. For example block 738,060 was 1.4 MB with 3000 transactions while 367,853 was 0.99 MB with 12239 transactions. But in general based on tx sizes that most people use specially the transactions consolidating many inputs or have many outputs the tx sizes are higher hence the average tx/sec is about 5.

Quote
    2) The BCH hard fork, happened because it was being claimed that making the block size bigger, would cause a more centralized network, that would be much harder for the regular people to participate in.
BCH forked to create a new altcoin so that those behind it could make money whether through mining or pump and dumping or other ways such as scamming people by selling BCH to the newbies who wanted to buy bitcoin using the scam website known as bitcoin.com.

Quote
    3) The network adapts the difficulty of mining a block in relation to the hashing power of the system, every 2 weeks.
The adjustment takes place every 2016 blocks.

Quote
This concept is based on one major assumption - in general (not in a specific date or short period of time), more hashing power means one of two things:
     1) More people are joining the network
     2) The same people/mining farms etc. got more power
Either one of these two options, suggests that the network (by the price of it's coin - aka bitcoin) is profitable enough for people to invest in mining power.
So, if the coin's value is high enough, it could imply that more people are adopting the network or the coin - resulting eventually to higher transactions rate.
This is not the most correct assumption.
First of all the hashrate increase could be because someone made a much more efficient hardware to mine bitcoin (in simple terms a better ASIC). That doesn't have to mean more adoption.
Another reason could be that a very cheap source of electricity were found and more miners got on board in that place. For example the electricity price for home users is ridiculously low where I live ($0.002) but people generally don't know or care about bitcoin mining not to mention that the government regulation that forces any bitcoin miner to register and pay a higher electricity price (equal to export price which is about $0.05 I believe).
Imagine if that changed, and the ASIC production wasn't a bottleneck. The hashrate would shoot up to the moon.

On the other hand we saw how the price crash accompanied with China banning mining and more price crash could lead to hashrate dropping a lot.

In both of these 2 cases everything else about the network remains the same. It takes the same amount of time for nodes to download and verify each block. The number of nodes remain the same, etc. while hashrate jumps up or down.
So, as you can see hashrate is not a good characteristic to use in order to determine the block size.

Generally speaking any dynamic way of computing block size cap is a bad idea.

Quote
      1) Not relating on another network - The mechanism is an integral part of the bitcoin network, and does not require a special and new custom-tailored network to be made.
          It should also be mentioned, that the dependency on another layer to process transactions, makes the whole ecosystem more vulnerable because now 2 networks has to work constantly perfect
          instead of just one.
Bottom line is that we can never reach the desired scalability using block size ever simply because there is always a cap and the block could be filled whether with legitimate transactions or spam ones. But with a second layer that does't have the same limits, we can achieve a lot of scaling.

Quote
          This is in a huge difference from the lightning network, which must have increased fees in order to be economically justified.
Fees have to be low for people to use LN otherwise it would make very little sense to even open a channel if you have to pay for example $100.

█▀▀▀











█▄▄▄
▀▀▀▀▀▀▀▀▀▀▀
e
▄▄▄▄▄▄▄▄▄▄▄
█████████████
████████████▄███
██▐███████▄█████▀
█████████▄████▀
███▐████▄███▀
████▐██████▀
█████▀█████
███████████▄
████████████▄
██▄█████▀█████▄
▄█████████▀█████▀
███████████▀██▀
████▀█████████
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
c.h.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀█











▄▄▄█
▄██████▄▄▄
█████████████▄▄
███████████████
███████████████
███████████████
███████████████
███░░█████████
███▌▐█████████
█████████████
███████████▀
██████████▀
████████▀
▀██▀▀
NotATether
Legendary
*
Offline Offline

Activity: 1820
Merit: 7476


Top Crypto Casino


View Profile WWW
May 27, 2022, 04:42:34 AM
Merited by Welsh (2), ABCbits (1)
 #4

I think the variable-blocksize method has already been proposed/attempted before. Possibly Segwit2x for example. Or maybe I'm just imagining about that one.

The thing is when the blocksize is based on a difficulty, there's a chance for the blocksize to rise exponentially in proportion to difficulty without ever going down. This would keep the size at e.g. 100MB for a 100x increase in difficulty.

Even if someday, disks do get that kind of capacity to store a blockchain with that large blocks (and not eat up space from everything else), network speeds and internet prices would still be a bottleneck to downloading such a chain.

███████████████████████
████▐██▄█████████████████
████▐██████▄▄▄███████████
████▐████▄█████▄▄████████
████▐█████▀▀▀▀▀███▄██████
████▐███▀████████████████
████▐█████████▄█████▌████
████▐██▌█████▀██████▌████
████▐██████████▀████▌████
█████▀███▄█████▄███▀█████
███████▀█████████▀███████
██████████▀███▀██████████

███████████████████████
.
BC.GAME
▄▄▀▀▀▀▀▀▀▄▄
▄▀▀░▄██▀░▀██▄░▀▀▄
▄▀░▐▀▄░▀░░▀░░▀░▄▀▌░▀▄
▄▀▄█▐░▀▄▀▀▀▀▀▄▀░▌█▄▀▄
▄▀░▀░░█░▄███████▄░█░░▀░▀▄
█░█░▀░█████████████░▀░█░█
█░██░▀█▀▀█▄▄█▀▀█▀░██░█
█░█▀██░█▀▀██▀▀█░██▀█░█
▀▄▀██░░░▀▀▄▌▐▄▀▀░░░██▀▄▀
▀▄▀██░░▄░▀▄█▄▀░▄░░██▀▄▀
▀▄░▀█░▄▄▄░▀░▄▄▄░█▀░▄▀
▀▄▄▀▀███▄███▀▀▄▄▀
██████▄▄▄▄▄▄▄██████
.
..CASINO....SPORTS....RACING..


▄▄████▄▄
▄███▀▀███▄
██████████
▀███▄░▄██▀
▄▄████▄▄░▀█▀▄██▀▄▄████▄▄
▄███▀▀▀████▄▄██▀▄███▀▀███▄
███████▄▄▀▀████▄▄▀▀███████
▀███▄▄███▀░░░▀▀████▄▄▄███▀
▀▀████▀▀████████▀▀████▀▀
ISRBitcoin11 (OP)
Newbie
*
Offline Offline

Activity: 4
Merit: 19


View Profile
May 27, 2022, 06:26:30 AM
 #5

I think the variable-blocksize method has already been proposed/attempted before. Possibly Segwit2x for example. Or maybe I'm just imagining about that one.

The thing is when the blocksize is based on a difficulty, there's a chance for the blocksize to rise exponentially in proportion to difficulty without ever going down. This would keep the size at e.g. 100MB for a 100x increase in difficulty.

Even if someday, disks do get that kind of capacity to store a blockchain with that large blocks (and not eat up space from everything else), network speeds and internet prices would still be a bottleneck to downloading such a chain.




Thanks for replying. I totally agree and that's exactly what I was aiming in the 1st disadvantage I mentioned. This might happen, but, and I will say this very carefully since I am not qualified yet to create any kind of formulas, but I believe that an adjusting constant factor could be added into the calculation - to soften increases and decreases in block size. Firstly it would create a somewhat moderate increase, and even giving a limit for the max block size so it cannot go way higher than today's technology could handle efficiently (that of course throughout time could be adjusted again as times are changing).
pooya87
Legendary
*
Offline Offline

Activity: 3668
Merit: 11103


Crypto Swap Exchange


View Profile
May 27, 2022, 07:00:40 AM
 #6

I think the variable-blocksize method has already been proposed/attempted before. Possibly Segwit2x for example. Or maybe I'm just imagining about that one.
No, SegWit2x was a fixed size one time hard fork to increase the size once. You might be thinking of "bitcoin unlimited" which was basically giving all the power to miners to change the block size any time they wanted to any value they wanted as many times they wanted.

█▀▀▀











█▄▄▄
▀▀▀▀▀▀▀▀▀▀▀
e
▄▄▄▄▄▄▄▄▄▄▄
█████████████
████████████▄███
██▐███████▄█████▀
█████████▄████▀
███▐████▄███▀
████▐██████▀
█████▀█████
███████████▄
████████████▄
██▄█████▀█████▄
▄█████████▀█████▀
███████████▀██▀
████▀█████████
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
c.h.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀█











▄▄▄█
▄██████▄▄▄
█████████████▄▄
███████████████
███████████████
███████████████
███████████████
███░░█████████
███▌▐█████████
█████████████
███████████▀
██████████▀
████████▀
▀██▀▀
ISRBitcoin11 (OP)
Newbie
*
Offline Offline

Activity: 4
Merit: 19


View Profile
May 27, 2022, 07:05:55 AM
 #7

I don't think basing the block size cap on the total difficulty is a good idea because the difficulty is not a measure of a need for more (or less) block space.

Any computation that determines the maximum size of a block must have the same result no matter who calculates it or when they calculate it, and the inputs to the calculation must be indisputable. Otherwise, the chain will fork.

Eventually, mining will be completely dependent on transaction fees and the maximum size of a block affects the value of those fees. So, a proposal for varying the maximum size should include some analysis on the effect on fees in potential future scenarios.

It can be argued that a 1 MB fixed cap may not be optimal (or even sufficient), but until a clearly better method can be demonstrated, I think it is likely to remain.

Thanks for replying.
I agree that the difficulty isn't really a measure for transaction demand. However, and I might also be wrong about this one, I believe that a relation (not necessarily a linear one) might exist.
I went to check some data to see something more visual.
I checked these 2 tables:
https://ycharts.com/indicators/bitcoin_transactions_per_day
https://www.blockchain.com/charts/hash-rate

I have seen 2 significant things:
     1) It appears that there is no linear relation between the hashing rate and the transactions per day.
         One thing that I think should be taken into account, is that today we are already facing the scalability issue's consequences. Meaning that I think it's possible to say with just a little of confidence, that if
         transaction fees weren't as high - we would see much more transactions occurring. So basically saying that it might be possible that the data we are seeing today is "manipulated" by the problem itself.
      2) When major changes in trends occur, it is seen in both the hashing rate and the transaction 'demand'. You can look at the end of June 2021, end of April 2021, end of October 2020, end of May 2020,
          end of March 2020. Although not identical, the downwards trend is similar. It should be said that it's possible that there a 3rd factor I didn't take into account that is effecting both of these rates.

So to sum this one up - I don't believe that the relation between the two factors should be linear. It could be inverted, divided, and etc.

Regarding your insight of having to have matching calculations no matter time of place - regarding the who and where I agree, regarding the time factor, that's the whole case here.
Who and where - shouldn't matter, as long as the formula/algorithm is embedded in the network.
Time - that's what the calculation is all about. To give a specific (yet doesn't have to be accurate to the decimal point one) value that represents the relation between demand (transaction amount) and network power.
This value has to change throughout time, or nothing will ever change in the network (hence block size remains the same).
The algorithm might take multiple calculations made by many nodes, and average them to get the best all-around estimation. This way there wouldn't be a hard fork. This is only theoretically speaking. I am not sure that on the mathematical level this should be a good idea.

ISRBitcoin11 (OP)
Newbie
*
Offline Offline

Activity: 4
Merit: 19


View Profile
May 27, 2022, 08:18:55 AM
 #8

Thanks for the reply.
I agree with most of the things that you said.

Quote
This is not the most correct assumption.
First of all the hashrate increase could be because someone made a much more efficient hardware to mine bitcoin (in simple terms a better ASIC). That doesn't have to mean more adoption.
Another reason could be that a very cheap source of electricity were found and more miners got on board in that place. For example the electricity price for home users is ridiculously low where I live ($0.002) but people generally don't know or care about bitcoin mining not to mention that the government regulation that forces any bitcoin miner to register and pay a higher electricity price (equal to export price which is about $0.05 I believe).
Imagine if that changed, and the ASIC production wasn't a bottleneck. The hashrate would shoot up to the moon.

On the other hand we saw how the price crash accompanied with China banning mining and more price crash could lead to hashrate dropping a lot.

In both of these 2 cases everything else about the network remains the same. It takes the same amount of time for nodes to download and verify each block. The number of nodes remain the same, etc. while hashrate jumps up or down.
So, as you can see hash rate is not a good characteristic to use in order to determine the block size.

I totally agree with the China example. I mentioned it as well as a problem to be solved. Conceptually speaking, we don't need the block size to change every 2016 block (roughly 2 weeks). It can be changed once a month or even more (this should be picked more carefully of course). If so, the algorithm can get many calculations from different nodes throughout let's say a month, and make some sort of an average in order to represent a general trend in the network.
Now this is all based on the say that there is a good enough relation between the has rate and the transaction amount.
I went to check some data to see something more visual.
I checked these 2 tables:
https://ycharts.com/indicators/bitcoin_transactions_per_day
https://www.blockchain.com/charts/hash-rate

I have seen 2 significant things:
     1) It appears that there is no linear relation between the hashing rate and the transactions per day.
         One thing that I think should be taken into account, is that today we are already facing the scalability issue's consequences. Meaning that I think it's possible to say with just a little of confidence, that if
         transaction fees weren't as high - we would see much more transactions occurring. So basically saying that it might be possible that the data we are seeing today is "manipulated" by the problem itself.
      2) When major changes in trends occur, it is seen in both the hashing rate and the transaction 'demand'. You can look at the end of June 2021, end of April 2021, end of October 2020, end of May 2020,
          end of March 2020. Although not identical, the downwards trend is similar. It should be said that it's possible that there a 3rd factor I didn't take into account that is effecting both of these rates.

So to sum this one up - I don't believe that the relation between the two factors should be linear. It could be inverted, divided, and etc.
For example, there can be a constant factor added to the formula, let's name is for fun the BS factor (block size). This factor will be used to show the relation we agree on, between the two indicators. This factor could be negative, a fraction and etc.
The role of this factor is to moderate or extend the calculation's value, to whichever the network decides represents it's needs the best. The factor also can be changed if needed throughout time.

I also agree that the hash rate indicator might not be the perfect one to use - I chose it because it was the most logic one I knew and thought that can work. There might be another indicator that will work better.
But overall, I am optimistic about the ability to find a relation that will be good enough, and will enable us to change block size according to need.


Quote
Bottom line is that we can never reach the desired scalability using block size ever simply because there is always a cap and the block could be filled whether with legitimate transactions or spam ones. But with a second layer that doesn't have the same limits, we can achieve a lot of scaling.

I agree and disagree with you on this one. The 'theoretical' infinite transaction cap is also valid with changing block sizes (because you can keep them growing as much as you need). Like the LN, the only limitations are those we and the technology put.
Another deal breaker for me, is that the LN is off-chain. The bitcoin network is some kind of miracle. I believe we should only move to layer-2 solutions after we are 100000% sure that there is nothing else we can do on-chain.
The LN is operated by a company, which is prone to failure, as we all know and suffer even from big corporations such as facebook. What if someone hacks the company? What if it fails? Hacked? Or anything disastrous that might happen. The beauty of bitcoin is that it operates without anything needed except a few nodes.
This can be an issue for the LN. For example, if I am from the US and I wanna pay someone from Japan, I need one of two scenarios: Either we open a direct channel (but then it would not be efficient since you are paying double fees) or either a long long link of people on the LN are active and if one of these links closes - I might not be able to perform the transaction.
Another thing is that on the bitcoin network, I can even send the coins to a wallet, when the other party isn't online. He will just see the coins next time he connects to the internet.
On the LN, both parties have to be always online in order to make and receive the transactions (they have to have the channel active). This might not be a huge deal, but we already have a system that works seamlessly. So why downgrade ourselves?

I am curios to hear your thoughts about these things. 






Quote
    3) The network adapts the difficulty of mining a block in relation to the hashing power of the system, every 2 weeks.
The adjustment takes place every 2016 blocks.

Quote
This concept is based on one major assumption - in general (not in a specific date or short period of time), more hashing power means one of two things:
     1) More people are joining the network
     2) The same people/mining farms etc. got more power
Either one of these two options, suggests that the network (by the price of it's coin - aka bitcoin) is profitable enough for people to invest in mining power.
So, if the coin's value is high enough, it could imply that more people are adopting the network or the coin - resulting eventually to higher transactions rate.
This is not the most correct assumption.
First of all the hashrate increase could be because someone made a much more efficient hardware to mine bitcoin (in simple terms a better ASIC). That doesn't have to mean more adoption.
Another reason could be that a very cheap source of electricity were found and more miners got on board in that place. For example the electricity price for home users is ridiculously low where I live ($0.002) but people generally don't know or care about bitcoin mining not to mention that the government regulation that forces any bitcoin miner to register and pay a higher electricity price (equal to export price which is about $0.05 I believe).
Imagine if that changed, and the ASIC production wasn't a bottleneck. The hashrate would shoot up to the moon.

On the other hand we saw how the price crash accompanied with China banning mining and more price crash could lead to hashrate dropping a lot.

In both of these 2 cases everything else about the network remains the same. It takes the same amount of time for nodes to download and verify each block. The number of nodes remain the same, etc. while hashrate jumps up or down.
So, as you can see hashrate is not a good characteristic to use in order to determine the block size.

Generally speaking any dynamic way of computing block size cap is a bad idea.

Quote
      1) Not relating on another network - The mechanism is an integral part of the bitcoin network, and does not require a special and new custom-tailored network to be made.
          It should also be mentioned, that the dependency on another layer to process transactions, makes the whole ecosystem more vulnerable because now 2 networks has to work constantly perfect
          instead of just one.
Bottom line is that we can never reach the desired scalability using block size ever simply because there is always a cap and the block could be filled whether with legitimate transactions or spam ones. But with a second layer that does't have the same limits, we can achieve a lot of scaling.

Quote
          This is in a huge difference from the lightning network, which must have increased fees in order to be economically justified.
Fees have to be low for people to use LN otherwise it would make very little sense to even open a channel if you have to pay for example $100.
[/quote]
ABCbits
Legendary
*
Offline Offline

Activity: 3094
Merit: 8176


Crypto Swap Exchange


View Profile
May 27, 2022, 11:35:13 AM
Merited by pooya87 (2), NeuroticFish (1)
 #9

While your approach have some flaws which already mentioned, i appreciate your effort to think new approach and writing it neatly.

I think the variable-blocksize method has already been proposed/attempted before. Possibly Segwit2x for example. Or maybe I'm just imagining about that one.

Definitely not Segwit2x, but i think you refer to BIP 104, 106 and 107.

The thing is when the blocksize is based on a difficulty, there's a chance for the blocksize to rise exponentially in proportion to difficulty without ever going down. This would keep the size at e.g. 100MB for a 100x increase in difficulty.

Even if someday, disks do get that kind of capacity to store a blockchain with that large blocks (and not eat up space from everything else), network speeds and internet prices would still be a bottleneck to downloading such a chain.

CPU and RAM would be bigger concern than internet speed/price.

I agree that the difficulty isn't really a measure for transaction demand. However, and I might also be wrong about this one, I believe that a relation (not necessarily a linear one) might exist.
I went to check some data to see something more visual.
I checked these 2 tables:
https://ycharts.com/indicators/bitcoin_transactions_per_day
https://www.blockchain.com/charts/hash-rate

I have seen 2 significant things:
     1) It appears that there is no linear relation between the hashing rate and the transactions per day.
         One thing that I think should be taken into account, is that today we are already facing the scalability issue's consequences. Meaning that I think it's possible to say with just a little of confidence, that if
         transaction fees weren't as high - we would see much more transactions occurring. So basically saying that it might be possible that the data we are seeing today is "manipulated" by the problem itself.
      2) When major changes in trends occur, it is seen in both the hashing rate and the transaction 'demand'. You can look at the end of June 2021, end of April 2021, end of October 2020, end of May 2020,
          end of March 2020. Although not identical, the downwards trend is similar. It should be said that it's possible that there a 3rd factor I didn't take into account that is effecting both of these rates.

So to sum this one up - I don't believe that the relation between the two factors should be linear. It could be inverted, divided, and etc.

1. Correlation doesn't necessary imply causation.
2. Have you used other data to compare it with bitcoin transaction/day? IIRC big Bitcoin price swing have bigger correlation with Bitcoin transaction/day since holder would either buy and withdraw from exchange or deposit to exchange and sell.

█▀▀▀











█▄▄▄
▀▀▀▀▀▀▀▀▀▀▀
e
▄▄▄▄▄▄▄▄▄▄▄
█████████████
████████████▄███
██▐███████▄█████▀
█████████▄████▀
███▐████▄███▀
████▐██████▀
█████▀█████
███████████▄
████████████▄
██▄█████▀█████▄
▄█████████▀█████▀
███████████▀██▀
████▀█████████
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
c.h.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀█











▄▄▄█
▄██████▄▄▄
█████████████▄▄
███████████████
███████████████
███████████████
███████████████
███░░█████████
███▌▐█████████
█████████████
███████████▀
██████████▀
████████▀
▀██▀▀
pooya87
Legendary
*
Offline Offline

Activity: 3668
Merit: 11103


Crypto Swap Exchange


View Profile
May 28, 2022, 04:30:42 AM
 #10

The LN is operated by a company,
Lightning Network is another peer to peer network that is running on top of bitcoin. It is not centralized nor controlled by any company.

Quote
Another thing is that on the bitcoin network, I can even send the coins to a wallet, when the other party isn't online. He will just see the coins next time he connects to the internet.
On the LN, both parties have to be always online in order to make and receive the transactions (they have to have the channel active). This might not be a huge deal, but we already have a system that works seamlessly. So why downgrade ourselves?
There are ways to not be online and receive payment but ignoring that, LN still provides a good utility. Imagine you wanted to pay your bills, deposit/withdraw bitcoin to/from your exchange, purchase something from a merchant, etc. in all these cases the receiver is already online and has an open channel, you too are also online and have an open channel. All you have to do is to make your payment.

In other words a lot of transactions that were happening on chain could happen on LN, freeing up on-chain space.

█▀▀▀











█▄▄▄
▀▀▀▀▀▀▀▀▀▀▀
e
▄▄▄▄▄▄▄▄▄▄▄
█████████████
████████████▄███
██▐███████▄█████▀
█████████▄████▀
███▐████▄███▀
████▐██████▀
█████▀█████
███████████▄
████████████▄
██▄█████▀█████▄
▄█████████▀█████▀
███████████▀██▀
████▀█████████
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
c.h.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀█











▄▄▄█
▄██████▄▄▄
█████████████▄▄
███████████████
███████████████
███████████████
███████████████
███░░█████████
███▌▐█████████
█████████████
███████████▀
██████████▀
████████▀
▀██▀▀
BlackHatCoiner
Legendary
*
Offline Offline

Activity: 1736
Merit: 8448


Fiatheist


View Profile WWW
May 28, 2022, 06:01:19 AM
 #11

But, how is this better than just raising the block size? It's more complicated and, therefore, leaves room for possible failures. If your approach to solve scalability has to do with the block size, simply increase it. Don't mess with variables such as difficulty, which exist solely for determining another thing.

This concept is based on one major assumption - in general (not in a specific date or short period of time), more hashing power means one of two things:
     1) More people are joining the network
     2) The same people/mining farms etc. got more power
As said by pooya above, there are more factors that determine difficulty. There have been times in the past when the price was dropping while the difficulty was increasing.

█▀▀▀











█▄▄▄
▀▀▀▀▀▀▀▀▀▀▀
e
▄▄▄▄▄▄▄▄▄▄▄
█████████████
████████████▄███
██▐███████▄█████▀
█████████▄████▀
███▐████▄███▀
████▐██████▀
█████▀█████
███████████▄
████████████▄
██▄█████▀█████▄
▄█████████▀█████▀
███████████▀██▀
████▀█████████
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
c.h.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀█











▄▄▄█
▄██████▄▄▄
█████████████▄▄
███████████████
███████████████
███████████████
███████████████
███░░█████████
███▌▐█████████
█████████████
███████████▀
██████████▀
████████▀
▀██▀▀
NotATether
Legendary
*
Offline Offline

Activity: 1820
Merit: 7476


Top Crypto Casino


View Profile WWW
May 29, 2022, 04:10:16 AM
 #12

Thanks for replying. I totally agree and that's exactly what I was aiming in the 1st disadvantage I mentioned. This might happen, but, and I will say this very carefully since I am not qualified yet to create any kind of formulas, but I believe that an adjusting constant factor could be added into the calculation - to soften increases and decreases in block size. Firstly it would create a somewhat moderate increase, and even giving a limit for the max block size so it cannot go way higher than today's technology could handle efficiently (that of course throughout time could be adjusted again as times are changing).

Try multiplying blocksize by the base-10 logarithm of difficulty.

i.e. log10(difficulty)

That way, difficulty can be in the exahashes range as it is now and the factor still be no higher than 15-18 i.e. 15MB-18MB final size for that difficulty.

███████████████████████
████▐██▄█████████████████
████▐██████▄▄▄███████████
████▐████▄█████▄▄████████
████▐█████▀▀▀▀▀███▄██████
████▐███▀████████████████
████▐█████████▄█████▌████
████▐██▌█████▀██████▌████
████▐██████████▀████▌████
█████▀███▄█████▄███▀█████
███████▀█████████▀███████
██████████▀███▀██████████

███████████████████████
.
BC.GAME
▄▄▀▀▀▀▀▀▀▄▄
▄▀▀░▄██▀░▀██▄░▀▀▄
▄▀░▐▀▄░▀░░▀░░▀░▄▀▌░▀▄
▄▀▄█▐░▀▄▀▀▀▀▀▄▀░▌█▄▀▄
▄▀░▀░░█░▄███████▄░█░░▀░▀▄
█░█░▀░█████████████░▀░█░█
█░██░▀█▀▀█▄▄█▀▀█▀░██░█
█░█▀██░█▀▀██▀▀█░██▀█░█
▀▄▀██░░░▀▀▄▌▐▄▀▀░░░██▀▄▀
▀▄▀██░░▄░▀▄█▄▀░▄░░██▀▄▀
▀▄░▀█░▄▄▄░▀░▄▄▄░█▀░▄▀
▀▄▄▀▀███▄███▀▀▄▄▀
██████▄▄▄▄▄▄▄██████
.
..CASINO....SPORTS....RACING..


▄▄████▄▄
▄███▀▀███▄
██████████
▀███▄░▄██▀
▄▄████▄▄░▀█▀▄██▀▄▄████▄▄
▄███▀▀▀████▄▄██▀▄███▀▀███▄
███████▄▄▀▀████▄▄▀▀███████
▀███▄▄███▀░░░▀▀████▄▄▄███▀
▀▀████▀▀████████▀▀████▀▀
rlirs
Newbie
*
Offline Offline

Activity: 13
Merit: 0


View Profile WWW
June 01, 2022, 12:21:40 AM
 #13

Why not make the block size simply depend on some fraction of mempool size, say 1/10? Add a field to each block to record what miner sees as memory pool size then average it for the last 2 weeks. Then approximately each transaction in the mempool will get included in the next 10 blocks.
odolvlobo
Legendary
*
Offline Offline

Activity: 4522
Merit: 3426



View Profile
June 01, 2022, 12:55:35 AM
 #14

Why not make the block size simply depend on some fraction of mempool size...

Simply stated, there is no "the mempool". Every node has a mempool and every mempool is assumed to be different.

Join an anti-signature campaign: Click ignore on the members of signature campaigns.
PGP Fingerprint: 6B6BC26599EC24EF7E29A405EAF050539D0B2925 Signing address: 13GAVJo8YaAuenj6keiEykwxWUZ7jMoSLt
rlirs
Newbie
*
Offline Offline

Activity: 13
Merit: 0


View Profile WWW
June 01, 2022, 02:25:11 AM
 #15

Why not make the block size simply depend on some fraction of mempool size...

Simply stated, there is no "the mempool". Every node has a mempool and every mempool is assumed to be different.

Of course every miner will set mempool size to a different value before mining the next block. That is why I said the mempool as miner (who will produce the next block) sees it. Once block is mined that value becomes the mempool size for blockchain at that block index.
odolvlobo
Legendary
*
Offline Offline

Activity: 4522
Merit: 3426



View Profile
June 01, 2022, 03:54:31 AM
Merited by ABCbits (1)
 #16

Why not make the block size simply depend on some fraction of mempool size...
Simply stated, there is no "the mempool". Every node has a mempool and every mempool is assumed to be different.
Of course every miner will set mempool size to a different value before mining the next block.
A node cannot validate the size of the miner's mempool, and if the miner can choose any size, then there is no point.

Join an anti-signature campaign: Click ignore on the members of signature campaigns.
PGP Fingerprint: 6B6BC26599EC24EF7E29A405EAF050539D0B2925 Signing address: 13GAVJo8YaAuenj6keiEykwxWUZ7jMoSLt
rlirs
Newbie
*
Offline Offline

Activity: 13
Merit: 0


View Profile WWW
June 01, 2022, 04:37:17 AM
 #17

Why not make the block size simply depend on some fraction of mempool size...
Simply stated, there is no "the mempool". Every node has a mempool and every mempool is assumed to be different.
Of course every miner will set mempool size to a different value before mining the next block.
A node cannot validate the size of the miner's mempool, and if the miner can choose any size, then there is no point.
There is no need to validate any precise value that will change anyway. When averaging over 2016 blocks the result will get close to average mempool size that other nodes see.
pooya87
Legendary
*
Offline Offline

Activity: 3668
Merit: 11103


Crypto Swap Exchange


View Profile
June 01, 2022, 04:40:43 AM
 #18

There is no need to validate any precise value that will change anyway. When averaging over 2016 blocks the result will get close to average mempool size that other nodes see.
You can't even measure the mempool size of older blocks for it to be close to the size or not, because as I said in my post above mempool is not stored anywhere. Even if you could, when calculating target we aren't looking for a number that is close to correct one, but we are looking for exact values.

█▀▀▀











█▄▄▄
▀▀▀▀▀▀▀▀▀▀▀
e
▄▄▄▄▄▄▄▄▄▄▄
█████████████
████████████▄███
██▐███████▄█████▀
█████████▄████▀
███▐████▄███▀
████▐██████▀
█████▀█████
███████████▄
████████████▄
██▄█████▀█████▄
▄█████████▀█████▀
███████████▀██▀
████▀█████████
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
c.h.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀█











▄▄▄█
▄██████▄▄▄
█████████████▄▄
███████████████
███████████████
███████████████
███████████████
███░░█████████
███▌▐█████████
█████████████
███████████▀
██████████▀
████████▀
▀██▀▀
odolvlobo
Legendary
*
Offline Offline

Activity: 4522
Merit: 3426



View Profile
June 01, 2022, 11:48:19 AM
 #19

Why not make the block size simply depend on some fraction of mempool size...
Simply stated, there is no "the mempool". Every node has a mempool and every mempool is assumed to be different.
Of course every miner will set mempool size to a different value before mining the next block.
A node cannot validate the size of the miner's mempool, and if the miner can choose any size, then there is no point.
There is no need to validate any precise value that will change anyway. When averaging over 2016 blocks the result will get close to average mempool size that other nodes see.
There is no point in having a requirement that does not have to be followed.

Join an anti-signature campaign: Click ignore on the members of signature campaigns.
PGP Fingerprint: 6B6BC26599EC24EF7E29A405EAF050539D0B2925 Signing address: 13GAVJo8YaAuenj6keiEykwxWUZ7jMoSLt
rlirs
Newbie
*
Offline Offline

Activity: 13
Merit: 0


View Profile WWW
June 01, 2022, 01:31:14 PM
 #20

There is no need to validate any precise value that will change anyway. When averaging over 2016 blocks the result will get close to average mempool size that other nodes see.
There is no point in having a requirement that does not have to be followed.

Additionally, it opens opportunity to,
1. Bloat blockchain (by claim have big mempool).
2. DoS attack on Bitcoin node (by claim have big mempool).
3. Network congestion (by claim have small mempool).

Using average mempool size is no different than using average difficulty. Some blocks mined in much less than 10 minutes, some in more than 10 minutes but the average value comes out fine. In addition there is no financial incentive for miners to attack blockchain.
Pages: [1] 2 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!