Bitcoin Forum
November 14, 2024, 08:59:21 AM *
News: Check out the artwork 1Dq created to commemorate this forum's 15th anniversary
 
   Home   Help Search Login Register More  
Pages: [1] 2 »  All
  Print  
Author Topic: What's the best solution for the scalability problem?  (Read 1010 times)
Blawpaw (OP)
Legendary
*
Offline Offline

Activity: 1596
Merit: 1027



View Profile
July 30, 2015, 02:47:04 PM
 #1

I was wondering... with a lot of projects being developed to implement the best solution to the BTC scalability problem, What do you think could be the best solution for this widely known issue?
AtheistAKASaneBrain
Hero Member
*****
Offline Offline

Activity: 770
Merit: 509


View Profile
July 30, 2015, 03:01:48 PM
 #2

I was wondering... with a lot of projects being developed to implement the best solution to the BTC scalability problem, What do you think could be the best solution for this widely known issue?

In my opinion and after reading countless hours of discussions of devs arguing with each other is... we don't really know. But what seems more logical to me personally is the Gavin approach, aka, keep incrementing the size in increments of 8mb across the next 40 years or so. Technological advancement should cater for the space. Also, add in LN as well. We need both sides.
Blawpaw (OP)
Legendary
*
Offline Offline

Activity: 1596
Merit: 1027



View Profile
July 30, 2015, 03:19:30 PM
 #3

I was wondering... with a lot of projects being developed to implement the best solution to the BTC scalability problem, What do you think could be the best solution for this widely known issue?

In my opinion and after reading countless hours of discussions of devs arguing with each other is... we don't really know. But what seems more logical to me personally is the Gavin approach, aka, keep incrementing the size in increments of 8mb across the next 40 years or so. Technological advancement should cater for the space. Also, add in LN as well. We need both sides.

Other than Gavin's approach i believe we should consider Sidechains as a possible turnaround. I heard about the Blockcypher proposal and the Bitcoin Lightning Network proposal as well, wich presents off chains solutions to the scalability issue. I think this is also something to be considered!
LiteCoinGuy
Legendary
*
Offline Offline

Activity: 1148
Merit: 1014


In Satoshi I Trust


View Profile WWW
July 30, 2015, 03:23:23 PM
 #4

increase the blocksize (we have to do this in every case and will do it. and satoshi would do this too)


+sidechains

+lightning network

+software improvements

+ things we dont know

BillyBobZorton
Legendary
*
Offline Offline

Activity: 1204
Merit: 1028


View Profile
July 30, 2015, 03:54:20 PM
 #5

I was wondering... with a lot of projects being developed to implement the best solution to the BTC scalability problem, What do you think could be the best solution for this widely known issue?

In my opinion and after reading countless hours of discussions of devs arguing with each other is... we don't really know. But what seems more logical to me personally is the Gavin approach, aka, keep incrementing the size in increments of 8mb across the next 40 years or so. Technological advancement should cater for the space. Also, add in LN as well. We need both sides.

Other than Gavin's approach i believe we should consider Sidechains as a possible turnaround. I heard about the Blockcypher proposal and the Bitcoin Lightning Network proposal as well, wich presents off chains solutions to the scalability issue. I think this is also something to be considered!

This is the roadmap for a blocksize increase schedule over the course of the following decades:



I honestly don't see the problem. As time passes, HD storage will get cheaper, it shouldn't be a problem. Lightning Network is a must as well, there is space for both methods to be used in harmony. We must take over the world and we must retain low fees, anything else is not trying hard enough.
noideacoin
Newbie
*
Offline Offline

Activity: 52
Merit: 0


View Profile
July 30, 2015, 04:02:56 PM
 #6

https://i.imgur.com/SCua8dz.jpg
gentlemand
Legendary
*
Offline Offline

Activity: 2590
Merit: 3015


Welt Am Draht


View Profile
July 30, 2015, 04:06:15 PM
 #7


+ things we dont know


I vote for this. It addresses every single problem so far and it's the only one I feel qualified to comment on.
Blawpaw (OP)
Legendary
*
Offline Offline

Activity: 1596
Merit: 1027



View Profile
July 30, 2015, 08:10:16 PM
 #8

I was wondering... with a lot of projects being developed to implement the best solution to the BTC scalability problem, What do you think could be the best solution for this widely known issue?

In my opinion and after reading countless hours of discussions of devs arguing with each other is... we don't really know. But what seems more logical to me personally is the Gavin approach, aka, keep incrementing the size in increments of 8mb across the next 40 years or so. Technological advancement should cater for the space. Also, add in LN as well. We need both sides.

Other than Gavin's approach i believe we should consider Sidechains as a possible turnaround. I heard about the Blockcypher proposal and the Bitcoin Lightning Network proposal as well, wich presents off chains solutions to the scalability issue. I think this is also something to be considered!

This is the roadmap for a blocksize increase schedule over the course of the following decades:



I honestly don't see the problem. As time passes, HD storage will get cheaper, it shouldn't be a problem. Lightning Network is a must as well, there is space for both methods to be used in harmony. We must take over the world and we must retain low fees, anything else is not trying hard enough.

even so... I don't think that increasing the Blocksize could be the best answer. Even if Storage space becomes cheaper, having 1 gb blocksize by the year 2032 seems simply unfeasible...
dodgecharger
Hero Member
*****
Offline Offline

Activity: 1005
Merit: 500



View Profile
July 30, 2015, 08:51:28 PM
 #9

Increasing the Blocksize needs to be done. block size was appropriate for it's early days, As bitcoin has evolved into more of a payment method, more transactions will occur every second which  equals more space in each block
unamis76
Legendary
*
Offline Offline

Activity: 1512
Merit: 1012


View Profile
July 30, 2015, 09:00:19 PM
 #10

I also think simple block increases are the best solution, at least for now... Storage and bandwidth will always be one step ahead of blockchain size and blocksizes...
PolarPoint
Hero Member
*****
Offline Offline

Activity: 672
Merit: 500


View Profile
July 30, 2015, 09:22:57 PM
 #11

"What is the best solution?" This is the reason why we are stuck in this perpetual argument and consensus is not met. There is no best solution! There are advantage and disadvantages to each solution proposed. Devs, miners and users all see different "best" solution for them.

We don't have time to discuss a "best" solution, we need a quick and easy fix now! I support a quick 2M to 8M limit raise to solve our immediate problem and devs can take another few years to figure out the "best" solution, sidechain or not.
coinpr0n
Hero Member
*****
Offline Offline

Activity: 910
Merit: 1000



View Profile
July 30, 2015, 09:42:09 PM
 #12

I lean towards a conservative increase in the block size (2mb - 4mb) with option to raise it as time goes on - this is what most proposals suggest anyway but I think it should be done sooner rather than later (within reason). I think that 2017 is too far in the future. I support other better solutions like LN also - they aren't mutually exclusive.

VirosaGITS
Legendary
*
Offline Offline

Activity: 1302
Merit: 1068



View Profile
July 30, 2015, 10:03:44 PM
 #13

Block size would scale well by being recalculated in %size based on need, like how difficulty is calculated, no?

And a blockchain the size of Peta gb. Weren't people talking about making the blockchain smaller by using checkpoints?


                      ▄▄█████▄▄
                    ▐████████████▄
                   ▄█▀▀▀▀▀▀▀██████▌
             █▄  ▄█▀           ▀▀█
              ▀▀▀███▄▄▄▄▄▄▄▄▄▄   █▄   ▄

               ▄▀▀         ▀▀▀▀▀▀▀██▀▀▀
         ▄▄▄▄▄█▄▄ ▄▀▀▄ ▄▄▄▄▄▄▄▄▄▄▄▄▄█▄▄▄▄
         ████▒▒███    ████▒▒████▌
    ▀█▄ ▀
███████▄ ███▒▒███      ██▒▒█████       ▀█▄
 ███████ ▀█▒▒████     ▄█▒▒█████▀         ▀█ ▄  ▄▄
  ██████  ▌▀▀█████▄▄▄███████▀▀            ███▄███▌
 █████████  █████▀▀█▀▀██████▌             ██████▀
 ▀█████████ ███▄  ███   ▐███▌ ▄██       ▄█████▀
     ▀▀    ▀▀███████████████▄▄████▄▄▄▄█▀▀▀▀▀
               ▀▀▀███▀▀▀      ██████▄
                               ▀▀▀▀▀

▄█████████████████████████████▄
███████████████████████████████
███████████████████████████████
███████████████████████████████
█████████▀▀█████████▀▀█████████
███████ ▄▀▀         ▀▀▄ ███████
██████                   ██████
█████▌     ▄▄     ▄▄     ▐█████
█████     ████   ████     █████
█████      ▀▀     ▀▀      █████
█████▄   ▀▄▄▄     ▄▄▄▀   ▄█████
████████▄▄▄█████████▄▄▄████████
███████████████████████████████
███████████████████████████████
███████████████████████████████
 ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
█ █
█ █
█ █
█ █
█ █
cryptworld
Hero Member
*****
Offline Offline

Activity: 714
Merit: 503



View Profile
July 30, 2015, 10:46:27 PM
 #14

the best should be to implant an incremental solution, like the blocks that are increasing per year or two years
Bitcoininspace
Sr. Member
****
Offline Offline

Activity: 574
Merit: 250


View Profile
July 30, 2015, 11:25:05 PM
 #15

Wouldn't it work with a system where the blocksize depends on transaction output? It would allow for getting bigger in blocksize when the network is being spammed while maintaining 10min avg confirmation timer?
Amph
Legendary
*
Offline Offline

Activity: 3248
Merit: 1070



View Profile
July 31, 2015, 11:11:52 AM
 #16

the best solution would be to integrate an algo inside the code of the client that rise the MB limit when we have saturated it and the queue is too big and troublesome, like it was in the test, with this the client will fix itself in any condition and there will be no need of future forks

i'm not a code expert but i presume that such thing is possible to implent in the code
Mickeyb
Hero Member
*****
Offline Offline

Activity: 798
Merit: 1000

Move On !!!!!!


View Profile
July 31, 2015, 02:26:47 PM
 #17

Increase the block size limit please! Or even better, change the algo so this happens automatically in the future so we don't have to go through this s**t again in a few years time!
LiteCoinGuy
Legendary
*
Offline Offline

Activity: 1148
Merit: 1014


In Satoshi I Trust


View Profile WWW
July 31, 2015, 02:57:14 PM
 #18

Increase the block size limit please! Or even better, change the algo so this happens automatically in the future so we don't have to go through this s**t again in a few years time!

we will do that with XT. be patient dude  Wink

AtheistAKASaneBrain
Hero Member
*****
Offline Offline

Activity: 770
Merit: 509


View Profile
July 31, 2015, 03:03:45 PM
 #19

I was wondering... with a lot of projects being developed to implement the best solution to the BTC scalability problem, What do you think could be the best solution for this widely known issue?

In my opinion and after reading countless hours of discussions of devs arguing with each other is... we don't really know. But what seems more logical to me personally is the Gavin approach, aka, keep incrementing the size in increments of 8mb across the next 40 years or so. Technological advancement should cater for the space. Also, add in LN as well. We need both sides.

Other than Gavin's approach i believe we should consider Sidechains as a possible turnaround. I heard about the Blockcypher proposal and the Bitcoin Lightning Network proposal as well, wich presents off chains solutions to the scalability issue. I think this is also something to be considered!

This is the roadmap for a blocksize increase schedule over the course of the following decades:



I honestly don't see the problem. As time passes, HD storage will get cheaper, it shouldn't be a problem. Lightning Network is a must as well, there is space for both methods to be used in harmony. We must take over the world and we must retain low fees, anything else is not trying hard enough.

even so... I don't think that increasing the Blocksize could be the best answer. Even if Storage space becomes cheaper, having 1 gb blocksize by the year 2032 seems simply unfeasible...

Why not? thats 10+ years from now. Do you know how much things change over time?



It will be as annoying as having a 40GB blockchain right now, thats all. I don't see the big fuzz about it. By 2025 the average computer will come with a HDD big enough to handle it IMO.
LiteCoinGuy
Legendary
*
Offline Offline

Activity: 1148
Merit: 1014


In Satoshi I Trust


View Profile WWW
July 31, 2015, 03:06:48 PM
 #20

but you must understand that some people have no foresight  Wink

Pages: [1] 2 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!