Bitcoin Forum
November 02, 2024, 01:06:18 PM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 [2]  All
  Print  
Author Topic: Luke Jr's 300kb blocks  (Read 885 times)
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4270
Merit: 8805



View Profile WWW
February 13, 2019, 03:06:11 PM
Merited by 1Referee (1), BobLawblaw (1)
 #21

and later to earn a pretty penny by scooping up routing fees
That's almost certainly not the case. The only time when fees could at all be high is when there aren't many people doing it.  What we've seen so far in lightning (and previously in joinmarket) is that fees rapidly race to pretty low values in competition.

Quote
and Lightning provides that incentive, especially in form of these plug-n-play physical nodes.
At the moment, but eliminating any need to run a node is a major focus of development effort for lightning developers.

There is an inherent incentive: radically improved security and privacy.  But it's only enough to overcome a certain (low) level of cost... thus the concern about managing that cost.

As an aside, a lot of that "node hardware" being sold won't keep up for that long due to limited memory/storage/speed.
1Referee
Legendary
*
Offline Offline

Activity: 2170
Merit: 1427


View Profile
February 13, 2019, 03:53:33 PM
 #22

That's almost certainly not the case. The only time when fees could at all be high is when there aren't many people doing it.  What we've seen so far in lightning (and previously in joinmarket) is that fees rapidly race to pretty low values in competition.
Definitely agree with more competition resulting in lower fees, but it comes down to transactional volumes in the end. Many hops make a pretty penny (pretty enough to continue running a node) after a month or so. I strongly believe that Lightning is capable of that with enough adoption.

As an aside, a lot of that "node hardware" being sold won't keep up for that long due to limited memory/storage/speed.
That's a valid concern. These physical nodes indeed have a shelf life which I seem to have ignored. Thanks for pointing out.
aliashraf
Legendary
*
Offline Offline

Activity: 1456
Merit: 1175

Always remember the cause!


View Profile WWW
February 13, 2019, 04:31:19 PM
 #23

My final take on this thread:

Do something , anything, about fast sync, not in Luke's approach but with his spirit: No SPVs, more full nodes.

Since OP has started this thread I'm banging my head over and over again


jubalix
Legendary
*
Offline Offline

Activity: 2632
Merit: 1023


View Profile WWW
February 17, 2019, 12:06:11 AM
 #24

I saw this as well.

Is, His argument "full nodes are dropping" does not want to centralise to keep network strong?

where can we get a figure on how many full nodes, I saw coin dance had nodes but could not see the full nodes




Admitted Practicing Lawyer::BTC/Crypto Specialist. B.Engineering/B.Laws

https://www.binance.com/?ref=10062065
jubalix
Legendary
*
Offline Offline

Activity: 2632
Merit: 1023


View Profile WWW
February 17, 2019, 12:10:20 AM
Last edit: February 17, 2019, 09:29:04 AM by jubalix
 #25

It's a really old proposal. I thought it was trying to be too smart/subtle.


The idea was to reduce to 300kB base size, but also set a graduated increase schedule, based on absolute block heights (the 300kb step was set to take place at a blockheight back in 2017 IIRC). It finally reached 1MB base size again in 2024, and continued at a percentage rate (also IIRC). In other words, if the proposal was adopted today, we'd be past the 300kB stage already.

This was partly a psychologically based proposal, which is why people reacted badly, lol. I think Luke knew that 300kB base size would get laughed off, but he figured that since the blockchain grows constantly, that the closer we get to 2024 (when 1MB base would be reached again), the more people might begin to realise that reducing from the 1MB base size was smarter than it sounded back when the blockchain was a more manageable size.

Note that all of this is using base block figures, the real possible block size would be x2-4 the base block (so 300kb would in fact be 600-1200kB if all transactions in a given block are segwit txs).

Bear in mind that as we're still not in 2024, Luke's plan may actually work, and reducing the base from 1MB to whatever the schedule would be stepped to now (which has of course increased beyond 300kB) might look good to some people. It would probably still take some convincing, but there's still 5 years left on the clock.


ok if this is the case this identical to my argument of some sort of increase over time at some rate ax^n where n is 0.05? or some such, you could probably work n out as a function of blockspace, network load, usage, user base (estimates of course) and text improvement curves HD space and bandwidth.

Edit

Sorry I mean {\displaystyle f(x)={\frac {L}{1+e^{-k(x-x_{0})}}}} Logistical function (S-curve) for block size and have been saying that now for about 2 years???

Admitted Practicing Lawyer::BTC/Crypto Specialist. B.Engineering/B.Laws

https://www.binance.com/?ref=10062065
jubalix
Legendary
*
Offline Offline

Activity: 2632
Merit: 1023


View Profile WWW
February 17, 2019, 12:13:42 AM
 #26

It's not as bad as it seems. If you're running a decent setup, the sync time is pretty reasonable actually. I set up a second node myself earlier this year, and it synced in like 11-12 hours, and that while I expected it to take a day at least. RPI's are a different story, but then again, run a decent setup and you don't have these problems.

In the end, the average person won't run a node even with a very small block size. Why? They just don't give a fuck. People who do give a fuck, and merchants, will continue to spec out their hardware to run their node in the most stable possible manner.
Are you arguing like "it is just fine, people should sit on the backbone (like me) and boot in half a day if they got real incentives, being a bitcoin whale (like me)" ? And you expect Luke to appreciate your argument and back-off?  Grin

Average users have incentives to join, like you and other bitcoin whales Tongue they just can't and it is getting worse as the time passes. A UTXO commitment/reconciliation protocol could change the scene radically, imo.



What LJr really wants to do is be PeerCoin

Admitted Practicing Lawyer::BTC/Crypto Specialist. B.Engineering/B.Laws

https://www.binance.com/?ref=10062065
cellard
Legendary
*
Offline Offline

Activity: 1372
Merit: 1252


View Profile
February 17, 2019, 05:15:56 AM
 #27

I saw this as well.

Is, His argument "full nodes are dropping" does not want to centralise to keep network strong?

where can we get a figure on how many full nodes, I saw coin dance had nodes but could not see the full nodes





I don't see his proposal would make people suddenly make the effort to run full nodes. The current growth of the blockchain is not that big of a deal within current settings. My folder is around 235 GB. A 4TB drive is pretty cheap these days, so should have you covered for years, and during these years I assume that disk sizes will keep growing as well.

How realistic is that it becomes impossible to host the blockchain on a single drive? I don't see it happening at the current linear growth.

Maybe it would be cool to have a way to host the blockchain on different HDDs. I don't see why this isn't possible, the client just must know where it left on the last file to keep downloading and validating on the next assorted HDD. The entire blockchain is hosted there so it counts as a full node.

Anyway my point was, his 300kb idea will not change the mind of people to do the effort to run a full node. It's a matter of mentality, not if we have 1MB or 300kb. The difference is not that big of a deal imo. People without the right mentality to run a full node will stay on Electrum or whatever.
jubalix
Legendary
*
Offline Offline

Activity: 2632
Merit: 1023


View Profile WWW
February 17, 2019, 09:34:23 AM
 #28

and later to earn a pretty penny by scooping up routing fees
That's almost certainly not the case. The only time when fees could at all be high is when there aren't many people doing it.  What we've seen so far in lightning (and previously in joinmarket) is that fees rapidly race to pretty low values in competition.

Quote
and Lightning provides that incentive, especially in form of these plug-n-play physical nodes.
At the moment, but eliminating any need to run a node is a major focus of development effort for lightning developers.

There is an inherent incentive: radically improved security and privacy.  But it's only enough to overcome a certain (low) level of cost... thus the concern about managing that cost.

As an aside, a lot of that "node hardware" being sold won't keep up for that long due to limited memory/storage/speed.

I have asked this before, and not really had a straight answer, why should block size not be increased with a S-curve function? (https://en.wikipedia.org/wiki/Logistic_function). If I recall you put me to rights on the demand side issue, and sort of said yeah maybe ... but my memory is sketchy and it was a long time ago.

I would like a square answer as to why not?

It could forever and a day end the whole blocksize issue with almost nil impact on the current core philosophy.

What is the negative technical argument?

or negative any argument to this?


Admitted Practicing Lawyer::BTC/Crypto Specialist. B.Engineering/B.Laws

https://www.binance.com/?ref=10062065
cellard
Legendary
*
Offline Offline

Activity: 1372
Merit: 1252


View Profile
February 18, 2019, 03:43:17 AM
 #29

and later to earn a pretty penny by scooping up routing fees
That's almost certainly not the case. The only time when fees could at all be high is when there aren't many people doing it.  What we've seen so far in lightning (and previously in joinmarket) is that fees rapidly race to pretty low values in competition.

Quote
and Lightning provides that incentive, especially in form of these plug-n-play physical nodes.
At the moment, but eliminating any need to run a node is a major focus of development effort for lightning developers.

There is an inherent incentive: radically improved security and privacy.  But it's only enough to overcome a certain (low) level of cost... thus the concern about managing that cost.

As an aside, a lot of that "node hardware" being sold won't keep up for that long due to limited memory/storage/speed.

I have asked this before, and not really had a straight answer, why should block size not be increased with a S-curve function? (https://en.wikipedia.org/wiki/Logistic_function). If I recall you put me to rights on the demand side issue, and sort of said yeah maybe ... but my memory is sketchy and it was a long time ago.

I would like a square answer as to why not?

It could forever and a day end the whole blocksize issue with almost nil impact on the current core philosophy.

What is the negative technical argument?

or negative any argument to this?



I don't think anything but linear is safe... you don't really know how hardware will progress across time, how much it will cost and so on. I don't see any solution to the so called "scaling on chain" that's why Bitcoin has become de-facto digital gold and not something that can be used realistically at scale (as in global usage) on-chain. This doesn't mean research on the field should stop, you never know... however what's clear is most of the effort in Bitcoin should be spent in review already-existing code rather than more exotic stuff. I mean Core had that inflation bug recently while other clients weren't affected. So Luke should be reviewing code instead of attempting stuff that probably will never have any consensus anyway.
jubalix
Legendary
*
Offline Offline

Activity: 2632
Merit: 1023


View Profile WWW
February 24, 2019, 02:06:05 AM
 #30

and later to earn a pretty penny by scooping up routing fees
That's almost certainly not the case. The only time when fees could at all be high is when there aren't many people doing it.  What we've seen so far in lightning (and previously in joinmarket) is that fees rapidly race to pretty low values in competition.

Quote
and Lightning provides that incentive, especially in form of these plug-n-play physical nodes.
At the moment, but eliminating any need to run a node is a major focus of development effort for lightning developers.

There is an inherent incentive: radically improved security and privacy.  But it's only enough to overcome a certain (low) level of cost... thus the concern about managing that cost.

As an aside, a lot of that "node hardware" being sold won't keep up for that long due to limited memory/storage/speed.

I have asked this before, and not really had a straight answer, why should block size not be increased with a S-curve function? (https://en.wikipedia.org/wiki/Logistic_function). If I recall you put me to rights on the demand side issue, and sort of said yeah maybe ... but my memory is sketchy and it was a long time ago.

I would like a square answer as to why not?

It could forever and a day end the whole blocksize issue with almost nil impact on the current core philosophy.

What is the negative technical argument?

or negative any argument to this?



I don't think anything but linear is safe... you don't really know how hardware will progress across time, how much it will cost and so on. I don't see any solution to the so called "scaling on chain" that's why Bitcoin has become de-facto digital gold and not something that can be used realistically at scale (as in global usage) on-chain. This doesn't mean research on the field should stop, you never know... however what's clear is most of the effort in Bitcoin should be spent in review already-existing code rather than more exotic stuff. I mean Core had that inflation bug recently while other clients weren't affected. So Luke should be reviewing code instead of attempting stuff that probably will never have any consensus anyway.

Would you agree that if usage is up, and HD space cost goes down, bandwith cost goes down, the we are effectively seeing the 1MB becoming smaller? for no reason?

I.E. we can afford lager block at least to the extent the bandwidth and HD space costs fall and cpu power per /$ goes up?




Admitted Practicing Lawyer::BTC/Crypto Specialist. B.Engineering/B.Laws

https://www.binance.com/?ref=10062065
Wind_FURY
Legendary
*
Offline Offline

Activity: 3094
Merit: 1929



View Profile
February 24, 2019, 08:23:08 AM
 #31

and later to earn a pretty penny by scooping up routing fees
That's almost certainly not the case. The only time when fees could at all be high is when there aren't many people doing it.  What we've seen so far in lightning (and previously in joinmarket) is that fees rapidly race to pretty low values in competition.

Quote
and Lightning provides that incentive, especially in form of these plug-n-play physical nodes.
At the moment, but eliminating any need to run a node is a major focus of development effort for lightning developers.

There is an inherent incentive: radically improved security and privacy.  But it's only enough to overcome a certain (low) level of cost... thus the concern about managing that cost.

As an aside, a lot of that "node hardware" being sold won't keep up for that long due to limited memory/storage/speed.

I have asked this before, and not really had a straight answer, why should block size not be increased with a S-curve function? (https://en.wikipedia.org/wiki/Logistic_function). If I recall you put me to rights on the demand side issue, and sort of said yeah maybe ... but my memory is sketchy and it was a long time ago.

I would like a square answer as to why not?

It could forever and a day end the whole blocksize issue with almost nil impact on the current core philosophy.

What is the negative technical argument?

or negative any argument to this?



I don't think anything but linear is safe... you don't really know how hardware will progress across time, how much it will cost and so on. I don't see any solution to the so called "scaling on chain" that's why Bitcoin has become de-facto digital gold and not something that can be used realistically at scale (as in global usage) on-chain. This doesn't mean research on the field should stop, you never know... however what's clear is most of the effort in Bitcoin should be spent in review already-existing code rather than more exotic stuff. I mean Core had that inflation bug recently while other clients weren't affected. So Luke should be reviewing code instead of attempting stuff that probably will never have any consensus anyway.

Would you agree that if usage is up, and HD space cost goes down, bandwith cost goes down, the we are effectively seeing the 1MB becoming smaller? for no reason?

I.E. we can afford lager block at least to the extent the bandwidth and HD space costs fall and cpu power per /$ goes up?


Have you recently tried doing the 200Gb initial blockchain download? It is a pain. It might be easy with your bandwidth, but not all Bitcoin users will have the access to high bandwidth, or upgrade to higher bandwidth. I believe they would quit.

██████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
██████████████████████
.SHUFFLE.COM..███████████████████████
███████████████████████
███████████████████████
███████████████████████
███████████████████████
█████████████████████
████████████████████
██████████████████████
████████████████████
██████████████████████
███████████████████████
███████████████████████
███████████████████████
███████████████████████
███████████████████████
███████████████████████
██████████████████████
██████████████████████
██████████████████████
███████████████████████
███████████████████████
███████████████████████
███████████████████████
███████████████████████
███████████████████████
███████████████████████
.
...Next Generation Crypto Casino...
BitcoinFX
Legendary
*
Offline Offline

Activity: 2646
Merit: 1722


https://youtu.be/DsAVx0u9Cw4 ... Dr. WHO < KLF


View Profile WWW
February 24, 2019, 10:40:08 AM
 #32

My final take on this thread:

Do something , anything, about fast sync, not in Luke's approach but with his spirit: No SPVs, more full nodes.

Since OP has started this thread I'm banging my head over and over again




... snip ...

And apparently shrinking the blocksize is a solution now  Cheesy

Where the alternative solution is to make money (a digital cash !) 'heavier' ? ...

- https://news.mlh.io/i-hacked-the-middle-out-compression-from-silicon-valley-06-16-2015

"... Please let me know if I overlooked anything that could make me a member of the Three Comma Club. I want a boat. And doors that open vertically..."

- https://www.hoover.org/research/middle-out-economics

"... in which he advanced a middle-out thesis for economic growth: “The fundamental law of capitalism is, if workers don’t have any money, businesses . . . don’t have any customers.” ..."

  Roll Eyes

   Cheesy

"Bitcoin OG" 1JXFXUBGs2ZtEDAQMdZ3tkCKo38nT2XSEp | Bitcoin logo™ Enforcer? | Bitcoin is BTC | CSW is NOT Satoshi Nakamoto | I Mine BTC, LTC, ZEC, XMR and GAP | BTC on Tor addnodes Project | Media enquiries : Wu Ming | Enjoy The Money Machine | "You cannot compete with Open Source" and "Cryptography != Banana" | BSV and BCH are COUNTERFEIT.
cellard
Legendary
*
Offline Offline

Activity: 1372
Merit: 1252


View Profile
April 02, 2019, 03:19:17 AM
 #33

and later to earn a pretty penny by scooping up routing fees
That's almost certainly not the case. The only time when fees could at all be high is when there aren't many people doing it.  What we've seen so far in lightning (and previously in joinmarket) is that fees rapidly race to pretty low values in competition.

Quote
and Lightning provides that incentive, especially in form of these plug-n-play physical nodes.
At the moment, but eliminating any need to run a node is a major focus of development effort for lightning developers.

There is an inherent incentive: radically improved security and privacy.  But it's only enough to overcome a certain (low) level of cost... thus the concern about managing that cost.

As an aside, a lot of that "node hardware" being sold won't keep up for that long due to limited memory/storage/speed.

I have asked this before, and not really had a straight answer, why should block size not be increased with a S-curve function? (https://en.wikipedia.org/wiki/Logistic_function). If I recall you put me to rights on the demand side issue, and sort of said yeah maybe ... but my memory is sketchy and it was a long time ago.

I would like a square answer as to why not?

It could forever and a day end the whole blocksize issue with almost nil impact on the current core philosophy.

What is the negative technical argument?

or negative any argument to this?



I don't think anything but linear is safe... you don't really know how hardware will progress across time, how much it will cost and so on. I don't see any solution to the so called "scaling on chain" that's why Bitcoin has become de-facto digital gold and not something that can be used realistically at scale (as in global usage) on-chain. This doesn't mean research on the field should stop, you never know... however what's clear is most of the effort in Bitcoin should be spent in review already-existing code rather than more exotic stuff. I mean Core had that inflation bug recently while other clients weren't affected. So Luke should be reviewing code instead of attempting stuff that probably will never have any consensus anyway.

Would you agree that if usage is up, and HD space cost goes down, bandwith cost goes down, the we are effectively seeing the 1MB becoming smaller? for no reason?

I.E. we can afford lager block at least to the extent the bandwidth and HD space costs fall and cpu power per /$ goes up?





Yes, I agree that we could afford doubling the blocksize right now and it would be far from the end of the world. However the main point being discussed here by those that consider all the game theory involved is: HOW do you make a blocksize increase without ending up in a clusterfuck of 2 competing "Bitcoins", with all the drama that always carries? (exchanges listing one or another, price crashing, miners speculating with hashrate, everyone claiming they own the real bitcoin....) Because of that, I don't see how hard forks are possible anymore at all, not mattering what the hardfork is about, there wouldn't be enough consensus, so you would end up with 2 competing coins.
Carlton Banks
Legendary
*
Offline Offline

Activity: 3430
Merit: 3080



View Profile
April 02, 2019, 08:43:55 AM
 #34

I don't see how hard forks are possible anymore at all

There are alot of other hardfork changes that are totally non-controversial, so you're exagerrating

Vires in numeris
spartacusrex
Hero Member
*****
Offline Offline

Activity: 718
Merit: 545



View Profile
April 03, 2019, 10:35:13 AM
 #35

Can I be cheeky and say - This problem has already been fixed. We just need to survive until the fix is implemented.

( I'm a programmer - not cryptographer. Slap me down on:error )

It's all to do with the new zk-STARKs. Like zk-SNARKS.. but faster smaller better hash based quantum secure version ?

They still take hours to process and compute and are still far too large, but that'll be 'fixed'. The pace of improvements is just going too fast at the moment Smiley

When it is - next 10/20 years - we'll be able to use a recursive fixed size zero-knowledge proof that proves the latest block is valid, it is linked to it's parent, a proof that the parent proof is valid, and a cumulative POW..  Roll Eyes

Some of this teck is already out there in various popow (Proof of Proof of Work) forms using the original SNARKs.

So - in 51 years - you'll have a zk-proof that the last 50 years of the blockchain is valid, with the cumulative Total POW, in about 10-20 MB and then the normal chain for the last year.

Life is Code.
jubalix
Legendary
*
Offline Offline

Activity: 2632
Merit: 1023


View Profile WWW
April 03, 2019, 01:25:33 PM
 #36

and later to earn a pretty penny by scooping up routing fees
That's almost certainly not the case. The only time when fees could at all be high is when there aren't many people doing it.  What we've seen so far in lightning (and previously in joinmarket) is that fees rapidly race to pretty low values in competition.

Quote
and Lightning provides that incentive, especially in form of these plug-n-play physical nodes.
At the moment, but eliminating any need to run a node is a major focus of development effort for lightning developers.

There is an inherent incentive: radically improved security and privacy.  But it's only enough to overcome a certain (low) level of cost... thus the concern about managing that cost.

As an aside, a lot of that "node hardware" being sold won't keep up for that long due to limited memory/storage/speed.

S-Curve why not?


Admitted Practicing Lawyer::BTC/Crypto Specialist. B.Engineering/B.Laws

https://www.binance.com/?ref=10062065
Pages: « 1 [2]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!