Bitcoin Forum
May 03, 2024, 04:44:57 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 [4]  All
  Print  
Author Topic: blocksize solution: longest chain decides  (Read 2113 times)
Peter R
Legendary
*
Offline Offline

Activity: 1162
Merit: 1007



View Profile
September 04, 2015, 09:41:23 PM
 #61

The hassle of a hard fork is an important feature of the process, not a burden.
Ever wondered why evolutionists cannot wrap their heads about the fact that sexual reproduction consumes that much energy?
Well, here is your answer. Erecting the new 8Mb limit is definitely pointing you in the right direction, Peter R. Wink

Haha I think you're right!

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
1714754697
Hero Member
*
Offline Offline

Posts: 1714754697

View Profile Personal Message (Offline)

Ignore
1714754697
Reply with quote  #2

1714754697
Report to moderator
1714754697
Hero Member
*
Offline Offline

Posts: 1714754697

View Profile Personal Message (Offline)

Ignore
1714754697
Reply with quote  #2

1714754697
Report to moderator
1714754697
Hero Member
*
Offline Offline

Posts: 1714754697

View Profile Personal Message (Offline)

Ignore
1714754697
Reply with quote  #2

1714754697
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
poeEDgar
Sr. Member
****
Offline Offline

Activity: 299
Merit: 250



View Profile
September 04, 2015, 10:33:25 PM
Last edit: September 05, 2015, 07:03:12 AM by poeEDgar
 #62

I'm disappointed that no one is taking this opportunity to discuss solutions to spam attacks (dust transactions w/ maximum outputs). That's actually the issue that is forcing this debate -- not organic growth in transaction volume, which on average, is nowhere near the limit.

I'll admit that I'm not entirely familiar with the issue.  Can you give an example?

I thought there is a dust limit of 5500 Satoshis.  is that set by the miners by consensus or is part of the protocol?

AFAIK, the minimum is slightly less than 550 satoshis, or < $0.0013. I believe the last Coinwallet stress test involved outputs of 0.00001 -- approximately double the current definition of dust.

In theory, nodes could observe the standard output of stress test transactions, then simply alter their conf files to not relay transactions with outputs of that size (or smaller). The only loss here is that people cannot quickly send $0.002 transactions. The gain is that above the agreed upon dust thresholds, the standard fee should be adequate (unless I am approaching this incorrectly).

It's not a protocol issue. It affects which transactions an unmodified bitcoind/bitcoin-qt client will relay on the network. Miners can put as many non-standard transaction in their blocks as they want, but without further modification the reference client will not broadcast or relay those transactions to the miners. It's completely configurable. If the current definition of dust is not economical, miners/nodes can just change their conf files.

One idea would be to increase the client-coded threshold that defines an output as dust, which would increase the aggregate amount of bitcoins required to push large spam transactions with dust outputs. Another idea would be to require an additional fee to push transactions on a per-dust-output basis.

The issue has other implications in regards to bloat.

Well no you already know that dust has nothing to do with blockchain size but EVERYTHING to do with UXTO bloat.  Dust (or a very high probability of dust approaching 1) won't be spent thus it remains in the UXTO.  For normal economic transactions the UXTO only grows linearly (related more to the number of discrete entities not total transactions over time).  This is good because the UXTO is a far more critical resource than the unpruned blockchain.   It is highly likely that in the future most full nodes won't even maintain the full blockchain but they will need the entire UXTO.

Having tiny worthless transactions which last forever in the UXTO lowers the efficiency of every node, forever.

Dust outputs won't get spent. 100BTC outputs will get spent and pruned.

Moore's Law isn't magic.  It reflects that people push back against wastes of CPU time, disk storage, and other limitations to push technology to its physical limits.  Processor speeds don't just magically increase.  Disk storage doesn't just magically increase.  The assumptions underlying Moore's Law (which isn't a real law) involve bean-counters identifying and eliminating wastes of resources that slow down systems and break them, just as much as they involve people inventing new technologies.

The threat from dust transactions may be on the un-glamorous, bean-counting side of technological improvement, but that doesn't mean it doesn't exist.


Quote from: Gavin Andresen
I woulda thunk you were old enough to be confident that technology DOES improve. In fits and starts, but over the long term it definitely gets better.
jonald_fyookball (OP)
Legendary
*
Offline Offline

Activity: 1302
Merit: 1004


Core dev leaves me neg feedback #abuse #political


View Profile
September 05, 2015, 12:34:38 AM
 #63

whats the connection to the spam limit Satoshi introduced as 1 mb blocksize.

Why is the spam limit blocksize not that important anymore?

poeEDgar
Sr. Member
****
Offline Offline

Activity: 299
Merit: 250



View Profile
September 05, 2015, 12:42:57 AM
 #64

whats the connection to the spam limit Satoshi introduced as 1 mb blocksize.

Why is the spam limit blocksize not that important anymore?

I actually haven't seen an in-depth discussion of this and would love for someone more knowledgeable than me to weigh in.

Quote from: Gavin Andresen
I woulda thunk you were old enough to be confident that technology DOES improve. In fits and starts, but over the long term it definitely gets better.
Pages: « 1 2 3 [4]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!