Bitcoin Forum
May 04, 2024, 02:38:28 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 [3]  All
  Print  
Author Topic: The minimum transfer fee is not trivial anymore  (Read 9911 times)
jgarzik
Legendary
*
qt
Offline Offline

Activity: 1596
Merit: 1091


View Profile
April 02, 2013, 03:08:29 PM
 #41

Nothing stops a pool from running a subscription-only bitcoind instance via which users can submit any transaction they want directly.

Certainly.  Eligius will send non-standard transactions that include a certain set-by-Eligius fee.  Because non-standard transactions are not relayed, you must submit such transactions directly to Eligius.

I wish more miners would offer services like this.


Jeff Garzik, Bloq CEO, former bitcoin core dev team; opinions are my own.
Visit bloq.com / metronome.io
Donations / tip jar: 1BrufViLKnSWtuWGkryPsKsxonV2NQ7Tcj
1714790308
Hero Member
*
Offline Offline

Posts: 1714790308

View Profile Personal Message (Offline)

Ignore
1714790308
Reply with quote  #2

1714790308
Report to moderator
The forum was founded in 2009 by Satoshi and Sirius. It replaced a SourceForge forum.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
caveden
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
April 02, 2013, 03:14:21 PM
 #42

Because we are a decentralized network Smiley  You can instead connect to 20,000 peers, and send one tiny spam through each.

D&T made the same remark, to which I responded above. You'd had to filter which you would relay, and then priority vs fees kicks in. The only difference is that it's not a binary decision (has/hasn't enough fees to get relayed), it's just a way of sorting what gets relayed from what does not. You may end up relaying a transaction that today would not be relayed if there's enough "room" for it.

1). Memory-limit the memory pool-- the set of transactions waiting in memory eligible to be included in a block. Matt Corallo has been working on that.  The limit should be a small multiple of the median block size of the last few hundred blocks.

2) Use the same algorithm/parameters/etc for adding transactions to the memory pool that we use to fill blocks.

3) Only relay transactions that fit into your memory pool.  This is the DoS prevention, your transaction won't get relayed if your node doesn't think it will end up in a block soon.

4) Estimate minimum transaction fee / priority needed to get into a block, based one:
    a) At startup:  the transactions in the last few blocks
    b) If you've been running long enough to "warm up" your memory pool:  transactions in the memory pool

That's nice, better than what is done today and better than what I was saying above. Much more flexible. Nice. Smiley

There is one more change I'd like to make that is independent; re-define "dust" based on the floating transaction fee (e.g. a dust output is any output with a value of less than 1/4 the minimum fee-per-kb required to get into one of the next 6 blocks).  And make any transactions with dust outputs non-standard, so they're not included in the memory pool or relayed.

Why not applying the same logic you described above? (they only get dropped if they can't fit in your memory pool)
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 2216


Chief Scientist


View Profile WWW
April 02, 2013, 05:13:51 PM
 #43

Quote
There is one more change I'd like to make that is independent; re-define "dust" based on the floating transaction fee (e.g. a dust output is any output with a value of less than 1/4 the minimum fee-per-kb required to get into one of the next 6 blocks).  And make any transactions with dust outputs non-standard, so they're not included in the memory pool or relayed.

Why not applying the same logic you described above? (they only get dropped if they can't fit in your memory pool)

Because dust outputs are more trouble than they're worth. They bloat wallets, cost more in fees to spend than they're worth (unless you go to ridiculous lengths to spend them), and are abused as a side-channel-in-the-blockchain-communication-mechanism.

If I could go back in time, I would go back and try to convince Satoshi to make them non-standard to begin with....

How often do you get the chance to work on a potentially world-changing project?
nonnakip
Hero Member
*****
Offline Offline

Activity: 633
Merit: 591



View Profile
April 02, 2013, 06:44:04 PM
 #44

Because dust outputs are more trouble than they're worth. They bloat wallets, cost more in fees to spend than they're worth (unless you go to ridiculous lengths to spend them), and are abused as a side-channel-in-the-blockchain-communication-mechanism.

It is not difficult to spend dust. The problem is how bitcoind chooses coins to include. I modify my bitcoind to ignore "dust coins" when choosing coins for a transaction. Then after choosing enough non-dust coins it adds as many "dust coins" to the input as possible without pushing the transaction to the next KiB in size. This allows my wallet to stay relatively dust free automatically and does not cause extra tx fees.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
April 02, 2013, 10:12:09 PM
 #45

Why not applying the same logic you described above? (they only get dropped if they can't fit in your memory pool)

Because dust have a unique cost.  They likely will never be spent.  They just remain part of the UXTO forever.  So more dust is continually being produced but none (or very little) is being used in new tx.  Eventually on a long enough timeline the UXTO (pruned database) will run into hundreds of TBs.  It makes Bitcoin far less scalable than if the UXTO remained spendable outputs.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
April 02, 2013, 10:16:39 PM
 #46

Because dust outputs are more trouble than they're worth. They bloat wallets, cost more in fees to spend than they're worth (unless you go to ridiculous lengths to spend them), and are abused as a side-channel-in-the-blockchain-communication-mechanism.

It is not difficult to spend dust. The problem is how bitcoind chooses coins to include. I modify my bitcoind to ignore "dust coins" when choosing coins for a transaction. Then after choosing enough non-dust coins it adds as many "dust coins" to the input as possible without pushing the transaction to the next KiB in size. This allows my wallet to stay relatively dust free automatically and does not cause extra tx fees.

Well that isn't exactly true.  It is possible the inclusion of the dust input lowers the priority making it low priority and thus mandating the min mandatory fee.  While improvement in coin selection is a good thing each input uses roughly 200 bytes.  You can't include many dust outputs without increasing the tx size.   Merely removing one or two dust outputs from the UXTO every couple days is like emptying the ocean with a teaspoon.  A single martingale SD player can generate a thousand or more new dust outputs in an hour or so.

There is no realistic reason why someone would need to send an amount less than the min fee.  I mean it is like mailing a penny to someone (at a cost of 46 pennies).  That small limitation would make the UXTO more efficient (a higher % of the outputs in the UXTO will actually be used in a future tx).  Note I am not saying coin selection shouldn't improve and priority should take into account uspent output reduction but something bigger is needed.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
April 02, 2013, 10:18:04 PM
 #47

Because dust outputs are more trouble than they're worth. They bloat wallets, cost more in fees to spend than they're worth (unless you go to ridiculous lengths to spend them), and are abused as a side-channel-in-the-blockchain-communication-mechanism.

If I could go back in time, I would go back and try to convince Satoshi to make them non-standard to begin with....

Hopefully alt-coin developers take note of the lessons learned.  Or wait never mind there is no innovation to make a better coin they are just weak "I want to be richz" attempts.  Well someday someone will try to make a superior coin and hopefully they spend 6-12 months studying bitcoin to actually make a better coin.
caveden
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
April 03, 2013, 06:59:47 AM
 #48

Quote
There is one more change I'd like to make that is independent; re-define "dust" based on the floating transaction fee (e.g. a dust output is any output with a value of less than 1/4 the minimum fee-per-kb required to get into one of the next 6 blocks).  And make any transactions with dust outputs non-standard, so they're not included in the memory pool or relayed.

Why not applying the same logic you described above? (they only get dropped if they can't fit in your memory pool)

Because dust outputs are more trouble than they're worth. They bloat wallets, cost more in fees to spend than they're worth (unless you go to ridiculous lengths to spend them), and are abused as a side-channel-in-the-blockchain-communication-mechanism.

What they're worth and how they're supposed to be used is not up to any single individual or group to decide.
Dust may be used as a way to send messages. Dust may be used in Smart Property contracts. Whatever.

My point is that what you're making here is a "judgement of value", and that's not up to the Bitcoin infrastructure to make. It's the users that should choose how they spend their coins. As long as they pay for the service being provided (and whether they're paying enough or not is up to miners to decide), and as long as the infrastructure is protected against attacks (DoS), everything is fine.

Please don't embed judgments of value in Bitcoin's backend.

If I could go back in time, I would go back and try to convince Satoshi to make them non-standard to begin with....

I thought it was you that came up with the concept of standard and non-standard transactions.... it was there since the beginning?
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526
Merit: 1129


View Profile
April 03, 2013, 04:27:48 PM
 #49

Satoshi invented that concept later.

That said, I think we should be able to get wallets defragmenting themselves. Miners have an incentive to keep the output set small so there's no reason not to accept transactions that consolidate coins of their CPU is otherwise idle, which most of the time it is. It just needs smarter rules and software.
chrisrico
Hero Member
*****
Offline Offline

Activity: 496
Merit: 500


View Profile
April 08, 2013, 09:11:27 AM
 #50

Maybe the priority calculation should only come into place if you're receiving more transactions than you can relay without breaking the max KB/s value, or if your memory pool is full (configurable max size for pool).

Then how would you know when sending a transaction if you need to include a fee or not?

edit... I hadn't yet read Gavin's first post. That does sound like a much more elegant solution.
Spekulatius
Legendary
*
Offline Offline

Activity: 1022
Merit: 1000



View Profile
April 08, 2013, 11:30:33 AM
 #51

It is obscene that minimum tx fees are fixed BTC/KiB recommendations. This must be determined by individual miners! Miners must decide for themselves how much tx fees they accept. Miners need to be competing with each other on this. That was the original idea of Bitcoin. This will solve the problems of variable Bitcoin exchange prices by itself!

The clients have the blockchain. They can see what tx fees are being accepted how fast. Clients can use this as a measure to decide what tx fee should be used.

The problem is current client/relay/miner software has no automation to evaluate what is actually happening in the blockchain so users are forced to manually set tx fees. This must be improved. Let the Bitcoin marketplace decide minimum tx fees. Not developers with fixed-guess recommendations. And certainly not with fixed default settings in the software.

+1.000001
Zaih
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500


View Profile
April 08, 2013, 12:34:48 PM
 #52

I agree x10000! I was thinking the exact same thing just before  Roll Eyes
NikolaTesla
Newbie
*
Offline Offline

Activity: 28
Merit: 0



View Profile
April 08, 2013, 01:38:49 PM
 #53

It is obscene that minimum tx fees are fixed BTC/KiB recommendations. This must be determined by individual miners! Miners must decide for themselves how much tx fees they accept. Miners need to be competing with each other on this. That was the original idea of Bitcoin. This will solve the problems of variable Bitcoin exchange prices by itself!

The clients have the blockchain. They can see what tx fees are being accepted how fast. Clients can use this as a measure to decide what tx fee should be used.

The problem is current client/relay/miner software has no automation to evaluate what is actually happening in the blockchain so users are forced to manually set tx fees. This must be improved. Let the Bitcoin marketplace decide minimum tx fees. Not developers with fixed-guess recommendations. And certainly not with fixed default settings in the software.

+1.000001
+1.000011
jgarzik
Legendary
*
qt
Offline Offline

Activity: 1596
Merit: 1091


View Profile
April 08, 2013, 06:36:14 PM
 #54

The problem is current client/relay/miner software has no automation to evaluate what is actually happening in the blockchain so users are forced to manually set tx fees. This must be improved. Let the Bitcoin marketplace decide minimum tx fees. Not developers with fixed-guess recommendations. And certainly not with fixed default settings in the software.

+1.000001

The developers agree with you.  Where is the code submission?  Pull requests to do this are welcome.


Jeff Garzik, Bloq CEO, former bitcoin core dev team; opinions are my own.
Visit bloq.com / metronome.io
Donations / tip jar: 1BrufViLKnSWtuWGkryPsKsxonV2NQ7Tcj
Qoheleth
Legendary
*
Offline Offline

Activity: 960
Merit: 1028


Spurn wild goose chases. Seek that which endures.


View Profile WWW
April 08, 2013, 06:42:33 PM
 #55

Because dust outputs are more trouble than they're worth. They bloat wallets, cost more in fees to spend than they're worth (unless you go to ridiculous lengths to spend them), and are abused as a side-channel-in-the-blockchain-communication-mechanism.

If I could go back in time, I would go back and try to convince Satoshi to make them non-standard to begin with....

Hopefully alt-coin developers take note of the lessons learned.  Or wait never mind there is no innovation to make a better coin they are just weak "I want to be richz" attempts.  Well someday someone will try to make a superior coin and hopefully they spend 6-12 months studying bitcoin to actually make a better coin.

Oh, rest assured, I'm taking notes.

(Not that I'm an altcoin developer... yet.)

If there is something that will make Bitcoin succeed, it is growth of utility - greater quantity and variety of goods and services offered for BTC. If there is something that will make Bitcoin fail, it is the prevalence of users convinced that BTC is a magic box that will turn them into millionaires, and of the con-artists who have followed them here to devour them.
ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
April 08, 2013, 07:10:06 PM
 #56

Quote
There is one more change I'd like to make that is independent; re-define "dust" based on the floating transaction fee (e.g. a dust output is any output with a value of less than 1/4 the minimum fee-per-kb required to get into one of the next 6 blocks).  And make any transactions with dust outputs non-standard, so they're not included in the memory pool or relayed.

Why not applying the same logic you described above? (they only get dropped if they can't fit in your memory pool)

Because dust outputs are more trouble than they're worth. They bloat wallets, cost more in fees to spend than they're worth (unless you go to ridiculous lengths to spend them), and are abused as a side-channel-in-the-blockchain-communication-mechanism.

If I could go back in time, I would go back and try to convince Satoshi to make them non-standard to begin with....

One suggestion here is to have an option in the GUI to add the dust created by a transaction to the transaction fee.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
chriswilmer
Legendary
*
Offline Offline

Activity: 1008
Merit: 1000


View Profile WWW
April 23, 2013, 11:26:06 AM
 #57

Satoshi invented that concept later.

That said, I think we should be able to get wallets defragmenting themselves. Miners have an incentive to keep the output set small so there's no reason not to accept transactions that consolidate coins of their CPU is otherwise idle, which most of the time it is. It just needs smarter rules and software.

+1
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
April 23, 2013, 11:50:47 AM
 #58

Because dust have a unique cost.  They likely will never be spent.  They just remain part of the UXTO forever.  So more dust is continually being produced but none (or very little) is being used in new tx.  Eventually on a long enough timeline the UXTO (pruned database) will run into hundreds of TBs.  It makes Bitcoin far less scalable than if the UXTO remained spendable outputs.

The only way around that is to have some kind of charge for storage of coins in the transaction set. 

Maybe add a 1 satoshi fee to every transaction output into the mining fees.

This actually won't be that hard to work out.  The UTXO set is already held in memory as some kind of map.  You just add that count times 1 satoshi to the max allowable mining fee.

Second when a transaction is spent, you just have to look at what block the transaction was in to see how much value it has lost.

Another option would be to block old coins from being spent unless the network is notified first.  There could be a specific protocol message for that.

Having said that, miners probably would unload dust transactions to disk anyway, and not store them in RAM.  If you tried to spend them, miners might have to look for the input in a slower data store.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Pages: « 1 2 [3]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!