Bitcoin Forum
May 04, 2024, 04:25:04 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: 1 2 3 [All]
  Print  
Author Topic: A clean solution to ALL Bitcoin problems: SatoshiDice, Block size, future fees.  (Read 4289 times)
Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 26, 2013, 05:27:05 PM
Last edit: February 26, 2013, 06:29:16 PM by Sergio_Demian_Lerner
 #1

The idea is simple, butrequires a hardfork, but is has minimum impact in the code and in the economics.

Solution: Require that the set of fees collected in a block has a maximum dispersion. Use, for example, the Coefficient of Variation (http://en.wikipedia.org/wiki/Coefficient_of_variation). If the CoVar is higher than a fixed constant, the block is invalid.

The Coefficient of variation is computed as the standard deviation over the mean value, so it's very easy to compute. (if the mean is zero, we assume CoVar=0). Note that the CoVar function does not depend of the scale, so is just what a coin with floating value requires.

This means that if there are many transactions containing high fees in a block, then free transactions cannot be included.
The core devs should tweak the transaction selection algorithm to take into account this maximum bound.

Example

If the transaction fee set is: 0,0,0,0,5,5,6,7,8,7
The CoVar is 0.85
Suppose we limit the CoVar to a maximum of 1.

Suppose the transaction fee set is: 0,0,0,0,0,0,0,0,0,10
Then the CoVar is 3.0

In this case the miner should have to either drop the "10" from the fee set or drop the zeros. Obviously the miner will drop some zeros, and choose the set: 0,10, that has a CoVar of 1.

Why it solves SD Problem?

Using this little modification, users of SatoshiDice would require to use higher fees, only if the remaining users in the community rises their fees. And miners won't be able to include an enormous amounts of spamming txs.

Why it solves the futue fee problem? (and the tragedy-of-the-commons "problem")

Because as miners are forced to keep the CoVar low, if people rises the fees to confirm faster than spamming txs, automatically smamming txs become less likely to appear in blocks.

Why it solves the block size problem?

Because if we increase the block size, miners acting irrationally won't be able to fill the block with spamming txs. Edit: This is not a solution against an attacker-miner.

Best regard,
  Sergio.

Edit: PLEASE don't confuse the acronym CoVar I used here with co-variance.





1714796704
Hero Member
*
Offline Offline

Posts: 1714796704

View Profile Personal Message (Offline)

Ignore
1714796704
Reply with quote  #2

1714796704
Report to moderator
"Governments are good at cutting off the heads of a centrally controlled networks like Napster, but pure P2P networks like Gnutella and Tor seem to be holding their own." -- Satoshi
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714796704
Hero Member
*
Offline Offline

Posts: 1714796704

View Profile Personal Message (Offline)

Ignore
1714796704
Reply with quote  #2

1714796704
Report to moderator
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 26, 2013, 05:30:12 PM
 #2

NICE!!!! A brand new idea!!!
Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 26, 2013, 05:37:18 PM
 #3

Good job Sergio. That is an interesting idea. Right now I have nothing else to say but maybe later as I think about it more.

Denarium closing sale discounts now up to 43%! Check out our products from here!
Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 26, 2013, 05:38:21 PM
 #4

Note that a miner could also drop transaction with very high fees in order to collect many transactions with lower fees, but I find it very rare in practice, since transactions consume block space.

I also think that even if the high-to-low fee selection algorithm is fast and greedy O(n), an optimum algorithm that maximizes fees with the dispersion restriction would be O(n^2), where n is the number of transactions to test.


 
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
February 26, 2013, 05:41:16 PM
 #5

Why do miners need to be forced to adopt a fee policy? If it's a good idea there's nothing stopping them from implementing it right now.

How do you decide on the optimum value for the magic constant that you want to embed into the block validation rules?
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 26, 2013, 05:41:53 PM
 #6

Why do miners need to be forced to adopt a fee policy? If it's a good idea there's nothing stopping them from implementing it right now.

Because the network would need to reject blocks if the covariance is too high.
markm
Legendary
*
Offline Offline

Activity: 2940
Merit: 1090



View Profile WWW
February 26, 2013, 05:52:09 PM
Last edit: February 26, 2013, 06:12:44 PM by markm
 #7

The same crowd who argue for inifinite (no limit) max block size will use the the same arguments against this.

To them it is just another arbitrary restriction preventing all out class-warfare from playing itself out to its natural conclusion of "the biggest bullies win".

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
AsymmetricInformation
Member
**
Offline Offline

Activity: 115
Merit: 10


View Profile WWW
February 26, 2013, 06:06:06 PM
 #8

Sergio provided an example where the policy caused a low-fee transaction to be dropped.

Can anything think of a counterexample where the policy caused something dramatic and horrible to happen?

I personally think I might have one involving miners who themselves create transactions to perfectly meet the CoVar, ensuring they recapture the fees and the block, but I haven't worked it out. It might be impractical. (Also I'm not sure I understand my own intuition so if anyone sees where Im going with this please write it down and save me the trouble.)

Why do miners need to be forced to adopt a fee policy? If it's a good idea there's nothing stopping them from implementing it right now.

How do you decide on the optimum value for the magic constant that you want to embed into the block validation rules?

Does anyone know the historical value of the CoVar? Ideally a graph of CoVar(Fees in block t) against Blocks (ie Time)? By some miracle it could be very stable, and only increase in association with the rise of 'unwanted' things (SatoshiDice).


It is a cool new idea though. Looking forward to the discussion.

Support Decentralized Bitcoin Prediction Markets: 1M5tVTtynuqiS7Goq8hbh5UBcxLaa5XQb8
https://github.com/psztorc/Truthcoin
blueadept
Full Member
***
Offline Offline

Activity: 225
Merit: 101


View Profile
February 26, 2013, 06:24:02 PM
 #9

One issue might be transactions that spam by means of their size. Perhaps a modification taking into account transaction size, too? Something like CoVar(txfeei/txsizei)?

I haven't had enough time yet to try thinking of other ways of gaming this.

Like my posts?  Connect with me on LinkedIn and endorse my "Bitcoin" skill.
Decentralized, instant off-chain payments.
Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 26, 2013, 06:25:51 PM
 #10


How do you decide on the optimum value for the magic constant that you want to embed into the block validation rules?

A historical analysis of the block chain must be done, but I don't have the right tools to do it. (a ready to run development environment).
If would choose the maximum historical CoVar as the fixed constant (maybe probably removing the 5% highest percentile).

But this value will be clearer when you have the historical data.

Maybe someone who develops BitcoinJ or the Satoshi Client can provide us with a graph CoVar/Time ?

Maybe there is some hidden information related to SD and spamming txs there that we have overlooked.


Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 26, 2013, 06:27:19 PM
 #11

One issue might be transactions that spam by means of their size. Perhaps a modification taking into account transaction size, too? Something like CoVar(txfeei/txsizei)?

I haven't had enough time yet to try thinking of other ways of gaming this.

Rational miners choose smaller transactions because:

1. Gives extra space to put more transactions in the block
2. Reduces the propagation time.

So there is enough incentives to include smaller txs.
twolifeinexile
Full Member
***
Offline Offline

Activity: 154
Merit: 100



View Profile
February 26, 2013, 06:45:25 PM
 #12

The idea is simple, butrequires a hardfork, but is has minimum impact in the code and in the economics.

Solution: Require that the set of fees collected in a block has a maximum dispersion. Use, for example, the Coefficient of Variation (http://en.wikipedia.org/wiki/Coefficient_of_variation). If the CoVar is higher than a fixed constant, the block is invalid.

The Coefficient of variation is computed as the standard deviation over the mean value, so it's very easy to compute. (if the mean is zero, we assume CoVar=0). Note that the CoVar function does not depend of the scale, so is just what a coin with floating value requires.

This means that if there are many transactions containing high fees in a block, then free transactions cannot be included.
The core devs should tweak the transaction selection algorithm to take into account this maximum bound.

Example

If the transaction fee set is: 0,0,0,0,5,5,6,7,8,7
The CoVar is 0.85
Suppose we limit the CoVar to a maximum of 1.

Suppose the transaction fee set is: 0,0,0,0,0,0,0,0,0,10
Then the CoVar is 3.0

In this case the miner should have to either drop the "10" from the fee set or drop the zeros. Obviously the miner will drop some zeros, and choose the set: 0,10, that has a CoVar of 1.

Why it solves SD Problem?

Using this little modification, users of SatoshiDice would require to use higher fees, only if the remaining users in the community rises their fees. And miners won't be able to include an enormous amounts of spamming txs.

Why it solves the futue fee problem? (and the tragedy-of-the-commons "problem")

Because as miners are forced to keep the CoVar low, if people rises the fees to confirm faster than spamming txs, automatically smamming txs become less likely to appear in blocks.

Why it solves the block size problem?

Because if we increase the block size, miners acting irrationally won't be able to fill the block with spamming txs. Edit: This is not a solution against an attacker-miner.

Best regard,
  Sergio.

Edit: PLEASE don't confuse the acronym CoVar I used here with co-variance.






great thought, I will vote up this to let more people to think about that.
Just is it possible at some point in time no coVar threshold compliant block that can be found? Then the whole chain will just wait more transactions?
Also, given this coVar requirement, what algorithm a miner would like to run to select transactions?
Maybe the logic could be if the block is smaller than certain size, then coVar check is not needed.
All in all, sounds like a very elegant solution.
Sukrim
Legendary
*
Offline Offline

Activity: 2618
Merit: 1006


View Profile
February 26, 2013, 06:53:29 PM
 #13

Can a miner be forced to include free transactions or to drop certain transactions by publishing "better" transactions with this approach?

Also I like it because it leaves the issue of block size aside. On the other hand maybe that might be a problem?

https://www.coinlend.org <-- automated lending at various exchanges.
https://www.bitfinex.com <-- Trade BTC for other currencies and vice versa.
AsymmetricInformation
Member
**
Offline Offline

Activity: 115
Merit: 10


View Profile WWW
February 26, 2013, 07:00:21 PM
 #14


How do you decide on the optimum value for the magic constant that you want to embed into the block validation rules?

A historical analysis of the block chain must be done, but I don't have the right tools to do it. (a ready to run development environment).
If would choose the maximum historical CoVar as the fixed constant (maybe probably removing the 5% highest percentile).

But this value will be clearer when you have the historical data.

Maybe someone who develops BitcoinJ or the Satoshi Client can provide us with a graph CoVar/Time ?

Maybe there is some hidden information related to SD and spamming txs there that we have overlooked.


What about merging this suggestion with Hazek's suggestion in a different post, which suggested increasing the block size limit if (subsidy + fees) > 50 BTC, and decreasing it (subsidy + fees) < 25. Instead of changing the block size directly you could just change the CoVar.

Support Decentralized Bitcoin Prediction Markets: 1M5tVTtynuqiS7Goq8hbh5UBcxLaa5XQb8
https://github.com/psztorc/Truthcoin
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 26, 2013, 07:01:01 PM
 #15

Also I like it because it leaves the issue of block size aside. On the other hand maybe that might be a problem?

If the maximum block size is increased, then this proposal will prevent fees from dropping to nothing. I like it!
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 26, 2013, 07:03:23 PM
 #16

is it possible at some point in time no coVar threshold compliant block that can be found?

No, because a block with 1 or more transactions that all have the same fee always has a coefficient of variation of zero.
twolifeinexile
Full Member
***
Offline Offline

Activity: 154
Merit: 100



View Profile
February 26, 2013, 07:19:29 PM
 #17

Why it solves the block size problem?
Because if we increase the block size, miners acting irrationally won't be able to fill the block with spamming txs. Edit: This is not a solution against an attacker-miner.
Trying to understand this further... If a miner mean to force out other miners out of network by flooding large block, and make many many many low fee trasactions, how it is affected?
If the block has many low fee transactions, no higher fee txs, then the coVaR may still low and pass the test, no?
Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 26, 2013, 07:34:34 PM
 #18

Why it solves the block size problem?
Because if we increase the block size, miners acting irrationally won't be able to fill the block with spamming txs. Edit: This is not a solution against an attacker-miner.
Trying to understand this further... If a miner mean to force out other miners out of network by flooding large block, and make many many many low fee trasactions, how it is affected?
If the block has many low fee transactions, no higher fee txs, then the coVaR may still low and pass the test, no?

Yes, as I added to the original post: it's not a solution against malicious miners. Only against selfish (and irrational)  miners that do not think about the future of Bitcoin.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1150


View Profile
February 26, 2013, 08:04:05 PM
 #19

How does this prevent a miner from creating their own transactions to game the coefficients? The only cost is the orphan rate, which can be kept well under 1% for a miner with sufficient infrastructure.

gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
February 26, 2013, 08:16:11 PM
 #20

If the maximum block size is increased, then this proposal will prevent fees from dropping to nothing. I like it!
that isn't obvious to me, say you mine a small block with some big fee txn. Great. The moment after that there are no big fee txn— but there is 10gb of low fee txn. So of course everyone mines those next.  Alternatively if the sum of low fee transactions is greater than a high fee one that would blow the covar, you'd drop the high fee txn and take the gigabyte of dust— so instead of keeping fees high, in some cases it would encourage people to lower their fees.

I think it would exaggerate priority effects, I'll have to think more about it.
ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
February 26, 2013, 08:49:48 PM
 #21

I see a problem with this. Would not one dominant source of transactions effectively set the transaction fees? For example S.Dice would be able to set the transaction fee effectively at will in today’s market. http://blockchain.info/

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
Zeilap
Full Member
***
Offline Offline

Activity: 154
Merit: 100


View Profile
February 26, 2013, 10:24:35 PM
 #22

This is just going to cause the transaction fees to bounce around wildly, and users will no longer have any real control over transaction priority.

At the moment, to increase your transaction priority, you increase the transaction fee. Pretty simple.

With this system, it is no longer true. Instead the optimal fee to increase your transaction priority is dependent on the other transaction fees in the set of unprocessed transactions. Your goal is to set your fee so that it is similar in value to as many other transactions as possible, so a large group with small variance is formed, becoming most profitable to mine.
But, if you set it to the same fee of a large group of other transactions, the group may get too large (to fit in a block) and so only a subgroup of that group will be used (or another group in a different range of fees), so you might want to set your fee to be slightly larger within the group, to increase your chances of placing in the subgroup. Too large however and it may be more profitable for the miner to include several smaller paying transactions than your slightly larger one etc.

To pick your fee you need to know what the most profitable group of transactions *will be*, i.e. you need to know what transactions other people are going to send before they send them.
Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 26, 2013, 10:51:45 PM
Last edit: February 26, 2013, 11:50:00 PM by Technomage
 #23

This is just going to cause the transaction fees to bounce around wildly, and users will no longer have any real control over transaction priority.

At the moment, to increase your transaction priority, you increase the transaction fee. Pretty simple.

With this system, it is no longer true. Instead the optimal fee to increase your transaction priority is dependent on the other transaction fees in the set of unprocessed transactions. Your goal is to set your fee so that it is similar in value to as many other transactions as possible, so a large group with small variance is formed, becoming most profitable to mine.
But, if you set it to the same fee of a large group of other transactions, the group may get too large (to fit in a block) and so only a subgroup of that group will be used (or another group in a different range of fees), so you might want to set your fee to be slightly larger within the group, to increase your chances of placing in the subgroup. Too large however and it may be more profitable for the miner to include several smaller paying transactions than your slightly larger one etc.

To pick your fee you need to know what the most profitable group of transactions *will be*, i.e. you need to know what transactions other people are going to send before they send them.

I don't see it as that problematic. Fees would have variance from block to block but that isn't such a big deal. Only the first ones announcing their tx are blind to what is going on and don't really know the optimal fee, but on the other hand they have the most power in defining the fee that's most useful for that block. The ones coming later would simply react to what kind of transactions are in the block already and choose the fee automatically.

The user can simply set an upper bound to how much he is willing to pay. If the next set of transactions is not looking good the client would postpone the tx to be sent once that block clears, or the client could just send it right away but it wouldn't be accepted into the next block because it has a fee that doesn't fit. Not sure if I'm way over the place with this but I think it can work.

Denarium closing sale discounts now up to 43%! Check out our products from here!
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
February 26, 2013, 11:19:51 PM
Last edit: February 26, 2013, 11:41:28 PM by solex
 #24

Sergio's idea sounds great, therefore I went away and examined two blocks.

SatoshiDice launched on April 24, 2012 so I randomly chose a large block two weeks earlier:

April 9, 2012      
block 174997 (Tx = 154)
Fee            Count   
0                 116   
0.0005           35   
0.001              4   
0.00677853     1   
0.01                1   
CoVar = 4.00      

Then I chose a similar sized block which went through in the last hour:
      
Feb 26, 2013      
block 223287 (Tx = 151)
Fee         Count   
0              13   
0.0005      61   
0.001        70   
0.0015        1   
0.002          2   
0.005          4   
CoVar = 0.93      

First, a wider analysis is warranted as this is a very narrow test. However, in the meantime I am not sure if there is a problem to be solved!  In the April 2012 example most transactions did not have fees, in the block today they do. SD is paying a lot in fees. It seems bitcoin is improving its economic "health" organically, or at least through current incentives for fees. All it needs is more time to see whether the fees climb or level off.

I am wondering whether fees are tangential and the real issue about the max block limit is block propagation/verification time as Gavin indicated at the start!

twolifeinexile
Full Member
***
Offline Offline

Activity: 154
Merit: 100



View Profile
February 26, 2013, 11:26:49 PM
 #25


I am wondering if fees are tangental and the real issue about the max block limit is block propagation/verification time as Gavin indicated at the start!


max block limit is the real question in my opinion, since that determines how much free ride could pontentially be and then the block proopagation/verification has long term effect on miner competition.
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
February 26, 2013, 11:44:27 PM
 #26


I am wondering if fees are tangental and the real issue about the max block limit is block propagation/verification time as Gavin indicated at the start!


max block limit is the real question in my opinion, since that determines how much free ride could pontentially be and then the block proopagation/verification has long term effect on miner competition.

Yes, but the 1Mb limit is nothing more than a psychological constraint at present, yet fees are rising anyway. The concern is about when it becomes a physical  constraint (due to transaction volumes).

streblo
Full Member
***
Offline Offline

Activity: 165
Merit: 100


View Profile
February 26, 2013, 11:58:57 PM
 #27

Nash equilibrium at 1 satoshi txn fee ruins this idea. There is no incentive to ever submit a transaction with a non-1 satoshi fee.
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 27, 2013, 12:15:58 AM
 #28

the optimal fee to increase your transaction priority is dependent on the other transaction fees in the set of unprocessed transactions.

This is true even in today's Bitcoin implementation.
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 27, 2013, 12:17:26 AM
 #29

Nash equilibrium at 1 satoshi txn fee ruins this idea. There is no incentive to ever submit a transaction with a non-1 satoshi fee.

Not true, if everyone included only 1 satoshi as the fee the confirmation time would grow without bound.
Zeilap
Full Member
***
Offline Offline

Activity: 154
Merit: 100


View Profile
February 27, 2013, 12:22:23 AM
 #30

I don't see it as that problematic. Fees would have variance from block to block but that isn't such a big deal.
Let's consider the analogous situation in real life:
[Customer   ] Hi, I'd like to buy this item, would you deliver it to this address
[Shop owner] No problem
[Customer   ] When can I expect delivery?
[Shop owner] I don't know, possibly immediately, possibly never.
[Customer   ] WTF!? Can I pay you a little extra to guarantee I get it by tomorrow?
[Shop owner] Sorry, that's not how the system works.
[Customer   ] So how does it work then?
[Shop owner] Well, I'm thinking of a number between 1 and 100, give me some money between $1 and $100 and if the amount you choose is closer than the other customers, you'll get it delivered tomorrow.


Only the first ones announcing their tx are blind to what is going on and don't really know the optimal fee, but on the other hand they have the most power in defining the fee that's most useful for that block. The ones coming later would simply react to what kind of transactions are in the block already and choose the fee automatically.
No, what will happen is that multiple transactions will appear all at different values early on, and only one of them will end up being in the winning group. The result is that announcing early gives you least control over getting your transaction processed. The incentive now is to announce your transaction as late as possible, but not too late to miss getting into the block.


If this were to come into force, people would end up realizing that it is in their interest to send their transaction multiple times with different fees. Only one can be processed due to double spends, but if they have transactions at all different fees, they are more likely to get one of them processed.

Further, I could easily set up a service to game the system for people who actually care about getting their transactions processed. How it would work is that they send me multiple versions of the transaction they would like to broadcast, each version with a different fee. I would hold the transactions until I got enough transactions to game the system, then I dump them all at once on the network, creating the best group of transactions or joining the current best group if I can't beat it.
Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 27, 2013, 04:19:16 AM
 #31

Zeilap: This is getting really interesting...

This is just going to cause the transaction fees to bounce around wildly, and users will no longer have any real control over transaction priority.

...

With this system, it is no longer true. Instead the optimal fee to increase your transaction priority is dependent on the other transaction fees in the set of unprocessed transactions. Your goal is to set your fee so that it is similar in value to as many other transactions as possible, so a large group with small variance is formed, becoming most profitable to mine.

This can be very easily avoided. Before calculating the CoVar, sort them by fees, and remove from the CoVar computation the transactions that in the 10% percentile that pay most.

Then you'll always be able to get the transaction into the next block: send the fee high enough to be in that percentile.

But I'm unsure if a high fee transaction would be ever discarded. I will simulate this.
Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 27, 2013, 04:22:57 AM
 #32

How does this prevent a miner from creating their own transactions to game the coefficients? The only cost is the orphan rate, which can be kept well under 1% for a miner with sufficient infrastructure.

It just goes against his own interests. Why would he do that? There is no memory between blocks, so a block cannot influence the next.
bpd
Member
**
Offline Offline

Activity: 114
Merit: 10


View Profile
February 27, 2013, 04:37:01 AM
 #33

Couldn't someone DoS the network for a period by submitting 1 high-fee transaction at a time? So high that including any other normal-fee transactions would violate the CoVar max, and high enough that it is greater than the sum of all the rest of the transactions in the pool?

Yes, it would cost bitcoin to do this, but an entity with deep pockets could just keep buying BTC on the open market to do this as long as they wanted.
Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 27, 2013, 04:41:46 AM
 #34

This is just going to cause the transaction fees to bounce around wildly, and users will no longer have any real control over transaction priority.

At the mome

Here is an example of what Zeilap  sais:

Suppose the set of transaction fees available is:
1,1,1,1,1,1,1,1,1,1,1,1,2,10

Suppose the maximum CoVar is 1.

We compare two possible sets to choose, one containing 10 and another without it:

(A) 1,1,1,1,1,1,1,1,1,1,1,1,2
(B) 1,2,10


Results for A:

Sum=14.00
Mean= 1.08
Variance= 0.07
StdDev= 0.27
CoVar= 0.25

Results for B:

Sum=13.00
Mean= 4.33
Variance=16.25
StdDev= 4.03
CoVar= 0.93

So the set A pays more fees, and so it will be chosen.

I would discard the 10% percentile, as I said.
Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 27, 2013, 04:50:59 AM
 #35

Couldn't someone DoS the network for a period by submitting 1 high-fee transaction at a time? So high that including any other normal-fee transactions would violate the CoVar max, and high enough that it is greater than the sum of all the rest of the transactions in the pool?

Yes, it would cost bitcoin to do this, but an entity with deep pockets could just keep buying BTC on the open market to do this as long as they wanted.

Yes.
One solution would be to use Robust statistics, like the median instead of the mean to divide the Standard deviation. Or the Median absolute deviation divided by the median. Let's call it CoVar*. So CoVar* will work as CoVar, but won't be affected by outliers.
Maged
Legendary
*
Offline Offline

Activity: 1204
Merit: 1015


View Profile
February 27, 2013, 07:09:01 AM
 #36

Be very careful with this. There are valid use cases where free transactions can be completely acceptable because they are paid for some other way. For example, chaining transactions so that the recipient pays the fee instead of the sender. This can be worked around by making sure that the algorithm is aware of that and considers the chain as a single unit. However, my point is that these cases need to all be known at the time of writing the algorithm because it would take a hard fork to change it later.

Also, I'm assuming that every reference to "transaction fee" really means transaction fee per KB, otherwise that opens up the potential for people to use mega-transactions to split single the fee between multiple parties (transactions larger than 500KB make sense now, don't they  Wink ) instead of mega-transactions just receiving a marginally lower fee over using normal transactions.

Finally, does this really require a hard fork? It might be used with one, sure, but could this be done at a later point via a soft fork? I'm trying to figure out if I'm missing anything about your proposal...

Zeilap
Full Member
***
Offline Offline

Activity: 154
Merit: 100


View Profile
February 27, 2013, 07:17:35 AM
 #37

Zeilap: This is getting really interesting...

This is just going to cause the transaction fees to bounce around wildly, and users will no longer have any real control over transaction priority.

...

With this system, it is no longer true. Instead the optimal fee to increase your transaction priority is dependent on the other transaction fees in the set of unprocessed transactions. Your goal is to set your fee so that it is similar in value to as many other transactions as possible, so a large group with small variance is formed, becoming most profitable to mine.

This can be very easily avoided. Before calculating the CoVar, sort them by fees, and remove from the CoVar computation the transactions that in the 10% percentile that pay most.

Then you'll always be able to get the transaction into the next block: send the fee high enough to be in that percentile.

But I'm unsure if a high fee transaction would be ever discarded. I will simulate this.
I'm not sure what you mean here (in bold), or maybe I do but don't understand how it solves the problem.
Also, I get the impression that you're assuming that the distribution of fees will still be unimodal. By changing the way that the system works like you propose, you change the way people are going to set their fees and you will likely end up with many clusters of transactions at different fees.
Note that it isn't enough to simply join the most profitable cluster - blocksize limits mean that there will be competition within the largest clusters. Thus the centres of these clusters will move over the course of one block as people fight for position, and may merge or split - fuck knows.
You can only know what the optimal fee is at the moment you send your transaction, which would be fine if you are the last player to make his move - but you won't be, so the next person will find a slightly different optimal fee (because your transaction has changed the optimal value), and so on until one minute later, your transaction is far enough away from optimal that it won't be processed this block.

I applaud you for thinking outside the box on this (and a lot of things actually), but I think you've just come up with an interesting topic for game theorists Smiley
farlack
Legendary
*
Offline Offline

Activity: 1311
Merit: 1000



View Profile
February 27, 2013, 07:30:35 AM
 #38

No one puts into consideration that there will be a large bank backed person to come in pour a ton of money into hardware and mine fees for profit powered off solar panels.
Google, the US government, paypal banks, etc.


Wouldn't that speed things along?
Granted it would have a huge startup and a long term profit ratio, but the big banks will always be around even if the USD collapses, and bitcoin are the new currency.
Maged
Legendary
*
Offline Offline

Activity: 1204
Merit: 1015


View Profile
February 27, 2013, 07:43:29 AM
 #39

Zeilap: This is getting really interesting...

This is just going to cause the transaction fees to bounce around wildly, and users will no longer have any real control over transaction priority.

...

With this system, it is no longer true. Instead the optimal fee to increase your transaction priority is dependent on the other transaction fees in the set of unprocessed transactions. Your goal is to set your fee so that it is similar in value to as many other transactions as possible, so a large group with small variance is formed, becoming most profitable to mine.

This can be very easily avoided. Before calculating the CoVar, sort them by fees, and remove from the CoVar computation the transactions that in the 10% percentile that pay most.

Then you'll always be able to get the transaction into the next block: send the fee high enough to be in that percentile.
But doesn't that break the whole system because even honest miners would be encouraged to add spam transactions (possibly generated locally) in order to make the 10% percentile bigger? In theory, they could artificially expand the set of transactions so that all "real" transactions fit within the 10% percentile.

Atruk
Hero Member
*****
Offline Offline

Activity: 700
Merit: 500



View Profile
February 27, 2013, 07:52:25 AM
 #40

Be very careful with this. There are valid use cases where free transactions can be completely acceptable because they are paid for some other way. For example, chaining transactions so that the recipient pays the fee instead of the sender. This can be worked around by making sure that the algorithm is aware of that and considers the chain as a single unit. However, my point is that these cases need to all be known at the time of writing the algorithm because it would take a hard fork to change it later.

Also, I'm assuming that every reference to "transaction fee" really means transaction fee per KB, otherwise that opens up the potential for people to use mega-transactions to split single the fee between multiple parties (transactions larger than 500KB make sense now, don't they  Wink ) instead of mega-transactions just receiving a marginally lower fee over using normal transactions.

Finally, does this really require a hard fork? It might be used with one, sure, but could this be done at a later point via a soft fork? I'm trying to figure out if I'm missing anything about your proposal...

In my transaction experience it is very easy to just start paying proportional fees like this on one's own. It has been a while since my transactions haven't made it into the next block.

What I would really like to see out of the OP's proposal is a formal whitepaper. There's a lot of mathematical language in the original post, and I think a more formal proposal would be very useful. I may be hoping for a bit too much because SDL usually only makes abbreviated posts, but more than in any other post he has made I want to see the white paper from this one.

phelix
Legendary
*
Offline Offline

Activity: 1708
Merit: 1019



View Profile
February 27, 2013, 08:58:12 AM
Last edit: February 27, 2013, 02:25:13 PM by phelix
 #41

How does this prevent a miner from creating their own transactions to game the coefficients? [...]

It just goes against his own interests. Why would he do that? There is no memory between blocks, so a block cannot influence the next.
As long as there is room in the block he can influence the coefficient by sending coins back to himself. See below.


Code:
						Miner:	5,00
Miner: 5,00
Miner: 5,00
Miner: 5,00
Miner: 5,00

0,00 1,00 1,00 1,00
0,00 1,00 1,00 1,00
0,00 1,00 1,00 1,00
0,00 1,00 1,00 1,00
5,00 1,00 1,00 1,00
5,00 1,00 1,00 1,00
6,00 1,00 1,00 1,00
7,00 1,00 1,00 1,00
8,00 1,00 1,00 1,00
7,00 1,00 1,00 1,00
1,00 1,00 1,00
1,00 1,00 1,00 1,00
2,00 2,00 2,00 2,00
10,00 10,00 10,00

-25,00
Sum 38,00 24,00 14,00 13,00 24,00
Mean 3,80 1,71 1,08 4,33 2,58
Var 11,51 5,76 0,08 24,33 6,37
StdDev 3,39 2,40 0,28 4,93 2,52
Coeff 0,89 1,40 0,26 1,14 0,98

(my openoffice seems to come to slightly different values than you)

Edit: Data is based on Sergio's example:
[...]
Suppose the set of transaction fees available is:
1,1,1,1,1,1,1,1,1,1,1,1,2,10

Suppose the maximum CoVar is 1.

We compare two possible sets to choose, one containing 10 and another without it:

(A) 1,1,1,1,1,1,1,1,1,1,1,1,2
(B) 1,2,10


Results for A:

Sum=14.00
Mean= 1.08
Variance= 0.07
StdDev= 0.27
CoVar= 0.25

Results for B:

Sum=13.00
Mean= 4.33
Variance=16.25
StdDev= 4.03
CoVar= 0.93

So the set A pays more fees, and so it will be chosen.

[...]


A while ago I suggested a maximum spread for transaction fees which is somewhat similar: https://bitcointalk.org/index.php?topic=67900.msg804916#msg804916


One problem of both solutions is that with a couple of high fee transactions you can stall the whole network.


Still, there might lay a solution in this direction up ahead.
Timo Y
Legendary
*
Offline Offline

Activity: 938
Merit: 1001


bitcoin - the aerogel of money


View Profile
February 27, 2013, 01:16:30 PM
 #42

One issue might be transactions that spam by means of their size. Perhaps a modification taking into account transaction size, too? Something like CoVar(txfeei/txsizei)?

I haven't had enough time yet to try thinking of other ways of gaming this.

Even better:

CoVar(txfee*priority)

Priority is already used by the protocol. It's defined as:

priority = sum(input_value_in_base_units * input_age)/size_in_bytes


Giving a higher priority transaction a bigger weight makes sense intuitively, since we already use priority as a criterion for excluding certain transactions.

GPG ID: FA868D77   bitcoin-otc:forever-d
Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 27, 2013, 02:21:03 PM
 #43

What I would really like to see out of the OP's proposal is a formal whitepaper. There's a lot of mathematical language in the original post, and I think a more formal proposal would be very useful. I may be hoping for a bit too much because SDL usually only makes abbreviated posts, but more than in any other post he has made I want to see the white paper from this one.
You are completely right. I can't promise I'll do it in the near future, but I'll start writing it down in a formal way. Also I will take into consideration block-chain statistics and the excellent critics the idea have received.

But if anyone is interested in writing this paper instead of me, be my guest.
Maybe a game-theoretic thesis can come out from this analysis.

Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 27, 2013, 02:32:17 PM
 #44


A while ago I suggested a maximum spread for transaction fees which is somewhat similar: https://bitcointalk.org/index.php?topic=67900.msg804916#msg804916


One problem of both solutions is that with a couple of high fee transactions you can stall the whole network.


Still, there might lay a solution in this direction up ahead.


Yes, the same idea. And it's a pity it passed unnoticed. I would prefer everybody writes their ideas in the Bitcoin wiki, so the common knowledge is increased.

My idea of discarding the transactions in the 10% of the highest fees before computing CoVar is sound. Obviously miners can manipulate everything, but the point is that they always act rationally, so why would they insert transactions to change the CoVar? And how much time will they invest finding the correct transactions to insert? That problem is, probably, very hard.

phelix
Legendary
*
Offline Offline

Activity: 1708
Merit: 1019



View Profile
February 27, 2013, 03:51:57 PM
 #45


A while ago I suggested a maximum spread for transaction fees which is somewhat similar: https://bitcointalk.org/index.php?topic=67900.msg804916#msg804916


One problem of both solutions is that with a couple of high fee transactions you can stall the whole network.


Still, there might lay a solution in this direction up ahead.


Yes, the same idea. And it's a pity it passed unnoticed. I would prefer everybody writes their ideas in the Bitcoin wiki, so the common knowledge is increased.

My idea of discarding the transactions in the 10% of the highest fees before computing CoVar is sound. Obviously miners can manipulate everything, but the point is that they always act rationally, so why would they insert transactions to change the CoVar? And how much time will they invest finding the correct transactions to insert? That problem is, probably, very hard.


Should it not be easy to just add a couple of tx with fees at the mean of the normal fees?

Maybe I am missing something but it worked in the example Smiley

blueadept
Full Member
***
Offline Offline

Activity: 225
Merit: 101


View Profile
February 27, 2013, 08:13:54 PM
 #46

Even better:

CoVar(txfee*priority)

Priority is already used by the protocol. It's defined as:

priority = sum(input_value_in_base_units * input_age)/size_in_bytes


Giving a higher priority transaction a bigger weight makes sense intuitively, since we already use priority as a criterion for excluding certain transactions.

This is, at first glance, almost certainly better than my original modification. Makes it more expensive to game the system with low priority coins. And miners can't pay too much to themselves in fees for fear of competitors intentionally orphaning their block to take the fee even despite retep's idea of nLockTime in every transaction, when the disparity between fees and difficulty is too large.

Like my posts?  Connect with me on LinkedIn and endorse my "Bitcoin" skill.
Decentralized, instant off-chain payments.
conv3rsion
Sr. Member
****
Offline Offline

Activity: 310
Merit: 250


View Profile
February 27, 2013, 09:46:10 PM
 #47

Also I like it because it leaves the issue of block size aside. On the other hand maybe that might be a problem?

If the maximum block size is increased, then this proposal will prevent fees from dropping to nothing. I like it!


Yea this is good stuff right here.
phelix
Legendary
*
Offline Offline

Activity: 1708
Merit: 1019



View Profile
February 28, 2013, 08:19:47 AM
 #48


A while ago I suggested a maximum spread for transaction fees which is somewhat similar: https://bitcointalk.org/index.php?topic=67900.msg804916#msg804916


One problem of both solutions is that with a couple of high fee transactions you can stall the whole network.


Still, there might lay a solution in this direction up ahead.


Yes, the same idea. And it's a pity it passed unnoticed. I would prefer everybody writes their ideas in the Bitcoin wiki, so the common knowledge is increased.

My idea of discarding the transactions in the 10% of the highest fees before computing CoVar is sound. Obviously miners can manipulate everything, but the point is that they always act rationally, so why would they insert transactions to change the CoVar? And how much time will they invest finding the correct transactions to insert? That problem is, probably, very hard.


Disregarding the 10% highest fees can easily be skirted by the miners. they just add a couple high fee tx.

The concept will not work like this because with a couple of bitcoins the whole network will stall. say, 100 tx with a fee of 1btc every ten minutes. so at 600btc per hour I can bring everything to a grinding halt or at least slow it down until there are enough low fee tx (should enough fit into a block).

maybe there is a more robust formula. what about this: fees must be higher than half the median of all fees.

naah, I can not think of anything temper proof.


The miners have to dictate / compete for the fees.

Sukrim
Legendary
*
Offline Offline

Activity: 2618
Merit: 1006


View Profile
February 28, 2013, 01:20:23 PM
 #49

Disregarding the 10% highest fees can easily be skirted by the miners. they just add a couple high fee tx.

The concept will not work like this because with a couple of bitcoins the whole network will stall. say, 100 tx with a fee of 1btc every ten minutes. so at 600btc per hour I can bring everything to a grinding halt or at least slow it down until there are enough low fee tx (should enough fit into a block).

The reason against miners mining their own money as fees is that if a block gets orphaned, some other miner can claim the money from these transactions.

High fees to drive out lower transactions would work in probably any limitation scenario. If you invest 600 BTC per hour in the current network, you're also probably able to drive out a LOT of transactions.

https://www.coinlend.org <-- automated lending at various exchanges.
https://www.bitfinex.com <-- Trade BTC for other currencies and vice versa.
Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 28, 2013, 01:42:38 PM
 #50


maybe there is a more robust formula. what about this: fees must be higher than half the median of all fees.


That formula does not work, it just keeps rising fees.

What about this alternative:

1. fees equal or greater than the average fees of the last 50 blocks are always accepted.
(This allows any used to put whatever high fee he may want, and it's predictable)

2. fees lower than the average are passed through the CoVar test (including the highest part)
(this reduces the changes of spamming txs with almost zero fees)

The key difference is that the average is not taken for the last block, but for a number of past blocks.
I will this method Restricted-CoVar for reference.

This method looks self-healing: 
- if the 50b-average fees goes down, many more transactions with higher fees are included, then the average goes up.
- if the 50b-average goes up too much, then everybody will send lower transaction fees, and the average goes down, but not too fast, since the CoVar limits it.
phelix
Legendary
*
Offline Offline

Activity: 1708
Merit: 1019



View Profile
February 28, 2013, 03:00:32 PM
Last edit: February 28, 2013, 03:23:21 PM by phelix
 #51

Disregarding the 10% highest fees can easily be skirted by the miners. they just add a couple high fee tx.

The concept will not work like this because with a couple of bitcoins the whole network will stall. say, 100 tx with a fee of 1btc every ten minutes. so at 600btc per hour I can bring everything to a grinding halt or at least slow it down until there are enough low fee tx (should enough fit into a block).

The reason against miners mining their own money as fees is that if a block gets orphaned, some other miner can claim the money from these transactions.

[...]
Thanks for bringing that up. I had not thought of it and missed it in the discussion above. I assume a typical orphan rate is 1.5%


maybe there is a more robust formula. what about this: fees must be higher than half the median of all fees.


That formula does not work, it just keeps rising fees.
yup  Roll Eyes

Quote

What about this alternative:

1. fees equal or greater than the average fees of the last 50 blocks are always accepted.
(This allows any used to put whatever high fee he may want, and it's predictable)

2. fees lower than the average are passed through the CoVar test (including the highest part)
(this reduces the changes of spamming txs with almost zero fees)

The key difference is that the average is not taken for the last block, but for a number of past blocks.
I will this method Restricted-CoVar for reference.

This method looks self-healing:  
- if the 50b-average fees goes down, many more transactions with higher fees are included, then the average goes up.
- if the 50b-average goes up too much, then everybody will send lower transaction fees, and the average goes down, but not too fast, since the CoVar limits it.
My gut says this will mostly slow things down but not much more. Miners can still drive up the minimum fees (as long as they don't loose too much because of orphaned blocks).

What about this:
1.) In each block miners vote for a minimum tx fee by the lowest tx fee included. this txFeeMin is averaged over 50 blocks.

Of course miners have an incentive to drive it as high as possible. So to keep up the competition:

2.) txs with fees up to half txFeeMinAvg are valid.
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 28, 2013, 04:11:27 PM
 #52

1. fees equal or greater than the average fees of the last 50 blocks are always accepted.
(This allows any used to put whatever high fee he may want, and it's predictable)

Do we also allow the block size to exceed the 1 megabyte limit, by not counting the size of these transactions with above-average fees towards the total block size?
Sergio_Demian_Lerner (OP)
Hero Member
*****
expert
Offline Offline

Activity: 552
Merit: 621


View Profile WWW
February 28, 2013, 05:24:53 PM
 #53

1. fees equal or greater than the average fees of the last 50 blocks are always accepted.
(This allows any used to put whatever high fee he may want, and it's predictable)

Do we also allow the block size to exceed the 1 megabyte limit, by not counting the size of these transactions with above-average fees towards the total block size?

Good, a new idea. Nevertheless there must be a hard limit (say 10 Mbytes) for DoS protection reasons.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1150


View Profile
February 28, 2013, 05:39:44 PM
 #54

1. fees equal or greater than the average fees of the last 50 blocks are always accepted.
(This allows any used to put whatever high fee he may want, and it's predictable)

Do we also allow the block size to exceed the 1 megabyte limit, by not counting the size of these transactions with above-average fees towards the total block size?

...which means large miners can offer "direct-submission" contracts where they accept transactions with above average fees, then refund the fees after they get them mined, minus the orphan rate and some small cut of profit. It'd be a great way for satoshidice to operate for example, and you could submit the same transaction with multiple miners at once if the miners agree to sign their coinbases. (you'll have to audit that the orphans are real, but to do so the miner just has to keep track of every block they produce, orphaned or not)

misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 28, 2013, 05:55:58 PM
 #55

...which means large miners can offer "direct-submission" contracts where they accept transactions with above average fees, then refund the fees after they get them mined, minus the orphan rate and some small cut of profit.

Damnit, I didn't think of that. You're devious!
phelix
Legendary
*
Offline Offline

Activity: 1708
Merit: 1019



View Profile
March 03, 2013, 05:54:31 PM
Last edit: March 17, 2013, 10:19:45 AM by phelix
 #56

Let me repost this as I think it might be a way to balance a single miners short time interest with all miners long time interest to hold up transaction fees. It allows for something like a limited extend miners union.

1.) In each block miners vote for a minimum tx fee by the lowest tx fee included. this txFeeMin is averaged over 50 blocks or more (2016?).
2.) txs with fees up to above 0.5 * txFeeMinAvg are valid.


The averaging could be done via a moving average (or even together with difficulty changes).

Of course the factor does not have to be 0.5 - it's choice is critical.


Pages: 1 2 3 [All]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!