Bitcoin Forum
November 09, 2024, 08:47:34 PM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 [8]  All
  Print  
Author Topic: [XMR] Monero Improvement Technical Discussion  (Read 14743 times)
This is a self-moderated topic. If you do not want to be moderated by the person who started this topic, create a new topic.
ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
July 16, 2016, 03:26:23 AM
Last edit: July 16, 2016, 03:50:06 AM by ArticMine
 #141

...
I'm confused, though I think I'm putting it together. I see you're threading the needle of making the fee adaptive while giving the fee the ability to serve its intended purpose of preventing blocksize expansion. I think you were on to something before with using the difficulty as an on-chain surrogate of external value. I think the need for that factor will exist at any stage of the chain's life - during initial distribution curve and during the tail emission. Your second scenario above seems focused on the tail emission portion of the coins existence, which doesn't happen for another ... however many years.




No the second scenario is determined by the median blocksize becoming larger than 60,000 bytes, not Monero reaching the tail emission. One can consider Bitcoin here as a special case where the penalty is infinite and with a fixed blocksize of 300,000 bytes. This was to a large degree the situation with Bitcoin until the spring of 2013.  

In either case the top three fee tiers would be determined by the blocksize penalty formula. There is really not much choice here if these fee tiers are going to challenge the blocksize penalty formula.  There is no reason, however, why the bottom two fee tiers could be not based on a difficulty ratio as was my original idea even after tail emission. The one proviso here would be to cap the lower two tiers at some percentages of the third tier. If the difficulty were to increase with time due to much more efficient hardware this could cause a spread in the fees over time that would make the Monero transactions when there was no pressure on the blocksize very affordable.  Spammers would still be blocked by the third and higher tiers.

Here is some research on Metcalfe's Law which to a large degree does support the case for not relying solely on the blocksize scaling / penalty formula for the lower two fee tiers, since the research indicates n log (n) rather than n as the rate of growth of the value of a network with n nodes. The extrapolation here is to replace the size of the network by the number of transactions in a given period of time. http://spectrum.ieee.org/computing/networks/metcalfes-law-is-wrong

Edit 1: There was an effective maximum blocksize in Bitcoin of around 256 KB until the spring of 2013.
Edit 2: A better distinction here is between those fee tiers at the lower end that are not challenging the penalty and those above that are challenging the penalty rather than between the two scenarios I indicated before.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
GingerAle (OP)
Legendary
*
Offline Offline

Activity: 1260
Merit: 1008


View Profile WWW
July 16, 2016, 03:53:14 PM
 #142

...
I'm confused, though I think I'm putting it together. I see you're threading the needle of making the fee adaptive while giving the fee the ability to serve its intended purpose of preventing blocksize expansion. I think you were on to something before with using the difficulty as an on-chain surrogate of external value. I think the need for that factor will exist at any stage of the chain's life - during initial distribution curve and during the tail emission. Your second scenario above seems focused on the tail emission portion of the coins existence, which doesn't happen for another ... however many years.




No the second scenario is determined by the median blocksize becoming larger than 60,000 bytes, not Monero reaching the tail emission. One can consider Bitcoin here as a special case where the penalty is infinite and with a fixed blocksize of 300,000 bytes. This was to a large degree the situation with Bitcoin until the spring of 2013.  

In either case the top three fee tiers would be determined by the blocksize penalty formula. There is really not much choice here if these fee tiers are going to challenge the blocksize penalty formula.  There is no reason, however, why the bottom two fee tiers could be not based on a difficulty ratio as was my original idea even after tail emission. The one proviso here would be to cap the lower two tiers at some percentages of the third tier. If the difficulty were to increase with time due to much more efficient hardware this could cause a spread in the fees over time that would make the Monero transactions when there was no pressure on the blocksize very affordable.  Spammers would still be blocked by the third and higher tiers.

Here is some research on Metcalfe's Law which to a large degree does support the case for not relying solely on the blocksize scaling / penalty formula for the lower two fee tiers, since the research indicates n log (n) rather than n as the rate of growth of the value of a network with n nodes. The extrapolation here is to replace the size of the network by the number of transactions in a given period of time. http://spectrum.ieee.org/computing/networks/metcalfes-law-is-wrong

Edit 1: There was an effective maximum blocksize in Bitcoin of around 256 KB until the spring of 2013.
Edit 2: A better distinction here is between those fee tiers at the lower end that are not challenging the penalty and those above that are challenging the penalty rather than between the two scenarios I indicated before.

I'm going to copy the formula you posted earlier to make sure I get it -

For simplicity I will define:
BlkSize = (1+B) MN
BaseReward = Rbase
Penalty (for a given B) = PB
NewReward (for a given B) = RB

I'm assuming that B is the current block? Is it the size of the block in bytes? kb? I think its the 1 + B that is throwing me off. Does that just mean "the next block"?

The penalty for a given B becomes:
PB = RbaseB2
While the new reward for a given B becomes:
RB = Rbase(1 - B2)
The first derivative of PB with respect to B is
dPB / dB = 2RbaseB

I apologize that I can't parse the equations. Its even worse as I type now with the bbcode or whatever formatting it is.

Quote
Here is some research on Metcalfe's Law which to a large degree does support the case for not relying solely on the blocksize scaling / penalty formula for the lower two fee tiers, since the research indicates n log (n) rather than n as the rate of growth of the value of a network with n nodes. The extrapolation here is to replace the size of the network by the number of transactions in a given period of time. http://spectrum.ieee.org/computing/networks/metcalfes-law-is-wrong (http://spectrum.ieee.org/computing/networks/metcalfes-law-is-wrong)

So if I'm reading this right, you're using hashrate as an indicator of the size of the network? Or just use something other than blocksize scaling / penalty formula for the first two tiers?

I apologize for being obtuse in my understanding.

The way that I understand things is that we need a way to match the internal cost (xmr) with the external value of xmr for adding data to the blockchain. With the multi-tiered system you propose, something is used to adjust the xmr cost for the first two tiers, and then for tiers 3,4,5 , it uses a component of the block size penalty. I think what I'm not getting is that as the network becomes more valuable, the internal xmr cost has to go down to maintain the usability of the network. Now, if the change in the transaction fee is coupled to a transaction priority system and is dependent on the block penalty... wouldn't all those things imply that the transaction fee is increasing?

< Track your bitcoins! > < Track them again! > <<< [url=https://www.reddit.com/r/Bitcoin/comments/1qomqt/what_a_landmark_legal_case_from_mid1700s_scotland/] What is fungibility? >>> 46P88uZ4edEgsk7iKQUGu2FUDYcdHm2HtLFiGLp1inG4e4f9PTb4mbHWYWFZGYUeQidJ8hFym2WUmWc p34X8HHmFS2LXJkf <<< Free subdomains at moneroworld.com!! >>> <<< If you don't want to run your own node, point your wallet to node.moneroworld.com, and get connected to a random node! @@@@ FUCK ALL THE PROFITEERS! PROOF OF WORK OR ITS A SCAM !!! @@@@
ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
July 16, 2016, 06:28:27 PM
Last edit: July 16, 2016, 06:42:48 PM by ArticMine
 #143

...

I'm going to copy the formula you posted earlier to make sure I get it -

For simplicity I will define:
BlkSize = (1+B) MN
BaseReward = Rbase
Penalty (for a given B) = PB
NewReward (for a given B) = RB

I'm assuming that B is the current block? Is it the size of the block in bytes? kb? I think its the 1 + B that is throwing me off. Does that just mean "the next block"?

The penalty for a given B becomes:
PB = RbaseB2
While the new reward for a given B becomes:
RB = Rbase(1 - B2)
The first derivative of PB with respect to B is
dPB / dB = 2RbaseB

I apologize that I can't parse the equations. Its even worse as I type now with the bbcode or whatever formatting it is.

Quote
Here is some research on Metcalfe's Law which to a large degree does support the case for not relying solely on the blocksize scaling / penalty formula for the lower two fee tiers, since the research indicates n log (n) rather than n as the rate of growth of the value of a network with n nodes. The extrapolation here is to replace the size of the network by the number of transactions in a given period of time. http://spectrum.ieee.org/computing/networks/metcalfes-law-is-wrong (http://spectrum.ieee.org/computing/networks/metcalfes-law-is-wrong)

So if I'm reading this right, you're using hashrate as an indicator of the size of the network? Or just use something other than blocksize scaling / penalty formula for the first two tiers?

I apologize for being obtuse in my understanding.

The way that I understand things is that we need a way to match the internal cost (xmr) with the external value of xmr for adding data to the blockchain. With the multi-tiered system you propose, something is used to adjust the xmr cost for the first two tiers, and then for tiers 3,4,5 , it uses a component of the block size penalty. I think what I'm not getting is that as the network becomes more valuable, the internal xmr cost has to go down to maintain the usability of the network. Now, if the change in the transaction fee is coupled to a transaction priority system and is dependent on the block penalty... wouldn't all those things imply that the transaction fee is increasing?

No apologies needed. You are not being obtuse, in fact, on the contrary, you are being very helpful. These are far from easy concepts to understand and explain. Furthermore I am not aware of any other crypto currency that has even seriously looked at these issues let alone proposed a solution or set of solutions, even though these are issues that every crypto currency faces.

You are of course correct. that "we need a way to match the internal cost (xmr) with the external value of xmr for adding data to the blockchain".

Monero however imposes a significant linear correlation between the internal cost (xmr) and the external value of adding data to the blockchain by virtue of the penalty function for blocksize scaling. I say linear because for a given base reward the cost in xmr adding a particular transaction of a given size in KB to a particular part of the penalty area falls liniarly with the the median of the blocksize. For example if MN is 10x larger the cost per transaction falls by a factor of 10 since there are 10x as many transactions paying for a given amount of penalty. In this example I am assuming the transactions are all of the same size for simplicity.

B is the relative  increase in the block size over the median blocksize, It can range from 0 (no increase) to 1 (100% increase / doubling of the blocksize). The critical point here is that B attracts the same penalty in XMR for an increase in blocksize from say 1 MB to 1,1 MB than for say 1 GB to say 1.1 GB, since in both cases B = 0.1. In the latter case there are 1024x more transactions to absorb the cost of the penalty so the cost per transaction falls by a factor of 1024. Again I am assuming for simplicity the same distribution by size of transactions in the 1.1 MB and 1.1 GB blocks.

Now here is where it can get interesting. If the natural relationship between network value and network size is say MN log (MN) rather than MN then it is possible for the cost of a transaction, in real terms, in the penalty area to rise with log (MN) at least for a period of time. This can happen if for example market responds to this difference by optimizing transactions in order to minimize paying the penalty. This would occur because all transactions do not have the same priority. It is for this reason that there can be a very significant merit to use a different scaling formula (difficulty adjusted for block reward) for the low tier fee levels than for the high tier fee levels where the fees are effectively set by the base reward and median blocksize.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
smoothie
Legendary
*
Offline Offline

Activity: 2492
Merit: 1474


LEALANA Bitcoin Grim Reaper


View Profile
July 16, 2016, 07:51:09 PM
 #144

One of the ideas I had which I'm not sure how much demand there will be for is a selectable list of fees & mixing for a tx.

It would calculate the tx fee based on different mixins and you could select the "best" tx fee for the best mixin.

I've noticed this is all determined on the fly as outputs to mix with are selected in a "random" fashion using a triangular distribution (at least thats what I remember after reviewing the source).

But one thing I noticed is that the tx fee can actually be higher with a lower mixin than a higher mixin in some cases.

In terms of efficiency I can see people wanting to see a range of mixins and fees that go along with it when creating the tx and select the one they want to actually execute based on their determination of what is "efficient" for them.

This may not end up saving users much money on tx fees now but it could in the long term if they can see a range of mixins/fees to pick from.

Obviously this would make the most sense to incorporate into a GUI and not try to do this via command line, as it would be that more confusing to the average user.

███████████████████████████████████████

            ,╓p@@███████@╗╖,           
        ,p████████████████████N,       
      d█████████████████████████b     
    d██████████████████████████████æ   
  ,████²█████████████████████████████, 
 ,█████  ╙████████████████████╨  █████y
 ██████    `████████████████`    ██████
║██████       Ñ███████████`      ███████
███████         ╩██████Ñ         ███████
███████    ▐▄     ²██╩     a▌    ███████
╢██████    ▐▓█▄          ▄█▓▌    ███████
 ██████    ▐▓▓▓▓▌,     ▄█▓▓▓▌    ██████─
           ▐▓▓▓▓▓▓█,,▄▓▓▓▓▓▓▌          
           ▐▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▌          
    ▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓─  
     ²▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓╩    
        ▀▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▀       
           ²▀▀▓▓▓▓▓▓▓▓▓▓▓▓▀▀`          
                   ²²²                 
███████████████████████████████████████

. ★☆ WWW.LEALANA.COM        My PGP fingerprint is A764D833.                  History of Monero development Visualization ★☆ .
LEALANA BITCOIN GRIM REAPER SILVER COINS.
 
GingerAle (OP)
Legendary
*
Offline Offline

Activity: 1260
Merit: 1008


View Profile WWW
July 17, 2016, 12:35:53 AM
 #145

...

I'm going to copy the formula you posted earlier to make sure I get it -

For simplicity I will define:
BlkSize = (1+B) MN
BaseReward = Rbase
Penalty (for a given B) = PB
NewReward (for a given B) = RB

I'm assuming that B is the current block? Is it the size of the block in bytes? kb? I think its the 1 + B that is throwing me off. Does that just mean "the next block"?

The penalty for a given B becomes:
PB = RbaseB2
While the new reward for a given B becomes:
RB = Rbase(1 - B2)
The first derivative of PB with respect to B is
dPB / dB = 2RbaseB

I apologize that I can't parse the equations. Its even worse as I type now with the bbcode or whatever formatting it is.

Quote
Here is some research on Metcalfe's Law which to a large degree does support the case for not relying solely on the blocksize scaling / penalty formula for the lower two fee tiers, since the research indicates n log (n) rather than n as the rate of growth of the value of a network with n nodes. The extrapolation here is to replace the size of the network by the number of transactions in a given period of time. http://spectrum.ieee.org/computing/networks/metcalfes-law-is-wrong (http://spectrum.ieee.org/computing/networks/metcalfes-law-is-wrong)

So if I'm reading this right, you're using hashrate as an indicator of the size of the network? Or just use something other than blocksize scaling / penalty formula for the first two tiers?

I apologize for being obtuse in my understanding.

The way that I understand things is that we need a way to match the internal cost (xmr) with the external value of xmr for adding data to the blockchain. With the multi-tiered system you propose, something is used to adjust the xmr cost for the first two tiers, and then for tiers 3,4,5 , it uses a component of the block size penalty. I think what I'm not getting is that as the network becomes more valuable, the internal xmr cost has to go down to maintain the usability of the network. Now, if the change in the transaction fee is coupled to a transaction priority system and is dependent on the block penalty... wouldn't all those things imply that the transaction fee is increasing?

No apologies needed. You are not being obtuse, in fact, on the contrary, you are being very helpful. These are far from easy concepts to understand and explain. Furthermore I am not aware of any other crypto currency that has even seriously looked at these issues let alone proposed a solution or set of solutions, even though these are issues that every crypto currency faces.

You are of course correct. that "we need a way to match the internal cost (xmr) with the external value of xmr for adding data to the blockchain".

Monero however imposes a significant linear correlation between the internal cost (xmr) and the external value of adding data to the blockchain by virtue of the penalty function for blocksize scaling. I say linear because for a given base reward the cost in xmr adding a particular transaction of a given size in KB to a particular part of the penalty area falls liniarly with the the median of the blocksize. For example if MN is 10x larger the cost per transaction falls by a factor of 10 since there are 10x as many transactions paying for a given amount of penalty. In this example I am assuming the transactions are all of the same size for simplicity.

B is the relative  increase in the block size over the median blocksize, It can range from 0 (no increase) to 1 (100% increase / doubling of the blocksize). The critical point here is that B attracts the same penalty in XMR for an increase in blocksize from say 1 MB to 1,1 MB than for say 1 GB to say 1.1 GB, since in both cases B = 0.1. In the latter case there are 1024x more transactions to absorb the cost of the penalty so the cost per transaction falls by a factor of 1024. Again I am assuming for simplicity the same distribution by size of transactions in the 1.1 MB and 1.1 GB blocks.

Now here is where it can get interesting. If the natural relationship between network value and network size is say MN log (MN) rather than MN then it is possible for the cost of a transaction, in real terms, in the penalty area to rise with log (MN) at least for a period of time. This can happen if for example market responds to this difference by optimizing transactions in order to minimize paying the penalty. This would occur because all transactions do not have the same priority. It is for this reason that there can be a very significant merit to use a different scaling formula (difficulty adjusted for block reward) for the low tier fee levels than for the high tier fee levels where the fees are effectively set by the base reward and median blocksize.

Okay. So in the equations state above, I think you're just laying out the existing penalty fee as it exists in Monero.

From an earlier post, you state

Quote
In this scenario per KB fees are proportional to the base reward divided by the median of the blocksize over the last N blocks, Rbase/MN.

I think this is the only place I see a clear exposition of how per KB fees will be calculated, and it is directly tied to Mn, so therefore the blocksize penalty mechanics are sort of encapsulated by Mn . So the base reward will eventually hit 0.6 (at current 2 minute blocktime), so at that point the the median blocksize is the main driver.

Quote
I say linear because for a given base reward the cost in xmr adding a particular transaction of a given size in KB to a particular part of the penalty area falls liniarly with the the median of the blocksize. For example if MN is 10x larger the cost per transaction falls by a factor of 10 since there are 10x as many transactions paying for a given amount of penalty. In this example I am assuming the transactions are all of the same size for simplicity.
With this I can see where your head is at - basically we assume that if the MN has grown, the network has become more valuable due to the increase in activity. The increased external valuation has to be countered with an internal decrease in the xmr cost to add to the chain, though it must be done with the proper safeguards to prevent bloat attack (or something). In your approach, this is done by directly scaling the blockchain fee with the penalty, which is directly coupled to MN.

So, in practice (for humans using the network), we will have a statistic that details MN in real time, and the software client will make suggestions based on MN for per KB fees.

Meanwhile, the consensus protocol creating a given block will have calculated MN for a given span, determined the optimal size of the block to create. If the optimal size of the block to create is <=  the previous block, any transaction meeting the current blockchain-fee threshold can be included in that block. If the optimal size of the block to create is > the previous block, any transaction meeting the current blockchain-fee threshold can be included in addition to transactions with lower fees.

My brain hurts just imagining how the optimal block size calculations will look.

I wonder if we should pull the training wheels with the 60 kb thing and just let the protocol do its thing, then we could see how the blocksize penalty actually affects things. Though that would force miners to not mine any transactions until the mempool is stuffed. Hrm...

< Track your bitcoins! > < Track them again! > <<< [url=https://www.reddit.com/r/Bitcoin/comments/1qomqt/what_a_landmark_legal_case_from_mid1700s_scotland/] What is fungibility? >>> 46P88uZ4edEgsk7iKQUGu2FUDYcdHm2HtLFiGLp1inG4e4f9PTb4mbHWYWFZGYUeQidJ8hFym2WUmWc p34X8HHmFS2LXJkf <<< Free subdomains at moneroworld.com!! >>> <<< If you don't want to run your own node, point your wallet to node.moneroworld.com, and get connected to a random node! @@@@ FUCK ALL THE PROFITEERS! PROOF OF WORK OR ITS A SCAM !!! @@@@
ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
July 17, 2016, 02:26:20 AM
 #146

...

Okay. So in the equations state above, I think you're just laying out the existing penalty fee as it exists in Monero.

From an earlier post, you state

Quote
In this scenario per KB fees are proportional to the base reward divided by the median of the blocksize over the last N blocks, Rbase/MN.

I think this is the only place I see a clear exposition of how per KB fees will be calculated, and it is directly tied to Mn, so therefore the blocksize penalty mechanics are sort of encapsulated by Mn . So the base reward will eventually hit 0.6 (at current 2 minute blocktime), so at that point the the median blocksize is the main driver.

Quote
I say linear because for a given base reward the cost in xmr adding a particular transaction of a given size in KB to a particular part of the penalty area falls liniarly with the the median of the blocksize. For example if MN is 10x larger the cost per transaction falls by a factor of 10 since there are 10x as many transactions paying for a given amount of penalty. In this example I am assuming the transactions are all of the same size for simplicity.
With this I can see where your head is at - basically we assume that if the MN has grown, the network has become more valuable due to the increase in activity. The increased external valuation has to be countered with an internal decrease in the xmr cost to add to the chain, though it must be done with the proper safeguards to prevent bloat attack (or something). In your approach, this is done by directly scaling the blockchain fee with the penalty, which is directly coupled to MN.

So, in practice (for humans using the network), we will have a statistic that details MN in real time, and the software client will make suggestions based on MN for per KB fees.

Meanwhile, the consensus protocol creating a given block will have calculated MN for a given span, determined the optimal size of the block to create. If the optimal size of the block to create is <=  the previous block, any transaction meeting the current blockchain-fee threshold can be included in that block. If the optimal size of the block to create is > the previous block, any transaction meeting the current blockchain-fee threshold can be included in addition to transactions with lower fees.

My brain hurts just imagining how the optimal block size calculations will look.

I wonder if we should pull the training wheels with the 60 kb thing and just let the protocol do its thing, then we could see how the blocksize penalty actually affects things. Though that would force miners to not mine any transactions until the mempool is stuffed. Hrm...

Setting the fees in the penalty area is actually the easy part.

It is coming up with an optimal or at least close to optimal algorithm for determining the optimal transactions to include in a block from the miner's point of view that I am still wrapping my head around. I do have some ideas at this point but nothing concrete yet. By the way we have had already blocks that have triggered the penalty going all the way back to 2014. Furthermore there was an attack in 2014 that produced a fair number of blocks into the penalty area when at the time M0 was 20,000 K. This was before the recent fork to 2 min blocks. So in effect we already have experience with the training wheels taken off. While the penalty works as designed the economics are very far from optimal with only one fixed fee tier that is approximately in the low end of the penalty range and basically little if any optimization on the miner side.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
GingerAle (OP)
Legendary
*
Offline Offline

Activity: 1260
Merit: 1008


View Profile WWW
July 17, 2016, 02:51:34 AM
 #147

...

Okay. So in the equations state above, I think you're just laying out the existing penalty fee as it exists in Monero.

From an earlier post, you state

Quote
In this scenario per KB fees are proportional to the base reward divided by the median of the blocksize over the last N blocks, Rbase/MN.

I think this is the only place I see a clear exposition of how per KB fees will be calculated, and it is directly tied to Mn, so therefore the blocksize penalty mechanics are sort of encapsulated by Mn . So the base reward will eventually hit 0.6 (at current 2 minute blocktime), so at that point the the median blocksize is the main driver.

Quote
I say linear because for a given base reward the cost in xmr adding a particular transaction of a given size in KB to a particular part of the penalty area falls liniarly with the the median of the blocksize. For example if MN is 10x larger the cost per transaction falls by a factor of 10 since there are 10x as many transactions paying for a given amount of penalty. In this example I am assuming the transactions are all of the same size for simplicity.
With this I can see where your head is at - basically we assume that if the MN has grown, the network has become more valuable due to the increase in activity. The increased external valuation has to be countered with an internal decrease in the xmr cost to add to the chain, though it must be done with the proper safeguards to prevent bloat attack (or something). In your approach, this is done by directly scaling the blockchain fee with the penalty, which is directly coupled to MN.

So, in practice (for humans using the network), we will have a statistic that details MN in real time, and the software client will make suggestions based on MN for per KB fees.

Meanwhile, the consensus protocol creating a given block will have calculated MN for a given span, determined the optimal size of the block to create. If the optimal size of the block to create is <=  the previous block, any transaction meeting the current blockchain-fee threshold can be included in that block. If the optimal size of the block to create is > the previous block, any transaction meeting the current blockchain-fee threshold can be included in addition to transactions with lower fees.

My brain hurts just imagining how the optimal block size calculations will look.

I wonder if we should pull the training wheels with the 60 kb thing and just let the protocol do its thing, then we could see how the blocksize penalty actually affects things. Though that would force miners to not mine any transactions until the mempool is stuffed. Hrm...

Setting the fees in the penalty area is actually the easy part.

It is coming up with an optimal or at least close to optimal algorithm for determining the optimal transactions to include in a block from the miner's point of view that I am still wrapping my head around. I do have some ideas at this point but nothing concrete yet. By the way we have had already blocks that have triggered the penalty going all the way back to 2014. Furthermore there was an attack in 2014 that produced a fair number of blocks into the penalty area when at the time M0 was 20,000 K. This was before the recent fork to 2 min blocks. So in effect we already have experience with the training wheels taken off. While the penalty works as designed the economics are very far from optimal with only one fixed fee tier that is approximately in the low end of the penalty range and basically little if any optimization on the miner side.

Re: bolded section - I would argue that we shouldn't invest too much into optimizing this. This is an algorithm that the market will figure out. Pool ops will claim they have a better block maker than other pool ops, etc. etc.

Indeed I know we've seen blocks trigger the penality, but we haven't seen a movement out of the median.

Ultimately I think I like what you're putting down, but for it to have any effect on things now we'd either have to remove the 60 kb psuedo window thing or come up with something different for the current era of pure speculation without significant financial activity on the chain.

< Track your bitcoins! > < Track them again! > <<< [url=https://www.reddit.com/r/Bitcoin/comments/1qomqt/what_a_landmark_legal_case_from_mid1700s_scotland/] What is fungibility? >>> 46P88uZ4edEgsk7iKQUGu2FUDYcdHm2HtLFiGLp1inG4e4f9PTb4mbHWYWFZGYUeQidJ8hFym2WUmWc p34X8HHmFS2LXJkf <<< Free subdomains at moneroworld.com!! >>> <<< If you don't want to run your own node, point your wallet to node.moneroworld.com, and get connected to a random node! @@@@ FUCK ALL THE PROFITEERS! PROOF OF WORK OR ITS A SCAM !!! @@@@
smooth
Legendary
*
Offline Offline

Activity: 2968
Merit: 1198



View Profile
July 17, 2016, 03:17:07 AM
 #148

Indeed I know we've seen blocks trigger the penality, but we haven't seen a movement out of the median.

We have on multiple occasions. Not with the sort of smarter algorithms that ArticMine discusses though.

I'm not sure I agree we can ignore miner optimization. Such behavior may affect the overall incentives. Transaction creators and miners are in a sort of conversation, with fees as the language. You probably can't understand one side of this conversation, or the conclusion of it, without considering the other.
GingerAle (OP)
Legendary
*
Offline Offline

Activity: 1260
Merit: 1008


View Profile WWW
July 17, 2016, 03:51:11 AM
Last edit: July 17, 2016, 04:24:40 AM by GingerAle
 #149

Indeed I know we've seen blocks trigger the penality, but we haven't seen a movement out of the median.

We have on multiple occasions. Not with the sort of smarter algorithms that ArticMine discusses though.

I'm not sure I agree we can ignore miner optimization. Such behavior may affect the overall incentives. Transaction creators and miners are in a sort of conversation, with fees as the language. You probably can't understand one side of this conversation, or the conclusion of it, without considering the other.


miner optimization as in miners creating better algos to include transactions in blocks to reap most reward?

So this gives us some choices - we either think of the most optimal block inclusion algo based on the parameters set in place by the auto-fee adjuster (can we give this thing a name?), or we make the auto-fee adjuster parameters such that gaming them doesn't totally bork the system.

edited to add - so its engineering perfection vs. fault tolerance.

< Track your bitcoins! > < Track them again! > <<< [url=https://www.reddit.com/r/Bitcoin/comments/1qomqt/what_a_landmark_legal_case_from_mid1700s_scotland/] What is fungibility? >>> 46P88uZ4edEgsk7iKQUGu2FUDYcdHm2HtLFiGLp1inG4e4f9PTb4mbHWYWFZGYUeQidJ8hFym2WUmWc p34X8HHmFS2LXJkf <<< Free subdomains at moneroworld.com!! >>> <<< If you don't want to run your own node, point your wallet to node.moneroworld.com, and get connected to a random node! @@@@ FUCK ALL THE PROFITEERS! PROOF OF WORK OR ITS A SCAM !!! @@@@
smooth
Legendary
*
Offline Offline

Activity: 2968
Merit: 1198



View Profile
July 17, 2016, 05:52:40 AM
 #150

Indeed I know we've seen blocks trigger the penality, but we haven't seen a movement out of the median.

We have on multiple occasions. Not with the sort of smarter algorithms that ArticMine discusses though.

I'm not sure I agree we can ignore miner optimization. Such behavior may affect the overall incentives. Transaction creators and miners are in a sort of conversation, with fees as the language. You probably can't understand one side of this conversation, or the conclusion of it, without considering the other.


miner optimization as in miners creating better algos to include transactions in blocks to reap most reward?

So this gives us some choices - we either think of the most optimal block inclusion algo based on the parameters set in place by the auto-fee adjuster (can we give this thing a name?), or we make the auto-fee adjuster parameters such that gaming them doesn't totally bork the system.

In either case, you still have to consider what strategies might be used.
GingerAle (OP)
Legendary
*
Offline Offline

Activity: 1260
Merit: 1008


View Profile WWW
July 18, 2016, 02:27:28 PM
 #151

I dunno why this is in my head. I think because I recently saw xthin blocks mentioned somewhere.

But I'm curious if another level of thinness, and speed, could be achieved in a blockchain consensus network.

As with the thin block approaches, an index of transactions is used - a hash of each transaction. However, instead of a daemon sending out this index and then forcing a downstream daemon to request the transactions to flesh out the block, why not just mine in the hashes? Consensus can be reached in a fraction of the time, and the actual transaction data can be filled in later. There's a block window where monero transactions have to mature anyway, so we have that window to fill in the data.

node1 - finds block, sends out meta-block of block header and transaction hashes.
node 2 - accepts block, starts solving next block. While solving, rest of block data is filled in and validated. If filled in data is not valid, block is rejected and node 2 waits to receive another block candidate or reverts to finding its own.

Obvious attack vectors are to spam with hashes that don't have corresponding transactions. Would also need to come up with someway to prevent things from being filled in if they shouldn't be filled in. E.g., in the meta block there's a hash of something not in the mempool - anyone's mempool at time of block forming (a malicious miner just put it in there). Meta block gets added, nodes then request data to fill that index. Malicious miner then transmits matching index transaction...

I guess my point is there exists a certain network buffer due to possibility of re-organizations, and we can use this buffer to increase the speed of consensus state independent of the transfer of data.

hashes are so cool.

it seems that the only thing, block protocol wise, that would need to be changed is the how the block ID is hashed. Instead of the transaction, the merkle root is just hashed from the transaction hash.

5. Calculation of Block Identifier

   The identifier of a block is the result of hashing the following data
   with Keccak:

      - size of [block_header, Merkle root hash, and the number of
        transactions] in bytes (varint)

      - block_header,

      - Merkle root hash,

      - number of transactions (varint).

   The goal of the Merkle root hash is to "attach" the transactions
   referred to in the list to the block header: once the Merkle root
   hash is fixed, the transactions cannot be modified.


5.1  Merkle Root Hash Calculation

   Merkle root hash is computed from the list of transactions as
   follows: let tx be the i-th transaction in the block, where 0 <= i
   <= n-1 (n is the number of transactions) and tx[0] is the base
   transaction. Let m be the largest power of two, less than or equal to
   n. Define the array h as follows:

      h = H(h[2*i] || h[2*i+1])
        where 1 <= i <= m-1 or 3*m-n <= i <= 2*m-1.
      h = H(tx[i-m])
        where m <= i <= 3*m-n-1
      h = H(tx[i-4*m+n])
        where 6*m-2*n <= i <= 4*m-1.

from: https://cryptonote.org/cns/cns003.txt

< Track your bitcoins! > < Track them again! > <<< [url=https://www.reddit.com/r/Bitcoin/comments/1qomqt/what_a_landmark_legal_case_from_mid1700s_scotland/] What is fungibility? >>> 46P88uZ4edEgsk7iKQUGu2FUDYcdHm2HtLFiGLp1inG4e4f9PTb4mbHWYWFZGYUeQidJ8hFym2WUmWc p34X8HHmFS2LXJkf <<< Free subdomains at moneroworld.com!! >>> <<< If you don't want to run your own node, point your wallet to node.moneroworld.com, and get connected to a random node! @@@@ FUCK ALL THE PROFITEERS! PROOF OF WORK OR ITS A SCAM !!! @@@@
GingerAle (OP)
Legendary
*
Offline Offline

Activity: 1260
Merit: 1008


View Profile WWW
July 21, 2016, 02:08:47 PM
 #152

i can't help it, i drink coffee.

Peerfriends Metanode Clusters-

what: self organizing nodes clusters that share blockchain and transaction data (somehow).

how - the daemon monitors peers, finds ones that send good data and have good connectivity.

The clusters will self organize. Individual nodes maintain block header chains, request data from peerfriends if needed. Small cluster sizes (relative to network perhaps) will ensure that blockchain data doesn't "go missing". Peerfriends share information about their peer connections, so there is no overlap. In this way, each member of the peerfriend cluster will be less likely to receive duplicate information, because they are receiving information from different parts of the network. This lightens the load for individual nodes, because it reduces the amount of transactions coming in per node. May also reduce the cost of relaying larger blocks.



btw, this website is awesome: https://sketch.io/sketchpad/

< Track your bitcoins! > < Track them again! > <<< [url=https://www.reddit.com/r/Bitcoin/comments/1qomqt/what_a_landmark_legal_case_from_mid1700s_scotland/] What is fungibility? >>> 46P88uZ4edEgsk7iKQUGu2FUDYcdHm2HtLFiGLp1inG4e4f9PTb4mbHWYWFZGYUeQidJ8hFym2WUmWc p34X8HHmFS2LXJkf <<< Free subdomains at moneroworld.com!! >>> <<< If you don't want to run your own node, point your wallet to node.moneroworld.com, and get connected to a random node! @@@@ FUCK ALL THE PROFITEERS! PROOF OF WORK OR ITS A SCAM !!! @@@@
GingerAle (OP)
Legendary
*
Offline Offline

Activity: 1260
Merit: 1008


View Profile WWW
July 25, 2016, 11:56:51 PM
Last edit: July 26, 2016, 12:40:47 PM by GingerAle
 #153


For simplicity I will define:
BlkSize = (1+B) MN
BaseReward = Rbase
Penalty (for a given B) = PB
NewReward (for a given B) = RB

Where MN = the median size of the last N blocks.

The penalty for a given B becomes:
PB = RbaseB2
While the new reward for a given B becomes:
RB = Rbase(1 - B2)
The first derivative of PB with respect to B is
dPB / dB = 2RbaseB

Where B is the relative increase in the block size over the median blocksize

-------

@arcticmine, would you say all the above equations are your current state of thinking on the matter? If so, I might start toying around with some simulations / scenarios for determining optimum block inclusion algos. I might stick with bash, but it could end up in R. Either or, it'll be on github. I just wanna be sure I got the right formulars before I dive into it. I figure I'll use a toy data set first, and then try it against a real data set (I think one of us linked to litecoin dataset somewhere, and bitcoin should be obtainable).

< Track your bitcoins! > < Track them again! > <<< [url=https://www.reddit.com/r/Bitcoin/comments/1qomqt/what_a_landmark_legal_case_from_mid1700s_scotland/] What is fungibility? >>> 46P88uZ4edEgsk7iKQUGu2FUDYcdHm2HtLFiGLp1inG4e4f9PTb4mbHWYWFZGYUeQidJ8hFym2WUmWc p34X8HHmFS2LXJkf <<< Free subdomains at moneroworld.com!! >>> <<< If you don't want to run your own node, point your wallet to node.moneroworld.com, and get connected to a random node! @@@@ FUCK ALL THE PROFITEERS! PROOF OF WORK OR ITS A SCAM !!! @@@@
ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
July 26, 2016, 01:54:00 AM
 #154

Yes. The above is my current understanding.

This is actually a discrete optimization problem.

For a small number of transactions one can test all the possibilities to find the optimum; however this may become computationally expensive to the miner if the number of transactions to include becomes very large.

For a very large number of transactions, with each individual transaction size very small when compared to the blocksize, a simple solution would be to add transactions in order of per KB fees, while testing each addition for an increase in revenue (total fees - penalty). When adding transactions causes a decrease in revenue, stop adding transactions.

It may be simplest to choose between one of the two cases above depending on the number of transactions and their individual size relative to the block size.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
GingerAle (OP)
Legendary
*
Offline Offline

Activity: 1260
Merit: 1008


View Profile WWW
August 03, 2016, 04:09:13 PM
 #155

arcticmine, sorry I haven't followed up on the adaptive blocksize algo. Par for the course with my manic tendencies. Ah well.

I've been following the bitfinex hack (as I'm sure we all have), and came across the Vault Address idea: http://fc16.ifca.ai/bitcoin/papers/MES16.pdf

I find it interesting and think it would be something that would add a lot of value to Monero. Based on my limited knowledge of both Monero and this "covenant" concept, it seems like Monero could pull it off, but I'm not sure. The more basic text on the concept comes from here: http://hackingdistributed.com/2016/02/26/how-to-implement-secure-bitcoin-vaults/


Quote
Operationally, the idea is simple. You send your money to a vault address that you yourself create. Every vault address has a vault key and a recovery key. When spending money from the vault address with the corresponding vault key, you must wait for a predefined amount of time (called the unvaulting period) that you established at the time you created the vault -- say, 24 hours. When all goes well, your vault funds are unlocked after the unvaulting period and you can move them to a standard address and subsequently spend them in the usual way. Now, in case Harry the Hacker gets a hold of your vault key, you have 24 hours to revert any transaction issued by Harry, using the recovery key. His theft, essentially, gets undone, and the funds are diverted unilaterally to their rightful owner. It’s like an “undo” facility that the modern banking world relies on, but for Bitcoin.

Now, the astute reader will ask what happens when Harry is really really good, and he lies in wait to steal not just your vault key, but also your recovery key. That is, he has thoroughly pwnd you and, as far as the network is concerned, is indistinguishable from you. Vaults protect you even in this case. The recovery keys have a similar lock period, allowing you to perpetually revert every transaction Harry makes. Unfortunately, at this point, Harry can do the same and revert every transaction you make. To avoid a perpetual standoff, the recovery keys can also burn the funds, so no one gets the money. The upshot is that Harry is not going to be able to collect a dime of proceeds from his theft. And this, in turn, means that Harry is unlikely to target vaults in the first place, because there is no positive outcome where he gets to keep the proceeds.

The guts of the paper go into detail about the bitcoin scripting that is necessary... and because Monero uses less scripting, it might not work? I dunno.

The important thing, though, is that this type of blockchain technology improvement requires a hardfork, which we're all fine with in Monero.


< Track your bitcoins! > < Track them again! > <<< [url=https://www.reddit.com/r/Bitcoin/comments/1qomqt/what_a_landmark_legal_case_from_mid1700s_scotland/] What is fungibility? >>> 46P88uZ4edEgsk7iKQUGu2FUDYcdHm2HtLFiGLp1inG4e4f9PTb4mbHWYWFZGYUeQidJ8hFym2WUmWc p34X8HHmFS2LXJkf <<< Free subdomains at moneroworld.com!! >>> <<< If you don't want to run your own node, point your wallet to node.moneroworld.com, and get connected to a random node! @@@@ FUCK ALL THE PROFITEERS! PROOF OF WORK OR ITS A SCAM !!! @@@@
ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
August 03, 2016, 11:14:03 PM
Last edit: August 04, 2016, 02:25:57 AM by ArticMine
 #156

arcticmine, sorry I haven't followed up on the adaptive blocksize algo. Par for the course with my manic tendencies. Ah well.

I've been following the bitfinex hack (as I'm sure we all have), and came across the Vault Address idea: http://fc16.ifca.ai/bitcoin/papers/MES16.pdf

I find it interesting and think it would be something that would add a lot of value to Monero. Based on my limited knowledge of both Monero and this "covenant" concept, it seems like Monero could pull it off, but I'm not sure. The more basic text on the concept comes from here: http://hackingdistributed.com/2016/02/26/how-to-implement-secure-bitcoin-vaults/


Quote
Operationally, the idea is simple. You send your money to a vault address that you yourself create. Every vault address has a vault key and a recovery key. When spending money from the vault address with the corresponding vault key, you must wait for a predefined amount of time (called the unvaulting period) that you established at the time you created the vault -- say, 24 hours. When all goes well, your vault funds are unlocked after the unvaulting period and you can move them to a standard address and subsequently spend them in the usual way. Now, in case Harry the Hacker gets a hold of your vault key, you have 24 hours to revert any transaction issued by Harry, using the recovery key. His theft, essentially, gets undone, and the funds are diverted unilaterally to their rightful owner. It’s like an “undo” facility that the modern banking world relies on, but for Bitcoin.

Now, the astute reader will ask what happens when Harry is really really good, and he lies in wait to steal not just your vault key, but also your recovery key. That is, he has thoroughly pwnd you and, as far as the network is concerned, is indistinguishable from you. Vaults protect you even in this case. The recovery keys have a similar lock period, allowing you to perpetually revert every transaction Harry makes. Unfortunately, at this point, Harry can do the same and revert every transaction you make. To avoid a perpetual standoff, the recovery keys can also burn the funds, so no one gets the money. The upshot is that Harry is not going to be able to collect a dime of proceeds from his theft. And this, in turn, means that Harry is unlikely to target vaults in the first place, because there is no positive outcome where he gets to keep the proceeds.

The guts of the paper go into detail about the bitcoin scripting that is necessary... and because Monero uses less scripting, it might not work? I dunno.

The important thing, though, is that this type of blockchain technology improvement requires a hardfork, which we're all fine with in Monero.


My first comment is that reversibility is not a substitute for poor security. This applies equally to both "banks" such a exchanges and individuals. The reality, when it comes to security, is that crypto currency by design turns back the clock back over 50 years if not 100 years. So the security mindset of the 1960's or earlier is actually appropriate here. Even better the security mindset of the 1890's. If all the bank's gold was stolen the bank went bust and the depositors lost all of their deposits. End of story.  Monero, after all, is by design a form of digital gold.

Many people today would be shocked to find out that as recently, as 50 years ago, most fiat transactions were not reversible. Cash and bearer instruments were the norm. Those transactions that were reversible were only reversible for a very limited period of time. Cheques for example could only be reversed while they were in transit. Once the post office delivered the cheque and it was cashed it was no longer reversible. Reversibility of transactions on a large scale started with the use of credit cards for card not present transactions first by the telephone, fax and mail and then over the Internet. When a credit card is used over the Internet or the phone, for example, there is no way to legally bind the payer to the transaction. Even with fax or mail there is no way for the merchant to verify that the signature provided is valid since they do not see the card. This callous lack of security in the banking system has also been extended to many other transactions.  For example in Canada if one deposits a cheque in an ATM the banks do not verify that the account to which the cheque was deposited to is actually that of the named payee on the cheque.

If the above is not enough this is all compounded by the design of the most popular proprietary operating systems. Security of the end user, especially for "consumer" products is not the primary design goal. The primary design goals are instead the deterrence of copyright infringement (DRM) and the collection of information from the end user for commercial exploitation. The Microsoft Windows registry was designed primarily to make it very difficult to copy an installed application form one computer to another, while the primary design goals of the security in IOS is to prevent the copying of installed apps from one phone to another. The result of this is that most people with good reason do not trust their computers, and instead in many cases keep their crypto currencies in centralized exchanges. This to a large degree defeats the whole point of Bitcoin and to a much larger degree of Monero.

For individuals basic security starts with using a FLOSS OS such as GNU/LInux or FreeBSD. One has of course to practice sound computer security and "harden" the system. There are already hardened distributions such as Qubes OS https://www.qubes-os.org/, but even a mainstream GNU/Linux distribution such a Ubuntu, combined with some simple common sense security practices is orders of magnitude ahead of what Microsoft and Apple sells to consumers.

For "banks" I would start with finding a retired banker over the age of 80 preferably over the age of 90 and getting some training on the security practices that were in place in the banking system say 50 - 60 years ago. I mean lets start here with some of the basics such as multiple signatures for amounts over a certain threshold.  I mean say 10,000 USD rather than 50,000,000 USD. I mean a hot wallet with over 100,000 XBT in it? Seriously. The regulators also have a role to play here in setting standards for security and internal controls.

After my lengthy security rant, I will address the articles. The concept here is to create a time locked transaction that can be reversed by the sender for a fixed period of time. There is some merit to this. This would require multi signature transactions that are time locked. My instinct is that this could be done in Bitcoin with what the Lightening Network is proposing. Multi signatures will be possible in Monero after RingCT is implemented. I am not very clear on how time locked signatures could be implemented in Monero, so someone more expert on this than myself may be able to comment. In the meantime Monero "banks" will have to focus on preventing the "gold" from being stolen in the first place, just as in the 19th century.  

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
aminorex
Legendary
*
Offline Offline

Activity: 1596
Merit: 1030


Sine secretum non libertas


View Profile
September 02, 2016, 05:21:21 PM
 #157

Post-quantum motivation: https://www.technologyreview.com/s/602283/googles-quantum-dream-may-be-just-around-the-corner/

TL;DR: End of 2017, a 50-qubit machine is possible, perhaps even likely.

Give a man a fish and he eats for a day.  Give a man a Poisson distribution and he eats at random times independent of one another, at a constant known rate.
Pages: « 1 2 3 4 5 6 7 [8]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!