Bitcoin Forum
December 04, 2016, 04:11:34 AM *
News: Latest stable version of Bitcoin Core: 0.13.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: « 1 [2] 3 4 5 »  All
  Print  
Author Topic: What will keep transaction fees up?  (Read 13446 times)
caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
November 20, 2010, 10:55:58 PM
 #21

It's not a tragedy of the commons because generators will have interest in seeing the network continue to function.
Just like the herders have an interest in having the common pasture continue to grow grass?


No, there will be instititutions with a high value to protect.  Costs of protection are a 'tax' of sorts, but they are unavoidable.  They are not a tragedy of the commons scenario, each is still looking out for his own interests, and his interests benefit others.  It's a postitive externality.

I think db has a point. If the block reward isn't profitable, that does look like a tragedy of the commons. It's true that if it happens the least efficient miners will give up, what will decrease the difficulty for the most efficient... but a lower difficulty is bad for the network...

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1480824694
Hero Member
*
Offline Offline

Posts: 1480824694

View Profile Personal Message (Offline)

Ignore
1480824694
Reply with quote  #2

1480824694
Report to moderator
1480824694
Hero Member
*
Offline Offline

Posts: 1480824694

View Profile Personal Message (Offline)

Ignore
1480824694
Reply with quote  #2

1480824694
Report to moderator
asdf
Hero Member
*****
Offline Offline

Activity: 527


View Profile
November 20, 2010, 11:05:47 PM
 #22

The maximum block size must be continuously adjusted to keep the transaction prices stable. The only way to change the maximum block size is through a lengthy political process of debate, decree, network fragmentation and majority agreement.

This is a bad idea. The generators will collude to keep the block size small; transactions scarce, gouging the market.
MoonShadow
Legendary
*
Offline Offline

Activity: 1666



View Profile
November 20, 2010, 11:18:45 PM
 #23

It's not a tragedy of the commons because generators will have interest in seeing the network continue to function.
Just like the herders have an interest in having the common pasture continue to grow grass?


No, there will be instititutions with a high value to protect.  Costs of protection are a 'tax' of sorts, but they are unavoidable.  They are not a tragedy of the commons scenario, each is still looking out for his own interests, and his interests benefit others.  It's a postitive externality.

I think db has a point. If the block reward isn't profitable, that does look like a tragedy of the commons. It's true that if it happens the least efficient miners will give up, what will decrease the difficulty for the most efficient... but a lower difficulty is bad for the network...

Which is why major institutions will still be willing to contribute clock-cycles at or just below a break even point.  Because there are more forms of economic motivation than just profit.  I'm really suprised that so many who seem so well educated on economic issues can't wrap their head around this simple concept.  If you have something valuable to protect, have you ever paid the rental fee on a safety deposit box?  The cost of the box rental is tiny compared to the value of the object within, but that's not a tragedy of the commons!  People do it all the time!  It's a cost of security, not a resource access issue!  The tragedy of the commons parable is a limited resource issue!

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
MoonShadow
Legendary
*
Offline Offline

Activity: 1666



View Profile
November 20, 2010, 11:24:38 PM
 #24

The maximum block size must be continuously adjusted to keep the transaction prices stable. The only way to change the maximum block size is through a lengthy political process of debate, decree, network fragmentation and majority agreement.

This is a bad idea. The generators will collude to keep the block size small; transactions scarce, gouging the market.

Some may try.  Keep in mind that generators have no sustainable monopolies on generation, not even as a group.  If the major generators collude to keep block sizes small amongst themselves; say by keeping their own max block sizes at 1 meg, but the regular users' clients all have a max block size limit of 3 megs, then the rising backlog of lower fee transactions will attract new players into generation.  Maybe forcing the colluding generators to change, maybe not, but a natural price balance will be maintained.  Perhaps the occasional blockchain split fight is neccessary.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
RHorning
Full Member
***
Offline Offline

Activity: 210


View Profile
November 21, 2010, 12:08:04 AM
 #25


[If there is a situation where there is a huge pile of transactions backing up, perhaps the ones with fees could get move to the front of the pack, so to say, but the rest would generally be cleared in subsequent blocks.


Such a priority ranking system is in the current release client.



I presume this has been used in the test network, but are there any blocks where that has been necessary in the regular client?  I'm not asking somebody to deliberately force enough messages to be processed to do that, but more as a general question, is it something which has been done?

Yes, I realize that it is in the current release.

1FLK3uUT3Vup5JtkGJVXKHAoS3AZWPcKdv
asdf
Hero Member
*****
Offline Offline

Activity: 527


View Profile
November 21, 2010, 12:08:47 AM
 #26

I just thought of something. The time it takes to generate a hash if proportional to the number of transactions you're hashing, right? So, it'll take twice as long (on average) to generate a block with 1000 transactions as one with 500. You're not going to waste precious hasing time on small fee transactions, they'll just decrease your hash/s for negligible gain.

Say for example, there are 999 transactions with 0.1 fee to process and one transaction with a 1BTC fee. It's more profitable to just process the one transaction and ignore the rest, because your hash/s will be 1000 times faster!

Now if you have 10 transactions with a 1BTC fee, your best off processing all 10 of them, because there's a small overhead in processing a hash. But, including an 11th transactions for 0.9BTC fee probably won't be worth it. 0.99BTC fee, maybe. It depends on the size of the overhead.

This implies that generators will only process the transactions with the highest fee! I'm still thinking through the implications of this. Ideas anyone?

caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
November 21, 2010, 12:12:14 AM
 #27

The time it takes to generate a hash if proportional to the number of transactions you're hashing, right?

No, not really. It's related to the difficulty factor only.

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
asdf
Hero Member
*****
Offline Offline

Activity: 527


View Profile
November 21, 2010, 12:14:55 AM
 #28

The time it takes to generate a hash if proportional to the number of transactions you're hashing, right?

No, not really. It's related to the difficulty factor only.

Are you sure? Given any difficulty, you still have to crunch the numbers to solve a block. The more transactions, the more numbers to crunch, thus the longer it takes to compute a given hash.
MoonShadow
Legendary
*
Offline Offline

Activity: 1666



View Profile
November 21, 2010, 12:19:13 AM
 #29

I just thought of something. The time it takes to generate a hash if proportional to the number of transactions you're hashing, right? So, it'll take twice as long (on average) to generate a block with 1000 transactions as one with 500. You're not going to waste precious hasing time on small fee transactions, they'll just decrease your hash/s for negligible gain.

I wouldn't assume that it's as straightforward as that, and you are probably overthinking it anyway.  Feel free to try it, though.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
theymos
Administrator
Legendary
*
Offline Offline

Activity: 2492


View Profile
November 21, 2010, 12:24:34 AM
 #30

Are you sure? Given any difficulty, you still have to crunch the numbers to solve a block. The more transactions, the more numbers to crunch, thus the longer it takes to compute a given hash.

Block hashes are only hashes of the fixed-size 80 byte block header, which contains a hash of the transactions. Transactions only have a small one-time CPU cost for adding.

1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
asdf
Hero Member
*****
Offline Offline

Activity: 527


View Profile
November 21, 2010, 12:30:45 AM
 #31

I just thought of something. The time it takes to generate a hash if proportional to the number of transactions you're hashing, right? So, it'll take twice as long (on average) to generate a block with 1000 transactions as one with 500. You're not going to waste precious hasing time on small fee transactions, they'll just decrease your hash/s for negligible gain.

I wouldn't assume that it's as straightforward as that, and you are probably overthinking it anyway.  Feel free to try it, though.

Well it's pretty simple really, unless I'm missing something, or hashes don't actually work like I think they do. The more data you have to hash, the longer it will take to compute the hash. Makes sense right? I'm pretty sure the computation time is linear with respect to data size, so double the number of transactions and you double the time to compute the hash.

This is a huge problem now, because why would anyone hash more than one transaction to generate a block. They get 50BTC either way, so you might as well just hash one transaction giving you the optimal hash/s.

I hope I'm wrong about this, I'd really like more feedback from you guys. Perhaps we need a minimum block size or something.
asdf
Hero Member
*****
Offline Offline

Activity: 527


View Profile
November 21, 2010, 12:34:30 AM
 #32

Are you sure? Given any difficulty, you still have to crunch the numbers to solve a block. The more transactions, the more numbers to crunch, thus the longer it takes to compute a given hash.

Block hashes are only hashes of the fixed-size 80 byte block header, which contains a hash of the transactions. Transactions only have a small one-time CPU cost for adding.

Ahhh, okay then. Thanks for the reassurance.
db
Sr. Member
****
Offline Offline

Activity: 279



View Profile
November 21, 2010, 12:37:03 AM
 #33

Which is why major institutions will still be willing to contribute clock-cycles at or just below a break even point.  Because there are more forms of economic motivation than just profit.  I'm really suprised that so many who seem so well educated on economic issues can't wrap their head around this simple concept.  If you have something valuable to protect, have you ever paid the rental fee on a safety deposit box?  The cost of the box rental is tiny compared to the value of the object within, but that's not a tragedy of the commons!  People do it all the time!  It's a cost of security, not a resource access issue!  The tragedy of the commons parable is a limited resource issue!
This is not like individual safety deposit boxes. This is like one big collective vault in which it is free for anyone to place their valuables and paying is optional.
asdf
Hero Member
*****
Offline Offline

Activity: 527


View Profile
November 21, 2010, 12:52:06 AM
 #34

how about making the max block size a function of the difficulty? If there are too few generators, then the max block size will decrease making transactions scarce. This will drive up the txfee and create incentive for new generators to enter the market. vice-versa.
MoonShadow
Legendary
*
Offline Offline

Activity: 1666



View Profile
November 21, 2010, 12:52:20 AM
 #35

I'm pretty sure the computation time is linear with respect to data size,


Thats true on most calculations, but most certainly not so with data structures that self reference, and a conditional is a self reference.  I'm not sure if it would be true with hashing algorithims or not, but I wouldn't assume that the algorithim is particularly linear in nature.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
MoonShadow
Legendary
*
Offline Offline

Activity: 1666



View Profile
November 21, 2010, 12:54:01 AM
 #36

Which is why major institutions will still be willing to contribute clock-cycles at or just below a break even point.  Because there are more forms of economic motivation than just profit.  I'm really suprised that so many who seem so well educated on economic issues can't wrap their head around this simple concept.  If you have something valuable to protect, have you ever paid the rental fee on a safety deposit box?  The cost of the box rental is tiny compared to the value of the object within, but that's not a tragedy of the commons!  People do it all the time!  It's a cost of security, not a resource access issue!  The tragedy of the commons parable is a limited resource issue!
This is not like individual safety deposit boxes. This is like one big collective vault in which it is free for anyone to place their valuables and paying is optional.


I'm afraid you missed the analogy that I was trying to present, and your's is more than a bit flawed as well.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
asdf
Hero Member
*****
Offline Offline

Activity: 527


View Profile
November 21, 2010, 12:56:16 AM
 #37

I'm pretty sure the computation time is linear with respect to data size,


Thats true on most calculations, but most certainly not so with data structures that self reference, and a conditional is a self reference.  I'm not sure if it would be true with hashing algorithims or not, but I wouldn't assume that the algorithim is particularly linear in nature.


I meant:
"I'm pretty sure the computation time is linear with respect to data size, for hash functions"

Anyway, as theymos pointed out, you only have to hash the transaction once, not on every attempt. So, this isn't a problem.
MoonShadow
Legendary
*
Offline Offline

Activity: 1666



View Profile
November 21, 2010, 12:56:27 AM
 #38

how about making the max block size a function of the difficulty? If there are too few generators, then the max block size will decrease making transactions scarce. This will drive up the txfee and create incentive for new generators to enter the market. vice-versa.

Absolutely not.  The max block size would increase to the point that too much room was available for spamming, and that would be to much of a temptation for some. 

Now that's an example of the Tragedy of the Commons issue!  The amount of space available to free transactions in the block.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
asdf
Hero Member
*****
Offline Offline

Activity: 527


View Profile
November 21, 2010, 01:09:05 AM
 #39

The maximum block size must be continuously adjusted to keep the transaction prices stable. The only way to change the maximum block size is through a lengthy political process of debate, decree, network fragmentation and majority agreement.

This is a bad idea. The generators will collude to keep the block size small; transactions scarce, gouging the market.

Some may try.  Keep in mind that generators have no sustainable monopolies on generation, not even as a group.  If the major generators collude to keep block sizes small amongst themselves; say by keeping their own max block sizes at 1 meg, but the regular users' clients all have a max block size limit of 3 megs, then the rising backlog of lower fee transactions will attract new players into generation.  Maybe forcing the colluding generators to change, maybe not, but a natural price balance will be maintained.  Perhaps the occasional blockchain split fight is neccessary.

Okay, I'm starting to like this idea. It could be a bit messy with blockchain splits, but it seems like it will work.
db
Sr. Member
****
Offline Offline

Activity: 279



View Profile
November 21, 2010, 10:11:38 AM
 #40

I'm afraid you missed the analogy that I was trying to present, and your's is more than a bit flawed as well.

Let's drop the analogies then and go straight at the problem.

The maximum block size is big, having room for all or most transactions.
Therefore, transaction fees are (close to) zero.
Therefore, total block transaction fees are also (close to) zero.
Therefore, all for-profit block generation ceases.
Therefore, difficulty drops.

If for-profit block generation was the only generation going on then difficulty would be very low and double spending very easy. The only thing that can save the system is people generating at a loss.

Now we have a lot of bitcoin holders that would lose greatly if payments become unreliable and confidence in the system drops. Will they contribute to their common good? If they do it will not be out of self interest. If you contribute to the system reliability you lose your contribution and benefit very little because the benefit is shared with everyone. If you use the system without contributing you benefit from everyone elses contributions anyway.

The usual sad result is that everyone tries to live off of everyone elses contributions, very little is actually contributed and everyone loses.

Classic tragedy of the commons.
Pages: « 1 [2] 3 4 5 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!