Bitcoin Forum
June 26, 2024, 11:12:45 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 [2]  All
  Print  
Author Topic: Individual Block Difficulty Based on Block Size  (Read 4625 times)
laurentmt
Sr. Member
****
Offline Offline

Activity: 384
Merit: 258


View Profile
February 17, 2015, 08:16:42 PM
 #21

I think you're misunderstanding the problem as this isn't about transaction selection. The issue is, should someone who is mining a more difficult block (say, 2x the normal difficulty) continue mining against an older block even after hearing about a 1x new block? To a naïve first-order approximation, they stand to benefit from doing so because a 2x block would beat out the 1x block and the would be able to steal the fees of the transactions in the 1x block. Of course there are a lot of assumptions wrapped up in that naïve assessment, nor is it clear that it is a reflectively stable outcome (if the 1x miners knew you would do this, how would that change their strategy? how would that change of strategy affect the profitability of this attack? etc.)

jonny1000's comment raises an interesting point.

According to the selected polynomial, we have very different ratios of normalized difficulties between smallest and biggest blocks (around x6 in first model, around x20 for n=100/p=0.99).

Am I wrong if I say that, with the latter, a chain of 6 (or 7, ...) smalls blocks should be considered as less secure, since a single big block could orphan them ?

Usually, people consider a transaction "safe" after 6 blocks. With this model, we really should think in term of work produced after the block embedding the tx.

I guess the choice of the polynomial plays an important role to mitigate this effect.
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 2222


Chief Scientist


View Profile WWW
February 17, 2015, 08:19:02 PM
 #22

Interesting idea, but I'm afraid I disagree with your premise.

There is no tragedy-of-the-commons race to zero transaction fees, because miners do not have infinite bandwidth, memory or CPU to accept and validate transactions.

We used to have a tragedy-of-the-commons situation with zero-fee transactions, but we solved that by rate-limiting them based on priority. And we have a working market for zero-fee transactions (see the graph here).

Assuming network bandwidth is the eventual bottleneck, and assuming there is demand for transactions to fill the available network-wide bandwidth (even if that demand is transaction spammers), nodes will start dropping transactions before they relay them. Prioritizing them based on fee paid and dropping the lowest fee/kb transactions will result naturally in a working market for fee-paying transactions.

As justusranvier points out, off-the-blockchain deals between transaction creators and miners doesn't change that logic, because low-fee transactions that are not broadcast break the O(1) block propagation assumption and have a direct cost to the miner.


I think you are trying to solve a different problem: I think you are trying to ensure that "enough" fees are paid to secure the network as the block subsidy goes away. Yes?

How often do you get the chance to work on a potentially world-changing project?
jonny1000
Member
**
Offline Offline

Activity: 129
Merit: 13



View Profile
February 17, 2015, 11:46:04 PM
 #23

I think you are trying to solve a different problem: I think you are trying to ensure that "enough" fees are paid to secure the network as the block subsidy goes away. Yes?

I think this is the problem foolish_austrian is trying to solve, yes.

We used to have a tragedy-of-the-commons situation with zero-fee transactions, but we solved that by rate-limiting them based on priority. And we have a working market for zero-fee transactions (see the graph here).

If you have a chance, please could you explain the graph in a bit more detail as I am not sure I understand it.

There is no tragedy-of-the-commons race to zero transaction fees, because miners do not have infinite bandwidth, memory or CPU to accept and validate transactions.

Gavin, you may well be right here, I do not know.  Although, I still think there is some risk of a tragedy-of-the-commons race to zero/lower transaction fees scenario, you and others have convinced me this is smaller than I thought.  I think the theory assumes that the marginal cost of bandwidth, memory and CPU for each additional transaction added to a block is becomes very low such that there is no longer any relevant marginal cost.  The argument for this is similar to the reasons you put forward for the increase in a block size limit being ok, which is that the costs of these may decline exponentially over time.

Thinking about the above further, I don't think the race to zero is necessarily the issue, what matters is the overall mining cost curve and that there is sufficient differential marginal cost of adding transactions for different miners.  If all miners have the same marginal cost of adding transactions to blocks, e.g. zero, then I think there is a risk that a race to the bottom type scenario could occur, however unlikely one may think this is.  foolish_austrian's proposal is interesting as it ensures miners have a different marginal cost of adding transactions, although it causes many other problems.

I advocate keeping a block size limit not because it directly solves the differential marginal cost problem, but as an alternative mechanism to ensure fees are high enough such that we dont run into this potential issue.
Sergio_Demian_Lerner
Hero Member
*****
expert
Offline Offline

Activity: 554
Merit: 650


View Profile WWW
February 18, 2015, 02:25:05 AM
Last edit: February 20, 2015, 12:21:18 PM by Sergio_Demian_Lerner
 #24

I think Bitcoin with and without subsidy work very differently and have very different incentives and equilibriums.

The idea of block difficulty based on block size is interesting (and old). It is one of several ideas of how to regulate, smooth or create a market we know very little yet, since we have our subsidy and we''ll have it for a long time.
I suggest nothing is done before we actually get into the no-subsidy trouble, if we ever get into it.

I recall some other similar ideas I had in the past:

- The Co-var fee restriction: https://bitcointalk.org/index.php?topic=147124.0

- Spreading the fee of transactions over future blocks (the first miner gets only a % of the fee, the following gets another %, in geometrical or linear steps)

- Creating an open market where miners FIRST announce their fee/kilobyte (e.g. in the coinbase field) and then they are bound to that price (they cannot include transactions of lower fee/kilobyte). This requires also miners pre-announcing an pseudo-identity.

- Pre-announcing transactions in blocks so everyone has the same view of the market (a global tx pool). Transactions would be included in a special part for additional data in the block and they won't be "executed". The miner who first publish the transaction would take 50% of the fees when the tx is executed. A following miner would specify the transactions to execute (by tx-ids to save space) and claim the rest 50% of the fees. Non-executed transactions would be dropped from the global tx pool after a fixed number of blocks.

Maybe anyone wants to dig into them, or combine them with the dynamic difficulty idea,  and see if they worth a math paper or a trash.

Best regards, Sergio
instagibbs
Member
**
Offline Offline

Activity: 114
Merit: 12


View Profile
February 18, 2015, 02:45:09 AM
 #25

stuff

The key is all those fees will just be going towards non-hashing. Security of network will most likely die with an infinite blocksize unless a minimum fee is imposed by miners via soft-fork.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
February 19, 2015, 03:46:49 PM
 #26

The key is all those fees will just be going towards non-hashing. Security of network will most likely die with an infinite blocksize unless a minimum fee is imposed by miners via soft-fork.
There's a lot more to security than just hash rate.

I think there's a large amount of cargo culting going on regarding the hash rate rather than useful analysis of threat models, attacker capabilities, and exactly what proof of work accomplishes.
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 2222


Chief Scientist


View Profile WWW
February 19, 2015, 05:06:47 PM
 #27

I think there's a large amount of cargo culting going on regarding the hash rate rather than useful analysis of threat models, attacker capabilities, and exactly what proof of work accomplishes.

I agree.

My guess is that we will end up with a very secure system with a modest amount of hashing in the future, because PoW hashing does three things:

1) Gives a steady 10-minute 'heartbeat' that limits how quickly new coins are produced
2) Makes it expensive to successfully double-spend confirmed transactions
3) Makes it expensive to censor transactions

The first becomes less important over time as the block subsidy halves.

I think we could do a lot to mitigate the second (see https://gist.github.com/gavinandresen/630d4a6c24ac6144482a for a partly-baked idea).

And I think the third might be mitigated naturally as we scale up and optimize the information sent across the network (there will be strong incentives to create "boring" blocks that don't include or exclude transactions everybody else is excluding or including).

How often do you get the chance to work on a potentially world-changing project?
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
February 19, 2015, 07:14:31 PM
 #28

We need most of the planet's double-sha256 capability devoted to Bitcoin mining instead of doing something else.

As long as that condition holds, the actual hash rate isn't very important.
grau
Hero Member
*****
Offline Offline

Activity: 836
Merit: 1021


bits of proof


View Profile WWW
February 19, 2015, 08:55:05 PM
 #29

I think we could do a lot to mitigate the second (see https://gist.github.com/gavinandresen/630d4a6c24ac6144482a for a partly-baked idea).

There is contradiction between Bitcoin's use as a settlement system and its unconstrained ability to revise its history (reorg).

The ability of revision will have to be constrained, as society will not accept a "technical subtlety" to override settlement of contracts that are considered final with "reasonable care".

Settlement cycles in the current banking system repeat typically a few times a day. Bitcoin would not be deemed suitable in the same context without a similar frequency of final settlement. This means reorgs that would revise the history more than a few hours back would not be accepted, at least not automatically. This makes me think that automated Bitcoin reorgs should be limited to 6-72 blocks.
jonny1000
Member
**
Offline Offline

Activity: 129
Merit: 13



View Profile
February 19, 2015, 08:55:31 PM
 #30

Mining is extremely important and core to Bitcoin.  Let me try to describe some of the desirable characteristics of Bitcoin mining, which are often overlooked:

  • Miners are always hashing, using significant real world resources, 24 hours and day, 365 days a year
  • Miners always ensure they are working on the block which they think is in their own financial interest and are constantly evaluating which block to mine on
  • Miners can change their mind about which block to mine on anytime they like and are therefore always contributing the determination of the longest chain

There is significant potential for weaknesses in mining incentives which could adversely impact some of the desirable characteristics of mining.  In order for miners to keep doing the above, they need to be incentivised.  Mining is crucial for determining the longest chain.

We need most of the planet's double-sha256 capability devoted to Bitcoin mining instead of doing something else.
As long as that condition holds, the actual hash rate isn't very important.

Maybe this is right, but even if most of the world’s hash rate is devoted to Bitcoin mining as there are not many other uses, if miners cannot make contributions to their marginal costs, why would they care that they help build the longest chain?

I appreciate I could be wrong here, maybe I have too much of a short time mindset when assessing Bitcoin game theory.  As Mike Hearn recently said "Miners are not incentivised to earn the most money in the next block possible. They are incentivised to maximise their return on investment" [in the long run].
instagibbs
Member
**
Offline Offline

Activity: 114
Merit: 12


View Profile
February 20, 2015, 01:41:21 PM
 #31

We need most of the planet's double-sha256 capability devoted to Bitcoin mining instead of doing something else.

As long as that condition holds, the actual hash rate isn't very important.

That's not clear at all. We need, at a minimum, a majority of *active* hashrate to not "attack" the network. 

Your economic theories aside, there is very little consensus on how we will achieve this.

I'm an optimist. I'd like to think that regardless of block size, a million people will donate a little bit of hashing on their USB sticks at a loss leading to an extremely hard to censor network. The other equilibrium aren't nearly so clean.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
February 20, 2015, 06:07:25 PM
 #32

That's not clear at all.

https://gist.github.com/oleganza/8cc921e48f396515c6d6

Quote
"Message X should be provably recent and alternatives should be practically impossible to produce."

Practical impossibility can be reframed in terms of "opportunity cost": there are limited physical resources and those should have been largely allocated to X than to Y so we can see that X sucked in all resources from any alternatives. Because if it didn't, then there is a huge uncertainty about whether remaining resources are used for alternative Y or they do not interfere with the voting process. Is it possible that X did not suck in a lot of resources while alternatives are still not possible? Then it would mean that X logically follows from whatever previous state of the system and there is no voting process needed.

Therefore: message X should be provably recent and should have employed provably big amount of resources, big enough that there are not enough resources left for any alternative Y to produce in a reasonably short time frame. Also, the message X should be always "recent" and always outcompete any alternative. Because we cannot reliably compare "old" messages: is Y an "old" one that was just delivered now, or was it produced just now after resources spent on X were released?

This logically leads us to the following: we should accept only the messages with the biggest Proof-of-Work attached, and that proof-of-work should be the greatest possible ever, so there would not be any possibility for any alternative to be produce in the short window of time. And that proof-of-work must be constantly reinforced or the value of previous consensus begins to fade quickly as the opportunity for alternatives grows.

If most of the planet's double-SHA256 capability is being used to mine Bitcoin, then choosing messages with the most accumulated proof of work satisfies the requirements for consensus.

We need, at a minimum, a majority of *active* hashrate to not "attack" the network.
You're trying to accomplish more than what is possible.

The best achievable guarantee for mining is: "ought to find it more profitable to play by the rules."

There is no technical solution available to counteract attacks by highly-resources entities who do not care about profit.

The only thing we can do is to make Bitcoin so valuable that the opportunity cost of not playing by the rules is high enough to deter such attackers.
JeromeL
Member
**
Offline Offline

Activity: 554
Merit: 11

CurioInvest [IEO Live]


View Profile
May 26, 2015, 07:30:15 PM
 #33

Sorry for reviving this topic but it's still pretty topical regarding the max block size discussions.

How would this blend with the rule : in case of conflicting chains, the chain with the maximum proof of work wins ?

This looks like a clever way to link the maximum block size to transaction fees.

jonny1000
Member
**
Offline Offline

Activity: 129
Merit: 13



View Profile
May 29, 2015, 08:30:06 AM
Last edit: May 29, 2015, 08:41:06 AM by jonny1000
 #34

JeromeL

This proposal does seem to help resolve the economic problems assosiated with the maximum blocksize issue.  However it also appears to potentially undermine the core longest chain rule consensus mechanism.  If blocks have different difficulty targets, which is the longest chain?

There is another potential solution I have proposed, which avoids this, which involves a seperate “mini proof of work” for each transaction, as well as the traditional proof of work for the whole block.  However, if this is implemented then more powerful miners have a comparitive advantage over smaller miners and mining centralisation would be encouraged dramatically.

Although I think we are on the right track and this type of thinking could hopefully lead us somehwere useful.
Pages: « 1 [2]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!