Bitcoin Forum
May 26, 2024, 05:15:50 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 ... 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 [141] 142 143 144 145 146 147 148 »
2801  Bitcoin / Bitcoin Discussion / Re: Why the Bitcoin rules can't change (reading time ~5min) on: February 26, 2013, 07:10:10 AM

Fortunately, Bitcoin already has a built-in mechanism to counteract overcrowded blocks: transaction fees.


Which is why this type of proposal is being offered...

https://bitcointalk.org/index.php?topic=145754.msg1551072#msg1551072
2802  Bitcoin / Development & Technical Discussion / Re: Best method of changing the maximum block size on: February 26, 2013, 04:07:02 AM
Would a "sorry this code is out of date" shutdown for nodes that fail to update in time be "a trainwreck" or just get such nodes out of the way so those that do upgrade in time can smoothly synch in whatever the new system turns out, by then, to be?

...
-MarkM-


This depends on the time-frame. One a six month time-frame a major this could lead to serious problems. One a two year time-frame it would likely not be an issue at all. Was there not recently a time-out on a delayed change that had been implemented by Satoshi that made certain old clients obsolete.

The key here is that delay could cause very serious problems because we will need to implement this with at least a 12 month delay or even better an 18 month delay or longer.

When the blue line touches 345,000 the cow-catcher on Bitcoin's train connects with the buffers...

https://blockchain.info/charts/n-transactions?showDataPoints=false&show_header=true&daysAverageString=7&timespan=all&scale=1&address=
2803  Bitcoin / Development & Technical Discussion / Re: Best method of changing the maximum block size on: February 25, 2013, 10:18:00 PM
That's a ridiculous argument because this risk has existed for four years already. You are describing the effect of a 51% attack where more than half the hashing power wants to harm bitcoin! If that is the case the existing situation is worse because 80% of the blocks could also be padded to 1Mb and that would be enough to cause indefinite transaction backlogs.

There is very little cost in padding to 80%, so a potential cartel has an advantage.

What about this rule instead:

Add a soft cap at MAX_BLOCK_SIZE * 0.5.

Blocks above the soft cap are allowed, but only if 25% of the fees (minting + tx) is paid to "true".

This means that the miner for the next block gets the money.

If 75% of the blocks in the last 2016 exceed the soft cap, then the cap is increased by 15%.

This means that a miner who votes "yes" for increasing the block size incurs a cost.  A cartel of miners is now harder to form, since it is in the interests of each member of the cartel to defect.

Tier, I too think this is a good refinement.
It is still a limit that is simply implemented which means it can be done sooner than later. This has to be the majority concern: to allow the most time for nodes to upgrade which reduces the non-zero probability of a self-inflicted bitcoin train wreck.
2804  Bitcoin / Development & Technical Discussion / Re: Best method of changing the maximum block size on: February 25, 2013, 10:18:59 AM
Thanks, that's really encouraging. If these are the best arguments against it then I am happy now that it is a good idea.
2805  Bitcoin / Development & Technical Discussion / Re: Best method of changing the maximum block size on: February 25, 2013, 09:58:53 AM
20% only when 2016 blocks are over 80% full. That cannot be gamed by anyone.

Sure it can.  Miners just have to pad all blocks to > MAX_BLOCK_SIZE * 0.8.

The effect is that it can be increased if 80% of miners agree. 

That's a ridiculous argument because this risk has existed for four years already. You are describing the effect of a 51% attack where more than half the hashing power wants to harm bitcoin! If that is the case the existing situation is worse because 80% of the blocks could also be padded to 1Mb and that would be enough to cause indefinite transaction backlogs.
2806  Bitcoin / Development & Technical Discussion / Re: Best method of changing the maximum block size on: February 25, 2013, 09:41:19 AM
20% only when 2016 blocks are over 80% full. That cannot be gamed by anyone.
2807  Bitcoin / Development & Technical Discussion / Re: Best method of changing the maximum block size on: February 25, 2013, 09:16:26 AM
This is not a criticism of the voting strategy above or any adaptive strategy for changing the maximum block size, just an interim solution if there is no consensus and time is drifting on.

This simple solution does the minimum necessary to preserve the status quo until a permanent strategy is implemented.

It is suggested that in the next version of bitcoin-qt and bitcoind the 1Mb block size limit constant becomes a variable that increases by 20% if the average block within the previous difficulty period is >80% of the max size limit.

The importance of an interim solution is that as many nodes as possible can be upgraded at the convenience of their owners. This drastically reduces the odds of forking the bitcoin blockchain with many users "on each side" which may well result if there was an emergency change required after a sharp growth in transaction volumes causing 1Mb block saturation.
2808  Bitcoin / Development & Technical Discussion / Re: The MAX_BLOCK_SIZE fork on: February 25, 2013, 05:02:54 AM
It is clearly the right way to go to balance the interests of all concerned parties.  Free markets are naturally good at that.

By this logic, we should leave it up to the free market to determine the block subsidy. And the time in between blocks.


And that type of argument takes us nowhere. There have been thousands of comments on the subject and we need to close in on a solution rather than spiral away from one. I have seen your 10 point schedule for what happens when the 1Mb blocks are saturated. There is a some probability you are right, but it is not near 100%, and if you are wrong then the bitcoin train hits the buffers.

Please consider this and the next posting:
https://bitcointalk.org/index.php?topic=144895.msg1556506#msg1556506

I am equally happy with Gavin's solution which zebedee quotes. Either is better than letting a huge unknown risk become a real event.
2809  Bitcoin / Development & Technical Discussion / Re: How a floating blocksize limit inevitably leads towards centralization on: February 24, 2013, 10:07:08 PM
Okay, but is jumping directly to ten megabytes in one fell swoop the best way to go, insead of , for example, a series of doublings?

-MarkM-


No, no, no. I am the last person to advocate jumping straight to 10Mb.

What I think we need is a flexible limit along the lines of the many suggestions which attempt to take account of what we know to be important, especially encouraging fees and eliminating free micro-transactions. Perhaps it should also adapt to block propagation times, constraining the limit if network performance degrades too much.  I appreciate that it is impossible to get this perfect on the first attempt and agreement on the detail may take time.

So a two step plan seems best:

1. In the next version the limit constant becomes a variable that increases 20% if the average block within the previous difficulty period is >80% of the limit (or a simple variation of this).

2. In a subsequent version activate the best adaptive/algorithmic block limit strategy which has found a consensus.

The advantage of step 1 is that it gets an early solution into the field achieving huge risk mitigation. It will have more time to be adopted by the greatest number of nodes. Step 2 is the attempt at a permanent solution  which may happen even before step 1 has its first automatic increase, or it could be delayed for a year through analysis paralysis - it won't matter as much either way.
2810  Bitcoin / Development & Technical Discussion / Re: How a floating blocksize limit inevitably leads towards centralization on: February 24, 2013, 09:13:31 PM

The absolute minimum transaction size(1) is for single input single-output transactions. They are 192 bytes each, 182 if transaction combining is aggressively used. 1MiB/10minutes * 182bytes = 9.6tx/s


Fine, here you have the definition of a transaction unit. Of course I understand that there are multiple inputs/outputs possible, which makes bitcoin transfers much more efficient than in conventional payment systems. I was just using terminology common on sites such as blockchain.info. This makes 4 tx/s a valid description of the real-world situation.

It is no good having theoretical efficiency available if it is not being physically used! You say it could be used, and this is true, however optimum transaction efficiency cannot be achieved by a wide user-base overnight. My concern all along is that bitcoin usage is growing exponentially, at a much faster rate than the adoption of transaction efficiency, or complimentary off-chain solutions are appearing.

There are a lot of smart people on this forum who fully understand exponential growth curves. Yet it seems some of these people think that a hard limit can seamlessly force adoption of efficiencies and complementary solutions which are developing linearly in comparison. 

There can be no seamless transition in that scenario, which is why an adaptive/algorithmic block size limit is needed sooner than later.
2811  Bitcoin / Bitcoin Discussion / Re: Fidelity-bonded banks: decentralized, auditable, private, off-chain payments on: February 24, 2013, 09:40:16 AM
retep, I just want to say that I am impressed by the amount of thought going in to off-chain systems like this. It is far easier to criticize detail than to put together such a structured concept. I hope that they become a reality as an available service one day.
2812  Bitcoin / Legal / Re: I disagree that Bitcoin is money, currency. on: February 24, 2013, 09:26:30 AM
gopher, perhaps you aim to help, but the thrust of the OP is such that I can't see how it helps.

There is a fundamental of economics that only products can buy products. Money is a medium of exchange between products, and develops the properties of a store of value and unit of account. Bitcoin excels at all these. It has particular strength for remote transactions and as an inelastic monetary base.

I recommend this paper which I am sure you will find interesting too:

http://dev.economicsofbitcoin.com/mastersthesis/mastersthesis-surda-2012-11-19b.pdf

wrt to governments. There are 200 nation state jurisdictions in the world. Bitcoin transcends all their boundaries. None of them have attempted to curtail its usage, so bitcoin should not be hobbled by its community just on the theoretical basis that one of them might make a fuss one day. Onward!
2813  Economy / Economics / Re: UK Downgraded from LOL to LMAO on: February 24, 2013, 09:02:52 AM
Bitcoin is never going to replace the £.

Yes, and PC word processors would never replace typewriters, and email would never replace letters, digital cameras would never make film obsolete, online video would never kill off high-street video rental. Yes, very good that we have all those long-standing things still with us...
2814  Bitcoin / Development & Technical Discussion / Re: How a floating blocksize limit inevitably leads towards centralization on: February 24, 2013, 06:40:51 AM
I'm pretty sure that the 250KB limit has never been broken to date.

block 187901 499,245 bytes
block 191652 499,254 bytes
block 182862 499,261 bytes
block 192961 499,262 bytes
block 194270 499,273 bytes

These are the biggest 5 blocks up to the checkpoint at block 216,116.  Interesting that they're all 499,2xx bytes long, as if whoever is mining them doesn't want to get too close to 500,000 bytes long.

I understand that at least one miner has their own soft-limit, probably Eligius and probably at 500kb.

I take that back because these blocks are version 1 and Eligius is supposedly producing all version 2 (http://blockorigin.pfoe.be/top.php)

However, I have extracted the transaction counts and they average 1190 each. Looking at a bunch of blocks maxing out at 250Kb they are in the region of 600 transactions each, which is to be expected. Obviously, there is a block header overhead in all of them. But this does mean that the 1Mb blocks will be saturated when they carry about 2400 transactions. This ignores the fact that some blocks are virtually empty as a few miners seem not to care about including many transactions.

So 2400 transactions per block * 144 blocks per day = 345,600 transactions per day or Bitcoin's maximum sustained throughput is just 4 transactions per second.
This is even more anemic than the oft-quoted 7 tps!

2815  Bitcoin / Development & Technical Discussion / Re: there is no max block size problem on: February 23, 2013, 07:16:02 PM
There is a lot of merit to this concept. It is robust and highly scalable.

My question is: do there really need to be multiple variations of bitcoins with multiple fx rates between them all?
Preserving the integrity of the 21m issue limit is important, so TierNolan's alternate proposal needs to be made known here:

Bitcoin alt chains which have no block reward and the mining is funded just by fees. Also, the alt-chains are funded with bitcoins from the primary chain. Miners could mine any chain they wanted to, but any alt-chain which had a 51% attack would be locked out of being able to send back to the main chain.

TierNolan's proposal in more detail:
https://bitcointalk.org/index.php?topic=144895.msg1549816#msg1549816

While these solutions might work well in the long-term, bitcoin would still need an adaptive block size limit to progress in the short-term.
2816  Bitcoin / Development & Technical Discussion / Re: review of proposals for adaptive maximum block size on: February 23, 2013, 10:17:39 AM
markm,

you are introducing a new debate which is about the maximum block size the network can handle. This needs to be kept separate from the max block size constant debate which is big enough already.

It might be that when the average block size is 700Kb the network begins seriously degrading because blocks are not being received/retransmiited effectively, it might be that this happens when they are an average of 12Mb each. If it is the lower size then we have been debating the wrong issue lately. If it is the upper size then, hopefully, the network will be wealthier (I am assuming that most owners of full nodes have BTC holdings) and some of that could be converted to fiat to buy better hardware.

This debate really needs to be in a new thread, and the results of models/testing made known there.
2817  Bitcoin / Development & Technical Discussion / Re: Should bitcoin lower the transaction fee? on: February 23, 2013, 05:45:53 AM
In the end, who cares what the values of the outputs are. The only thing that matters is the space transactions take up in the block chain, and if fees are based upon that, then this nonsense resolves itself.

A more relevant measure than space in the block chain is the size of the list of unspent transaction outputs ("UTXO", list of spendable bitcoins).  It keeps growing and growing, with near-zero-value outputs.

RE "who cares what the values are"  One follows from the other.

If your purchases are typically 4-5 decimal places at most, and the remainder has sub-$0.001 value, the remainder is simply less likely to be swept up and spent... thereby taking up space in the block chain without being prunable.

This is why losing SatoshiDICE bets are so harmful to the system.  You get back 0.00000001 or so on a losing bet.  With such high transaction volumes, SatoshiDICE is handing out lots of difficult-to-spend dust spam.



This is a very important point.
so (open question). has anyone asked SatoshiDICE whether they can change their model to internalize their transactions?
2818  Economy / Speculation / Re: Wall Observer - MtGoxUSD wall movement tracker on: February 23, 2013, 01:00:32 AM
I don't get this "lower low on the hourly chart exploit". Can someone explain?

But please, I do this for fun. Its not professional in any way.
Last 2 times it happend with a bigger ratio, a nice sell off happened. Looks like its not happening today.

I don't think there is any lower low on the hourly. There needs to be a retrace higher before a 2nd low can be looked for. All that has happened at 31 is the market got ahead of itself and is taking a breather. It may consolidate for a while in the 30-31 range. It still looks to have legs. We have not seen the price spike to a level where it has been firmly kicked out. The last time that happened was the spike to $19.75 which led to the last real bearish mini-trend.
2819  Bitcoin / Development & Technical Discussion / Re: review of proposals for adaptive maximum block size on: February 22, 2013, 11:03:09 PM
hazek, I like this very much. As you say, it is impossible to determine the perfect algrithm in advance, but the market will adapt to it to some degree. I think this approach does a lot to secure bitcoin's long-term future.
2820  Bitcoin / Development & Technical Discussion / Re: How a floating blocksize limit inevitably leads towards centralization on: February 22, 2013, 10:56:44 PM
I'm pretty sure that the 250KB limit has never been broken to date.

block 187901 499,245 bytes
block 191652 499,254 bytes
block 182862 499,261 bytes
block 192961 499,262 bytes
block 194270 499,273 bytes

These are the biggest 5 blocks up to the checkpoint at block 216,116.  Interesting that they're all 499,2xx bytes long, as if whoever is mining them doesn't want to get too close to 500,000 bytes long.

I understand that at least one miner has their own soft-limit, probably Eligius and probably at 500kb.
Pages: « 1 ... 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 [141] 142 143 144 145 146 147 148 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!