Bitcoin Forum
November 07, 2024, 11:22:10 AM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 »
  Print  
Author Topic: How a floating blocksize limit inevitably leads towards centralization  (Read 71582 times)
n8rwJeTt8TrrLKPa55eU
Hero Member
*****
Offline Offline

Activity: 588
Merit: 500



View Profile
February 20, 2013, 02:18:15 AM
Last edit: February 20, 2013, 03:38:31 AM by n8rwJeTt8TrrLKPa55eU
 #141

Doctors have the right idea: Primum non nocere.  Or if you prefer: if it ain't broke, don't fix it.

Clearly Bitcoin, as currently implemented, 1MB limitation included, is doing something right.  ...


Except that Satoshi had foreseen the limit being a temporary requirement and gave an example of raising it by block 115,000 (in the future). That is now the past.

I have 25 years experience in commercial systems, much of that processing $US billions per day, and am fully aware that software issues ignored invariably blow up badly. The human body is different as it has the capability to fix itself (in fact that is 99% of modern medicine).

Satoshi isn't infallible.

And like you, I also have many years experience developing commercial software products used by millions of people, and can attest to many fun anecdotes where trying fixing a bug or making an "obvious" improvement, resulted in disasters far worse and more costly than the original problem.

Changing something as fundamental and delicate as this constant should require pretty hardcore quantitative data and at least 80% consensus.  Not gut feelings.

Ultimately, I'd suggest Gavin do an informal poll at the San Jose conference.  If he can't get at least 80% agreement there, then he'll know that there is a definite risk of a community split between those who prioritize Bitcoin's transactional attributes versus its potential for cast-iron independent value storage.

The world already has plenty of fast, high-capacity media of exchange.  But only one digital, decentralized, and private store of value.  To me, risking the strength of the latter attributes to enhance the former, seems shortsighted.
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 20, 2013, 03:34:08 AM
 #142

Changing something as fundamental and delicate as this constant should require pretty hardcore quantitative data and at least 80% consensus.  Not gut feelings.

Maybe the Bitcoin Foundation can start up a university outreach program that encourages research on Bitcoin's difficult problems, like MAX_BLOCK_SIZE.
thinkweis
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile WWW
February 20, 2013, 03:35:28 AM
 #143

Startbitcoin.com is offering the blockchain delivered on DVD. As the blockchain continues to grow, this will make life a lot easier.

http://startbitcoin.com/blockchain-on-dvd/
ldrgn
Member
**
Offline Offline

Activity: 118
Merit: 10


View Profile
February 20, 2013, 03:37:34 AM
 #144

Consider the two scenarios:

Everyone has equal hashing power and bandwidth except for one person who has a half-speed connection.  The slowpoke's mining speed and profitability is decreased and eventually they stop mining because they are no longer turning a profit.  (This is what retep described in his first post.)

Everyone has equal hashing power and bandwidth except for one person who hashes at half the speed of everyone else.  The slowpoke's mining speed and profitability is decreased and eventually they stop mining because they are no longer turning a profit.

Why are they being treated differently?  Nobody cares about catering to the second person's lack of profitability so why are people suddenly concerned about the first?  If you're saying "oh, it's because we want non-miners to be able to validate in time" - take a look at retep's post, this isn't about the convenience of non-miners.  A well-connected miner with a fast upload speed adds value to the network just like hashing does - fast dissemination of blocks lets everyone get faster information about the latest updates to the ledger.  Requiring the lowest common denominator of connections doesn't add much value and takes away a whole lot more.
Nagato
Full Member
***
Offline Offline

Activity: 150
Merit: 100



View Profile WWW
February 20, 2013, 03:55:29 AM
 #145

Everyone has equal hashing power and bandwidth except for one person who hashes at half the speed of everyone else.  The slowpoke's mining speed and profitability is decreased and eventually they stop mining because they are no longer turning a profit.

No. If slowpoke is using the most efficient form of mining(ASICs atm), his running/capital costs will be half of everyone else. He will be as profitable as the bigger miners.

Once you start requiring a high barrier to entry for mining via a large block size which residential connections cannot keep up with, you will naturally see independant mining and p2p-pool failing with a migration towards centralised mining pools. If you look at the Bitcoin globe, is it any surprise that hardly any full nodes(usually independant miners) exist in countries with slow internet connections(most of Asia, Africa, South America)? Countries with bandwidth caps will also make mining unfeasible in many developed countries(I believe many US/Canadian/NZ/Australian ISPs have caps).

gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4270
Merit: 8805



View Profile WWW
February 20, 2013, 04:07:14 AM
 #146

PS: and if the "1MB-is-doom" scenario is correct, the beauty of not tackling problems until they are a clear and present danger, is that if a hard fork must take place, then it will be much easier to reach 100% consensus
Yea, two forks of my risk doomsday saying are "dorking with the rules will (rightfully) undermine confidence"  and "if the limit is lifted without enough txn we'll destroy the fees market", both of those go away if size is only increased in response to a situation where the necessity is obvious.

Though part of that also means that if we're to really reason about it in the future we get to have this debate every time it comes up so that we don't create an expectation that it must be increased even absent a need.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4270
Merit: 8805



View Profile WWW
February 20, 2013, 04:16:58 AM
 #147

Changing something as fundamental and delicate as this constant should require pretty hardcore quantitative data and at least 80% consensus.  Not gut feelings.
It's not like a San Jose poll would be terribly representative of Bitcoin users.  A lot of the tech startups have big booming enthusiasm far outpacing their basic technical and economic competence, and I expect them to be well represented there. It takes all kinds, sure, but if you ask people who have been presenting these crazy double exponential graphs to VCs all week if they want there to be MOAR TRANSACTIONS, of course they're going to say "YES. IF WE NEED MILLIONS FOR ANOTHER VALIDATING NODE, I KNOW JUST THE VC TO CALL" Tongue  (And these folks and their opinions are, of course, quite important... but that doesn't mean that a poll is a grant way to get a thoughtful response that reflects their real interest)
markm
Legendary
*
Offline Offline

Activity: 3010
Merit: 1121



View Profile WWW
February 20, 2013, 04:42:27 AM
 #148

Yeah maybe we should also do polls in deepest darkest Africa, both of groups of peasants who have not managed to afford a community phone yet and of groups of the individuals who have managed to afford a phone thus are able to go into the "letting someone use my phone briefly for a fee" business, and see whether they feel something requiring the vast expense of a 24/7 internet connected 386 to 586 grade machine to get into seems a better or worse grassroots empower the people thing than something that even the latest and greatest throwaways being handed out at furniture banks cannot handle?

Sample bias much?

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
Maged
Legendary
*
Offline Offline

Activity: 1204
Merit: 1015


View Profile
February 20, 2013, 06:38:54 AM
 #149

Gavin and co, drop your silly suggestions of using an idea that can't be globally synchronized without a central authority. This suggestion does exactly what you are looking for in a decentralized manner and it's globally synchronized:
I actually posted the below in the max_block_size fork thread but got absolutely no feedback on it so rather than create a new thread to get exposure on it, I am reposting it here in full as something to think about with regards to moving towards having a fairly simple process to create a floating blocksize for the network that is conservative enough to avoid abuse and will work in tandem with difficulty so no new mechanisms need to be made. I know there are probably a number of holes in the idea but I think it's a start and could be made viable so that we get a system that allows blocks to get bigger, but doesn't run out of control such that only large miners can participate, and also avoids situations where manipulations of difficulty could occur if there were no max blocksize limit. Ok, here goes.

I've been stewing over this problem for a while and would just like to think aloud here....

I very much think the blocksize should be network regulated much like difficulty is used to regulate propagation windows based on the amount of computation cycles used to find hashes for particular difficulty targets. To clarify, when I say CPU I mean CPUs, GPUs, and ASICs collectively.

Difficulty is very much focused on the network's collective CPU cycles to control propagation windows (1 block every 10 mins), avoid 51% attacks, and distribute new coins.

However the max_blocksize is not related to computing resources to validate transactions and regular block propagation, it is geared much more to network speed, storage capacity of miners (and includes even non-mining full nodes) and verification of transactions (which as I understand it means hammering the disk). What we need to determine is whether the nodes supporting the network can quickly and easily propagate blocks while not having this affect the propagation window.

Interestingly there is a connection between CPU resources, the calculation of the propagation window with difficulty targets, and network propagation health. If we have no max_blocksize limit in place, it leaves the network open to a special type of manipulation of the difficulty.

The propagation window can be manipulated in two ways as I see it, one is creating more blocks as we classically know, throw more CPUs at block creation, and we transmit more blocks, more computation power = more blocks produced, and the difficulty ensures the propagation window doesn't get manipulated this way. The difficulty is measured by timestamps in the blocks to determine whether more or less blocks in a certain period were created and whether difficulty goes up or down. All taken care of.

The propagation window could also be manipulated in a more subtle way though, that being transmission of large blocks (huge blocks in fact). Large blocks take longer to transmit, longer to verify, and longer to write to disk, though this manipulation of the number of blocks being produced is unlikely to be noticed until a monster block gets pushed across the network (in a situation where there is no limit on blocksize that is). Now because there is only a 10 minute window the block can't take longer than that I'm guessing. If it does, difficulty will sink and we have a whole new problem, that being manipulation of the difficulty through massive blocks. Massive blocks could mess with difficulty and push out smaller miners, causing all sorts of undesirable centralisations. In short, it would probably destroy the Bitcoin network.

So we need a maximum block size that is high enough that the vast majority of nodes are comfortable with it, and isn't so big that it can be used to manipulate the difficulty by artificially slowing propagation accross the network with massive blocks. With the help of the maintaining of the propagation window through it's difficulty, we may be able to determine whether the propagation of blocks is slowing and whether the max_blocksize should be adjusted down to ensure the propagation window remains stable.

Because the difficulty can be potentially manipulated this way we could possibly have a means of knowing what the Bitcoin network is comfortable with propagating. And it could be determined thusly:

If the median size of the blocks transmitted in the last difficulty period is bumping up against the max_blocksize (median being chosen to avoid situations where one malicious entity, or entities tries to arbitrarily push up the max_blocksize limit), and the difficulty is "stable", increase the max_blocksize (say by 10%) for the next difficulty period (say the median is within 20% of the max_blocksize), but if the median size of blocks for the last period is much lower (say less than half the current blocksize_limit), then lower the size by 20% instead.

However, if the If the median size of the blocks transmitted in the last difficulty period is bumping up against the max_blocksize and the difficulty is NOT stable, don't increase the max_blocksize since there is a possibility that the network is not currently healthy and increasing or decreasing the max_blocksize is a bad idea. Or alternatively in those situations lower the max_blocksize by 10% for the next difficulty period anyway (not sure if this is a good idea or not though).

In either case the 1mb max_blocksize should be the lowest the blocksize should go to if it continued to shrink. Condensing all that down to pseudocode...

Code:
IF(Median(blocksize of last difficulty period) is within 10% of current max_block_size 
AND new difficulty is **higher** than previous period's difficulty),
    THEN raise max_block_size for next difficulty period by 10%

otherwise,

Code:
IF(Median(blocksize of last difficulty period) is within 10% of current max_block_size 
AND new difficulty is **lower** than previous period's difficulty),
    THEN lower max_block_size for next difficulty period by 10% UNLESS it is less than the minimum of 1mb.


Checking the stability of the last difficulty period and the next one is what determines whether the network is spitting out blocks at a regular rate or not, if the median blocksize of blocks transmitted in the last difficulty period is bumping up against the limit, and difficulty is going down, it could mean a significant number of nodes can't keep up, esp. if the difficulty needs to be moved down, that means that blocks aren't getting to all the nodes in time and hashing capacity is getting cut off because they are too busy verifying the blocks they received. If the difficulty is going up and median block size is bumping up against the limit, then there's a strong indication that nodes are all processing the blocks they receive easily and so raising the max_blocksize limit a little should be OK. The one thing I'm not sure of though is determining whether the difficulty is "stable" or not, I'm very much open to suggestions the best way of doing that. The argument that what is deemed "stable" is arbitrary and could still lead to manipulation of the max_blocksize, just over a longer and more sustained period I think is possible too, so I'm not entirely sure this approach could be made foolproof, how does calculating of difficulty targets take these things into consideration?

OK, guys, tear it apart.
In plain English, this means that if over 50% of the mining power (which shouldn't be only a single miner by definition since we'd be totally screwed anyway) think that they can make more money in overall fees by allowing the maximum block size to increase, they can each vote for this increase by hitting (close to) the limit in each block they make, which in turn proves that the network can handle the increase, especially if we use this idea:
So we need a maximum block size that is high enough that the vast majority of nodes are comfortable with it, and isn't so big that it can be used to manipulate the difficulty by artificially slowing propagation accross the network with massive blocks. With the help of the maintaining of the propagation window through it's difficulty, we may be able to determine whether the propagation of blocks is slowing and whether the max_blocksize should be adjusted down to ensure the propagation window remains stable.

A measure of how fast blocks are propagating is the number of orphans.  If it takes 1 minute for all miners to be notified of a new block, then on average, the number of orphans would be 10%.

However, a core of miners on high speed connections could keep that down and orphans are by definition not part of the block chain.

Maybe add an orphan link as part of the header field.  If included, the block links back to 2 previous blocks, the "real" block and the orphan (this has no effect other than proving the link).  This would allow counting of orphans.  Only orphans off the main chain by 1 would be acceptable.  Also, the header of the orphan block is sufficient, the actual block itself can be discarded.

Only allowing max_block_size upward modification if the difficulty increases seems like a good idea too.

A 5% orphan rate probably wouldn't knock small miners out of things.  Economies of scale are likely to be more than that anyway.

Capping the change by 10% per 2 week interval gives a potential growth of 10X per year, which is likely to be at least as fast as the network can scale.

So, something like

Code:
if ((median of last 2016 blocks < 1/3 of the max size && difficulty_decreased) || orphan_rate > 5%)
 max_block_size /= 8th root of 2
else if(median of last 2016 blocks > 2/3 of the max size && difficulty_increased)
 max_block_size *= 8th root of 2 (= 1.09)

The issue is that if you knock out small miners, a cartel could keep the orphan rate low, and thus prevent the size from being reduced.
So, no increase in the maximum block size could ever hurt the miners more than a small amount. And if a miner doesn't think they are making enough in fees and that they won't be able to make up the difference in volume? They simply fill their blocks to just under the point where they would be considered "full" for the purposes of this calculation. There are even more economic factors here, but it makes my head hurt to even think of them.

- snip -
My objection i don't see answered is, what stops other miners to spontaneously build longer chain with smaller blocks that more easily propagate? (In absence of said cartel.) . . .
The proof-of-work system.
How, I'm asking again? The proof of work will naturally favor smaller blocks that can be spread faster to majority of the network (and thus support decentralization). Yes, worst connected nodes will be left behind, but as long as there are plenty of "middle class" nodes that will naturally favor blocks less than certain size and slow propagation of oversized blocks making them into orphans more likely, I don't see a problem.

The point of this debate is that the incentive for miners with faster bandwidth is to intentionally pad out their blocks to be so big that only slightly more than half of the hashing power on the network can receive them before the next block is discovered.  This leaves almost 50% of the hashing power on the network continually working on old blocks never to earn any rewards while those with the highest bandwidth leave them behind and increase their own rewards.  Doing so forces those with lower bandwidth out of the market, allowing the process to repeat, consolidating the mining into only those with the absolute highest bandwidth on the network.

The proof of work prevents the miners with slower bandwidth from solving a block any faster than those with higher bandwidth, and the bandwidth issue keeps them from working on the correct block.
And the neat part about the suggestion I quoted in this post is that any hashing power on the network continually working on old blocks is creating orphan blocks, which are directly considered in the maximum block size algorithm. Network security, therefore, is considered more important than the use of Bitcoin as a payment system, but a compromise between those two ideas is allowed when it benefits ALL users through more security and cheaper individual transactions.

Doctors have the right idea: Primum non nocere.  Or if you prefer: if it ain't broke, don't fix it.

Clearly Bitcoin, as currently implemented, 1MB limitation included, is doing something right.  Otherwise demand wouldn't be growing exponentially.  Gavin: with each Bitcoin purchase, users are implicitly "voting" that they approve of its current embedded constants (or at least, they don't mind them, yet).
No, they aren't. They're voting for a crypocurrency that they think of as "Bitcoin". Every piece of Bitcoin documentation involving the 1 MB limit has been clear that it was temporary in order to protect the network from attacks while it was in its infancy. As for the people who didn't read the documentation, they would have no idea that this limit even currently exists since we aren't ever hitting it. Therefore, any *coin that has a permanent 1 MB limit cannot, by definition, be called "Bitcoin".

markm
Legendary
*
Offline Offline

Activity: 3010
Merit: 1121



View Profile WWW
February 20, 2013, 07:00:15 AM
 #150

So that particular calculation for automatically "adapting" is simply an example of why I am doubtful that automation is the way to go, since basically all it amounts to is carte blanche for all miners who want mining to be limited to a privileged elite to simply spout the biggest blocks permitted, constantly, thus automatically driving up the size permitted, letting them spout out even bigger blocks and so on as fast as possible until the few of them left are able to totally control the whole system in a nice little cartel or oligarchy or plutarchy or kakistrocracy or whatever the term is for such arrangements. (Sounds like an invitation to kakistocracy actually, maybe?)

Basically automatic "adaptation" seems more like automatic aquiescence to whatever the cartel wants, possibly happening along the way to leave them with incentives to maintain appearances of controlling less of the network than they actually do so that if/when they do achieve an actual monopoly it will appear to the public as maybe pretty much any number of "actors" the monopoly chooses to represent itself as for public relations purposes. (To prevent panics caused by fears that someone controls 51%, for example.)

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
thanke
Member
**
Offline Offline

Activity: 104
Merit: 10


View Profile
February 20, 2013, 07:04:46 AM
 #151

What was the talk about "O(n^2) scaling problems" about? I fail to see this is relevant here. The total number of transactions is O(n), where n is the number of users. The number of transactions per user should be a finite constant and not depend on n. Or does anybody have a different assumption?

Note that this is meant to be the long term assumption, for the scenario in which each user conducts all his transactions with bitcoin. Before that happens, only a certain percentage of the people that a given user transacts with actually uses bitcoin. With growing n this percentage will also grow, giving you the illusion of a growth similar to O(n^2). But as the percentage is bounded (by 100%) this can only be a temporary phenomenon.

If the number of transactions can be assumed to be O(n) then it is reasonable to factor hardware improvements into the decision. You would not need Moore's Law to conclude that wristwatches will be able to do all the world's verifying in real-time.

That said, in reality, "the constant does matter". And it doesn't suffice as an argument for unlimited block size.

markm
Legendary
*
Offline Offline

Activity: 3010
Merit: 1121



View Profile WWW
February 20, 2013, 07:16:01 AM
 #152

I strongly suspect that even despite Moore's Law being thought by some to be running out, allowing an increase in the block size would not be able to be limited to increasing it only by 50% at first then another 50% each calendar year thereafter.

Some folk are already talking about 100 megabyte blocks as if maybe that might even be the smallest block size they consider changing the size at all should change it to right away, and since those same folk seem to argue it is exponentially increasing need that drives them to that number maybe we can expect 100 times the size per year thereafter too maybe.

It is ridiculously easy to spin off as many chains as their are fiat currencies, to proviide a cryptocurrency landscape as rich and possibly as varied as the fiat landscape everyone is already used to, and with Ripple transparent automatic exchanges as part of transfers should be even easier than it will be with the fiat currencies.

But hey lets give it a try: all in favour of a 50% increase in block size to start with, followed by further 50% increases yearly thereafter?

Edit: Oops misremembered the actual Moores law numbers typically cited. Maybe all in favour of doubling the size then every year and a half doubling it again is more palatable?

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
Maged
Legendary
*
Offline Offline

Activity: 1204
Merit: 1015


View Profile
February 20, 2013, 07:30:03 AM
Last edit: February 20, 2013, 07:41:49 AM by Maged
 #153

So that particular calculation for automatically "adapting" is simply an example of why I am doubtful that automation is the way to go, since basically all it amounts to is carte blanche for all miners who want mining to be limited to a privileged elite can simply spout the biggest blocks permitted, constantly, thus automatically driving up the size permitted, letting them spout out even bigger blocks and so on as fast as possible until the few of them left are able to totally control the whole system in a nice little cartel or oligarchy or plutarchy or kakistrocracy or whatever the term is for such arrangements. (Sounds like an invitation to kakistocracy actually, maybe?)

Basically automatic "adaptation" seems more like automatic aquiescence to whatever the cartel wants, possibly happening along the way to leave them with incentives to maintain appearances of controlling ess of the network than they actually do so that if/when they do achive an actual monopoly it will appear to the public as maybe pretty much any number of "actors" the monopoly chooses to represent itself as for public relations purposes. (To prevent panics caused by fears that someone controls 51%, for example.)

-MarkM-

If they manage to do that in such a way that keeps global orphan rates down and the difficulty at least stable (so this would have to be done slowly) all while losing boatloads of money by essentially requiring no transaction fee ever, good for them. Other 51% attacks would be more economical, especially since this attack would be trivial to detect in the non-global-state side of things. For example, people would notice a large amount of previously-unheard transactions in these blocks, or extremely spammy transactions. Worst-case, they could get away with including all legitimate transactions and a small amount of their own and not be detected, raising the limit to have a block be able to contain just above the amount of typical transactions made per block period.

However, other considerations can be added. That suggestion is by no means final. Some extreme ideas (not outside the box, I know, but just to prove that this attack can be prevented with more constraints):
*To increase max block size, global orphan rate must be below 1%.
*To increase max block size, 95% of the blocks in the last difficulty period must be at/very near the current max size.
*Max block size can only increase by a factor of 2 over four years.

For more ideas, think about the process of raising the block size limit in manual terms. What would/should we consider before manually raising the block size limit? Let's see if we can codify that...


markm
Legendary
*
Offline Offline

Activity: 3010
Merit: 1121



View Profile WWW
February 20, 2013, 07:42:47 AM
 #154

Well for starters they need not use "previously unheard-of transactions" to pad blocks, they can spam their spam like Satoshi Dice does, heck they can even buy shares of Satoshi Dice and promote Satoshi Dice and lobby the Foundation for the inalienable right of every two-bit loser's losses to be primary-blockchain news items (which is kind of, like, way the heck more massively globally important than mere front-page-news items).

Where does the idea one has to hide one's padding come from? The pathetically trivial transaction fees that increasing the block size kind of undermines the justification for?

Observe historical behavior of miners: spending ten million dollars up front in order to have a slight edge a year down the line, for example as they have done with Butterfly Labs in the ASIC saga. And still they are enthusiastic, still thinking its a good thing they blew that ten million way back when because this year, finally, they might actually start to see some of that hoped-for edge. Meanwhile just holding coins would have appreciated how much in that same span of time?

I think it is clear that throwing vast sums of money into slim chances of even a slight edge is not at all unlikely behavior for miners.

So we could start out with a guess at how much spam ten million dollars would buy, then maybe allow for bitcoin appreciation to realise that those who did not blow the ten million on that saga back when it was many more bitcoins that it would be now or maybe will be soon and who smart from failing to get in on that saga from the moment it became possible to attempt it could amount to much much more itching to be spent on the next slim chance of an edge saga. Unlimited block size just by paying pathetically low transaction fees? How many millions is that gonna cost, an edge is an edge, man, anyone want to throw money at me so I can build myself a city of datacentres in Iceland? (I'll pay 7% a day in dividends once I control the entire network! Wink Smiley)

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
Maged
Legendary
*
Offline Offline

Activity: 1204
Merit: 1015


View Profile
February 20, 2013, 08:03:48 AM
 #155

I think it is clear that throwing vast sums of money into slim chances of even a slight edge is not at all unlikely behavior for miners.
Agreed, which is why the requirements for increasing the maximum block size need to be difficult, but not impossible. Luckily, if we fail, we can just implement a soft-fork with either a manual limit that's lower than the automatic limit, or we can add new requirements for raising the maximum block size.

In fact, that just gave me another idea, we can make it so that whatever we end up doing, it will expire at a certain block a few years in the future and revert to unlimited block size. That way, if we screw up, we can keep trying again via a soft-fork at that expiration point.

markm
Legendary
*
Offline Offline

Activity: 3010
Merit: 1121



View Profile WWW
February 20, 2013, 08:30:57 AM
Last edit: February 20, 2013, 08:49:23 AM by markm
 #156

TL;DR: Maybe factor in exchange-rates, since the more massively and swiftly the value of a coin skyrockets the more enthusasts who picked up a few coins while they were still available for less than a hundred bucks each might become rich enough to be able to consider setting themselves up a full node...


If bitcoin exchange rates continue to skyrocket, at some point even people who only managed to scrape together a fraction of a coin, or lets say, maybe such a huge fraction of a coin that we might as well call it a whole coin to keep our numbers simple, could be in a position to blow that entire coin, which might have taken them many months to accumulate by applying all their included-in-rent electricity and all the hardware they could scape up, on a full-node-capability home-internet system, *if* we don't raise the bar faster than even such devoted enthusiasts are able to keep up.

Do we want a full node to be forever out of such a devoted follower's reach, or would it be preferable that that carrot hovering before their eyes be attainable if they are actually able to make use of their existing hardware budget and pre-paid electricity efficiently (and never spend a satoshi of the satoshis p2pool doles out to them) Huh

Just trying to get some soundings of various data-points. Right now, without increasing the maximum block size, and actually maybe even if we do increase it in a Moore's Law fashion of 50% per year or 100% per eighteen months, it does not seem inconceivable that a truly dedicated and technically competent enthusiast could dream of eventually buying a full node with their strictly hoarded bitcoins...

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
Nagato
Full Member
***
Offline Offline

Activity: 150
Merit: 100



View Profile WWW
February 20, 2013, 08:52:03 AM
Last edit: February 20, 2013, 09:08:21 AM by Nagato
 #157

*To increase max block size, global orphan rate must be below 1%.
*To increase max block size, 95% of the blocks in the last difficulty period must be at/very near the current max size.
*Max block size can only increase by a factor of 2 over four years.

If you always increase block size to meet all demand, TX fees will inevitably fall towards 0.
But as i've said, i like the direction of having an automated block adjustment algo if it does not compromise on security and decentralisation of mining.

How about
*To increase max block size by n%, more than 50% of fee paying transactions(must meet a minimum fee threshold to be counted) during the last difficulty window were not included in the next X blocks. Likewise we reduce the max block size by n%(down to a minimum of 1MB) whenever 100% of all paying transactions are included in the next X blocks.

Basically it means that we only increase block size if the majority of paying TXs are not being included within X blocks. The minimum fee threshold could be the average of all fees paid in the last difficulty window with a hardcoded floor like it is now(0.0005) to prevent abuse. If the % of fee paying TXs not being included in X blocks is less than 50%, then we do not increase the block size to allow competition for express block inclusion.

Obviously the 50% number could be higher or lower but i chose a middle ground. 50% means Bitcoin will adapt the block size to confirm atleast half of all paying TXs as fast as possible.

X should be small enough for express to mean express. But it cannot be too small to allow 1-2 miners to collude and manipulate the block size. Given Satoshi's proof of Bionomial Random walk for double spends, i suggest X be 6 blocks.
n could be a conservative number like 5% to prevent any revenue shock to miners.

caveden
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
February 20, 2013, 08:52:57 AM
 #158

I'm sorry for cross-posting, but unfortunately I have a hard time seeing this thread as a "Technical Discussion". This is much more an economical discussion than a technical one by now. And in another thread I said what I think about it while replying to hazek:

You don't understand. I use Bitcoin because it is built upon certain principles and built in such a way that those principles can't be "legislated" away with a rule change. If this isn't the case I have no use for Bitcoin.

The main economical principles of Bitcoin are not being threatened. If they were, I would be among the "resistance" for sure.
What happens is that there's this damn constant limit Satoshi inserted in the code because he couldn't think of something better at the moment he was coding, and now this thing is haunting us.

Wanting to cripple Bitcoin scalability just because you fear that a super-cartel of pool operators would pop up without such handicapping limit? Seriously?
That looks very much too me like the argument for anti-trust laws. This talk that high-bandwidth pool operators would voluntarily pad their blocks with useless data in a desperate attempt to kick out low-bandwidth pools*, risking with that to increase their propagation time and thus also increase the chance of losing their reward... to me it sounds just like the argument that price dumping works. History and economical theory show us that price dumping doesn't work.

That summarizes it in economical jargon for me: the resistance against dropping the limit believes that "dumping techniques" can work, and fear that dropping the limit would be like dropping anti-trust laws. They are wrong.

* By the way, for totally different reasons, it's already not really possible to be a low-badwidth pool. The reason for this is not the Bitcoin protocol per se, but DDoS attacks from bot operators mining on different pools. Unfortunately most major pools were already DDoSed. Ultimately the only defense against such kind of attach is large bandwidth.
notig
Sr. Member
****
Offline Offline

Activity: 294
Merit: 250


View Profile
February 20, 2013, 08:54:41 AM
 #159

For the blocksize limit removal proponents: What are you afraid of?

That you cannot make transactions for basically free anymore? Don't you get it?
The limit is there for a reason... some of which were already mentioned in this thread.

Maybe once we reach the limit there will be finally some steady income for miners (read not related to the initial block reward).
If the limit is reached consistently there would finally be a market for transactions which wasn't there before. You preach free markets and shit, but when it comes down to it you are afraid of them.

So leaving the limit in place and forcing most peoples transactions to go through clearinghouses is going to suddenly make miners rich or give them an incentive to mine? Do people here really want to see transaction fees having to be 20 or more dollars each on the bitcoin network? And sure the fees might be enormous. The users will take the brunt of that blow. But will miners actually gain from such large fees? Isn't the amount miners are "incentivized" to mine  come from the product of the size of the transaction fees but also  the quanity and wouldn't the quantity be reduced if you forced the majority of real world usage transactions to take place totally off the bitcoin blockchain?

markm
Legendary
*
Offline Offline

Activity: 3010
Merit: 1121



View Profile WWW
February 20, 2013, 08:55:30 AM
 #160

Quote
* By the way, for totally different reasons, it's already not really possible to be a low-badwidth pool. The reason for this is not the Bitcoin protocol per se, but DDoS attacks from bot operators mining on different pools. Unfortunately most major pools were already DDoSed. Ultimately the only defense against such kind of attach is large bandwidth.

Or p2pool?

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
Pages: « 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!