Bitcoin Forum
May 07, 2024, 10:06:00 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 [6] 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 »
  Print  
Author Topic: How a floating blocksize limit inevitably leads towards centralization  (Read 71512 times)
markm
Legendary
*
Offline Offline

Activity: 2940
Merit: 1090



View Profile WWW
February 19, 2013, 03:25:34 PM
Last edit: February 19, 2013, 03:42:39 PM by markm
 #101

Here's an idea: Reject blocks larger than 1 megabyte that do not include a total reward (subsidy+fees) of at least 50 BTC per megabyte.

"But miners can just include a never broadcast, fee-only transactions to jack up the fees in the block!"

Yes... but if their block gets orphaned then they'll lose those "fake fees" to another miner. I would guess that the incentive to try to push low-bandwidth/CPU miners out of the network would be overwhelmed by the disincentive of losing lots of BTC if you got orphaned.

That might work, but it is probably worth while not to un-cap the size limit, just raise it to something that wouldn't cause problems if someone *did* choose to blow a few hundred or thousand bitcoins on blowing up the network. Someone who hacked half a million coins some years back could easily be out there who'd love to blow up the whole shebang for just a few tens of thousands of coins...

-MarkM-

EDIT: As to eight decimals, it provides price granularity, so your transactions of at least one whole bitcoin have plenty of granularty for representing many different fractions of prime or almost-prime larger numbers. It need not mean transactions of less than a whole coin are to be encouraged.

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
1715119560
Hero Member
*
Offline Offline

Posts: 1715119560

View Profile Personal Message (Offline)

Ignore
1715119560
Reply with quote  #2

1715119560
Report to moderator
If you see garbage posts (off-topic, trolling, spam, no point, etc.), use the "report to moderator" links. All reports are investigated, though you will rarely be contacted about your reports.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
Mike Hearn
Legendary
*
expert
Offline Offline

Activity: 1526
Merit: 1129


View Profile
February 19, 2013, 03:31:17 PM
 #102

I don't think we want magic numbers for how many fees are "required". We don't actually know how much mining is needed today, except that whatever we need, we currently have too much (I don't see any malicious re-orgs that double spend against merchants).

If speeds fall too far and people start getting systematically ripped off by malicious miners, the people who care (ie, merchants) can club together and create all-fee transactions by working together until they reach the speeds they're happy with.

misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 19, 2013, 03:32:24 PM
 #103

My favorite quotes from the thread:

Storing value securely and moving significant value securely should not be sacrificed for the ability to tell the whole world about every penny-ante gambler's every wager and every child's purchase of a popsicle.

Blockchains are just simply bad at fast and cheap. They are a zero information hiding global flooding network (==costly) that depends on achieving a worldwide consensus (==slow). When a butterfly in china buys a dumpling with Bitcoin, processors in Nebraska must whirl, and everyone must wait until we've had time to hear back about all the whirling or risk having their transactions reversed. And in spite of being slow, costly, and with a heaping helping of the market-failure-maker of externalized costs... it turns out that that blockchains can actually do a whole lot. An O(N^2) algorithm can be quite fast when N is small. And with enough thrust (compromise) even a pig can fly (scale). But we shouldn't mistake that for scaling without compromise— because, at least as of yet, no one has figured out how.

I believe that Bitcoin plus a nice mixture off-chain transaction processing can be all things to all people: It can provide unbreakable trust based on cryptographic proof, it can provide infinitely scalable transactions. The chain doesn't scale great, but it can scale enough to handle high value transactions (I pay $30 for an international wire today) and scale enough to interlink infinitely scalable additional mechanisms.  People can pick their blend of security vs cost by choosing how they transact.

justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
February 19, 2013, 03:36:51 PM
 #104

The changes in the last year were "soft forks" -- forks that required all miners to upgrade (if they don't, their blocks are ignored), but that do not require merchants/users to upgrade.
For this change the distinction is hardly relevant, since it won't happen unless the merchants/users who run full nodes upgrade first.
johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
February 19, 2013, 03:53:18 PM
 #105

I don't see how a hard fork is somehow not Bitcoin. It's simply an upgrade. A bigger upgrade, but an upgrade nonetheless. If it is done in the face of serious controversy and there is a real split, then there is a naming problem. Otherwise it's just as much Bitcoin, just a new version.

It could be Bitcoin version 2.0 for example. If there is a permanent split, then the other network would be called something else.

I'm going to side with the version where the block size limit is raised, that is for sure. If it is done in a smart way though, it's a big change and needs to be thought of carefully. I don't advocate "just raising it", I advocate planning the change well and thinking about the long term future as well.

Changing the protocol is not the same as changing the client. Software becomes so cheap and even free is just because they could always come up with a new version with new features etc... This is the view of a service provider, not a central bank

Bitcoin gained big success just because it is can be trusted, but people certainly do not want their trust to be easily changed by a group of designers at will

Before, I support the change to protocol in a carefully planned way to improve the end user experience, but recently I discovered that you can double spend on both original chain and the new chain after a hard fork, then it means the promise of prevent double-spending and limited supply is all broken, that is much severe than I thought


Pieter Wuille
Legendary
*
qt
Offline Offline

Activity: 1072
Merit: 1174


View Profile WWW
February 19, 2013, 04:04:17 PM
 #106

The changes in the last year were "soft forks" -- forks that required all miners to upgrade (if they don't, their blocks are ignored), but that do not require merchants/users to upgrade.
For this change the distinction is hardly relevant, since it won't happen unless the merchants/users who run full nodes upgrade first.

A soft and hard for are not comparable.

In a soft fork, it is about getting the _majority_ of _miners_ behind the rule. Every piece of old software keeps working. Depending on the change, it may be advisable for merchants to upgrade to get the extra rules enforced, but those who don't just get dropped to SPV-level security. Nothing will break as long as somewhat more than 50% of hash power enforces the new rule.

In a hard fork, it is about getting _all_ of _everyone_ to change the rule at exactly the same time. Doing a hard fork where not everyone is on the same side, is an outright disaster. Every coin that existed before the fork will be spendable once on every side of the chain. If this happens, it is economic suicide for the system. Sure it may recover after a short while, when people realize to pick the side that most others chose, but it is not something I want to see happening.

The only way a hard fork can be done, is when there is reasonable certainty that all players in the network agree.

I do Bitcoin stuff.
Peter Todd (OP)
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1150


View Profile
February 19, 2013, 04:04:33 PM
 #107

If you think that the block size should stay at 1 megabyte forever, then you're saying the network will never support more than 7 transactions per second, and each transaction will need to be for a fairly large number of bitcoins (otherwise transaction fees will eat up the value of the transaction).

If transactions are all pretty big, why the heck do we have 8 decimal places for the transaction amount?

Why not? They're 64-bit numbers, might as well give plenty of room for whatever the price turns out to be.

More to the point, if I'm correct and in the future we're paying miners billions of dollars a year, that implies Bitcoin is probably transfering trillions of dollars a year in value, on and off chain. In that scenario the market cap is probably tens of trillions of dollars worth, so 1BTC could easily be worth something like $10,000USD. Thus the $20 fee is 0.002BTC. That's pretty close to fees currently in terms of BTC - you might as well ask why do we have 8 decimal places now?

One reasonable concern is that if there is no "block size pressure" transaction fees will not be high enough to pay for sufficient mining.

Here's an idea: Reject blocks larger than 1 megabyte that do not include a total reward (subsidy+fees) of at least 50 BTC per megabyte.

"But miners can just include a never broadcast, fee-only transactions to jack up the fees in the block!"

Yes... but if their block gets orphaned then they'll lose those "fake fees" to another miner. I would guess that the incentive to try to push low-bandwidth/CPU miners out of the network would be overwhelmed by the disincentive of losing lots of BTC if you got orphaned.

You know, it's not a crazy idea, but trying to figure out what's the right BTC value is a tough, tough problem, and changing it later if it turns out your estimates of what the BTC/USD/mining hardware conversions as well as the orphan rate is a tough problem. You'll also create a lot of incentives for miners to create systems to crawl through the whole Bitcoin network, discovering every single node, and then use that knowledge to connect to every node at once. Using such a system the second you find your block, you'll immediately broadcast it to every node immediately. Of course, the biggest miner who actually succeeds in doing this will have the lowest orphan rate, and thus has much more room to increase block sizes because faking tx fee's costs them a lot less than miners who haven't spent so much effort. (effort that like all this stuff diverts resources from what really keeps the network secure, hashing power)

This same system can also now be used to launch sybil attacks on the network. You could use it to launch double-spend attacks, or to monitor exactly where every transaction is coming from. Obviously this can be done already - blockchain.info already has such a network - but the last thing we need is to give incentives to build these systems. As it is we should be doing more to ensure that each peer nodes connect to comes from a unique source run by an independent entity like using P2Pool PoW's linked to IP addresses.

TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
February 19, 2013, 04:19:03 PM
 #108

Here's an idea: Reject blocks larger than 1 megabyte that do not include a total reward (subsidy+fees) of at least 50 BTC per megabyte.

That effectively sets a lower limit on the fees per transaction.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
markm
Legendary
*
Offline Offline

Activity: 2940
Merit: 1090



View Profile WWW
February 19, 2013, 04:27:11 PM
 #109

Maybe we could start off with a really conservative schedule of increases, like every time the mining rate halves the maximum block size must at least double?

That would give as an already-overdue doubling from last year's halving of minting, and provide that we will double the size again in less than four years if it isn't at least 4 megabytes by then.

That would give us an as soon as we can reasonably get it done doubling along with a few years in which to argue about whether we can survive until the next halving of the minting or will have to do an emergeny increase before then?

Meanwhile we can also start propagandising more the idea that bitcoins ought rightfully be worth thousands of dollars each, that it is kind of frivolous to use it for transactions of less value than that, and there are plenty of merged mined chains to choose from to start stabilising some other coin's value at some lower value per coin for smaller transactions...

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
hazek
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


View Profile
February 19, 2013, 04:27:18 PM
 #110

If you think that the block size should stay at 1 megabyte forever, then you're saying the network will never support more than 7 transactions per second, and each transaction will need to be for a fairly large number of bitcoins (otherwise transaction fees will eat up the value of the transaction).

If transactions are all pretty big, why the heck do we have 8 decimal places for the transaction amount?

Why not? They're 64-bit numbers, might as well give plenty of room for whatever the price turns out to be.

More to the point, if I'm correct and in the future we're paying miners billions of dollars a year, that implies Bitcoin is probably transfering trillions of dollars a year in value, on and off chain. In that scenario the market cap is probably tens of trillions of dollars worth, so 1BTC could easily be worth something like $10,000USD. Thus the $20 fee is 0.002BTC. That's pretty close to fees currently in terms of BTC - you might as well ask why do we have 8 decimal places now?

Not to mention off chain transaction service providers will need the accuracy of 8 decimal places or even more when they eventually have to push a sum of off chain transaction onto the chain.

My personality type: INTJ - please forgive my weaknesses (Not naturally in tune with others feelings; may be insensitive at times, tend to respond to conflict with logic and reason, tend to believe I'm always right)

If however you enjoyed my post: 15j781DjuJeVsZgYbDVt2NZsGrWKRWFHpp
caveden
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
February 19, 2013, 04:37:01 PM
 #111

Here's an idea: Reject blocks larger than 1 megabyte that do not include a total reward (subsidy+fees) of at least 50 BTC per megabyte.

As a protocol rule?
I find that worse than an automatic adjustment of the max block size. You can't set prices like that. You can't predict what 50BTC will represent in the future.
Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 19, 2013, 04:43:24 PM
 #112

Here's an idea: Reject blocks larger than 1 megabyte that do not include a total reward (subsidy+fees) of at least 50 BTC per megabyte.

As a protocol rule?
I find that worse than an automatic adjustment of the max block size. You can't set prices like that. You can't predict what 50BTC will represent in the future.

No, it does make some sense. With this type of system in place, the security of the network would be based on a BTC based reward. This way we would always have similar security level relative to the actual value of the monetary base.

Bitcoin is bound to become less and less secure relative to the value of bitcoins, since the total reward for miners is going down. Right now it doesn't matter at all because our network, just like Mike Hearn said, is actually extremely secure relatively speaking. It will continue to be that way for a while yet, but not forever. At some point the security level might become questionable. If there was something in place in the lines of Gavin's suggestion, the security of the network is pretty much guaranteed.

I do agree with Mike again that it's a bit questionable to just set it at 50 BTC. We really don't know how much mining is "high security", we only know that what we have now is quite enough. It would be wasteful to pay more fees for more mining if we don't actually need it.

Denarium closing sale discounts now up to 43%! Check out our products from here!
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
February 19, 2013, 05:08:46 PM
 #113

In a hard fork, it is about getting _all_ of _everyone_ to change the rule at exactly the same time. Doing a hard fork where not everyone is on the same side, is an outright disaster. Every coin that existed before the fork will be spendable once on every side of the chain. If this happens, it is economic suicide for the system. Sure it may recover after a short while, when people realize to pick the side that most others chose, but it is not something I want to see happening.
This isn't a hard problem to solve at a technical level. Have the nodes keep track of the version numbers they see on the network. When X% of the network has upgraded to a version which supports the new rules, and when Y% of the miners indicate their support via coinbase flags switch to the new rules. Until then use the old rules. Let the users decide when and if to switch.
Nagato
Full Member
***
Offline Offline

Activity: 150
Merit: 100



View Profile WWW
February 19, 2013, 05:14:39 PM
Last edit: February 19, 2013, 06:30:44 PM by Nagato
 #114

Second half-baked thought:

One reasonable concern is that if there is no "block size pressure" transaction fees will not be high enough to pay for sufficient mining.

Here's an idea: Reject blocks larger than 1 megabyte that do not include a total reward (subsidy+fees) of at least 50 BTC per megabyte.

"But miners can just include a never broadcast, fee-only transactions to jack up the fees in the block!"

Yes... but if their block gets orphaned then they'll lose those "fake fees" to another miner. I would guess that the incentive to try to push low-bandwidth/CPU miners out of the network would be overwhelmed by the disincentive of losing lots of BTC if you got orphaned.

The issue with setting an arbitary number like this is that it may not make economic sense to pay 50BTC in fees for 3600 transactions (~0.0138BTC) depending on how much Bitcoin is worth in the future. Should Bitcoin reach $100k, a transaction on average would cost in excess of $1000.

Im all for a floating block size limit but unlike difficulty, a change in the current block size affects "all" parties involved (clients, miners, all users via tx fees).
The core difference is that when someone wants to increase their hash rate, they bear the cost of investing in additional hardware. At most, it will bring up difficulty and force other miners to mine more efficiently or drive the less efficient ones out of business while increasing the security of the network. Users benefit(for "free") from increased mining competition via a more secure network. Free market forces will drive down mining profitability/transaction costs and increase hash rates to an equilibrium and everyone wins.

Now when there is a block size increase(whether it be floating or forked as a hard rule), things get messy.
Pros
Transactions are verified cheaper and processed as fast as they are today

Cons
Increased Storage costs for everyone(full nodes, miners)
Increased Bandwidth requirements (everyone including SPVs)
Reduced Network hash rate == Reduced security of Bitcoin(due to lower transaction fees, miners invest less in hashing power)

So lets break this down

Storage
Im surprised nobody is talking about storage issues. Sometimes when i launch the reference client after a few days, i start thinking how absurd it is that each SD bet has become a permanent cost(time to download, disk space, network usage) for every user of the full client now and in the future. Even at 1MB/block, we are looking at a block chain increase of about 55GB a year(including misc indexing files/USX DB). Increase this to 10MB and you start requiring a dedicated hard disk(500+GB) for every years worth of blocks. At 10MB, it starts requiring a considerable investment to run a full node after a year. This means your average user WILL NEVER run a full node after a year of this change. After a few years, running a full node becomes the domain of medium sized companies.

Solution:
Now lets assume that pruning is implemented and we start storing only the unspent output, at 10MB, a 2GB unspent output DB starts to seem reasonable. A few archived nodes containing the full blocks could be run by donation financed sites or the big Bitcoin businesses(MtGox, blockchain, etc..). Or the full client could be modified to include a DHT mode instead/in addition to the pruning mode to allow the average user to store a subset of the block chain.

Network Bandwidth
As easy as it is for Mike to say that 100mbit connections are widely available and bandwidth is not an issue, the fact is that not everyone lives in Kansas or Korea. If you look at Asia(excluding Japan/Korea/Singapore/Taiwan/HK), there are not many countries where you can get a stable 512kbps connection. Speed aside, even developed countries like US/Canada/Australia/New Zealand have many ISPs with puny bandwidth caps of 50GB - 100GB per month charging above $70. Some parts of Europe have extremely expensive internet with poor connectivity as well. This may or may not change. Some countries have government imposed monopolies allowing poor service and high prices while some countries do not have government investment/economies of scale to warrant an investment in internet infrastructure.

Still, having only miners having to worry about network bandwidth is fine in my opinion as it is a competitive business.
A full node used for verification should not need to worry about a 1-2min block download time as it is not in a race to find the next block. But it does mean that full nodes starting afresh may not be able to catch up with the current block if their download speeds are too slow.  For a 1mbit connection, even a 10MB block size would be pushing it. To me it becomes a serious issue when half the people in the world are unable to run a full node because the blocks are too large for them to catch up.

Security
This is something that we cannot account for because we have not had a precedent breach of security. Still, a single incident could be fatal to Bitcoin's reputation as secure form of money, something which it may not be able to recover from(infact this may lead to a loss of confidence in the currency and cause a collapse of it's value akin to Hyperinflation scenarios). I think this point should not be taken lightly, we know there are many parties who will benefit from Bitcoin's demise and would not mind mounting an attack at a loss.

My Take
Im with retep and max on this one for couple of reasons. Even if technically feasible, we should be extremely conservative in raising the block size limit(if at all) just because of security.

On a more philosophical level, i dislike wasteful systems.
I find it absurd that with ever more powerful hardware, software is getting slower and more bloated. In the past couple of decades, there has been a culture of waste in computing as we abuse Moore's Law.
Lets run virtual machines because the hardware is getting faster.
But we will need 1GB ram and a 1ghz dual-core processor with a fast GPU to swipe 16 icons on screen smoothly.
Compare this with the Genesis/Megadrive which was running the Sonic series on a 7mhz processor with 64KB RAM without a GPU and you start to realise just how inefficient and wasteful today's software has become.

Bitcoin as a decentralised p2p system is extremely inefficient as compared to a centralised system like Visa as has been pointed out by multiple posters. Now in Bitcoin's case, the inefficiency is a requirement to maintain a decentralised system, a necessary evil if you will. Competing centralised services built atop of Bitcoin to cater for micro payments will not only be more efficient and cheaper but also instant, something which Bitcoin will not be able to compete with and should not. Advocating the use of Bitcoin for volumes it is not optimised for just seems extremely wasteful to me.

It is important to understand that these centralised services will be more akin to the tech industry than with today's banking industry. Anyone can start a tech company in his basement or garage if he wants, you cant start a bank or fiat payment processor like Visa unless you have connections with people of power(Regulators, governments, big banks) because of many artificial barriers to entry. Unlike fiat currency, Bitcoin is an open platform and the anybody is free to build services atop of it for which Bitcoin is not well suited for.
Likewise anyone who can host a website can start a micro-payment processor. The big exchanges and hosted wallet services would(already) do it for free to save on Bitcoin TX fees and allow instant confirmation. Moreover, this is largely going to be for micro-payments where customers would maintain a small pre-paid deposit and processors would clear balances regularly(hourly-daily) with other processors. The losses in the event of a dishonest processor would be minimal.

In summary, leave the block size alone and let the free market work around it. Bitcoin's primary role was to liberalize money and that goal should not be compromised to support a micro-payment network for which it is ill-suited for.

Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 19, 2013, 05:16:17 PM
 #115

I like jojkaart's idea, can you guys comment more on it?

"How about tying the maximum block size to mining difficulty?

This way, if the fees start to drop, this is counteracted with the shrinking block size. The only time this counteracting won't be effective is when usage is actually dwindling at the same time.
If the fees start to increase, this is also counteracted with increasing the block size as more mining power comes online.

The difficulty also goes up with increasing hardware capabilities, I'd expect that the difficulty increase due to this factor will track the increase of technical capabilities of computers in general."

Denarium closing sale discounts now up to 43%! Check out our products from here!
DannyHamilton
Legendary
*
Offline Offline

Activity: 3388
Merit: 4653



View Profile
February 19, 2013, 05:30:36 PM
 #116

This isn't a hard problem to solve at a technical level. Have the nodes keep track of the version numbers they see on the network. When X% of the network has upgraded to a version which supports the new rules, and when Y% of the miners indicate their support via coinbase flags switch to the new rules. Until then use the old rules. Let the users decide when and if to switch.

So if my client only ever connects to a subset of clients that have chosen not to upgrade, then my client won't know to make the switch when the rest of the networks switches?
caveden
Legendary
*
Offline Offline

Activity: 1106
Merit: 1004



View Profile
February 19, 2013, 06:04:57 PM
Last edit: February 19, 2013, 06:30:05 PM by caveden
 #117

As a protocol rule?
I find that worse than an automatic adjustment of the max block size. You can't set prices like that. You can't predict what 50BTC will represent in the future.

No, it does make some sense. With this type of system in place, the security of the network would be based on a BTC based reward. This way we would always have similar security level relative to the actual value of the monetary base.

You can't predict how much the monetary base will be worth nor how much security is enough. You can't set prices like that.

I do agree with Mike again that it's a bit questionable to just set it at 50 BTC. We really don't know how much mining is "high security", we only know that what we have now is quite enough. It would be wasteful to pay more fees for more mining if we don't actually need it.

That's what I'm saying, you can't know what's enough or more than enough, so you can't simply set a mandatory amount per Mb.
You seem to be contradicting yourself in the same post...
Technomage
Legendary
*
Offline Offline

Activity: 2184
Merit: 1056


Affordable Physical Bitcoins - Denarium.com


View Profile WWW
February 19, 2013, 06:13:23 PM
 #118

That's what I'm saying, you can't know what's enough or more than enough, so you can't simply set a mandatory amount per Mb.
You seem to be contradicting yourself in the same post...

True, but I haven't formed any final opinions on the whole issue, so that sort of explains it. I'm still throwing around ideas and scenarios in my mind, like many others here I suppose.

Denarium closing sale discounts now up to 43%! Check out our products from here!
cjp
Full Member
***
Offline Offline

Activity: 210
Merit: 124



View Profile WWW
February 19, 2013, 06:32:37 PM
 #119

Half-baked 1:
why the heck do we have 8 decimal places for the transaction amount?
That's just a design decision that had to be made in an early stage, when it wasn't clear yet what the potential of Bitcoin would be. I think it was a good decision at the time, since the number of bits for storing values is still small compared to e.g. scriptsigs. As other posters have mentioned, it can still be useful in the future, even when we will never have really small (satoshi-sized) transactions in the block chain.

Half-baked 2:
Here's an idea: Reject blocks larger than 1 megabyte that do not include a total reward (subsidy+fees) of at least 50 BTC per megabyte.
How is that arbitrary limit better than a "number of transactions" limit?
Also, you choose quite a high number: it would be about 0.01BTC/transaction. How is that going to allow "satoshi-sized" transactions?

About coupling the transaction limit to difficulty:
Interesting idea. For now I only see one problem: it is possible (and likely) that in the future processing power will increase at a different rate than network+storage, so you could still run into trouble. But since it's a "better approximation" of the optimal transaction limit than a constant limit, you'll probably run into trouble "less often".
Just make sure it's set low enough so that people can run full nodes at moderately small investments (maybe not for the average consumer, but at least for a non-profit hobbyist).

Donate to: 1KNgGhVJx4yKupWicMenyg6SLoS68nA6S8
http://cornwarecjp.github.io/amiko-pay/
markm
Legendary
*
Offline Offline

Activity: 2940
Merit: 1090



View Profile WWW
February 19, 2013, 07:05:01 PM
Last edit: February 19, 2013, 07:36:40 PM by markm
 #120

Once ASIC production gets down pat, churning out chips whose research and setup costs have long been covered, and so many are in use that they are only borderline profitable and even then maybe only in places where electricity is dirt cheap and/or the heat produced is needed or recycled back into more electricity what level of difficulty will be be looking at?

A megabyte per decimal-digit of difficulty would already be seven megabytes right now, since difficulty hit 1,000,000. I am not sure offhand how many leading zeroes that is in the binary version that is actually used by the code.

But fundamentally we just cannot really know what the future needs will look like, which is why I still favour going through this whole should we shouldn't we and the whole is everyone on board for this hard-fork process each time it starts to seem to some as if the limit does need to be raised.

How long is that process likely to take? We seem to have oodles of spare space currently, if we take our time about maybe adding a whole 'nother megabyte we still might find we have more than enough space by the time it even comes into effect.

I have even read here and there claims it is mostly Satoshi Dice that has made block sizes climb enough to notice much, if that is so I'd say we should correct the failure to discourage such frivolous waste first before considering increasing the block size.

Since I last wrote, bitcoin price has risen significantly again, so it is clearer by the hour that the block size limit is not discouraging use / driving away adoption enough to hit us in the pocketbooks, if at all. Maybe the people inventing in the dram of $10,000 per bitcoin actually like the idea that transaction fees will be high enough to ensure each and every one of their coins is safer than houses.


-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
Pages: « 1 2 3 4 5 [6] 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!