Bitcoin Forum
May 23, 2024, 09:16:49 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 ... 95 »
121  Bitcoin / Development & Technical Discussion / Re: Please do not change MAX_BLOCK_SIZE on: June 04, 2013, 09:43:10 AM
The block size will be raised, that is the overwhelming consensus among the people who are actually writing code and using Bitcoin for products and services that it needs to happen.

And there is a tiny minority of people who will loudly proclaim that isn't true and that the core developer are going to destroy Bitcoin if the block size is raised.

If you want to be helpful, please organize a list of objections to raising the block size limit and responses to those objections.

I believe the last objection raised was that a higher block size limit would make it impossible to mine anonymously, but I think that has been debunked with the notion of "read the firehose of transactions non-anonymously, then broadcast just new block header + coinbase + listof(truncated transaction hashes) anonymously."

I'll soon be writing up a plan for how we can safely raise the block size limit.


RE: central planning:

No central planning is why I would like to eliminate the hard, upper blocksize limit entirely, and let the network decide "how big is too big."

RE: "the plan"  :   The plan from the beginning was to support huge blocks.  The 1MB hard limit was always a temporary denial-of-service prevention measure.


Thank you Gavin for keeping common sense on this matter.

Little suggestion to the dev team: when dropping the block size limit, also consider implementing "replacement code" that would give block generators the ability to control the block limit themselves through soft-limits. Soft-limits could be useful not only on the size of the block, but on the amount of "unknown transactions" you see in a new block - spammer-miners would have to fill their blocks with transactions that are not on other generators' memory pools. In other words, don't forget to provide the tools for block generators to collectively control the block size.
You may bet that, as soon as there's no longer a hard limit on block sizes, some of these "Bitcoin should crippled and limited to a SWIFT2.0"-people will attempt to spam the network with gigantic blocks, if only to "prove their point".
122  Bitcoin / Bitcoin Discussion / Re: Peter Vessenes: Take a step back and F*** YOUR OWN FACE!!! on: June 03, 2013, 06:13:53 AM
I'm glad I never donated to Bitcoin Foundation.

+1

This Vesseness guy is... a "bad influence", to say the least.

I won't be surprised the day he proposes mandatory white-listing of addresses.
I hope Gavin knows that's the moment to jump off.
123  Bitcoin / Bitcoin Discussion / Re: The Holy Grail! I wish I could kiss the author of Bitmessage on his face. on: June 03, 2013, 05:55:39 AM
Are colored coins even economical in 0.8.2? I heard some concerns that the stifling of dust transactions would effect them.

You could either waste the arbitrary amount of 5430 satoshis for every colored coin, or, with the collaboration of a single mining pool, you can get colored coins transactions through even with 1 satoshi outputs. Obviously miners would request some fee to be paid, that's why I wonder if an alternate chain wouldn't be a better idea as fees could be paid with the tokens themselves, but anyway, colored coins on the main chain are already implemented, an alternate chain would have to be created from scratch.
124  Bitcoin / Bitcoin Discussion / Re: Why does there need to be a limit on amount of transactions? on: May 31, 2013, 12:24:33 PM
Except that the block size issue is not akin to free market dynamics.

In a free market, when someone makes a purchase, the only parties directly affected are the purchaser and the seller.

In the blockchain world, when someone publishes a 1GB block, the price is paid by every full node while the reward is only
collected by the publisher. This dynamic is closer to private profit, public loss.

Dude, this a repeated scenario in economic theory. The "fear" that free markets cannot internalize costs when there's the possibility of a tragedy of the commons. It's similar to security in the meatspace.
And guess what? It's perfectly possible to eliminate the tragedy of the commons risk through spontaneous order, as long as property rights are established and respected.

And btw, private profit and public loss only happens when the state comes into the scene.

There is much incentive for a well connected miner to publish a large block for 3 reasons
i) He and only he gets more txn fees. The larger the block, the more revenue he gets.

Nagato, if there are real transactions paying large fees to get included, this represents real demand. Miners better attend it or Bitcoin jams!

And as I explained twice on this thread already, the risk of hitting some soft limits would make miners be prudent on this. They would only increase their blocks when there's enough consensus, or when the demand is so strong that it's worth the risks. In both cases we are fine.

P2Pool has an insignificant share of hashing power even though miners get to keep 100% of all earnings vs mining pools which take a cut or txn fees.

Why?
Because the cost of running a full node outweighs the the revenue loss from mining with a pool.

Please, I and many others run a full node without getting nothing in return.

AFAIK P2Pool is not very popular because it allegedly has large stale rates. I don't know if this claim is factual.

Personally i think keeping the Bitcoin protocol decentralised to be much more important than keeping its direct transactional capabilities decentralised.

Both things will always be the case, if you remove the hardcoded constant limit.

Ideally, the community takes the middle ground and increases the block size slowly to keep pace with bandwidth increases.

But that's precisely what I'm saying! Block size should be controlled by everybody, with their choices and plannings, not by a centrally imposed formula.
125  Bitcoin / Bitcoin Discussion / Re: Why does there need to be a limit on amount of transactions? on: May 31, 2013, 06:39:22 AM
but to counterbalance people who appear to seriously want to dump the limit altogether in one go, and hope things will just work out, with no empirical evidence beyond thought experiments and forum debates.

Sigh...
It's not a "hope", it's an aprioristic certainty. You don't need central planning to avoid "nefarious market cartelization", just study economics if you don't believe me.

Talking Bitcoin specifics: it's easy to spot an attempt of spamming by another miner. Its blocks will contain a large percentage of unknown transactions. So, just create soft limits to censor blocks with many unknown transactions. Say, if a block contains more than 10% of unknown transactions, don't build on top of it unless it's already 1 block deep. If it contains >20%, wait for it to be 2 blocks deep etc. Obviously the percentages and depths should be configurable.
You can also add such limits on the block size itself. Larger than X, wait for Y depth at least. Multiple (X,Y) configurable pairs.
Oh, and as a bonus, such soft limits would create an incentive for miners to relay and broadcast transactions they receive - today they have no incentive other than the counter-incentive of having to patch the software. If they keep transactions they receive from SPV nodes for themselves, they might get their block orphaned.

It's quite visible that miners would only slightly raise their limits when they believe the gains from adding more paying transactions would outcome the potential losses from orphanage. That's spontaneous regulation, transaction space supply adapting to its demand.
And it's absurd to claim that a remote, unclear chance of kicking out very low bandwidth miners would be so attractive as to make large bandwidth miners take the risk of losing their entire block on orphanage.

Please, people, being p2p is the greatest feature of Bitcoin. P2P is all about spontaneous order - an actual verifiable fact, not a mere "hope". How can you claim to support the first and largest p2p currency and yet be against spontaneous order?

126  Bitcoin / Bitcoin Discussion / Re: Why does there need to be a limit on amount of transactions? on: May 30, 2013, 08:36:09 PM
Agreed higher bandwidth connections will be more common in the future however if 1% of potential users have a 1 Gbbps connection and that becomes the minimum then you have reduced the potential full nodes to <1% of the planet.  The numbers also aren't as rosy as they seem on first glance.  A node by definition needs connections to multiple peers so a node connected to 8 peers will rebroadcast a tx it receives to 7 peers.  Now 8 is the minimum for network security we really want a huge number of nodes with high levels of connectivity (20, 30, 500+ connections).  So lets look at 20.
...

Come on, D&T... I know that you know that a node should only need to broadcast a tx to all his peers if he's the very first to receive and validate it. Nodes can first send a "I have this new tx" message, which is small (tx hash size), and then upload the tx to the peers that requested it. Not all of your peers will request it from you - they're connected to other nodes too.

I used the amount 10 in a conservative way... I don't think a node would upload the same transaction 10 times in average, it seems a high number to me.

But it'd be interesting to see statistics on how many times a node has to upload a tx, proportionally to its amount of connections. I never saw any.

The last issue is what Nagato mentions above (although his numbers are low due to the need to broadcast to multiple peers).  

I've already answered Nagato above. (and I know that you knew that too...)

BTW despite the post I am bullish on Bitcoin, solutions can be found however those advocating dropping all limits because of "centralization" need to realize at the far extreme it just leads to another form of centralization.  When only a tiny number of players can afford the cost of running a mining pool (and 1, 10, 50 Gbps low latency connectivity) or run a full node you have turned the p2p network into a group of a few hundred highly connected peers.

I'm confident that spontaneous order can easily tackle block size control. Miners can implement soft limits, not only on block size per se, but also on the percentage of unknown transactions in a block as I said above (normally you should have most transactions of the new block in your pool, if you don't, it might represent a spamming attempt).
Just look at miners today: they're already extra-conservative, only to ensure the fastest possible propagation.

Guess what modern banking IS a peer to peer network of a few hundred highly connected peers.   The fact that you can't be a peer on the interbank networks doesn't mean the network doesn't exist.  The barriers (legal, regulatory, and cost) just prevent you from becoming a peer.

Banking is an industry in symbiosis with the state. The problem with it are the regulations: that's the barrier of entry that makes it so hard to hop in. The cost of the business per se shouldn't be that high. Taking care of people's money (which is mostly digital today) has no reason to be a more costly business than a complex factory for instance.
Just take a look at the amount of competitors that show up in places where banking regulations are less burdensome, like Panama, and compare it with other places (relatively to the country's population and GDP sizes)
127  Bitcoin / Bitcoin Discussion / Re: Why does there need to be a limit on amount of transactions? on: May 30, 2013, 08:10:34 PM
What many people don't realise is that the bandwidth numbers quoted on the wiki and by you only apply to keep up with the block generation rate. An independant miner will need 100x - 1000x more bandwidth to mine at all.

1 MB block size produced ONCE every 10 minutes NOT over 10 minutes
If im a miner, i want to download that new block as fast as possible to reduce my idle time.
Lets use 1% idle time as your target(Means your entire mining farm sits idle for ~6s while you download the block)
...

That's not the case. If you were online since before that block started to be built, you already received all its transactions. They're all on your transaction pool. There's no actual need to download them again (that's a performance improvement suggested by the scalability page by the way).
To start mining on the next block, all you need is the header of the previous, and a sorted list of transaction hashes to build the Merkle tree. That's much less data then the entire block.

Unless of course the block contains lots of transactions that are not on the memory pool, in that case you'll have to download these unknown transactions.

And there you have it: an easy way to detect if a spamming attempt is in progress. If a sensible amount of transactions in the new block was not present on your memory pool, you should consider that block a spamming attempt by a miner and refuse to mine on top of it, unless of course it's already more than x blocks deep, in which case you accept it (soft limits).
If the spamming miner decides to broadcast his spamming transactions, he'd hit anti-spam fee policies, and end up needing to pay other miners in the network to include its spam.

Just to clarify im not opposed to an increase in block size as long as decentralisation is not compromised by ensuring that the block size remains small enough for average residential broadband connections/commodity PCs to mine with.

Mostly everybody agrees with that. The argument is between those that think that an arbitrary formula should be invented and imposed via the protocol, and those who believe that spontaneous order (or p2p, free-market, freedom, pick your term) can implement a better and safer control on block size without the use of a centralized formula. Well, there's also a third group that thinks the 1Mb limit should be kept, but I can't take them seriously...
Not only I believe spontaneous order would reach better results, I also agree with D&T when he says that setting a formula is technically (and pollitically) complicated, and potentially error-prone (might introduce bugs).
128  Bitcoin / Bitcoin Discussion / Re: Why does there need to be a limit on amount of transactions? on: May 30, 2013, 07:23:56 AM
The bottleneck is more in this order (from most critical to least critical):
bandwidth (for residential connections the upload segment tends to be rather constrained)
memory (to quickly validate txs & blocks the UXTO needs to be kept in memory, sure pulls from disk are possible and ... panfully slow)
storage (as much as people worry about storage it is a free market unlike residential last mile and HDD capacities already have a massive "headstart")
cpu (with moore's law I don't see this ever being a problem but as pointed out non CPU solutions are possible)

I agree with your bottleneck order. Bandwidth will probably be the first, particularly with SSDs getting cheaper (you can store your UXTO in a SSD for better I/O performance). CPU can be dramatically improved as you say. Storage is not such a big deal. And if memory becomes a big deal, good caching strategies together with SSDs could circumvent it.

Let's talk bandwidth then... It seems people in Kansas City already have 1Gbit/s available in their homes, up and down. Assuming the 400bytes average for bitcoin transaction that I read somewhere, that's more than 300Ktps if I'm not mistaken. That's a shitload of transactions. Even if transaction sizes were to multiple by 3 due to more usage of multi-signature features (something that I hope will happen), that would still be more than 100Ktps. What's the average number of times a full node has to upload the same transaction? It shouldn't be much, due to the high connectivity of the network. But even if you have to upload the same transaction 10 times, Google Fiber would probably allow you to handle more transactions than Visa and Mastercard combined! We're obviously not hitting such numbers anytime soon. Until there, there might be much more than 1Gbit/s available for residential links.

All these desperate attempts to hold the block limit become ridiculous when we look at the numbers.

The average Joe hasn't even started using Bitcoin today.  The requirements of a full node will only increase.  Today users are already encouraging new/casual users towards lite nodes and eWallets.  That trend will only accelerate. 

I'm not sure. The greatest issue for new users is having to wait for the initial sync. If the client were to operate as an SPV in the meanwhile, and switching to full mode once initial sync is complete, I guess many more people would be OK with having a full node. Well, some would complain about how slow their computer got after they've installed this bitcoin-thing, and might be turned off. But not that much as today.
129  Alternate cryptocurrencies / Altcoin Discussion / Re: How does Ripple consensus deal with forks? on: May 30, 2013, 06:56:26 AM
this could only occur if people are using totally disparate, non-overlapping UNLs which will not happen in practice due to defaults and universally trusted nodes.
...
people running clients who are not validators need not worry about it. they may use the default list of UNLs.
Also, it is all about UNL overlap.

So, people are expected to trust a set of "default validators". Everybody.

Don't you believe that gives too much power to these validators? Okay, you may argue they could be in different jurisdictions, some of them could be anonymous entities on a darknet etc, but still.

Not to mention this "default list". I'm supposing it'd be attached to the client. What a power it gives to these clients editors...

regardless of that childlike campaign, the simple fact remains: the bigger the block the longer the transmission time. we are essentially relying on Moore's Law here in network capacity which does not apply across the board in computing. still, there are other issues described above.

Have you ever read this? https://en.bitcoin.it/wiki/Scalability
Mike Hearn has already made lots of good posts on this subject too, as well as Gavin. Search for their posts if you're interested.
130  Bitcoin / Bitcoin Discussion / Re: The Holy Grail! I wish I could kiss the author of Bitmessage on his face. on: May 30, 2013, 06:43:42 AM
@dscotese

The whole point here is not to save fiat currency. If you're going to hold fiat-backed tokens, well, you're accepting all the problems you talk about. The whole point here is to improve considerably the process of exchanging fiat for bitcoins. Bitcoins do no suffer the problems you cite: you can be cryptographically sure your "bank" has your coins. But unfortunately we need - and we'll keep needing for a long time - to exchange bitcoins for fiat and vice-versa. If we could render such process much more reliable and censorship-resistant, that'd be a good thing, wouldn't it? That's the point here.
131  Alternate cryptocurrencies / Altcoin Discussion / Re: How does Ripple consensus deal with forks? on: May 29, 2013, 10:07:18 AM
Of course they'd be "ignored", that's the very definition of a "split", each part of the network believe in its version of the ledger. But how do they get to common ground? How do you fix the split and merge everything back, without allowing for dangerous double-spends?

I'm not talking about lying, I'm talking about honest splits, please read OP.
If my node's ULN doesn't share a large intersection with your node's ULN, we may split. Specially considering the high settlement frequency Ripple supposedly works with. At least that's my understanding of the protocol.
since everything is signed you know generally who the other validators are and can prevent splits/merge them.
Quote
If a validator cannot keep up with the load of transactions being voted into the consensus set, it will switch to issuing partial validations. These let those who trust it know that it is not split off from the network (and potentially validating other ledgers). In addition, validators will raise the transaction fees they demand to prevent the network from shrinking to a small set of "super nodes", as that would increase the risk of collusion.
remember, this is a distributed network, so each validator comes into play to reach consensus. If one node's unique node list agrees on one ledger, that node will also agree on that ledger. Since every node has a unique node list and all ledgers/validators are signed, the network can never fork in a significant manner since the network essentially knows all the node players and knows which node players signed what.

This is not clear to me.
It seems you're saying that for it to work, all nodes would have to basically set the same ULN, or at least do not set ULNs which are meaningfully different.

That's not a desirable requirement. More than that, I don't see how could you guarantee it.

I'm correct to assume that the validation consensus need not to be unanimous, right? If 10% of your ULN doesn't agree with the ledger, you simply ignore them and trust the other 90%, right?
Then how do you guarantee a split won't happen?
Or else, how do you fix a split once it happens? Manual reconfiguration cannot be an option. It could take too long and make people vulnerable to double-spends.

ordinary users would not have to know what a validator is

You say what's above, then you say this:

not really, the slightest change could get you distrusted and ignored in the validator pool.  

But how, if most people don't even know what a validator is? Who would distrust the validator implementing "slight changes"?

i wrote "spammers" not "scammers."

Sorry, my bad. (there's no actual spamming problem in the blockchain though... these dust rules were mostly unnecessary IMO)

this just means Bitcoin cannot update to become what Ripple is - cannot simply update once something catastrophic happens.

Of course it can, we saw it happening this year with the 0.7 Berkeley DB issue.


I'm aware of retep desperate attempts to spread this false idea. It doesn't make it correct though.
132  Alternate cryptocurrencies / Altcoin Discussion / Re: How does Ripple consensus deal with forks? on: May 29, 2013, 08:55:55 AM
The forkers would be ignored and cut out.

Of course they'd be "ignored", that's the very definition of a "split", each part of the network believe in its version of the ledger. But how do they get to common ground? How do you fix the split and merge everything back, without allowing for dangerous double-spends?

Since every validation must be signed, it is blatantly apparent who is lying.

I'm not talking about lying, I'm talking about honest splits, please read OP.
If my node's ULN doesn't share a large intersection with your node's ULN, we may split. Specially considering the high settlement frequency Ripple supposedly works with. At least that's my understanding of the protocol.

It is all reputation based - so if I have a reputation as a good validator people will choose to trust my node as a validator.

I'm not sure ordinary users would even bother to know what a validator is, let alone a good one, but anwyays, that's not the question.

Imagine taking out all of the spammers, "dust bunnies," and bad guys from the Bitcoin protocol

You'll only take out scammers and others if you introduce chargeback. But then you only push the problem onto merchants actually, who can take no action to counter these scammers (only their direct victims could have potentially done so)

Even if 80% of validators that somehow managed to gain trust decided to fork Ripple dishonestly the major exchanges, institutions, and people who matter in commerce would cut them out immediately and would never trust them again as validators based on their signatures.

You might be right. On the other hand, protocol changes can be introduced without people having to change their clients. If 80% of validators decide to introduce some change in the protocol, your node will automatically consent to it. In Bitcoin, you'd need to take the action of updating to the new protocol version.
I wonder if that's not a potential vulnerability. How many big actors would really change their validators? Introducing dangerous changes little by little, wouldn't that be a possibility?

What's more is that Bitcoin cannot change it's core without becoming an entirely different protocol.

I consider that an advantage. The contract doesn't change without your explicit and active consent. In Ripple, you may "consent" without even knowing it (passively).

If we increase the size of blocks we move toward centralization

That's false and FUD.

Even if Moore's law results in better connectivity and speed, we are still wasting all this electricity and processing power on mining just to prevent the 51% attack. Ripple helps solve these problems and is a happy medium.  

Granted, mining is costly. But I'm still not 100% convinced Ripple actually "solves it". Nobody has yet answered this topic's question: how does ripple deal with splits/forks?

I would rather embrace a distributed, technically unified protocol that will transition from centralized to decentralized than a fragmented, bottlenecked protocol which will transition from decentralized to centralized.

Bitcoin is neither.
133  Bitcoin / Bitcoin Discussion / Re: The Holy Grail! I wish I could kiss the author of Bitmessage on his face. on: May 29, 2013, 08:55:24 AM
Datz, I answered your post in the appropriate thread. If you could transfer the discussion to that thread I'd appreciate. Thanks.
134  Bitcoin / Bitcoin Discussion / Re: The Holy Grail! I wish I could kiss the author of Bitmessage on his face. on: May 29, 2013, 06:16:51 AM
Ripple does more than just solve the "multiple currency" transaction problem - it provides a more efficient validation and confirmation ledger platform. Mining is unnecessary and inefficient in the long run.

Perhaps, but I'm still not 100% convinced. For example, what's the answer to this question?

We will see higher and higher transaction fees as Bitcoin adoption increases if the block size does not increase.

I agree with that. And that's why the block size limit has to be lifted. Even if it implies a chaotic hard-fork due to the stubborn ones who want to cripple Bitcoin.... so be it. The majority will prefer the scalable and affordable Bitcoin, not the SWITF2.0 one.
135  Bitcoin / Project Development / Re: P2PX. Using SSL dumps as proof of money transfer on: May 27, 2013, 12:11:56 PM
2. The other problem is the technical problem of "recording" only *part* of the SSL session, so as not to expose the most sensitive data. I don't think the user could first login to internet banking, and then switch on a proxy to start rerouting traffic via the escrow's machine (who would then dump all decrypted SSL output). That's what we want, but it sounds somewhere between tricky and impossible. There needs to be a solution to this.
Dan's fallback of changing your password after the audit is a great idea for a last resort (or for the sensibly paranoid), but it can't be the main idea there.

Hum, I thought the recording was being done on localhost, but only now I realized that if the escrow doesn't perform the SSL-handshake himself record everything himself on real-time, he can't really be sure the recorded data wasn't forged. That's correct, right? Only the handshake is properly signed? Each single message is not signed by the server?

Tough.

On a slight tangent, I personally don't mind if others see my name and account number. As far as I'm concerned, that's semi-public knowledge.
...
so really the worst that happens from giving someone your statement is that they see your bank balance.

Of course, it's not the name+account nb that's sensitive (at least not IMO), it's the balance and transaction history.
136  Bitcoin / Project Development / Re: P2PX. Using SSL dumps as proof of money transfer on: May 27, 2013, 08:53:32 AM
To answer the question "why should my escrow see that" - the difficulty here is we are trying to prove whether the BTC seller has received the promised USD in his account or not. That can only be resolved by seeing a list of transactions for the given date.

You could show only the details of the relevant transaction. At least every internet banking I've used so far do display a "confirmation page" after a transaction is finished, and this confirmation page contains all relevant data for the escrow (source and target accounts, amount, date, description).
The only "inconvenience" is that the plugin wouldn't be able to automatically recognize the confirmation page - well, it could learn if it's a really fancy software with AI and all, but let's keep it simple please Wink -, meaning that the user should manually indicate that "this is the good page to save". That's why I mentioned a printscreen-like button. And as long as the internet banking displays the details of past transactions, the escrow could always instruct his clients on how to obtain proof of transfer in case of a dispute, meaning that in 'everything goes well' use cases, people wouldn't even need to bother with installing the plugin.
137  Bitcoin / Project Development / Re: P2PX. Using SSL dumps as proof of money transfer on: May 27, 2013, 07:26:30 AM
Sorry for not reading the entire topic, I'm just answering the OP. If what I'm saying has already been said, just ignore me. Wink

It's a powerful idea, but there's only one thing that bothers me:

4. After that all further traffic between user<<-->>bank gets redirected by the plugin through the escrow agent's proxy

All traffic? As soon as I log into my internet banking my balance is displayed, together with latest transactions. Why should my escrow see that?

Can't the plugin have a sort of "print-screen" button, and only send the content of the current page when this button is pressed (together with the SSL session information, of course)? The user could then send only the receipt page for the transfer, which should be enough.
138  Bitcoin / Bitcoin Discussion / Re: Bitcoin Island/City and More on: May 27, 2013, 07:04:54 AM
As best I can tell, the core belief of a Libertarian is if they can rip off someone, it means ipso-facto that they are superior and deserve to have the victims money by virtue of that alone.  In practice, at least.

Troll harder.
139  Bitcoin / Bitcoin Discussion / Re: The Holy Grail! I wish I could kiss the author of Bitmessage on his face. on: May 25, 2013, 10:56:50 PM
Actually I meant what I said.

Have you tried libertyreserve.com lately?

I never tried it.
Wasn't aware of it. Thanks for the link.
140  Bitcoin / Bitcoin Discussion / Re: Why does there need to be a limit on amount of transactions? on: May 25, 2013, 10:47:15 PM
This is always a political and economic discussion instead of a technical one. In one word: Scarcity create value, aboundant destroy value, as a currency you want it to have highest value possible, so transaction capacity should be scarce, people will adjust their behavior accordingly

What?? You're mixing the value of the currency with the value (cost) of transacting it.

The scarcity of bitcoin is set on stone: 21M units. No more.

Making Bitcoin transactions more scarce - thus more expensive - is more likely to reduce the value of the currency due to its reduced utility (if it's more expensive to transfer it around, it's certainly less useful).
Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 ... 95 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!