johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
May 24, 2013, 08:28:07 PM |
|
Scarcity only matters to the extent that it contributes to value. The value of the currency comes from the demand to supply ratio. Demand will be limited if transactions are limited. The supply is 21 million BTC regardless of how many transactions are allowed, so more transactions = more demand for each unit of BTC.
Will you change a bank just because one bank opens 24/7 and another bank open only 4 days a week? I guess the priority should be based on the security and the interest that banks provide. Currently bitcoin bank pay you an interest of 1000%+ per year, and you still complain that you have to wait in line to withdraw The value of a currency comes from trust and consensus, not supply and demand, this is the most significant difference between money's valuation and anything else's valuation. The demand for money is always endless, given enough trust for that money
|
|
|
|
jdbtracker
|
|
May 24, 2013, 08:53:34 PM |
|
Scarcity only matters to the extent that it contributes to value. The value of the currency comes from the demand to supply ratio. Demand will be limited if transactions are limited. The supply is 21 million BTC regardless of how many transactions are allowed, so more transactions = more demand for each unit of BTC.
Will you change a bank just because one bank opens 24/7 and another bank open only 4 days a week? I guess the priority should be based on the security and the interest that banks provide. Currently bitcoin bank pay you an interest of 1000%+ per year, and you still complain that you have to wait in line to withdraw The value of a currency comes from trust and consensus, not supply and demand, this is the most significant difference between money's valuation and anything else's valuation. The demand for money is always endless, given enough trust for that money Very true we have conceded to value this currency... could it be that the exchanges are hijacking it's value? we have had speculators flooding the market this year, but wow! have you seen how steady the price has been across all the exchanges lately? so how do we concede to value Bitcoin then, rationally and backed by science? instead of scarcity? The fees as well, their is a market on fees... but more importantly it seems to be guided by luck. Maybe your lucky and you get a ASIC miner to take care of your transaction instead of a efficient FPGA miner. and what about the ecosystem from the other alt coins? how does that change it's valuation? the factors for Bitcoin are complex and deep, but very entertaining to discuss. I don't have answered just questions.
|
If you think my efforts are worth something; I'll keep on keeping on. I don't believe in IQ, only in Determination.
|
|
|
RationalSpeculator (OP)
Sr. Member
Offline
Activity: 294
Merit: 250
This bull will try to shake you off. Hold tight!
|
|
May 25, 2013, 04:52:55 AM |
|
Why the wink? Yes I saw that but there is a strong argument against this. As long as an ordinary personal computer can deal with it, it will remain decentralised. But since ordinary personal computers go up in power exponentially, so can the blocksize limit/amount of transactions go up exponentially, right? VISA is transferring inflative fiat money for the purpose of spending, people will always spend inflative currency at the first place, so even bitcoin's network are capable of handle huge amount of transaction per second, it still will seldom be used to buy milk everyday
I think that is a strong argument against bitcoin that initially came up for me. However this is counterbalanced by people always want to receive sound money in the first place. Sound money like standardised gold and silver coins, came into widespread usage, not because people insisted on spending those, but because people insisted on receiving those. This is always a political and economic discussion instead of a technical one. In one word: Scarcity create value, aboundant destroy value, as a currency you want it to have highest value possible, so transaction capacity should be scarce, people will adjust their behavior accordingly Thanks for explaining that scarcity element applied to amount of transactions. I don't follow that reasoning though. Like many others I think the value of the currency goes up thanks to scarcity of amount of bitcoins, but goes down if there is a scarcity of amount of transactions. People will adapt by using another currency for smaller transactions, lowering the value of bitcoin and bitcoins.
|
|
|
|
caveden
Legendary
Offline
Activity: 1106
Merit: 1004
|
|
May 25, 2013, 10:38:09 PM |
|
caveden I think you misunderstood I wasn't talking about miners but full nodes.
Indeed I thought you were talking about miners. I don't view a smaller number of full nodes as a problem. A single honest full node is enough to spot any attempt of fraud. There'll always be more multiple honest full nodes running. If eventually the network handles 100,000 tps but the demands on a full node are so high that only the largest of the largest of companies operating in massive datacenters can run a full node then you might as well call those "full nodes" .... banks.
Come on. You know you'll never need a "largest of the largest company operating in massive datacenters" to run a full node. 100k tps? What's that? Visa apparently operates at 4tps according to the scalability page. How could Bitcoin possibly outgrow visa by 25 times in any reasonable time? The world has ~7 billion people. A day has 86,4k seconds. With 100k tps, that'd be 8,64 billion transactions per day if I'm not mistaken. It's more than 1 transaction per person in the world per day (even if we consider population growth, it'd still be close to one transaction per person per day at least). I mean... Bitcoin would already be the king of currencies if it ever reach such amounts. It'd be something so freaking important in the world, that you can rest assured there would be more than enough honest nodes running. Remember: one single honest node can spot fraud (like breaking the 21M limit for ex.). And also, by that time, we can't even imagine how powerful hardware will be. The idea that miners would soft fork blocks which are "too big" is equally disturbing.
Why disturbing? Btw, that's not something that can be prevented by the protocol. First of all it is unlikely it would work. Nobody is going to make a block larger than what all but the tiniest fraction of miners says it "too big". If 10% of miners say they will soft fork blocks over 5MB only an idiot miner would make a block larger than 5MB. You are taking a 10% orphan chance by making the block even a few bytes over the limit.
Yes, the blocksize would only be increased if there's a large consensus, or if the monetary incentives to take more transactions outcome the risks/losses from eventual "orphanage". In this last case, that's supply adapting to strong demand. See, market coordination. Another element is that Bitcoin works on the concept (at least in theory) that miners are independent actors. If various miners will soft fork at various levels you have created a disincentive to remain independent. You NEED to know what all your peer miners are doing. We shouldn't be building a system where independent miners (the desired state) is at a disadvantage to miners in coordination.
It depends how you see it. Today there's a protocol rule imposing a 1Mb rule. That's 100% coordination, everybody follows the same rule. By dropping it, you allow a bit more of "independence". Some miners may try a limit slightly larger. It's unavoidable: there must some planning concerning what's a "too big block", because otherwise the network is vulnerable to flooding. I personally prefer the "planning by the many", and that's miners coordinating what's reasonable and what's not. They will tend to adapt to actual demand. Attempts of "central planning" (elaborating some formula etc) might not produce ideal results, and, as noted yourself, are difficult and error-prone. Lastly if various miners do have differing soft for levels than an attacker can exploit that to degrade the effective hashrate of the network. This would open the network to a 51% style attack with less than 51% of hashpower. What matters is effective haspower (i.e. hashpower applied to the longest chain). If miner X has a soft fork level of x MB and miner y has a soft fork level of y then by planting various blocks of differing size the attacker could fragment the good miners into working on competing chains. Anytime miners are on multiple chains the network is only as strong as the hashpower on the chain with the most hashpower. Note miner in this case is the entity (pool for most hashers) that is making the strategic decision of what chain to extend and what tx to include. Hash processors aren't truly miners in that they are already following an authority (the pool server).
If I understand it correctly you're talking about a race attack with the intention to double-spend some payments, not a total overtake. Because an overtake wouldn't work out this way in the long run. It might give some higher chances of a race attack, but I doubt it'd be anything meaningful. By the way, I'm not sure I get how this attack would take place... Say, I generate a block that I know somehow it will hit the soft limits of 50% of the network (in hashpower). That means I break the network, half of it is mining on top of my block, the other on the previous. Now if I have >25% of the network's hashpower (a lot!) I can outpace any of the competing chains. But the thing is that, whatever chain I choose not to mine on will likely outpace the other. Even if it's the chain with the large block, eventually it will be accepted by those who initially refused it. It would still be very difficult to get any double-spend on anything with a meaningful number of confirmations. High value transactions always should always wait for significant amount of confirmations. And double-spending small transactions is not worth the investment.
|
|
|
|
caveden
Legendary
Offline
Activity: 1106
Merit: 1004
|
|
May 25, 2013, 10:47:15 PM |
|
This is always a political and economic discussion instead of a technical one. In one word: Scarcity create value, aboundant destroy value, as a currency you want it to have highest value possible, so transaction capacity should be scarce, people will adjust their behavior accordingly
What?? You're mixing the value of the currency with the value (cost) of transacting it. The scarcity of bitcoin is set on stone: 21M units. No more. Making Bitcoin transactions more scarce - thus more expensive - is more likely to reduce the value of the currency due to its reduced utility (if it's more expensive to transfer it around, it's certainly less useful).
|
|
|
|
johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
May 26, 2013, 11:08:32 PM |
|
As long as an ordinary personal computer can deal with it, it will remain decentralised. But since ordinary personal computers go up in power exponentially, so can the blocksize limit/amount of transactions go up exponentially, right?
The personal computer's performance did not get any significant gain during recent years and it is already a pain to download the whole blockchain with an average block size less than 256KB. The bitcoin client's setup time is already magnitudes higher than it was 2 years ago but you won't have magnitudes higher harddrive capacity and network bandwidth in 2 years I think that is a strong argument against bitcoin that initially came up for me. However this is counterbalanced by people always want to receive sound money in the first place. Sound money like standardised gold and silver coins, came into widespread usage, not because people insisted on spending those, but because people insisted on receiving those. An interesting view... I suppose that who have the money decide if they want to spend. Of course merchants will give some discount for bitcoin payment, maybe it will work. Then we back to the already-too-big blockchain problem Thanks for explaining that scarcity element applied to amount of transactions. I don't follow that reasoning though. Like many others I think the value of the currency goes up thanks to scarcity of amount of bitcoins, but goes down if there is a scarcity of amount of transactions. People will adapt by using another currency for smaller transactions, lowering the value of bitcoin and bitcoins.
The VISA transaction fee is very expensive, but that won't affect the value of USD. Same for bitcoin, the blocksize limit is there to ensure the value of bitcoin (decentralized to maximum), but if transaction cost is high, people will just stop use it to do small transactions
|
|
|
|
amincd
|
|
May 27, 2013, 03:40:23 AM |
|
The VISA transaction fee is very expensive, but that won't affect the value of USD. Same for bitcoin, the blocksize limit is there to ensure the value of bitcoin (decentralized to maximum), but if transaction cost is high, people will just stop use it to do small transactions
Bitcoin's transaction fee being low is one of its major advantages over the dollar, which has to use relatively expensive third party payment processors like Visa for digital transactions. If bitcoin transactions become too expensive for most uses, then bitcoin will be just like fiat currency, where you are forced to use large private third party intermediaries, like Visa, for transaction processing, who can hike up their fees because they are the only processors with large enough private networks to be useful.
|
|
|
|
Abdussamad
Legendary
Offline
Activity: 3696
Merit: 1584
|
|
May 28, 2013, 11:37:49 AM |
|
The average Joe is not running a qt client.
Then what are they using? I think most people google for bitcoin and end up on bitcoin.org where they see this: Bitcoin-Qt is an app you can download for Windows, Mac, and Linux. Bitcoin Wallet for Android runs on your phone or tablet.
http://bitcoin.org/en/choose-your-walletSo yeah they end up using bitcoin-qt. Then they complain that the blockchain download is taking ages I know that's exactly what happened to me.
|
|
|
|
RationalSpeculator (OP)
Sr. Member
Offline
Activity: 294
Merit: 250
This bull will try to shake you off. Hold tight!
|
|
May 29, 2013, 11:19:55 AM Last edit: May 30, 2013, 08:09:18 AM by RationalSpeculator |
|
As long as an ordinary personal computer can deal with it, it will remain decentralised. But since ordinary personal computers go up in power exponentially, so can the blocksize limit/amount of transactions go up exponentially, right?
The personal computer's performance did not get any significant gain during recent years and it is already a pain to download the whole blockchain with an average block size less than 256KB. The bitcoin client's setup time is already magnitudes higher than it was 2 years ago but you won't have magnitudes higher harddrive capacity and network bandwidth in 2 years This article says the inverse, that the amount of transactions can still go up dramatically for a standard home pc: https://en.bitcoin.it/wiki/Scalability"As we can see, this means as long as Bitcoin nodes are allowed to max out at least 4 cores of the machines they run on, we will not run out of CPU capacity unless Bitcoin is handling 100 times as much traffic as PayPal" "Let's assume an average rate of 2000tps, so just VISA. ... That means that you need to keep up with around 8 megabits/second of transaction data. ... This sort of bandwidth is already common for even residential connections today, and is certainly at the low end of what colocation providers would expect to provide you with." What do you think of the arguments and calculations presented?
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
May 29, 2013, 02:58:19 PM |
|
That math is valid and the CPU really isn't the bottleneck. If it ever became a bottleneck the use of OpenCL GPU accleration or even dedicated ASIC processors capable of highspeed ECDSA verification and SHA256 hashing are certainly a possibility (eventually the "easy money" for producing ASIC miners will dry up). That still may happen anyways to reduce power consumption, heat, and processor load (much like many servers use dedicated ASICS for SSL or TCP offloading).
The bottleneck is more in this order (from most critical to least critical): bandwidth (for residential connections the upload segment tends to be rather constrained) memory (to quickly validate txs & blocks the UXTO needs to be kept in memory, sure pulls from disk are possible and ... panfully slow) storage (as much as people worry about storage it is a free market unlike residential last mile and HDD capacities already have a massive "headstart") cpu (with moore's law I don't see this ever being a problem but as pointed out non CPU solutions are possible)
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
May 29, 2013, 03:04:29 PM |
|
The average Joe is not running a qt client.
Then what are they using? I think most people google for bitcoin and end up on bitcoin.org where they see this: The average Joe hasn't even started using Bitcoin today. The requirements of a full node will only increase. Today users are already encouraging new/casual users towards lite nodes and eWallets. That trend will only accelerate. The demands on a full node are certain manageable but there will always be a cost in being an equal peer in a global transaction network. Many users will opt out of that cost by using lighter solutions.
|
|
|
|
caveden
Legendary
Offline
Activity: 1106
Merit: 1004
|
|
May 30, 2013, 07:23:56 AM |
|
The bottleneck is more in this order (from most critical to least critical): bandwidth (for residential connections the upload segment tends to be rather constrained) memory (to quickly validate txs & blocks the UXTO needs to be kept in memory, sure pulls from disk are possible and ... panfully slow) storage (as much as people worry about storage it is a free market unlike residential last mile and HDD capacities already have a massive "headstart") cpu (with moore's law I don't see this ever being a problem but as pointed out non CPU solutions are possible)
I agree with your bottleneck order. Bandwidth will probably be the first, particularly with SSDs getting cheaper (you can store your UXTO in a SSD for better I/O performance). CPU can be dramatically improved as you say. Storage is not such a big deal. And if memory becomes a big deal, good caching strategies together with SSDs could circumvent it. Let's talk bandwidth then... It seems people in Kansas City already have 1Gbit/s available in their homes, up and down. Assuming the 400bytes average for bitcoin transaction that I read somewhere, that's more than 300Ktps if I'm not mistaken. That's a shitload of transactions. Even if transaction sizes were to multiple by 3 due to more usage of multi-signature features (something that I hope will happen), that would still be more than 100Ktps. What's the average number of times a full node has to upload the same transaction? It shouldn't be much, due to the high connectivity of the network. But even if you have to upload the same transaction 10 times, Google Fiber would probably allow you to handle more transactions than Visa and Mastercard combined! We're obviously not hitting such numbers anytime soon. Until there, there might be much more than 1Gbit/s available for residential links. All these desperate attempts to hold the block limit become ridiculous when we look at the numbers. The average Joe hasn't even started using Bitcoin today. The requirements of a full node will only increase. Today users are already encouraging new/casual users towards lite nodes and eWallets. That trend will only accelerate.
I'm not sure. The greatest issue for new users is having to wait for the initial sync. If the client were to operate as an SPV in the meanwhile, and switching to full mode once initial sync is complete, I guess many more people would be OK with having a full node. Well, some would complain about how slow their computer got after they've installed this bitcoin-thing, and might be turned off. But not that much as today.
|
|
|
|
Nagato
|
|
May 30, 2013, 10:00:06 AM |
|
Let's talk bandwidth then... It seems people in Kansas City already have 1Gbit/s available in their homes, up and down. Assuming the 400bytes average for bitcoin transaction that I read somewhere, that's more than 300Ktps if I'm not mistaken. That's a shitload of transactions. Even if transaction sizes were to multiple by 3 due to more usage of multi-signature features (something that I hope will happen), that would still be more than 100Ktps. What's the average number of times a full node has to upload the same transaction? It shouldn't be much, due to the high connectivity of the network. But even if you have to upload the same transaction 10 times, Google Fiber would probably allow you to handle more transactions than Visa and Mastercard combined! We're obviously not hitting such numbers anytime soon. Until there, there might be much more than 1Gbit/s available for residential links.
All these desperate attempts to hold the block limit become ridiculous when we look at the numbers.
What many people don't realise is that the bandwidth numbers quoted on the wiki and by you only apply to keep up with the block generation rate. An independant miner will need 100x - 1000x more bandwidth to mine at all. 1 MB block size produced ONCE every 10 minutes NOT over 10 minutes If im a miner, i want to download that new block as fast as possible to reduce my idle time. Lets use 1% idle time as your target(Means your entire mining farm sits idle for ~6s while you download the block) To download 1MB over 6s, you need about 1.7MBPS connection (seems reasonable for most people in developed countries) 10MB block size, 17MBps (Even i do not have a 17MBPs connection at home though it is affordable enough if i need it) 100MB block size, 170MBPs (Most countries are atleast 5-10 years away from having affordable fibre internet) And that is assuming 1% is the market determined edge you can afford to lose to remain profitable.
|
|
|
|
Nagato
|
|
May 30, 2013, 10:07:48 AM |
|
Just to clarify im not opposed to an increase in block size as long as decentralisation is not compromised by ensuring that the block size remains small enough for average residential broadband connections/commodity PCs to mine with.
|
|
|
|
justusranvier
Legendary
Offline
Activity: 1400
Merit: 1013
|
|
May 30, 2013, 05:09:50 PM |
|
1 MB block size produced ONCE every 10 minutes NOT over 10 minutes This could be solved by pre-announcing blocks: As soon as a miner decides on a list of transactions to include in a block the start broadcasting this list in parallel with hashing, and then broadcast the nonce once they've found it.
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
May 30, 2013, 05:18:38 PM |
|
Let's talk bandwidth then... It seems people in Kansas City already have 1Gbit/s available in their homes, up and down. Assuming the 400bytes average for bitcoin transaction that I read somewhere, that's more than 300Ktps if I'm not mistaken. That's a shitload of transactions. Even if transaction sizes were to multiple by 3 due to more usage of multi-signature features (something that I hope will happen), that would still be more than 100Ktps. What's the average number of times a full node has to upload the same transaction? It shouldn't be much, due to the high connectivity of the network. But even if you have to upload the same transaction 10 times, Google Fiber would probably allow you to handle more transactions than Visa and Mastercard combined! We're obviously not hitting such numbers anytime soon. Until there, there might be much more than 1Gbit/s available for residential links. Agreed higher bandwidth connections will be more common in the future however if 1% of potential users have a 1 Gbbps connection and that becomes the minimum then you have reduced the potential full nodes to <1% of the planet. The numbers also aren't as rosy as they seem on first glance. A node by definition needs connections to multiple peers so a node connected to 8 peers will rebroadcast a tx it receives to 7 peers. Now 8 is the minimum for network security we really want a huge number of nodes with high levels of connectivity (20, 30, 500+ connections). So lets look at 20. 1 Gbps / (400 bytes per tx * 19 peers to relay * 8 bits per byte) = ~16,000 tps. Now 16,000 tps per second is still a huge number. However that would limit full node participation to those with 1 Gbps. However the real problem is real bandwidth vs marketing. 1 Gbps sounds great until you saturation your uplink at 1 Gbps 24/7 continually every second. In no time flat the ISP is going to cut you off or throttle you. Even if they have no hard bandwidth caps all ISP agreements have "reasonable usage" guidelines. Residential bandwidth is shared. No company could offer (even at cost) 1 Gbps for $100 per month if every user (or even a small minority) actually maxed it out. If you want real pricing take a look at what most datacenters charge for bandwidth. 1 Gbps is going to cost a LOT more than $100 per month, probably more than $1,000 per month (although the cost does get cut in half every 12-18 months). The last issue is what Nagato mentions above (although his numbers are low due to the need to broadcast to multiple peers). For miners their outgoing bandwidth is "bursty". A miner needs to broadcast his found block very quickly to as much of the network as possible. Every 6 seconds in delay increases the orphan rate by ~ 1%. If targeting a 3 second window to send a 10 MB block to 50 connected peers in 3 seconds we are looking at 10 MB * 8 bits per byte * 50 peers / 3 seconds = ~1,300 Mbps. Lower connectivity will put the miner at a disadvantage to better connected miners. If this barrier is too high you will see even more migration to the largest pools as they can afford the high levels of connectivity needed. Slower pools will essentially have a 1% to 3% or more "hidden" oprhan tax. As miners discover that they will migrate to the better paying pools. All these desperate attempts to hold the block limit become ridiculous when we look at the numbers. When the average user has "true" 1 Gbps connectivity at a reasonable cost and the average miner can obtain "true" 10 Gbps connectivity then maybe. BTW despite the post I am bullish on Bitcoin, solutions can be found however those advocating dropping all limits because of "centralization" need to realize at the far extreme it just leads to another form of centralization. When only a tiny number of players can afford the cost of running a mining pool (and 1, 10, 50 Gbps low latency connectivity) or run a full node you have turned the p2p network into a group of a few hundred highly connected peers. Guess what modern banking IS a peer to peer network of a few hundred highly connected peers. The fact that you can't be a peer on the interbank networks doesn't mean the network doesn't exist. The barriers (legal, regulatory, and cost) just prevent you from becoming a peer. The greatest issue for new users is having to wait for the initial sync. If the client were to operate as an SPV in the meanwhile, and switching to full mode once initial sync is complete, I guess many more people would be OK with having a full node. Well, some would complain about how slow their computer got after they've installed this bitcoin-thing, and might be turned off. But not that much as today.
Today maybe. But lets look at just a 10MB block for a node with only 8 peers (dangerously low IMHO). That requires about 64 Mbps sustained. Due to the bursty nature for this peer to provide any value relaying blocks the peak bandwidth would need to be 10x higher (640Mbps). The larger obstacle isn't sustained or peak speeds (more than acheivable from a technical standpoint). The larger obstacle is how much burden it would put on ISP networks (which are generally massively oversubscribed). Total bandwidth used for this peer is ~350 GB per month. Most ISP will cap a user long before that. The biggest ISP, comcast, IIRC starts throttling at ~200 GB per month (less on cheaper plans). The first time a casual user either has his download speeds cut 80% or gets a warning from his ISP on having to pay overage fees he likely is going to pull the plug. Maybe not every user but at least some users will.
|
|
|
|
RationalSpeculator (OP)
Sr. Member
Offline
Activity: 294
Merit: 250
This bull will try to shake you off. Hold tight!
|
|
May 30, 2013, 05:48:06 PM |
|
Thank you DeathAndTaxes and all the others for your technical explanations.
Do you believe that bitcoin will continue to be used for microtransactions as it is today? Or do you think this will fade out over time due to technical limitations?
If you believe it will fade out, do you think another cryptocurrency will be used for microtransactions or do you think a top layer build on bitcoin with off chain transactions, will win out, for microtransactions?
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
May 30, 2013, 06:32:42 PM |
|
I guess it depends on what you mean by micro transactions. I mean look at the tempest in a teacup about setting the default dust limit at 5430 satoshis (~0.5 cents). If you mean <$0.10 (in 2013 USDs) then probably not. If you $0.10 to a couple bucks then it likely will be some time before those transactions are not economically viable.
BTW I am not opposed to solving the "block size" problem I am just pointing out that the situation is slightly more complex then some make it out to be. There is always some level of centralization, an unlimited blockchain simply pushes us towards a different kind of centralization.
How many tx the network eventually happens, what the costs will be, how big Bitcoin gets are all unknown. My guess is likely no better than anyone elses. I do believe a alt-coin built from ground up around low cost microtransactions could carve out a niche. I also think off blockchain transactions aren't that scary. I don't like the idea of web wallets holding massive wealth but I could really careless of the security implications of using off blockchain transactions to buy a cup of coffee or some discount steam game.
|
|
|
|
caveden
Legendary
Offline
Activity: 1106
Merit: 1004
|
|
May 30, 2013, 08:10:34 PM Last edit: May 30, 2013, 08:36:50 PM by caveden |
|
What many people don't realise is that the bandwidth numbers quoted on the wiki and by you only apply to keep up with the block generation rate. An independant miner will need 100x - 1000x more bandwidth to mine at all.
1 MB block size produced ONCE every 10 minutes NOT over 10 minutes If im a miner, i want to download that new block as fast as possible to reduce my idle time. Lets use 1% idle time as your target(Means your entire mining farm sits idle for ~6s while you download the block) ...
That's not the case. If you were online since before that block started to be built, you already received all its transactions. They're all on your transaction pool. There's no actual need to download them again (that's a performance improvement suggested by the scalability page by the way). To start mining on the next block, all you need is the header of the previous, and a sorted list of transaction hashes to build the Merkle tree. That's much less data then the entire block. Unless of course the block contains lots of transactions that are not on the memory pool, in that case you'll have to download these unknown transactions. And there you have it: an easy way to detect if a spamming attempt is in progress. If a sensible amount of transactions in the new block was not present on your memory pool, you should consider that block a spamming attempt by a miner and refuse to mine on top of it, unless of course it's already more than x blocks deep, in which case you accept it (soft limits). If the spamming miner decides to broadcast his spamming transactions, he'd hit anti-spam fee policies, and end up needing to pay other miners in the network to include its spam. Just to clarify im not opposed to an increase in block size as long as decentralisation is not compromised by ensuring that the block size remains small enough for average residential broadband connections/commodity PCs to mine with.
Mostly everybody agrees with that. The argument is between those that think that an arbitrary formula should be invented and imposed via the protocol, and those who believe that spontaneous order (or p2p, free-market, freedom, pick your term) can implement a better and safer control on block size without the use of a centralized formula. Well, there's also a third group that thinks the 1Mb limit should be kept, but I can't take them seriously...Not only I believe spontaneous order would reach better results, I also agree with D&T when he says that setting a formula is technically (and pollitically) complicated, and potentially error-prone (might introduce bugs).
|
|
|
|
caveden
Legendary
Offline
Activity: 1106
Merit: 1004
|
|
May 30, 2013, 08:36:09 PM |
|
Agreed higher bandwidth connections will be more common in the future however if 1% of potential users have a 1 Gbbps connection and that becomes the minimum then you have reduced the potential full nodes to <1% of the planet. The numbers also aren't as rosy as they seem on first glance. A node by definition needs connections to multiple peers so a node connected to 8 peers will rebroadcast a tx it receives to 7 peers. Now 8 is the minimum for network security we really want a huge number of nodes with high levels of connectivity (20, 30, 500+ connections). So lets look at 20. ...
Come on, D&T... I know that you know that a node should only need to broadcast a tx to all his peers if he's the very first to receive and validate it. Nodes can first send a "I have this new tx" message, which is small (tx hash size), and then upload the tx to the peers that requested it. Not all of your peers will request it from you - they're connected to other nodes too. I used the amount 10 in a conservative way... I don't think a node would upload the same transaction 10 times in average, it seems a high number to me. But it'd be interesting to see statistics on how many times a node has to upload a tx, proportionally to its amount of connections. I never saw any. The last issue is what Nagato mentions above (although his numbers are low due to the need to broadcast to multiple peers).
I've already answered Nagato above. (and I know that you knew that too...) BTW despite the post I am bullish on Bitcoin, solutions can be found however those advocating dropping all limits because of "centralization" need to realize at the far extreme it just leads to another form of centralization. When only a tiny number of players can afford the cost of running a mining pool (and 1, 10, 50 Gbps low latency connectivity) or run a full node you have turned the p2p network into a group of a few hundred highly connected peers.
I'm confident that spontaneous order can easily tackle block size control. Miners can implement soft limits, not only on block size per se, but also on the percentage of unknown transactions in a block as I said above (normally you should have most transactions of the new block in your pool, if you don't, it might represent a spamming attempt). Just look at miners today: they're already extra-conservative, only to ensure the fastest possible propagation. Guess what modern banking IS a peer to peer network of a few hundred highly connected peers. The fact that you can't be a peer on the interbank networks doesn't mean the network doesn't exist. The barriers (legal, regulatory, and cost) just prevent you from becoming a peer.
Banking is an industry in symbiosis with the state. The problem with it are the regulations: that's the barrier of entry that makes it so hard to hop in. The cost of the business per se shouldn't be that high. Taking care of people's money (which is mostly digital today) has no reason to be a more costly business than a complex factory for instance. Just take a look at the amount of competitors that show up in places where banking regulations are less burdensome, like Panama, and compare it with other places (relatively to the country's population and GDP sizes)
|
|
|
|
|