Bitcoin Forum
May 06, 2024, 02:09:22 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 7 8 9 10 11 [12] 13 14 15 16 17 18 19 20 21 22 23 24 »
221  Bitcoin / Bitcoin Discussion / Re: Fidelity-bonded banks: decentralized, auditable, private, off-chain payments on: March 11, 2013, 08:14:30 PM
Okay, so we can say that the number of users N with significant holdings scales roughly with the exchange rate R (measured in today's USD) as

N = 200*R

If we say the blockchain should accommodate 10 transactions per month for each of these users, then the block size target should be

B = 10*N txs/month * 500 bytes/tx / 144*30 blocks/month = 1.2 N bytes/block

or

B = 230*R bytes/block

This becomes 1MB when R is around $4000, around 80 times what it is today, or 800,000 users with "significant" holdings.

Fair enough.  Qualitatively we agree then that the block size limit should scale with the number of users.  Using the exchange rate as a proxy for this seems quite reasonable.
222  Other / Politics & Society / Re: New CIA Director takes oath on draft Constitution—without Bill of Rights on: March 11, 2013, 02:45:40 PM
https://www.youtube.com/watch?v=6IRxpjEZveQ
223  Bitcoin / Bitcoin Discussion / Re: Fidelity-bonded banks: decentralized, auditable, private, off-chain payments on: March 11, 2013, 02:29:54 PM
The problem I have with portraying off-blockchain transaction systems like this as allowing us to avoid having to deal with large blocks and the problems that may arise from them, is that they don't.  Don't get me wrong, they are helpful, but we can't rule out success, and an inaccessible blockchain is also a security risk to users.  For example, a billion users plus a refusal to significantly raise the block size limit, and the vast majority of users are virtually shut out from engaging in blockchain transactions at all. *  That means they can't use the blockchain as a trust-free, infrequently accessed savings store, or even engage in runs on their banks when they have to.  It also means that because only a relative trickle of transaction flow goes through the blockchain, and because the greater system's security relies on this flow continuing, that the damage/cost ratio for a flooding attacker can become very high if the growth of the number of users outstrips the growth of block sizes.

Bottom line is: we don't get to decide what the optimally secure block size limit is.  It's defined by some balance between computing technology's enabling of decentralization on the one hand, and the number of users and their need for reasonable access to the blockchain on the other.  So in the event of success, we'll have to find ways to work with large blocks one way or another.

* To give a sense of how inaccessible the blockchain would become: 2000 txs/block * 144*365 blocks/year / 1B users = 0.1 txs/year/user.  Ridiculous as one blockchain transaction every ten years already is, keep in mind that wealth is not distributed equally, and so probably 90% of these people would be cut off altogether.
224  Bitcoin / Project Development / Re: Pledging coins for ultimate blockchain compression on: March 11, 2013, 10:47:35 AM
The only reasonable way I've seen to get something in between SPV and full mode is d'aniels suggestions for various kinds of fraud proofs, but the whole point of a fraud proof is that you get one when the best chain is no longer following the rules .... making fraud proofs that rely on trusted commitments to the state of the UTXO set pointless (they would be a part of the fraud).
The SPV clients would be auditing commitments to the UTXO set as well, of course.

Edit: One way I can see to do this is for each block, full nodes keep handy all the branches of the UTXO tree that get updated in the next block.  If they're pruning, then they only have to keep this data a reasonable ways back in the chain.  To audit commitments to the UTXO set in a given block, an SPV node would download transaction merkle branches in this block and the relevant branches in the UTXO tree from the previous block, then check that the commits were done properly, i.e. produce the correct root to the current UTXO tree.
225  Bitcoin / Development & Technical Discussion / Re: Blocking the creation of outputs that don't make economic sense to spend on: March 10, 2013, 06:20:43 PM
I'll assume RAM is replaced every 5 years, and that Moore's law for it holds for at least 5 more years.

16GB costs $115 today and consumes about 5W of power. *  Take $0.10/kWh for electricity, and it costs

(5/1000 kW)*(5*365*24 hours)*$0.10/kWh + $115 = $137

to buy and run 16GB of RAM for 5 years.  Assuming a factor of 1/4 reduction in this price after 5 years (doubling size/unit cost every 2.5 years), we have $171 for 16GB for 10 years.

Assuming 125 bytes per utxo on average, this is 128M utxos per 16GB of RAM, giving

$0.0000013 to store a utxo for 10 years.

But we have to scale this by the number of full nodes, say 100K: $0.13 for the whole network to store a utxo for 10 years.

I only went to 10 years cause I don't know if Moore's law holds for longer.  But we can safely say it's roughly $0.13 for a network of 100,000 nodes to hold a utxo for as long as Moore's law for memory holds.

*
http://www.newegg.com/Product/Product.aspx?Item=N82E16820104302
http://www.kingston.com/dataSheets/KHX1600C10D3B1K2_16G.pdf

I did this quickly so please correct me if I did something stupid...

Edit: I overheard sipa say that the average utxo size is a bit more than 100 bytes, so I changed the number from 50 to 125 bytes.
226  Bitcoin / Project Development / Re: Pledging coins for ultimate blockchain compression on: March 10, 2013, 05:07:49 PM
There have been lots of discussion with the devs, though the most interesting ones I've had were on IRC.  The biggest concern was expressed by gmaxwell, which was that he doesn't see it being acceptable to further expand the computational requirements of miners, despite the benefits that are offered by this idea.  It risks creating further centralization, when miners with less-powerful hardware are pushed out.  That doesn't mean it couldn't exist in a side-chain, it just means that he doesn't think it could ever be accepted into the core protocol -- which I think would be a desirable goal for this in the long term.
Does gmaxwell still think this?  I've noticed lately he seems keen on the idea:
https://bitcointalk.org/index.php?topic=88208.msg1599126#msg1599126
https://bitcointalk.org/index.php?topic=137933.msg1599093#msg1599093

It's also in his alt-coin wishlist: https://en.bitcoin.it/wiki/User:Gmaxwell/alt_ideas

I'm thinking he may have changed his mind because of its application to partial verification and fraud proofs.  Essentially this can be seen as part of an overall SPV trust model upgrade.  He explains this well in this post: https://bitcointalk.org/index.php?topic=137933.msg1596626#msg1596626
227  Bitcoin / Project Development / Re: Pledging coins for ultimate blockchain compression on: March 10, 2013, 03:26:40 PM
The final step of this project is "integrate with lightweight clients", but this is vague - what does it mean? Which clients?

I don't think I'd be interested in accepting code for it into bitcoinj, because I'm actually not sure what the goal of this is. As far as I can tell we have already achieved the scalability goals of that proposal in a much simple and more efficient way (by supporting Bloom filtering of the chain).

It's probably a good idea to just refund the pledges.
The benefit that I was seeing in this was that SPV nodes could do partial verification of blocks, helping to find and broadcast fraud proofs.  This seems important for scaling.
228  Bitcoin / Development & Technical Discussion / Re: Blocking the creation of outputs that don't make economic sense to spend on: March 10, 2013, 03:14:13 AM
Also, look at it this way: if your special txout represents something worth less than a transaction fee, moving it incurs a net loss anyway. If it's worth more than that, having to keep a transaction fee worth of coins sitting quickly becomes a minor cost.
Yep, that's exactly the reason why I said I'd prefer your solution if the exotic uses can be achieved without dust sized txouts Smiley
229  Bitcoin / Development & Technical Discussion / Re: Blocking the creation of outputs that don't make economic sense to spend on: March 10, 2013, 02:33:04 AM
How can you tell the difference between spam and a legit use for an anonymous transaction?
By how much they're willing to pay to create the small valued txout.

The point is, a transaction that will never be spent costs the network more than one that will be spent because the former requires expensive, high-speed storage (currently RAM, and maybe SSDs in the future) while the latter doesn't even have to be stored at all by the majority of validating nodes with pruning. To charge the same price for both, which is what the current fee system does, is crazy.
Oh, we very much agree on this.

Actually, if all the exotic uses of dust sized coins can be done equally well with non-dust sized ones, then I prefer your solution.

Otherwise I proposed one that still permits small valued coins here: https://bitcointalk.org/index.php?topic=151177.0
230  Bitcoin / Development & Technical Discussion / Re: Blocking the creation of outputs that don't make economic sense to spend on: March 10, 2013, 02:19:00 AM
Can't we block the spam without nuking legitimate uses for small valued coins?

How can you tell the difference between spam and a legit use for an anonymous transaction?
By how much they're willing to pay to create the small valued txout.
231  Bitcoin / Development & Technical Discussion / Re: Blocking the creation of outputs that don't make economic sense to spend on: March 10, 2013, 02:16:07 AM
Can't we block the spam without nuking legitimate uses for small valued coins?
232  Bitcoin / Development & Technical Discussion / Exploring the future character of Bitcoin on: March 10, 2013, 12:06:26 AM
I made a simple model to help explore how people might interact with the Bitcoin blockchain in the future.  It assumes that the number of users grows via a logistic function (common for population growth and diffusion of innovation modeling).  For example:



And with log scaling:



It lets you set a desired number of blockchain transactions per month users should have access to on average, and gives you the block size limit required to achieve this.  It compares this with the block size limit that grows at the same expected rate as bandwidth.  For example:



This "natural" block size limit results in some number of blockchain transactions per month users will have access to on average:



Here's the MATLAB/Octave code if you want to play around with the parameters:
Code:
n = 2;      % Target average number of txs per user per month.
P0 = 500E3; % Number of users today.
K = 2E9;    % Expected number of users after full diffusion.
a = 0.5;    % Current (exponential) annual growth rate of users.
b = 0.2;    % Expected annual bandwidth growth rate.
s = 1000;  % Average tx size in bytes.

t = 0:0.1:40;
% Number of users after t years assuming logistic growth function.
P = K*P0*(1+a).^t ./ (K + P0*((1+a).^t - 1));  
% Target block size limit in MB.
B = n/(30*144)*s*P/1E6;
% Block size limit in MB assuming exponential growth alongside bandwidth.
Bnat = (1+b).^t;
% Average number of txs per user per month assuming exponential growth of
% block size limit alongside bandwidth.
nnat = 30*144*1E6*Bnat/s./P;

figure(1);
plot(t, P);
xlabel('Years from now')
ylabel('Expected number of users')

figure(2);
semilogy(t, P);
xlabel('Years from now')
ylabel('Expected number of users')

figure(3);
semilogy(t, B, t, Bnat);
legend([num2str(n), ' txs per user per month target'], 'Natural', 'Location','southeast');
xlabel('Years from now')
ylabel('Block size (MB)')

figure(4);
plot(t, nnat);
xlabel('Years from now')
title('Natural average number of txs per user per month')

Edit: My 2 cents on the block size limit is that we shouldn't be targeting bandwidth growth rate, but user growth rate, and the algorithm should follow a logistic function.  Then we try to deal externally with whatever centralization might result from larger blocks, which are at least bounded in size going forward.

Also added the average tx size explicitly into the code.
233  Bitcoin / Development & Technical Discussion / Re: Dust Collection on: March 09, 2013, 08:59:00 PM
True, it doesn't require any protocol changes... but it does require miners to care.
Not just care, but care enough to forgo some income for it when they take a txn consuming more utxo paying lower fees per byte over one consuming less utxo paying more.
I think there needs to be a mining manual called "Responsible Blockchain Stewardship" Smiley

If it does turn out to be a tragedy of the commons type situation, then I would think that miners would enact block discouraging rules to solve the problem, once it becomes painful enough.  Hopefully they don't end up acting like idiots in the aggregate, though...
234  Bitcoin / Development & Technical Discussion / Re: Dust Collection on: March 09, 2013, 08:38:25 PM
Permanent storage isn't the (significant) scarce resource that requires block space to be rationed, bandwidth is.
Downloading an amount of transactions equal to Visa, based on today's bandwidth costs, would be less than $20/day.
So you agree that bandwidth is a scarce resource?

High frequency trading especially uses up a lot of it...
235  Bitcoin / Development & Technical Discussion / Re: Dust Collection on: March 09, 2013, 08:28:44 PM
Your reasoning would make sense to me if UTXO set size was the only scarce resource being rationed, but block space needs to be rationed at the same time.
True at the moment, but in the long term block space does not need to be rationed. It's not even strictly necessary for anyone to keep every transaction all the way back to the Genesis block.

The only data that absolutely must persist forever for the network to function is the UTXO set.
Permanent storage isn't the (significant) scarce resource that requires block space to be rationed, bandwidth is.
236  Bitcoin / Development & Technical Discussion / Re: Discouraging dust without hurting exotic uses of small valued coins on: March 09, 2013, 08:04:52 PM
I don't see a way to distinguish between dust spam that originates from SatoshiDICE versus dust spam that appears that way because it is really the result of colored coin operations.
There isn't.  The idea is to set w0 high enough to discourage dust spam, but not so high that the creation of legitimate small valued coins is discouraged.  Notice that it's only their creation that incurs the higher fee; transferring them costs roughly the same as transferring normal valued coins.

Quote
In fact one could argue that colored coins meet the definition of economically unviable transaction spam.
One could argue this, but some miners may disagree.  Ultimately they're the "deciders".

Edit: A miner that doesn't like colored coins can simply set w0 prohibitively high.
237  Bitcoin / Development & Technical Discussion / Re: Dust Collection on: March 09, 2013, 07:53:05 PM
doesn't this depend on how much Bitcoin appreciates assuming of course that as that appreciation occurs fees are reduced?
Yes, absolutely.  This would mean the UTXO set size scales roughly with the number of users, as it should, ideally.

seems to me that as # users increases, Bitcoin price increases, fees decrease, and UTXO decreases.
Um...  Bitcoin price increases -> fees decrease -> smallest economical txout value decreases -> more UTXOs can be created -> UTXO set size increases.
238  Bitcoin / Development & Technical Discussion / Re: Dust Collection on: March 09, 2013, 07:44:14 PM
doesn't this depend on how much Bitcoin appreciates assuming of course that as that appreciation occurs fees are reduced?
Yes, absolutely.  This would mean the UTXO set size scales roughly with the number of users, as it should, ideally.
239  Bitcoin / Development & Technical Discussion / Re: Dust Collection on: March 09, 2013, 07:20:24 PM
Without any changes to the protocol itself miners could make it economical to create transactions which reduce the UTXO set by changing their transaction rules to favor transactions which reduce dust. Once this happens, clients could be programmed to do that automatically.

One way it could be done: if all outputs are above the dust cutoff, and if N inputs are below the dust cutoff, prioritize the transaction as if it included N*the minimum fee in addition to fees which are actually present.


I don't think the BTC-size of the inputs should matter.  Simply look at whether it increases or reduces the global UTXO set all full nodes will have to track.  Even huge transactions should be free if they combine dozens of inputs into a one or two outputs.  As long as it is very expensive to go the other way.

In fact, you could look at the contraction efficiency of a transaction:  how many kB did it use to remove X UTXOs from the set?  Above a certain efficiency, those tx should be free.  Any neutral or negative tx will be charged a fee, with the highest fees going to those who dramatically expand the set.  But the rules should be balanced so that it's not cheaper to send 100 single-output transactions instead of a single 100-output tx.
One problem with ignoring coin value altogether is that we're more likely to end up with coins that are uneconomical to ever spend, and thus more likely to bloat the UTXO set indefinitely.

Your reasoning would make sense to me if UTXO set size was the only scarce resource being rationed, but block space needs to be rationed at the same time.  This means transaction priority can't be coin value-agnostic.

Edit: Additionally, the UTXO set size is naturally bounded if uneconomical coin values are discouraged.  Of course this brings up the problem of discouraging exotic uses of small valued coins.  I addressed this here though if you're interested: https://bitcointalk.org/index.php?topic=151177.0.
240  Bitcoin / Development & Technical Discussion / Discouraging dust without hurting exotic uses of small valued coins on: March 09, 2013, 05:46:40 PM
The idea is kind of simple, so apologies if it's already been proposed.

The UTXO set size can be controlled by weighting fees with the following function:

W(tx) = sum(w(txinval)) / sum(w(txoutval))

w(v) = 1 if v > v0
     w0 if v < v0

where v0 defines what is considered a "small" valued txout, and w0 is the extra weight given to small valued txins and txouts.

With v0 = 0 the sums reduce to number of txins and txouts.  But a finite v0 means outputs that cost more to spend than they are worth can be discouraged from being created.  For exotic uses of small valued outputs like colored coins, this essentially sets an extra fee for creating them, but not much extra for simply transfering them.  Recombining small txouts gives a fee "bonus".

For example, consider the tx where one small valued txout is created, and the rest of a single normal valued txin is sent as a fee to miners.  Then W(tx) = 1/w0, meaning a fee of w0 times more is required to be accepted on par with this tx, had the txout been of normal value.  This adds an extra cost to expand the UTXO set into practically unbounded territory.

Another example: one txin of normal value spent to fees, and a small valued txin spent to a single small valued output.  Then W(tx) = 1 + 1/w0 (~= 1 for large w0), i.e. not much extra fee is required because the UTXO set is not being expanded.

Finally consider the example of adding a small valued txin to a normal one, and spending this to a single normal valued txout.  Then W = 1 + w0, meaning the usual fee to spend a single normal valued txin to a single normal valued txout is attenuated by a factor of 1/(1 + w0) (~= 1/w0 for large w0) as a bonus for contracting the UTXO set size.

v0 should be set somewhere on the order of the minimum tx fee, and w0 will need to be adjusted by miners to whatever (minimum) value is required to prevent abuse of the UTXO set.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 [12] 13 14 15 16 17 18 19 20 21 22 23 24 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!