Bitcoin Forum
September 26, 2016, 10:28:02 PM *
News: Latest stable version of Bitcoin Core: 0.13.0 (New!) [Torrent]. Make sure you verify it.
 
  Home Help Search Donate Login Register  
  Show Posts
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 [26] 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 ... 107 »
501  Bitcoin / Armory / Re: Armory - Discussion Thread on: September 07, 2015, 06:11:42 PM
0.91??? Upgrade please.
502  Bitcoin / Armory / Re: Negative balance after restore on: September 07, 2015, 12:42:44 PM
I'm stupid. The transaction history you posted is invalid. Can you give me the transaction hash instead? If you do not want to post this data publicly, pm me instead, or create a support ticket. Say goatpig sent you, I'll take care of it on that end.

Quote
I'm still reading and researching to try to figure out where I went wrong, as I don't want to repeat this mistake and end up loosing a larger amount of funds that wouldn't be so easily written off.

Do you use cold storage or just a regular encrypted wallet?
503  Bitcoin / Armory / Re: Negative balance after restore on: September 07, 2015, 10:09:25 AM
Those coins were sent to an address starting with 3. That's a P2SH address. Armory only uses P2SH addresses for lockboxes right now. If you moved your money to a lockbox, you need to give that lockbox backup to Armory. If you do not have the backup, you need to recreate the Lockbox with the addresses you used.

Otherwise, you need to investigate who has access to your private keys.
504  Bitcoin / Development & Technical Discussion / Re: Really not understanding the Bitcoin XT thing... on: September 06, 2015, 09:14:14 AM
If the information required to be sent grows logarithmically with block size, then the market would be exactly on the border between healthy and non-healthy.  

Not really. It follows the same growth pattern, not necessarily the same rate. A linear growth will always surpass a logarithmic growth, regardless of the rate. Once we are using the same functions, (k log(x)), suddenly all that matters is the rate (k). There is no indication we would be on that border exactly, we could be way below or way above it. What do the law of physics have to do with block space demand again? Note that I am not arguing its effect on supply.

Quote
"any TXs that you know aren't part of other's mempool".  I agree.  But if we imagine that 95% of the transactions are common between mempools, then the number of missing TXs still grows linear with block size.  

The only way this can happen is if a miner is purposefully creating his own tx and not emitting them publicly. Mempool hit rate is just another propagation system, with time it will always tend towards 100%. The scenario you describe where the mempool hit rate constantly hovers at 95% if only possible if a miner is purposefully creating transactions at a pace and withholding them. That makes no sense as it increases his propagation time (reduces his profit) and motivates other miners to eventually ignore him. On contrary, I would speculate all miners will make a point of sticking to well propagated transactions particularly to suppress their orphan rate.

Quote
Furthermore, you still need to communicate the order in which the transactions are sorted in a block (even if all the TXs were common between mempool).  Communicating the order of a list of transactions requires an amount of information linear in the number of transactions in that list.  

What makes you believe that information cannot be communicated pre solution? Also look at IBLT implementation. The intent is to give other miners the rules you build your own blocks with. There is no reason a miner propagating an IBLT should stray from the pattern it describes. Again, miners have a clear incentive in fast relay networks to use as predictable a tx set as possible. That means ignore transactions that are not fully propagated and stick to your IBLT.

Quote
you still need to get 100% of the contributing hash power to go along with it.  If only 90% does, the fee market remains healthy

No I only need 51%. The rest of the network will get steamrolled out of business pronto. Part of your paper relies on how rational miners should account for propagation time in their tx set choice (the model you propose is naive but let's keep it at that). Now somehow you are arguing 2 things to me:

1) You are suggesting rational miners won't take every opportunity to reduce propagation time.

You state yourself that a high propagation time will result in lower revenue. Rational miners are motivated by profit, why would they stay away from any coding gain?

2) You are suggesting irrational miners have a stake in this game.

Clearly they don't, their only fate is to get eaten alive by rational miners, or bleed money from an external source ad nauseum (or until that source depletes I guess).

Quote
I think you are looking for ways around a Law of Physics here.  The only way the market will not be healthy (according to the definition in my paper) is if exactly zero information about the transactions included in a block is communicated with the block solution announcement and this holds for every block solution announcement.

You always have to propagate the block header and your coinbase tx at solution annoucement. But there is no indication we have to stick to a network that has to propagate every other bit of information in a block along with the solution. We have solutions being developed that allow miners to only ever have to emit the header and coinbase tx. The Nash equilibrium will keep all miners doing just right that as long as propagation time inversely affects profit (read forever).

So the amount of data emitted is static and the network bandwidth will keep growing until we reach the physical limit of the carrier. Therefor the propagation time is tending towards 0 (remember math 101, tending is not reaching). So we do have a situation where blocks will propagate at insignificant speed compared to the block emission period, without the poles inverting.

Quote
You missed the rest of the sentence

No I purposefully omitted it, replacing it with ellipses to signify that it was an excerpt from a sentence or paragraph, as common citation etiquette dictates.
505  Bitcoin / Development & Technical Discussion / Re: Proposal: dynamic max blocksize according to difficulty on: September 06, 2015, 08:25:19 AM
Remember that currently the market for transactions is distorted severely because of the block reward subsidy. As it goes down, transactions fees become much more significant and will further force the miner to consider which transaction to add to his block. More hashing power makes the bitcoin network stronger, so everyone benefits. More users can lead to increase of bitcoin price, etc

Higher price commands higher difficulty. The other way around is not true. You cannot deduce price has increased because difficulty went up. The only thing it means is that profitability per hash has increased. A wealth of factors can affect that, not just price.

An increase in price itself does not equate to an increase in block space demand. The tps can go up when price goes down and vice versa. There is no economic evidence of the contrary.

Your proposal has to account for situations when reality contradicts the meaning you give your indicators. I agree that in general, in the absence of a coinbase reward, a growth in difficulty will mean an increase in fee subsidy, and thus block space demand.

However, that does not always stand true, and you need a contingency plan for when that is the case. This is actually a pretty uncommon situation right now, as fee subsidy are about 1% of miner revenue. How much of a solution is your proposal if it only works when the stars align? Some people choose to discuss proposals for the now, others prefer to work on solutions that can also withstand the absence of inflation. However at this rate, you are designing a solution that will only start to make sense in some 20 odd years.

Quote
Unfortunately we cannot use transaction data (volume, fees etc) or block sizes as a metric because these can be manipulated by the miners themselves leading to tragedy of commons scenarios.

Not really. The emergence of relay networks and IBLT largely increases the cost of these attacks. To fake transaction volume under these conditions means a miner needs to emit txs to himself to recover its own fees. The implies they can't publish these transactions publicly until they are mined, or other miners will pick them up and end making the ill intentioned miner actually pay for the attack. However, relay networks and IBLT produce a coding gain in block propagation by forwarding block content before a solution is found, i.e. each miner is telling others what tx set they are working on.

A miner deliberately slowing down his propagation is exposing himself to a lot of natural orphaning, and past a certain point, he is so easy to identify and such a nuisance that other miners have motivation to just out right orphan him on purpose. Add a decay function on top of it and not only miners will have to pay for spam attacks like any other spammer, but their effort will also be lost in the mid term, as the decay function will correct the effect of the attack.

Quote
But if we could some how force the miners to always include transactions that had appeared it the mempool, or at least make it a risk not too then this problem would be solved.

That's inane. the first and best spam filter is letting miners pick transactions based on economic factors. As long as the incentives are not completely warped, this system works. What you are proposing is making any tx spam attack completely trivial. It would also make DoS attacks by high OPs count and low size tx very effective. A lot more than currently, as they can easily be thwarted by their own nature: low miner profit.
506  Bitcoin / Development & Technical Discussion / Re: Dynamically Controlled Bitcoin Block Size Max Cap on: September 06, 2015, 07:49:30 AM
As I can see, you have talked about various numbers, like 66~75%, 20% etc. These appears to be magic number to me, like BIP 101's 8mb or BIP 103's 4.4% & 17.7%. How do you derive them ?

I've said stated all these figures need to be discussed. I believe these thresholds need to exist, but a decent value or a decent way to compute these values needs to be discussed. I can give you the reason these values need to exists but I have not done the research to determine which figures are the most appropriate, of it there is a way to set these dynamically as well.

The 66~75% figure is a proposal for the block space usage threshold at which a resizing should be tested against secondary conditions. The rationale is that organic market growth will always burst through any threshold (until it hits a hard cap eventually), whereas an attacker won't necessarily. Raising the space usage threshold increases the effort required by ill intentioned parties, and doesn't change a thing for natural market growth. As a reminder, the current threshold is 50%.

The 20% figure denotes the fee growth threshold, i.e. a resizing should only occur if fees have gone X% either way compared to the previous period. Currently there is no such threshold, making it trivial for any attacker to push up the block size and maintain it high.

As long as these thresholds are in place and tight enough, an effective decay function can be implemented. The goal is to distinguish between organic growth in demand and spam attacks, and use a safety net mechanism (the decay function) to correct all growth that is not supported by actual demand. It would actually mimic commodity prices in a speculative market: large speculators can pump the price for a while but eventually the market will always correct itself, with the valid demand as its baseline.

The first threshold is not very important. It will always be reached first when demand climbs, so its particular value is not all that important. It could be 90% for all I care, because fees won't start climbing until blocks are nearing max capacity. It needs to be > 50% to make room for the decay function.

The threshold that needs to be truly discussed is the second one. It can't be low enough that an attacker can throw a couple extra BTC at the network and trigger a size growth on the cheap under the right conditions. It can't be so big the network will get clogged with a massive backlog before it resizes. However, an increase in fee subsidy is an increase in revenue, which will translate into an increase mining power eventually. It can be expected that such tight thresholds will result in bursty cap growths, which is another reason for a decay function, but generally I believe we are better with high values than low ones.
507  Bitcoin / Armory / Re: Negative balance after restore on: September 05, 2015, 09:07:46 PM
In the transactions log, I can see a transaction that seems to take the funds from my wallet and send it to a different address (change address I assume?)

The transaction log does not display outputs funding change addresses. It is likely what you are seeing is an actual transaction moving your funds.
508  Bitcoin / Armory / Re: Negative balance after restore on: September 05, 2015, 04:27:53 PM
Which Armory version is being used? Extend address chain option wasn't always available, I think it was only added recently.

It predates the time I started working on Armory. You can only get to this dialog if you turn on the Expert user mode, which is set to Advanced by default.
509  Bitcoin / Armory / Re: Negative balance after restore on: September 05, 2015, 03:51:36 PM
https://imgur.com/olqpnjf
510  Bitcoin / Development & Technical Discussion / Re: Why I support Jeff's BIP100 on: September 05, 2015, 03:37:35 PM
(1), (4), (5) and (6) imply that all miners are withholding all hash power until the total fee in mempool hits a desired threshold. I don't think that equilibrium can exists. Indeed any miner has the opportunity to orphan the last top to "steal" the transaction in that block while the network is waiting for the mempool to fill. Consequently any miner has to "defend" their block after finding a solution by mining on top of it, and will naturally add all fee paying transactions they can get on the way, forcing every other miner to commit at least some hashing power as soon as a new block is propagated.

You could rework your example with the assumption that miners are throttling their hash rate based on total fees in the mempool but even that may not stand in view of this previous counter argument.

Quote
13. Suppose that these 10-satoshi-fee transactions are distributed uniformly with time.

(13) coupled with (6) will reduce the average time it takes for every miner to start committing hash power, not just Alice. Generally, it means (7) is true with or without Alice. It also means that Alice may never hit her expected fee density.

In the Bitcoin network, block space suppliers compete for market share only by lowering prices, since the notion of quality does not apply to block bytes. You are speculating that Alice can exist in this market at a higher sell price only by waiting for demand to periodically outweigh supply. However (5) contradicts that strategy, as it suggest supply is infinite for intents and purposes. As stated previously, other miners will not sit at this equilibrium.

Assume my counter argument does not stand and Alice can still build "fat" blocks periodically regardless of the implications of (5). If she chooses to stick to this strategy despite the current equilibrium, the rest of the network will have a double incentive to orphan her:

1) Because of the first counter argument, as other miners know she is withholding all hash power until the mempool is attractive enough (according to (12), tx emitters know of her strategy, so there is no reason to believe other miners won't)
2) Because her blocks have higher than average fee density.
511  Bitcoin / Development & Technical Discussion / Re: Really not understanding the Bitcoin XT thing... on: September 05, 2015, 01:13:06 PM
The paper shows that as long as a non-zero amount of information about the transactions included in a block is communicated (on average) during block solution propagation, then the fee market will be healthy, according to the paper's definition of a healthy market. This assumption about information propagation has held over the entire history of bitcoin, will hold if the entire network uses the Corallo relay network, or adopts any implementation of IBLTs I can imagine.  The fee market exists. 

The whole point of "anti friction" measures like relay networks and IBLT is to communicate as much data as possible about a block "out of band" (while the block is still being mined). This doesn't simply mean it takes less bytes to communicate a block, it also means the bytes sent to bytes used ratio won't necessarily grow linearly anymore.

If you have to communicate as many bytes as you use in a block, then your analysis stands. If a coding gain can be achieved in the form of lossless, flat % compression, then ratio sent/used will remain roughly static, describing a linear growth. Again the model will stand.

If however the only data you have to transmit at propagation time is what you cannot communicate out of band, i.e. block header, coinbase transaction and any tx that you know aren't part of other's mempool, then the sent/used ratio will grow logarithmically, which is the same as saying that propagation speed (Note speed, not time. Higher speed means faster propagation) will increase exponentially with block size. This would put us in the case you describe as the "non-existent market".

Quote
In hindsight, perhaps I should have chosen a word other than "healthy" that would have been less controversial.

Maybe organic or self sustained are better wordings. Semantics are relevant in a research paper.

Quote
Nevertheless, I define precisely what I mean by a heathy market in Section 7.

This is from the introduction:

Quote
"...a healthy transaction fee market would develop which charges users the full cost to post transactions..."

The full cost to post transactions is not explicitly defined so it is reasonable to expect it means "the cost to maintain proper difficulty" (which is largely accepted as the main purpose of transaction fees as inflation diminishes), and this is the assumption I was operating on until I reached page 7.

Quote
You could argue that the market is "not healthy" (using a different definition) because the equilibrium block size in a free market would result in some negative externality (centralization--although I don't buy that).

If there exists a market condition where a miner's profitability per hash increases with total hash rate, then there is a centralizing pressure on mining, as larger miners will out earn their competition. Either we're talking pools and hash rate providers will centralize around a single large pool, or we're talking actual miners and the largest of them all will win the "arms race" by default.

If you doubt propagation time can have this effect, then consider that Q* grows with a miner's hash rate, as they have to propagate their solution to an decreasing portion of the network to have it validated. After all, they only need to propagate to a cumulated 51% of the network's hash rate.

Quote
And then you would propose that a quota on the production of block space needs to be put into place, for some greater good.  However, before we go down the road of production quotas, we should take a long look at other economies where they've been implemented.  The most famous is of course the USSR, but even in Canada we have quotas on the production of eggs.  These were originally implemented to ensure that farmers could earn a living wage and the public could enjoy a reliable supply of eggs; however, they now serve to keep small famers out of the commercial egg market and result in all sorts of lobbying and palm greasing (and higher prices for consumers).  He who controls the quota wields the power to pick winners and losers...

If block size limit is a production quota, then coinbase reward is minimum wage or government grants. Both are externalities and both have devastating effects on an economy. Yet both are currently in place in the Bitcoin network (and every alt coin out there has a mix of both), and the only true centralizing force at the moment is the heavily subsidized cost of electricity in certain parts of the world. Clearly the macro economics of classical commodity markets do not apply (at least as is) to crypto-currencies.

Quote
The model simple no longer applies as R/T -> 0.  It's neither right nor wrong.  It's sort of like an equation where you have 0 divided by 0.  It's undefined.  You need to evaluate the limit.

And yet that situation is defined outside of your model: some transaction emitters would mine their own transactions, and others will pay them to mine theirs. If your model cannot come to this conclusion then it is at least lacking something. Do you realize that according to your results, no one would ever mine on the testnet chain?
512  Bitcoin / Armory / Re: Negative balance after restore on: September 04, 2015, 08:44:49 PM
Start Armory. Pick User -> Expert Mode

Go to your wallet's properties dialog, click on the figure next to Addresses Used.

Pick an amount to extend the address chain with then click compute. It will rescan the wallet, then you should see your balance.
513  Bitcoin / Armory / Re: Negative balance after restore on: September 04, 2015, 06:22:19 PM
Try Help -> Rebuild and Rescan
514  Bitcoin / Development & Technical Discussion / Re: Really not understanding the Bitcoin XT thing... on: September 03, 2015, 11:11:30 AM
This doesn't happen.

May well in some future scenario, but certainly not now.

I am pretending it would happen with large propagation times, so we are not contradicting each other.

It's incendiary one way or another.

To get a second opinion, find someone who has worked in dynamic display ads and ask them how much they love data.  High value data like precise transaction logs probably has a broad range of opportunities for monetization which go far beyond simply spamming people with ads, but I suspect that just that alone would be enough to support a very robust monetary framework infrastructure and reason to work hard on gaining the largest footprint possible.

I am not doubting the value of data mining financial transaction logs, but you would have to prove Hearn is trying to modify the network in a way that it becomes feasible on the blockchain, and that such way can remain private to a few select people in order to turn a profit from this method first hand. After all, this dataset is public and there is no intuitive link between addresses and individuals.

Possibly open a book on what their sfII character of preference is  Cheesy Unless they're one of those "wildcard" assholes  Angry  ( Grin)

I like Ken
515  Bitcoin / Development & Technical Discussion / Re: Why I support Jeff's BIP100 on: September 03, 2015, 10:58:32 AM
Even with no block subsidy and assuming all other miners always sweep the mempool I expect F will be greater than zero.  I think Alice from the example above could still find a viable mining strategy in only accepting 10-satoshi transactions and waiting for the mempool to grow sufficiently "plump" before beginning to hash.

That's assuming block size demand is superior to block size supply. Otherwise the mempool would essentially be empty after every block. Clearly there is a debate on the projected supply vs demand, but in the absence of a block size limit, I'm expecting the technological gain from fast relay networks will keep the supply way ahead of the demand for pretty much ever.

My expectation is that as long as there is a realistic block size limit in place, the Nash equilibrium will put upward pressure on fees. With the absence of a realistic limit, the Nash equilibrium will induce the opposite effect and fees will be not be sufficiently high to support proper difficulty.

So my response to this:

Quote
It may be profitable for a miner to increase the minimum fee they will accept

would be "only if the Nash equilibrium supports it". Which is the same as saying that demand will outgrow supply, which implies there is not enough block space to wipe the mempool of fee paying transactions.

The corollary to this statement would be that if a miner can wipe the mempool, then competing miners cannot afford to do any less.
516  Bitcoin / Development & Technical Discussion / Re: Proposal: dynamic max blocksize according to difficulty on: September 03, 2015, 10:38:28 AM
When the hash rate increases because of new ASIC, the difficulty rises, this will allow for the max block size to increase. This increases supply so transaction fees go down, which will cause an increase in block space demand. All of which if you think it over, it makes sense. The bitcoin network has gotten stronger because of a massive increase of hash rate due to new ASIC, therefore it stands to reason it should support more transaction throughput.

Again there is no link between increased hash rate and block space demand. If a new ASIC tech is unleashed on the market tomorrow, why would transaction volume go up? Your local bank gets a new paint job, a larger parking lot, a McD right next to it and a larger safe, and your reaction is "ima purchase more goods and services"? Where is your extra purchasing power coming from? Isn't the fact that your bank is upgrading the result of the fees it's been charging its customers? What leads you to believe said customer are somehow wealthier as a result of this? If anything, shouldn't it be the opposite?

You have demonstrated yourself that if your algorithm was applied from day one, the current block size would be completely bloated and unrealistic. Your argument for supporting your method is that ASIC technology has caught up to its technological debt and is now dependent on Moore's law.

The implication is simple: difficulty growth is only a valid metric within certain bounds (that's your proposition, not mine, I'm just deducing). So again, how does your algorithm deals with situations where difficulty grows outside of its "healthy" bounds?

Quote
Intel has been around for decades, bitcoin only 6 years, of which 3 has shown dramatic growth.

That's not my point. My point is Intel is not building Bitcoin ASIC miners right now. If Bitcoin's market cap grows big enough for Intel to start building mining hardware, chances are difficulty will be growing much faster than Moore's law dictate for a few extra years.

Quote
It doesn't matter really at which rate block space demand grows. If it grows slowly, then transaction fees stay low and investment in mining will be low.

The fee market is not an end in and of itself. It is a mean to support certain network mechanics, one of which is to pay miners enough to have acceptable blockchain security. Not just security, but high enough that it is more profitable for a brute force attacker to just mine blocks than to try and rewrite block history.

You should design your proposal with the purpose of the fee market as your goal, not to simply sustain any fee market.

Quote
Unfortunately we don't have any other metric to determine the max blocksize.

There are plenty of other metrics in the blockchain to represent block space demand. Over a difficulty period consider total fees paid, average fee density, UTXO set, total value transfered, average value per UTXO, straight up average block size. Plenty of stuff to get creative with.
517  Bitcoin / Development & Technical Discussion / Re: Synchronizing wallet balance from a pruned node on: September 03, 2015, 10:09:07 AM
Yeah using the p2p layer could be used, with those bloom filters. The BitcoinJ implementation is bugged and doesnt provide privacy and reading through the documents it's not clear to me how that could be fixed.

I'm suggesting to use the P2P layer with your local node (instead of RPC only) for the added functionality. Unless pruned nodes won't accept bloom filters (I have no idea whether it does or not), this is the easiest path to achieve your functionality. And since the node is local, the privacy issue goes out of the way.

I don't think there is a technical limitation preventing pruned nodes from fulfilling a bloom filter request. After all they store the UTXO set and the payment address can be extracted from each one of them. There is no real difference in that regard when compared to a full node, besides that the set of TxOut is smaller.

Quote
Presumably I should read through the bitcoin dev mailing list to figure out how the developers imagined using a pruned node would work.
I don't see a way around requiring a pruning node to redownload the entire blockchain whenever a wallet is imported. Presumably this might end up as the accepted way of doing things, users should recover or import new wallets rarely.

Depends on the operating mode. If they do away with the wallet history feature and only stick to balances, it should be pretty straight forward. Or they could bootstrap a wallet ala SPV, once they replace these useless  bloom filters with committed maps (I think that's the name).
518  Bitcoin / Development & Technical Discussion / Re: Why I support Jeff's BIP100 on: September 02, 2015, 11:45:30 PM
Imagine a world where basically all fees are <= 1 satoshi.  Suppose Alice is a miner with 5% of the network's total hashrate.  Alice could advertise that she will no longer be processing all transactions but only those that pay at least 10 satoshis.  Each bitcoin user now has the option of paying 9 extra satoshi to reduce the expected first transaction waiting time by about 30 seconds.  Supposing this extra utility is worth the 9 satoshi in enough cases, Alice would increase her revenue.

Not necessarely. You should look at the problem the other way around. If all miners will only mine fee F transactions, and suddenly one of them decides to just indiscriminately wipe the mempool for every block it finds, then the average fee will go down.

You should also consider the tapering of inflation. Currently the coinbase reward composes the grand majority of miner revenue, so they can afford to mine small blocks as a result of refusing to integrate any transaction with fee < F. That will certainly push the average fee up. However, as the coinbase reward keeps diminishing, we will eventually reach an equilibrium where a miner cannot afford to mine too small a block (based on the fee density he expects) and will either have to take on these transactions paying below F, or not mine blocks until the mempool is "plump" enough (which is not viable).
519  Bitcoin / Development & Technical Discussion / Re: Synchronizing wallet balance from a pruned node on: September 02, 2015, 10:40:42 PM
One issue I've just realised is the gettxout call also requires a numeric vout value. That's the number that goes with the txid.
There's no way to tell how many outputs a transaction has, so best you could do is try all numbers from zero to about 30 or 40 (?) And then you're wasting a lot of time and still might miss outputs.

As long as you have a Tx size you can guesstimate the top boundary for TxOut count per Tx. Short of that, block size could give you a broader range, but then you would have to resolve tx hashes to block height.

Not sure how much of that data is available through the RPC in pruned mode. I'm very familiar with block chain analysis but I work directly with raw block data, never through the RPC. Maybe you are better off using the P2P layer in an attempt to query more relevant data.
520  Bitcoin / Development & Technical Discussion / Re: Synchronizing wallet balance from a pruned node on: September 02, 2015, 06:41:48 PM
3) is a bad idea since you should expect a DB engine to lock access to a single process by default. I would not base my code on this assumption, which imply you would need to bootstrap your own history DB without Core running, then use a different code path to maintain your own DB straight from blockdata.

2) is tedious and what are the chances that would be merged into Core?

1) is how I would do it, pull all blocks and check each for relevant UTXOs. This process can be very fast if you parallelize it, but you don't necessarily need to since it's just the original bootstrapping that will be resource intensive. Maintenance won't be nearly as costly and can use the exact same code path with bounds on block height.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 [26] 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 ... 107 »
Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!