Bitcoin Forum
May 26, 2024, 02:21:00 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 ... 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 [103] 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 ... 288 »
2041  Bitcoin / Armory / Re: [ANN] Armory 0.93 Official Release on: February 21, 2015, 08:46:15 PM
Great work as always.  I particularly appreciate the highly accessible source code (last night, GnuPG's source made me a very sad panda).
Any thoughts on BIP0039?  I'm not so worried as I've written the functionality I desire myself; I'm just curious.
Personally I wouldn't implement it, I consider it a ill-advised and harmful feature.  Keep in mind, there can be a BIP for anything that someone wants to use, having a BIP is not a mark of quality.
2042  Bitcoin / Development & Technical Discussion / Re: Are very old clients still able to participate on the bitcoin network? on: February 20, 2015, 05:57:08 PM
EDIT: I think I have found the answer.
Current clients can only talk to other clients with have at least version:

static const int GETHEADERS_VERSION = 31800;


Hiowever, Bitcoin Client v.0.3.24 has only version 10300 and therefore it cannot connect to other peers.
0.3.24 has version 32400, but nodes that old will almost certainly get stuck pretty much right away due to old bugs.
2043  Bitcoin / Development & Technical Discussion / Re: Why Reduce the Block Reward? on: February 20, 2015, 09:29:12 AM
by the block reward.
By subsidy and transaction fees. You're missing part of it. Smiley

Quote
and transactions need to expire after some time, a year for example,
And then miners can simply confiscate anyone's money by ignoring their transactions until they expire and take it for themselves.

Quote
Expired transactions, and all previous transactions with no other dependencies, can then be pruned from the blockchain.
No expiration is needed to prune data. This is explained in the Bitcoin whitepaper (and implemented in an even more effective form in Bitcoin Core).
2044  Bitcoin / Development & Technical Discussion / Re: bitcoind JSON RPC performance declining over the blockchain? on: February 20, 2015, 09:24:28 AM
Why are you assuming the performance has anything to do with Bitcoind. JSON is just inherently a bit slow, but the performance shouldn't depend on where the transactions are located and should more or less put out an equal number of transactions per second.  Your database, on the other hand, will slow down sharply as you insert more records into it.  A quick test here shows it reading the same amount of tx per second at height 150k and 300k.

General purpose databases tend to perform very poorly for Bitcoin applications, especially if you're carrying specialized indexes... as the bitcoin data implies a large number of really tiny records for most ways of splitting it out.
2045  Bitcoin / Development & Technical Discussion / Re: getblocktemplate on testnet giving wrong target and bits? on: February 19, 2015, 11:02:59 PM
It is also possible that everything is fine, and jerks are just generating testnet blocks two hours in the future to mess with me.
Well, not to mess with you but because they can. Of course, if you generate a single block at the full difficulty you'll reorg out a wad of those blocks that came after you... so they shouldn't be blocking you from mining, only potentially from mining at difficulty 1.
2046  Bitcoin / Development & Technical Discussion / Re: getblocktemplate on testnet giving wrong target and bits? on: February 19, 2015, 06:59:46 PM
Getblocktemplate is working correctly there.

Getdifficulty is telling you the difficulty of the prior block. "Difficulty" is not an interface that is suitable for use in mining, its fundamentally imprecise. It's a human friendly number for display.
2047  Bitcoin / Development & Technical Discussion / Re: Pruning OP_RETURNs with illegal content on: February 19, 2015, 02:11:34 AM
There is no reason for you, personally, to keep around any old transactions for things burred in the blockchain. Pruning removes all transactions and signatures already and a full verifying node can happily be run this way.
2048  Bitcoin / Development & Technical Discussion / Re: Multi-language consensus library on: February 19, 2015, 02:08:39 AM
Are there any languages which can't call a static external c library?  I think this is a solid solution and one of the things I am excited about in the latest release.  To my knowledge C# (.net), Java, go, and python all support calling c libraries.  Maybe we can put together some requirements (data types, etc) to ensure the library remains easily callable in a variety of languages.  I hope to see libconsensus expanded significantly in the future.  It is the first step forward in ensuring the safe development of alternative full nodes.
Kinda. There are hosting providers that will only allow you to run code written in some trendy language or another, with no native code libraries (I don't know the details as to why). There have been some large and high profile bitcoin services running in those hosting enviroments and thus "unable" to run native code, and thus very interested in complete reimplementation in other languages. I don't know what amount of relevance that kind of motivation will have in the future.

Quote
Anyone know if bitcoinj and other libraries intend to integrate libconsensus?
I think it's really too immature to say right now. At the moment it's just script.


Quote
Quote
Beyond libconsensus there is the idea of reducing the consensus code to a bytecode with a trivial interpreter. We're not yet sure how well this will work, but it's something people are also working towards. Libconsensus is a necessary first step which is useful even if the bytecode path doesn't work out.
Interesting.  Do you have any links?  
[/quote]It's mostly been IRC discussion over the last couple years-- it's a pretty low priority effort, esp since libconsensus is a hard prereq as it's unreasonable to put a whole implementation in a slow bytecode, so first the consensus parts must be completely isolated into parts with limited interaction. There has been some experiment work which has had some payoffs, e.g. http://moxielogic.org/blog/real-world-multiply.html.

The idea is simply enough you can create a C-targetable load/store machine instruction set which can be run with a <1000 line switch statement (moxie is such an example), one which is simple enough to formally specify and even prove multiple distinct implementations mach the specification.  The consensus code just gets compiled to a bytecode and then everyone can use the same bytecode.  The challenge is that a simple machine has performance that may be unacceptably low, adding a general JIT like things to your VM has insane risks and makes it much harder to reason about or implement exactly. One possible solution to that is extending the architecture to add some crypto blocks similar to how many embedded processors have multimedia accelerators-- macroscopic hardcoded units that do things like perform a whole 8x8 dct-- e.g. so the instruction set is a big switch statement with a small amount of special case handling does things like compute sha256 with native code. It's relatively easy to be quite confident that an implementation of sha256's compression function is correct... other crypto implementations, less so.  Hopefully its possible to add just enough native accelerators to get acceptable performance without greatly increasing the implementation complexity/risky.  Otherwise the pure bytecode approach will be slow enough that people would either JIT it or replace it with a native implementation and defeat the safety gains.
2049  Bitcoin / Development & Technical Discussion / Re: Get all Bitcoin Addresses with balance >= 0.00000001 on: February 18, 2015, 06:47:15 PM
The system itself doesn't have balances in any direct way, so thats one reason it's not as simple as you might guess.

What precisely are you trying to accomplish?

"What precisely are you trying to accomplish?"

I work for the NSA and I need to be able to get this list ASAP, we are trying to crack Bitcoin.

Just kidding Cheesy I just got into Bitcoin technically recently and I'm trying to figure out how everything works...

Basically what I'm searching for is for a way to query the blockchain locally.

But I find it very difficult to find a way, I use the blockchain.info API, but when you want to make millions of requests the API is obsolete since there are requests limits and obviously its slow.

I would like to do complex queries, but if I am just able to get all the current addresses with a balance > 0 I will be happy enough.

Maybe this info is available somewhere?

Any ideas?
People often ask questions like that out of ignorance about how the system works, so it's useful to ask why they're asking; as mentioned the data you seek is not relevant to the operation of the system.  You still haven't explained what you're trying to accomplish. Knowing it can help people provide more useful answers.
2050  Bitcoin / Development & Technical Discussion / Re: Get all Bitcoin Addresses with balance >= 0.00000001 on: February 18, 2015, 11:24:45 AM
The system itself doesn't have balances in any direct way, so thats one reason it's not as simple as you might guess.

What precisely are you trying to accomplish?
2051  Bitcoin / Development & Technical Discussion / Re: Multi-language consensus library on: February 17, 2015, 08:42:35 PM
However, not all clients are written in c++.
Libconsensus is intentionally C callable, so it can be used from any language that can call an external library.

Beyond libconsensus there is the idea of reducing the consensus code to a bytecode with a trivial interpreter. We're not yet sure how well this will work, but it's something people are also working towards. Libconsensus is a necessary first step which is useful even if the bytecode path doesn't work out.
2052  Bitcoin / Development & Technical Discussion / Re: Risks of big changes to Bitcoin Core on: February 16, 2015, 10:09:09 PM
There are some scary code in the reference implementation
What "scary code" are you referring to?
Quote
and it has already happened that a new version has hard forked the blockchain because of a bug.
This is incorrect Versions prior to 0.8 were inconsistent with _themselves_: Block verification was non-deterministic for some large blocks, with acceptance depending on the precise layout on disk of some database data structures. The fork was triggered by a miner that changed their settings to produce larger blocks than typical, and would have happened without any new versions in play.  The precise nature of the issue was initially misunderstood as being 0.8 vs before, because all the 0.8 were on one side, but in fact most of the split was pre-0.8 vs pre-0.8.

This may or may not have much relevance to your thinking, but please get your facts straight. It's irritating to see this misinformation/misunderstanding continually repeated.

Quote
Let's say ~50% of the mining capacity uses the reference implementation and ~50% uses another implementation.
It's not what miners are using that matters. If your are mining and your blocks are being rejected by the user's systems because they fail validation then you're not actually mining, regardless of how much hashrate you have.

Suppose a big change was actually made, there’s nothing forcing miners to take the update. What are some of the risks of having miners on the network using different versions of the protocol? Not necessarily malicious clients, just outdated ones?
(since you appear to be asking about intentional changes:)

When changes are adopted to the blockchain rules they're made in a way which is intentionally backwards compatible with old versions, called a "soft fork". This is accomplished by constructing the change so that it strictly narrows the set of permissible blocks. Because nothing invalid became valid, old nodes also accept these blocks.  To avoid issues where old nodes would create blocks that get forked off, an effort is made to only narrow the validity of blocks in ways that an old node wouldn't have constructed an invalid block, and the new rules are only activated when a strong super-majority of miners has signaled an intent to enforce them.

Keep in mind that the vast majority of all conceivable changes are actually bad and open up new attacks: Most things people post about (myself included) turn out to be bad ideas on further reflection.  The 'mitigate 51% attacks' link you provided is, I think, one such example: that approach fails to prevent any interesting attacks (attacker can easily meet the criteria by including many of the prior transactions) and also opens up new attacks where none existed before (by intentionally preparing two forks and broadcasting to half the network simultaneously the network can be cheaply split and work against itself; this is a general flaw pattern in approaches that make block preference depend on multiple strong tie-breakers).
2053  Bitcoin / Development & Technical Discussion / Re: Why is difficulty a float number? on: February 16, 2015, 08:19:52 AM
It isn't: in the Bitcoin p2p protocol and blockchain there is no "difficulty", difficulty is just a display convention.  What the network uses is the 'bits' number which is a compressed representation of a 256 bit integer target which the block hash is compared to. To be valid the hash must be less than the target.

The difficulty number is a relative number which makes it easier to compare than the huge target or the cryptically encoding bits field. Now that it's over 4 billion it's not as useful a unit as it was back when difficulty was 100k.

There are some ignorantly constructed mining things which try to use the for humans 'difficulty' number for actual important usage. There be dragons. (Among other things, errors related converting block hashes to 'effective difficulty' have been blamed for miners discarding valid blocks in the past).

It is my understanding difficulty is "the number of bits in the resulting hash". This is a very nice and fairly defined number in my opinion.
Thats not correct, nor would it be very useful: if it was a straight leading zero bit check difficulity could only change in huge doubling/halving increments which could not adequately control the rate of blocks.
2054  Bitcoin / Development & Technical Discussion / Re: Individual Block Difficulty Based on Block Size on: February 15, 2015, 11:45:29 PM
No, they can't.  They can only influence users' expectations of required fees in proportion to their hashrate, which you specified was small.
Only if block sizes are limited. If they are unlimited even a tiny portion of the hashrate can continually clear the market at effectively no cost to themselves, thats the whole point of this post-- creating a cost for doing so.

Some useful thought fodder... Monero/Bytecoin/etc. do something like this, but they require the miner to throw away some of the coins they'd receive in their coinbase txn. The problem with that is that once subsidy is small the miners can simply bypass it by accepting fees "out of band" (e.g. with addition txouts). Difficulty scaling avoids that trap.
2055  Bitcoin / Development & Technical Discussion / Re: Individual Block Difficulty Based on Block Size on: February 15, 2015, 08:30:12 PM
tl;dr Non-mining nodes can in principle receive compensation for transaction handling without radical changes to Bitcoin.
I don't see why you think this is relevant. Other nodes relaying plays no role in the economics of mining: miners can (and already do, and have in the past since 2011) advertise locations which can receive transactions for them, and can do so trivially and in a jamming proof way by listing them in the blocks they mine. Transactions then go straight from users to miners, which is inherently more efficient since the multi-hop flooding network relay was just pure overhead (a product of trying to achieve decentralization at great bandwidth cost). Anyone can "in principle" pay anyone else, sure. But there is no obvious reason for anyone to do so.

foolish_austrian, I'm very interested in that kind of closed loop control vs difficulty... In the past when I'd worked on it's I'd struggled at producing a difficulty size relation which was well justified, there seem to be an infinite class of functions that works just about as well. "Why this shape and not this other one?".  Have you considered an absolute floor (e.g. size limit can't go below 1MB regardless of what the average size is) and if that creates weird incentives?

I tried doing some work where I assumed there was a unimodal optimum quantity and tried to prove (for some function) that the system would always find equilibrium there but I was unable.  All these 'quadratic curve' like systems do have a nice property that differences in miner efficiency turn into differences in the block sizes they produce, thus equalizing their participation some-- more efficient miners produce larger blocks than typical rather than just driving people out of the market quite so strongly (though this is less powerful when you assume transaction fees/byte are power-law rather than uniform)... but I was unable to prove that even with homogenous-cost rational miners if there is a single supply level at which miner income is maximized that the system has an equilibrium there. I think it would be really good to be able to show that.

Back to your writeup; can you check your figures? because the function you've shown there doesn't obey the invariant that the result should be 1 if s,d = 1. Using 1/2 as both your coefficients does, but disagrees with your illustrations; did you have some scale and offset you're not listing here?

Subsidy can be addressed by scaling it with the function (e.g. 2x difficulty, you get 2x the subsidy) and just letting it modulate the move the inflation schedule in or out somewhat, so the subsidy is neutral to the process... though it's annoying because it makes it more complex and harder to reason about.   Your notion of having coin-holders influence the target hashrate growth target is interesting; though if you do so internally to the network with transactions miners can censor them to change the result so some care and thought must be taken there. It sort of combines a proposal I liked before (basically only increasing the block size if there is enough coins days destroyed voting for it) but thought perhaps was too costly to use directly.

As far as implementation goes, ... using floating point in consensus would be an unmitigated disaster. But thats no problem, a reasonable, cheap, fixed point approximation is no problem-- esp. for the limited range this function needs to operate over, a rational polynomal fit will work fine.
2056  Bitcoin / Bitcoin Discussion / Re: Don't Let Anyone Tell You Satoshi's Identity is NOT Important on: February 14, 2015, 09:18:22 AM
Click bait article; don't be suckered.

Bitcoin is a system governed by its software, -- mathematical in nature, its operation is fundamentally transparent.  If Bitcoin is good, it's goodness if available for your discovery and analysis, and likewise if it is not.   It does not matter if it was created by the CIA or whatever.  Any alternative identities of its creator are fundamentally uninteresting in any practical sense as the whole value of the system is its autonomy, its independence from its creation (and any other person or institution). If you think the details of its creation matter, you've failed to understand the system.

The only utility of that information is pointless tabloid gossip, or fodder for Bitcoin's political opponents who would use the humanity behind its origins to discredit it in the eyes of the public which has been too long seeped in opaque trust-based systems where the integrity of the origin has considerable predictive power.

More fundamentally, it just isn't any of our business. That people are constantly so cavalier with the insane allegations (just about every long time Bitcoin user has been accused of being its creator at one time or another; including ones that barely understand the technology) without any consideration of the physical risk of harm these allegations can bring is a constant source of disappointment for me.

If you want to show respect for the Bitcoin system and demonstrate a real understanding of its nature; say no to speculative bitcoin-creation-myth tripe and save your drama-gawking for the scam of the week where it might actually do some good.
2057  Bitcoin / Development & Technical Discussion / Re: What about kill or fill transactions? on: February 12, 2015, 11:05:37 PM
An invariant we've tried to avoid breaking is reorg safety.

Right now, if you receive a coin and all the participants in that coin are honest (and we fix/ignore malleability), no* reorg will make the coin become invalid. (*)This is broken for recent coinbase created coins, but 'fixed' by making them undependable for 100 blocks; so at least all coins you see confirmed are equivalent in their reorg safety for reorgs up to 100 blocks.

Absent this you can get fun problems where a reorg can break an exponentially widening cone of transactions by accident or malice (on the part of someone who wasn't a participant).  To avoid it you might want to refuse to accept a coin which has had an unsafe event in its recent history, but absent the network informing you of this (e.g. by not allowing confirmation in the case of coinbase spends) you have to do potentially exponential work to check for these events, and checking-- of course-- makes the coins less fungible.

FOK can already be handled by successfully conflicting with another spend... and you already wanted to get a transaction confirmed in any case, if you didn't care you wouldn't mind that it hasn't gone through yet. In that case only a parties to the unconfirmed transaction can kill, and I think thats generally better than letting miners (or chance) handle it.

But at the same time, some useful things are incompatible with reorg safety (e.g. using the blockchain as a random beacon in transactions) and I can't say with confidence that the improved fungibility of strong reorg safety justifies not having those properties.... it's just something to think carefully about.  May well be that some kind of coinbase-like or state flagging (e.g. you can spend an immature unsafe coin, but your spend has to propagate a time till mature counter which is the max of the inputs counters) approach may be an adequate balance for these concerns.
2058  Alternate cryptocurrencies / Altcoin Discussion / Re: Lightweight Cryptos without Block Chain Bloat (Coq Development and Paper) on: February 09, 2015, 12:34:45 AM
Greetings, interesting work.

I'm not sure if you're aware of it, but Bitcoin core hasn't used the blockchain for verification for years; instead it uses a compact state summary and differences which we call the UTXO set (unspent transaction outputs). Among other things, this allows you to run a Bitcoin full node with under 1GB storage.

The UTXO set is not authenticated in Bitcoin core today, it's just locally generated-- though the utility of a committed utxo set has been known for years (also, somewhat later https://bitcointalk.org/index.php?topic=88208.0) it's a rather expensive commitment (e.g. a single spend/create update is factor log(utxo) more IO required) and perhaps not as useful as it seems-- if nodes do not construct their own but jump in mid-stream they only have SPV security of the history.  SPV security is a substantial step down from the Bitcoin security model (in particular people expect the full node rules to limit the damage of dishonest miners) but it might still be reasonable-- but if you're willing to accept SPV security why not go all the way and use a SPV node at a massive cost savings?

With respect to proof of stake, I'm disappointed to see it just applied here with no analysis to the actual security model being used. It appears to be impossible to achieve the same security model as Bitcoin with POS; existing systems compromise with centralized block signers or other external sybil-proof beacon assumptions (which seemingly could just replace the blockchain entirely if you really trusted them). For some applications these might be reasonable trade-offs, but if they're ignored or pretended to be the same trafeoffs as Bitcoin, thats just sloppy cryptography. I'd encourage you to not take these things for granted.

I haven't reviewed your COQ yet, but have you benchmarked the performance of your extractions?  Having some formal statements of the system is interesting indeed, but perhaps of low utility if there isn't a way to get extractions with acceptable computational/memory performance.

Quote
Create an initial "genesis" ledger in which each address holds assets corresponding to the unspent transaction outputs that can be spent by that address. (I don't know a general way to handle P2SH, so I will ignore these.)
The parenthetical at the end suggests that you could use some improvement in how you're thinking about the signature system in Bitcoin.  You shouldn't think in terms of addresses, addresses are just human friendly (compact) ways to encode a scriptPubKey.  A scriptPubKey is a public key for the script cryptosystem, and like in other signature systems a public key is an identifier for the criteria required to sign to authenticate a signature. A UTXO oriented system should track scriptPubKeys, not addresses. (And P2SH itself shows how to make those entries constant size, with computational security). What Bitcoin does is much more general than basic multisignature-- in particular atomic swaps are not possible in a model which is constrained to simple thresholds. (Also, thresholds can be done much more efficiently than multisignature in any case, if supporting only thresholds was really the goal).

What you're calling a 'snapshot' has been described before (by PeterR) as 'spinoffs'.
2059  Alternate cryptocurrencies / Altcoin Discussion / Re: Trustless, Tradable, Reality-Based Contracts on cryptonote coins on: February 07, 2015, 11:57:10 PM
hm previously that response directed me to the other thread, but once I responded there the text I was responing to moved here. Confusing! Smiley

2) My use of the term "reality based" is a reference to the fact that each multisig script on Bitcoin is independent from any other script on the blockchain. There is no link between what should be two identical contracts, and so two identical contracts can potentially be scored differently
::Sigh:: The common method of key-reveal for binary contracts involves no per contract processing by the oracle, they reveal one key or the other in public. In your language "data scores the contract".
2060  Bitcoin / Bitcoin Discussion / Re: Permanently keeping the 1MB (anti-spam) restriction is a great idea ... on: February 07, 2015, 07:27:31 AM
In the end, the whole block of transactions must be present on the blockchain at the most distant end of the network in 3 minutes to allow newly discovered blocks to be added upon it. Ideally, you need to transmit 20MB data in 1-2 minutes. Maybe it is possible to use multi-threaded P2P downloading to accelerate the data transfer
Blocks are just transaction data which has almost all already been relayed through the network. All that one has to send is just a tiny set of indexes to indicate which of the txn in circulation were included and in what order  and there are already alternative transports that do this (or even less-- just a difference between a deterministic idealized list and the real thing).  The data still has to be sent in the network, so it doesn't fundamentally improve scaling to be more efficient here (just a constant factor), but it gets block size pretty much entirely out of the critical path for miners.
Pages: « 1 ... 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 [103] 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 ... 288 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!