Bitcoin Forum
June 24, 2024, 01:21:20 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 [40] 41 42 43 44 45 46 »
781  Bitcoin / Pools / Re: Network latency, stale shares, and p2pool on: February 28, 2012, 07:25:34 PM
It does make a difference. What happens to your share if it becomes an orphan block on the share chain? Other miners work on the competing block and you lose out on any dividends from your share. If my analysis in the OP is correct, this should be happening an alarming 9% of the time. A traditional pool, on the other hand, would count both shares as valid and pay dividends to both miners.

Again, if p2pool implemented a means of "merging" divergent share chains, that would adequately address this problem. Until then and ignoring p2pool subsidies, it's more profitable to mine an (honest) traditional pool.
782  Bitcoin / Pools / Re: Network latency, stale shares, and p2pool on: February 27, 2012, 07:28:20 AM
Stale/orphan shares on YourTraditionalPool can also be valid block solutions. That's not the issue (the chance of your orphan block being a valid solution is just under 0.15%). The issue is that stale data occurs approx. every 10 minutes for a traditional pool, but every 10 seconds for p2pool while the network latency presumably remains the same (and therefore is of 60x more consequence). The discrepancy is because traditional pool operators are not required to linearly order shares into an auxiliary blockchain as p2pool does.

Put differently, when some other miner finds a share in a traditional pool, I don't have to have to update my unit of work unless that share is a valid bitcoin block solution. But with p2pool I do--every time--and often enough to where network latency has a significant dampening effect. Traditional pools have a "wasted work" overhead of about 0.15%, but as this effect is small and nearly the same across the board, it's not something we talk about much. However p2pool is unique in requiring synchronization 60x as often, making this effect very real, measurable, and a significant factor.

Put differently again, if my overclocked GPU is reporting 400MH/s and I signed up for deepbit, to name one pool just for the sake of argument, my useful contribution--after subtracting time spent stale data that doesn't have a chance of being accepted--would be closer to 399.4MH/s. That's such a small difference that it's easily lost in the noise and hardly worth talking about... except that with p2pool, my useful contribution would be 356MH/s--a full 43MH/s less. That is significant, and worth complaining about.


BTW, there is a solution, although it is not easy: create a procedure within p2pool for the merging of divergent/orphan chains.


EDIT: Put even more simply: what happens to orphan "blocks"/shares in p2pool? Therein lies the problem.
783  Bitcoin / Pools / Re: Network latency, stale shares, and p2pool on: February 23, 2012, 06:47:25 PM
That's not how the bitcoin wiki describes p2pool...

Quote
P2Pool shares form a "sharechain" with each share referencing the previous share's hash... The chain continuously regulates its target to keep generation around one share every ten seconds, just as Bitcoin regulates it to generate one block every ten minutes.

The wiki also says:

Quote
On P2Pool stales refer to shares which can't make it into the sharechain. Because the sharechain is 60 times faster than the Bitcoin chain many stales are common and expected. However, because the payout is PPLNS only your stale rate relative to other nodes is relevant; the absolute rate is not.

Which is misleading. Stales don't affect your payout relative to other p2pool nodes (assuming everyone is equally well connected, which is actually not a good assumption), but they do affect the overall payout per unit of work crunched of p2pool vs. other pools.

EDIT: clarified wording
784  Bitcoin / Pools / Network latency, stale shares, and p2pool on: February 23, 2012, 05:58:02 PM
My understanding is that p2pool operates an alternative blockchain with a 10-second target interval, where each found block constitutes 1 share. Somewhat coincidentally I have previously looked at the properties of an altchain with 10-second intervals, and what I found was that the network latency/p2p propagation time was a significant factor. On bitcoin proper it takes about 1.9 seconds for a newly minted block to find its way to all the bitcoin peers. Assuming you're solo mining with long-poll enabled, that means you're working on a stale block 0.15% of the time, on average (since you can assume the time it takes a block to reach just you is half the latency). But on a 10-second round, that likelihood is now 9%, a couple of multiples larger than even the most ostentatious pool operator fees.

Is this accurate, or am I doing something wrong with my math?

EDIT: I can't find any reorglog statistics for p2pool online, which we could use to determine p2pool's network latency... if someone has kept track of this, let me know.


EDIT2: P2Pool is safe to use! My analysis above, while not incorrect, is incomplete and only half of the story. The amount paid per share that *does* make it into the share chain is also approximately increased by a corresponding amount, negating the effect due to stale work.
785  Bitcoin / Armory / Re: Armory - The most advanced Bitcoin Client in existence! on: January 27, 2012, 03:00:56 AM
Ehh, I don't agree.
I live in the US too, so "Armory" is the preferred spelling for me as well, and such a change is a feature that would not benefit me at all.  The least I suggest is this might come up again, especially if the code attracts other contributors interested in localizing for other languages.

Actually, translations are something I want to support.  In fact, maybe I should make that a high priority, given your optimism about widespread adoption Smiley

The easiest way I can think of, would be to offload basically every string/message in the program, to a dictionary in a separate file (i.e. msgWarnDeleteWallet['en']="...").  Then users can replace the file with translated versions, which can be bundled with the program and selected through command-line options.  

Is there an "official" way to do/support this?  Or is my theoretical technique sufficient?

"gettext" is what you are looking for.
786  Bitcoin / Armory / Re: Armory - The most advanced Bitcoin Client in existence! on: January 03, 2012, 05:21:24 PM
This may fly for blocks buried deep enough in the chain. But for recent blocks you need to check they are actually valid before propagating them further. Otherwise you get into a situation that nobody does any checking because he trusts that everyone else will do the checking and only feed him clean data.
That's what the Satoshi client does, which this client interfaces with. All he's doing is trusting that the Satoshi client is doing its job.
787  Bitcoin / Wallet software / Re: libbitcoin on: December 31, 2011, 11:39:37 PM
I, too, would use and contribute to this project if it were released under a more permissive license (BSD preferably, LGPL if you must). As it stands I cannot even look at this code due to the possibility of legal risk to my company.
788  Alternate cryptocurrencies / Altcoin Discussion / Re: Innovation in the alt chains on: December 30, 2011, 07:29:28 AM
ArtForz hits my main complaints. As I said, either very rushed or not very competent. Given that we don't know the facts I'll be polite and assume it was the former. Maybe he really rushed to meet that new years deadline three years ago.
789  Alternate cryptocurrencies / Altcoin Discussion / Re: Innovation in the alt chains on: December 27, 2011, 11:56:22 PM
yes.
790  Alternate cryptocurrencies / Altcoin Discussion / Re: Innovation in the alt chains on: December 27, 2011, 07:52:40 PM
Guys, let's stay on topic. If you want to debate that, start a new thread.
791  Alternate cryptocurrencies / Altcoin Discussion / Re: Innovation in the alt chains on: December 27, 2011, 08:05:37 AM
@BP, yes it's the same company, but that post is not representative of our current projects.
792  Alternate cryptocurrencies / Altcoin Discussion / Re: Innovation in the alt chains on: December 25, 2011, 09:16:21 AM
While I generally agree with your sentiment, I take issue with a few of the details. Satoshi is an interesting case in that he clearly thought out many of the advanced use cases of bitcoin, and took baby steps toward implementing/enabling them before releasing the client into the wild. As an example, all of the contracts/scripting code was not necessary for bitcoin to do what bitcoin did at launch, and indeed some features were clearly untested as they did not work as advertised. However Satoshi was either very rushed or not a very competent programmer by industry standards, and that shows up in his code. Much of the work Gavin and others have done is just in cleaning up that mess. Also, Satoshi demonstrates an ignorance about the state of the art in academic computer science, unnecessarily cutting off valuable future enhancements. Finally, while Satoshi had the foresight to imagine complex use cases in the future, he lacked the imagination to think up many applications outside of tech and finance, which the current implementation unnecessarily hinders.

I have the utmost respect for Satoshi. He single-handedly took anonymous crypto-cash, which we've been talking about since the 90's, from concept to practicum. Beyond my respect as a fellow engineer, my company hopes to profit greatly from that. But... deification doesn't help anybody. It only obscures existing flaws and potential for future improvements.

The other aspect is that the economic underpinnings of bitcoin are fringe and esoteric, and the in practice application of it as a currency is clunky and, well, not very practical (I'm being honest and stating my informed opinion--not trying to start a flamewar!). Don't underestimate how much of a turn-off that is for the vast majority of developers and business people, who probably have real-world experience using e-payment solutions like Braintree, WePay, Stripe, or Square, or were trained in Keynesian economics. To these people (I count myself among them), the bitcoin project is seen as a combination of childish optimism and inexcusable ignorance. From that standpoint, the people bitcoin attracts are either 1) the fringe intellectuals (Austrian-school economists, extremist libertarians, etc.), 2) engineers who recognize the significance and disruptive nature of this technology, 3) those with something to hide (tax evasion, drugs, money laundering, general profiteering), and 4) the ignorant and/or naïve. While I of course count all the good sirs I have conversed with here in the first two categories, unfortunately bitcoin has attracted predominantly #3's and #4's, and in this subforum that is more true than others.
793  Alternate cryptocurrencies / Altcoin Discussion / Re: Innovation in the alt chains on: December 25, 2011, 06:34:42 AM
2013 will be the year that bitcoin goes mainstream, but not a way that you would currently recognize. 2012 will be a year of transformation, the impetus of which will not come from the Satoshi client (Gavin's great work notwithstanding).

Your beliefs are nice— but where is the evidence?

...

So, yea, speculate that the advancement will come from elsewhere if you want... but it's just speculation without some evidence to back it up.

That's because we're not public yet.

I'm CTO of a Mountain View-based startup that is working on a number of products using a bitcoin-derived protocol. Actually when we first started a little more than a year ago we trying to solve a problem in a completely unrelated field. The primary technical challenge and the key to monetizing the product basically boiled down to building a distributed, peer-to-peer time-stamping service for notarizing changes in ownership. Sound familiar? Unfortunately none of us heard about bitcoin until the bubble over the summer when bitcoin mania spilled into the mainstream media. Naturally it was refreshing to find a ready-made solution to our problem, with most of the kinks already worked out.

So why am I telling this long, uninteresting, and self-absorbed story, you might ask? Because once we we saw that bitcoin solved our problem we wondered what else it could do. And lo and behold, just about everywhere we look, in every industry (even--especially--outside of tech and finance) there are low hanging fruit ready to be optimized or made obsolete by the introduction of products based on bitcoin P2P technology. We've set aside for now our original project and are now working on a number of products that all much easier, and many of them much larger in potential impact (and revenue)*.

It took me a while to convert (not the least because bitcoin's economic underpinnings are populist bullcrap and it will never take hold as a viable currency--but please start a new thread if you want to debate me on this), but now I am convinced that without exaggeration bitcoin is the most disruptive technology to emerge since the World Wide Web. Just not for the reasons most on this forum probably think.

In the time the truth will show itself. Now if you excuse me it's Christmas day and my present to myself is a long block of uninterrupted coding Wink

* You'll have to forgive me for being cagy, but I'm under NDA and of course all potential businesses we've identified are company secret until we launch or decide for sure not to ever pursue.

(Oh, and before anyone cynical wonders outloud why I'm posting this (and I've said the same thing as the OP before), it's because we could actually use some competition. The potential applications are diverse enough and it'd actually validate our business in the eyes of investors.)
794  Alternate cryptocurrencies / Altcoin Discussion / Re: Innovation in the alt chains on: December 25, 2011, 12:55:38 AM
2013 will be the year that bitcoin goes mainstream, but not a way that you would currently recognize. 2012 will be a year of transformation, the impetus of which will not come from the Satoshi client (Gavin's great work notwithstanding).
795  Bitcoin / Development & Technical Discussion / Re: BIP 2112 on: December 15, 2011, 09:42:33 AM
Interesting. I've already got this working within our own (not yet publicly released) bitcoin-derived protocol. We also chose LISP as the language for writing "chain definitions", as we call them but I like your phrasing better, as well as replacing the opcode-based scripting system for transactions. We're also using the bittorrent protocol for the P2P overlay network and DHT capabilities, so I can report that works well (and better than bitcoin, I believe, although we haven't the metrics yet). I would add that the prospectus could include rules for accepting or rejecting future modifications. That's how we're handling it, combined with a PKI infrastructure.
796  Bitcoin / Development & Technical Discussion / Re: Suggested MAJOR change to Bitcoin on: November 12, 2011, 06:39:57 AM
The math presented in this thread is incomplete, IMHO.  All a block is in this regard is a unit of measurable proof-of-work performed.  Just because such proof-of-work can be conveiently modeled around the unit, doesn't mean that the unit doesn't still represent the work that is done to create a block; and the work done is a continuous stream of proof-of-work.
Meaningless and inconsequential--blocks don't have inherent difficulties. If the pool operator gives me a block it could take me a billion hashes to find a solution--or I could get it on the first try. How long it takes is just dumb luck in choosing the right (or wrong) starting nonce.
797  Bitcoin / Development & Technical Discussion / Re: Suggested MAJOR change to Bitcoin on: November 12, 2011, 01:17:21 AM
This is a misleading description.  You're happily waving away the _cost_ of a particular attack.

I am doing no such thing. Rather I am recognizing that the practical reality of the situation such that much of the costs of mounting an attack would be up-front/sunk costs (buying hardware, building a botnet, etc.), so much so that in any realistic scenario the difference in cost between simply preparing an attack and actually mounting it would be relatively quite small.

In the absence of a cloud-based burst-compute utility suitable for hash-computations at a scale to rival bitcoin and at a reasonable price (which doesn't, and likely will never exist), utility computation is a terrible model for the economics of attacking bitcoin.

Take bitcoin as it is.... if you'd like to buy computing time to mine a six block fork you should pay 300 BTC (plus some premium to get it all at once, I suppose). Under the 2.5/minute 12.5 BTC model, the cost would be 75.5 BTC.

(Before you protest that burst computing capacity in excess of bitcoin violates the security model— realize that is effectively an argument that honest bitcoin mining must consume a majority of all fungible computing power in the universe,  as compared to the far more limited requirement that the majority sustained hash power applied to bitcoin be honest.    It's also important to keep in mind that not all the attacks we guard against are races against the public network. It's pretty easy to isolate single nodes via sibyl attacks then slowly feed them blocks which have no chance of surviving on the main network)

The assumptions also require that blocks are not frequently found under the network latency. The 1s number being thrown around is nuts and obviously and observably not true today. Keep in mind that forwarding nodes also check the validity of the blocks before propagating them some of our larger recent blocks are hitting a second of validation in computation alone. This is why it takes several hours to syncup a new client now.

1) such burst computing capacity does not exist
2) you're assuming the price-per-hash of utility compute will be comparable to the expenditures of your average bitcoin miner. in reality, it will be orders of magnitude more with something like AWS.
3) bitcoin needn't "consume a majority of all fungible computing power"--just a majority of the discretionary computing power/excess compute capacity. As stated that's a false argument, because...
4) you're ignoring the opportunity cost of that burst computing capacity. for small amounts of compute power this is zero. but once you exceed excess capacity you will be competing with existing users, driving up the price far in excess of actual costs.
5) if you can isolate a node, a 10 minute block interval won't provide much more protection than a X-second interval--.
6) network latency is 2.1 seconds today, with really very little optimizations. i expect average network latency for well connected nodes to go down, not up in the long term, but that is really a topic for a different thread.

Then there is the point which I made in my first post which seems to have been ignored— part of why we _wait_ is not just to gain confirmations, but to reduce the risk that there was a network partition with considerable hash power on both sides.  E.g. If EU and the US lose internet connectivity people can spend on both sides. In order to be secure you must wait long enough that such a split is unlikely and/or long enough that if a split were to happen people would have an opportunity to detect it intervene (e.g. miners could stop processing new txn, sites could switch to higher confirmation limits— but only if they've detected the partition, which takes time if you are to avoid false positives).  Both criteria require absolute time, the distribution of blocks is irrelevant.

Perhaps I missed it, but I haven't really seen anyone address the point I made that this kind of change (10->2.5minutes) is a change in value but not in character. It doesn't suddenly make the system suitable for direct POS usage. There is a fairly narrow range of use cases where that would constitute an improvement and I suspect almost all of those uses could be adequately addressed by the solutions employed for POS processing.  (I assume everyone agrees that switching to _two second_ blocks needed for comfortable direct POS processing (keeping in mind the variance) most certainly would break it, even if you care to debate my arguments against decreases in general)

And those are valid points which I agree with. Bitcoin should remain at 10 min block intervals, IMHO--but for these reasons, not because it offers any additional protection against hidden-adversary double-spend attacks (it doesn't).
798  Bitcoin / Development & Technical Discussion / Re: Suggested MAJOR change to Bitcoin on: November 11, 2011, 11:08:21 PM
I understand that, but that is not a meaningful distinction in the context of this thread or in the practicalities of attacking bitcoin. If I have 30% of the global hash power and overnight bitcoin changes from 10min blocks to 2min blocks, my likelihood of executing a double-spend after X confirmations is negligibly different before and after.

Yes it is would be 5 times harder if the honest network magically increased to 5x it's original size, but that's not what will actually happen if the block interval is changed.
That's not at all what I said.  And no, simply switching the target interval from 10 to 2 does not mean that the confirmations are as secure after any particular number of blocks.  At two minutes, 30 confirmations are roughly as secure as 6 are now.  The confirmations themselves are not magic, they only represent an amount of time passed since your transaction was recorded.  It's the time(multiplied by the difficulty) that creates the security.
Show me, mathematically, why that is the case.

I have done the math (summarized earlier), and done simulations to verify this fact: as long as BLOCK_INTERVAL is significantly larger than NETWORK_LATENCY, one's chance of forcing a re-org to undo a transaction depends only on the number of confirmations and the percent of global hash power controlled.

So please, either bust out the math to show me where I'm wrong, or tell me how you're defining 'security'. Because from where I sit, breaking the security model means undoing a transaction, and the chance of doing that depends only on your relative hash rate and the number of confirmations.

But heck, don't take my word for it. Page 6 from Satoshi's whitepaper:

Quote from: Satoshi
p = probability an honest node finds the next block
q = probability the attacker finds the next block
q_z = probability the attacker will ever catch up from z blocks behind

q_z = (1 if p <= q) or ( (q/p)^z if p > q )

EDIT: For what it's worth, I agree with what has been posted earlier that decreasing the block interval yields little benefit unless you can get it down to single-digit seconds. 10 minutes remains a very good compromise. But nevertheless, decisions should be made on the facts.
799  Bitcoin / Development & Technical Discussion / Re: Suggested MAJOR change to Bitcoin on: November 11, 2011, 06:59:21 PM

No, it's not. The proof-of-work is a Poisson generator, meaning that the expectation value for the attacker follows an decaying exponential, which is itself a function of the number of Poisson events--i.e, the number of confirmations. So the probability of overtaking the honest chain after 6 blocks on the 2min intervals is exactly the same as 6 blocks on 10min intervals--hashes/block doesn't figure into the equations anywhere at all.

The global hash rate is important, it's just assumed to be static in the comparisons because there is no way to know what the proper hash rate should be or would be.  But if one assumes that the pool of honest hashing power is the same regardless of the target interval, a 2 minute block does represent roughly one-fifth the brute force security of a 10 minute block.  The statistical analysis of a confirmed block does matter somewhat, but isn't the most important factor in the security of the blockchain, the difficulty of reversing the honest block is.  No matter how you look at it, the difficulty of reversing a confirmed 10 minute block is about 5 times harder than reversing a confirmed 2 minute block...

I understand that, but that is not a meaningful distinction in the context of this thread or in the practicalities of attacking bitcoin. If I have 30% of the global hash power and overnight bitcoin changes from 10min blocks to 2min blocks, my likelihood of executing a double-spend after X confirmations is negligibly different before and after.

Yes it is would be 5 times harder if the honest network magically increased to 5x it's original size, but that's not what will actually happen if the block interval is changed.

...Latency will matter if Bitcoin is ever successful enough to process significant numbers of transactions comparable to Visa or Paypal, particularly for the sole miners and end user nodes on the edges of the network.  The core miners (and pools) will probably be very well connected to one another, but the edges is where the latency will be greatest.

If bitcoin ever reaches Visa-level of adoption, we will likely see many federated, industrial mining operations connected by direct fiber-optic connections. Latency will be no higher than it currently is with bitcoin or Visa (a few seconds, typically).

to ensure your safety, you would have to wait a hour anyway.
it does not matter if you wait 6*10min or 30*2min.

its the same security, the blockchain is only proof of time.
No, that's simply not true by any meaningful measure (see my previous post), although it's a commonly repeated myth on these forums.

That statement is equating security/safety with number of hash operations needed to undo a transaction. But that would only be true if work stopped on the honest chain. In actuality, the only thing that matters (and the math shows this) is percentage ownership of the global hash rate, and the likelihood of pulling off that attack decreases geometrically with the number of confirmations, regardless of block interval length.
800  Bitcoin / Development & Technical Discussion / Re: Suggested MAJOR change to Bitcoin on: November 11, 2011, 09:27:27 AM
there's a tradeoff between gambler's ruin initial deficit and honest hash power dilution due to detrimental effects on the orphan rate and network communication.
That's also a good point, although it is still a bit unclear how large this effect would be in reality. Maybe a good idea for setting up some simulations.
The power dilution is approximately the network latency of the dishonest network minus the network latency of the global (honest) network, divided by the expected block interval. As a consequence, it has little difference if we're talking about dishonest pools. Given that the current network latency is about 2 seconds, it has very little effect at all. We can get well under a minute before it even starts being measurable.

The global hash rate is relevant in relation to how much hash power the attacker should have so that the proportion between honest hash power and malicious hash power is small enough to allow carrying out the double-spending attack. Assuming that this proportion is some constant number, it's false to say that only the number of confirmations matters, because of hash power dilution of the honest miners with faster blocks. In other words, if the attacker has e.g. 30% of the total hash power, waiting for 6 blocks with average 2mins blocks is less secure than waiting for 6 blocks with average 10mins blocks. Which wiki is wrong on this point? Please cite the paragraph that's wrong?
No, it's not. The proof-of-work is a Poisson generator, meaning that the expectation value for the attacker follows an decaying exponential, which is itself a function of the number of Poisson events--i.e, the number of confirmations. So the probability of overtaking the honest chain after 6 blocks on the 2min intervals is exactly the same as 6 blocks on 10min intervals--hashes/block doesn't figure into the equations anywhere at all.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 [40] 41 42 43 44 45 46 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!