Bitcoin Forum
May 30, 2024, 01:52:24 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 »
121  Bitcoin / Meetups / Re: Israel Bitcoin Meetup Group on: August 10, 2012, 09:09:25 PM
Got anything planned for September? I'm going to be in Israel Smiley
122  Bitcoin / Development & Technical Discussion / Re: The High-Value-Hash Highway on: August 10, 2012, 12:32:10 AM
The "difficulty" of a block could be set by each individual miner who "bids" on the difficulty they want to get. So a winning block at difficulty 4 would be H(difficulty: 4 zeros || nonce || header) -> 0000xxxxxxxxx. Instead of taking the largest chain, we would take the chain that has the largest sum-difficulty (according to the difficulty bids, not the actual hash value).

There should be a disincentive for miners to spam the network with low-difficulty blocks. Each miner should have stake in the game - the miner himself could place down a bet, effectively a fee for the storage of his block header. If his block is accepted, then he gets the money back. If his fork gets swept up by a higher-difficulty fork, then he loses his bet and the fees he would have earned.

Isn't wasted hashing power disincentive enough?

The hash-value-highway makes it efficient to evaluate the sum-difficulty of a fork, and also to merge forks by cherry picking valid transactions. Long chains of low-difficulty blocks will tend to get preempted by higher-difficulty forks. If each transaction includes a 'minimum difficulty', then you'll only be included in the difficult blocks which are less likely to be reordered.

I don't expect this part of the explanation to be totally clear, especially since I haven't elaborated on how a fork is 'swept up' by the main chain, but I will get to that later. But since you've been working on similar ideas, I think you might have the right intuition for how this works.

So this "sum-difficulty" is the total difficulty of a chain, including blocks that were merged in from other forks? If the block is merged into another chain, how would the original miner lose his transaction fees from that solved block?
123  Bitcoin / Development & Technical Discussion / Re: The High-Value-Hash Highway on: August 09, 2012, 05:35:20 AM
All of those 'hacks' can be independently improved without affecting the bitcoin protocol, and the additional functions of an economy are intended to be seperate from the bitcoin protocol.  It's simply not necessary to include those things into the protocol.  Sure, hard drive tech is always improving, but techies have been trying to get the Internet to upgrade to IP6 for well over a decade, but that is not going to happen until the majority of the users have personaly evidence that IP4 broken.  The same is true with the bitcoin protocol, I don't want you screwing with a good thing & my opinion actually matters because you can't change anything without the consent of the majority of node owners.

Again, I don't want to mess with the current Bitcoin implementation at all, and I'm completely on-board with the careful and conservative changes the core devs are making. But there are alternative blockchains which don't jeopardize bitcoin in any way. If one of them scratches an important itch, many people will adopt it, and it will compete for some resources. I think the bitcoin ecosystem will eventually be much more robust if there are multiple competing blockchains around instead of just one.

The reason some of these economic functions will be based on bitcoin ideas, is that they need distributed chronological ordering, and the bitcoin ideas are the best way we've currently got to do that.
124  Bitcoin / Development & Technical Discussion / Re: The High-Value-Hash Highway on: August 09, 2012, 05:02:10 AM
It's precisely because the system is so stable that modifications to that system are verboten in the majority of minds here.  The old addage, "don't fix it if it an't broke" applies here.

This is pretty antithetical to the hi tech world. Are hard drives broken now? They work amazingly well. Yet, lots of people are out there making improvements to them. I just don't see how it makes sense to say that a piece of technology works well, so it shouldn't be improved. Moreover, bitcoin is broken when it comes to using it in a commercial setting. Ten to thirty minute waits just aren't acceptable. Sure there are ways to deal with that, but those are hacks.

Bitcoin takes care of money. What about the rest of the financial transactions that manage economies, like exchanges, stocks, bonds, bets, options, futures? You need higher resolution timing for those, and probably lots of other improvements as well. I don't think anybody is recommending baking this stuff into the Satoshi client. There will be alternative coins, and they'll compete on the market.
125  Bitcoin / Development & Technical Discussion / Re: The High-Value-Hash Highway on: August 08, 2012, 09:59:12 PM
Average block solving time will never be reduced to anywhere near 10 seconds.

Care to elaborate? It'll be easier to do if we know why it's impossible.

It's not impossible, just astronomically improbable.  Those are arbitrary parameters decided by Satoshi, and in all the time since there has not been any proposal for altering them that could offer more than a marginal improvement, if that.  For example, the 10 minute block interval could have been 6 minutes, 12 minutes, 15 minutes or any other arbitrary number of seconds; but it could not have been a small number of seconds because the interval was chosen as a balance between transaction confirmation time and the expected future network propogation latency.  10 seconds, or even 120, is too fast and would result in a great deal many more network splits & orphaned blocks; potentially causing a congestion feedback. 

Also, the block interval time cannot be variable, due to the interlocking structure of the blockchain's design & purpose.  If block intervals were variable; 1) the block reward would either have to be variable also, which would make the task of verifying that the block reward was a valid amount quite difficult for an individual node to perform or the distribution rate of newly issued bitcoins into circulation would no longer be predictable, and that would be bad; and 2) the relatively common blockchain splits would be significantly more difficult to resolve in any automatic fashion, which is the only way it could be done.

Of course your points make perfect sense, but isn't this circular reasoning? You're saying that the target time can't be reduced because of network splits and orphaned blocks, but those are exactly the kind of things these proposals are trying to address. Satoshi is a genius, and it's uncanny how well his somewhat arbitrary constants worked right off the bat. It would have been unwise of him to add in extra complexity to reduce those numbers. But time has passed, the system has proven itself stable, and it's time to consider even more amazing things that can be done with these core concepts. Of course, pressing issues shouldn't be neglected, but some people are going to want to work on more experimental stuff, and I think that's great.
126  Bitcoin / Development & Technical Discussion / Re: The High-Value-Hash Highway on: August 08, 2012, 06:46:26 PM
My mission is to eliminate every last hard-coded global parameter in Bitcoin, so that it grows into an indisputably scalable and universal protocol. On the chopping block are "10 minutes," and "difficulty adjustment every 2016 blocks."

Two of the things I'm going to propose next (absorbing orphaned forks, and self-adjusted difficulty without timestamps) are going to potentially to create a much larger number of headers, so I wanted to explain my solution to that first, especially starting with efficient validation for lite-clients. If it's not interesting, then no need to read ahead - but save your place.

I'm looking forward to reading this.
127  Bitcoin / Development & Technical Discussion / Re: The High-Value-Hash Highway on: August 08, 2012, 06:43:40 PM
Average block solving time will never be reduced to anywhere near 10 seconds.

Care to elaborate? It'll be easier to do if we know why it's impossible.
128  Bitcoin / Development & Technical Discussion / Re: The High-Value-Hash Highway on: August 08, 2012, 05:13:50 PM
How does a lite client know which block chain is the largest? By iterating through the whole chain from front to back? Ick!

I stopped reading at this point.   Ten years of headers is about 40 megabytes. One hundred years is 400 megabytes. A boring computer _today_ could validate 100 years of headers in a second or two. And even this could be skipped up to a fixed reference point. There are interesting challenges in Bitcoin, reading headers isn't one of them.

Also, from this discussion it sounds like some people are confused wrt difficulty. The difficulty of a block is its target, not the hash value.


If you stopped reading, how do you know? But seriously, I believe the idea here isn't to address practical problems with the current Bitcoin parameters, but to explore concepts that will be needed if Bitcoin morphs in interesting ways. For instance, if average block solving time is reduced 60-fold to ten seconds, can your smartphone still keep up with all those headers?

129  Bitcoin / Development & Technical Discussion / Re: The High-Value-Hash Highway on: August 08, 2012, 02:06:22 AM
Blocks don't currently have a concept of height.  If you add that too, you have more information, but not that much more.

And backing up doesn't give a new walk, ever.  Every block from X+1 to Y has a skiphash value of X, every block from Y+1 to Z has a skiphash value of Y.  If you back up one block to Z-1 you don't get a new walk, you get the exact same one again.

kjj, you seem to be confusing two concepts here, the skip hash (most recent block with higher value than the previous) and the highest value block in the chain. In your example, X, Y, Z are skip hashes, but for the traversal stuff, you'd be working with highest value blocks. If I find the highest value block in the chain, clearly there are two intervals: blocks before the hvb in the chain and blocks after the hvb in the chain. If we look at each of these intervals, excluding the hvb, we will find a different hvb in each interval. We can choose to go either up or down. That's the second block in the path.
130  Bitcoin / Development & Technical Discussion / Re: The High-Value-Hash Highway on: August 08, 2012, 12:01:51 AM
Hmm, how about this:

Let's hypothetically assume current difficulty for all blocks for the sake of this illustration. Your client chooses a random number (k) from 0 to current block height (n). That number is a binary number with log(n) digits (zero-padded). A zero means "look low", a 1 means "look high". You send me that number and I now have to give you all the blocks I get from traversing the highway according to those instructions. If I wanted to feed you fake blocks, I'd have to either dream up all log(n) blocks on the path for you on the fly, or have all possible paths ready to go, which is basically the same as hashing the entire blockchain from scratch.

In order to hash the blocks on the path in a matter of seconds, I'd need 60 * log(n) more hashing power than the entire network.

The problem with this is that I can try to say that the difficulty was historically ridiculously low. But since I'm feeding you all the highest value blocks, if you're not happy with the overall difficulty you just won't deal with me, because I'm probably a crook. Since you can be off by at least two orders of magnitude and still feel safe dealing with me, there probably won't be a lot of false negatives.
131  Bitcoin / Development & Technical Discussion / Re: The High-Value-Hash Highway on: August 07, 2012, 06:27:42 PM
I love this stuff. I was contemplating something similar, where you'd store n=log(block_height) hashes in each header instead of just the last block hash. So in the current block, you'd store the hash for the previous block, the hash for the block before that, the hash 4 blocks back, 8 blocks back, 16 blocks back, all the way to the genesis block. That's about 18 hashes per block currently. This would allow some kind of SPV client to only store log n headers instead of n.  I haven't really thought through how this works yet, but it sounds similar to what you proposed. Your idea is more elegant and has some advantages, though.
132  Bitcoin / Development & Technical Discussion / Re: High Resolution, Dual-Difficulty Blockchain on: August 07, 2012, 06:03:21 PM
You can start with https://en.bitcoin.it/wiki/Proof_of_Stake but there are some new ideas I still need to write about.

Thanks, very interesting!
133  Bitcoin / Development & Technical Discussion / Re: High Resolution, Dual-Difficulty Blockchain on: August 06, 2012, 06:11:04 PM
You're trusting the recipient anyway to deliver the goods. You don't use the processor because you trust him, you use him because he allows you to send payments everywhere with a single channel, without knowing in advance who you're going to pay.

Without wandering too far off topic, I'm not disputing that this is a very cool idea, or that the trust in the processor is a minor issue compared to trusting someone with the whole escrow. I just wanted to point out that there are drawbacks to each solution which have to be considered, and there's no obvious best solution which trumps the others being discussed.


...The real issue is race attacks, which are certainly feasible. Minor blocks would make race attacks a lot more expensive than they currently are because you'd have to put in the effort to solve minor blocks (assuming the merchant is willing to wait an average of 10 seconds).
I don't see how it really helps security against race attacks. With the current system you can query each of your peers if they know of a conflicting transaction. In none do you can be pretty sure the network recognizes the transaction.

There's a big assumption there that you're well connected to peers, that those peers are well connected to miners, and that there's no sybil attack. Being properly connected is an extra precaution that has to be managed carefully by vendors, as opposed to being baked into the system. If you wait for a few minor blocks, at least you know for sure that the attacker had to pay for a considerable amount of hashing power. I feel that's a big win.

Also, proof of stake proposals strive to make the network immune to >50% hashrate attacks. I've had a discussion about PoS lately and I think I have a new framework to think about it, your suggestion could fit in as well.

I'd love to hear your thoughts on this, is there somewhere I could read about this?

There are advantages to the hybrid minor/major block approach (high time resolution, faster confirmations, while retaining the longer term safety of the major blocks), but I'm not at all convinced a Bitcoin fork with a 10 second interval couldn't work better. Maybe I'm missing something but the wasted hash power (assuming 7% orphan blocks) and header overhead don't seem like a huge deal to me.
Header overhead is a pretty big deal. Most people won't run full nodes, and hopefully they'll choose the more secure lightweight clients over eWallets. The resource cost of this is proportional to the number of headers.

That's a very good point. This merits a lot more discussion, but off the top of my head I don't see why the minor/major block paradigm couldn't be used here so that lightweight clients would only be looking at major blocks, at least for historical transactions. In fact, since old transactions have such a massive amount of buried hashing power protecting them, one could consider a scheme that would decrease the number of blocks needed per time interval as you go back in time, potentially making that header chain even smaller.
134  Bitcoin / Press / 2012-08-06 newscientist.com - Silk Road sells $2m worth of drugs a month on: August 06, 2012, 05:33:08 PM
Quote
New analysis of the anonymous online marketplace Silk Road suggests that purveyors of illegal drugs and other black-market goods are raking in the equivalent of nearly $2 million of Bitcoins per month, which is roughly 20 per cent of exchanges from Bitcoins to US dollars that take place on the online currency's main exchange, Mt. Gox.

http://www.newscientist.com/blogs/onepercent/2012/08/silk-road-bitcoin-illegal.html
135  Bitcoin / Development & Technical Discussion / Re: High Resolution, Dual-Difficulty Blockchain on: August 06, 2012, 05:47:26 AM
I also don't buy the suggestion that quick confirmations aren't necessary in Bitcoin. Your proposal is very interesting, but there are lots of tradeoffs involved, including trusting an established payment processor, and putting funds in escrow. These are serious considerations, and if there's a way to do this using the elegance of the blockchain idea, while retaining low trust requirements and no escrow, I think that would be a big win.
I don't know how much you've followed the suggestion but it does not require you to trust the processor. The funds tied up on the channel are not at the mercy of the processor, the customer gets them back even if the processor is malicious or negligent. The amount that is trusted with the processor is the amount of a single payment by a single customer - or even less, if the customer splits the payment to multiple pieces and receives confirmation from the seller on every one (since payments are direct this isn't expensive).

I was just pointing out that there are tradeoffs to each system. I didn't say you had to trust the processor for the entire escrow, I said you had to trust the processor, and you do, for the amount of a single transaction. That's still trust. If you didn't need to trust the processor, you wouldn't need the processor at all, the recipient could just be the processor.
136  Bitcoin / Development & Technical Discussion / Re: High Resolution, Dual-Difficulty Blockchain on: August 05, 2012, 07:44:22 PM
I don't think this works.

Either the minor blocks are authoritative or they're not. If they are then it's just like having block frequency of 10 seconds with all the disadvantages (wasted hashpower, block header resource cost). If they're not then they do not add to security, a Finney attacker could just hold on to a double-spending major block, and then release it no matter how many minor blocks have been found in the mean time, likewise for >50% attacker.

By the way, p2pool is a way to reduce mining variance with fair reward distribution. It was never touted as a way to get faster security for transactions, and it probably won't work for this purpose for the reasons outlined above.

Also, having very quick confirmations isn't necessary, the blockchain is for final settlement and actual payments will more likely be in some form of Trustless, instant, off-the-chain Bitcoin payments.

Related ideas are Dynamic block frequency and Unfreezable blockchain.

Thanks Meni, I hadn't seem some of those links and others merited a re-read -- fascinating stuff. A have a few comments about your reply:

Since my main motivation is time resolution, it doesn't matter if the minor blocks are a priori authoritative, only that they reliably fix the ordering of transactions in time once the major block is broadcast. Even so, I find talk about Finney attacks and 51% attacks a bit distracting. Finney attacks are almost impossible to reliably pull off, and 51% attacks are common to every Bitcoin modification I've heard of. The real issue is race attacks, which are certainly feasible. Minor blocks would make race attacks a lot more expensive than they currently are because you'd have to put in the effort to solve minor blocks (assuming the merchant is willing to wait an average of 10 seconds). That's means they can't be used to get a free cup of coffee at a cafe. If you're buying  a car, the dealer can wait for a few major blocks. On the other hand, it would be great if you didn't have to hang around the dealership for half an hour waiting for the payment to clear.

I also don't buy the suggestion that quick confirmations aren't necessary in Bitcoin. Your proposal is very interesting, but there are lots of tradeoffs involved, including trusting an established payment processor, and putting funds in escrow. These are serious considerations, and if there's a way to do this using the elegance of the blockchain idea, while retaining low trust requirements and no escrow, I think that would be a big win. Dynamic block frequency is very cool. With my arbitrary 10 second interval, I was trying to hit the minimum time that would make sense for current network and processing speeds, but a market approach would be very interesting. Perhaps there could be competing blockchains, each optimized for a different use case. Maybe that's what's beginning to happen with litecoin. Maybe we need somebody to throw a 10 second blockchain out there and see what happens?

There are advantages to the hybrid minor/major block approach (high time resolution, faster confirmations, while retaining the longer term safety of the major blocks), but I'm not at all convinced a Bitcoin fork with a 10 second interval couldn't work better. Maybe I'm missing something but the wasted hash power (assuming 7% orphan blocks) and header overhead don't seem like a huge deal to me.
137  Bitcoin / Development & Technical Discussion / Re: High Resolution, Dual-Difficulty Blockchain on: August 05, 2012, 03:48:13 AM
On average p2pool blocks are orphaned 7% of the time, so not too much branching.

Awesome! That's a really useful data point.

I was not imagining the minor blocks as forming a chain. They'd all just be orphaned/ignored eventually but in the moment you can say to yourself, "Ok, 3 out of 4 minor blocks included the tx I'm interested in and none of them included a different one using the same input, sounds good to me."

The missing piece to me is why should miners broadcast these minor blocks?

If minor blocks are just regular blocks, and part of the chain, miners would publish them to get rewards, just like major blocks. As I mentioned, this would establish 10 second resolution for transactions, and it would be very useful for more advanced financial blockchains. If minor blocks are really only orphaned about 7% of the time, this could really be feasible. Frankly, with numbers like that, you might not even need to do a dual target, you might be able to get away with just 10 second blocks.
138  Bitcoin / Meetups / Re: New Hampshire Bitcoin Meetup -- Weekly on: August 04, 2012, 07:27:43 PM
Meetup this week is on Sunday, 4 August at 6:30pm at Strange Brew Tavern in Manchester, NH.

https://www.facebook.com/events/208856109243508/
139  Bitcoin / Development & Technical Discussion / Re: High Resolution, Dual-Difficulty Blockchain on: August 04, 2012, 01:26:04 PM
In the OP, there are 60 minor blocks in a typical 10 minute window. These will not be in a single tidy sequential chain but multiple splitting branches and twigs. Imagine the trunk of the tree as the last major block and then various branches, smaller branches and twigs sprouting from it.

Note also that this tree is 'burnt to the ground' when a next major block appears. Its structure does not get used by the major blocks.

If my transaction is on the 3rd twig, 4th branch on the left what does it mean ? It means a miner saw it, it is well formed and THAT miner has put it into the self-immolating minor block tree. There may well be a competing transaction (ie a potential double spend) on the 7th twig, 1st branch on the right. Which will get into the next major block ?  The minor block tree gives no indication.

It is an innovative idea but I am not sure it gives much more than knowing the miners saw it ie network propagation info.

That's a pretty good account of what this would look like, however you may be overestimating the "frothiness" factor. I'd be interested to see what the P2Pool structure looks like. It's certainly the case that observing a minor block is not as safe as observing a major block. It's also the case that observing a minor block is way better than looking at a 0/conf transaction floating around. Another thing you could do is to track all the minor blocks, not just the longest chain. I don't really think there will be all that many. Also, the 10 second time frame isn't set in stone, it could be made a bit longer if that makes this a lot more manageable. It probably depends on network speeds and how fast miners verify and jump to a newly released minor block.

Lastly, the main motivation behind this was to gain much greater time resolution after the fact. So we'd want to have that 10 second resolution once the major block has resolved the one true blockchain. I'd love to hear comments on that part, too.
140  Bitcoin / Development & Technical Discussion / Re: High Resolution, Dual-Difficulty Blockchain on: August 04, 2012, 05:16:01 AM
This already exists today!

It's called P2Pool.

It even follows the very same ten second rule you speak of.

Wow, it's embarrassing that I didn't know about that, thanks casascius! I'm pretty stoked that this appears to be working in real life. My main motivation for this was to get higher time resolution for stuff life stock trades and exchanges, and for that it's important that the minor blocks remain in the blockchain. So, is there any reason why this must remain a miner's hack, or could it be floated as an alternate blockchain? Clearly, another benefit that comes out of this is that the new blockchain would have a much reduced need for pools.

*If* the overwhelming majority of mining power on the network supported this, then you would have a much better indicator after just a few seconds regarding the risk of accepting a transaction.  More analysis would be needed, but I'd say it's very likely that the economics are such that for transactions of as much as $100, you could be very safe accepting a transaction after 10 seconds…and for $1000 to $10000, a minute or two would probably push the risk into the realm of insignificance (particularly if it's a transaction between parties that have just a small amount of trust or recourse).

Yeah, this is a big deal for vendors.

The problem I see is that itīs difficult to predict which will be "Nash equilibrium" of the system or even if an equilibrium exists. Will all miners start making only the lowest difficulty blocks ? Or will they ignore and them always mine the ones with higher rewards?

One possible solution may be to fix the number of low difficulty blocks before a high difficulty block must appear (say in 10:1 ratio). Bit then you will have some (predictable) time intervals where there are no fast confirmations.

A miner would be highly incentivized not to ignore the minor blocks. As part of the hashing process, they will find many minor blocks for every major block, so why not take the extra trivial step and broadcast it and have a good chance of getting the reward. This is even more true for major blocks. While hashing the minor blocks, miners will occasionally hit on a major block. It would be crazy for a miner to not broadcast that and lose out on the big reward. That said, I'm not sure the 10:1 ratio is necessary.

The announced minor blocks only need to build on the last major block (they don't need to build on each other).

In order to get the time resolution, the miners would need to build the blockchain on minor blocks. This would also get you exponential protection against fake blocks instead of linear protection. Is there an obvious reason this chaining wouldn't work? That said, a different blockchain operating as you described could solve a lot of problems, too.
Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!