Luke-Jr (OP)
Legendary
Offline
Activity: 2576
Merit: 1186
|
|
March 13, 2012, 11:31:39 PM Last edit: March 16, 2012, 01:50:33 AM by Luke-Jr |
|
Unfortunately, with the advent of Deepbit adding BIP 16 support, the possibility of BIP 17 being implemented is pretty much gone. I therefore regret to announce the official Withdrawl of BIP 17. If anyone else wants to take over as its "champion", feel free to re-open it, but I am convinced it is a lost cause at this point. Short of something major happening within the next day or two, I will be switching Eligius (my pool) over to BIP 16, and merge backported BIP 16 support into the future 0.4.5 and 0.5.4 stable releases. However, BIP 16 isn't completely irreparable: I am proposing BIP 18 as the next step forward. This proposal is 100% protocol-compatible with BIP 16 and requires no software changes at all. It is simply a formal rewriting of the specification in a more consistent manner, and implies developers will make a full commitment to P2SH, using it for all new address/transaction types (without breaking compatibility with "legacy" addresses). Finally, Gavin's BIP 16 backport does not merge cleanly into 0.4.x/0.5.x, and seems to have fallen behind on a few fixes made in the "master" branch. I have attempted to resolve this, but would appreciate as many reviews of it as possible before merging to stable. Unlike BIP 17, the code to implement BIP 16 is very complicated and hard to follow, so there is plenty of room for error. Edit: Gavin noted the backport doesn't need to actually mine P2SH into blocks, so here is a simpler patch that only validates them. Please audit this one instead.
|
|
|
|
maaku
Legendary
Offline
Activity: 905
Merit: 1012
|
|
March 14, 2012, 12:08:09 AM |
|
I just had a few minutes to skip BIP 18, but I like it so far. Is there any reasons we would not want to depreciate the scriptPubKey/scriptSig system? In other words, is there any use case for knowing the actual script ahead of time?
|
I'm an independent developer working on bitcoin-core, making my living off community donations. If you like my work, please consider donating yourself: 13snZ4ZyCzaL7358SmgvHGC9AxskqumNxP
|
|
|
theymos
Administrator
Legendary
Offline
Activity: 5376
Merit: 13410
|
|
March 14, 2012, 12:14:28 AM |
|
I don't see a point in "officially" depreciating something that must be supported forever anyway.
|
1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
|
|
|
finway
|
|
March 14, 2012, 02:55:14 AM |
|
Good!
|
|
|
|
Gavin Andresen
Legendary
Offline
Activity: 1652
Merit: 2301
Chief Scientist
|
|
March 14, 2012, 12:19:38 PM |
|
I don't see a point in "officially" depreciating something that must be supported forever anyway.
I agree.
|
How often do you get the chance to work on a potentially world-changing project?
|
|
|
ribuck
Donator
Hero Member
Offline
Activity: 826
Merit: 1060
|
|
March 14, 2012, 01:36:37 PM |
|
I don't see a point in "officially" depreciating something that must be supported forever anyway.
The reason for official deprecation is so that developers know that future enhancements are unlikely to be developed for the deprecated part of the system. This keeps the actively-developed portion of the software as small as possible. This reduces the risk of bugs and security holes being introduced in the future, and makes coding and testing easier. If the old way of doing things is completely subsumed by a new and better way, the old way should definitely be deprecated.
|
|
|
|
realnowhereman
|
|
March 14, 2012, 03:05:55 PM |
|
I assume we're actually talking about "deprecation" not "depreciation" ("depreciation" being pretty meaningless in this context)... Deprecation is not obsolescence; and seems the perfect word for it to me. From WordNet (r) 3.0 (2006) [wn]:
deprecate v 1: express strong disapproval of; deplore 2: belittle; "The teacher should not deprecate his student's efforts" [syn: {deprecate}, {depreciate}, {vilipend}]
1 being the obvious definition here. "deprecate" is used all over the place in software. It means a feature who's use is discouraged. Perfect.
|
1AAZ4xBHbiCr96nsZJ8jtPkSzsg1CqhwDa
|
|
|
piuk
|
|
March 14, 2012, 03:34:19 PM |
|
I agree with Theymos and Gavin. Although I agree with the reasoning behind BIP 18, since the old scripting system cannot be deprecated, it doesn't make the solution any cleaner. e.g. why would hashScriptCheck contain op codes if it's not a script.
Also I don't see the advantage in making all payments to a scriptHash, the need to save the script adds extra complexity.
|
|
|
|
Maged
Legendary
Offline
Activity: 1204
Merit: 1015
|
|
March 14, 2012, 05:02:45 PM |
|
I don't see a point in "officially" depreciating something that must be supported forever anyway.
I do. By officially deprecating it, that could mean that we will remove the ability to make new legacy transactions in a future hard fork. Eventually, another hard fork could remove it altogether, destroying any still-unspent legacy outputs.
|
|
|
|
Luke-Jr (OP)
Legendary
Offline
Activity: 2576
Merit: 1186
|
|
March 14, 2012, 05:05:38 PM |
|
By officially deprecating it, that could mean that we will remove the ability to make new legacy transactions in a future hard fork. This wouldn't even require a hard fork, really. Eventually, another hard fork could remove it altogether, destroying any still-unspent legacy outputs. Legacy outputs could be safely converted to P2SH in such a hardfork.
|
|
|
|
etotheipi
Legendary
Offline
Activity: 1428
Merit: 1093
Core Armory Developer
|
|
March 14, 2012, 05:41:18 PM |
|
Also I don't see the advantage in making all payments to a scriptHash, the need to save the script adds extra complexity.
^This. Although I see the value in P2SH or any similar system, I am deeply concerned about the backup issues. I put in a lot of work to make paper backups in Armory, to make it as simple as possible for users to make one backup ever, and not have to worry about it again. "Regular users" who must rely on a consistent backup solution will fail. It doesn't happen. No matter what you do, they will forget to set it up, not realize their backup drive was disconnected, not set it up again after a system restore, or reinstall OS, etc. But with proposals that require all everything to be hidden behind script hashes: your wallet must be backed up (or at least the script information) after every single transaction. Your system dies right after a transaction before you've had a chance to save the script, and you'll never recover it again. This is something I'm battling with multi-sig scripts, but was comforted by the fact that at least regular users who want to avoid all of it don't have to deal with it and can use regular addresses which can be scanned for in the blockchain. That's what this thread was about. Gavin addressed the issue for multi-sig and escrow.
|
|
|
|
Luke-Jr (OP)
Legendary
Offline
Activity: 2576
Merit: 1186
|
|
March 14, 2012, 05:46:45 PM |
|
Also I don't see the advantage in making all payments to a scriptHash, the need to save the script adds extra complexity.
^This. Although I see the value in P2SH or any similar system, I am deeply concerned about the backup issues. I put in a lot of work to make paper backups in Armory, to make it as simple as possible for users to make one backup ever, and not have to worry about it again. "Regular users" who must rely on a consistent backup solution will fail. It doesn't happen. No matter what you do, they will forget to set it up, not realize their backup drive was disconnected, not set it up again after a system restore, or reinstall OS, etc. But with proposals that require all everything to be hidden behind script hashes: your wallet must be backed up (or at least the script information) after every single transaction. Your system dies right after a transaction before you've had a chance to save the script, and you'll never recover it again. This is something I'm battling with multi-sig scripts, but was comforted by the fact that at least regular users who want to avoid all of it don't have to deal with it and can use regular addresses which can be scanned for in the blockchain. That's what this thread was about. Gavin addressed the issue for multi-sig and escrow. I don't see why this is any harder to do with P2SH...
|
|
|
|
maaku
Legendary
Offline
Activity: 905
Merit: 1012
|
|
March 14, 2012, 07:16:19 PM |
|
There are some possible scripts whose properties depend on being public. David Schwartz describes one here that achieves anonymous donations/transfers, although it requires new opcodes for the scripting system. Such a system would not be possible under P2SH.
|
I'm an independent developer working on bitcoin-core, making my living off community donations. If you like my work, please consider donating yourself: 13snZ4ZyCzaL7358SmgvHGC9AxskqumNxP
|
|
|
Maged
Legendary
Offline
Activity: 1204
Merit: 1015
|
|
March 14, 2012, 07:23:06 PM |
|
By officially deprecating it, that could mean that we will remove the ability to make new legacy transactions in a future hard fork. This wouldn't even require a hard fork, really. Baah, you're right. That could be a backwards-compatible protocol change. Eventually, another hard fork could remove it altogether, destroying any still-unspent legacy outputs. Legacy outputs could be safely converted to P2SH in such a hardfork. Protocol-wise, that's true. If we completely replace scriptPubKey with scriptHash in the protocol (which is how I'd suggest doing it), we can convert all of the legacy scripts to P2SH. Practically, someone would have to keep the old script data around, but I suppose, thinking it over, the Satoshi client already saves that in the wallet, anyway. So yes, a straight conversion would work. etotheipi, what backup case hasn't been considered?
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
March 14, 2012, 07:31:36 PM |
|
There are some possible scripts whose properties depend on being public. David Schwartz describes one here that achieves anonymous donations/transfers, although it requires new opcodes for the scripting system. Such a system would not be possible under P2SH. Script can be public and not part of the transaction. Post the script somewhere public and you have a public copy of the script. A more elegent solution would be a cloud based repository of scripts. A p2p solution would be a DHT that users can submit scripts to and query scripts based on their hash.
|
|
|
|
etotheipi
Legendary
Offline
Activity: 1428
Merit: 1093
Core Armory Developer
|
|
March 14, 2012, 08:57:13 PM |
|
Protocol-wise, that's true. If we completely replace scriptPubKey with scriptHash in the protocol (which is how I'd suggest doing it), we can convert all of the legacy scripts to P2SH.
Practically, someone would have to keep the old script data around, but I suppose, thinking it over, the Satoshi client already saves that in the wallet, anyway. So yes, a straight conversion would work.
etotheipi, what backup case hasn't been considered?
Sorry, I misunderstood the statement that "all transactions should be P2SH scripts." That's what I get for rushing through the description... My concern was that moving from vanilla OP_CHECKMULTISIG to P2SH meant that it was no longer possible to just make one backup of your [deterministic] wallet. You have to continually backup scripts and script-hashes after every multi-sig transaction. But there are good reason to use P2SH, I feel like it's the better of two evils . But I will still complain about the extra backup complexity associated with it -- the scripts to be backed up are not-sensitive, but important, and many users may not want to setup autobackups of their entire wallet, encrypted or not. So it will require saving the scripts in separate files and maintaining a separate backup system for it, and must be executed after every transaction. And I need to build in a system for restoring an entire wallet from a private-key-only file plus script-backup-file. And figuring out how to recover as much info as possible in case the script-backup-file is not recoverable. Again, that's what that other thread is for. When I heard "all tx to use P2SH" I immediately started thinking that all transactions were going to suffer the same backup complexity. But the proposal is really more about changing the format of the standard scripts, so you will still be able to scan the blockchain for regular transactions -- they will just have a different form. I was thinking more like proposals where even the regular tx is "obfuscated", requiring you to save some critical information in addition to your private keys, in order to identify and redeem it.
|
|
|
|
realnowhereman
|
|
March 15, 2012, 10:06:28 AM |
|
Just for information: There is a mapping from current bitcoin public key, to p2sh receiving address; just as there is a mapping from public key to non-p2sh address. There's no need to keep scripts around on disk, nor really to worry about all your addresses changing. Both those things can be handled with a bit of clever software and access to the public key (which is available in current scriptPubKey standard scripts). Perhaps I'm misreading Maged's concerns, but there is no need to worry that one deterministic backup isn't enough. The public and private keys remain the same. What changes is the way the public key is converted to an address. For a given public key it will always be the hash of the script (or equivalent): OP_PUSH <signature> OP_CODESEPARATOR OP_PUSH <public key> OP_CHECKSIGVERIFY
Note that since the <signature> bytes will not (cannot) be included in the hash of this script, there is no part of it that changes with different transactions; so it is always the same hash and therefore always the same receiving address.
|
1AAZ4xBHbiCr96nsZJ8jtPkSzsg1CqhwDa
|
|
|
da2ce7
Legendary
Offline
Activity: 1222
Merit: 1016
Live and Let Live
|
|
March 16, 2012, 02:55:01 AM |
|
Luke, I like your proposal, upon first read. It looks like an elegant and effective solution to the BIP16/17 drama we previously had.
|
One off NP-Hard.
|
|
|
Tril
|
|
March 16, 2012, 03:34:35 PM |
|
Hi, please bear with me I have not had time to study the code and bitcoin protocol and understand how scripts work in depth, but as a bitcoin user and miner, I have been wondering since BIP16 and 17 have been introduced:
Why does BIP16 NOT vastly increase the options for pre-computing hash collisions? If we're not sending scripts anymore, but hashes of scripts, then the more expressive the script language, the more ways there are to try to make a script that hashes to something known: to scan the blockchain for unspent transactions and attempt to come up with a matching hash that pays the attacker. Even with something as simple as paying to multiple bitcoin addresses script, you can pre-generate a billion bitcoin addresses, then find some combination and ORDERING of the script to pay those addresses whose hash collides with another script. The more opcodes and the longer you allow the script to be, the more dangerous this becomes.
If new opcodes are ever added, can we guarantee there be restrictions preventing the new opcodes from working on scripts generated before the opcodes were added, to not allow retro-active additions to the script language to prevent brute forcing collisions for old addresses?
Does the existing script language require strict sorting of addresses in advance to prevent the above attack? Does the existing language disallow inserting NO-OP opcodes to increase the number of scripts you can attempt to collide with? Possibly I'm just missing something and the script language is far less expressive than it seems, but it looked like it contained a stack, as well as a no-op, as well as the ability to include "user-supplied data" (a bitcoin address). Something that allows inserting other script hashes into the script makes it exponentially more dangerous. I'm hoping I'm completely off base here, but I haven't seen this addressed anywhere.
|
|
|
|
jojkaart
Member
Offline
Activity: 97
Merit: 10
|
|
March 16, 2012, 03:45:28 PM |
|
Hi, please bear with me I have not had time to study the code and bitcoin protocol and understand how scripts work in depth, but as a bitcoin user and miner, I have been wondering since BIP16 and 17 have been introduced:
Why does BIP16 NOT vastly increase the options for pre-computing hash collisions? If we're not sending scripts anymore, but hashes of scripts, then the more expressive the script language, the more ways there are to try to make a script that hashes to something known: to scan the blockchain for unspent transactions and attempt to come up with a matching hash that pays the attacker. Even with something as simple as paying to multiple bitcoin addresses script, you can pre-generate a billion bitcoin addresses, then find some combination and ORDERING of the script to pay those addresses whose hash collides with another script. The more opcodes and the longer you allow the script to be, the more dangerous this becomes.
If new opcodes are ever added, can we guarantee there be restrictions preventing the new opcodes from working on scripts generated before the opcodes were added, to not allow retro-active additions to the script language to prevent brute forcing collisions for old addresses?
Does the existing script language require strict sorting of addresses in advance to prevent the above attack? Does the existing language disallow inserting NO-OP opcodes to increase the number of scripts you can attempt to collide with? Possibly I'm just missing something and the script language is far less expressive than it seems, but it looked like it contained a stack, as well as a no-op, as well as the ability to include "user-supplied data" (a bitcoin address). Something that allows inserting other script hashes into the script makes it exponentially more dangerous. I'm hoping I'm completely off base here, but I haven't seen this addressed anywhere.
First, I'll have to note that I'm not an expert on this subject. However, I think you are correct in that the number of options to try if you want to try creating hash collisions increases. This does not, however, make much of a difference, if any, on the expected time to find a collision. This is because the number of options, even before BIP 16, is vastly greater than the average number of tries needed to find a collision. Therefore, merely increasing the option space does not make much of a difference.
|
|
|
|
|