Bitcoin Forum
May 21, 2024, 07:22:49 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Fixed length loops for scripts  (Read 754 times)
TierNolan (OP)
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
May 01, 2013, 01:57:01 PM
 #1

One of the reason not to allow loops is that it makes it harder to work out how much processing is required.

If loops had a clearly defined maximum number of loops, then this is not an issue.

For example, you could define loop as

[max loops] LOOP ..... [true/false] ENDLOOP

Loop would pop [max loops] off the stack.

ENDLOOP would go back to LOOP if the top of the stack contains true/1 and the maximum number of loops haven't been reached.

This would be a pay to multi-sig with loops. 

[Hash(pub-key1)] [Hash(pub-key1)] [Hash(pub-key1)] [3] LOOP [3] OP_PICK OP_HASH160 OP_EQUALVERIFY [5] OP_PICK [3] OP_PICK OP_CHECKSIGVERIFY [1] ENDLOOP

All sigops between LOOP and ENDLOOP would have their weight multiplied by 3.  If inner loops are allowed, then the inner loops would be the product of the maximum for all the loops involved.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
nybble41
Full Member
***
Offline Offline

Activity: 152
Merit: 100


View Profile
May 01, 2013, 06:20:23 PM
 #2

A simpler approach might be to simply assign a CPU and memory cost to each operation, and fail the script if it goes over preset limits. The precise costs and limits would need to be defined at the protocol level to ensure compatibility. The script language could the be opened up to permit arbitrarily complex processing (loops, subroutines, recursion, etc.) without worrying about complexity.
TierNolan (OP)
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
May 01, 2013, 07:29:36 PM
 #3

A simpler approach might be to simply assign a CPU and memory cost to each operation, and fail the script if it goes over preset limits. The precise costs and limits would need to be defined at the protocol level to ensure compatibility. The script language could the be opened up to permit arbitrarily complex processing (loops, subroutines, recursion, etc.) without worrying about complexity.

However, that requires actually running the script.  This would mean you could just scan and look at the number before the OP_LOOP, so easier.

They already have hard-coded values for all the operations.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
nybble41
Full Member
***
Offline Offline

Activity: 152
Merit: 100


View Profile
May 01, 2013, 08:31:56 PM
 #4

However, that requires actually running the script.  This would mean you could just scan and look at the number before the OP_LOOP, so easier.
Sure, but under what circumstances would anyone care about the cost of a script which they wouldn't be running anyway?
TierNolan (OP)
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
May 01, 2013, 11:19:43 PM
 #5

Sure, but under what circumstances would anyone care about the cost of a script which they wouldn't be running anyway?

It would allow a transaction to be dropped quickly without having to actually run the script.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
nybble41
Full Member
***
Offline Offline

Activity: 152
Merit: 100


View Profile
May 02, 2013, 12:41:16 AM
 #6

Sure, but under what circumstances would anyone care about the cost of a script which they wouldn't be running anyway?
It would allow a transaction to be dropped quickly without having to actually run the script.
The savings from that approach seem negligible. You still have to design the system to handle the worst case, which is a script just short enough to get past the filter. A malicious attacker would design the script to use the maximum amount of time without triggering the filter, so you have the cost of the script plus the cost of checking its complexity. The only case where you would save anything would be where someone accidentally created a script which requires too much time or memory, which should be rare (akin to creating malformed scripts which fail on valid input for other reasons).
TierNolan (OP)
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
May 02, 2013, 09:02:15 AM
 #7

The savings from that approach seem negligible.

Running a loop 100 times is slower than just multiplying all the maximum loop lengths in the script.  However, specifying a cost per operation and failing validation if it goes over the limit works too.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!