Bitcoin Forum
April 25, 2024, 02:04:57 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 9 [10] 11 »  All
  Print  
Author Topic: Funding of network security with infinite block sizes  (Read 24526 times)
zebedee
Donator
Hero Member
*
Offline Offline

Activity: 668
Merit: 500



View Profile
April 09, 2013, 06:09:36 AM
 #181

You're working under the assumptions that everyone mining coins want what's best for Bitcoin / care about Bitcoin's future, and all miners are equal - they are not, some will realize that producing large blocks that others can't handle is very much in their interest. Others may have the aim of just causing as many problems as possible because Bitcoin is a threat to them in some way, who knows? All you're doing is creating a vulnerability and expecting no-one to take advantage.

We aren't all hippies living on a fucking rainbow - problems don't just fix themselves because in your one-dimensional view of the world you can't imagine anyone not acting rationally and for the greater good.
You are assuming things are problems with no evidence.  It's quite easy to ignore troublemakers, e.g. the anti-spam rules.  Think outside the box a bit more.  Central planners almost never get anything right, stop wanting others to solve your problems.

Despite all the claims of trouble-making large blocks, the only one we really had over 500k was recently, and the only reason that caused a problem was entirely unrelated, owing to limitations of untested parts of the software, untested so late only because of all these special case rules.  Let miners be free and figure it out for themselves.  I suspect a common core of rule(s) would be generally agreed (anti-spam etc.) and others would all do their own thing on the fringes, which is actually the case now.
1714053897
Hero Member
*
Offline Offline

Posts: 1714053897

View Profile Personal Message (Offline)

Ignore
1714053897
Reply with quote  #2

1714053897
Report to moderator
1714053897
Hero Member
*
Offline Offline

Posts: 1714053897

View Profile Personal Message (Offline)

Ignore
1714053897
Reply with quote  #2

1714053897
Report to moderator
1714053897
Hero Member
*
Offline Offline

Posts: 1714053897

View Profile Personal Message (Offline)

Ignore
1714053897
Reply with quote  #2

1714053897
Report to moderator
The forum strives to allow free discussion of any ideas. All policies are built around this principle. This doesn't mean you can post garbage, though: posts should actually contain ideas, and these ideas should be argued reasonably.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714053897
Hero Member
*
Offline Offline

Posts: 1714053897

View Profile Personal Message (Offline)

Ignore
1714053897
Reply with quote  #2

1714053897
Report to moderator
1714053897
Hero Member
*
Offline Offline

Posts: 1714053897

View Profile Personal Message (Offline)

Ignore
1714053897
Reply with quote  #2

1714053897
Report to moderator
1714053897
Hero Member
*
Offline Offline

Posts: 1714053897

View Profile Personal Message (Offline)

Ignore
1714053897
Reply with quote  #2

1714053897
Report to moderator
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 3920
Merit: 2348


Eadem mutata resurgo


View Profile
April 10, 2013, 07:56:02 AM
Last edit: April 10, 2013, 09:05:58 AM by marcus_of_augustus
 #182

Okay, I think this will be my final word on this (if anyone cares or is listening).

Basically it comes down to a philosophical call since the number of variables and unknowns going into the decision lead to an incalculable analysis, at least I don't see any good way to reduce the variable set to get a meaningful result without over-simplifying the problem with speculative, weak assumptions.

In that case, as Gavin is saying, we should just leave it up to the market, our intuitions, our future ingenuity, trust in ability to solve anything that comes up and basically hope for the best.

So now I'm thinking, let the max_block_size float up to any unlimited size. But have a limit on how fast it can rise that makes practical sense, a rate limiter of some sort so it can grow as fast the network grows but not so fast that an attacker could game it over the medium/short term to force smaller users off the network with unrealistic hardware upgrade requirements. Ultimately though have an infinite limit to max block size in the long term, as Mike Hearn says.

As a first blush in pseudo-code;

Code:
if (MAX_BLOCK_SIZE < multiple of the median size of the last 1008 blocks)              ## Gavin A.'s rule to let MAX_BLOCK_SIZE float proportionally to recent median block size history, weekly check

    NEW_MAX_BLOCK_SIZE = multiple of the median size of the last 1008 blocks

    if ( NEW_MAX_BLOCK_SIZE < (2^(1/52))* MAX_BLOCK_SIZE )                             ## Jeff G.'s rule as a weekly time-based rate limit on MAX_BLOCK_SIZE to yearly doubling (Moore's law)

        then {MAX_BLOCK_SIZE = NEW_MAX_BLOCK_SIZE }

    else

        MAX_BLOCK_SIZE =(2^(1/52))* MAX_BLOCK_SIZE

    endif
endif

Infinite max block size growth possible but rate limited to yearly doubling rate.


notig
Sr. Member
****
Offline Offline

Activity: 294
Merit: 250


View Profile
April 10, 2013, 08:37:45 AM
 #183

Oh goodie, more Google conspiracy theories. Actually I never had to ask for approval to use 20% time on Bitcoin. That's the whole point of the policy - as long as there's some justifiable connection to the business, you can do more or less whatever you want with it and managers can't tell you not to unless it's clearly abusive. That's how we ensure it's usable for radical (i.e. unexpected) innovation.

But even if I was being paid to work on Bitcoin full time by Google, the idea that I'd want Bitcoin to grow and scale up as part of some diabolical corporate master plan is stupid. Occam's Razor, people! The simplest explanation for why I have worked so hard on Bitcoin scalability is that I want it to succeed, according to the original vision laid down by Satoshi. Which did not include arbitrary and pointless limits on its traffic levels.

The idea that Bitcoin can be a store of value with a 1mb block size limit seems like nonsense to me. That's reversing cause and effect. Bitcoin gained value because it was useful, it didn't gain use because it had value - that can't be the case because it started out with a value of zero. So if Bitcoin is deliberately crippled so most people can't use it, it will also cease to have much (if any) value. You can't have one without the other. The best way to ensure Bitcoin is a solid store of value is to ensure it's widely accepted and used on an every day basis.

If Bitcoin was banned in a country then I think it's obvious its value would be close to zero. This is one of the most widely held misconceptions about Bitcoin, that it's somehow immune to state action. A currency is a classic example of network effects, the more people that use it, the more useful it becomes but it goes without saying that you have to actually know other people are using it to be able to use it yourself. If there was immediate and swift punishment of anyone who advertised acceptance of coins or interacted with an exchange, you would find it very hard to trade and coins would be useless/valueless in that jurisdiction.

The reason I'm getting tired of these debates is that I've come to agree with Gavin - there's an agenda at work and the arguments are a result of people working backwards from the conclusion they want to try and find rationales to support it.

Every single serious point made has been dealt with by now. Let's recap:

  • Scalability leads to "centralization". It's impossible to engage in meaningful debate with people like Peter on this because they refuse to get concrete and talk specific numbers for what they'd deem acceptable. But we now know that with simple optimisations that have been prototyped or implemented today, Bitcoin nodes can handle far more traffic than the worlds largest card networks on one single computer, what's more, a computer so ordinary that our very own gmaxwell has several of them in his house. This is amazing - all kinds of individuals can, on their own, afford to run full nodes without any kind of business subsidisation at all, including bandwidth. And it'll be even cheaper tomorrow.
  • Mining can't be anonymous if blocks are large. Firstly, as I already pointed out, if mining is illegal in one place then it'll just migrate to other parts of the world, and if it's illegal everywhere then it's game over and Bitcoin is valueless anyway, so at that point nobody cares anymore. But secondly, this argument is again impossible to really grapple with because it's based on an unsupported axiom: that onion networks can't scale. Nobody has shown this. Nobody has even attempted to show this. Once again, it's an argument reached by working backwards from a desired conclusion.
  • Mining is a public good and without artificial scarcity it won't get funded. This is a good argument but I've shown how alternative funding can be arranged via assurance contracts, with a concrete proposal and examples in the real world of public goods that get funded this way. It'll be years before we get to try this out (unless the value of Bitcoin falls a lot), but so far I haven't seen any serious rebuttals to this argument. The only ones that exist are of the form, "we don't have absolute certainty this will work, so let's not try". But it's not a good point because we have no certainty the proposed alternatives will work either, so they aren't better than what I've proposed.

Are there any others? The amount of time spent addressing all these arguments has been astronomical and at some point, it's got to be enough. If you want to continue to argue for artificial scaling limits, you need to get concrete and provide real numbers and real calculations supporting that position. Otherwise you're just peddling vague fears, uncertainties and doubts.

+1
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
April 10, 2013, 09:53:41 AM
Last edit: April 10, 2013, 10:52:16 AM by gmaxwell
 #184

Okay, I think this will be my final word on this (if anyone cares or is listening).

Meh.

Any amount of algorithms will just be guesswork that could be wildly wrong— A small error in the exponent makes a big difference: If computer's ability to handle the blocks doubles in 18 months instead of 12 then ten years the size increase will be 1024 : 101— blocks 10x too big to handle.

And "median of blocks" basically amounts to "miner's choose"— if you're going to have miners choose, I'd rather have them insert a value in the coinbase and take the median of that... then at least they're not incentivized to bloat blocks just to up the cap, or forced to forgo fees to their competition just to speak their mind on making it lower. Smiley  But miners choose is ... not great because miners alignment with everyone else is imperfect— and right now the majority of the mining decision is basically controlled by just two people.

There are two major areas of concern from my perspective:  The burden on unpaid full nodes making Bitcoin not practically decentralized if it grows faster than technology, and a race to the bottom on fees and thus PoW difficulty making Bitcoin insecure (or dependent on centralized measures like bank-signed-blocks).  (and then there are then some secondary ones, like big pools using large blocks to force small miners out of business)

If all of the cost of being a miner is in the handling of terabyte blocks and none in the POW, an attacker who can just ignore the block validation can easily reorg the chain. We need robust fees— not just to cover the non-adaptive "true" costs of blocks, but to also cover the POW security which can adapt as low as miners allow it.

The criteria you have only speak to the first of these indirectly— we HOPE computing will follow some exponential curve— but we don't know the exponent.  It doesn't speak to the second at all unless you assume some pretty ugly mining cartel behavior to fix the fee by a majority orphaning valid blocks by a minority (which, even if not too ugly for you on its face would make people have to wait many blocks to be sure the consensus was settled).

If you were to use machine criteria, I'd add a third: It shouldn't grow faster than (some factor of) the difficulty change.  This directly measures the combination of computing efficiency and miner profitability.  Right now difficulty change looks silly, but once we're comfortably settled w/ asics on the latest silicon process it should actually reflect changes in computing... and should at least prevent the worst of the difficulty tends to 1 problem. (though it might not keep up with computing advances and we could still lose security).  Also, difficult as a limit means that miners actually have to _spend_ something to increase it— the median of size thing means that miners have to spend something (lost fees) to _decrease it_...  

Since one of my concerns is that miners might not make enough fees income and the security will drop, making it so miners have to give up more income to 'vote' for the size to go down... is very not good. Making it so that they have to burn more computing cycles to make it go up would, on the other hand, be good.

Code:
every 2016 blocks:

new_max_size =  min( old_size * 2^(time-difference/year),
                                     max(max(100k,old_size/2),median(miner_coinbase_preferences[last 2016 blocks])),
                                     max(1mb,1MB*difficulty/100million),
                                     periodic_hard_limit (e.g. 10Mbytes),  
                                  )
(the 100m number there is completely random, we'll have to see what things look like 8 months once the recent difficulty surge settles some, otherwise I could propose something more complicated than a constant there.)

Mind you— I don't think this is good by itself. But the difficulty based check improves it a lot. If you were to _then_ augment these three rules with a ConsentOfTheGoverned hard upper limit that could be reset periodically if people thought the rules were doing the right thing for the system and decentralization was being upheld...  well, I'd run out of things to complain about, I think. Having those other limits might make agreeing on a particular hard limit easier— e.g. I might be more comfortable with a higher one knowing that there are those other limits keeping it in check. And a upper boundary gives you something to test software against.

I'm generally more of a fan rule by consent— public decisions suck and they're painful, but they actually measure what we need to measure and not some dumb proxies that might create bad outcomes or weird motivations— there is some value which is uncontroversial, and it should go to that level. Above that might be technically okay but controversial, and it shouldn't go there.   If you say that a public set limit can't work— then you're basically saying you want the machine to set behavior against the will and without consent of the users of the system, and I don't think thats right.
 
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
April 10, 2013, 11:53:13 AM
 #185

What about directly targeting fees, similar to Gavin's suggestion that blocks must have fee > 50 * (size in MB).

If the average (or median) block fee (including minting) for the last 2016 blocks is more than 65, then increase the block size by 5%.  If it is less than 35, then drop it by 5%.  This gives a potential increase in block size of 3.5 per year approx.  It also guarantees that the miner fees stay between 35 and 65 per block, so keeps the network secure.

Ofc, if decreased block size causes decreased usage, which causes less tx fees, then you could get a downward spiral, but the limit is a factor of 3.5 per year in either direction.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
fornit
Hero Member
*****
Offline Offline

Activity: 991
Merit: 1008


View Profile
April 10, 2013, 01:52:40 PM
 #186

It also guarantees that the miner fees stay between 35 and 65 per block, so keeps the network secure.

it also guarantess that bitcoin always costs more than 8,7%-16,2% of its current market cap in fees. sounds like a solid plan. at 100 billion dollar market cap we need to start thinking about invading some small countries for mining rig storage space though.
Gavin Andresen
Legendary
*
qt
Offline Offline

Activity: 1652
Merit: 2216


Chief Scientist


View Profile WWW
April 10, 2013, 04:29:29 PM
 #187

What about directly targeting fees, similar to Gavin's suggestion that blocks must have fee > 50 * (size in MB).
Nah, I now think that's a dumb idea.

Responding to gmaxwell:

RE: burden of unpaid full nodes: for the immediately forseeable future, that burden is on the order of several hundred dollars a year to buy a moderately beefy VPS somewhere.

I understand the "lets engineer for the far future" ... but, frankly, I think too much of that is dumb. Successful projects and products engineer for the next year or two, and re-engineer when they run into issues.

Maybe the answer will be "validation pools" like we have mining pools today, where people cooperate to validate part of the chain (bloom filters, DHTs, mumble mumble....). Maybe hardware will just keep up.

RE: race to the bottom on fees and PoW:

sigh.  Mike explained how that is likely to be avoided. I'm 100% convinced that if users of the network want secure transactions they will find a way to pay for them, whether that is assurance contracts or becoming miners themselves.

How often do you get the chance to work on a potentially world-changing project?
ripper234
Legendary
*
Offline Offline

Activity: 1358
Merit: 1003


Ron Gross


View Profile WWW
April 12, 2013, 01:05:59 PM
 #188

Posted the wiki article to reddit.

I haven't read the thread in a while. Can I assume the wiki is relatively up to date, or does it need updating?

Please do not pm me, use ron@bitcoin.org.il instead
Mastercoin Executive Director
Co-founder of the Israeli Bitcoin Association
Mike Hearn (OP)
Legendary
*
expert
Offline Offline

Activity: 1526
Merit: 1128


View Profile
April 12, 2013, 02:21:11 PM
 #189

The wiki page is basically a copy of what I wrote with a few edits to make it sound more wiki-ish.
ripper234
Legendary
*
Offline Offline

Activity: 1358
Merit: 1003


Ron Gross


View Profile WWW
April 12, 2013, 02:25:19 PM
 #190

The wiki page is basically a copy of what I wrote with a few edits to make it sound more wiki-ish.

Coolness - keep up the good work Mike!

Please do not pm me, use ron@bitcoin.org.il instead
Mastercoin Executive Director
Co-founder of the Israeli Bitcoin Association
Stampbit
Full Member
***
Offline Offline

Activity: 182
Merit: 100



View Profile
April 12, 2013, 05:57:07 PM
 #191

Can someone explain why limited blocksizes limit the number of transactions that can be placed? Wouldnt 10 100kb blocks get solved in as much time as 1mb block?
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
April 12, 2013, 07:06:23 PM
 #192

Can someone explain why limited blocksizes limit the number of transactions that can be placed? Wouldnt 10 100kb blocks get solved in as much time as 1mb block?

No, a block takes the same time to solve, no matter how many transactions are in it.  You only have to solve the block header which is 80 bytes.  You run the header through the sha-256 hash function twice to see if you "win".  If not, you increment an integer in the header and try again.  Mining is repeating the hash function until you get an integer than "wins".

Basically, once you have generated all the transactions you want to add, you calculate the merkle root.  This is a hash for all the transactions.  You include that number in the header and it is always the same size (32 bytes).

Larger blocks will potentially cause bandwidth issues when you try to propagate them though, since you have to send all the transactions.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
April 12, 2013, 07:52:27 PM
 #193

The wiki page is basically a copy of what I wrote with a few edits to make it sound more wiki-ish.

So, the procedure is basically, create a transaction:

Inputs:
2: <valid auth> + SIGHASH_ANYONECANPAY

Outputs
10: OP_RETURN <so dead>

This transaction cannot be spent.  However, when performing the authentication, all other inputs are ignored.  This means that if someone modified it to

Inputs:
2: <valid auth> + SIGHASH_ANYONECANPAY
8: <valid auth> + SIGHASH_ANYONECANPAY

Outputs
10: OP_RETURN <so dead>

It becomes valid.  The tx is validated twice, once with each input and the keys have to all be valid.

It seems that you can get the same effect without OP_RETURN by having a transaction that pays to OP_TRUE OP_VERIFY.

Also, what is to stop a miner adding the extra 8 BTC to get it over the limit?  In that case, the block is funded anyway.

The disadvantage is that the miner must add a 2nd transaction to direct the OP_TRUE output to his actual address.  However, no hard fork is required.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Stampbit
Full Member
***
Offline Offline

Activity: 182
Merit: 100



View Profile
April 12, 2013, 08:24:01 PM
 #194


Larger blocks will potentially cause bandwidth issues when you try to propagate them though, since you have to send all the transactions.

When you propagate them, do you send them to all other miners or only a subset of miners?
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1149


View Profile
April 12, 2013, 08:27:06 PM
 #195

It seems that you can get the same effect without OP_RETURN by having a transaction that pays to OP_TRUE OP_VERIFY.

Actually you just need OP_TRUE as the scriptPubKey, or leave it empty and have the miner provide OP_TRUE in the scriptSig. I mentioned the trade-offs of the approach on IRC a few weeks ago.

Also, what is to stop a miner adding the extra 8 BTC to get it over the limit?  In that case, the block is funded anyway.

Ouch... not much of an assurance contract then is it? It's basically just a donation at random - that's very clever of you. There is of course the risk that your block gets orphaned and another miner takes the fees instead, but that happens something like 1% of the time now, and potentially a lot less in the future. Miners can deliberately orphan the block too, but we really want to implement mechanisms to discourage that behavior for a lot of reasons.

You could use nLockTime: basically you would all participate in authoring an nLockTime'd assurance contract transaction. Some time before the nLockTime deadline approaches, if the assurance contract is *not* fully funded, IE a miner might pull the "self-assure" trick, you double-spend the input that would have gone to the contract so your contribution to it is no longer valid. On the other hand, that means anyone can DoS attack the assurance contract process itself by doing exactly that, and if they time it right, they can still pull off the "self-assure" trick. Risky though - it's hard to reason about what is the correct strategy there, although it's obviously easier to pull off all those attacks if you control a large amount rather than a small amount of the overall hashing power.

With a hardfork we can fix the issue though, by making it possible to write a scriptPubKey that can only be spent by a transaction following a certain template. For instance it could say that prior to block n, the template can only be a mining fee donation of value > x, and after, the spending transaction can be anything at all. It'll be a long time before that feature is implemented though.

In any case all these tricks would benefit an attacker trying to depress the overall network hash rate, either to launch double-spend attacks, attack Bitcoin in general, or just keep funds from flowing to the competition if they are a miner.

TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
April 12, 2013, 08:32:20 PM
 #196

When you propagate them, do you send them to all other miners or only a subset of miners?

All full nodes you are connected to.  They will verify the block and if it is ok, forward it to all full nodes they are connect to.  However, if they receive the block a 2nd (or later) time, they don't forward it.

This sends the block to all nodes on the network.  However, large blocks will propagate more slowly.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
April 12, 2013, 09:09:26 PM
 #197

Also, what is to stop a miner adding the extra 8 BTC to get it over the limit?  In that case, the block is funded anyway.
Ouch... not much of an assurance contract then is it? It's basically just a donation at random - that's very clever of you. There is of course the risk that your block gets orphaned and another miner takes the fees instead, but that happens something like 1% of the time now, and potentially a lot less in the future. Miners can deliberately orphan the block too, but we really want to implement mechanisms to discourage that behavior for a lot of reasons.

Yeah, there is a risk, esp if the amount you add it very high.

A similar situation was discussed in another thread (ish).

The current rule is mine the longest chain.  However, if a miner included a payment to true in the block, then it would encourage other miners to build on his block.

If the chain was

A -> B -> C ->

but B has a large payment to true and C doesn't, then miners would be encouraged to keep mining against B, rather than accept the new block. 

This means that you have a tradeoff.  If you create C' and it pays a lot of the reward from B onward to true, then you weaken the incentive.

An equilibrium would set it where there is an incentive to include some payment to true.  This means that tx fees are effectively smeared.

I assumed there was basically 2 strategies

1) Mine against the longest chain
2) Mine against one of the top-2 blocks, whichever pays the highest to true

Depending on the payout for the top-2 blocks, neither strategy wins outright, a certain fraction will follow each of them.

Quote
You could use nLockTime: basically you would all participate in authoring an nLockTime'd assurance contract transaction. Some time before the nLockTime deadline approaches, if the assurance contract is *not* fully funded, IE a miner might pull the "self-assure" trick, you double-spend the input that would have gone to the contract so your contribution to it is no longer valid.

You can only spend your own input.  All other inputs would still be valid.  In effect, all participants would have to cancel.

It does offer protection for those who actually care.  The miner would have to publish the tx a few days (or hours) before the deadline, so he couldn't add it to his block.

Quote
On the other hand, that means anyone can DoS attack the assurance contract process itself by doing exactly that, and if they time it right, they can still pull off the "self-assure" trick.

The only thing that would accomplish is to reduce the total, since the inputs from anyone else who contributed would still be valid.

Quote
With a hardfork we can fix the issue though, by making it possible to write a scriptPubKey that can only be spent by a transaction following a certain template. For instance it could say that prior to block n, the template can only be a mining fee donation of value > x, and after, the spending transaction can be anything at all. It'll be a long time before that feature is implemented though.

What would be good would be the timestamping thing.  For example, you a hash of the tx must be included at least 10 blocks previously.

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
gglon
Member
**
Offline Offline

Activity: 64
Merit: 10


View Profile
April 12, 2013, 10:35:33 PM
 #198


Hi guys, being not that deep into bitcoin I was trying to make some sense from Mike's proposal. I wrote a comment on reddit describing it to normal people as best as I could. But I'm still not sure if it's more or less right. It would be nice if someone translated the proposal to more accessible language, as the way we will fund the network in the future is pretty important, and more people should know what are the options.
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1083


View Profile
April 12, 2013, 11:00:59 PM
 #199


Hi guys, being not that deep into bitcoin I was trying to make some sense from Mike's proposal. I wrote a comment on reddit describing it to normal people as best as I could. But I'm still not sure if it's more or less right. It would be nice if someone translated the proposal to more accessible language, as the way we will fund the network in the future is pretty important, and more people should know what are the options.

It is an assurance contact.  You say "Pay the miner who includes this transaction 100BTC, signed by me for 10BTC, allow extra signatures".  This is an invalid transaction since you haven't included enough money.

However, if others create more transactions, you end up with something like

Pay the miner who includes this transaction 100BTC, signed by me for 10BTC, allow extra signatures
Pay the miner who includes this transaction 100BTC, signed by person2 for 60BTC, allow extra signatures
Pay the miner who includes this transaction 100BTC, signed by person3 for 30BTC, allow extra signatures

These can be combined, since they are identical except for the signatures to give:

Pay the miner who includes this transaction 100BTC, signed by me for 10BTC, signed by person2 for 60BTC, signed by person3 for 30BTC, allow extra signatures

This is only possible if you explicitly allow extra signatures.  

The final transaction is valid, so can be submitted to the main network.  If a miner includes the transaction in the block, they get 100BTC.

The idea is that lots of people could add a few BTC.  However, it is only valid if the total is 100BTC (or more).

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
April 12, 2013, 11:18:05 PM
 #200

Arguably OT, but still important to the thrust of the OP and much of the debate where network security = decentralization.
Astounding developments like this will drive forward Bitcoin's success.

I feel that just THIS makes me strong bull again ..



friedcat presented USB powered mini ASIC running 300MH/sec. That's what I call DECENTRALIZATION.

This looks great. Thread for interested lurkers: https://bitcointalk.org/index.php?topic=99497.3180

Pages: « 1 2 3 4 5 6 7 8 9 [10] 11 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!