Bitcoin Forum
May 10, 2024, 09:29:12 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Error code -22 on OP_RETURN tx  (Read 1325 times)
luv2drnkbr (OP)
Hero Member
*****
Offline Offline

Activity: 793
Merit: 1016



View Profile
February 23, 2014, 08:35:26 AM
 #1

I'm learning to make an OP_RETURN tx, and I have a simple one, that has one input of 0.001 and one OP_RETURN output that receives no bitcoins.

I have made the tx

0100000001d4a37f5116d17a9a969752c979cdb138cb86fe4c89546dfda85bce376ce8241a01000 0000100ffffffff010000000000000000266a2454657374696e67204232344364354a5530706d39 4f3770787277376e277320477569646500000000

and signed it

0100000001d4a37f5116d17a9a969752c979cdb138cb86fe4c89546dfda85bce376ce8241a01000 0006c493046022100d9d80aa3c14d611b03e48376869125b07f133fb9ee81c5bdb8978c580b5a10 b9022100d85a9bd0c94926878d8954f82871ec44a069fd54bc3b0676b13e805a1d92d61a0121029 d4d4a7bbf9c52e8fb700a9c96a3c3a9d26c00d6630fc01efc27c956c40365f4ffffffff01000000 0000000000266a2454657374696e67204232344364354a5530706d394f3770787277376e2773204 77569646500000000

but when I try to broadcast it, I get error code -22, and similarly, neither Blockchahin.info nor Eligius' pushtx websites will broadcast it.  I know it's non-standard, but I'm connected directly to miners who mine non-standard transactions and would like to send the darned thing.

Can somebody tell me if there's an error in the tx itself, or if there is another reason for the broadcasting failure..?

Thank you all for your time!

1715376552
Hero Member
*
Offline Offline

Posts: 1715376552

View Profile Personal Message (Offline)

Ignore
1715376552
Reply with quote  #2

1715376552
Report to moderator
1715376552
Hero Member
*
Offline Offline

Posts: 1715376552

View Profile Personal Message (Offline)

Ignore
1715376552
Reply with quote  #2

1715376552
Report to moderator
You get merit points when someone likes your post enough to give you some. And for every 2 merit points you receive, you can send 1 merit point to someone else!
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1715376552
Hero Member
*
Offline Offline

Posts: 1715376552

View Profile Personal Message (Offline)

Ignore
1715376552
Reply with quote  #2

1715376552
Report to moderator
1715376552
Hero Member
*
Offline Offline

Posts: 1715376552

View Profile Personal Message (Offline)

Ignore
1715376552
Reply with quote  #2

1715376552
Report to moderator
1715376552
Hero Member
*
Offline Offline

Posts: 1715376552

View Profile Personal Message (Offline)

Ignore
1715376552
Reply with quote  #2

1715376552
Report to moderator
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4172
Merit: 8419



View Profile WWW
February 23, 2014, 08:51:35 AM
 #2

Eligius won't mine transactions which appear to be abusing the system to store data.

Data in an OP_RETURN should be a hash in any case. Bitcoin is not a distributed data storage, and the cost and risk associated with the data is an externality which you are not paying for.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1150


View Profile
February 23, 2014, 09:18:17 AM
 #3

You don't need OP_RETURN to store data, particularly hashes. The idea being it was to encourage people to store data in the least harmful way possible in outputs that can be obviously pruned; like it or not putting data in the blockchain is useful so people will do it, and there's no good way to stop people from doing so. Of course, once data is prunable, it's no different than any other transaction in terms of impact on the system - you're still paying fees on the exact same basis as doing any other transaction.

If you're planning on doing more than just playing around, what Mastercoin does to embed data is reasonably cheap and censorship resistant, and will become more so in the future: https://github.com/mastercoin-MSC/spec#appendix-a--storing-mastercoin-data-in-the-blockchain

I'll add to that spec sooner or later with the P2SH scriptSig method of encoding; censoring that would require fairly invasive changes to the scripting language or blacklists. (fwiw I'm Mastercoin's Chief Scientist)

luv2drnkbr (OP)
Hero Member
*****
Offline Offline

Activity: 793
Merit: 1016



View Profile
February 23, 2014, 04:40:13 PM
Last edit: February 23, 2014, 04:51:23 PM by luv2drnkbr
 #4

So the transaction itself is without errors then, correct?  And I just can't push it anywhere because everybody is trying to (rightly so) prevent me from clogging up the blockchain?

The is the first manually created transaction I've made, so mostly I just want to know I've done it correctly.

UPDATE:  I was about to double spend it, but it looks like somebody did actually put it in a block.  Did my tx broadcast even with the error -22 code or is it more likely somebody saw it here and broadcast it for me?

UPDATE 2:  Wow, ok, nevermind, I've answered my own question.  That tx that got into a block is actually NOT the one I posted.  It's an earlier version I made with a lower sequence number specifically so I could double-spend it easily if there were problems.  But that version went through.  So I know it must have been broadcast, because I hadn't posted that anywhere else.

Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1150


View Profile
February 24, 2014, 12:29:37 AM
 #5

So the transaction itself is without errors then, correct?  And I just can't push it anywhere because everybody is trying to (rightly so) prevent me from clogging up the blockchain?

The is the first manually created transaction I've made, so mostly I just want to know I've done it correctly.

UPDATE:  I was about to double spend it, but it looks like somebody did actually put it in a block.  Did my tx broadcast even with the error -22 code or is it more likely somebody saw it here and broadcast it for me?

UPDATE 2:  Wow, ok, nevermind, I've answered my own question.  That tx that got into a block is actually NOT the one I posted.  It's an earlier version I made with a lower sequence number specifically so I could double-spend it easily if there were problems.  But that version went through.  So I know it must have been broadcast, because I hadn't posted that anywhere else.

Ah, you're quite mistaken on a few levels. First of all nSequence was removed from Bitcoin literally years ago because it's insecure and makes Bitcoin vulnerable to DoS attacks. The wiki pages that mention it shouldn't, but unfortunately we've got a developer who's still hung up on the idea so...

Secondly I think the -22 error code you saw just means your local bitcoind instance doesn't support OP_RETURN; have you upgraded it to either the 0.9 release candidate, or the latest from git?

Looks like Eligius was who mined that tx, which makes me suspect Luke still hasn't actually disabled OP_RETURN transactions...

Anyway, like I said, if you have a use for OP_RETURN, just make your application use it and have other data-encoding mechanisms as a fallback. The idea of "clogging up the blockchain" is nonsense.

luv2drnkbr (OP)
Hero Member
*****
Offline Offline

Activity: 793
Merit: 1016



View Profile
February 24, 2014, 12:38:08 AM
 #6

Thank you very much Mr. Todd for all of your help and information!

d'aniel
Sr. Member
****
Offline Offline

Activity: 461
Merit: 251


View Profile
February 24, 2014, 01:58:33 AM
 #7

You don't need OP_RETURN to store data, particularly hashes. The idea being it was to encourage people to store data in the least harmful way possible in outputs that can be obviously pruned; like it or not putting data in the blockchain is useful so people will do it, and there's no good way to stop people from doing so.
As you pointed out in a recent post*, P2SH^2 on the scriptPubKey side and MASTs on the scriptSig side would work to keep the blockchain from becoming a garbage dump.  Since both of these changes would be very good for both scalability and privacy, I don't understand why you were so quick to write them off as unlikely.
Quote
Of course, once data is prunable, it's no different than any other transaction in terms of impact on the system - you're still paying fees on the exact same basis as doing any other transaction.
The problem is that non-Bitcoin applications compete for scarce block space, driving up the cost of transactions for Bitcoin users.  As you know, this cost is either felt in the form of fees, or increased centralization if the block size is simply raised to accommodate.  I don't think it's unreasonable to say that Bitcoin users ought not be burdened by arbitrary demands of its blockchain.

You mentioned it being better to just focus on the root of the problem - fundamental scalability - but the potential solutions I've seen proposed by you seem like very radical changes (far more so than P2SH^2 or MASTs), and the conservative approach appears to be to protect the blockchain from abuse to the greatest extent possible at least until a particular scalability upgrade has been 1) implemented, 2) well-tested, and 3) widely agreed upon.

Regarding OP_RETURN though, I agree that it's the proper way of doing damage control (though I worry that it sends the wrong signal to users that this usage is acceptable, and not abusive, and will encourage more of it), but only while P2SH^2 and MASTs are not yet implemented.


*
My advice for new projects is to support multiple encoding methods, the same was Mastercoin did, so you aren't dependent on the Bitcoin devs. Incidentally there's no practical way to stop all those methods - even P2SH^2, itself a very invasive change to the ecosystem which is unlikely to happen, can't stop encoding data in P2SH scriptSigs without merkleized abstract syntax tree support and risky changes to the scripting language... and that in turn has the big risk that you make upgrades in the future, perhaps because a crypto algorithm has been weakened, much more difficult to implement.

Of course, that's why I'm spending my time working on actually improving fundamental scalability rather than wasting time trying to tell people what to do with a trust-free decentralized system...
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1150


View Profile
February 24, 2014, 03:35:34 AM
 #8

You don't need OP_RETURN to store data, particularly hashes. The idea being it was to encourage people to store data in the least harmful way possible in outputs that can be obviously pruned; like it or not putting data in the blockchain is useful so people will do it, and there's no good way to stop people from doing so.
As you pointed out in a recent post*, P2SH^2 on the scriptPubKey side and MASTs on the scriptSig side would work to keep the blockchain from becoming a garbage dump.  Since both of these changes would be very good for both scalability and privacy, I don't understand why you were so quick to write them off as unlikely.

"Garbage dumbp"

Please, don't write off useful applications for proof-of-publication so quickly. Heck, just the other day I realized how it could be used to create decentralized digital asset exchanges with cryptographically verifiable pricing that was inherently resistant to market manipulation:  http://www.mail-archive.com/bitcoin-development@lists.sourceforge.net/msg03892.html

Quote
Of course, once data is prunable, it's no different than any other transaction in terms of impact on the system - you're still paying fees on the exact same basis as doing any other transaction.
The problem is that non-Bitcoin applications compete for scarce block space, driving up the cost of transactions for Bitcoin users.  As you know, this cost is either felt in the form of fees, or increased centralization if the block size is simply raised to accommodate.  I don't think it's unreasonable to say that Bitcoin users ought not be burdened by arbitrary demands of its blockchain.

What is or isn't a "Bitcoin application" is open to debate. I certainly think decentralized finance has the potential to be valuable, and it will be quite concretely based on Bitcoin as a medium of exchange in actual applications. More to the point, if I wish to, say, use a multisignature protected wallet, I'm creating transactions that take up significantly more blockchain space than those that don't. Of course, I pay for that space in fees, driving up costs for all Bitcoin users... So am I wrong for doing that?

You mentioned it being better to just focus on the root of the problem - fundamental scalability - but the potential solutions I've seen proposed by you seem like very radical changes (far more so than P2SH^2 or MASTs), and the conservative approach appears to be to protect the blockchain from abuse to the greatest extent possible at least until a particular scalability upgrade has been 1) implemented, 2) well-tested, and 3) widely agreed upon.

Regarding OP_RETURN though, I agree that it's the proper way of doing damage control (though I worry that it sends the wrong signal to users that this usage is acceptable, and not abusive, and will encourage more of it), but only while P2SH^2 and MASTs are not yet implemented.

My worry is if we don't stick to a strict market-based allocation of proof-of-publication security we're going to see an acceptance in this community of rationing it based on perceived value. Frankly, this is much like net neutrality debates - I believe in applying freedom of choice and non-judgemental market principles rather than having a small group of devs and pool operators decide what is or isn't the valid use of the Bitcoin blockchain.

As for my advice, again, I know that P2SH^2 and MAST's are quite hard to deploy, which is why my advice to users who think they have a use-case for data-in-the-blockchain is to go ahead and do so now. They'll have a few years to show that their applications are in fact valuable, at which time a majority of hashing power may decide that implementing anti-data-in-chain techniques would be a mistake. Also, it increasingly looks like the scalability stuff I'm working on can be done transparently in a backwards compatible manner - actually implementing that stuff may prove to be less of a disruption than forcing all Bitcoin software to use new address types.

d'aniel
Sr. Member
****
Offline Offline

Activity: 461
Merit: 251


View Profile
February 24, 2014, 08:04:17 AM
 #9

"Garbage dumbp"

Please, don't write off useful applications for proof-of-publication so quickly.
I don't mean to; there are definitely interesting things that can be done.  I just question whether it's necessary to do them on the Bitcoin blockchain.  A truly symbiotic application seems like it could work perfectly well using a separate merged mined chain, and atomic cross chain trading.  (And if the application is competing with Bitcoin, then all the more reason to keep it the hell off the Bitcoin blockchain, if possible.)

Quote
What is or isn't a "Bitcoin application" is open to debate. I certainly think decentralized finance has the potential to be valuable, and it will be quite concretely based on Bitcoin as a medium of exchange in actual applications. More to the point, if I wish to, say, use a multisignature protected wallet, I'm creating transactions that take up significantly more blockchain space than those that don't. Of course, I pay for that space in fees, driving up costs for all Bitcoin users... So am I wrong for doing that?
Obviously pretty much nobody would object to that example.  Yet consensus seems to be that using the blockchain for completely arbitrary data dumps is a bad idea.  So yes the line is blurry, but that doesn't mean it doesn't exist or is safely ignored.

Quote
My worry is if we don't stick to a strict market-based allocation of proof-of-publication security we're going to see an acceptance in this community of rationing it based on perceived value. Frankly, this is much like net neutrality debates - I believe in applying freedom of choice and non-judgemental market principles rather than having a small group of devs and pool operators decide what is or isn't the valid use of the Bitcoin blockchain.
This implies a rejection of any defined purpose for the blockchain: "it exists for whatever the highest bidders want to use it for".  For example, crowded out users that just want to transfer bitcoins back and forth would IMHO be rightly upset by this kind of transformation.

Quote
Also, it increasingly looks like the scalability stuff I'm working on can be done transparently in a backwards compatible manner - actually implementing that stuff may prove to be less of a disruption than forcing all Bitcoin software to use new address types.
I hope so!  Better fundamental scalability is certainly the ideal solution.  Since I like to keep up-to-date on these topics, what is your current favoured way to go here?  Just the name of the proposal is fine, I can search through the chat logs/forums to get the details.  Thanks.
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1150


View Profile
February 24, 2014, 08:35:38 AM
 #10

I don't mean to; there are definitely interesting things that can be done.  I just question whether it's necessary to do them on the Bitcoin blockchain.  A truly symbiotic application seems like it could work perfectly well using a separate merged mined chain, and atomic cross chain trading.  (And if the application is competing with Bitcoin, then all the more reason to keep it the hell off the Bitcoin blockchain, if possible.)

If you think merge-mined chains represent proof-of-publication in a world of large pools, you misunderstand what the idea is. Fundamentally merge-mining is insecure without the participation of at least a very large fraction of the mining hashing power, which negates the scalability argument for merge-mining.

Obviously pretty much nobody would object to that example.  Yet consensus seems to be that using the blockchain for completely arbitrary data dumps is a bad idea.  So yes the line is blurry, but that doesn't mean it doesn't exist or is safely ignored.

Even if there is consensus about that, which I myself am a solid counter-example, there isn't consensus about what exactly arbitrary data is.

This implies a rejection of any defined purpose for the blockchain: "it exists for whatever the highest bidders want to use it for".  For example, crowded out users that just want to transfer bitcoins back and forth would IMHO be rightly upset by this kind of transformation.

Equally those who had been putting a lot of effort into decentralized finance will be rightly upset by this kind of transformation. Which is why my advice to them is don't get sucked into the hype - blocking data in the blockchain is non-trivial and their applications don't have much to worry about. Furthermore blocking proof-of-publication/timestamping for hashes is outright impossible, and for many of these applications (e.g. colored coins) hashes is all you need. In fact, with some cleverness I suspect I could make the entire Mastercoin and Counterparty protocols be purely hash based and unblockable by generic mechanisms; I should put some thought into this...)

I hope so!  Better fundamental scalability is certainly the ideal solution.  Since I like to keep up-to-date on these topics, what is your current favoured way to go here?  Just the name of the proposal is fine, I can search through the chat logs/forums to get the details.  Thanks.

#bitcoin-wizards is where this has been discussed. Working on a more formal paper for tree chains as well. I prefer not to discuss ideas here - much better to discuss ideas on open mediums like email lists that are archived by multiple entities.

d'aniel
Sr. Member
****
Offline Offline

Activity: 461
Merit: 251


View Profile
February 24, 2014, 10:13:59 AM
 #11

If you think merge-mined chains represent proof-of-publication in a world of large pools, you misunderstand what the idea is. Fundamentally merge-mining is insecure without the participation of at least a very large fraction of the mining hashing power, which negates the scalability argument for merge-mining.
Don't forget about all the non-mining full nodes that would avoid having to carry the extra burden.  Miners are getting financially rewarded for it, so it's much less of a problem for them.  And hashing costs dominate up to pretty large scale anyway.

Quote
In fact, with some cleverness I suspect I could make the entire Mastercoin and Counterparty protocols be purely hash based and unblockable by generic mechanisms; I should put some thought into this...)
Well if the system can be hijacked, then I'm sure some clever people will do it, and this will make all the previous talk about ethical blockchain usage moot.

Quote
#bitcoin-wizards is where this has been discussed. Working on a more formal paper for tree chains as well. I prefer not to discuss ideas here - much better to discuss ideas on open mediums like email lists that are archived by multiple entities.
Thanks, I saw the tree chain discussion, but haven't read through it carefully yet.  Are you not interested in the MMR TXO commitments idea anymore?  Looking forward to the paper, I hope you finish it before you begin working on the blockchain hijacking mechanism Wink
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1150


View Profile
February 24, 2014, 02:40:01 PM
 #12

If you think merge-mined chains represent proof-of-publication in a world of large pools, you misunderstand what the idea is. Fundamentally merge-mining is insecure without the participation of at least a very large fraction of the mining hashing power, which negates the scalability argument for merge-mining.
Don't forget about all the non-mining full nodes that would avoid having to carry the extra burden.  Miners are getting financially rewarded for it, so it's much less of a problem for them.  And hashing costs dominate up to pretty large scale anyway.

That's the problem! If miners are the only ones with the data, then it's not proof of publication. Of course, this shows a flaw in Bitcoin, but at least for now we've been able to paper over that flaw...

Quote
In fact, with some cleverness I suspect I could make the entire Mastercoin and Counterparty protocols be purely hash based and unblockable by generic mechanisms; I should put some thought into this...)
Well if the system can be hijacked, then I'm sure some clever people will do it, and this will make all the previous talk about ethical blockchain usage moot.

Exactly - I know it can and will be hijacked.

Quote
#bitcoin-wizards is where this has been discussed. Working on a more formal paper for tree chains as well. I prefer not to discuss ideas here - much better to discuss ideas on open mediums like email lists that are archived by multiple entities.
Thanks, I saw the tree chain discussion, but haven't read through it carefully yet.  Are you not interested in the MMR TXO commitments idea anymore?  Looking forward to the paper, I hope you finish it before you begin working on the blockchain hijacking mechanism Wink

TXO commitments are just a small part of solving scalability - they only help with long-term storage and actually make bandwidth scalability significantly worse. They do appear to be a good approach to blockchain sharding - tree-chains makes use of them - but the people who have been claiming they represent some scaling breakthrough misunderstand the technology. Incidentally, I'm inclined to drop the "MMR" from the approach - for TXO's as opposed to timestamping just using a variable height sparse merkle tree indexed by insertion order is simpler and results in pretty similar proof lengths as with MMR's.

As for what I'm working on... everything at once, as usual. Smiley All this tech interrelates - I probably wouldn't have thought of the proof-of-publication theory if I hadn't been thinking about Mastercoin at the time.

justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
February 24, 2014, 05:15:05 PM
 #13

Don't forget about all the non-mining full nodes that would avoid having to carry the extra burden.  Miners are getting financially rewarded for it, so it's much less of a problem for them.
The thing about Bitcoin is that it's in no way tied to a particular P2P network.

Back in ancient times, there was a P2P filesharing network called "Mojo Nation" that attempted to use market based resource allocation for bandwidth, storage, and content indexing.

It didn't really work out for several reasons, but it's probably worth dusting off, fixing its price discovery problems, and repurposing for Bitcoin.
d'aniel
Sr. Member
****
Offline Offline

Activity: 461
Merit: 251


View Profile
February 24, 2014, 08:20:30 PM
Last edit: February 24, 2014, 08:32:53 PM by d'aniel
 #14

If you think merge-mined chains represent proof-of-publication in a world of large pools, you misunderstand what the idea is. Fundamentally merge-mining is insecure without the participation of at least a very large fraction of the mining hashing power, which negates the scalability argument for merge-mining.
Don't forget about all the non-mining full nodes that would avoid having to carry the extra burden.  Miners are getting financially rewarded for it, so it's much less of a problem for them.  And hashing costs dominate up to pretty large scale anyway.

That's the problem! If miners are the only ones with the data, then it's not proof of publication. Of course, this shows a flaw in Bitcoin, but at least for now we've been able to paper over that flaw...
If anything, not being forced to curate the "garbage dump" in addition to the Bitcoin blockchain would enable more people to run Bitcoin full nodes.  I understand that a lot of participants are unwilling/unable to fully verify, but hijacking the Bitcoin blockchain for a zillion other often unrelated and often unwelcome applications just makes the problem worse.

I get your point though about parasitic users being incentivised to be, well, parasitic.

Also, even if there is ultimately no technical means to prevent blockchain hijacking, this doesn't mean social pressures can't work to some useful degree.  Well-respected developers being vocal about it at the very least this gives less abusive alternatives a PR advantage, or can correct the most egregious abuses - like e.g. mastercoin's initial use of non-prunable outputs.

TXO commitments are just a small part of solving scalability - they only help with long-term storage and actually make bandwidth scalability significantly worse. They do appear to be a good approach to blockchain sharding - tree-chains makes use of them - but the people who have been claiming they represent some scaling breakthrough misunderstand the technology.
Right, I calculated a little while back something like a ~7 fold increase in bandwidth (for authenticated prefix trees), but my understanding was that they are useful because they enable partial verification/fraud discovery and distributed block construction, which spreads the increased bandwidth load over a much larger number of participants.  Is this a misunderstanding?  Or just an overestimation of the number of extra participants?

Back in ancient times, there was a P2P filesharing network called "Mojo Nation" that attempted to use market based resource allocation for bandwidth, storage, and content indexing.

It didn't really work out for several reasons, but it's probably worth dusting off, fixing its price discovery problems, and repurposing for Bitcoin.
I guess I'm ancient, as I'm actually aware of Mojo Nation Smiley  Zooko et. al. were talking about implementing accounting a while ago for their more modest successor project, Tahoe-LAFS, though I'm not sure if they're still planning to do that.
justusranvier
Legendary
*
Offline Offline

Activity: 1400
Merit: 1009



View Profile
February 24, 2014, 08:40:50 PM
 #15

I guess I'm ancient, as I'm actually aware of Mojo Nation Smiley  Zooko et. al. were talking about implementing accounting a while ago for their more modest successor project, Tahoe-LAFS, though I'm not sure if they're still planning to do that.
The subject came up on this forum a couple years ago, although it was in the context of generic file storage: https://bitcointalk.org/index.php?topic=86384.0;all

I remember emailing Zooko about it back then, but nothing really came of it.

Maybe it's better to look at a rebooted MojoNation as a replacement for Bitcoin's P2P layer which might incidentally turn out to be useful for generic storage too (maybe for all that extra data people want to stuff in the blockchain?)
Peter Todd
Legendary
*
expert
Offline Offline

Activity: 1120
Merit: 1150


View Profile
February 25, 2014, 02:18:28 AM
 #16

If anything, not being forced to curate the "garbage dump" in addition to the Bitcoin blockchain would enable more people to run Bitcoin full nodes.  I understand that a lot of participants are unwilling/unable to fully verify, but hijacking the Bitcoin blockchain for a zillion other often unrelated and often unwelcome applications just makes the problem worse.

But again, in the merge-mining for proof-of-publication case, unless mining hashing power is more decentralized the merge-mining is not proof of anything, resulting in insecure applications.

I get your point though about parasitic users being incentivised to be, well, parasitic.

Also, even if there is ultimately no technical means to prevent blockchain hijacking, this doesn't mean social pressures can't work to some useful degree.  Well-respected developers being vocal about it at the very least this gives less abusive alternatives a PR advantage, or can correct the most egregious abuses - like e.g. mastercoin's initial use of non-prunable outputs.

You realize that the dev's involved appear to have managed to lose that social respect overnight by making an about-face on the idea of OP_RETURN? You may not think that within this small community, but within the community of people making use of data and metadata-using transactions they have. Just a week or two ago I was arguing with someone - a paying client - that the OP_RETURN thing was certainly going to be released and they should go to the effort of using it for their application. Well, looks like I was wrong, and when I talked to them again earlier today their plan is to stick with what they had written initially and use UTXO-bloating outputs.

TXO commitments are just a small part of solving scalability - they only help with long-term storage and actually make bandwidth scalability significantly worse. They do appear to be a good approach to blockchain sharding - tree-chains makes use of them - but the people who have been claiming they represent some scaling breakthrough misunderstand the technology.
Right, I calculated a little while back something like a ~7 fold increase in bandwidth (for authenticated prefix trees), but my understanding was that they are useful because they enable partial verification/fraud discovery and distributed block construction, which spreads the increased bandwidth load over a much larger number of participants.  Is this a misunderstanding?  Or just an overestimation of the number of extra participants?

Yup - they don't make distributed block construction possible by themselves. Which is a really serious problem actually as what could happen is they just act to paper over the scalability problem and leave us in a situation where it lets a larger block size or similar look scalable, which further centralizing control of mining. I gotta admit, I was a bit reluctant to publish the idea for that reason initially, but decided it was a strict improvement on UTXO commitments so went ahead.

Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!