IShishkin
Member

Offline
Activity: 87
Merit: 44
|
 |
January 24, 2025, 09:42:50 AM |
|
Block size, for example are necessary to be changed because they cannot fulfill the objectives of Bitcoin and remain scalable. Why? You don't need bigger blocks, to confirm more transactions. You need more non-interactive cut-through instead, where "Alice -> Bob" transaction, and "Bob -> Charlie" transaction, will be combined by miners, and confirmed as "Alice -> Charlie" joined transaction. Don't you think this combined transaction will occupy comparable amount of block space? It might have the same amount of computational and network data cost. You still need to propagate and verify the transaction signed by Bob. It's a hidden cost.
|
|
|
|
ABCbits
Legendary
Offline
Activity: 3332
Merit: 8987
|
 |
January 24, 2025, 09:54:44 AM |
|
Block size, for example are necessary to be changed because they cannot fulfill the objectives of Bitcoin and remain scalable. Why? You don't need bigger blocks, to confirm more transactions. You need more non-interactive cut-through instead, where "Alice -> Bob" transaction, and "Bob -> Charlie" transaction, will be combined by miners, and confirmed as "Alice -> Charlie" joined transaction. Don't you think this combined transaction will occupy comparable amount of block space? It might have the same amount of computational and network data cost. You still need to propagate and verify the transaction signed by Bob. It's a hidden cost. Cut-through actually merge multiple TX into one big TX and it leads to overall smaller TX size. The computation and network usage is a bit higher, but negligible with today's computer or server.
|
|
|
|
stwenhao
|
Don't you think this combined transaction will occupy comparable amount of block space? No. Two separate transactions take more space, than a single combined transaction, when you cut some data in the middle, and you prove in signatures, that it was done correctly. Also, amounts can be combined as well, so miners have an incentive to join transactions, because then, they can get the same fees, but use less block space, than before joining. More than that: if you can join any two matching transactions, then you can join N transactions as well, and then, the only limits are related to things like max standard transaction size. It might have the same amount of computational and network data cost. There are more computations needed, but they are temporary. When joined transactions are deeply confirmed, then new nodes won't need that data during Initial Blockchain Download. You still need to propagate and verify the transaction signed by Bob. Only as long, as this transaction is unconfirmed. However, when it is batched into other transactions, then Bob can just keep some SPV-like proof, and every other node can forget about it. It's a hidden cost. What do you think is better? Having higher cost during mining recent blocks, and then lower cost during Initial Blockchain Download, or the opposite situation, where each and every full node has to verify non-batched transactions over and over again? Very often, you have non-batched transactions in the scope of the same block. Which means, that the final outcome of the block is "consumed N inputs, and made M outputs", but it is not expressed in the simplest way.
|
|
|
|
IShishkin
Member

Offline
Activity: 87
Merit: 44
|
Don't you think this combined transaction will occupy comparable amount of block space? No. Two separate transactions take more space, than a single combined transaction, when you cut some data in the middle, and you prove in signatures, that it was done correctly. Also, amounts can be combined as well, so miners have an incentive to join transactions, because then, they can get the same fees, but use less block space, than before joining. More than that: if you can join any two matching transactions, then you can join N transactions as well, and then, the only limits are related to things like max standard transaction size. It might have the same amount of computational and network data cost. There are more computations needed, but they are temporary. When joined transactions are deeply confirmed, then new nodes won't need that data during Initial Blockchain Download. You still need to propagate and verify the transaction signed by Bob. Only as long, as this transaction is unconfirmed. However, when it is batched into other transactions, then Bob can just keep some SPV-like proof, and every other node can forget about it. It's a hidden cost. What do you think is better? Having higher cost during mining recent blocks, and then lower cost during Initial Blockchain Download, or the opposite situation, where each and every full node has to verify non-batched transactions over and over again? Very often, you have non-batched transactions in the scope of the same block. Which means, that the final outcome of the block is "consumed N inputs, and made M outputs", but it is not expressed in the simplest way. 1) Are you talking about the UTXO model? 2) It will be awesome if you support your claims with proper mathematical calculations. 3) Don't forget about network data propagation delays. 4) Remember that blockchain security relies on full nodes that do all verifications in full against all protocol rules. Full nodes never rely on any spv-like proofs. There is no compromise here on what is better. There are no "options" here to think about. 5) Finally. Could you estimate what is % of transactions go in chains "Alice->Bob->Charlie" within short time intervals compared to stand-alone transactions "Alice to Bob"? If transactions which could be "batched" are very rare, then how can you get any non-marginal efficiency improvement here?
|
|
|
|
IShishkin
Member

Offline
Activity: 87
Merit: 44
|
 |
January 24, 2025, 11:24:24 AM |
|
Block size, for example are necessary to be changed because they cannot fulfill the objectives of Bitcoin and remain scalable. Why? You don't need bigger blocks, to confirm more transactions. You need more non-interactive cut-through instead, where "Alice -> Bob" transaction, and "Bob -> Charlie" transaction, will be combined by miners, and confirmed as "Alice -> Charlie" joined transaction. Don't you think this combined transaction will occupy comparable amount of block space? It might have the same amount of computational and network data cost. You still need to propagate and verify the transaction signed by Bob. It's a hidden cost. Cut-through actually merge multiple TX into one big TX and it leads to overall smaller TX size. The computation and network usage is a bit higher, but negligible with today's computer or server. If network usage is negligible then why do have block size problem?
|
|
|
|
NotATether (OP)
Legendary
Offline
Activity: 2058
Merit: 8803
Search? Try talksearch.io
|
 |
January 24, 2025, 11:51:43 AM |
|
1) Are you talking about the UTXO model? 2) It will be awesome if you support your claims with proper mathematical calculations. ~
For transactions at least, you can save an input field by joining them together: https://en.bitcoin.it/wiki/Transaction - you save at least 40 bytes per output in the combined transaction for that. Assuming of course you were going to make a bunch of 1 input -> 1 output transactions.
|
|
|
|
stwenhao
|
 |
January 24, 2025, 12:36:59 PM |
|
1) Are you talking about the UTXO model? Yes. In general, I think that this: +----------------------------------------------+ +----------------------------------------------+ | Alice 1.00000000 BTC -> Bob 0.50000000 BTC | | Bob 0.50000000 BTC -> Charlie 0.10000000 BTC | | Alice 0.49999000 BTC | | Bob 0.39999000 BTC | +----------------------------------------------+ +----------------------------------------------+ Will take more on-chain space, than this: +------------------------------------------------+ | Alice 1.00000000 BTC -> Charlie 0.10000000 BTC | | Bob 0.39999000 BTC | | Alice 0.49999000 BTC | +------------------------------------------------+ As you can see, the final outcome is exactly the same. Fees are identical, amounts are identical, but the batched transaction will take less space, than non-batched two transactions, just because in the first case, you have to store Bob's data on-chain, but in the second case, Bob can just keep a proof locally, but nobody else needs that proof. 2) It will be awesome if you support your claims with proper mathematical calculations. Let's assume, that we use these regtest addresses: bcrt1qtgaawjdc9ntf0gcujfu0rx25tnddh65zj7fyd2 Alice bcrt1qgne9hgmelwfq25tv9p74hvrhpu54fp5hyhtku5 Bob bcrt1q56n28k46acsalrah3vgqykv70s3crv8fk7k7pq Charlie bcrt1qhv22hzfzppuzuyfhamx8wujez3wpgjv6xvf2xr AliceChange bcrt1qlxyv80uq4c5mj6r6r05ue9h92n8hhx283sqhcp BobChange And let's assume, that we use 1000 satoshis per transaction, as a fee. Then we have: +----------------------------------------------+ | Alice 1.00000000 BTC -> Bob 0.50000000 BTC | | Alice 0.49999000 BTC | +----------------------------------------------+ | Weight: 560 bytes | +----------------------------------------------+ Transaction data: decoderawtransaction 02000000000101757b4468801a1c03822502563e78f363c03678e6fc7f41dd3c9f9692e62947830000000000fdffffff0280f0fa020000000016001444f25ba379fb9205516c287d5bb0770f2954869798ecfa0200000000160014bb14ab892208782e1137eecc777259145c14499a0246304302202576b269429202db4638df0e5108924c52e93c3e39ee19a28fba899cd0f0d1f3021f5a234876825606b6597cbbccdda771d000894ce657a2c63dbb93474d3a1edd0121037d9652e4ab7eb49882c3a803ac412bd727002c717d33f33e389c6ad7014adc7a65000000 { "txid": "8823421a1086bd01c06220cbae230513d359e4e6c2eea5d1133c4d2dfb7402f7", "hash": "7e6c1fab5dc6575f0478036b03de68a24bdd8c28393e4875337595c19a1aa7f5", "version": 2, "size": 221, "vsize": 140, "weight": 560, "locktime": 101, "vin": [ { "txid": "834729e692969f3cdd417ffce67836c063f3783e56022582031c1a8068447b75", "vout": 0, "scriptSig": { "asm": "", "hex": "" }, "txinwitness": [ "304302202576b269429202db4638df0e5108924c52e93c3e39ee19a28fba899cd0f0d1f3021f5a234876825606b6597cbbccdda771d000894ce657a2c63dbb93474d3a1edd01", "037d9652e4ab7eb49882c3a803ac412bd727002c717d33f33e389c6ad7014adc7a" ], "sequence": 4294967293 } ], "vout": [ { "value": 0.50000000, "n": 0, "scriptPubKey": { "asm": "0 44f25ba379fb9205516c287d5bb0770f29548697", "desc": "addr(bcrt1qgne9hgmelwfq25tv9p74hvrhpu54fp5hyhtku5)#3m4hqnkd", "hex": "001444f25ba379fb9205516c287d5bb0770f29548697", "address": "bcrt1qgne9hgmelwfq25tv9p74hvrhpu54fp5hyhtku5", "type": "witness_v0_keyhash" } }, { "value": 0.49999000, "n": 1, "scriptPubKey": { "asm": "0 bb14ab892208782e1137eecc777259145c14499a", "desc": "addr(bcrt1qhv22hzfzppuzuyfhamx8wujez3wpgjv6xvf2xr)#e4k5ccyh", "hex": "0014bb14ab892208782e1137eecc777259145c14499a", "address": "bcrt1qhv22hzfzppuzuyfhamx8wujez3wpgjv6xvf2xr", "type": "witness_v0_keyhash" } } ] } Second transaction: +----------------------------------------------+ | Bob 0.50000000 BTC -> Charlie 0.10000000 BTC | | Bob 0.39999000 BTC | +----------------------------------------------+ | Weight: 561 bytes | +----------------------------------------------+ Transaction data: decoderawtransaction 02000000000101f70274fb2d4d3c13d1a5eec2e6e459d3130523aecb2062c001bd86101a4223880000000000fdffffff028096980000000000160014a6a6a3dabaee21df8fb78b1002599e7c2381b0e91856620200000000160014f988c3bf80ae29b9687a1be9cc96e554cf7b99470247304402205412c259d4dd8a783159f545509c086ea1fe85ea0a716ab98d45195fb1bc6b10022047c8a7faed2379f7d6e589301cd7e22b01c3dd642c4c9db8f0de6ee705cdb5b1012102248ba02ebf822efc4e118755efc6b2afb1e3208b071f3056e0d666a35459599465000000 { "txid": "53f552546ddac64d9e8cf061ef88af32e8bc180d91f546273711884d81746cf6", "hash": "945c80db2d8208a4388983500641fef5c4104e88526c336158335d2dce918148", "version": 2, "size": 222, "vsize": 141, "weight": 561, "locktime": 101, "vin": [ { "txid": "8823421a1086bd01c06220cbae230513d359e4e6c2eea5d1133c4d2dfb7402f7", "vout": 0, "scriptSig": { "asm": "", "hex": "" }, "txinwitness": [ "304402205412c259d4dd8a783159f545509c086ea1fe85ea0a716ab98d45195fb1bc6b10022047c8a7faed2379f7d6e589301cd7e22b01c3dd642c4c9db8f0de6ee705cdb5b101", "02248ba02ebf822efc4e118755efc6b2afb1e3208b071f3056e0d666a354595994" ], "sequence": 4294967293 } ], "vout": [ { "value": 0.10000000, "n": 0, "scriptPubKey": { "asm": "0 a6a6a3dabaee21df8fb78b1002599e7c2381b0e9", "desc": "addr(bcrt1q56n28k46acsalrah3vgqykv70s3crv8fk7k7pq)#nje9902a", "hex": "0014a6a6a3dabaee21df8fb78b1002599e7c2381b0e9", "address": "bcrt1q56n28k46acsalrah3vgqykv70s3crv8fk7k7pq", "type": "witness_v0_keyhash" } }, { "value": 0.39999000, "n": 1, "scriptPubKey": { "asm": "0 f988c3bf80ae29b9687a1be9cc96e554cf7b9947", "desc": "addr(bcrt1qlxyv80uq4c5mj6r6r05ue9h92n8hhx283sqhcp)#zyauga6q", "hex": "0014f988c3bf80ae29b9687a1be9cc96e554cf7b9947", "address": "bcrt1qlxyv80uq4c5mj6r6r05ue9h92n8hhx283sqhcp", "type": "witness_v0_keyhash" } } ] } Batched version: +------------------------------------------------+ | Alice 1.00000000 BTC -> Charlie 0.10000000 BTC | | Bob 0.39999000 BTC | | Alice 0.49999000 BTC | +------------------------------------------------+ | Weight: 685 bytes | +------------------------------------------------+ Transaction data: decoderawtransaction 02000000000101757b4468801a1c03822502563e78f363c03678e6fc7f41dd3c9f9692e62947830000000000fdffffff038096980000000000160014a6a6a3dabaee21df8fb78b1002599e7c2381b0e91856620200000000160014f988c3bf80ae29b9687a1be9cc96e554cf7b994798ecfa0200000000160014bb14ab892208782e1137eecc777259145c14499a0247304402206e84fee8ff1300776dfac9e506e9f3bb3bc2e2be992109a526c61d8ca4ffe9ac02204260554c76d8c637ba861b31af150faa4ba780a391ad4fe35c9dec4b348521f90121037d9652e4ab7eb49882c3a803ac412bd727002c717d33f33e389c6ad7014adc7a65000000 { "txid": "a7a857cddc4196baa72254fcd2b7d20416b3fbbde27ce5bd166c47886c2796b4", "hash": "97552eaf1b84bd0b6b06e15c1f4dd4d28d412b4f77ab8ffe827a9508e79616f4", "version": 2, "size": 253, "vsize": 172, "weight": 685, "locktime": 101, "vin": [ { "txid": "834729e692969f3cdd417ffce67836c063f3783e56022582031c1a8068447b75", "vout": 0, "scriptSig": { "asm": "", "hex": "" }, "txinwitness": [ "304402206e84fee8ff1300776dfac9e506e9f3bb3bc2e2be992109a526c61d8ca4ffe9ac02204260554c76d8c637ba861b31af150faa4ba780a391ad4fe35c9dec4b348521f901", "037d9652e4ab7eb49882c3a803ac412bd727002c717d33f33e389c6ad7014adc7a" ], "sequence": 4294967293 } ], "vout": [ { "value": 0.10000000, "n": 0, "scriptPubKey": { "asm": "0 a6a6a3dabaee21df8fb78b1002599e7c2381b0e9", "desc": "addr(bcrt1q56n28k46acsalrah3vgqykv70s3crv8fk7k7pq)#nje9902a", "hex": "0014a6a6a3dabaee21df8fb78b1002599e7c2381b0e9", "address": "bcrt1q56n28k46acsalrah3vgqykv70s3crv8fk7k7pq", "type": "witness_v0_keyhash" } }, { "value": 0.39999000, "n": 1, "scriptPubKey": { "asm": "0 f988c3bf80ae29b9687a1be9cc96e554cf7b9947", "desc": "addr(bcrt1qlxyv80uq4c5mj6r6r05ue9h92n8hhx283sqhcp)#zyauga6q", "hex": "0014f988c3bf80ae29b9687a1be9cc96e554cf7b9947", "address": "bcrt1qlxyv80uq4c5mj6r6r05ue9h92n8hhx283sqhcp", "type": "witness_v0_keyhash" } }, { "value": 0.49999000, "n": 2, "scriptPubKey": { "asm": "0 bb14ab892208782e1137eecc777259145c14499a", "desc": "addr(bcrt1qhv22hzfzppuzuyfhamx8wujez3wpgjv6xvf2xr)#e4k5ccyh", "hex": "0014bb14ab892208782e1137eecc777259145c14499a", "address": "bcrt1qhv22hzfzppuzuyfhamx8wujez3wpgjv6xvf2xr", "type": "witness_v0_keyhash" } } ] } Then, instead of having weight of 560 bytes + 561 bytes = 1121 bytes, you have a single weight of 685 bytes. 3) Don't forget about network data propagation delays. You don't have to remove all data right away. You can use a similar strategy, as pruned nodes, and for example keep the full data, from the last 288 blocks. But: in the long-term scenario, you can remove in-the-middle proofs, because they are not needed for Initial Blockchain Download. 4) Remember that blockchain security relies on full nodes that do all verifications in full against all protocol rules. Full nodes never rely on any spv-like proofs. There is no compromise here on what is better. There are no "options" here to think about. SPV proofs are not needed, to make sure, that the system is honest. They are needed only to show, who was inside. All signatures are valid in both versions: batched and non-batched. You need a valid ECDSA signature, to move any coins anywhere, no matter what. 5) Finally. Could you estimate what is % of transactions go in chains "Alice->Bob->Charlie" within short time intervals compared to stand-alone transactions "Alice to Bob"? If transactions which could be "batched" are very rare, then how can you get any non-marginal efficiency improvement here? I didn't create any on-chain statistics yet, but I can see, that there are many unconfirmed transactions, which could be batched. In the example above, where you can see two typical one-input-two-output transactions, you can see, how it can be batched into a single one-input-three-output transaction, doing exactly the same things. And note, that batching just two transactions into one, is the simplest case. If you batch for example 10 transactions, then you will save more space, while consuming the same inputs, and making the same final outputs.
|
|
|
|
IShishkin
Member

Offline
Activity: 87
Merit: 44
|
 |
January 24, 2025, 01:00:47 PM |
|
As you can see, the final outcome is exactly the same. Fees are identical, amounts are identical, but the batched transaction will take less space, than non-batched two transactions, just because in the first case, you have to store Bob's data on-chain, but in the second case, Bob can just keep a proof locally, but nobody else needs that proof.
Then, instead of having weight of 560 bytes + 561 bytes = 1121 bytes, you have a single weight of 685 bytes.
You don't have to remove all data right away. You can use a similar strategy, as pruned nodes, and for example keep the full data, from the last 288 blocks. But: in the long-term scenario, you can remove in-the-middle proofs, because they are not needed for Initial Blockchain Download.
SPV proofs are not needed, to make sure, that the system is honest. They are needed only to show, who was inside. All signatures are valid in both versions: batched and non-batched. You need a valid ECDSA signature, to move any coins anywhere, no matter what.
Very well. Good job. However, it means that the first transaction is not recorded on the blockchain before Bob submits the second transaction. It means the first transaction should stay unconfirmed in the mempool for some time and miners should be restricted from including it into the blockchain. While this transaction is unconfined, Bob is vulnerable to double spend attacks. In this setting, Bob has to trust Alice. Bitcoin is already criticised for having long transaction confirmation time. In this scheme, first tx get confirmed long time after it was signed by Alice. I saw businesses which try to join multiple transactions into a single one. However, it is inconvenient for their clients.
|
|
|
|
stwenhao
|
 |
January 24, 2025, 01:25:02 PM |
|
In this setting, Bob has to trust Alice. This is how it works today, where we have just full-RBF, without any restrictions. But: if transactions will be constructed with non-interactive batching in mind, then Alice won't be needed to sign a second version, but instead, nodes will just combine two signed transactions, and will generate a signature for a batched version in that way. So, you have to trust Alice only as long, as we have today's interactive version. But if you use non-interactive version instead, then Alice won't sign new versions, but nodes will do that instead, during their block template construction. However, it is inconvenient for their clients. Because it is interactive, and because each new version has to be accepted by the earliest coin owner. But if you have non-interactive version, then when you have hundreds of transactions with enabled batching, each node can do it on their own, without asking users, to sign each and every version, every time, when the next person will make the next transaction. That non-interactivity really makes a huge difference, which is why today's version is so inconvenient. But in general, as you can see, there are ways, to process more transactions, without increasing the size of the block. Which is why I think that kind of models should be better explored, instead of pushing for any block size increases.
|
|
|
|
ABCbits
Legendary
Offline
Activity: 3332
Merit: 8987
|
 |
January 25, 2025, 08:11:27 AM |
|
--snip-- Cut-through actually merge multiple TX into one big TX and it leads to overall smaller TX size. The computation and network usage is a bit higher, but negligible with today's computer or server.
If network usage is negligible then why do have block size problem? See this old article, https://en.bitcoin.it/wiki/Block_size_limit_controversy.
|
|
|
|
Wind_FURY
Legendary
Offline
Activity: 3374
Merit: 2046
|
 |
January 29, 2025, 04:01:21 AM |
|
There's also the adoption perspective. If there was a cryptocurrency with the level of adoption/network effects like Dogecoin and its mature market, that would probably be a very good merge mine coin.
Yeah, I suppose that is one of the reasons why Doge is being merge mined by LTC miners (they both use the same scrypt algorithm for their PoW). But I think for miners the main incentive is still profitability, I doubt majority of miners care about popularity of an altcoin. Although the two are not mutually exclusive, a popular coin can be more profitable as well. But higher level of popularity = more network effects, more network effects = higher possibility of retaining long term value, LIKE Bitcoin. I believe, if it was possible, a merge mine with Monero would be the best possible choice for Bitcoin. BUT that would merely complicate things. When the actual problem comes, and if the decision would mean Life or Death for Bitcoin, whether to increase the supply limit or not, then I'm VERY confident that we'll get community/network consensus to choose Life. 
|
| .SHUFFLE.COM.. | ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ | ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ | . ...Next Generation Crypto Casino... |
|
|
|
pooya87
Legendary
Offline
Activity: 3906
Merit: 11841
|
 |
January 30, 2025, 04:58:51 AM |
|
But higher level of popularity = more network effects, more network effects = higher possibility of retaining long term value, LIKE Bitcoin.
Not like bitcoin though. Because it is trivial to create another useless memcoin and replace the old one or at least take its market share than it is to create another useful payment system like Bitcoin. I believe, if it was possible, a merge mine with Monero would be the best possible choice for Bitcoin.
Merge mining works best with weaker projects that cannot stay alive and secure alone (due to lack of interest) and need another bigger project to stick to and stay alive with its help. Monero isn't like that, it has been capable of attracting enough interest on its own to stay alive and secure so far.
|
|
|
|
BlackHatCoiner
Legendary
Offline
Activity: 1792
Merit: 8677
|
 |
January 30, 2025, 09:45:34 AM |
|
I believe, if it was possible, a merge mine with Monero would be the best possible choice for Bitcoin. It's not possible, because their mining algorithms are completely different. Monero uses RandomX, which is incompatible with ASIC miners. It is also realistically infeasible to create a Monero ASIC, as it would require making a better general-purpose CPU than those of Intel and AMD, both of which are worth hundreds of billions, because they specialize in researching and optimizing CPU performance.
|
|
|
|
d5000
Legendary
Offline
Activity: 4368
Merit: 9193
Decentralization Maximalist
|
 |
January 30, 2025, 08:55:30 PM |
|
Merge mining works best with weaker projects that cannot stay alive and secure alone (due to lack of interest) and need another bigger project to stick to and stay alive with its help.
I'm not sure about that. Dogecoin for example also has chosen to become merged mined with Litecoin, even if it has a larger market cap than LTC (but Doge probably chose that because of its own volatility which can become a risk if it stays independent). Thus there already signs that strong altcoins develop some kind of symbiotic relationship if the move is planned correctly. This includes taking into account the "community side" of cryptocurrencies, at least both project leaders (devs, foundation if available ...) shouldn't behave in a maximalist way and support the other coin. Afaik however BlackHatCoiner is correct about the infeasibility of a Bitcoin-Monero merged mining mechanism. However, it is probably possible to create a merged mined SHA256 coin supporting ring signatures and other Monero-style privacy features. This coin could develop into a true Bitcoin sidechain if a dynamic federation algorithm (Threshold's tBTC, for example) is added. In this case it could acquire sufficient strength to become valuable on its own and increase the income of Bitcoin miners.
|
|
|
|
Wind_FURY
Legendary
Offline
Activity: 3374
Merit: 2046
|
 |
January 31, 2025, 04:32:52 AM |
|
But higher level of popularity = more network effects, more network effects = higher possibility of retaining long term value, LIKE Bitcoin.
Not like bitcoin though. Because it is trivial to create another useless memcoin and replace the old one or at least take its market share than it is to create another useful payment system like Bitcoin. The point is, for a merge mine to work to sustain the long term profitability of the miners, the merge mined coin itself MUST also be valued by both miners and users. I believe, if it was possible, a merge mine with Monero would be the best possible choice for Bitcoin.
Merge mining works best with weaker projects that cannot stay alive and secure alone (due to lack of interest) and need another bigger project to stick to and stay alive with its help. Monero isn't like that, it has been capable of attracting enough interest on its own to stay alive and secure so far. It's possible that that's what Bitcoin will probably be if the miners aren't incentivized to continue providing security for the network, no? 🤔 I believe, if it was possible, a merge mine with Monero would be the best possible choice for Bitcoin. It's not possible, because their mining algorithms are completely different. Monero uses RandomX, which is incompatible with ASIC miners. It is also realistically infeasible to create a Monero ASIC, as it would require making a better general-purpose CPU than those of Intel and AMD, both of which are worth hundreds of billions, because they specialize in researching and optimizing CPU performance.  Ser, please, kindly, and respectfully, get the context of the post before you reply.
|
| .SHUFFLE.COM.. | ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ | ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ ███████████████████████ | . ...Next Generation Crypto Casino... |
|
|
|
|