Bitcoin Forum
May 23, 2024, 03:29:59 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 [26] 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 ... 132 »
501  Other / MultiBit / Re: I repaired my wallet now coins are unconfirmed SOS, and solution, reward in btc on: August 01, 2016, 11:00:24 AM
I had a similar problem, just fixed it https://bitcointalk.org/index.php?topic=1513497.0

If you are willing to send me by private message the wallet password and the main wallet file,
I can retrieve those coins for you (no wallet words required).

If anyone takes up this guy's offer, then I have a bridge to sell you...
502  Economy / Speculation / Re: $13,000 bitcoins by the end of 2016? $120,000 by 2018? (golden ratio fractals) on: July 26, 2016, 09:14:44 AM
The price is not going to be that high at the end of 2016 I am so sure of that we have not even reached the thousands yet so it would be very unlike to be that high.
At the end of 2016 I believe that the price is going to be max $1000.

when it happens it happens fast, and hard...

http://bitcoincharts.com/charts/bitstampUSD#igDailyzczsg2012-11-01zeg2013-11-30ztgSzm1g10zm2g25zvzcvzl

nothing would surprise me.
503  Economy / Speculation / Re: Market Cycles - why 560k might be wrong! on: July 03, 2016, 11:53:25 PM
It begins
504  Bitcoin / Bitcoin Discussion / Re: Blockchain size - problem ? on: March 17, 2016, 10:12:45 AM
To run a Lightning Node requires those three things... you can see where I am going with this...
Apples and oranges.

You'll have to clarify in what way things are different.

This was my thinking:

A lightning transaction is a bitcoin transaction.

More on chain transactions => greater resource usage for bitcoin nodes
More LN transactions => greater resource usage for LN nodes

If greater resource usage is a centralising force for bitcoin nodes, then i would say the same could be said for LN nodes.
505  Bitcoin / Development & Technical Discussion / Re: Segwit details? SEGWIT WASTES PRECIOUS BLOCKCHAIN SPACE PERMANENTLY on: March 16, 2016, 09:32:05 PM

Script versioning is essentially about changing this consensus mechanism so that any change can be made without any consensus. Giving this control to anyone, even satoshi himself, entirely undermines the whole idea of bitcoin. *Decentralised* something something.
The content of your scriptpubkey, beyond the resource costs to the network, is a private contract between the sender of the funds and the receiver of the funds. It is only the business of these parties, no one else. Ideally, it would not be subject to "consensus", in any way/shape/form-- it is a _private matter_. It is not any of your business how I spend my Bitcoins; but unfortunately, script enhancing softforks do require consensus of at least the network hashpower.

Bitcoin Script was specifically designed because how the users contract with it isn't the network's business-- though it has limitations. And, fundamentally, even with those limitations it already, at least theoretically, impossible to prevent users from contracting however they want. For example, Bitcoin has no Sudoku implementation in Script, and yet I can pay someone conditionally on them solving one (or any other arbitrary program).

Bitcoin originally had an OP_VER to enable versioned script upgrades. Unfortunately, the design of this opcode was deeply flawed-- it allowed any user of the network, at their unannounced whim, to hardfork the network between different released versions of Bitcoin.  Bitcoin's creator, removed it and in its place put in facilities for softforks. Softforks have been used many times to compatibly extend the system-- first by Bitcoin's creator, and later by the community. The segwit script versioning brings back OP_VER but with a design that isn't broken---- it makes it faster and safer to design and deploy smart contracting/script improvements (for example, a recently proposed one will reduce transaction sizes by ~30% with effectively no costs once deployed); but doesn't change the level of network consensus required to deploy softforks; only perhaps the ease of achieving the required consensus because the resulting improvements are safer.

This is a really good explanation, thanks for taking the time to write it up. My understanding of Bitcoin doesn't come direct from the code (yet!) I have to rely on second hand information. The information you just provided has really deepened my understanding of the purpose of the scripting system over and above "it exists, and it makes the transactions work herp" which probably helps address your final paragraph...

If you're going to argue that you don't want a system where hashpower consensus allows new script rules for users to use to voluntarily contract with themselves, you should have left Bitcoin in 2010 or 2011 (though it's unclear how any blockchain cryptocurrency could _prevent_ this from happening).  Your views, if not just based on simple misunderstandings, are totally disjoint with how Bitcoin works. I don't begrudge you the freedom to want weird or even harmful things-- and I would call denying users the ability to choose whatever contract terms they want out of principle rather than considerations like resource usage both weird and harmful--, but Bitcoin isn't the place for them, and the restrictions you're asking for appear to be deeply disjoint with Bitcoin's day-one and every-day-since design, which has a huge amount of complexity in the original design for user (not consensus) determined smart contracting and where softforks (hashpower consensus) have been frequently used to extend the system.

As we have established my understanding was, lets say limited Wink, then I don't think its far to say I am arguing against what it is for. I was arguing against what I though it meant. Quite the opposite to wanting weird or harmful things, I was very much arguing that we shouldn't be allowing a harmful thing! If as may be the case, that harmful thing is not an issue, then I have nothing to worry about!

I'm trying not to get (too) sucked into the conspiracy theories on either side, I'm only human though so sometimes I do end up with five when adding together two and two.

A question that still niggles me is segwit as a soft fork. I know that just dredges up the same old discussion about pros and cons of soft vs hard but for a simpleton such as me it seems that if the benefits of segwit are so clear, then compromising on the elegance of implementation in order to make it a soft fork seems a strange decision.
506  Bitcoin / Development & Technical Discussion / Re: Segwit details? SEGWIT WASTES PRECIOUS BLOCKCHAIN SPACE PERMANENTLY on: March 16, 2016, 06:40:10 PM
I asked some of these questions 3 months ago.  Never got a decent answer.

Blockstream wants soft-forked SegWit to fix the malleability problems (that would be needed for the LN, if they ever get it to work), and to force ordinary p2p bitcoin users subsidize the costs of complicated multisig transactions (ditto).  But these reasons do not seem explain the urgency and energy that they are putting on the SegWit soft fork.  Maybe they have other undeclared reasons?  Perhaps they intend to stuff more data into the extension records, which they would not have to justify or explain since, being in the extension part, "ordinary users can ignore it anyway"?

As for SegWit being a soft fork, that is technically true; but a soft fork can do some quite radical changes, like imposing a negative interest (demurrage) tax, or raising the 21 million limit.  One could also raise the block size limit that way.  These tricks would all let old clients work for a while, but eventually everybody will be forced to upgrade to use coins sent by the new verson.

A hard fork based consensus mechanism, far from being dangerous, is actually the solution to centralised control over consensus.

Script versioning is essentially about changing this consensus mechanism so that any change can be made without any consensus. Giving this control to anyone, even satoshi himself, entirely undermines the whole idea of bitcoin. *Decentralised* something something.

[bold]Script versioning[/bold]
Changes to Bitcoin’s script allow for both improved security and improved functionality. However, the design of script only allows backwards-compatible (soft-forking) changes to be implemented by replacing one of the ten extra OP_NOP opcodes with a new opcode that can conditionally fail the script, but which otherwise does nothing. This is sufficient for many changes – such as introducing a new signature method or a feature like OP_CLTV, but it is both slightly hacky (for example, OP_CLTV usually has to be accompanied by an OP_DROP) and cannot be used to enable even features as simple as joining two strings.

Segwit resolves this by including a version number for scripts, so that additional opcodes that would have required a hard-fork to be used in non-segwit transactions can instead be supported by simply increasing the script version.

It doesn't matter where you stand on the blocksize debate, which dev team you support, or any of the myriad disagreements. As Gregory Maxwell himself states:

"Anyone who /understood/ it would [shut down bitcoin], if somehow control of it were turned over to them."
507  Bitcoin / Bitcoin Discussion / Re: Blockchain size - problem ? on: March 14, 2016, 01:24:16 PM
Well to run a node one requires three things:
1) Processing power (for validation).
2) Storage (for storing the blockchain).
3) Bandwidth (for relaying data).

Currently the block size limit is at 1 MB. Now when you try increasing that to let's say 2 MB, you are essentially increasing the requirements listed above. One could argue that decentralization is Bitcoin's biggest pro. Now if you make the requirements too large, then a decrease in the number of nodes will occur. This is why you've heard of "Bitcoin ending up on super nodes". We can not let this happen.


So to avoid this, the plan is to be very conservative on the block size increases (only safe limits) to retain decentralization and build a second layer (e.g. Lightning Network). Such a layer enables a much better (efficient way) of scaling.

To run a Lightning Node requires those three things... you can see where I am going with this...
508  Bitcoin / Bitcoin Discussion / Re: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" on: March 12, 2016, 03:03:08 PM
The limit was a temporary fix.

Shortly before disappearing, Satoshi also repeated over and over that there were more ways that Bitcoin could be successfully DOS attacked than he could count.

The limit (in conjunction with transaction fees) were intended as a DOS attack and spam deterrent. Do you have any evidence that those risks have been mitigated? Do you have any evidence that suggests that 2MB blocks won't be filled to capacity right away? That would leave us where we are today -- with no scaling solutions, wondering how much more un-optimized throughput the network can safely handle.  

I don't believe a second, compatible implementation of Bitcoin will ever be a good idea.  So much of the design depends on all nodes getting exactly identical results in lockstep that a second implementation would be a menace to the network.

Satoshi's description of how to increase the block size limit was also clearly in the context of a software update. Not an incompatible implementation of bitcoin that essentially attacks the network and intentionally forks from all other client nodes. He even thought compatible alternatives were a bad idea.

One could make the argument that Satoshi thought we could raise the limit as/when needed (as if we needed his authority anyway). In that respect, the question of necessity is subjective and debatable, and it is not immediately clear that an increase to 2MB now, with no attempts to make throughput more scalable is necessary.

One could not make the argument that Satoshi thought we should increase the block size limit through a contentious hard fork.

All of this is moot. We should be talking in terms of "what is best for bitcoin" -- with respect to users, nodes and miners -- not talking past each other with interpretations of things Satoshi said 5-6 years ago. Not only was Satoshi wrong about some things, but the state of bitcoin has changed a lot. Hell, if we left the codebase as he originally wrote it, those 184 billion bitcoins from the August 2010 value overflow incident would still be with us. Better to work towards bitcoin's principles than aimlessly trying to fit some arbitrary interpretation of Satoshi's words.

As there are already several implementations the menace threat seems overblown. Even if you were to assume that Core was the "one true bitcoin", which version of it is gospel? If miners refuse to upgrade to 0.12 is this an attack? A menace?

It is possible to release an implementation right now with the limit removed. This still isn't an attack. Bitcoin's built in consensus mechanism protects the network.

There is evidence right now that the 1MB limit causes problems. There is only speculation that removing it causes problems.

The reason we can't talk in terms of what is best for bitcoin is because people have different opinions. People prefer speculative drama to mundane truth. The reason we can't work towards Bitcoin's principles is that these are also subject to people's opinion. If you refuse to accept that "What Satoshi said" has any importance all we have is the court of public opinion. In which case your view is as equally valid as mine. How do we resolve that? What are bitcoins principles when it comes to disagreement about what bitcoin is?

With regards to the "appeal to authority" defence. If something is a fact, and somebody happens to state that fact. Any argument that relies on the fact cannot be dismissed by saying "we should not rely on what somebody said". The argument is being made because of the fact, not because of what was said.

The limit was a temporary anti DOS measure, it is now restricting growth of transactions. Those are facts.

IMHO what is best for bitcoin is to allow it to grow unencumbered by artificial limits.
509  Bitcoin / Mining / Re: Empty transaction blocks on: March 11, 2016, 08:37:01 AM
I've read some long discussions about the pros and cons of miner's publishing zero transaction blocks (as 'some' are doing now).

The TL;DR is that:

Let's say the blocksize was 10MB and it took 10 minutes to solve a block. Miners would work on solving it but a large zero-miner would quickly solve a zero sized block. They would them publish it to their "large miner friends" who would start solving a next block.

Whoever solved the 10MB block would likely get their block orphaned if the empty block + next block was published immediately after the block was solved/broadcast. And whoever was working on the longer chain would be already working on the next empty block + next block.

In a way, publishing empty blocks seems to create an artificial cap on blocksize, does it not? As the blocksize gets larger and it takes longer to solve blocks, publishing empty transactions can cause more and more transactions to be orphaned.

Additionally, some (many?) people support miners publishing empty blocks. There seems some back and forth whether this should be allowed at all. As it is part of the protocol, disallowing it would seem about impossible at this point in time.

Doesn't this basically put a blocksize limit in the hands of the large miners? And isn't this just a way for the largest miners to gain an advantage over the others? Or is there some reason this is a "good thing"?


There are at least two ways that empty (1txn) blocks can be used.

The first is concerned with reducing idle time. If a block is solved and broadcast to the network then as a miner you are first made aware of the block header. This block header is all you need to start mining the next block. The rest of the block may take some time to propagate and will also take further time for you to validate and adjust your mempool. There is a chance that during this time you may find a block. Given the block reward is high and fees are low the most profitable thing to do is publish the block immediately. As you have not yet validated the previous block you cannot know which transactions to include from the mempool, so the only transaction you can include are ones that you know nobody else knows about - typically the coinbase transaction.

The second us a natural disincentive for miners to produce blocks that are too expensive to propagate/validate. If such a block  is produced then a miner may decide that validating this block is taking too long and instead decide that they could make more profit by mining a smaller sibiling block. This block would also be seen by miners and they could decide whether to continue validating the "expensive" block or instead mine on top of the easy to validate block. These behaviours form a natural equilibrium in block size obviating the need for any artificial limit.
510  Bitcoin / Bitcoin Discussion / Re: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" on: March 10, 2016, 06:19:11 PM
The elephant in the room just will not go away though. The limit was a temporary fix.

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?

(so if Satoshi had "got it right" with the original block size then Bitcoin wouldn't be able to even produce blocks as quickly as 10 minutes and no-one but corporations would be able to afford to even run full nodes - but hey who cares about inconvenient truths such as those)


Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?

That's not even really the appropriate question. The real question you should be asking is "What would prevent someone who wants to destroy the network from producing such a block?"

But you don't ask that because ... (you tell me).


I do not ask the question because, an attacker intending to destroy the network would not produce such a block, because such a block does not destroy the network.

This transaction monopolization can only happen IF all the other mining pools choose to mine on top of that monopolizing mining pool's blocks. Yet, doing so will result in lower profitability for the other (the majority) of pools since they (having smaller validation capacity) must always mine 1-txn blocks and are therefore unable to reap transaction fees. This is an unstable situation - if a single mining pool chooses to ignore the large block and is able to find a small competing block while other pools are still validating a large block, it is in the other pools best interest to switch to this new sibling6. By switching, the other pools reduce the risk that they are mining on top of an invalid block, and can mine blocks with transactions. But if mining pools know that the majority will switch to a discovered sibling, it is rational for all pools except for the producer of the large block to search for a sibling rather than produce a 1-txn block.

Knowing that an expensive to validate block is mitigated as an attack vector, the question remains as to why somebody would produce such a block?

The implication being that if you have the money/hardware to produce blocks, your most profitable course of action is to just mine blocks honestly.

You are missing my point because you are assuming that the motivations of all participants is that of monetary profit. You can't just ignore the observation that decentralized cryptocurrency represents the only real threat to the central banksters in a very long time. The point is the same for 64MB as it is for 2MB, only more obvious in that the problems 64MB would create are much greater.


No such assumption was made. In the quoted text a method by which *other* miners who are motivated by money act to thwart this kind of attack by mining a smaller sibling block, building on that and ultimately orphaning the attacker's block.

There are simpler ways that a theoretical bad actor with no profit motivation and lots of money can destroy bitcoin. Setting up a mining operation to craft malicious expensive to validate blocks doesn't seem like a good angle to me!
511  Bitcoin / Bitcoin Discussion / Re: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" on: March 10, 2016, 02:57:24 PM
The elephant in the room just will not go away though. The limit was a temporary fix.

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?

(so if Satoshi had "got it right" with the original block size then Bitcoin wouldn't be able to even produce blocks as quickly as 10 minutes and no-one but corporations would be able to afford to even run full nodes - but hey who cares about inconvenient truths such as those)


Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?

That's not even really the appropriate question. The real question you should be asking is "What would prevent someone who wants to destroy the network from producing such a block?"

But you don't ask that because ... (you tell me).


I do not ask the question because, an attacker intending to destroy the network would not produce such a block, because such a block does not destroy the network.

This transaction monopolization can only happen IF all the other mining pools choose to mine on top of that monopolizing mining pool's blocks. Yet, doing so will result in lower profitability for the other (the majority) of pools since they (having smaller validation capacity) must always mine 1-txn blocks and are therefore unable to reap transaction fees. This is an unstable situation - if a single mining pool chooses to ignore the large block and is able to find a small competing block while other pools are still validating a large block, it is in the other pools best interest to switch to this new sibling6. By switching, the other pools reduce the risk that they are mining on top of an invalid block, and can mine blocks with transactions. But if mining pools know that the majority will switch to a discovered sibling, it is rational for all pools except for the producer of the large block to search for a sibling rather than produce a 1-txn block.

Knowing that an expensive to validate block is mitigated as an attack vector, the question remains as to why somebody would produce such a block?

The implication being that if you have the money/hardware to produce blocks, your most profitable course of action is to just mine blocks honestly.
512  Alternate cryptocurrencies / Altcoin Discussion / Re: Am I crazy or is running a shit load of classic nodes on amazanaws.com dumb on: March 10, 2016, 09:02:11 AM
Something is wrong, I call foul

While higher number of Classic nodes compared to Core nodes are on amazon cloud, the reason is the DDoS attacks where only Classic nodes are targetted which amazon cloud has good protection for.

I doubt that, the DDoS can easily be defended with fail2ban[1]. It would be easier to install a very common admin and protection tool you should have anyway (even if you just use it to defend against SSH brute force attacks) than to move the entire server.

Try run Classic node home and the chances are you wont be able to use internet when you need it the most because of DDoS - so your forced to set up node on cloud hosting with good DDoS protection if you want keep the Bitcoin network healthy.

I call DDoS crimminal, not just foul.

A actually distributed DoS (as in ran by multiple people and not a bought bot net) is seen as a form of political protest by some. I do agree however that hiring a bot net to attack classic is a stupid idea, whoever is behind it.

[1] https://bitcointalk.org/index.php?topic=1380642.msg14103169#msg14103169

A DDoS that takes down routers at the ISP cannot be mitigated at the local server level.
513  Bitcoin / Bitcoin Discussion / Re: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" on: March 10, 2016, 08:54:04 AM
You didn't logic it correctlly. the block size limit was used to protect against "poison" blocks. The block size limit is in the way of natural growth.
There's nothing natural about it.

A different solution to "poison" blocks exists now so block size limit is redundant. Segwit has nothing to do with any of this
It isn't a solution. It is a workaround that prevents certain types of transactions.

If Bitcoin adoption continues then natural growth occurs!ergo he block size limit is in the way. This is a reasonable assertion so what followed still stands. If you refute that Bitcoin adoption is going to continue than you have a point. Do you refute that. Do you think Bitcoin will not grow?

Given that Bitcoin will continue to grow and that there is a known lead time to safely deploy code that removes the limit (75% activation and 28 days grace) and that it cannot be reliably known that Bitcoin growth will not exceed current limits and that exceeding those limits affects the normal operation of the network then that code must be deployed now to mitigate the risk of the network not being able to operate normally.

Certain types of transactions are the problem. Excluding those types of transactions is a solution.

It's a workaround the same way the 1MB block size limit was.

If it causes problems and it can be solved differently then it can be removed.
514  Bitcoin / Bitcoin Discussion / Re: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" on: March 09, 2016, 11:48:18 PM
Your wrong, follow the thread url. The hard sigop limit is the way to fight such attacks. All times are for worst case blocks than can be used for attacks and the same computer to make the comparsion fair.
I don't have time to follow that thread. So this is the worst case scenario? If we assume that this is true, then then Segwit seems pretty good. Added capacity without an increase in the amount of hashed data and no additional limitations. Did I understand this correctly?

You didn't logic it correctlly.

the block size limit was used to protect against "poison" blocks.

The block size limit is in the way of natural growth.

A different solution to "poison" blocks exists now so block size limit is redundant.

Segwit has nothing to do with any of this
515  Bitcoin / Bitcoin Discussion / Re: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" on: March 09, 2016, 11:41:31 PM
Thank you so much sgbett; that was a really great read.  I must admit I didn't work through all of the math yet but at first blush it appears ok until;
Quote
The Bitcoin network is naturally limited by block validation and construction times. This puts an upper limit on the network bandwidth of 60KB/sec to transmit the block data to one other peer.
Hmm, really?  There's no way ever to improve on block validation times?  Quantum computers?  Nothing?  That doesn't ring true.

I am totally with you. I read 60kb/s more as a theoretical *minimum* rate we can achieve Smiley
516  Bitcoin / Bitcoin Discussion / Re: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" on: March 09, 2016, 09:23:59 PM
Set an arbitrary limit which is way above what we need right now, but closes the attack vector.
agreed.

and last I heard its exactly how the attack remains mitigated in classic...

AS BU supporter though, we don't need limits!

IMHO the financial incentives are strong enough that block size (in terms of both bandwidth to transmit/ and CPU to process) is self limiting. Propagation time is a combination of the two things and to (over)simplify propagation time vs orphan risk is enough to make sure miners don't do stupid things, unless they want to lose money.

The full math is here - David you would probably be interested in this if you haven't already seen it.

http://www.bitcoinunlimited.info/resources/1txn.pdf

The paper also describes how the sigops attack is mitigated through miners simply mining 1tx blocks whilst validating then pushing that out to other miners whilst they are still validating the 'poison' block. Rational miners will validate the smaller block, and they also be able to mine another block on top of this, orphaning the poison block.

The attacker would get one shot, and would quickly be shut out. If you have enough hash rate to be mining blocks yourself its really much more profitable to behave!
517  Bitcoin / Bitcoin Discussion / Re: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" on: March 09, 2016, 09:13:39 PM
You and I think very much alike.  Lauda, can you point us at a really big but totally legit/non-abusive transaction?
I don't think that there are many transactions that are so large in nature (both 'abusive' and non). This is the one that I'm aware of. However, you'd also have to define what you mean by "big". Do you mean something quite unusually big (e.g. 100kB) or something that fills up the entire block? I'd have to a lot more analysis to try and find one (depending on the type).

Segwit isn't a solution designed to fix the block size limit. Its a solution to another problem that right now is undefined, that is being sold as a solution to a problem that is being actively curated by those who refuse to remove a prior temporary hack.
TX malleability (e.g.) is 'undefined'? Segwit provides additional transaction capacity while carrying other benefits. How exactly is this bad?

What problem is it that requires signatures to be segregated into another data structure and not counted towards the fees. Nobody can give a straight answer to that very simple question. Why is witness data priced differently?
The question would have to be correct for one to be able to answer it. In this case, I have no idea what you are trying to ask.

Fixing TX Malleability is beneficial to everyone.

This *other benefits* - they include giving the ability to introduce consensus changes without hard forking. This is because we are told that a contentious hard fork is a terrible thing. How does anyone know this for sure!?

A hard fork is good. (Note the absence of the word contentious). A hard fork establishes Nakamoto consensus, and is the only consensus vital to the ongoing successful operation of the bitcoin network. The incentives that drive this consensus mechanism are sound. The fear from those that do not see this is overwhelming. To subvert this is to destroy fundamental parts of bitcoins architecture.

I thought you would understand what I meant when I asked the question, sorry if I have used the wrong terminology or something. I can make it a broader question, then perhaps we can investigate the specifics.

Why does segregated witness change the tx fee calculation?
518  Bitcoin / Bitcoin Discussion / Re: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" on: March 09, 2016, 08:43:34 PM
Setting a block size limit of 1MB was, and continues to be a hacky workaround.
It is certainly not a hacky workaround. It is a limit that was needed (it still is for the time being).

Theory drives development, but in practice sometimes hacky workarounds are needed.
If it can be avoided, not really.

The block size limit was a hacky workaround to the expensive to validate issue. An issue that is now mitigated by other much better solutions, not least a well incentivised distributed mining economy. That is now smart enough to route around such an attack, making it prohibitively expensive to maintain.
So exactly what is the plan, replace one "hacky workaround" with another? Quite a lovely way forward. Segwit is being delivered and it will ease the validation problem and increase the transaction capacity. What is the problem exactly?

Problem: an attacker can create a block that is so expensive to validate that other miners would get stuck validation the block.
Hack: Set an arbitrary limit which is way above what we need right now, but closes the attack vector.
Solution: 1 transaction blocks.

Problem: the block size limit is causing transactions to get stuck in the mempool
Hack: raise the block size limit to 2MB
Solution: remove the block size limit

Segwit isn't a solution designed to fix the block size limit. Its a solution to another problem that right now is undefined, that is being sold as a solution to a problem that is being actively curated by those who refuse to remove a prior temporary hack.

What problem is it that requires signatures to be segregated into another data structure and not counted towards the fees. Nobody can give a straight answer to that very simple question. Why is witness data priced differently?
519  Bitcoin / Bitcoin Discussion / Re: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" on: March 09, 2016, 05:22:43 PM
Satoshi was wrong about so many things.

He was right about things too.

<snip>
Bitcoin users might get increasingly tyrannical about limiting the size of the chain so it's easy for lots of users and small devices.
</snip>

520  Bitcoin / Bitcoin Discussion / Re: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" on: March 09, 2016, 10:41:21 AM
No. You don't get to define what we allow in the system and what we don't, certainly not when it was possible all this time. What Gavin proposed is a hacky workaround, nothing more

Setting a block size limit of 1MB was, and continues to be a hacky workaround.

Theory drives development, but in practice sometimes hacky workarounds are needed.

I write code, I'd prefer it was all perfect. I run a business which means sometimes I have to consider bottom line. If a risk is identified and a quick fix is available it makes economic sense to apply the quick fix whilst working on a more robust long term solution.

That this has not been done inevitably leads people to question why. It's the answers that have been given to those questions that are causing the most difficulty. The fact that when those answers are challenged the story changes. The fact that the answers are inconsistent with what seems logical to any reasonably minded impartial observer.

The most important thing is that until about a year ago there was near unanimous agreement on what the purpose of the block size limit was, and how it would be dealt with. yet here we are today with this action having not been taken and a group of people actively trying to convince everyone that centralised enforcement of a block size limit is somehow the natural bahaviour if the system, despite it having never been so in its entire history.

The block size limit was a hacky workaround to the expensive to validate issue. An issue that is now mitigated by other much better solutions, not least a well incentivised distributed mining economy. That is now smart enough to route around such an attack, making it prohibitively expensive to maintain.

Individual economic self interest is how Bitcoin is supposed to work.

It's time to remove the bandaid.

When the curtain is pulled back you will see how powerful the wizard really isn't.

Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 [26] 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 ... 132 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!