bitcoin 0.1.0 was deemed "production-ready" by satoshi.
and he subscribed to the same damn make blocks bigger and bigger scaling vision.
does that make him an idiot?
Firstly, there was no money on the line back then. We have a $7 billion market cap to consider. Secondly, that would have been purely theoretical given that bitcoin had such low adoption and transaction volume and had no worries of running into scalability walls for the foreseeable future. Thirdly, Satoshi isn't here to defend himself from your (mis?) characterization of his words, nor to clarify in the context of
actual technological and infrastructural improvements in regards to latencies and bandwidth shortcomings since his disappearance. Fourthly, stop appealing to authority. You're building a strawman by suggesting that Satoshi's actions are in any way comparable to Gavin's, then knocking it down by appealing to Satoshi's authority. It's a pathetic, dishonest approach to debate. Just stop it.
XT failed because poeple didnt like the idea that nodes on the network would eventually cost 1000's of dollars yearly to run should the network grow to visa like TX/sec, not because it was technically flawed. they see increased cost to nodes as having a centralizing effect. they are WRONG, but thats just my opinion... and few subscribe to it.
8MB was technically flawed to begin with. Refer to the links posted above.
you small blockers like to pretend everything would come to a grinding halt if blocks were huge, ( some of you would say 2MB is "technically impossible") but thats not true, it just makes running a node more expensive
Who is saying 2MB is technically impossible? One could just be sloppy like Gavin -- put a limit on sig ops instead of fixing quadratic scaling limitations, do no risk analysis whatsoever, and commit buggy/untested code into your releases.
The questions are: Is it optimal? Is it necessary? Is it worth risking of contentious hard fork over? The answer is a resounding "No" on all fronts.