Bitcoin Forum
September 24, 2018, 03:01:02 AM *
News: ♦♦ New info! Bitcoin Core users absolutely must upgrade to previously-announced 0.16.3 [Torrent]. All Bitcoin users should temporarily trust confirmations slightly less. More info.
 
  Home Help Search Donate Login Register  
  Show Posts
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 [19] 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 ... 238 »
361  Bitcoin / Bitcoin Discussion / Re: Segregated Witness has been merged! on: June 25, 2016, 08:11:58 AM
first it was bribery with the "fee discount" now its blackmail with the mining algorithm hardfork.
Idiot. Miners are service providers for the Bitcoin network. They order transactions for the network node and are paid _handsomely_ for this service. If they start blocking transaction features in an attempt to demand control over the system they're going to get fired. This isn't anything that I would do, it's the natural and obvious response to coercive behavior from miners, no different than any other kind of censorship activity. Many people already call for POW changes due to heavy mining centralization, so far I've fought back against those calls to promote harmony and avoid disruption, but this would be a position that would be hard to stick to in this kind of hypothetical.

But it's all moot, because that isn't what things actually look like. I talk to a LOT of miners (and, of course, I'm a miner too) and I don't expect there to actually be any issue there.
362  Bitcoin / Bitcoin Discussion / Re: Segregated Witness has been merged! on: June 25, 2016, 07:54:34 AM
Blockstream was created by several of the most active Bitcoin core developers. It represents over 10 Bitcoin core developers who provide together most commits to Bitcoin core.
this is a load of bullshit, sure we do a lot-- but mostly in contrast to the general failure of most of the Bitcoin industry to support infrastructure development. I'm not aware of any way of counting that concludes anything close to "most". There are a good hundred contributors to core, and a great many very active ones which aren't at blockstream.

Anyone talking about blocking segwit for some hardfork is would be proving the big stinking lie of any claim that they were concerned with capacity. Core can't and isn't going to respond to threatening behavior. Fortunately, that isn't really the situation here-- except in the fevered minds of the /r/btc fudsters. Smiley I'm not concerned about activation having any deployment issues. (And if is delayed by some miners, that an issue the community can take up with them. I don't consider the segwit deployment urgent.)

The HK agreement between miners and blockstream (basically Core) represents consensus.
Blockstream has no such agreement, and it wouldn't matter if it did. (And, in fact, any effort to push such a thing would be an immediate actionable violation of my employment agreement; and trigger a parachute clause)

introducing additional features into Bitcoin that are increasingly complex
What do you think is increasingly complex? I think segwit makes Bitcoin simpler and safer. (though because the old things are still supported you can say there is some increase, but it seems pretty modest considering the simplicity it brings to people who are only using segwit and the number of problems it solves)
363  Bitcoin / Bitcoin Discussion / Re: ToominCoin aka "Bitcoin_Classic" #R3KT on: June 25, 2016, 07:31:31 AM
I also wonder, since they already released a "0.12.1" which was, if anything, an anti-0.12.1 (ripped out warnings related to the 0.12.1 features it lacked!)-- are they going to call their kludge port 0.13 and claim to be leaders in Bitcoin innovation-- while lacking the half year of development from dozens of people and dozens of features that will be in the real 0.13?
Why do things by half measures? Behold, Bitcoin Classic 1.1.0! Grin
Mark your calendars, lets see how long until there are sock accounts posting that Bitcoin Core is an outdated version, since Classic is 1.1.0... (while it's still actually a crappy barely maintained fork of an outdated copy of Bitcoin Core)

Predictably dishonest people are predictable dishonest.
364  Bitcoin / Bitcoin Discussion / Re: ToominCoin aka "Bitcoin_Classic" #R3KT on: June 25, 2016, 07:25:44 AM
among them the implementation of bitcoin's a scripting language on which LN will be dependant.
What are you talking about?


Re altcoins, they're welcome to take our open code, but most altcoins don't have active development teams... and most of them are based on seriously outdated Bitcoin Core codebases and have serious bugs (either ones we fixed long ago-- or ones they added themselves). Considering that no one has generally exploited them suggests they're not very useful test points.

And what does any of this have to do with classic other than classic being seriously out of date?

Quote
Apparently I just hold developers to a higher standard than you.
You do? Oh. So perhaps you can tell me who is funding classic developers Zander and Gavin?
365  Bitcoin / Bitcoin Discussion / Re: The Blocksize Debate & Concerns on: June 21, 2016, 09:50:47 AM
Out of curiosity, would you be willing to state your personal opinion on under what conditions it would be appropriate and/or beneficial to raise the blocksize limit?
I'd be glad to, and have many times in the past (also on the more general subject of hardforks), though I don't know that my particular views matter that much. I hope they do not.

Quote
What particular concerns, technical or otherwise, do you consider most important when it comes to considering alterations to this particular aspect of the system? Of course, blocksize is only one aspect of capacity (and a horribly inefficient method of scaling capacity), but I'm curious if there are any particular conditions under which you feel raising the blocksize would be the appropriate solution. Is it only a last-resort method of increasing capacity? And when it comes to capacity, to what extent should it be increased? Is, "infinite", capacity even something we should be targetting?
Since we live in a physical world, with computers made of matter and run on energy there will be limits. We should make the system as _efficient_ as possible because, in a physical world, one of the concerns is that there is some inherent trade-off between blockchain load and decentralization. (Note that blockchain load != Bitcoin transactional load, because there are _many_ ways to transact that have reduced blockchain impact.) ... regardless of the capacity level we're at, more efficiency means a better point on that trade-off.

Not screwing up decentralization early in the system's life is a high priority for me: It is utterly integral to the system's value proposition in a way that few other properties are, and every loss of decenteralization we've suffered over the system's life has been used to argue that the next loss isn't a big deal, making progress backwards is hard: The system actually works _better_ in a conventional sense, under typical usage, as it becomes more centralized: Upgrades are faster and easier, total resource costs are lower, behavior is more regular and consistent, some kinds of attacks are less commonly attempted, properties which are socially "desirable" but not rational for participants don't get broken as much, etc. Decentralization is also central to other elements that we must improve to preserve Bitcoin's competitiveness as a worldwide, open and politically neutral money-- such as fungiblity.

But even if we imagine that we got near infinite efficiency, would we want near infinite capacity?

Bitcoin's creator proposed that security would be paid for in the future by users bidding for transaction fees. If there were no limit to the supply of capacity, than even a single miner could always make more money by undercutting the market and clearing it*. Difficulty can adapt down, so low fee income can result in security melting away.  This could potentially be avoided by cartel behavior by miners, but having miners collude to censor transactions would be quite harmful... and in the physical world with non-infinite efficiency, avoiding the miners from driving the nodes off the network is already needed.  Does that have to work as a static limit?  No-- and I think some of the flexcap proposals have promise for the future for addressing this point.

Some have proposed that instead of fees security in the future would be provided by altruistic companies donating to miners. The specific mechanisms proposed didn't make much sense-- e.g. they'd pay attackers just as much as honest miners, but that could be fixed... but the whole model of using altruism to pay for a commons has significant limitations, especially in a global anonymous network. We haven't shown altruism of this type to be successful at funding development or helping to preserve the decentralization of mining (e.g. p2pool). I currently see little reason to believe that this could workable alternative to the idea initially laid of of using fees... of course, if you switch it around from altruism to a mandatory tax, you end up with the inflationary model of some altcoins-- and that probably does "work", but it's not the economic policy we desire (and, in particular, without a trusted external input to control the inflation rate, it's not clear that it would really work for anyone).

So in the long term part of my concern is avoiding the system drifting into a state where we're all forced to choose between inflation or failure (in which case, a bitcoin that works is better than one that doesn't...).

As far as when, I think we should execute the most extreme caution in incompatible changes in general. If it's really safe and needed we can expect to see broad support... it becomes easier to get there when efforts are made to address the risks, e.g. segwit was a much easier sell because it improved scalablity while at the same time increasing capacity. Likewise, I expect successful future increases to come with or after other risk mitigations.

(* we can ignore orphaning effects for four reasons, orphaning increases as a function of transaction load can be ~completely eliminated with relay technology improvements, and if not that by miners centralizing.. and if all a miners income is going to pay for orphaning losses there will be no excess paying for competition thus security, and-- finally-- if transaction fees are mostly uniform, the only disincentive from orphaning comes from the loss of subsidy, which will quickly become inconsequential unless bitcoin is changed to be inflationary.)

Quote
It is my understanding that Core's implementation of segwit will also include an overall, "blocksize", increase (between both the witness and transaction blocks), though with a few scalability improvements that should make the increase less demanding on the system (linear verification scaling with segwit and Compactblocks come to mind). Do you personally support this particular instance of increasing the overall, "blocksize"?
I think the capacity increase is risky. The risks are compensated by improvements (both recent ones already done and immediately coming, e.g. libsecp256k1, compactblocks, tens of fold validation speed improvements, smarter relay, better pruning support) along with those in segwit.

I worry a lot that there is a widespread misunderstanding that blocks being "full" is bad-- block access is a priority queue based on feerate-- and at a feerate of ~0 there effectively infinite demand (for highly replicated perpetual storage). I believe that (absent radical new tech that we don't have yet) the system cannot survive as a usefully decentralized system if the response to "full" is to continually increase capacity (such as system would have almost no nodes, and also potentially have no way to pay for security). One of the biggest problems with hardfork proposals was that they directly fed this path-to-failure, and I worry that the segwit capacity increase may contribute to that too... e.g. that we'll temporarily not be "full" and then we'll be hit with piles of constructed "urgent! crash landing!" pressure to increase again to prevent "full" regardless of the costs.  E.g. a constant cycle of short term panic about an artificial condition pushing the system away from long term survivability.

But on the balance, I think the risks can be handled and the capacity increase will be useful, and the rest of segwit is a fantastic improvement that will set the stage for more improvements to come. Taking some increase now will also allow us to experience the effects and observe the impacts which might help improve confidence (or direct remediation) needed for future increases.

Quote
To be clear, I'm just asking as someone who'd like to hear your informed opinion. In the short-term I'm not exactly worried about transaction capacity--certainly not hardfork worried--as with segwit on the way and the potential for LN or similar mechanisms to follow, short-term capacity issues could well be on their way out (really all I want short-term is an easier way to flag and use RBF so I can fix my fees during unexpected volume spikes). What I'm more curious about at this point are your views on the long-term--the, "133MB infinite transactions", stage.
I think we don't know exactly how lightning usage patterns will play out, so the resource gains are hard to estimate with any real precision. Right now bidi channels are the only way we know of to get to really high total capacity without custodial entities (which I also think should have a place in the Bitcoin future).

Some of the early resource estimates from lightning have already been potentially made obsolete by new inventions. For example, the lightning paper originally needed the ability to have a high peak blocksize in order to get all the world's transactions into it (though such blocks could be 'costly' for miners to build, like flexcap systems) in order to handle the corner case where huge numbers of channels were uncooperative closed all at once and all had to get their closures in before their timeouts expired. In response to this, I proposed the concept of a sequence lock that stops when the network is beyond capacity ("timestop"); it looks like this should greatly the need for big blocks at a cost of potentially delaying closure when the network is usually loaded; though further design and analysis is needed.  I think we've only started exploring the potential design space with channels.

Besides capacity, payment channels (and other tools) provide other features that will be important in bringing our currency to more places-- in particular, "instant payment".

As much as I personally dislike it, other services like credit are very common and highly desired by some markets-- and that is a service that can be offered by other layers as well.

I'm sorry to say that an easy to use fee-bump replacement feature just missed the merge window for Bitcoin Core 0.13. I'm pretty confident that it'll make it into 0.14. I believe Green Address has an feebump available already (but I haven't tried it). 0.13 will have ancestor feerate mining ("child pays for parent") so that is another mechanism that should help unwedge low fee transactions, though not as useful as replacement.

Quote
Of course, you're free to keep your opinions to yourself or only answer as much as you're comfortable divulging; it's not my intent to have you say anything that would encourage people to sling even more shit at you than they already have been lately Undecided
Ha, that's unavoidable.  I just do what I can, and try to remind myself that if no one at all is mad then what I'm doing probably doesn't matter.
366  Bitcoin / Bitcoin Discussion / Re: The Blocksize Debate & Concerns on: June 21, 2016, 07:05:29 AM
Malleability is already fixed. Segwit does not fix it further.
Technical point: This is very much not the case: Malleability is blocked in the relay network for a narrow class of transactions but anything clever is exposed, multisig is exposed to other-signer malleation, and all transactions are exposed to malleation by miners. Segwit fixes it in a deep and profound way for all purely segwit transactions.

doesn't say anything about a 1MB block size in the Bitcoin whitepaper. Changing it to 2 MB (like we did) does not change the protocol
The whitepaper doesn't say anything about 21 million coins-- or a great many other very important consensus rules. If the rule was intended to _simply_ be temporary it could have had an automatic phase out, but it didn't. If it was intended to go up as it filled, it could have-- just as difficulty adjusts, the mechanism is all there.

Already Ethereum users make the incorrect argument that bitcoin was hardforked in the past to fix the value overflow bug (it wasn't) and thus its okay for them to manually tamper with the ledger to claw the funds the DAO lost and hand them over to other users.  You're seeing a first hand demonstration for how quickly people cling to argument of convenience.

The rule by math element of Bitcoin is essential to the value proposition, great care should be taken to not erode it based on short term interests.
367  Bitcoin / Development & Technical Discussion / Re: Turing completeness and state for smart contract on: June 20, 2016, 07:55:40 AM
but I'm not seeing why exactly this means they shouldn't be written in a declarative language

How about now?

Though your example is making a distinction without a difference, something like that could easily be converted to something more Bitcoin script like... but the real differences I'm highlighting are things like avoiding mutable state, which-- among other benefits-- would absolutely preclude vulnerabilities like that re-entrancy race in the DAO, which is currently in the process of proving to the world that ethereum isn't the decentralized system it has claimed to be (so, in that sense, the error prone smart contracting programming model has done the world a service).
368  Bitcoin / Bitcoin Discussion / Re: ToominCoin aka "Bitcoin_Classic" #R3KT on: June 20, 2016, 07:35:33 AM
I know it's "Classic" (and /r/btc) style to spend most of the time talking about Ethereum; but how about we buck that trend and talk about classic instead?

In an ultra rare fit of software development, they've begun to try to add BIP9/68/etc. support to Classic: https://github.com/bitcoinclassic/bitcoinclassic/pull/177

But because their "single constant fork" remained anything but and they've diverged from Core and don't seem to know what they're doing-- they've made a fine hairball of it and the tests aren't passing. I wonder if they'll get it working before CSV is live on the network?

I also wonder, since they already released a "0.12.1" which was, if anything, an anti-0.12.1 (ripped out warnings related to the 0.12.1 features it lacked!)-- are they going to call their kludge port 0.13 and claim to be leaders in Bitcoin innovation-- while lacking the half year of development from dozens of people and dozens of features that will be in the real 0.13?

Only time will tell, but I guess that would be consistent with their "Classic" naming practice, enh?


369  Bitcoin / Development & Technical Discussion / Re: Smart contracts in bitcoin on: June 18, 2016, 07:43:11 PM
Maybe after seeing how the dao and ethereum have ended up we shouldnt really tempt fate too much by tinkering with bitcoin to allow such things. If its not broke dont fix it. Imo smart contract should be left to altcoins, Bitcoin as it stands works at what its supposed to (well within reason, its a good store of value anyway)

Bitcoin has had smart contracts since day one. The architecture of the system is much more fundamentally sound and safe than the system in Ethereum. People have lost money via misuse of Bitcoin script, however, and no one has suggested forcefully adjusting Bitcoin's rules to claw those funds-- or others lost in other incidents-- back.

370  Bitcoin / Development & Technical Discussion / Re: Synchronous transactions on: June 17, 2016, 02:08:18 AM
Sure.

Doing that is easy, just make a joint transaction.

(see the coinjoin thread)
371  Bitcoin / Pools / Re: [14000Th] Eligius: 0% Fee BTC, 105% PPS NMC, No registration, CPPSRB on: June 13, 2016, 06:48:37 PM
Currently it is not practical to solo mine with CSV
Solo mining with it works fine. I'm doing so myself right now.  Please don't conflate issues in BFGminer's GBT support with "solo mining"
372  Bitcoin / Bitcoin Discussion / Re: ToominCoin aka "Bitcoin_Classic" #R3KT on: June 12, 2016, 11:27:34 PM
LOL @ maxwell complaining about r/btc when r/bitcoin has subscribed to outright censorship of all competing viewpoints as its ruling doctrine.
Hasn't been what I've seen, promote classic get removed as offtopic altcoin promotion (minor eyerolll)  but if you wanted to point out that Bitcoin Core just ripped out warnings about hash-power behaving inconsistently with your node's rules that would be fine.  I don't endorse all of /r/bitcoin's moderation practices, but false equivalences like this are rubbish.

It also shows that many of the arguments being used to advance "Classic" are convenient lies. If 75% hashpower "decides" what the network is and node software failing to fully validate all txn the same as the hashpower are such significant concerns then why is it okay for Classic to ignore CSV (>98% hashpower supporting now) and rip out all the warnings?
373  Bitcoin / Bitcoin Discussion / Re: BurtW arrested (update: charges dropped!) on: June 12, 2016, 03:00:30 AM
I'm pretty shocked to see some of the comments in this thread.  BurtW has nothing to prove here, and those of you attacking him should be ashamed of yourselves for attacking a victim.
374  Bitcoin / Bitcoin Discussion / Re: ToominCoin aka "Bitcoin_Classic" #R3KT on: June 12, 2016, 12:55:28 AM
BIP-68 and friends, the new sequence lock feature that allows for relative locktimes, are getting close to activation on the network with 99.31% of the last 144 blocks signaling they are ready to activate.

As we've discussed a few times, Bitcoin Classic has still taken no move to support this-- Zander is suggesting they won't (with a rather confusing argument that this feature is not useful without segwit).  Meanwhile, /r/btc noticed the adoption and there was this gem of a post, after someone suggested that the rapid increase from 50% hashpower to 99% meant classic's 75% threshold were fine:

Quote
But I've been told it takes some people a YEAR to upgrade (really slow typists, I guess).......

I responded, pointing out that yes indeed, in professional operations upgrades can and should take a long time... and also pointed out that this softfork has been released in Bitcoin Core for months (BIP-68 itself is a year old proposal, there is nothing surprising or hasty here), and even though Classic need only forward port their "trivial" single point of departure hardfork to the 0.12.1 codebase-- Classic has taken no action to do so at all. Meaning that if CSV were a 75% hardfork, every classic user (however many there actually are) would _now_ be forked off the network.

We saw this same kind of absentee maintenance with Bitcoin XT. -- and, withholding judgement, the freedom to be absentee in this manner is one of the major reasons many people in the community prefer the conservative soft-fork process. Without it, running any implementation but the most popular would be a lot less wise. (Ignoring for the moment the lack of wisdom in running something with an intentionally programmed consensus inconsistency-- it's perfectly possible to make alternative implementations that are consensus consistent.)

It was only after I finished my reply did I realize that the author of the post was Gavin from Bitcoin Classic.  So, what, do the classiccoin supporters here need to pool their funds to spring for a copy of Mavis Beacon teaches typing?

The standard /r/btc response is to bot-vote my replies to invisibility, so, sadly, the people who would most benefit from seeing my response won't see it... but perhaps it will bring some amusement to people here.
375  Bitcoin / Bitcoin Discussion / Re: ToominCoin aka "Bitcoin_Classic" #R3KT on: June 11, 2016, 09:42:49 AM
Latest "Classic" move,   https://www.reddit.com/r/btc/comments/4nkmzp/the_ultimate_defence_against_the_alleged/   "The ultimate defence against the alleged xthinblock attack is header-first mining"

So "unlimited" (and proposed for "classic" but classic, seems mostly dead) has an efficient block relay scheme (their homegrown analog of BIP152) with a design flaw.

The way it works is this: When I relay a block to you, I give you a list of the transaction IDs in the block so you can match them out of your mempool instead of getting them from me.  To save bandwidth instead of sending the whole ID I send only the first couple digits of it.  They reasoned that they sent enough digits that it would be really unlikely for two txn in your mempool to have the same truncated IDs by chance.

What they didn't account for is the well known result, often called the "birthday paradox", that it is _much_ easier to compute two messages sharing the same short hash than you'd expect.  Because of this, with the scheme in unlimited it's very easy for people to make pairs of transactions with matching short IDs and send them to the network. Any block that includes one of these TXN will propagate more slowly (because the reconstruction will fail, and it will have to take a round trip and retry with more data.).

This flaw is something I spotted back in 2014 while working on some of the design work which later became part of BIP152, and I came up with a simple solution: Instead of truncating the txid, you hash it with a keyed value that isn't known to the attacker (we just have the sender pick one).

It's not the biggest deal in the world, but that fix shuts down some easily perpetrated vandalism (which could also potentially performed for profit reasons) at basically no cost.

The "classic" response?  If miners don't verify anything at all, well then it doesn't matter to the miners how long it takes for block data to reach them. And since big miners and companies are all that are classically important, and SPV wallets (which make a strong security assumption that miners validate) are not... why bother fixing the flawed protocol?

Never-mind the fact that classic's attempt at this was already aborted.
376  Bitcoin / Bitcoin Discussion / Re: Why Blockstream is against "contentious" hard forks - Control on: June 11, 2016, 05:18:54 AM
If average tx is twice median tx, will this not lead to under paying tx fees?
(i am asking 21.co also)
The 'average' tx mostly doesn't exist. The sizes of transactions are highly multi-modal. There are a whole lot of small ones (most ones people make) and a decreasing number of larger and larger ones that drag up the average. If you're joe-schmoe and make a transaction, it's most likely you'll make a median sized one.

It's like saying the the average number of testicles an American has is 1... and yet relatively few people have exactly one testicle, even though it's the average.

[In no case should fees be under-paid... in competent wallets fees are configured per unit size and set accordingly. Smiley ]

Another way to think about it is that "a transaction" isn't really a great unit of capacity. Consider, what if everyone started using 4-party non-amount-matched coinjoins for all their transactions. We'd end up with 1/4 the number of "transactions" each of 4x the size. Would it be right to say that suddenly the Bitcoin network had 1/4th the capacity because now the tx were 4x larger?  No-- they're also doing 4x as much.
377  Bitcoin / Bitcoin Discussion / Re: Why Blockstream is against "contentious" hard forks - Control on: June 10, 2016, 05:20:24 AM
So why are you talking about the median? It is a completely worthless statistic with regards to this topic.
It's a great statistic if you're talking about typical transaction fees, which I believe was the original topic.
378  Bitcoin / Bitcoin Discussion / Re: Why Blockstream is against "contentious" hard forks - Control on: June 06, 2016, 05:10:48 PM
Correct. While I might have wrongly used 'average' instead of median somewhere (I don't recall all of the times that I've written about this), Maxwell never did.

but gmaxwel has

median transaction size of 226byte??
i think u meant minimum not median
That is the median size.

again the 226byte is MINIMUM.. not median, not average.. the median is about 500.. the average is similar
median means middle number. and is usually close to an average, well atleast in the same ball park.. its definetly not the minimum or the maximum.. but the amount between the two..

No. 226 is actually the _median_ transaction size.


In [46]: pp = AuthServiceProxy("http://bitcoinrpc:password@127.0.0.1:8332")                                   
In [47]: txa=[pp.getrawtransaction(x,1) for x in pp.getblock(pp.getblockhash(415093),True)['tx'][1:]]
In [48]: sum([x['size'] for x in txa])
Out[48]: 999724
In [49]: numpy.median([x['size'] for x in txa])
Out[49]: 226.0


Same story for pretty much every block.

(Minimum is 189 in that block FWIW).

Franky1, in this case you were just confused-- but you've got a number of other claims like saying CT is part of core's published roadmap, that are outright lies. I think you need to stop wasting everyone's time.
379  Bitcoin / Bitcoin Discussion / Re: Why Blockstream is against "contentious" hard forks - Control on: June 06, 2016, 05:32:19 AM
CT isn't part of Bitcoin Core's roadmap at this time; but somehow its not shocking that you're vigorously opposed to it for unexplained reasons.  There are like a bazillion people on /r/btc who would love to hear your theories that Bitcoin Core is bad because the blocks will be _bigger_ under it's plans though, I suggest you go share your theories there.
380  Bitcoin / Bitcoin Discussion / Re: Why Blockstream is against "contentious" hard forks - Control on: June 06, 2016, 04:36:18 AM
Bitcoin has specific affordances for softforks which were added to enable them, things like NOPs in script-- which were added to replace an earlier mechanism that caused random uncoordinated hardforks, and transaction version numbers. Softforks were used by bitcoin's creator several times early in its existence, e.g. to do things like fix script or add height based nlocktime.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 [19] 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 ... 238 »
Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!