Bitcoin Forum
July 02, 2016, 02:12:31 AM *
News: Latest stable version of Bitcoin Core: 0.12.1 [Torrent]
  Home Help Search Donate Login Register  
  Show Posts
Pages: [1] 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 ... 226 »
1  Bitcoin / Bitcoin Discussion / Re: Current SegWit code does not fix O(n^2) on: June 30, 2016, 07:33:06 AM
If you've seen his posts anywhere else before, you'd know that he calls it the "The SegWit Omnibus Changeset".
I actually haven't since he's on my ignore list (and I'm only replying to satisfy my masochistic tendencies), but I did Google the phrase and fail to find anything useful. I'm just going to assume it means "SegWit plus all other Core updates I don't understand or like".

Don't be so hard on him on this (every other reason is good to go).  SegWit Omnibus changeset sounds like something _I_ would say-- I'd use it to refer to the pull request that implemented the segwit consensus rules, the segwit wallet support, and the huge amount of testing infrastructure.
2  Bitcoin / Bitcoin Discussion / Re: Current SegWit code does not fix O(n^2) on: June 30, 2016, 07:25:32 AM
We layfolk are not party to the detailed development plans, and that is OK. However, with several Core supporters deriding alternative node implementations for limiting the effects of the O(n^2) issue, rather than solving it head on

Fundamental misunderstanding, conflating the protocol with non-normative implementation particulars.  The Bitcoin protocol has a design flaw where transaction validation can take time quadratic in the size of the transaction. No implementation can avoid this wasteful computation because it is a consensus rule normative to the protocol.

With an increase in blocksize this wasteful computation could easily be turned into a system halting denial of service.

Rather that fixing it, Bitcoin Classic implemented yet another useless hard limit of transaction sizes-- to keep the bleeding at a moderate level. (Still allowing blocks to trigger 1.2 _gigabytes_ of hashing)

Segwit's design addressed the issue in two ways: One is that the extra capacity in segwit is for witness data, which is not hashed by the signature hasher. Because of this even with no fix, the worst case possible is much less significant than a plain 2MB block.  The other is that segwit changes the data structure which is hashed to not require the quadratic computation, by making the part of the hashing that all signatures would share identical that computation can be shared-- the resulting structure can be hashed with O(N) work instead of O(N^2).  These are both done, integrated, and tested since 2015.  Both are fundamental to segwit.

The point Peter Todd was making was that the segwit implementation in Bitcoin Core doesn't make use of that changed structure in the second improvement to actually save the computation possible from that second improvement.  There is an open pull request for it it just isn't integrated yet.  Btcd's does however.    This is an implementation specific difference, on my computer it does, on yours it doesn't and we're totally compatible.  Similarly, even ignoring segwit Bitcoin Core is normally many times faster than btcd, but both are compatible.

Our focus is on correctness, compatibility, and ensuring flexibility, not in getting in every last possible optimization into the system on day one. Not changing the signature-hashing algorithm, just changing the data structure, made review for correctness easier, and also allowed compatibility testing (between the naive code, the unmerged optimization, and the btcd implementation).

The important thing is that the design flaw has been eliminated for segwit txn; allowing the implementations to implement the optimization at their leisure.  It makes not a difference at all if anyone actually makes use of the new structure until segwit is activated on the network.
3  Bitcoin / Bitcoin Discussion / Re: Current SegWit code does not fix O(n^2) on: June 29, 2016, 09:35:34 PM
lets get to the short and curlies of it.

Can you explain to me what the change does and what significance it has?

I think that would do more to get the details, like making it clear that your "concern" is motivated by harassment rather than actual concern.

but in regards to a publicly downloading implementation that handles real bitcoin data on the real live bitcoin network.
will the optimization be included in the next release.
I expect it to be included in any release with segwit activated.
4  Bitcoin / Bitcoin Discussion / Re: Current SegWit code does not fix O(n^2) on: June 29, 2016, 07:36:49 PM
Weird thread.

Fixing the O(n^2) issue required a change in how signature hashes are computed so that the work of hashing the transaction can be reused between multiple signatures.  Then it requires making use of that change van an optimization to actually reuse the computation.  (This optimization has earned the cheeky name, hashcache).

The segwit consensus rules do the former.  The segwit PR to Bitcoin core didn't include the latter optimization, because it isn't wasn't for the system to work (assuming we don't care that it can be slow), and packing everything in at once complicates review and increases risk. The correctness of the optimization is easier to verify as a change by itself, and the correctness of segwit was easier to verify without the optimization in the way.

But the code to actually make use of the format change is the queue as well  (which was open 5 days ago, but the code for it was written January 19th).  This optimization is also part of the btcd implementation and has been since they originally wrote it.

Considering this, the presentation of this as some kind of flaw or surprising find make it look to me that people are desperate to find something wrong.  I suppose that is good: more eyes may reduce the risk of unbound actual issues. Hopefully the next time it won't be over a missing optimization which we've had an implementation of for six months. Smiley
5  Bitcoin / Development & Technical Discussion / Re: Turing completeness and state for smart contract on: June 27, 2016, 09:16:03 PM
The argument in Bitcoin is simple: If doing anything interesting with Bitcoin forces you back to a purely trusted third party, the value of Bitcoin as a permissionless asset is diminished.

Bitcoins value comes partially from decentralization, and that value is maximized by minimizing the frequency at which you need to put those benefits aside. No more justification is needed.

Re other things copying ethereum's smart contracting system. For me, the recent snafu in ethereum is experimental confirmation of a long held belief about that design. My view that the design is flawed wouldn't be changed by putting it another altcoin or a sidechain. That doesn't mean I'm opposed to other people trying other things, -- more experimental evidence.
6  Alternate cryptocurrencies / Altcoin Discussion / Re: [VNL] Vanillacoin, a quiet word of warning. on: June 27, 2016, 07:39:58 PM
It looks like someone was getting jealous of scammer Craig Wright bogus patent drama and decided to start some of their own with an extra helping of incoherent jibberish.
7  Bitcoin / Bitcoin Discussion / Re: Segregated Witness has been merged! on: June 25, 2016, 09:09:55 AM
The people who had signed that agreement, "vouched" to deliver a HF proposal
The antagonism and misrepresentations-- things like saying core agreed, that blockstream is core, that people were planning on working on this _before_ segwit, that miners would refuse segwit (a capacity increase) unless there were some hardfork, etc-- along with F2Pool not upholding their end seem to have really demoralized most of the people involved there, and cause some loss of face in the technical community for them. They were working on some designs but I think the threats caused them to depriortize that effort. (Consider, if they post a proposal and it arms more threatening behavior, they'll take some blame for facilitating that-- and that is a situation which would be bad for the value of their Bitcoin holdings as well as their professional reputations) In effect, this work is currently pre-undermined because a large body of people won't touch it with a ten foot poll because it's tied up with intolerable behavior, and that makes it a lot less interesting to work on.

Really the biggest lesson from the earlier chapters of this debacle is that you can't convince principled people in the community with threats. Probably the worst thing that could have been done in this blocksize nonsense was running to the mass media with "Bitcoin is forking!" before even writing a BIP... The whole value prop of Bitcoin depends on Bitcoin being resistant to political coercion. Convincing people that we all have a common shared interest to cooperatively move in particular directions that benefit everyone is a strategy that works (which is why segwit itself has such broad support, that even most people that prefer other things admit that its a really good move). That works because it is a way to encourage change that, if it works, doesn't come with without too much risk of adverse change in the future.  If threats and antagonism or PR campaigns based on ignorance (Make Bitcoin Great Again!) are shown to be a successful way to enact change (even otherwise good changes), then they could be abused in the future to make bad changes (such as transaction censorship or undermining the systems' monetary policy) and that cannot be allowed to happen if Bitcoin is to remain valuable: If someone could credibly argue that in the past changes were pushed through via threatening actions by a small set of high hash-power miners this would be powerful FUD that would erode the value proposition. I think a lot of people (including a lot of big miners) are committed to avoiding that outcome.
8  Bitcoin / Bitcoin Discussion / Re: Segregated Witness has been merged! on: June 25, 2016, 08:11:58 AM
first it was bribery with the "fee discount" now its blackmail with the mining algorithm hardfork.
Idiot. Miners are service providers for the Bitcoin network. They order transactions for the network node and are paid _handsomely_ for this service. If they start blocking transaction features in an attempt to demand control over the system they're going to get fired. This isn't anything that I would do, it's the natural and obvious response to coercive behavior from miners, no different than any other kind of censorship activity. Many people already call for POW changes due to heavy mining centralization, so far I've fought back against those calls to promote harmony and avoid disruption, but this would be a position that would be hard to stick to in this kind of hypothetical.

But it's all moot, because that isn't what things actually look like. I talk to a LOT of miners (and, of course, I'm a miner too) and I don't expect there to actually be any issue there.
9  Bitcoin / Bitcoin Discussion / Re: Segregated Witness has been merged! on: June 25, 2016, 07:54:34 AM
Blockstream was created by several of the most active Bitcoin core developers. It represents over 10 Bitcoin core developers who provide together most commits to Bitcoin core.
this is a load of bullshit, sure we do a lot-- but mostly in contrast to the general failure of most of the Bitcoin industry to support infrastructure development. I'm not aware of any way of counting that concludes anything close to "most". There are a good hundred contributors to core, and a great many very active ones which aren't at blockstream.

Anyone talking about blocking segwit for some hardfork is would be proving the big stinking lie of any claim that they were concerned with capacity. Core can't and isn't going to respond to threatening behavior. Fortunately, that isn't really the situation here-- except in the fevered minds of the /r/btc fudsters. Smiley I'm not concerned about activation having any deployment issues. (And if is delayed by some miners, that an issue the community can take up with them. I don't consider the segwit deployment urgent.)

The HK agreement between miners and blockstream (basically Core) represents consensus.
Blockstream has no such agreement, and it wouldn't matter if it did. (And, in fact, any effort to push such a thing would be an immediate actionable violation of my employment agreement; and trigger a parachute clause)

introducing additional features into Bitcoin that are increasingly complex
What do you think is increasingly complex? I think segwit makes Bitcoin simpler and safer. (though because the old things are still supported you can say there is some increase, but it seems pretty modest considering the simplicity it brings to people who are only using segwit and the number of problems it solves)
10  Bitcoin / Bitcoin Discussion / Re: ToominCoin aka "Bitcoin_Classic" #R3KT on: June 25, 2016, 07:31:31 AM
I also wonder, since they already released a "0.12.1" which was, if anything, an anti-0.12.1 (ripped out warnings related to the 0.12.1 features it lacked!)-- are they going to call their kludge port 0.13 and claim to be leaders in Bitcoin innovation-- while lacking the half year of development from dozens of people and dozens of features that will be in the real 0.13?
Why do things by half measures? Behold, Bitcoin Classic 1.1.0! Grin
Mark your calendars, lets see how long until there are sock accounts posting that Bitcoin Core is an outdated version, since Classic is 1.1.0... (while it's still actually a crappy barely maintained fork of an outdated copy of Bitcoin Core)

Predictably dishonest people are predictable dishonest.
11  Bitcoin / Bitcoin Discussion / Re: ToominCoin aka "Bitcoin_Classic" #R3KT on: June 25, 2016, 07:25:44 AM
among them the implementation of bitcoin's a scripting language on which LN will be dependant.
What are you talking about?

Re altcoins, they're welcome to take our open code, but most altcoins don't have active development teams... and most of them are based on seriously outdated Bitcoin Core codebases and have serious bugs (either ones we fixed long ago-- or ones they added themselves). Considering that no one has generally exploited them suggests they're not very useful test points.

And what does any of this have to do with classic other than classic being seriously out of date?

Apparently I just hold developers to a higher standard than you.
You do? Oh. So perhaps you can tell me who is funding classic developers Zander and Gavin?
12  Bitcoin / Bitcoin Discussion / Re: The Blocksize Debate & Concerns on: June 21, 2016, 09:50:47 AM
Out of curiosity, would you be willing to state your personal opinion on under what conditions it would be appropriate and/or beneficial to raise the blocksize limit?
I'd be glad to, and have many times in the past (also on the more general subject of hardforks), though I don't know that my particular views matter that much. I hope they do not.

What particular concerns, technical or otherwise, do you consider most important when it comes to considering alterations to this particular aspect of the system? Of course, blocksize is only one aspect of capacity (and a horribly inefficient method of scaling capacity), but I'm curious if there are any particular conditions under which you feel raising the blocksize would be the appropriate solution. Is it only a last-resort method of increasing capacity? And when it comes to capacity, to what extent should it be increased? Is, "infinite", capacity even something we should be targetting?
Since we live in a physical world, with computers made of matter and run on energy there will be limits. We should make the system as _efficient_ as possible because, in a physical world, one of the concerns is that there is some inherent trade-off between blockchain load and decentralization. (Note that blockchain load != Bitcoin transactional load, because there are _many_ ways to transact that have reduced blockchain impact.) ... regardless of the capacity level we're at, more efficiency means a better point on that trade-off.

Not screwing up decentralization early in the system's life is a high priority for me: It is utterly integral to the system's value proposition in a way that few other properties are, and every loss of decenteralization we've suffered over the system's life has been used to argue that the next loss isn't a big deal, making progress backwards is hard: The system actually works _better_ in a conventional sense, under typical usage, as it becomes more centralized: Upgrades are faster and easier, total resource costs are lower, behavior is more regular and consistent, some kinds of attacks are less commonly attempted, properties which are socially "desirable" but not rational for participants don't get broken as much, etc. Decentralization is also central to other elements that we must improve to preserve Bitcoin's competitiveness as a worldwide, open and politically neutral money-- such as fungiblity.

But even if we imagine that we got near infinite efficiency, would we want near infinite capacity?

Bitcoin's creator proposed that security would be paid for in the future by users bidding for transaction fees. If there were no limit to the supply of capacity, than even a single miner could always make more money by undercutting the market and clearing it*. Difficulty can adapt down, so low fee income can result in security melting away.  This could potentially be avoided by cartel behavior by miners, but having miners collude to censor transactions would be quite harmful... and in the physical world with non-infinite efficiency, avoiding the miners from driving the nodes off the network is already needed.  Does that have to work as a static limit?  No-- and I think some of the flexcap proposals have promise for the future for addressing this point.

Some have proposed that instead of fees security in the future would be provided by altruistic companies donating to miners. The specific mechanisms proposed didn't make much sense-- e.g. they'd pay attackers just as much as honest miners, but that could be fixed... but the whole model of using altruism to pay for a commons has significant limitations, especially in a global anonymous network. We haven't shown altruism of this type to be successful at funding development or helping to preserve the decentralization of mining (e.g. p2pool). I currently see little reason to believe that this could workable alternative to the idea initially laid of of using fees... of course, if you switch it around from altruism to a mandatory tax, you end up with the inflationary model of some altcoins-- and that probably does "work", but it's not the economic policy we desire (and, in particular, without a trusted external input to control the inflation rate, it's not clear that it would really work for anyone).

So in the long term part of my concern is avoiding the system drifting into a state where we're all forced to choose between inflation or failure (in which case, a bitcoin that works is better than one that doesn't...).

As far as when, I think we should execute the most extreme caution in incompatible changes in general. If it's really safe and needed we can expect to see broad support... it becomes easier to get there when efforts are made to address the risks, e.g. segwit was a much easier sell because it improved scalablity while at the same time increasing capacity. Likewise, I expect successful future increases to come with or after other risk mitigations.

(* we can ignore orphaning effects for four reasons, orphaning increases as a function of transaction load can be ~completely eliminated with relay technology improvements, and if not that by miners centralizing.. and if all a miners income is going to pay for orphaning losses there will be no excess paying for competition thus security, and-- finally-- if transaction fees are mostly uniform, the only disincentive from orphaning comes from the loss of subsidy, which will quickly become inconsequential unless bitcoin is changed to be inflationary.)

It is my understanding that Core's implementation of segwit will also include an overall, "blocksize", increase (between both the witness and transaction blocks), though with a few scalability improvements that should make the increase less demanding on the system (linear verification scaling with segwit and Compactblocks come to mind). Do you personally support this particular instance of increasing the overall, "blocksize"?
I think the capacity increase is risky. The risks are compensated by improvements (both recent ones already done and immediately coming, e.g. libsecp256k1, compactblocks, tens of fold validation speed improvements, smarter relay, better pruning support) along with those in segwit.

I worry a lot that there is a widespread misunderstanding that blocks being "full" is bad-- block access is a priority queue based on feerate-- and at a feerate of ~0 there effectively infinite demand (for highly replicated perpetual storage). I believe that (absent radical new tech that we don't have yet) the system cannot survive as a usefully decentralized system if the response to "full" is to continually increase capacity (such as system would have almost no nodes, and also potentially have no way to pay for security). One of the biggest problems with hardfork proposals was that they directly fed this path-to-failure, and I worry that the segwit capacity increase may contribute to that too... e.g. that we'll temporarily not be "full" and then we'll be hit with piles of constructed "urgent! crash landing!" pressure to increase again to prevent "full" regardless of the costs.  E.g. a constant cycle of short term panic about an artificial condition pushing the system away from long term survivability.

But on the balance, I think the risks can be handled and the capacity increase will be useful, and the rest of segwit is a fantastic improvement that will set the stage for more improvements to come. Taking some increase now will also allow us to experience the effects and observe the impacts which might help improve confidence (or direct remediation) needed for future increases.

To be clear, I'm just asking as someone who'd like to hear your informed opinion. In the short-term I'm not exactly worried about transaction capacity--certainly not hardfork worried--as with segwit on the way and the potential for LN or similar mechanisms to follow, short-term capacity issues could well be on their way out (really all I want short-term is an easier way to flag and use RBF so I can fix my fees during unexpected volume spikes). What I'm more curious about at this point are your views on the long-term--the, "133MB infinite transactions", stage.
I think we don't know exactly how lightning usage patterns will play out, so the resource gains are hard to estimate with any real precision. Right now bidi channels are the only way we know of to get to really high total capacity without custodial entities (which I also think should have a place in the Bitcoin future).

Some of the early resource estimates from lightning have already been potentially made obsolete by new inventions. For example, the lightning paper originally needed the ability to have a high peak blocksize in order to get all the world's transactions into it (though such blocks could be 'costly' for miners to build, like flexcap systems) in order to handle the corner case where huge numbers of channels were uncooperative closed all at once and all had to get their closures in before their timeouts expired. In response to this, I proposed the concept of a sequence lock that stops when the network is beyond capacity ("timestop"); it looks like this should greatly the need for big blocks at a cost of potentially delaying closure when the network is usually loaded; though further design and analysis is needed.  I think we've only started exploring the potential design space with channels.

Besides capacity, payment channels (and other tools) provide other features that will be important in bringing our currency to more places-- in particular, "instant payment".

As much as I personally dislike it, other services like credit are very common and highly desired by some markets-- and that is a service that can be offered by other layers as well.

I'm sorry to say that an easy to use fee-bump replacement feature just missed the merge window for Bitcoin Core 0.13. I'm pretty confident that it'll make it into 0.14. I believe Green Address has an feebump available already (but I haven't tried it). 0.13 will have ancestor feerate mining ("child pays for parent") so that is another mechanism that should help unwedge low fee transactions, though not as useful as replacement.

Of course, you're free to keep your opinions to yourself or only answer as much as you're comfortable divulging; it's not my intent to have you say anything that would encourage people to sling even more shit at you than they already have been lately Undecided
Ha, that's unavoidable.  I just do what I can, and try to remind myself that if no one at all is mad then what I'm doing probably doesn't matter.
13  Bitcoin / Bitcoin Discussion / Re: The Blocksize Debate & Concerns on: June 21, 2016, 07:05:29 AM
Malleability is already fixed. Segwit does not fix it further.
Technical point: This is very much not the case: Malleability is blocked in the relay network for a narrow class of transactions but anything clever is exposed, multisig is exposed to other-signer malleation, and all transactions are exposed to malleation by miners. Segwit fixes it in a deep and profound way for all purely segwit transactions.

doesn't say anything about a 1MB block size in the Bitcoin whitepaper. Changing it to 2 MB (like we did) does not change the protocol
The whitepaper doesn't say anything about 21 million coins-- or a great many other very important consensus rules. If the rule was intended to _simply_ be temporary it could have had an automatic phase out, but it didn't. If it was intended to go up as it filled, it could have-- just as difficulty adjusts, the mechanism is all there.

Already Ethereum users make the incorrect argument that bitcoin was hardforked in the past to fix the value overflow bug (it wasn't) and thus its okay for them to manually tamper with the ledger to claw the funds the DAO lost and hand them over to other users.  You're seeing a first hand demonstration for how quickly people cling to argument of convenience.

The rule by math element of Bitcoin is essential to the value proposition, great care should be taken to not erode it based on short term interests.
14  Bitcoin / Development & Technical Discussion / Re: Turing completeness and state for smart contract on: June 20, 2016, 07:55:40 AM
but I'm not seeing why exactly this means they shouldn't be written in a declarative language

How about now?

Though your example is making a distinction without a difference, something like that could easily be converted to something more Bitcoin script like... but the real differences I'm highlighting are things like avoiding mutable state, which-- among other benefits-- would absolutely preclude vulnerabilities like that re-entrancy race in the DAO, which is currently in the process of proving to the world that ethereum isn't the decentralized system it has claimed to be (so, in that sense, the error prone smart contracting programming model has done the world a service).
15  Bitcoin / Bitcoin Discussion / Re: ToominCoin aka "Bitcoin_Classic" #R3KT on: June 20, 2016, 07:35:33 AM
I know it's "Classic" (and /r/btc) style to spend most of the time talking about Ethereum; but how about we buck that trend and talk about classic instead?

In an ultra rare fit of software development, they've begun to try to add BIP9/68/etc. support to Classic:

But because their "single constant fork" remained anything but and they've diverged from Core and don't seem to know what they're doing-- they've made a fine hairball of it and the tests aren't passing. I wonder if they'll get it working before CSV is live on the network?

I also wonder, since they already released a "0.12.1" which was, if anything, an anti-0.12.1 (ripped out warnings related to the 0.12.1 features it lacked!)-- are they going to call their kludge port 0.13 and claim to be leaders in Bitcoin innovation-- while lacking the half year of development from dozens of people and dozens of features that will be in the real 0.13?

Only time will tell, but I guess that would be consistent with their "Classic" naming practice, enh?

16  Bitcoin / Development & Technical Discussion / Re: Smart contracts in bitcoin on: June 18, 2016, 07:43:11 PM
Maybe after seeing how the dao and ethereum have ended up we shouldnt really tempt fate too much by tinkering with bitcoin to allow such things. If its not broke dont fix it. Imo smart contract should be left to altcoins, Bitcoin as it stands works at what its supposed to (well within reason, its a good store of value anyway)

Bitcoin has had smart contracts since day one. The architecture of the system is much more fundamentally sound and safe than the system in Ethereum. People have lost money via misuse of Bitcoin script, however, and no one has suggested forcefully adjusting Bitcoin's rules to claw those funds-- or others lost in other incidents-- back.

17  Bitcoin / Development & Technical Discussion / Re: Synchronous transactions on: June 17, 2016, 02:08:18 AM

Doing that is easy, just make a joint transaction.

(see the coinjoin thread)
18  Bitcoin / Pools / Re: [14000Th] Eligius: 0% Fee BTC, 105% PPS NMC, No registration, CPPSRB on: June 13, 2016, 06:48:37 PM
Currently it is not practical to solo mine with CSV
Solo mining with it works fine. I'm doing so myself right now.  Please don't conflate issues in BFGminer's GBT support with "solo mining"
19  Bitcoin / Bitcoin Discussion / Re: ToominCoin aka "Bitcoin_Classic" #R3KT on: June 12, 2016, 11:27:34 PM
LOL @ maxwell complaining about r/btc when r/bitcoin has subscribed to outright censorship of all competing viewpoints as its ruling doctrine.
Hasn't been what I've seen, promote classic get removed as offtopic altcoin promotion (minor eyerolll)  but if you wanted to point out that Bitcoin Core just ripped out warnings about hash-power behaving inconsistently with your node's rules that would be fine.  I don't endorse all of /r/bitcoin's moderation practices, but false equivalences like this are rubbish.

It also shows that many of the arguments being used to advance "Classic" are convenient lies. If 75% hashpower "decides" what the network is and node software failing to fully validate all txn the same as the hashpower are such significant concerns then why is it okay for Classic to ignore CSV (>98% hashpower supporting now) and rip out all the warnings?
20  Bitcoin / Bitcoin Discussion / Re: BurtW arrested (update: charges dropped!) on: June 12, 2016, 03:00:30 AM
I'm pretty shocked to see some of the comments in this thread.  BurtW has nothing to prove here, and those of you attacking him should be ashamed of yourselves for attacking a victim.
Pages: [1] 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 ... 226 »
Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!