Bitcoin Forum
June 22, 2024, 07:33:05 AM *
News: Voting for pizza day contest
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 9 10 [11] 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 »
  Print  
Author Topic: .  (Read 24695 times)
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
February 04, 2016, 05:49:22 PM
 #201

I have been so done with the block fight I don't even know what segwit is and why it can't work with 2mb blocks, if it can then why not both?
Segwit is kind of hard to explain for people who aren't good with technology. It is a way of changing how the information gets stored so to speak and will be deployed as a soft fork. Transacting between Segwit clients becomes more efficient and it should result in a effective block size of ~1.7 MB (which will and can grow depending on usage and type of transactions). Segwit + 2 MB blocks would be almost the same as a 4 MB block size limit which is not deemed as safe (yet).

Throwing the term of shills all over the place does not lead to any sort of discussion that promotes advancing a coin, I think it is pretty obvious when someone is really a shill and they should just be ignored and to me franky isn't one.
I think you haven't done enough analyzing of the situation to understand what is going on. CIYAM isn't surely a person who would throw that word around for no reason. Look at a fine example that I will quote:

I think the greater community already agrees but core will never get approval from their corporate Blocksteam™ masters.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1078


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 04, 2016, 05:51:23 PM
 #202

The community already does agree but core will never get approval from its corporate Blocksteam™/PWC wage masters.

Why do you post such blatant lies?

Prove that the community agrees or stop lying please.

(you are clearly a very dishonest person who wants to lie about what others think - in short you are a *****)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Jet Cash
Legendary
*
Offline Offline

Activity: 2744
Merit: 2462


https://JetCash.com


View Profile WWW
February 04, 2016, 05:52:35 PM
 #203

I have been so done with the block fight I don't even know what segwit is and why it can't work with 2mb blocks, if it can then why not both?

As I understand it, SegWit and increased blocksizes are two completely separate issues. SegWit improves efficiency, and is the equivalent of a 2Mb blocksize, or better. It introduces a number of other possibilities as well. As there is no immediate need for a blocksize increase, it would be better to implement SegWit, and then see what is needed for the future. Who knows, maybe the Litecoin and Bitcoin chains should be combined to facilitate exchanges. Smiley

Offgrid campers allow you to enjoy life and preserve your health and wealth.
Save old Cars - my project to save old cars from scrapage schemes, and to reduce the sale of new cars.
My new Bitcoin transfer address is - bc1q9gtz8e40en6glgxwk4eujuau2fk5wxrprs6fys
franky1
Legendary
*
Offline Offline

Activity: 4256
Merit: 4532



View Profile
February 04, 2016, 05:57:34 PM
 #204

I have been so done with the block fight I don't even know what segwit is and why it can't work with 2mb blocks, if it can then why not both?

As I understand it, SegWit and increased blocksizes are two completely separate issues. SegWit improves efficiency, and is the equivalent of a 2Mb blocksize, or better. It introduces a number of other possibilities as well. As there is no immediate need for a blocksize increase, it would be better to implement SegWit, and then see what is needed for the future. Who knows, maybe the Litecoin and Bitcoin chains should be combined to facilitate exchanges. Smiley

there was no immediate need for 1mb when blocks were only being made of less than 500k in 2009-2013.. but the setting was there at 1mb.. AS A BUFFER to allow for growth without any rush, or debate

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
MicroGuy (OP)
Legendary
*
Offline Offline

Activity: 2506
Merit: 1030


Twitter @realmicroguy


View Profile WWW
February 04, 2016, 06:05:07 PM
 #205

I have been so done with the block fight I don't even know what segwit is and why it can't work with 2mb blocks, if it can then why not both?

As I understand it, SegWit and increased blocksizes are two completely separate issues. SegWit improves efficiency, and is the equivalent of a 2Mb blocksize, or better. It introduces a number of other possibilities as well. As there is no immediate need for a blocksize increase, it would be better to implement SegWit, and then see what is needed for the future. Who knows, maybe the Litecoin and Bitcoin chains should be combined to facilitate exchanges. Smiley

An irrational fear of hard forking to 2MB is not valid justification for Blockstream's paid/owned core devs to ignore the wider community and strap on their experimental protocol-perverting sidechain.

Fact: Satoshi reduced the blocksize to an arbitrary 1MB as a temporary security measure only. It was never intended to stay fixed at 1MB and certainly never intended as a consensus rule. What we are seeing is a 'classic' example of a hostile corporate takeover having fully co-opted command and control of bitcoin core.
VeritasSapere
Hero Member
*****
Offline Offline

Activity: 546
Merit: 500



View Profile
February 04, 2016, 06:07:22 PM
Last edit: February 07, 2016, 12:10:53 AM by VeritasSapere
 #206

Just some spit-balling here, I'm sure I have some ordering and details wrong.
I am going to correct some of the things you said here, in the interest of truth.

Mike and Gavin argue for no limits and ~free transactions forever, regardless of the costs.
I think that you might be lying here, since I think you should know better, though please supply a source where Gavin definitively states his support for free transactions forever?

Some in core suggest that if we really need to show the flexibility of a blocksize increase, 2MB could probably be sold and made safe enough (with planned future improvements) if we really had to.
Even today Core has not given the community a date for a hard fork increase of the blocksize limit. This is unacceptable, and you should not expect the community to trust Core. This is why the community can fork the network themselves, if the interests of the "reference client" no longer reflect the will of the economic majority then we are certainly justified in doing so.

Gavin gets some clue and realizes no limit at all makes no sense; Mike grudgingly agrees to go along with a 20MB proposal.
Its called compromise, Core could learn from this.

Miners reject that as unrealistically large.
Because miners decide, incentives aligning their interests with those of the economic majority. Ultimatly this economic majority therefore decides, not a centralized group technocrats in charge of a implementation of Bitcoin.

Gavin and Mike retrench with a 8MB that rapidly ramps to gigabytes. Announce intent to adversarial fork the network in Bitcoin XT. A new client which is 99.99% code from Core but with the addition of the new expanded blocksize and Mike hearn declared benevolent dictator of the project.
What you describe as rapidly actually takes place over a twenty year period. It is fine to be skeptical of the schedule within BIP101 but you are dramatically exaggerating the facts with the language you choose to use.

Furthermore you fail to acknowledge that Core is also effectively a dictatorship. Every implementation of Bitcoin is arguably a dictatorship, it is the nature of open source projects. In Bitcoin the democratic will of the economic majority can be represented through the choice of multiple alternative implementations. This solves this particular problem within the governance of Bitcoin.

In spite of announced intentions by some, almost no one adopts Bitcoin XT and XT development sits at a near standstill compared to Core.
Again with the exaggeration, around ten percent of the nodes have been alternatives to Core since the launch of XT, I would not call that nothing. When Classic is released I am sure it will take away a even bigger share of nodes running Core. Not to mention the supermajority of miners now supporting Classic.

Systems testing is finally performed on XT via testnet a month before it was intended to activate. Many performance problems are found with 8MB, person testing it suggests that 4MB or 3MB initially may be more realistic.
While other tests showed that it was safe, I am not a technical expert myself but there are other technical experts beside yourself that say eight megabyte would be fine.

Core completes a years long development work which speeds signature validation >5x, invents a clean way to allow new efficiencies and security trade-offs, gets node pruning working, and comes up with a way to get roughly 2MB worth of capacity without a the risky and coordination problematic hardfork, while simultaneously fixing some long time bugs like malleability and misaligned incentives (utxo bloat).
Hard forks allow us to test consensus and serves as a check against the power of any development team. It is not something that should be avoided, it should be embraced. The work Core has done is good, besides certain details within the code discounting some transaction types in favor of lighting network developed by Blockstream. It being open source I am sure it can be adapted and adopted into another client which will scale Bitcoin to its true potential.

Core posts a capacity roadmap including these solutions, along with a plan for further development to allow more capacity into the future.
We have discussed this before, it comes down to a fundamental disagreement in terms of the vision and future of Bitcoin. For you to say that Core stands for scaling Bitcoin directly is somewhat of a misnomer, at least Peter Todd was able to acknowledge this. Core seems to want to fundamentally change the economic policy of Bitcoin. Many people do not agree with this, having signed up to the original vision of Bitcoin. Not everyone accepts off chain solutions as a way to scale Bitcoin.

Almost the entire development community and many in industry sign on an open letter in support of this plan. On the order of fifty people in all, it includes all of the most active contributors to Bitcoin Core and many other pieces of Bitcoin software.
This is a very misleading statement, especially considering that at the time three out of the five Core commiters did not even sign the road map. Gavin Andreses, Jeff Garzick and Peter Todd. Prominent developers to say the least. Not to mention the many companies that also disagree with the road map.

Gavin, Jeff, and a few other people (including people involved with the recently insolvent Crypsy exchange) announce that they're creating "Bitcoin Classic"; a retry of the XT approach but with added popular voting on a centralized web voting site.
Explain to me how is Classic a retry of XT? The answer is it is not, the only thing they have in common is that they increase the blocksize and that Gavin is involved. I think you are falsy equating these two projects.

Mike Hearn catches fire, slams Bitcoin with a one-sided attack piece piece in the NYT calling Bitcoin a failure. Some argue that Mike's position is driven by his employment at R3, an adversarial to Bitcoin company working with major banks. Astute followers know this isn't true: Mike's misalignment with Bitcoin has existed forever.
Mikes loss of believe and going off to work for R3 is unfortunate. However it is wrong to think that he was always a shill for R3 without evidence. You should know better then to throw around baseless accusations.

Bitcoin market price crashes significantly.
I have called you out on doing this before, relating these events to the markets, unless you are also an expert in markets you are just fear mongering.

Core creates a public test network for the new improvements and many people are actively testing on it. Several wallets begin their integration process for the new improvements. Development moves rapidly, several standards documents are written.

Market price substantially recovers.
You are attributing the rise in price with the work done in Core? Seriously you should know better.

Gavin finally announces code for the new "Classic", largely duplicating the XT functionality. Instead of the BIP101 rapid growth scheme, it features a 2MB hardfork, and none of the other improvements that are recently in core and in the works.

Bitcoin market price drops significantly again.
You are using peoples fear of their monetary investment in order to sway people over to your ideological side. That is manipulation, what you are doing here is propaganda.

I'm hoping we get to the point where the market realizes it's being toyed with here and repeated XT reloaded attempts are pretty meaningless. We're not seemingly there yet.
What are you even trying to say here? That all alternative implementations that want to increase the blocksize are meaningless? It sounds like you have contempt for the freedom of choice, and the very check on the power of Core. Bitcoin is freedom, and I think that you are trying to control it.

Have I basically summarized the last year? Anyone want to add any bullets?
Classic will be released soon, already having gained a supermajority support from the miners. The narrative has evolved and more people are aware of the divergence of vision and possible conflict of interest within Core. Bitcoin is freedom and I am confident that the original vision of Satoshi will triumph. Smiley
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1078


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 04, 2016, 06:08:09 PM
 #207

Fact: Satoshi reduced the blocksize to an arbitrary 1MB as a temporary security measure only. It was never intended to stay fixed at 1MB and certainly never intended as a consensus rule.

Fact: Satoshi's original software had 32MB as the limit.

So why isn't Gavin going for 32MB instead of the 2MB (after he has downsized a few times)?

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
franky1
Legendary
*
Offline Offline

Activity: 4256
Merit: 4532



View Profile
February 04, 2016, 06:11:49 PM
 #208


So why isn't Gavin going for 32MB instead of the 2MB (after he has downsized a few times)?


because he is following community demand. rather than personal choice.
it was the community that first said 2mb was acceptable.. gavin followed afterwards..

so dont make it out to sound like the community who want 2mb real blocks, are following gavin.. its the other way round.. gavins giving into the community

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1078


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 04, 2016, 06:14:17 PM
 #209

so dont make it out to sound like the community who want 2mb real blocks, are following gavin.. its the other way round.. gavins giving into the community

Again - you have never provided proof of any such "community support" for 2MB (you just keep on repeating the word community like a fucking retard).

Hmm.. maybe that's because you are a fucking retard?

(now go and sulk because I called you a bad name)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 503



View Profile
February 04, 2016, 06:14:58 PM
 #210

SegWit does seem good although it is a little tricky and wonder if we will get it perfect the first time.

Increasing the blocksize limit seems good to me.  It seems trivial to get it right.

Doing both seems ok to me but does contradict the wisdom of only making one change at a time.  Can we change one and then a little later the other or is there some compelling reason to do them at the same time?

One thing I wonder about is how to motivate miners to fill up blocks; partial blocks when there's a backlog is pretty dumb.  Here's my thought; when a block is announced then look at the backlog and if the backlog is big enough then keep working on the pre-announced work trying to find a fuller block.  Orphaning partial blocks motivates the miners to fill up the blocks.  Full blocks will attract subsequent work and eventually build a longer chain.  Everyone on the fuller (and hopefully longer) chain wins; everyone on chains with partial blocks lose.
VeritasSapere
Hero Member
*****
Offline Offline

Activity: 546
Merit: 500



View Profile
February 04, 2016, 06:17:20 PM
 #211

so dont make it out to sound like the community who want 2mb real blocks, are following gavin.. its the other way round.. gavins giving into the community
Again - you have never provided proof of any such "community support" for 2MB.
The only way to "prove", "community support" is through proof of work. The moment of truth is approaching.

Power to the people! Grin
franky1
Legendary
*
Offline Offline

Activity: 4256
Merit: 4532



View Profile
February 04, 2016, 06:19:47 PM
 #212

SegWit does seem good although it is a little tricky and wonder if we will get it perfect the first time.

Increasing the blocksize limit seems good to me.  It seems trivial to get it right.

Doing both seems ok to me but does contradict the wisdom of only making one change at a time.  Can we change one and then a little later the other or is there some compelling reason to do them at the same time?

2mb is just a few lines of code that just sit there as a buffer.. just like 1mb set as a buffer for years never hitting top capacity. segwit is a total change

One thing I wonder about is how to motivate miners to fill up blocks; partial blocks when there's a backlog is pretty dumb.  Here's my thought; when a block is announced then look at the backlog and if the backlog is big enough then keep working on the pre-announced work trying to find a fuller block.  Orphaning partial blocks motivates the miners to fill up the blocks.  Full blocks will attract subsequent work and eventually build a longer chain.  Everyone on the fuller (and hopefully longer) chain wins; everyone on chains with partial blocks lose.

orphaning off blocks like that would cause more problems than solutions. the main result is the smaller miners who dont have hashpower will get orphaned off more, because they include less tx's to try gaining advantage against the large farms. making mining network less distributed and in favour of the large mining farms

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1078


Ian Knowles - CIYAM Lead Developer


View Profile WWW
February 04, 2016, 06:21:34 PM
 #213

The only way to "prove", "community support" is through proof of work. The moment of truth is approaching.

Power to the people! Grin

You are seriously deluded (or more likely your account was bought).

The miners are not supporting 2MB.

(did you forget that I live in China?)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
VeritasSapere
Hero Member
*****
Offline Offline

Activity: 546
Merit: 500



View Profile
February 04, 2016, 06:24:26 PM
 #214

The only way to "prove", "community support" is through proof of work. The moment of truth is approaching.

Power to the people! Grin

You are seriously deluded (or more likely your account was bought).

The miners are not supporting 2MB.

(did you forget that I live in China?)
Better check your facts. There is no reason to insult me either, just check my post history, you will see that my writing style and views are consistent. I am also a miner and I support 2MB. Smiley

https://bitcoinclassic.com/
https://docs.google.com/spreadsheets/d/1Cg9Qo9Vl5PdJYD4EiHnIGMV3G48pWmcWI3NFoKKfIzU/edit?pref=2&pli=1#gid=0
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 503



View Profile
February 04, 2016, 06:26:46 PM
 #215

One thing I wonder about is how to motivate miners to fill up blocks; partial blocks when there's a backlog is pretty dumb.  Here's my thought; when a block is announced then look at the backlog and if the backlog is big enough then keep working on the pre-announced work trying to find a fuller block.  Orphaning partial blocks motivates the miners to fill up the blocks.  Full blocks will attract subsequent work and eventually build a longer chain.  Everyone on the fuller (and hopefully longer) chain wins; everyone on chains with partial blocks lose.
orphaning off blocks like that would cause more problems than solutions. the main result is the smaller miners who dont have hashpower will get orphaned off more, because they include less tx's to try gaining advantage against the large farms. making mining network less distributed and in favour of the large mining farms
Hmm, https://blockchain.info/block-index/1010926/0000000000000000001e7d9e5bf68fda5332ee606963b407353fc8b0f1f4d38b came from AntPool.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
February 04, 2016, 06:27:19 PM
 #216

Core completes a years long development work which speeds signature validation >5x, invents a clean way to allow new efficiencies and security trade-offs, gets node pruning working, and comes up with a way to get roughly 2MB worth of capacity without a the risky and coordination problematic hardfork, while simultaneously fixing some long time bugs like malleability and misaligned incentives (utxo bloat).
The work Core has done is good, besides certain details within the code discounting some transaction types in favor of lighting network developed by Blockstream. It being open source I am sure it can be adapted and adopted into another client which will scale Bitcoin to its true potential.
This is what you label as 'good'? You obviously don't even have the slightest clue as far as the complexity is concerned. Another thing that you're wrong about is "LN developed by Blockstream". There are multiple implementations of the Lightning Network, the most advanced ones are:
Joseph, Tadge and roasbeef's version: https://github.com/LightningNetwork/lnd/  
Rusty'sversion (Blockstream): https://github.com/ElementsProject/lightning

The only way to "prove", "community support" is through proof of work. The moment of truth is approaching.
Power to the people! Grin
Look at all the support for the 'forkers':
Quote

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
VeritasSapere
Hero Member
*****
Offline Offline

Activity: 546
Merit: 500



View Profile
February 04, 2016, 06:32:02 PM
 #217

Core completes a years long development work which speeds signature validation >5x, invents a clean way to allow new efficiencies and security trade-offs, gets node pruning working, and comes up with a way to get roughly 2MB worth of capacity without a the risky and coordination problematic hardfork, while simultaneously fixing some long time bugs like malleability and misaligned incentives (utxo bloat).
The work Core has done is good, besides certain details within the code discounting some transaction types in favor of lighting network developed by Blockstream. It being open source I am sure it can be adapted and adopted into another client which will scale Bitcoin to its true potential.
This is what you label as 'good'? You obviously don't even have the slightest clue as far as the complexity is concerned. Another thing that you're wrong about is "LN developed by Blockstream". There are multiple implementations of the Lightning Network, the most advanced ones are:
Joseph, Tadge and roasbeef's version: https://github.com/LightningNetwork/lnd/  
Rusty'sversion (Blockstream): https://github.com/ElementsProject/lightning

The only way to "prove", "community support" is through proof of work. The moment of truth is approaching.
Power to the people! Grin
Look at all the support for the 'forkers':
Quote
I am not wrong in saying that the Lighting network is being developed by Blockstream, and that segwit arbitrarily favors this transaction type for this technology over other transaction types, call it a subsidy if you will. Even if Blockstream is benign, the conflict of interest is clear. Just another example of Core attempting to apply a centrally planned economic policy onto the Bitcoin protocol.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 503



View Profile
February 04, 2016, 06:34:07 PM
 #218

One thing I wonder about is how to motivate miners to fill up blocks; partial blocks when there's a backlog is pretty dumb.  Here's my thought; when a block is announced then look at the backlog and if the backlog is big enough then keep working on the pre-announced work trying to find a fuller block.  Orphaning partial blocks motivates the miners to fill up the blocks.  Full blocks will attract subsequent work and eventually build a longer chain.  Everyone on the fuller (and hopefully longer) chain wins; everyone on chains with partial blocks lose.
orphaning off blocks like that would cause more problems than solutions. the main result is the smaller miners who dont have hashpower will get orphaned off more, because they include less tx's to try gaining advantage against the large farms. making mining network less distributed and in favour of the large mining farms
Hmm, https://blockchain.info/block-index/1010926/0000000000000000001e7d9e5bf68fda5332ee606963b407353fc8b0f1f4d38b came from AntPool.
Meanwhile, I haven't seen the much smaller Slush pool put out such empty blocks.
franky1
Legendary
*
Offline Offline

Activity: 4256
Merit: 4532



View Profile
February 04, 2016, 06:36:12 PM
 #219

Just another example of Core attempting to apply a centrally planned economic policy onto the Bitcoin protocol.

don't worry. the blockstream fanboys have been blasting that bitcoinocracy link between all their friends.. its not a fair view of the community.

hell even hilary clinton can fake a poll by only asking her best friends to get on TV and show their support for her

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 503



View Profile
February 04, 2016, 06:37:04 PM
 #220

One thing I wonder about is how to motivate miners to fill up blocks; partial blocks when there's a backlog is pretty dumb.  Here's my thought; when a block is announced then look at the backlog and if the backlog is big enough then keep working on the pre-announced work trying to find a fuller block.  Orphaning partial blocks motivates the miners to fill up the blocks.  Full blocks will attract subsequent work and eventually build a longer chain.  Everyone on the fuller (and hopefully longer) chain wins; everyone on chains with partial blocks lose.
orphaning off blocks like that would cause more problems than solutions. the main result is the smaller miners who dont have hashpower will get orphaned off more, because they include less tx's to try gaining advantage against the large farms. making mining network less distributed and in favour of the large mining farms
Hmm, https://blockchain.info/block-index/1010926/0000000000000000001e7d9e5bf68fda5332ee606963b407353fc8b0f1f4d38b came from AntPool.
Meanwhile, I haven't seen the much smaller Slush pool put out such empty blocks.
Filling a block takes a tiny amount of time compared to finding a compliant nonce et al for any sized pool or even a solo miner.
Pages: « 1 2 3 4 5 6 7 8 9 10 [11] 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!