Bitcoin Forum
August 01, 2021, 09:40:38 PM *
News: Latest Bitcoin Core release: 0.21.1 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 »
121  Bitcoin / Development & Technical Discussion / Re: Elastic block cap with rollover penalties on: June 10, 2015, 06:28:07 PM
Because if he starts creating 5943-tx blocks, the penalty pool will be smaller, and so will the amount he collects per block. See scenario 1 here - if the big miner behaves as the small miners do, it's the same as if they're all small miners. He will get just 4.50 BTC per block, instead of 4.68. He can't have the cake and eat it too - not pay a penalty, but expect to reclaim it...
That's not the point. What prevents the competition from pulling from the pool without contributing to it? I understand that it's in the interest of a massive node or cartel of nodes to mine large blocks, but only if they aren't undermined by their competitors, which can pull from the pool without paying any penalty themselves.

The nodes can't have their cake and eat it too... But they can eat it. Ultimately, there would be no pool.

If there are several big miners, they should all create supersized blocks, and then every miner will enjoy the supersized blocks of the others.
Yes, they should, and this would be a cartel; where each is cooperating with each-other for greater profits, and it would break up in the same fashion as other cartels; Any member of the cartel profits more by not contributing to the penalty pool.
122  Bitcoin / Development & Technical Discussion / Re: Confidential Transactions, Content privacy for Bitcoin transactions on: June 10, 2015, 06:02:39 PM
Since this hides transaction amounts doesn't this make CoinJoin impossible at the same time?
Attaching transactions between two people or businesses can be a serious breach of privacy even if the actual amounts are unknown.
123  Bitcoin / Development & Technical Discussion / Re: Elastic block cap with rollover penalties on: June 10, 2015, 02:23:11 PM
I don't understand how your algorithm will give the entity with higher hashpower more profit by mining blocks that give less BTC per block, that doesn't make any sense to me. Why wouldn't the bigger miner mine smaller blocks and get the higher BTC per block, like your example was showing?

On the other hand, if it really is more profitable to mine larger blocks, then why wouldn't the smaller hashpower nodes do it? If they do, then the same problems I mentioned before would apply (The penalty would give a competitive advantage to the higher hashpower nodes).


if this is true;

Reward per block for 1% miner: 5943 * 0.0006885 + 3.2223 - 0.4589 = 6.8552 BTC (more than in scenario 1)
Reward per block for 90% miner: 7251 * 0.0006885 + 3.2223 - 3.5294 = 4.68521 BTC (less than 1% miners in this scenario; more than the miners in scenario 1).

Why isn't the 90% miner simply including 5943 transactions like the 1% miner is, in order to get about 2 BTC more per block?


At some point, the increase in penalty outweighs the increase in tx fees + collection, so he stops there. Whether one or more of these summands is more than 2 BTC is irrelevant, it's their sum that counts.
The problem is, regardless of what the collection is in the pool, the highest profit per block is always greater by simply mining at the target block size.

If the major nodes have driven up the BTC in the collection pool to provide greater profits on the whole, then the first big miner that stops contributing to the pool will essentially drain the pool by mining blocks at the target block size. In the end, trying to maximize profits by relying your competitors to "play fair" won't work.

This is actually analogous to cartelization; Big companies cooperating to achieve greater profits. However.. Cartels, absent the power of the State, always break down, because the first one that does can achieve greater profits at the expense of the cartel as a whole.

Quote
Weaknesses of cartels
Analysis demonstrates that a cartel is an inherently unstable form of operation:
If pooling resources is more profitable, then the cartel will merge into one company.
If it proves to be less profitable, the individual members of the cartel will break off.
If it doesn’t break from within, an outsider, noticing the enormous profitability, will enter the market, and this dooms the cartel.
Particularly likely to be restive under the imposed joint action will be the more efficient producers, who will be eager to expand their business rather than be fettered by shackles and quotas to provide shelter for their less efficient competitors. If the cartel does not break up from within, it is even more likely to do so from without. To the extent that it has earned unusual monopoly profits, outside firms and outside producers will enter the same field of production.
http://wiki.mises.org/wiki/Cartel
124  Bitcoin / Development & Technical Discussion / Re: Elastic block cap with rollover penalties on: June 09, 2015, 08:52:36 PM
2. Explain what is wrong with the example (the one with real numbers) that shows the reward per block is smaller for big miners.

Right here;
Reward per block for 1% miner: 5943 * 0.0006885 + 3.2223 - 0.4589 = 6.8552 BTC (more than in scenario 1)
Reward per block for 90% miner: 7251 * 0.0006885 + 3.2223 - 3.5294 = 4.68521 BTC (less than 1% miners in this scenario; more than the miners in scenario 1).

No miner is going to mine a block that costs him more than 2 BTC to mine. Since it's not economically viable to mine larger blocks, you're right that there isn't an economic advantage given to the larger mining entity, and what I wrote earlier wouldn't apply. What I wrote would only apply to penalties that don't reduce the reward below the target block size reward.

Why would a miner mine a block at a loss just to accept more transactions? Regardless, any market participant that engaged in this behavior would just get out-competed by another that didn't.
125  Bitcoin / Development & Technical Discussion / Re: Elastic block cap with rollover penalties on: June 09, 2015, 03:02:25 PM
Ok, so my post was nuked again (My computers fault) so this will be short and sweet. I have been away, so I apologize for the delay;

Suppose there is a big miner with 33% of the hashpower, and there are 67 small miners with 1% of the hashpower. Because of the Law of Big Numbers and simplicity, I'll assume that they all mine blocks in a predictable order;
BigMiner, SmallMiner[1], SmallMiner[2], BigMiner, SmallMiner[3], SmallMiner[4], .. , BigMiner, SmallMiner[66], SmallMiner[67]

I'll also assume that the rollover fee will be distributed over the next 100 blocks.

BigMiner can assume that it can recover 33% of the lost reward on a future block. So if the BigMiner can mine a block that exceeds T while simultaneously getting a reward that is better than a block less than T, given that his reward is reward - penalty + .33*penalty, then it's at a net positive. Whereas a small miner could only do reward - penalty + 0.01*penalty. What is further problematic for the SmallMiner is that if they mine anything with a rollover fee, the BigMiner will get 33% of the reward whereas they will be lucky to get 1% (Not only is the reward less for himself, but any rolling over reward is more beneficial to his larger competitor(s)). In this way, the BigMiner has a competitive advantage over SmallMiners.

I tried to come up with specific numbers that would show the attack on your algorithm, but I wasn't sure what values you were using to get your graph1, so I think it would end up being a straw man anyway. The bottom line is that penalties or rewards rolling over hurts the SmallMiner more than the BigMiner. SmallMiners being less able to recover over future blocks.

By "penalties or rewards" I mean that it doesn't make a difference whether you are rewarding the current larger block and rolling a bonus over to future blocks, or if you are penalizing a big block and then passing the difference to future blocks.

I didn't see a response to that problem, sorry if I missed it.

1https://i.imgur.com/EZvlJq7.png

Edit: T was defined as the target block size.
126  Bitcoin / Development & Technical Discussion / Re: Elastic block cap with rollover penalties on: June 04, 2015, 07:52:01 PM
Quote from: Meni Rosenfeld
This is correct, and I hadn't given enough thought to this problem prior to posting.

Now that I've given it more thought, I think it can be significantly alleviated by making collection from the pool span a longer period, on the time scale of years. Relative hashrate is likely to change over these period, so it may not be the best plan to publish excessively large blocks hoping to reclaim the fee (and publishing typical-sized blocks does not give big miners an advantage). Also, with a different function you can change the balance between the marginal and total penalty so that the actual penalty is small, nullifying the effect (it will require the cap to be a bit harder, but still more elastic than what we have now).

I agree that this calls for more analysis.
A longer time period over which the reward is given doesn't help, as the larger nodes or entities will still get a larger ratio of the rolling over fees, by definition.
Actually, making the rollover fees only extend over a couple of blocks would more likely mitigate the problem, but if you roll over the fee for about 3 blocks or so, then it might be worth it for a miner to hold blocks and release 2 at a time, depending on fee-over-time. This, in turn, might exacerbate the selfish miner exploit1. The natural monopoly condition that already exists in Bitcoin2 seems to be exacerbated either way.

Getting around this would be tricky, if it's possible.

1http://fc14.ifca.ai/papers/fc14_submission_82.pdf
2https://bitcointalk.org/index.php?topic=176684.msg9375042#msg9375042

An examination of the prior art is warranted.
Pointing to Monero as an examination of prior art is asking a bit much. Are you expecting us to dig through the Monero source code? How do they get around the problem?

This is not very helpful;

Quote
The Basics
A special type of transaction included in each block, which contains a small amount of monero sent to the miner as a reward for their mining work.

https://getmonero.org/knowledge-base/moneropedia/coinbase
127  Bitcoin / Development & Technical Discussion / Re: Elastic block cap with rollover penalties on: June 04, 2015, 04:46:10 PM
Isn't it precisely what is implemented in Monero? (except you don't have a rollover pool, the penalty is simply deducted from the block reward for good).
No idea what happens in Monero, but if so, more power to them.

Apparently, neither does Gavin.
He said he didn't want to talk to you until there was working code that does it?
Such code has been working for years, but people forget where the experimentation is occurring, the alts.

Can you explain a bit about the mechanism wherein the miner pays into the rollover pool, and why that is different from the 'original proposal'?
The difference is quantitative. In this version the rollover effects only blocks that exceed a threshold of size.
128  Bitcoin / Development & Technical Discussion / Re: Elastic block cap with rollover penalties on: June 04, 2015, 01:51:27 PM
I think it's interesting to switch the conversation over to a soft-failure rather than trying to find the appropriate answer to the block size for now.

That said, a problem with any kind of rollover fee is that it assumes that moving fees to future blocks is also moving fees to different nodes.

Put differently; centralizing nodes is a way of avoiding the penalties you're trying to introduce with this protocol.

Put differently again; Paying fees over consecutive blocks gives a competitive advantage to larger mining entities when making larger blocks.

Put triply differently; A node that can reliably get another block within X blocks is less penalized than a node that cannot, where "X" is the number of blocks that the rollover fee is given.

So if the goal is to avoid centralization, then the protocol does the opposite of the intention. If the goal is to make Bitcoin fail-safe, I'm not convinced that Bitcoin isn't already. When blocks fill, we will see higher transaction fees, potentially lengthier times before a transaction is included in a block, and as a result more 3rd party transactions.

Rereading Mike Hearns article1, changing bitcoin to include highest fee transactions into the mempool should result in the behavior I described. An end user might see delays before their transaction is included in a block, but I wouldn't call that a "crash landing", considering that the sorts of transactions that would be done at these rates are not as concerned about the speed of confirmation.

TLDR: How does a fee over "X" blocks not incentivize a centralization of nodes?

1https://medium.com/@octskyward/crash-landing-f5cc19908e32
129  Bitcoin / Development & Technical Discussion / Re: Reduce Block Size Limit on: May 26, 2015, 04:44:42 PM
[Redacted] Cut out original response to a document to revise bad math, then my computer immediately crashed. So never-mind, dead horse stuff anyway. I'm surprised Word didn't save it.
130  Bitcoin / Development & Technical Discussion / Re: Reduce Block Size Limit on: May 21, 2015, 07:03:03 PM
What is the problem with transaction fees plummeting? As long as fees per block remain constant (or higher) it shouldn't matter if fees per transaction drop.
The point is that the fees that an individual node will accept does not reflect the cost of a transaction to the system. So whatever the amount of fees per block, we can be sure that it would be too low to support all nodes, once inflation stops... And blocks are too big... Or demand is too low.

If block size is not removed/increased, then people will transact bitcoins outside the blockchain (Just like it happens inside bitstamp). So people will implement centralized wallets which will hold as a collateral bitcoins in one address, while moving in a centralized database the amounts between users. And even when this looks as a smart solution, if bitcoins are not moving around, miners wont be getting those fees which puts in danger the whole ecosystem.

If we want the miner to earn transaction fees, then transactions must be done in the blockchain and not outside.
The centralization of Bitcoin through small blocks is true, but it happens through a different mechanism. If the maximum amount of transactions per second is saturated with a certain quantity of users, then the ratio of indirect to direct users increases. That doesn't represent atrophy of hashpower, or the amount mining entities, or the amount of direct users. On one extreme we have centralization toward handling only expensive transactions of the wealthy few that can bid their way in, and on the other extreme we have atrophy of nodes.

I don't want to see a small fraction of the world having total control over Bitcoin in either direction, and what I think a lot of people would like to see is a automatic way of finding that balance. Just making the argument to increase block size is missing the larger point, as DannyHamilton pointed out earlier. Developers don't want that responsibility and the users don't want them to have it either.
131  Bitcoin / Development & Technical Discussion / Re: Reduce Block Size Limit on: May 21, 2015, 12:43:30 PM
You are mistaken.  There are physical limits and costs that would prevent this.  Each additional transaction increases the size of the block.  There are costs associated with increasing the size of a block.  At a minimum, there is a (very small) increase in the chance that the block will be orphaned.  Miners (and pools) would need to consider the cost of that risk and weigh it against the revenue gained from adding transactions.  Additionally, there storage costs associated with holding the list of unconfirmed transactions in memory.  Since some miners may choose to set a higher fee requirement, users would find that their transaction may not confirm quite as quickly with a lower fee.
Yes, mrvision was saying the same thing. "Epsilon" might have been a better way to put it. The fact is that the cost to a node is many orders of magnitude less than the cost to the network as a whole. Just saying, "it's not zero" whitewashes the fact that it is much much less than what's needed to compensate a highly redundant network. You're right, I'm guilty of hyperbole.

The hope is that the block space will be scarce enough to bid transaction fees high enough to cover some semblance of decentralization when inflation stops.

Yes, but that scarcity is could potentially be controlled by the costs associated with adding a transaction to a block.  However, at the moment, that size may be large enough to have other repercussions that need to be considered and dealt with. If I remember correctly the 1 MB limit wasn't added to create scarcity, it was added to protect the blockchain from certain attack vectors with an assumption that it would be increased significantly in the future.
It wasn't the intention, but ultimately that is the only thing that's going to prevent transaction fees from plummeting.

Whatever that limit should be we can only say for certain that it is not "infinite".

Obviously, it would be impossible for anyone to create (or broadcast) an "infinite" block.  The question that mrvision appears to be attempting to discuss is whether that limit should be arbitrarily chosen, should be based on some reasoning by the consensus of the community, or should be controlled by market forces based on the the physical limitations and costs associated with increasing the block size.
Same as above, I assumed from the way he was writing that his preference would be to let the market handle it entirely, without any imposed block size limit.
Some sort of algorithmically increasing blocksize limit is reasonable but at the same time very weird to consider when we don't even know what the Bitcoin network is going to look like once inflation stops at the current block size limit. What if it turns out, after the block size limit is increased (Or increased algorithmically), that transaction fees don't support the network? Are we going to see backpedaling at the glacial pace of a Bitcoin fork?

Edit: Er... Just noticed GMaxwell already said what I meant;
Quote from: GMaxwell
So I would refine DumbFruit's argument to point out that it isn't that "fees would naturally be priced at zero" but that the equilibrium is one where there is only a single full node in the network (whos bandwidth costs the fees pay for) and no POW security, because the that is the most efficient configuration and there is no in system control or pressure against it, and no ability to empower the users to choose another outcome except via the definition of the system.  I believe this is essentially the point that he's making with "the most competitive configuration in a free market"-- even to the extent those costs exist at all they are minimized through maximal centralization.
132  Bitcoin / Development & Technical Discussion / Re: Reduce Block Size Limit on: May 20, 2015, 08:44:57 PM
I think you people believe...
If you want to have a discussion you should quote somebody and make your arguments based on their position.

I don't see where centralization is more likely to ocur in this kind of ecosystem (which would be prepared to scale) than in a capped one, but i see lots of benefits in a free bitcoin ecosystem.
You were told by at least two people exactly why lifting the limit would necessarily lead to centralization.

We know there is more than enough transactional information in the world than we could ever hope to fit in a sufficiently decentralized blockchain using today's technology. There is zero incentive for miners to not fill the blocks entirely; almost any non-zero fee would be sufficient.

The hope is that the block space will be scarce enough to bid transaction fees high enough to cover some semblance of decentralization when inflation stops. Whatever that limit should be we can only say for certain that it is not "infinite".

If the block space is infinite then transaction costs, through competition, would fall to roughly the rate needed to operate single node (when inflation stops), because this is the most competitive configuration in a free market.
133  Bitcoin / Development & Technical Discussion / Re: How many confirmations? on: May 20, 2015, 05:24:49 PM
-snip-
Due to the malleable nature of a bitcoin transaction ID, it is possible that a modified version of the first transaction which changes its hash will become confirmed, {...}

Can you please explain this part further? Thank you!

Transactions in Bitcoin have an ID that can be arbitrarily changed before their inclusion in a block, or if a reorganization happens. This is an intended feature of Bitcoin which allows things like CoinJoin.

https://en.bitcoin.it/wiki/Transaction_Malleability
https://bitcointalk.org/index.php?topic=279249.0
134  Bitcoin / Development & Technical Discussion / Re: Reduce Block Size Limit on: May 20, 2015, 01:17:53 PM
But of course they won't agree either, so ask them: is it 1mb a magical number? Why is it the correct size? Of course in a free block economy market forces would determine the correct amount, but that is soooo risky...
They sure would. Markets tend to be highly efficient, running in only the most extreme situations (at large scales) at like 10% profit margins (In financials, which surely aren't free markets anyway). In more reasonable situations running below 2%.

In Bitcoin's case, this would mean that competitors in this space would drive transaction prices down to whatever it costs them to receive those transactions plus about 10%. However, the cost of each transaction is actually several hundred thousand times the cost of one node... Because there are lots of nodes that need to duplicate the same transaction.

So in order for your idea to work (Presumably you want the limit removed entirely) we would need to see profit margins in a free market of something like 10,000,000%.

That's only the tip of the iceberg though, because obviously we wouldn't want to see all that profit go to a single entity, we'd want it spread to all miners in proportion to their work, and if you can figure out a way to do all that, then maybe we don't even need free markets.

TLDR; Something about cakes and eating them.
135  Bitcoin / Development & Technical Discussion / Re: Blockchain download speed singularity on: May 19, 2015, 04:15:19 PM
But as the services / products grow, eventually this one will become a concern.

Now it may need a few more years, up to 10-15 for this to become a serious issue, but its better to be prepared for everything.
Mike Hearn argues it'll be a "serious issue" by next year. Gerald Davis also made an impassioned post around here about how the Bitcoin block sizes need to be increased. Mircea Popescu argues they're both wrong. GMaxwell seems to have a neutral opinion, though he's argued repeatedly that decentralization of Bitcoin is paramount, but pointed out that actually hitting the limit won't cause any kind of devastation, like Mike Hearn argued.

I'd be willing to argue that the block size is actually already too big, or at least would like to see more discussion about the benefits of a more restricted block space. That the block limit needs to be raised is a foregone conclusion with poor support; Leaning way too heavily on FUD, appeals to emotion, and reliance on future undiscovered technology. All arising out of a vision for Bitcoin that competes with the likes of Visa on their terms, despite manifestly being unable to do so.

Greg Maxwell;
https://bitcointalk.org/index.php?topic=1054181.msg11318788#msg11318788
Gerald Davis;
https://bitcointalk.org/index.php?topic=946236.0
Mircea Popescu;
http://trilema.com/2015/gerald-davis-is-wrong-heres-why/
Mike Hearn;
https://medium.com/@octskyward/the-capacity-cliff-586d1bf7715e
136  Bitcoin / Development & Technical Discussion / Re: Blockchain download speed singularity on: May 19, 2015, 02:59:21 PM
The Bitcoin blockchain is not going to handle all transactions in the United States or the world with today's technology. Making hundreds of thousands of duplicates of every single transaction throughout the world is non-trivial. Even if Bitcoin blocks were massive we would run into the issue of transaction fees being too high to be reasonable, not to mention the outrageous bandwidth requirements that would be necessary.

That does not mean Bitcoin is dead or doomed (Why the FUD?). It means that other technologies must be used to facilitate the majority of "Bitcoin" transactions in the long run. This isn't ideal, but sometimes the perfect is the enemy of the good.

https://en.bitcoin.it/wiki/Off-Chain_Transactions
https://blog.coinbase.com/2013/08/06/you-can-now-send-micro-transactions-with-zero-fees/
https://lightning.network/
137  Alternate cryptocurrencies / Altcoin Discussion / Re: Could BitCoin fork into PoS? on: May 19, 2015, 02:33:40 PM
That would fundamentally break the decentralization model that Bitcoin is attempting to achieve. Yes it can be done, but it wouldn't be Bitcoin anymore. I still haven't figured out what PoS solves that centralized agencies don't solve better, but I guess that's a different topic.

Ye olde definitive criticism;
https://download.wpsoftware.net/bitcoin/pos.pdf
138  Bitcoin / Development & Technical Discussion / Re: [Theory] The optimal confirmation time on: May 18, 2015, 02:41:06 PM
https://bitcointalk.org/index.php?topic=977245.msg11370378#msg11370378

I'm a little skeptical about this, as even with fast transfer of messages,
any lags and connectivity problem on the user's end could lead to forking AFAICS.

But I'd be really interested in knowing whether the number of confirmations in Bitcoin is
merely a function of network propagation rate, or are there any deeper reasons
why the current implementation matters.
Network latency, bandwidth, block size, processor/memory/hard drive speed, geopolitical concerns, code efficiency, and fork convergence speed. Is there anything else?

Neither Bitcoin or any other clones I have inspected have replaced the original mempool technology or surrounding communications sub-system.
If you can improve the factors above then you can avoid the forks and therefore avoid the loss of work incurred when reducing confirmation rate. My understanding is that the mempool has been highly optimized with a bloom filter, so I'm not sure what gains can be made there, but I'm also not a Bitcoin developer. I'm also skeptical.

Quote from: john-connor
The Bitcoin mempool is inconsistent in that very few peers have the same transactions. The Vanillacoin "transaction pool" is consistent in that all clients and peers know all transactions.
Huh?

https://talk.vanillacoin.net/topic/142/version-0-2-8-beta-release/5
139  Bitcoin / Development & Technical Discussion / Re: Bitcoin without bitcoins? on: May 14, 2015, 01:16:21 PM
I read back up on merge mining, and actually since you can do fee-less blocks it's possible to do this in the existing Bitcoin protocol, but it's different from Merge mining in these ways;

1.) Merge mining does double mining, whereas what I'm describing all mining resources are directed at the Proof of Work blockchain. The  cryptocurrency blocks are built at the same time as a Proof of Work block.

2.) There is no pretense of being a Bitcoin block, so a dedicated Proof of Work block would be more efficient. No transactional information or scripting would be included.

3.) Bitcoin would be only optionally supported by miners.


Quote from: dumbfruit
The main advantages I see with this system is that it allows arbitrary experimentation in cryptocurrency designs while retaining PoW infrastructure, it creates a marketplace for Proof of Work hashes so that transaction costs can be computed by the actions of the market participants, and it allows centralization of cryptocurrencies while retaining decentralized consensus of their state. The Proof of Work provider could also include other services like "smart property", not just cryptocurrencies.

This allows for experimentation but less than with an alt, because you are limited to the protocol that exists.

The only limitations that I'm aware of is that the cryptocurrency must be a Proof of Work cryptocurrency (Which is the only type of cryptocurrency that achieves decentralized consensus that I know of.) and the confirmation times would be longer. Are there other limitations?

Sidechains are more strictly limited by design. Merge mining, and this slight twist on it, are much more lenient.
140  Bitcoin / Development & Technical Discussion / Re: [Theory] The optimal confirmation time on: May 12, 2015, 08:22:26 PM
@Monsterer
Ok, you're glazing over the chasm of difference between Proof of Work and "Proof" of Stake, but whatever. What does this have to do with "optimal confirmation time"?

There is a fair amount of glazing going on here, yes, but for the purposes of the question I think the equivalence assumption is ok.

I'm trying to establish how the idealised, trustless crypto-currency would perform. Is the limiting factor in optimal confirmation time simply the time it takes a transaction to reach 50% network consensus?

No, a block is first confirmed even if it's only one miner that's confirmed it. A block in Bitcoin is confirmed the moment the correct hash is found for it. However, that's no guarantee that it won't be an orphan block. Even when 100% of the network has accepted the block there is no guarantee it won't be orphaned.

So if I'm understanding you correctly, the amount of time it takes to reach 50% of the nodes is not the only limiting factor to optimal confirmation time. Also, the Bitcoin blockchain can't be reversed by a single party unless they control >50% of the hashrate, that's very different from saying it's irreversible once 50% of the network has reached consensus.
Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!