I never understood the argument for that card, considering the increases in difficulty favours the card that gives the most hash power per USD when initially bought.
In the short run (and with increasing difficulty), Hashes per USD on the price tag is THE only thing to look at.
In the longer run (and with constant or falling difficulty), Hashes per Watt used is the dominant measurement.
If you could get a device that costs only 50 USD, uses 3000 Watt per hour and does 1000 MH/s, it would be a better investment currently than FPGAs that cost 500 USD, use 20 Watts and do a few hundred MHashes/sec.
I really should plot out some numbers, as noone did it yet publicly... but that could get quite depressing for 6990 owners/buyers!
Indeed. I've grown weary of trying to debunk these silly static statements, so it's nice to see someone pointing out that context plays a huge role in any decision. $/Mhash, Mhash/Watt, Mhash/Card, $/Watt, Price/Difficulty, Rate_of_Network_Growth, all play an important role in how you choose your path to profitability.
But no, let's just calculate 50% static difficulty increases based on nothing.