According to my calculations, on large scale setups, FPGAs are more profitable over a 10 month period than GPUs are.
But that math was based on very large scale operations, requiring professional cooling/power and so on (100GHash/s +)
I'll grant you that the GPU *may* have better resale ability, but that's assuming FPGA mining doesn't take off and provide a demand for secondhand FPGA cards. And that the GPU improvements on the gaming market aren't devaluing your hardware too fast.
I think the big difference comes from the perception many miners convince themselves of, which is that the space/cooling/power is a minimal overall impact with GPUs, but when your talking about a 15-20x improvement in power consumption (and then the power consumption of cooling as well) it makes a big difference.
A (very rough numbers) example:
a dual 5830 mining rig can be purchased for about $400 if you're careful. And it will produce 600MHash/s easily.
At current difficulties that produces around 0.5BTC per day.
And at a rough average over the past several months of $5 per BTC that earns you roughly $75 per 30 day period.
Current cost of electricity (at least where I am) is around $0.10 per KWh (but varies depending on time of use, and usage volume, and who your supplier is, but it averages around $0.10).
So that 5830 rig consumes around 500W Measured at the wall. That means that in 30 days, it will consume 360KWh of electricity. Costing $36. In addition it's generating heat, which requires cooling, costing additional electricity (but this doesn't become noticable with only a single rig, 5-10 rigs though, and you're paying for dedicated cooling solutions, adding to supporting hardware costs, and operational costs by as much as 50% - 100% of the electricity to run the unit, just to keep it cool).
But for the sake of easy comparison, let's ignore that for now (keeping it in the back of our mind). Let's just compare electricity as the ONLY operational cost of either unit to keep it simple.
So that means the total "profit" per 30 days, after operational costs is $39.
an FPGA purchased for $500 produces 400MHash/s
At current difficulties producing around 0.35BTC roughly per day.
That means at the same $5 per BTC, that's $52.50 per 30 days in mining income.
That FPGA consumes a mere 16W roughly (maybe 18 measured at the wall with transformer loss) so that would be a total consumption of around 13KWh of electricity in a 30 day period (basing off the higher 18W number). Which costs $1.30 per 30 days.
This results in a "profit" of $51.20 per 30 days.
That means the FPGA earns $11.20 more per month than the GPU rig. And that's not considering the fact it's also only generating heat with waste from 18W versus 500W, so that's another front of savings. AND FPGA is FAR smaller, and more dense than a GPU rig. You can scale 100 of them off a single host PC (and soon might be able to mine standalone with them). Reducing supporting infrastructure costs further.
I realize these are very rough calcs, I have done them in far more detail making a large scale business case.
And that said if anyone sees any flaws in my reasoning I'd be glad to hear them, and discuss your thoughts.
Also to consider. When the BTC payout per block drops to 25 (near the end of this year I believe) that will have an unknown impact on the economics of mining. The theory is it will stabilize. But it may make things tight. And those mining with FPGAs have a thicker operational margin to work with.
Anyway, just my thoughts, not trying to pressure anyone into anything, GPUs have their benefits for sure (and their application in the right type of mining op) but I'm just trying to provide information that I hope will be helpful
Hope some of it helps!