There has been some discussion about the current costs associated with mining for bitcoins. Obviously it is dependent on a few variables but if we could come up with a ballpark figure, or a range, for the UScents per GHash/second at the current difficulty.
Take the most efficient card now available, AMD HD 5970, run it at most efficient clocking/power consumption, calculate the (GHash/s)/kiloWatt, https://en.bitcoin.it/wiki/Mining_Hardware_Comparison
I suggest measure it directly for your total rig, i.e. include mobo, monitor if you use one, etc.
Multiply by your local kiloW.hr electricity rate and divide by Bitcoins per GHash/sec at current difficulty, http://www.alloscomp.com/bitcoin/calculator.php
Estimate current UScents/ bitcoin. Add on top of that card amortisation (say 1 card last for 12 months on average). Call this gross expenditure.
Add on top 30% for sys admin and sundry overheads (more if you think necessary). Get final UScents per bitcoin, (or whatever your local currency units if you prefer.)
I'll do a spreadsheet if anybody wants it.