add my former card
Nvidia GTX560 Ti, factory overclocked to 900/2000
86700 hash/s
win7 x64 and RPC Miner CUDA
OK, well, I guess I'll add it. Some more info wouldn't hurt, but there seems to be just enough here.
I'm running an 8600GT at 7.3Mh/s
2Mh/s better than the ones posted on the wiki
GPU shark says 43 watts so it's 0.169 Mhash/W if thats correct
using poclbm with -w 128
1602 MHz shader clock
The wattage is going to be off by a pretty good amount. If you go up by one model, the 8600 GTS has the same GPU clocked at 675 MHz core/1450 MHz shader clocks vs. the 8600 GT at 540 MHz core/1180 MHz shader clocks. The 8600 GTS is listed at 75 W. If your shader clocks are 1602 MHz, you're probably in that 75W or greater ballpark, which reminds me...
I idle around 50 and can hit 90 under load if I push it hard enough.
Mining coins and boiling water at the same time, just 10 more to go
The cooler for your card was designed with a 47 W TDP in mind. Pushing those clocks may not kill your card today or a month from now, but it will definately shorten its life. Your card being an 8600 GT, you may not be all that concerned, anyhow. I'll add it for now, but let me know if you are running a different clock than the one you have posted.
My computer jumps between 18.7 and 16.8Mhash/s on my 9600gt
More information would be helpful. Is this card running at the nVidia reference clocks? What miner are you using? What command line arguments are you giving it? (omit username and password, obviously) The speed variance your seeing is somewhat disturbing. Is this happening while you're not using the system or is 18.7 MHash/s consistent while you're not using the system? Also, fire up GPU-Z to determine which specific GPU you have, as there were two versions of the 9600 GT, one based on the 65nm G94 and another based on the 55nm G94b. I've seen wattage specs for the 65 nm part, but haven't run across wattage specs for the 55 nm part (which would be lower).
I'm using a GT 240.
Using python poclbm-mod.py -d 0 -f 0 -a 10 -v -l, I regularly see between 21230 and 21255 khash/s. I've also noticed that -v (vectors) costs about 300 khash/s (21567 to 21579), but I think there was an increase of invalid/stale shares --- this may have been due to other factors, however. I also think there was a reduction in discovering multiple shares per getwork without vectors, but that may have been due to the same 'other factors'.
I don't know what the wattage draw is, but I have noticed the temperature of the card jump from 58 (idle) to 72 (mining). I'm guessing these values are in fahrenheit.
nVidia specs this card to run at 69 W. Also, try to determine if this card runs at nVidia reference clocks. GPU-Z should do the trick, as well as CPU-Z (Graphics Tab, highest perf level). As far as you're -v issue goes, you may have just had a run of bad luck. Who knows?