Can someone answer me this?
All these miners - Phoenix, ufasoft, cgminer, Diablo... every one I've tried so far, with both AMD and nVidia GPUs, always seem to run in a blind "game loop", consuming as much CPU power as is allocated to them. They don't use any intelligent control schemes to loop with less CPU, they just go 100% full-time.
That is HUGE. Power consumption is the #1 problem with Bitcoin hashing; for many people that don't consider it, they actually waste money by mining, getting a few bucks (and a warm fuzzy feeling), but then getting slapped with a huge power bill that eats it all away. I'm running my rig outside on the porch (2nd floor) to offset the 2:3 consumption:A/C cooling ratio problem (for about every 2 watts consumed by electronics, it takes 3 watts to remove the heat produced via air-conditioning). And even with that, it consumes 200 watts at 200MHash/sec on a 6770 and an underclocked (1.2GHz/200MHz/0.95v) Core 2 Quad. The meter reports I've spent $4.50 in electricity to produce ~0.57 Bitcoin over the past week (yuck!). That's not taking into account the losses in refining my mining methods, and it's due to improve, but that's a VERY tiny profit to be made from the amount of environmental resources consumed to get there...
One of the bigger oversights of GPU mining is the CPU factor. Having a miner eat up 100% of a single core makes the PC think there's an important process running that needs additional power to accelerate the process. That's completely untrue in Bitcoin mining - it just checks/updates the GPU's progress more often. Having a higher clock speed does NOTHING to increase the speed of the GPU process! Maybe a 1-2% change at worst, but going from full clock to minimal clock reduces the CPU's power consumption by more than half!
I brought this up once in the Phoenix thread, but got pretty much ignored with just one reply suggesting it was my nVidia drivers at fault. Now that I'm using ATI and seeing exactly the same behavior (and also with an nVidia Quadro driver, which is completely different), I know it's not the driver. It's the miner's loop. I've just got to wonder.... has anyone taken steps to address this in their setups? And are the developers of these miners able to do anything about the "game loop" problem?
I have found out the ATI drivers written after version 11.6 are buggy. It appears to "automatically" have use 100% of the core. Even though it appears that some of the "newer" drivers perform "better". It is not necessarily the case.
I have read that it can be remedied by using this variable in the args of your batch file.