|
September 06, 2012, 05:14:58 AM |
|
I noticed a difference in CPU load and was wondering if it's normal. The boxes with nVidia GPUs (two 9500GTs and a GT210) show full usage on one core while mining. By comparison, the new mining box I set up with a Radeon HD 7750 has almost no CPU usage while pushing a much higher rate (~3.5 MH/s for the GT210, ~6 MH/s each for the 9500GTs, ~133 MH/s for the 7750). What would account for this discrepancy?
I'm running the most recent bfgminer on all machines, with the diablo kernel. All but one are running 64-bit Gentoo Linux. One of the 9500GTs is in my office computer, which runs 32-bit Windows XP SP3; CPU load on it behaves the same way as on the Linux boxen. Intensity is set to dynamic on all but the new mining box; on the mining box, it's set to 11. CPUs are a mix of Core 2s and Athlon 64s.
I know that nVidia GPUs are suboptimal for mining, but they're what I have. Two of them are in boxes that serve as MythTV frontends, and nVidia has in the past provided better support for video decoding. I haven't given VAAPI (?) a shot on the 7750 yet; if it's working as well with MythTV nowadays as VDPAU, maybe my HTPC could pull more serious double duty as a miner...at least until ASICs take over. (Playing 1080p H.264 only knocks about 300 kH/s off of the GT210's hashrate.)
(As an aside: on the Windows box, guiminer gives me zero CPU load with the CUDA miner, but it doesn't pass the username parameter through to my P2Pool server properly for some reason.)
|