Title: Not seeing linear scale-up with addition of graphics card Post by: johanatan on April 23, 2011, 12:07:39 PM Hi,
I was mining at a rate of 300K khash/sec with a single ATI HD 5870 last week running on an ASUS P8P67 WS Revolution and a 64-bit installation of Ubuntu 11.X running open source ATI drivers (fglrx). Something went wrong with the Ubuntu installation so I re-installed 32-bit Kubuntu 10.10 and also installed two additional cards using the ATI proprietary driver (Catalyst)--although it was installed with fglrx dpkg package as well. Now, my 3 cards are only averaging a rate of 150K khash/sec. I've read elsewhere that bus speed should not be the issue. My CPU is a Intel i3 3.1 Ghz (duo-core with hyperthreading). 'top' shows 3 of the CPUs pegged. If i also start the CPU mining, then my rate on the GPUs drops to 1/2 what it is currently. Is the CPU the bottleneck? And, if so why? I've also read elsewhere that this should be plenty of CPU. [I've also specified 'performance' for the 'governor' settings]. Thanks! Title: Re: Not seeing linear scale-up with addition of graphics card Post by: rezin777 on April 23, 2011, 12:39:59 PM The CPU isn't a bottleneck. Something else is wrong.
Title: Re: Not seeing linear scale-up with addition of graphics card Post by: teknohog on April 23, 2011, 02:22:45 PM Which SDK version are you using? Many people (including me) have had this scaling problem with newer SDKs, and 2.1 is the only one that works well.
Title: Re: Not seeing linear scale-up with addition of graphics card Post by: rezin777 on April 23, 2011, 02:54:59 PM Which SDK version are you using? Many people (including me) have had this scaling problem with newer SDKs, and 2.1 is the only one that works well. You've lost half your hashing power per card due to adding cards and a newer sdk? Title: Re: Not seeing linear scale-up with addition of graphics card Post by: teknohog on April 23, 2011, 08:09:32 PM You've lost half your hashing power per card due to adding cards and a newer sdk? Yes, and not just me. http://bitcointalk.org/index.php?topic=4648.0 Title: Re: Not seeing linear scale-up with addition of graphics card Post by: johanatan on April 23, 2011, 11:37:21 PM It was in fact the SDK version.
Changing from 2.3 to 2.1 (and I tried 2.4 as well along the way) did fix the issue for 2 of the cards. However, the card which is running the display only runs at 2K khash/sec with 2.1 libs (it was doing 150K khash/sec with 2.3 and it is also the original card which was doing 300K+ before the addition of the other two). Strangely enough, I had a terminal window open for each of the 3 cards and changing the LD_LIBRARY_PATH to point to the 2.1 libs got two of them running fine. However, on the 3rd window (the one which was running the card with the problem above) I get 'cannot find libOpencl.so ...'. I can run the card from the other two terminals by specifying the proper -d value (and that's how I got the 2K khash/sec measurement). Is there any rationale for either of these issues that any of you know of? [Regarding the terminal weirdness, I am thinking it may be related to stale Python pre-compiled bytecode but I think all 3 terminals should be using the same binaries]. Title: Re: Not seeing linear scale-up with addition of graphics card Post by: yunk3r on April 23, 2011, 11:43:40 PM if you stop mining with the cpu your gpus will start to work at full speed. i think the problem is that since the cpu is mining it is unable to send the gpus data or "feed" them.
Title: Re: Not seeing linear scale-up with addition of graphics card Post by: shivansps on April 24, 2011, 12:11:50 AM the problem is because you added a 2nd card... once you do that you start using 50% cpu instead of 0-1%, so if you do cpu mining, you will reduce the mhash of the cards.
I only workaround i found is set the affinity of both poclbm to "core 3" and using cpu mining at the other 3. Title: Re: Not seeing linear scale-up with addition of graphics card Post by: johanatan on April 24, 2011, 12:49:45 AM But, the CPU is not mining currently.
Title: Re: Not seeing linear scale-up with addition of graphics card Post by: johanatan on April 24, 2011, 05:34:39 AM Doh! It actually was the CPU I mistook for a GPU. For some reason, the system decided to renumber my processors from when I first tried it.
|