Bitcoin Forum

Other => Beginners & Help => Topic started by: grantbdev on July 20, 2011, 03:05:54 AM



Title: GUIminer throttle for CUDA?
Post by: grantbdev on July 20, 2011, 03:05:54 AM
Quote
Q: My temperatures are too high, can I throttle the GPU so it runs slower but cooler?
A: If you are mining using OpenCL you can use the -s flag a value such as 0.01 in order to force the GPU to sleep for 0.01 seconds in between runs. Increase or decrease this value until you have the desired GPU utilization.

I need a way to throttle my GTX 260 to prevent it from getting too hot. I noticed there is a flag for OpenCL users, so I was wondering if there is a flag I can add to my CUDA GUIminer. Thanks.


Title: Re: GUIminer throttle for CUDA?
Post by: haydent on July 20, 2011, 03:26:27 AM
what you may be talking about is -f ?? this priority not throttle, one suggestion to keep temps down is underclock or increase fans


Title: Re: GUIminer throttle for CUDA?
Post by: grantbdev on July 20, 2011, 04:16:11 AM
what you may be talking about is -f ?? this priority not throttle, one suggestion to keep temps down is underclock or increase fans

What value should I set -f to for lower temperatures? Underclocking doesn't help that much, and my fan speed increases as temperature increases, but doesn't keep it from going way up.


Title: Re: GUIminer throttle for CUDA?
Post by: haydent on July 20, 2011, 04:28:56 AM
-f wont reduce your temps, it just helps with responsiveness while using computer. how hot are we talking ?


Title: Re: GUIminer throttle for CUDA?
Post by: grantbdev on July 20, 2011, 05:05:41 AM
-f wont reduce your temps, it just helps with responsiveness while using computer. how hot are we talking ?

Beyond 85 Celsius. At idle it's 50 Celsius, and games put it in the 60-75 range.


Title: Re: GUIminer throttle for CUDA?
Post by: haydent on July 20, 2011, 05:12:34 AM
im not sure exactly what thats cards temps are good to, youd need to look that up. but as mentioned your only way is underclocking or better cooling for case or/and  card. but either way its prob not worth worrying about it too much with a nvidia card.  :P