Bitcoin Forum
May 07, 2024, 01:24:50 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 [6] 7 »  All
  Print  
Author Topic: HOW TO SET UP OVERCLOCKING AND FAN CONTROL ON UBUNTU 16.04 FOR NVIDIA CARDS  (Read 54990 times)
thevictimofuktyranny (OP)
Legendary
*
Offline Offline

Activity: 1092
Merit: 1004


View Profile
December 03, 2017, 05:02:28 PM
 #101

Thanks.  Is there anything I need to do to install that driver?  Do I need to remove the old one first?

Depends how you are installing them, usually you need to be in the non-Nvidia Ubuntu default driver called Noveau.
1715045090
Hero Member
*
Offline Offline

Posts: 1715045090

View Profile Personal Message (Offline)

Ignore
1715045090
Reply with quote  #2

1715045090
Report to moderator
The Bitcoin software, network, and concept is called "Bitcoin" with a capitalized "B". Bitcoin currency units are called "bitcoins" with a lowercase "b" -- this is often abbreviated BTC.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
k3rt
Newbie
*
Offline Offline

Activity: 18
Merit: 0


View Profile
December 04, 2017, 05:19:48 PM
 #102

This is the output from nvidia-smi -q -d CLOCK:

Code:
==============NVSMI LOG==============

Timestamp                           : Mon Dec  4 19:05:48 2017
Driver Version                      : 384.98

Attached GPUs                       : 2
GPU 00000000:01:00.0
    Clocks
        Graphics                    : 1657 MHz
        SM                          : 1657 MHz
        Memory                      : 3802 MHz
        Video                       : 1480 MHz
    Applications Clocks
        Graphics                    : N/A
        Memory                      : N/A
    Default Applications Clocks
        Graphics                    : N/A
        Memory                      : N/A
    Max Clocks
        Graphics                    : 1974 MHz
        SM                          : 1974 MHz
        Memory                      : 4004 MHz
        Video                       : 1708 MHz
    Max Customer Boost Clocks
        Graphics                    : N/A
    SM Clock Samples
        Duration                    : 4.36 sec
        Number of Samples           : 100
        Max                         : 1733 MHz
        Min                         : 1620 MHz
        Avg                         : 1682 MHz
    Memory Clock Samples
        Duration                    : 4.36 sec
        Number of Samples           : 100
        Max                         : 3802 MHz
        Min                         : 3802 MHz
        Avg                         : 3802 MHz
    Clock Policy
        Auto Boost                  : N/A
        Auto Boost Default          : N/A

- Does Graphics: 1657 MHz stand for CPU Clock?
- Does Max Clocks / Graphics: 1900 MHz means I can rise it to 1900 without any harm to the GPU?

I've noticed that chaning GPUGraphicsMemoryOffset changes Graphics precisely. Changing GPUMemoryTransferRateOffset affect Memory but a bit weird, e.g.:

Code:
 Attribute 'GPUMemoryTransferRateOffset' (ubuntu:0[gpu:0]) assigned value 400.

Will set Memory to 3999 (4000 is the max value??).

Can someone with more experience provide some explanation?
thevictimofuktyranny (OP)
Legendary
*
Offline Offline

Activity: 1092
Merit: 1004


View Profile
December 06, 2017, 06:12:56 PM
 #103

This is the output from nvidia-smi -q -d CLOCK:

Code:
==============NVSMI LOG==============

Timestamp                           : Mon Dec  4 19:05:48 2017
Driver Version                      : 384.98

Attached GPUs                       : 2
GPU 00000000:01:00.0
    Clocks
        Graphics                    : 1657 MHz
        SM                          : 1657 MHz
        Memory                      : 3802 MHz
        Video                       : 1480 MHz
    Applications Clocks
        Graphics                    : N/A
        Memory                      : N/A
    Default Applications Clocks
        Graphics                    : N/A
        Memory                      : N/A
    Max Clocks
        Graphics                    : 1974 MHz
        SM                          : 1974 MHz
        Memory                      : 4004 MHz
        Video                       : 1708 MHz
    Max Customer Boost Clocks
        Graphics                    : N/A
    SM Clock Samples
        Duration                    : 4.36 sec
        Number of Samples           : 100
        Max                         : 1733 MHz
        Min                         : 1620 MHz
        Avg                         : 1682 MHz
    Memory Clock Samples
        Duration                    : 4.36 sec
        Number of Samples           : 100
        Max                         : 3802 MHz
        Min                         : 3802 MHz
        Avg                         : 3802 MHz
    Clock Policy
        Auto Boost                  : N/A
        Auto Boost Default          : N/A

- Does Graphics: 1657 MHz stand for CPU Clock?
- Does Max Clocks / Graphics: 1900 MHz means I can rise it to 1900 without any harm to the GPU?

I've noticed that chaning GPUGraphicsMemoryOffset changes Graphics precisely. Changing GPUMemoryTransferRateOffset affect Memory but a bit weird, e.g.:

Code:
 Attribute 'GPUMemoryTransferRateOffset' (ubuntu:0[gpu:0]) assigned value 400.

Will set Memory to 3999 (4000 is the max value??).

Can someone with more experience provide some explanation?

So. let's say the memory is at running at 7600mhz. You put 400mhz on the GPUMemoryTransferRateOffset to reach 8GHz
jaromiradamek
Newbie
*
Offline Offline

Activity: 23
Merit: 0


View Profile
December 23, 2017, 05:08:18 PM
 #104

Everytime, when I tried to Overclock or change Fan speed Im getting "The control display is undefined"

Only one think which working is to enable PL: "nvidia-smi -pm 1"
And set TDP: "nvidia-smi -i 0 -pl 151"



Fun speed control do not working:
-----------------------
root@ja:~# nvidia-settings -a '[gpu:0]/GPUFanControlState=1'
Unable to init server: Could not connect: Connection refused

ERROR: The control display is undefined; please run `nvidia-settings --help` for usage information.
------------------------


Overclocking do not working:
-----------------------
root@jatrovka:~# nvidia-settings -a '[gpu:0]/GPUGraphicsMemoryOffset[3]=100'
Unable to init server: Could not connect: Connection refused

ERROR: The control display is undefined; please run `nvidia-settings --help` for usage information.
--------------------------




Dont you know what to do?

Im using Ubuntu 17.10 with proprietary Nvidia drivers:

root@jatrovka:~# nvidia-smi
Sat Dec 23 18:03:58 2017
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 384.90                 Driver Version: 384.90                    |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  GeForce GTX 1070    Off  | 00000000:04:00.0 Off |                  N/A |
| 69%   69C    P2   158W / 220W |    540MiB /  8111MiB |     99%      Default |
+-------------------------------+----------------------+----------------------+
|   1  GeForce GTX 108...  Off  | 00000000:0C:00.0  On |                  N/A |
| 60%   70C    P2   243W / 250W |    601MiB / 11172MiB |     99%      Default |
+-------------------------------+----------------------+----------------------+
|   2  GeForce GTX 108...  Off  | 00000000:0D:00.0 Off |                  N/A |
| 65%   78C    P2   248W / 250W |    592MiB / 11172MiB |     99%      Default |
+-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+
| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|=============================================================================|
|    0       914      G   /usr/lib/xorg/Xorg                            10MiB |
|    0      1016      G   /usr/bin/gnome-shell                           8MiB |
|    0      1344      C   ./zm                                         509MiB |
|    1       914      G   /usr/lib/xorg/Xorg                            15MiB |
|    1      1344      C   ./zm                                         573MiB |
|    2       914      G   /usr/lib/xorg/Xorg                             7MiB |
|    2      1344      C   ./zm                                         573MiB |
+-----------------------------------------------------------------------------+


========================================
# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 384.90  (buildmeister@swio-display-x86-rhel47-05)  Tue Sep 19 18:13:03 PDT 2017

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0"
    Screen      1  "Screen1" RightOf "Screen0"
    Screen      2  "Screen2" RightOf "Screen1"
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
EndSection

Section "Files"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Unknown"
    HorizSync       28.0 - 33.0
    VertRefresh     43.0 - 72.0
    Option         "DPMS"
EndSection
Section "Monitor"
    Identifier     "Monitor1"
    VendorName     "Unknown"
    ModelName      "Unknown"
    HorizSync       28.0 - 33.0
    VertRefresh     43.0 - 72.0
    Option         "DPMS"
EndSection

Section "Monitor"
    Identifier     "Monitor2"
    VendorName     "Unknown"
    ModelName      "Unknown"
    HorizSync       28.0 - 33.0
    VertRefresh     43.0 - 72.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 1070"
    BusID          "PCI:4:0:0"
    Option         "Coolbits" "31"
EndSection

Section "Device"
    Identifier     "Device1"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 1080 Ti"
    BusID          "PCI:12:0:0"
    Option         "Coolbits" "31"
EndSection

Section "Device"
    Identifier     "Device2"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 1080 Ti"
    BusID          "PCI:13:0:0"
    Option         "Coolbits" "31"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "AllowEmptyInitialConfiguration" "True"
    Option         "Coolbits" "31"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

Section "Screen"
    Identifier     "Screen1"
    Device         "Device1"
    Monitor        "Monitor1"
    DefaultDepth    24
    Option         "AllowEmptyInitialConfiguration" "True"
    Option         "Coolbits" "31"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

Section "Screen"
    Identifier     "Screen2"
    Device         "Device2"
    Monitor        "Monitor2"
    DefaultDepth    24
    Option         "AllowEmptyInitialConfiguration" "True"
    Option         "Coolbits" "31"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

raoulus
Newbie
*
Offline Offline

Activity: 3
Merit: 0


View Profile
December 31, 2017, 11:19:49 PM
 #105

I'm searching for similar solution. How may I shut the system's xorg auto reset on reboot. Maybe after that editing the xorg will remain permanent and the display attached wont be needed...

In my case gpumanager was rewriting xorg.conf. You can check the logs: /var/log/gpu-manager.log

I solved by disabling gpumanager at startup adding nogpumanager in grub loader.
See https://askubuntu.com/a/732004 for details
NetopyrMan
Member
**
Offline Offline

Activity: 144
Merit: 10


View Profile
January 11, 2018, 09:52:11 PM
 #106

THIS!!!!!!!!!!!

you absolutly dont know how much i like this guide
it solved everything ...

only 1 thing: i realized that (at least) for 1070ti is needed coolbits sets on 12 ... then i can OC every card Smiley

THANKS THANKS and again THANKS
zx_master
Newbie
*
Offline Offline

Activity: 3
Merit: 0


View Profile
January 19, 2018, 12:33:11 PM
 #107

BTW after you are able to control your fans manually you can setup tool to do it in proper (smart) way.
It was designed especially for rigs.
https://github.com/ktsol/karlson
penailija
Newbie
*
Offline Offline

Activity: 3
Merit: 0


View Profile
January 30, 2018, 08:59:19 PM
 #108

Hi guys,

I've been trying to overclock my GPUs on linux for way too long now and I'm about to lose my mind.

I've tried all of the advices and tips in this thread numerous times. I have tried running Ubuntu, Lubuntu, Kubuntu, Xubuntu and Debian. 17.10 AND 16.04 versions of all these.

And no matter what I do and what version of Linux I'm running, the same problem persists: xorg.conf resets every time I reboot the system or even just log out.

Does anyone have any suggestions anymore? I have tried every possible solution I've come across while trying to troubleshoot this but nothing has worked.

My setup:

Asrock H110 pro btc+
Intel i7-7700K
4GB Crucial Ballistix DDR4-2400
5 x Zotac Geforce GTX1080ti (blower model)
2 x Corsair RM850X
32gb usb3.0 stick/some random hdd (tried running on both of these)

I'm literally clueless. Huh All help is much appreciated.
dejan_p
Member
**
Offline Offline

Activity: 132
Merit: 11


View Profile
January 30, 2018, 09:08:14 PM
 #109

^
how did you install the drivers? did you just install them, have you blacklisted the other driver?

i've had that problem when used to install nvidia+cuda via the provided downloaded file (just by running .sh file if i remember correctly) - the system would work until rebooting

i solved that by installing the drivers via repository (apt-get & stuff), NOT via the install file
pallas
Legendary
*
Offline Offline

Activity: 2716
Merit: 1094


Black Belt Developer


View Profile
January 30, 2018, 09:22:41 PM
 #110

New driver out (390) but, apparently, still no nvidia-smi clock setting support :-/

penailija
Newbie
*
Offline Offline

Activity: 3
Merit: 0


View Profile
January 31, 2018, 01:34:31 PM
 #111

^
how did you install the drivers? did you just install them, have you blacklisted the other driver?

i've had that problem when used to install nvidia+cuda via the provided downloaded file (just by running .sh file if i remember correctly) - the system would work until rebooting

i solved that by installing the drivers via repository (apt-get & stuff), NOT via the install file

I've tried both ways of installing the drivers and it doesn't seem to have any effect. What do you exactly mean by blacklisting? I've just selected the proprietary nvidia driver from driver manager.

Should I be able to adjust the Powermizer settings instantly after modifying the xorg.conf file with coolbits? Or does the overclocking get enabled only once the system reboots with the modified xorg.conf file?
penailija
Newbie
*
Offline Offline

Activity: 3
Merit: 0


View Profile
February 01, 2018, 07:36:15 PM
 #112

Ok, I feel kind of noobish to admit this..

Problem behind this was that I connected my display to mobo while I was configuring the rig. I've done this with mining rigs and it has not been a problem so far. When I switched to one of the GPUs, everything started working perfectly.
miner.31
Newbie
*
Offline Offline

Activity: 8
Merit: 0


View Profile
February 13, 2018, 07:53:39 AM
 #113

Hi guys .
I’m new and would need some help .
I’m not practice of Ubuntu but I have to go for it because don’t know why ... on win10 just recognize 4 gpu only.
Maybe I’m in the wrong place but I hope some1 will help me .
I have :
Asus mining expert
Intel g4400
4gb balistics ram
5 1080ti

I wanna mine Ethereum and ofcorse overclock my gpus ...
Pls help me , I will make a donation . Thanks
thevictimofuktyranny (OP)
Legendary
*
Offline Offline

Activity: 1092
Merit: 1004


View Profile
February 17, 2018, 03:57:14 PM
 #114

Hi guys .
I’m new and would need some help .
I’m not practice of Ubuntu but I have to go for it because don’t know why ... on win10 just recognize 4 gpu only.
Maybe I’m in the wrong place but I hope some1 will help me .
I have :
Asus mining expert
Intel g4400
4gb balistics ram
5 1080ti

I wanna mine Ethereum and ofcorse overclock my gpus ...
Pls help me , I will make a donation . Thanks

Normal stuff - set everything to PCI-Express Gen 1 or Gen 2.

Enable 4G performance.

Set 16GB of virtual memory.

 
miner.31
Newbie
*
Offline Offline

Activity: 8
Merit: 0


View Profile
February 18, 2018, 12:05:20 AM
 #115

Hi guys .
I’m new and would need some help .
I’m not practice of Ubuntu but I have to go for it because don’t know why ... on win10 just recognize 4 gpu only.
Maybe I’m in the wrong place but I hope some1 will help me .
I have :
Asus mining expert
Intel g4400
4gb balistics ram
5 1080ti

I wanna mine Ethereum and ofcorse overclock my gpus ...
Pls help me , I will make a donation . Thanks

Normal stuff - set everything to PCI-Express Gen 1 or Gen 2.

Enable 4G performance.

Set 16GB of virtual memory.

 
Let’s say I got for word . I’m mining with 5 gpu , but the problem now is that I can’t overclock all 5 gpus . Only the main one. I know I must change the xorg.conf file... I tryed everything but I have no permission . Could you help me with the configurazion of all gpu ? Thanks
miner.31
Newbie
*
Offline Offline

Activity: 8
Merit: 0


View Profile
February 20, 2018, 06:42:39 PM
 #116

Hi guys .
I’m new and would need some help .
I’m not practice of Ubuntu but I have to go for it because don’t know why ... on win10 just recognize 4 gpu only.
Maybe I’m in the wrong place but I hope some1 will help me .
I have :
Asus mining expert
Intel g4400
4gb balistics ram
5 1080ti

I wanna mine Ethereum and ofcorse overclock my gpus ...
Pls help me , I will make a donation . Thanks

Normal stuff - set everything to PCI-Express Gen 1 or Gen 2.

Enable 4G performance.

Set 16GB of virtual memory.

Hi man....i need help pls...
I was regularly mining for about 24h....when my ethminer crashes with this error : Error CUDA mining : an illegal memoery access was ancounered

Pls help me...i don`t know wat to do...it was working so nice...


 
WaveRiderx
Member
**
Offline Offline

Activity: 168
Merit: 39


View Profile
February 22, 2018, 04:45:19 AM
 #117

Thanks for the post.

is this still the proper way to do it or is there a new way?  Would like to get setup in linux with my rigs.

Also, can you make oc settings for each card individually or is it a global setting?  A few of my rigs have cards with mixed freqs.  had to just buy what I could, because of the video card drought.  Can you set power limit with this too?
DrJury
Member
**
Offline Offline

Activity: 122
Merit: 10


View Profile
February 22, 2018, 08:42:02 AM
Last edit: February 22, 2018, 09:34:14 AM by DrJury
 #118

Overclocking & Fancontrol is only enabled for first GPU?!

*********************************************
# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 390.25  (buildmeister@swio-display-x86-rhel47-03)  Wed Jan 24 20:46:04 PST 2018

Section "ServerLayout"
    Identifier     "layout"
    Screen      0  "nvidia" 0 0
    Inactive       "intel"
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Keyboard0"
    Driver         "keyboard"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Unknown"
    HorizSync       28.0 - 33.0
    VertRefresh     43.0 - 72.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "intel"
    Driver         "modesetting"
    Option         "AccelMethod" "None"
    BusID          "PCI:0@0:2:0"
EndSection

Section "Device"
    Identifier     "nvidia"
    Driver         "nvidia"
    BusID          "PCI:1@0:0:0"
    Option         "Coolbits" "31"
EndSection

Section "Device"
    Identifier     "nvidia"
    Driver         "nvidia"
    BusID          "PCI:2@0:0:0"
    Option         "Coolbits" "31"
EndSection

Section "Device"
    Identifier     "nvidia"
    Driver         "nvidia"
    BusID          "PCI:3@0:0:0"
    Option         "Coolbits" "31"
EndSection

Section "Device"
    Identifier     "nvidia"
    Driver         "nvidia"
    BusID          "PCI:4@0:0:0"
    Option         "Coolbits" "31"
EndSection

Section "Device"
    Identifier     "nvidia"
    Driver         "nvidia"
    BusID          "PCI:5@0:0:0"
    Option         "Coolbits" "31"
EndSection

Section "Device"
    Identifier     "nvidia"
    Driver         "nvidia"
    BusID          "PCI:6@0:0:0"
    Option         "Coolbits" "31"
EndSection

Section "Screen"
    Identifier     "intel"
    Device         "intel"
    Monitor        "Monitor0"
EndSection

Section "Screen"
    Identifier     "nvidia"
    Device         "nvidia"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "AllowEmptyInitialConfiguration" "on"
    Option         "IgnoreDisplayDevices" "CRT"
    Option         "ConstrainCursor" "off"
    SubSection     "Display"
        Depth       24
        Modes      "nvidia-auto-select"
    EndSubSection
EndSection

Section "Screen"
    Identifier     "nvidia"
    Device         "nvidia"
    Monitor        "Monitor0"
    Option         "AllowEmptyInitialConfiguration" "on"
    Option         "IgnoreDisplayDevices" "CRT"
EndSection

Section "Screen"
    Identifier     "nvidia"
    Device         "nvidia"
    Monitor        "Monitor0"
    Option         "AllowEmptyInitialConfiguration" "on"
    Option         "IgnoreDisplayDevices" "CRT"
EndSection

Section "Screen"
    Identifier     "nvidia"
    Device         "nvidia"
    Monitor        "Monitor0"
    Option         "AllowEmptyInitialConfiguration" "on"
    Option         "IgnoreDisplayDevices" "CRT"
EndSection

Section "Screen"
    Identifier     "nvidia"
    Device         "nvidia"
    Monitor        "Monitor0"
    Option         "AllowEmptyInitialConfiguration" "on"
    Option         "IgnoreDisplayDevices" "CRT"
EndSection

Section "Screen"
    Identifier     "nvidia"
    Device         "nvidia"
    Monitor        "Monitor0"
    Option         "AllowEmptyInitialConfiguration" "on"
    Option         "IgnoreDisplayDevices" "CRT"
EndSection

thevictimofuktyranny (OP)
Legendary
*
Offline Offline

Activity: 1092
Merit: 1004


View Profile
March 04, 2018, 03:57:46 PM
 #119

Thanks for the post.

is this still the proper way to do it or is there a new way?  Would like to get setup in linux with my rigs.

Also, can you make oc settings for each card individually or is it a global setting?  A few of my rigs have cards with mixed freqs.  had to just buy what I could, because of the video card drought.  Can you set power limit with this too?

You can set up individual overclocking and fan profiles per card via a .sh file for each card loaded up when you log into OS via startup program.
The Godfather
Newbie
*
Offline Offline

Activity: 41
Merit: 0


View Profile
March 04, 2018, 08:08:29 PM
 #120

i reinstalled ubuntu 20 times and never could overclock nvidia cards.... in nvidia setings there is fan control but not core clock... tryed everything.. any new tutorial ?
Pages: « 1 2 3 4 5 [6] 7 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!