Bitcoin Forum
December 03, 2016, 09:54:38 PM *
News: Latest stable version of Bitcoin Core: 0.13.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: [1] 2 »  All
  Print  
Author Topic: 7x 7970's, PCI-E 16x-connected cards not working  (Read 10047 times)
NLA
Member
**
Offline Offline

Activity: 88


How does I shot web?


View Profile
June 07, 2012, 05:06:01 AM
 #1

I've built a nice dual-GPU rig with 7x 7970's in an open-air rig I made from spare parts. 5 of them are connected to the motherboard via 1x to 16x riser cables, and 2 are connected via 16x to 16x riser cables. The 5 connected via 1x to 16x work just fine and can be mined, but the 2 connected via 16x to 16x have some sort of issue and are not available for mining. In Windows Device Manager, the 2 cards list "Code 43" as the error, and they are being prevented from starting. I've done a clean driver re-install with atiman, and I'm using the 11.12 drivers (which support up to 8 GPU's), and I have no idea what has gone wrong. I wonder if I need powered 16x to 16x risers from cablesaurus (or somewhere cheaper, preferably).

Anyone have a similar issue?

  • 7x 7970's: 2 Diamond's, 5 Sapphire's
  • Dual GPU's: 1x 1600W (connected to 5 GPUs), 1x 750W (connected to 2 GPUs)
  • Motherboard: GIGABYTE GA-970A-D3
  • Windows 7 Ultimate 64-bit
  • AMD 11.12 Drivers, for >5 GPU support
  • Clean driver re-install with ATIMan completed, same issue
  • 1x to 16x cards recognized and functional
  • 16x to 16x cards non-functional, code 43 in Device Manager

Halp? Sad

PS Please no one buy the dual fan Sapphire cards, they look impressive but the fans don't spin fast enough to cool worth a damn (~3300RPM cap), and they're not designed properly to move heat out the back of the card (they just gently direct heat everywhere).

EDIT: Just connected one of the 16x to 16x cards to another slot via 1x to 16x cable, works just fine. The issue is definitely either the cable or the 16x slot. Halp?

If my post helped you in some way, please donate to 1NP2HfabXzq1BB288ymbgnLcGoeBsF7ahP. Smiley
1480802078
Hero Member
*
Offline Offline

Posts: 1480802078

View Profile Personal Message (Offline)

Ignore
1480802078
Reply with quote  #2

1480802078
Report to moderator
1480802078
Hero Member
*
Offline Offline

Posts: 1480802078

View Profile Personal Message (Offline)

Ignore
1480802078
Reply with quote  #2

1480802078
Report to moderator
1480802078
Hero Member
*
Offline Offline

Posts: 1480802078

View Profile Personal Message (Offline)

Ignore
1480802078
Reply with quote  #2

1480802078
Report to moderator
"If you don't want people to know you're a scumbag then don't be a scumbag." -- margaritahuyan
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1480802078
Hero Member
*
Offline Offline

Posts: 1480802078

View Profile Personal Message (Offline)

Ignore
1480802078
Reply with quote  #2

1480802078
Report to moderator
1480802078
Hero Member
*
Offline Offline

Posts: 1480802078

View Profile Personal Message (Offline)

Ignore
1480802078
Reply with quote  #2

1480802078
Report to moderator
Fuzzy
Hero Member
*****
Offline Offline

Activity: 560



View Profile
June 07, 2012, 06:05:22 AM
 #2

Some motherboards only let you use a certain number or configuration of slots.

IE, on one of my motherboards, I can use the top one of the two x16 slots and the 3 x1 slots, or both the x16 slots but none of the x1 slots.

This should be stated in the manual, so check there first.
NLA
Member
**
Offline Offline

Activity: 88


How does I shot web?


View Profile
June 07, 2012, 06:17:21 AM
 #3

Welp, oddly enough, I've just switched around some cables to make sure that the cards aren't to blame, and something odd happened: the card not working is one of the cards which was working previously, and the card connected via 16x to 16x is working just fine (can adjust fan speed and clock rates). WHAT IS THIS. :|

Anyone out there ever hook up 7x GPU's and have it work?

If my post helped you in some way, please donate to 1NP2HfabXzq1BB288ymbgnLcGoeBsF7ahP. Smiley
Dalkore
Legendary
*
Offline Offline

Activity: 1176


ASIC Miner Hosting: www.bitcoinasichosting.com


View Profile WWW
June 07, 2012, 06:29:53 AM
 #4

Some motherboards only let you use a certain number or configuration of slots.

IE, on one of my motherboards, I can use the top one of the two x16 slots and the 3 x1 slots, or both the x16 slots but none of the x1 slots.

This should be stated in the manual, so check there first.

This sounds like your answer in short.   Your saying the ones that previously didn't operate, do and the ones the worked before are now non-functioning?   Is this correct?

Dalkore

[Winter Fire Sale] Hosting: $60.00 per KW) [6-month contracts] - Link
Transaction List: jayson3 +5 - ColdHardMetal +3 - Nolo +2 - CoinHoarder +1 - Elxiliath +1 - tymm0 +1 - Johnniewalker +1 - Oscer +1 - Davidj411 +1 - BitCoiner2012 +1 - dstruct2k +1 - Philj +1 - camolist +1 - exahash +1 - Littleshop +1 - Severian +1 - DebitMe +1 - lepenguin +1 - StringTheory +1 - amagimetals +1 - jcoin200 +1 - serp +1 - klintay +1 - -droid- +1 - FlutterPie +1
NLA
Member
**
Offline Offline

Activity: 88


How does I shot web?


View Profile
June 07, 2012, 07:26:47 AM
 #5

Well, all 7 GPU's do show up in the Device Manager, but only 5 work properly. The other 2 have status messages of code 43, that Windows prevented them from being used because they reported errors.

I disconnected one of the 16x to 16x riser cables from a card, disconnected a 1x to 16x riser cable, and plugged the card into the 1x to 16x riser cable, resulting in only 6 cards being connected. I did this to make sure that the card itself wasn't to blame. Sure enough, when Windows booted and finished installing the drivers for the card, I could control it via software (my test: adjust fan speed to see if controllable). What was odd was that the other 16x to 16x card could now also be controlled via software, and one of the original 1x to 16x cards, which was working fine before, now could not be controlled via software. Its as if the motherboard can only work with 5 devices at one time, which is absolutely ridiculous. Also, I checked through the manual a few minutes ago, didn't see anything about impaired PCI lanes under certain scenarios.

I've updated the BIOS, updated Windows, updated systems drivers, cleanly uninstalled and reinstalled the AMD 11.12 drivers since everyone is saying they support 7970's and up to 8 GPU's, and it seems like Windows (or the motherboard) just does not want to let me work with more than 5 GPU's. Undecided

I would be very curious to hear from others that have put together 6x or 7x GPU rigs!

If my post helped you in some way, please donate to 1NP2HfabXzq1BB288ymbgnLcGoeBsF7ahP. Smiley
AzN1337c0d3r
Full Member
***
Offline Offline

Activity: 238

★YoBit.Net★ 350+ Coins Exchange & Dice


View Profile
June 07, 2012, 11:59:05 AM
 #6

I assume you're using some kind of adapter to convert those PCI slots to 1x PCI-E.

I looked at your motherboard block diagram in the manual and it looks like it should work.

I remember vaguely reading somewhere before that someone tried the same thing you did and having to go in and play with their IRQ settings to get it to work.

crazyates
Legendary
*
Offline Offline

Activity: 938



View Profile
June 07, 2012, 12:31:05 PM
 #7

I've heard of other people with those Gigebyte boards saying that the MB can only allocate a certain number of devices. Try going into your BIOS and disabling anything you don't need. I'd start with the audio, 2nd SATA controller, and USB3.0. Disable anything you don't need, and then try it.

Tips? 1crazy8pMqgwJ7tX7ZPZmyPwFbc6xZKM9
Previous Trade History - Sale Thread
Dargo
Legendary
*
Offline Offline

Activity: 1554



View Profile
June 07, 2012, 12:45:52 PM
 #8

Did you change the latency timer in BIOS? I've read on this forum that this needs to be increased from the default (64) for 5+ cards on gigabyte boards. If I remember correctly, the theory was that you need to set the timer to 16x# of GPUs. I'm not sure about that but you should try increasing it. I'd try 96 and some higher settings as well. 7x16=112, so try that too if you can. 
Aseras
Hero Member
*****
Offline Offline

Activity: 658


View Profile
June 07, 2012, 12:59:23 PM
 #9

IIRC windows wont see more then 5 gpus
crazyates
Legendary
*
Offline Offline

Activity: 938



View Profile
June 07, 2012, 01:25:14 PM
 #10

IIRC windows wont see more then 5 gpus

ATI drivers cap out at 8. You can't use 5 dual GPU cards (5970, 6990, etc), but you can use up to 8 single CPU cards.

Tips? 1crazy8pMqgwJ7tX7ZPZmyPwFbc6xZKM9
Previous Trade History - Sale Thread
NLA
Member
**
Offline Offline

Activity: 88


How does I shot web?


View Profile
June 07, 2012, 03:57:11 PM
 #11

I assume you're using some kind of adapter to convert those PCI slots to 1x PCI-E.
You assume correctly! Using two StarTech PCI1PEX1 PCI to PCI Express Adapter cards to make the magic happen.

I remember vaguely reading somewhere before that someone tried the same thing you did and having to go in and play with their IRQ settings to get it to work.
Interesting.. anything in particular I would have to do with IRQ settings? Never had to mess with IRQ stuff before. Will look into this in the next few minutes, will report back.

I've heard of other people with those Gigebyte boards saying that the MB can only allocate a certain number of devices. Try going into your BIOS and disabling anything you don't need. I'd start with the audio, 2nd SATA controller, and USB3.0. Disable anything you don't need, and then try it.
Hm, I already disabled the Audio controller, but you're right, I might as well disable some other stuff while I'm in there. Will report back!

Did you change the latency timer in BIOS? I've read on this forum that this needs to be increased from the default (64) for 5+ cards on gigabyte boards. If I remember correctly, the theory was that you need to set the timer to 16x# of GPUs. I'm not sure about that but you should try increasing it. I'd try 96 and some higher settings as well. 7x16=112, so try that too if you can. 
Latency timing eh? Sounds reasonable! Will look into this in just a few minutes!

If my post helped you in some way, please donate to 1NP2HfabXzq1BB288ymbgnLcGoeBsF7ahP. Smiley
NLA
Member
**
Offline Offline

Activity: 88


How does I shot web?


View Profile
June 07, 2012, 05:16:38 PM
 #12

Welp, looked through my available BIOS options, and while I see PCI-E clock options (increase from 100MHz to something higher, or lower), I see no PCI latency timer option. Should it be in my BIOS? Disabling audio controller and USB3.0 controllers haven't helped. On boot, there are still 2 GPU's that are giving a code 43 in Device Manager.

Would powered riser cables fix this? Would raising the PCI-E clock above 100MHz help? Or am I just stuck using a second computer for two of the cards? Huh

If my post helped you in some way, please donate to 1NP2HfabXzq1BB288ymbgnLcGoeBsF7ahP. Smiley
AzN1337c0d3r
Full Member
***
Offline Offline

Activity: 238

★YoBit.Net★ 350+ Coins Exchange & Dice


View Profile
June 07, 2012, 07:04:28 PM
 #13

I take it you didnt have IRQ settings to mess with?

NLA
Member
**
Offline Offline

Activity: 88


How does I shot web?


View Profile
June 07, 2012, 08:41:03 PM
 #14

No IRQ settings in the BIOS that I could see. And I'm not sure how (or if I should) change IRQ settings in Windows.

If my post helped you in some way, please donate to 1NP2HfabXzq1BB288ymbgnLcGoeBsF7ahP. Smiley
AzN1337c0d3r
Full Member
***
Offline Offline

Activity: 238

★YoBit.Net★ 350+ Coins Exchange & Dice


View Profile
June 07, 2012, 08:47:10 PM
 #15

No IRQ settings in the BIOS that I could see. And I'm not sure how (or if I should) change IRQ settings in Windows.

That's a bummer, I guess everything is auto-configured without choice of changing it (curses motherboard manufacturer)

One last thing I remember is maybe playing with the PnP OS option if it exists: See this KB article.

Clipse
Hero Member
*****
Offline Offline

Activity: 504


View Profile
June 07, 2012, 08:51:47 PM
 #16

Do yourself a favour and configure it on a linux machine.

Use kernel parameters to disable ACPI and self-assign IRQ's to each GPU (ACPI can stuffup horribly for some reason with so many GPU's)

Ive had 8 GPU setup like this temporarily but it really puts alot of strain on the board and I could literally feel the slots heatup with all 8 running thus I dont really support this for a stable longterm setup.

...In the land of the stale, the man with one share is king... >> Clipse

We pay miners at 130% PPS | Signup here : Bonus PPS Pool (Please read OP to understand the current process)
jjiimm_64
Legendary
*
Offline Offline

Activity: 1680


View Profile
June 07, 2012, 09:21:52 PM
 #17



Ive had 8 GPU setup like this temporarily but it really puts alot of strain on the board and I could literally feel the slots heatup with all 8 running thus I dont really support this for a stable longterm setup.

I would have to disagree with this.  I have 5 rigs with 8 gpus running 24/7 for about 8 months now. very stable.

as for the original topic.  i have 3 rigs with 5x7970's.  I tried to put a 6th card in one and could NOT get it to work.

 I could see all 6 cards in list of devices, but only 5 would be enabled at a time.


I hope this does not stop me from getting 4x7990s working... cant wait for that.

1jimbitm6hAKTjKX4qurCNQubbnk2YsFw
ssateneth
Legendary
*
Offline Offline

Activity: 1288



View Profile
June 08, 2012, 10:18:44 AM
 #18

Sounds like this is a dedicated mining rig. Go in your BIOS and disable EVERY non-essential piece of hardware that you wont be using. Audio. Firewire. All but 1 NIC. Serial ports. Parallel Ports. Serial ATA Ports/controllers except 1 if using a hard drive. All but 1 CPU core if using a multi-core CPU. USB ports, unless you need a keyboard to get in for something. All of that is consuming "resources" I'm assuming that can be used for the hardware you actually want to use, i.e. video cards. Also, using 16x > 16x risers may be causing a problem, because your board is trying to negotiate to 16x speeds. Some boards, when negotiating to max slot speed, will turn off other slots entirely in order to have the full amount of bandwidth available (PCI-E lanes are not unlimited). Try using 1x risers across the whole shebang. If you have limited 1x risers, use the 16x risers in your PCI > PCI-E adapters since they are always 1x speed.

Also make sure your risers are not defective. I had a bunch of 1x/16x > 16x risers, and they all went bad because of the stress of plugging them into the cards or bending the cables, would either cause cards to freeze the system when a load was applied, or device manager would show errors, or the cards wouldn't even detect at all. I use 1x > 1x risers with the edge sanded off in order to accept anything bigger than 1x, and they have all worked no problem (have 10 in use now), and are effortless to plug a card in.

Edit: I didn't really read much of the article before. You say 16x > 16x doesnt work, but 1x > 16x does. It could be a bad riser or bad speed negotiating. Use the 16x> 16x risers in your PCI > PCI-E adapters since they always work at 1x. Powered shouldn't matter if you are doing underclock/volt. Also, I already have experience with 6 cards on 1 board as well as using PCI > PCI-E adapters (I'm using one in a 4 PCI-E slot board now, now have 5 slots).

Oh, almost forgot. You say you are using 2 PSU's. If you can, power on the PSU that ISN'T powering the motherboard FIRST, then the one that does SECOND. I mean, kind of worth a shot I guess. I've forgotten to plug in PCI-E power cords to a card before and they wont detect, or they malfunction. If the card doesn't have proper power before the motherboard tries to detect it, that might be a possible problem too.

NLA
Member
**
Offline Offline

Activity: 88


How does I shot web?


View Profile
June 09, 2012, 04:08:48 AM
 #19

One last thing I remember is maybe playing with the PnP OS option if it exists: See this KB article.
Don't remember seeing this option in the BIOS, but will check again! Thanks for the heads-up!

Do yourself a favour and configure it on a linux machine.

Use kernel parameters to disable ACPI and self-assign IRQ's to each GPU (ACPI can stuffup horribly for some reason with so many GPU's)

Ive had 8 GPU setup like this temporarily but it really puts alot of strain on the board and I could literally feel the slots heatup with all 8 running thus I dont really support this for a stable longterm setup.
I did attempt to set everything up in Linux initially (Ubuntu 12.04) but I remember having problems getting it set up properly with the drivers and having the cards show up as being available for mining. I suppose I could sink some time into it and use Ubuntu 10.10 like I used to. Hm.

Also, if I set everything up with Linux, I doubt I could underclock the RAM as much as I do in Windows (thanks to a "bug" in the beta versions of Afterburner). I guess I could BIOS edit since the 7970's don't do hash checking like the 6XXX series and below did (not a lot of people know this.. including the guy that makes RBE, sadly), but thats very uncharted territory at this point, and not many people have attempted it. Also, I've set the SATA type to AHCI in the BIOS as opposed to ACPI.. is that fine? Or should I configure things under IDE?

Go in your BIOS and disable EVERY non-essential piece of hardware that you wont be using.
Did that. Disabled everything but USB, then disabled USB and gave that a try. Didn't help.

Also, using 16x > 16x risers may be causing a problem, because your board is trying to negotiate to 16x speeds. Some boards, when negotiating to max slot speed, will turn off other slots entirely in order to have the full amount of bandwidth available (PCI-E lanes are not unlimited). Try using 1x risers across the whole shebang. If you have limited 1x risers, use the 16x risers in your PCI > PCI-E adapters since they are always 1x speed.
Seems reasonable. When all cards are plugged in via risers, I noticed that the two cards that weren't available for mining were (incidentally?) the two 16x connected cards. Could I just cut the part of the cable that makes the whole thing a 16x cable so that there are only enough connected wires for it to function as a 1x to 16x, and use the cable as a 1x to 16x cable? I'm willing to sacrifice a cable to make this all work! For science! Wink

Also make sure your risers are not defective.
The risers aren't defective; I unplugged two 1x to 16x cables from two other cards, and after a reboot all of the remaining cards were recognized and functional (including the 16x cards, which were not functional before).

Powered shouldn't matter if you are doing underclock/volt. Also, I already have experience with 6 cards on 1 board as well as using PCI > PCI-E adapters (I'm using one in a 4 PCI-E slot board now, now have 5 slots).
But I'm not underclocking/undervolting, quite the opposite, I'm majorly overclocking and majorly overvolting (1231mV, 1270MHz core).

Oh, almost forgot. You say you are using 2 PSU's. If you can, power on the PSU that ISN'T powering the motherboard FIRST, then the one that does SECOND.
Way ahead of you; I'm using this bad-boy:


If my post helped you in some way, please donate to 1NP2HfabXzq1BB288ymbgnLcGoeBsF7ahP. Smiley
ssateneth
Legendary
*
Offline Offline

Activity: 1288



View Profile
June 09, 2012, 05:55:30 AM
 #20

If you want to convert a 16x riser into a 1x riser, yes you could convert it by cutting wires. Also make sure your motherboard plays nice.. some boards require pin shorting to be able to use a 1x riser in certain slots (i had to sometimes short presence detect on 1x slots and 16x slots of various boards)

huge migraine right now will try to help later

Pages: [1] 2 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!