Bitcoin Forum
May 21, 2024, 07:50:03 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 »  All
  Print  
Author Topic: Newbie trying to set up a 6x7950 rig  (Read 13834 times)
thehun (OP)
Legendary
*
Offline Offline

Activity: 1212
Merit: 1037



View Profile
January 05, 2014, 12:37:35 AM
 #1

Hi,

I have got eveything needed to operate a 6x7950 rig, with powered 1x to 16x riser cables. I have 2 different setups:

- one with Win 8 pro loaded on an external HD
- one with Xubuntu on a USB drive

and right now I can't get either of them to detect my GPUs (currently only 2 are connected). I'm not sure if I've connected them properly: I plugged 2 6-pin power cables into each of them but not the molex connector of the riser.

I have the following doubts at this moment:
- How am I supposed to power 6 GPUs with the 2x750W Corsair PSUs if each of them only has 4 6-pin PCI-e outputs. Can a card work with only one of these?
- I guess I need a PCI-e to molex adapter to power the risers. In this case, can I disconnect one or both of the PCI-e power supplies of the GPU?
- Do powered risers also work in non-powered mode?

Sorry if some of these questions are a bit silly but I'm still digesting all the information that's out there and can't see the light.

Cheers
repairguy
Sr. Member
****
Offline Offline

Activity: 252
Merit: 250


View Profile
January 05, 2014, 12:48:58 PM
 #2

The cards will not start up if you do not connect power to the molex connector on the riser.  You must connect to every power connector on the cards, using splitters or adapters.

Why on earth would you use a pcie to molex adapter to power the riser?Huh? Just use the molex connector on the power supply
thehun (OP)
Legendary
*
Offline Offline

Activity: 1212
Merit: 1037



View Profile
January 05, 2014, 06:59:47 PM
 #3

The cards will not start up if you do not connect power to the molex connector on the riser.  You must connect to every power connector on the cards, using splitters or adapters.

Why on earth would you use a pcie to molex adapter to power the riser?Huh? Just use the molex connector on the power supply

Thanks, so I understand I need to get some splitters to power all cards. The PSU doesn't have any molex connectors.
thehun (OP)
Legendary
*
Offline Offline

Activity: 1212
Merit: 1037



View Profile
January 05, 2014, 08:24:28 PM
 #4

The cards will not start up if you do not connect power to the molex connector on the riser.  You must connect to every power connector on the cards, using splitters or adapters.

Why on earth would you use a pcie to molex adapter to power the riser?Huh? Just use the molex connector on the power supply

Thanks, so I understand I need to get some splitters to power all cards. The PSU doesn't have any molex connectors.


Ok I think I have just answered myself and found the molex connectors, which simply look different to what I am used to seeing  Tongue



=

xhomerx10
Legendary
*
Offline Offline

Activity: 3850
Merit: 8171



View Profile
January 05, 2014, 09:51:42 PM
 #5

The cards will not start up if you do not connect power to the molex connector on the riser.  You must connect to every power connector on the cards, using splitters or adapters.

Why on earth would you use a pcie to molex adapter to power the riser?Huh? Just use the molex connector on the power supply

Thanks, so I understand I need to get some splitters to power all cards. The PSU doesn't have any molex connectors.


Ok I think I have just answered myself and found the molex connectors, which simply look different to what I am used to seeing  Tongue



=



Hun,

I am failing to see the difference...
You can also use the 8-pin connectors which might be a 6+2 config and come apart OR you will have to cut two off of one side - make sure you cut off the right ones.
Exactly which PSUs do you have?  Corsair 750 what?
You might need some of these:

versterk
Member
**
Offline Offline

Activity: 93
Merit: 10


View Profile
January 05, 2014, 10:09:15 PM
 #6

2x750w Psu won't be enough for 6 cards

I got many issue due to power loss on GPUs

A rig with z77+3x7970 work correctly (my 4 rigs are like this) with 1x750w for 2 GPU and 1x650w for board +1GPU , total Power on wall is near 1100w at full throttle (748Kh/s per GPU)

hope this help
xhomerx10
Legendary
*
Offline Offline

Activity: 3850
Merit: 8171



View Profile
January 05, 2014, 10:21:19 PM
 #7

2x750w Psu won't be enough for 6 cards

I got many issue due to power loss on GPUs

A rig with z77+3x7970 work correctly (my 4 rigs are like this) with 1x750w for 2 GPU and 1x650w for board +1GPU , total Power on wall is near 1100w at full throttle (748Kh/s per GPU)

hope this help

 This is true.  I have a 7950 and a 7970 running full out with voltages tweaked as low as possible and I'm at slightly over 600 watts at the wall.
So start with 4 cards on the two PSUs and if you can, check the power usage with a kill-a-watt device.  Do you know how to jumper the pins to start the PSU that isn't connected to a motherboard?

thehun (OP)
Legendary
*
Offline Offline

Activity: 1212
Merit: 1037



View Profile
January 06, 2014, 12:14:48 AM
Last edit: January 06, 2014, 12:28:40 AM by thehun
 #8

Thanks, I start to see the light now. I'll be quite happy for the moment if I can get 4 GPUs to work, only then I'll start investigating how to use all 6 Smiley

I have Corsair CX750s (the cheapest category I think).

The only thing which ATM is not 100% clear is how I have to jumper the pins for the second PSU. I have seen there is a device called Add2PSU which solves this, do you think it's not necessary?

Edit:
I have found this old thread https://bitcointalk.org/index.php?topic=31357.0

with this picture



Quote
If you want it to turn on with the mobo connected supply, wire the same green PS_ON pin from the mobo supply to your second supply's PS_ON pin, along with one of the adjacent COM (ground) pins.  This is how the cablesaurus adapter works.

If I could get an adapter delivered here fast enough (which isn't the case) I wouldn't mind paying for it to have it done the "clean" way, but I guess I'll have to make this work the "dirty" way
Bitcoinorama
Hero Member
*****
Offline Offline

Activity: 532
Merit: 500



View Profile
January 06, 2014, 03:05:00 AM
 #9

Thanks, I start to see the light now. I'll be quite happy for the moment if I can get 4 GPUs to work, only then I'll start investigating how to use all 6 Smiley

I have Corsair CX750s (the cheapest category I think).

The only thing which ATM is not 100% clear is how I have to jumper the pins for the second PSU. I have seen there is a device called Add2PSU which solves this, do you think it's not necessary?

Edit:
I have found this old thread https://bitcointalk.org/index.php?topic=31357.0

with this picture



Quote
If you want it to turn on with the mobo connected supply, wire the same green PS_ON pin from the mobo supply to your second supply's PS_ON pin, along with one of the adjacent COM (ground) pins.  This is how the cablesaurus adapter works.

If I could get an adapter delivered here fast enough (which isn't the case) I wouldn't mind paying for it to have it done the "clean" way, but I guess I'll have to make this work the "dirty" way

On the top of the larger ATX black plastic male (where the clippy bit is), pins 4 & 5 are what you want to push a paperclip into.

The apply some electrical tap around to hold in place.

Make my day! Say thanks if you found me helpful Smiley BTC Address --->
1487ThaKjezGA6SiE8fvGcxbgJJu6XWtZp
thehun (OP)
Legendary
*
Offline Offline

Activity: 1212
Merit: 1037



View Profile
January 06, 2014, 04:51:12 PM
 #10

Thanks, I start to see the light now. I'll be quite happy for the moment if I can get 4 GPUs to work, only then I'll start investigating how to use all 6 Smiley

I have Corsair CX750s (the cheapest category I think).

The only thing which ATM is not 100% clear is how I have to jumper the pins for the second PSU. I have seen there is a device called Add2PSU which solves this, do you think it's not necessary?

Edit:
I have found this old thread https://bitcointalk.org/index.php?topic=31357.0

with this picture



Quote
If you want it to turn on with the mobo connected supply, wire the same green PS_ON pin from the mobo supply to your second supply's PS_ON pin, along with one of the adjacent COM (ground) pins.  This is how the cablesaurus adapter works.

If I could get an adapter delivered here fast enough (which isn't the case) I wouldn't mind paying for it to have it done the "clean" way, but I guess I'll have to make this work the "dirty" way

On the top of the larger ATX black plastic male (where the clippy bit is), pins 4 & 5 are what you want to push a paperclip into.

The apply some electrical tap around to hold in place.

I'm having trouble keeping the clip in place as the second connector is in the air (it is all quite flabby and the contact is weak). Is there maybe a way to short two pins to keep the second PSU in "always on" mode while I await for my Add2PSU to arrive? maybe pin PS_ON to +5 or +3.3?
thehun (OP)
Legendary
*
Offline Offline

Activity: 1212
Merit: 1037



View Profile
January 06, 2014, 08:59:32 PM
 #11

Ok after reading a bit I have shorted pins 4 and 5 and measured the voltages with a multimeter. Everything is plugged, I try to start the system with 4 7950s (one plugged into the MB and 3 with risers) but nothing happens. The screen stays blank and the CPU cycle LEDs of the Z77A board are off (except the first two). The digits on the debug LCD move between "30" and "33"
thehun (OP)
Legendary
*
Offline Offline

Activity: 1212
Merit: 1037



View Profile
January 07, 2014, 01:03:11 AM
 #12

Ok after reading a bit I have shorted pins 4 and 5 and measured the voltages with a multimeter. Everything is plugged, I try to start the system with 4 7950s (one plugged into the MB and 3 with risers) but nothing happens. The screen stays blank and the CPU cycle LEDs of the Z77A board are off (except the first two). The digits on the debug LCD move between "30" and "33"

Answering myself again... it turns out multi-monitor support was disabled on my board and coincidentally the wifi also got disconfigured (no remote access) so it was just the VGA output getting disabled every time I plugged a GPU.

Now all I have to solve is why the GPUs connected to PSU #2 with the jumped pins aren't being detected by Windows (even though they are powered with fans spinning). Hopefully I can figure it out before my wife decides to move out (I hoped to set this up a lot faster and now I've been completely absent for the last days with this).  Embarrassed
winner999
Newbie
*
Offline Offline

Activity: 47
Merit: 0


View Profile
January 07, 2014, 05:18:13 PM
 #13

With 6 7950 cards, you should go with 2 rigs, both wit 3x7950.
For each rig you should get decent PSU, somewhere in 800W range (not less than 750W) with single 12V rail design.

Price difference between 2x 750W PSU and 1x1500W PSU (for 6x7950) is more than enough to buy 1 motherboard, CPU, HDD and 4GB of RAM.

You will avoid any potential windows problems when running 6 GPUs on one system (I didn't see a singe system with more than 5 GPUs), and you will have couple of inches between cards, unlike with 6 cards where you can't get more than 1". Like that you will have better cooling, and quieter system.
repairguy
Sr. Member
****
Offline Offline

Activity: 252
Merit: 250


View Profile
January 08, 2014, 02:32:41 AM
 #14

With 6 7950 cards, you should go with 2 rigs, both wit 3x7950.
For each rig you should get decent PSU, somewhere in 800W range (not less than 750W) with single 12V rail design.

Price difference between 2x 750W PSU and 1x1500W PSU (for 6x7950) is more than enough to buy 1 motherboard, CPU, HDD and 4GB of RAM.

You will avoid any potential windows problems when running 6 GPUs on one system (I didn't see a singe system with more than 5 GPUs), and you will have couple of inches between cards, unlike with 6 cards where you can't get more than 1". Like that you will have better cooling, and quieter system.

Not entirely correct.  you could just use 2x 750s for the single six unit. I run six gpus with win7 and no issues. However the spacing is a valid point.  to use 6 gpus you have to have a custom case.
thehun (OP)
Legendary
*
Offline Offline

Activity: 1212
Merit: 1037



View Profile
January 08, 2014, 03:05:55 PM
 #15

I have managed to get 4 cards to mine. It turns out if there is one card plugged directly into a x16 slot you won't be able to use rised cards in any other x16 slot, so I had to move them into 1x.

I will try to borrow a kill-a-watt today and see how much power I'm drawing. However, overheating is definitely an issue. One of the cards (the one on the side) heats up to 86° if I leave the room closed and to 80° when open (but then we hear the noise in the whole flat, which makes my wife rather unhappy) with the fan running at 95%. I have to find a solution, especially if I want to add more cards.

drdops
Full Member
***
Offline Offline

Activity: 144
Merit: 100



View Profile
January 09, 2014, 02:22:54 PM
 #16

With 6 7950 cards, you should go with 2 rigs, both wit 3x7950.
For each rig you should get decent PSU, somewhere in 800W range (not less than 750W) with single 12V rail design.

Price difference between 2x 750W PSU and 1x1500W PSU (for 6x7950) is more than enough to buy 1 motherboard, CPU, HDD and 4GB of RAM.

You will avoid any potential windows problems when running 6 GPUs on one system (I didn't see a singe system with more than 5 GPUs), and you will have couple of inches between cards, unlike with 6 cards where you can't get more than 1". Like that you will have better cooling, and quieter system.

Not entirely correct.  you could just use 2x 750s for the single six unit. I run six gpus with win7 and no issues. However the spacing is a valid point.  to use 6 gpus you have to have a custom case.

thats interesting..
i just got 15 x sapphire r9 290 tri x and was going to do 5 rigs with win 7..now i think ill try 3 rigs with 5 cards..
i already have 3x Gigabyte 990XA-UD3 AMD 990X (Socket AM3+) DDR3 Motherboard  and 3x Silverstone SST-ST1200-G Strider Evolution Gold Series - 1200 Watt

could i run 5 cards on this mobo in win 7? how much extra power would i need for each rig? would i run 3 or 4 cards off the 1200w?
could you tell me exactly what else i need to make it work..ie specific power risers, things to connect psu's etc.. (Its my first build- so will need it spelled out)

thanks
xhomerx10
Legendary
*
Offline Offline

Activity: 3850
Merit: 8171



View Profile
January 09, 2014, 05:07:04 PM
 #17

I have managed to get 4 cards to mine. It turns out if there is one card plugged directly into a x16 slot you won't be able to use rised cards in any other x16 slot, so I had to move them into 1x.

I will try to borrow a kill-a-watt today and see how much power I'm drawing. However, overheating is definitely an issue. One of the cards (the one on the side) heats up to 86° if I leave the room closed and to 80° when open (but then we hear the noise in the whole flat, which makes my wife rather unhappy) with the fan running at 95%. I have to find a solution, especially if I want to add more cards.


 Wow.  Good job!  You taught me something too - I had no idea about the X16 slot.
This will add a bit to your overall cost but you might try water cooling the GPUs.  For example - http://www.swiftech.com/KOMODO-HD7900.aspx
It might be worth the extra if your wife is pissed.

glendall
Legendary
*
Offline Offline

Activity: 2100
Merit: 1018


Sugars.zone | DatingFi - Earn for Posting


View Profile
January 10, 2014, 04:54:57 AM
 #18

OP:  you really should save yourself a big headache and simply but another motherboard.  You could get a suitable used 3xpcie mobo, cpu , and minimum ddr2 ram for probably 50 bucks? I don't see why you'd want to go the 6x route. It's plausible and good on you if you want to do it, but man, 2 3xvideo card pc's would be so much easier. 

.SUGAR.
██   ██

██   ██

██   ██

██   ██

██   ██

██   ██
▄▄████████████████████▄▄
▄████████████████████████▄
███████▀▀▀██████▀▀▀███████
█████▀██████▀▀██████▀█████
██████████████████████████
██████████████████████████
█████████████████████▄████
██████████████████████████
████████▄████████▄████████
██████████████████████████
▀████████████████████████▀
▀▀████████████████████▀▀

██   ██

██   ██

██   ██

██   ██

██   ██

██   ██
███████████████████████████
███████████████████████████
██████               ██████
██████   ▄████▀      ██████
██████▄▄▄███▀   ▄█   ██████
██████████▀   ▄███   ██████
████████▀   ▄█████▄▄▄██████
██████▀   ▄███████▀▀▀██████
██████   ▀▀▀▀▀▀▀▀▀   ██████
██████               ██████
███████████████████████████
███████████████████████████
.
Backed By
ZetaChain

██   ██

██   ██

██   ██

██   ██

██   ██

██   ██

██   ██

██   ██

██   ██

██   ██

██   ██

██   ██
▄▄████████████████████▄▄
██████████████████████████
████████████████████████████
█████████████████▀▀  ███████
█████████████▀▀      ███████
█████████▀▀   ▄▄     ███████
█████▀▀    ▄█▀▀     ████████
█████████ █▀        ████████
█████████ █ ▄███▄   ████████
██████████████████▄▄████████
██████████████████████████
▀▀████████████████████▀▀
▄▄████████████████████▄▄
██████████████████████████
██████ ▄▀██████████  ███████
███████▄▀▄▀██████  █████████
█████████▄▀▄▀██  ███████████
███████████▄▀▄ █████████████
███████████  ▄▀▄▀███████████
█████████  ████▄▀▄▀█████████
███████  ████████▄▀ ████████
████████████████████████████
██████████████████████████
▀▀████████████████████▀▀
repairguy
Sr. Member
****
Offline Offline

Activity: 252
Merit: 250


View Profile
January 10, 2014, 09:28:24 AM
 #19

OK just spent 45 minutes writing a reply on a Nexus 7 and the browser / forum lost it.  Angry

Anyway this may have been done to death, but I've got a question. What is going on with people building GPU mining rigs?

Is it the altruism - helping maintain the distributed safety of the network?
Is it the sheer tech geek fun factor - making mad machines for a laugh?
Is it for 'alternative' uses - from nice good Bitcoin mining to password bruteforcing / rainbow table / spook type stuff?
Is it because something has *fundamentally changed* to make GPU mining worthwhile recently???

OK, I'll front up - I've been around a while but was pretty much out of the picture in 2013 as I lost everything - health, job, family, home, assets, the whole shooting match. But this ain't a catfish sob story. This is my first return to the forum and for good reason...

As a few old timers will remember, I made a fair few insane home-made mining rigs, starting with lethal wooden contraptions with a couple of GPUs, onto bigger aluminium multi-GPUs-on-risers frames, and then to my silly 'shelf rigs' with three logic boards on each, all three with 4 overclocked GPUs on risers. I built a couple of those until the 1800W+ power consumption 24/7 at UK residential electricity rates, along with the insane noise and heat, really annoyed my girlfriend (we lasted 18 years, and Bitcoin was not the reason for it all ending...).

Actually, the whole big GPU rig paradigm started making no economic sense back then in 2012. I was barely breaking even, and that was ignoring the cost of dying equipment. I didn't over clock to insane levels (I underclocked RAM and over clocked core, plus volt tweaks as many others at the time) but 24/7 heavy lifting eventually ate Radeons and PSUs.

So I moved onto FPGAs, and even though our email styles (i.e. the owner and me!) didn't get on that well, I stuck with Ztex kit as I thought the quality was superb, and submitted a Mac OS X build for his then driver suite. I ended up with a few variations on 20 to 25 unit boards, all with crazy lights etc. and entertained at least a few on the 'show us your mining rigs' thread Smiley Sadly, all the photos are hosted on my servers, and having lost my home and the static IP the servers lived on, you'll only find the text on the threads since I've no way of re-hosting the servers until I get a new place of my own and some new money...

Even so, commercial concerns and ASICs pushed difficulty to new heights, so the private miner really couldn't make even 'hobbyist' money without very large capital investments.

So what has changed? Yes, I'm aware of the large price rise in the BTC - and it's surprised me even though I'm fully aware of the deflationary econometric paradigm. But the ASIC-led difficulty must have made mining profitability almost pointless for anything other than those right in the inner circle, or those making a commercial investment (and a big one).

Can anyone clue me up?


Anyway, another reason the catfish is back. Around a year ago, when the FPGAs were barely breaking even, I ordered one BFL ASIC (a small Jalapeño) with money I really didn't have. Well, that year passed, and I've read enough about BFL, so along with the other heavy events of 2013, I wrote that particular 'investment' off long ago.

Well I received an import tax demand last week - something from the USA. I haven't bought anything for a very long time. Turns out it was my BFL Jalapeño.... of course BFL are now selling 600TH devices so my little box is already well out of date, so I'll have to work out whether power consumption / likely income is worth my while. And I've also read the horror stories of non-USA-120V power supplies for these units so will be using a proper earthed UK PSU before firing them up...

So the catfish is back Bitcoin mining! Woo hoo Smiley


But if anyone could clue me up as to why the hell anyone is still building GPU rigs, I'd be very interested. Are today's GPUs as fast as ASICs? Surely that's mad? And they surely still run hot and eat power?

All I know is that I'm better off selling my 2011 Casascius error-print physical bitcoins somehow... only three were ever made with the prefix 1Pi so any mathematical obsessives out there may be interested in rare coins!!! (sorry, wasn't meaning a sales pitch, I've got all three of them).



Anyway apologies for the massive off-topic outburst - first time back for the catfish Smiley As to the OP's question - the largest number of GPUs I managed to run from a single logic board was 5. This logic board has 7 PCIe slots, and I was using mostly x16->x1 risers with a couple of x16->x16 risers. This is in the old days - cards were 5850s and 6950s, flashed, OCd/UCd as necessary, average performance 380-420MH per card. I needed two PSUs for that particular rig.

And the sheer unpredictable behaviour of the 5-GPU rig led me to my standard design, which used cheap 4-slot (1x16, 3x1) logic boards, 4 GPUs per logic board, as a modular design. Reliable and the odd crash (I used a Linux build, scripted and posted here IIRC) only took out one module than bringing down a huge rig with loads of GPUs.

IIRC there was a monster project with a special server logic board that aimed for 8 top-end GPUs on the one board. Never found out if that actually ended up working or not - don't think the BIOS manufacturers ever expected anyone pulling 60W through *every* PCIe bus (even if the PCIe power cabling should have supplied the majority of the power)....

Welcome back, Sorry to hear of your misfortune.  Other alternative coins have came about that use a different proof of work algorithm, scrypt.  A scrypt asic or fpga has yet to be developed, so the gpus are making a comeback.  (It also adds cool points to brute force a 12 digit windows password in 3hrs.)

The 8 gpu setup never materialized I have seen the board, but never heard of one working.  The pcie risers now use a connector for external power not drawn from the motherboard(when needed)  From what I read most people use 5 gpus per motherboard.

Also you kind of hijacked this thread Sad
repairguy
Sr. Member
****
Offline Offline

Activity: 252
Merit: 250


View Profile
January 10, 2014, 10:40:01 AM
 #20

osx is vulnerable, I am not sure about linux
Pages: [1] 2 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!