Bitcoin Forum
June 23, 2024, 12:58:33 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 »  All
  Print  
Author Topic: VirtualCL / 2x Quad HD 7990  (Read 8231 times)
gpufreak (OP)
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
June 23, 2013, 01:50:31 PM
 #1

Hello all,

Really wanted to post this question in the hardware section to give it a bit more attention but here is my story.

I'm a programmer heavily using the OpenCL framework mainly using AMD/ATI GPU's. What started as a simple project is now getting bigger and bigger and thus needing more resources. I asked a computer shop to come up with a more heavy system with more GPU's to satisfy my GPU processing needs.

My first system was a single HD 7970 for testing purposes which simply works and no issues with cooling at all but was not having enough power.

Then I decided to step it up a bit and go for a quad 7970 setup but immediately ran into cooling issues. They have tried to go with extra fans, more spacing but still no luck (build up in a chasis). They took back the system and build up a 3 x 7990 (XFX type) machine with more spacing, bigger case, even more fans but after running for 2-3 minutes on all GPU's at 100% temps are up to 90C which is the temp where my own fail safe kicks in and shut down the calculations.

Now the reason I'm posting here is that lots of people here have experience with cooling their rigs and building heavy dutch machines ;-) I have been browsing and learning a lot the past few days but I need some advice here.

I really prefer a closed chassis just to keep things tidy and clean but using pure fan's just not seems to get these things cooled enough so here is the time to reconsider my entire setup and now do it properly... I'm open to all suggestions.

My end-goal is this: 2 worker machines, 4 dual GPU cards 7990 running in a VirtualCL cluster. Simply because I then can fire up from my build server and let those machines to the heavy duty work in parallel. Now the VirtualCL requires roughly 90 Mbit/sec per GPU. Thus building a machine network with in total 2 gpu's per card * 4 cards per machine * 2 machines = 16 GPU's totally consuming roughly 1500 Mbit/sec. Since most mobo's out there only doing 1000 Mbit I should take into consideration reserving a PCI slot for a 10000 Mbit network card or probably infiniband card.

Current base system looks like this: Asrock X79 Extreme 11 with a i7-3930K CPU and 16GB of RAM with 3x XFX 7990 and a LEPA G1600 as PSU.

My concerns by adding a 4th XFX card is that it consumes too much power to be handled by the G1600 under full load as to my understanding a single 7990 can eat up 375 watt. So with 4 cards this could already consume 1500 watt leaving not much power left for the mobo/cpu/ram/hdd.

I think the motherboard is a great pick and the i7-3930K is doing great with CPU intensive operations. RAM here is just fine with the 16 GB. I just need to figure out how to build and keep two of those systems cool and giving it enough power. I can draw 2x 16amps from my outlets so that should not be a problem here just the PSU's might need a dual setup but not sure since I was not yet able to get a real good measurement on the actual usage of a single 7990 as the 375 watt mentioned earlier is just from a spec sheet running factory default settings.

So please let me know your thoughts and considerations Smiley !
gpufreak (OP)
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
June 24, 2013, 03:24:07 PM
 #2

Going for a Lian Li PC-D8000 case with a secondary PSU regarding the power concerns and the 4th XFX 7990. People here have experiance cooling such config?
Remember remember the 5th of November
Legendary
*
Offline Offline

Activity: 1862
Merit: 1011

Reverse engineer from time to time


View Profile
June 24, 2013, 03:26:04 PM
 #3

Most people here use custom cooling solutions, however improvised they may be.

Also, while I understand your question is non-Bitcoin related, you may want to pursue ASIC development for your goals. ASIC means a much more efficient device, that does more work for far less power.

BTC:1AiCRMxgf1ptVQwx6hDuKMu4f7F27QmJC2
gpufreak (OP)
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
June 24, 2013, 04:19:11 PM
 #4

Hi,

To my understanding ASIC's are application specific and thus requires me to have an ASIC designed for my specific goals. As my code implicates highly non-standard calculations (no double sha) a far cheaper goal would be to go with GPU's. My goal is to build a 32 GPU cluster. Each node with 4x 7990 = 8 GPU's so with 4 machine's I'm all set.

Now only the cooling is an issue which I think would be solved by using water cooling (chillers) instead of open-air systems and heavy A/C units.
Gomeler
Hero Member
*****
Offline Offline

Activity: 697
Merit: 500



View Profile
June 24, 2013, 05:11:12 PM
 #5

Look into some of the open-air frames that have been built. I know gigavps sells his frame in the marketplace here. By giving the cards a few inches air-gap you can run 100% load on the stock cooler. Waterblocks are fantastic for compute dense configurations but there are all sorts of maintenance issues around those. For example I just discovered that a large quantity of my HD 5970 waterblocks have warped acrylic covers. Pain in the ass that could have resulted in a dead motherboard and possibly dead video cards.

For simplicity and if you have a location where noise isn't a concern, go with open-air rigs. If it must be cased then look into waterblocks.
tom_o
Sr. Member
****
Offline Offline

Activity: 308
Merit: 250


View Profile
June 24, 2013, 05:44:48 PM
Last edit: June 24, 2013, 05:55:47 PM by tom_o
 #6

Hi,

To my understanding ASIC's are application specific and thus requires me to have an ASIC designed for my specific goals. As my code'] implicates highly non-standard calculations (no double sha) a far cheaper goal would be to go with GPU's. My goal is to build a 32 GPU cluster. Each node with 4x 7990 = 8 GPU's so with 4 machine's I'm all set.

Now only the cooling is an issue which I think would be solved by using water cooling (chillers) instead of open-air systems and heavy A/C units.

You wouldn't need a water chiller, much simpler design for 4x7990 would be a triple liang dcc and dual/triple 480mm radiator setup. Then you don't have to worry about chiller control etc.

1 x Thread Laing DDC Triple Acrylic Pump Top : Plexi
3 x Laing DDC-1T Pro Pump 440 L/hr 10W
2-4 x Swiftech MCR420XP Extreme Performance 480mm Quad Radiator
8-16 x Zalman ZM-F3 120 x 120 x 25mm 1800RPM Case Fan
4 x EK FC 7990 SE Full Cover Waterblock for AMD Malta Series Reference Design HD 7990 Graphics Card : Copper Acetal

Plus some tubing and some high quality hose clamps. T line is much easier than using a reservoir and means there is less to spill if something goes wrong, I'd still recommend distilled water and some kind of anti-conductive additive though.

Although expensive it should keep the cards at a delta between 10-20 degrees max compared to ambient air at full load depending on how many radiators you decide to use.
Gomeler
Hero Member
*****
Offline Offline

Activity: 697
Merit: 500



View Profile
June 24, 2013, 05:53:09 PM
 #7

Hi,

To my understanding ASIC's are application specific and thus requires me to have an ASIC designed for my specific goals. As my code'] implicates highly non-standard calculations (no double sha) a far cheaper goal would be to go with GPU's. My goal is to build a 32 GPU cluster. Each node with 4x 7990 = 8 GPU's so with 4 machine's I'm all set.

Now only the cooling is an issue which I think would be solved by using water cooling (chillers) instead of open-air systems and heavy A/C units.

You wouldn't need a water chiller, much simpler design for 4x7990 would be a triple liang dcc and dual/triple 480mm radiator setup. Then you don't have to worry about chiller control etc.

1 x Thread Laing DDC Triple Acrylic Pump Top : Plexi
3 x Laing DDC-1T Pro Pump 440 L/hr 10W
2-4 x Swiftech MCR420XP Extreme Performance 480mm Quad Radiator
8-16 x Zalman ZM-F3 120 x 120 x 25mm 1800RPM Case Fan

plus the waterblocks for your graphics cards, some tubing and some high quality hose clamps. T line is much easier than using a reservoir and means there is less to spill if something goes wrong, I'd still recommend distilled water and some kind of anti-conductive additive though.

Although expensive it should keep the cards at a delta between 10-20 degrees max compared to ambient air at full load depending on how many radiators you decide to use.

Get a silver kill coil and distilled water to handle any biocide requirements. What is an anti-conductive additive? Short of using something like fluorinert I don't think there is a way to prevent the water from ionizing given the constant contact with copper.
tom_o
Sr. Member
****
Offline Offline

Activity: 308
Merit: 250


View Profile
June 24, 2013, 06:00:06 PM
 #8


Get a silver kill coil and distilled water to handle any biocide requirements. What is an anti-conductive additive? Short of using something like fluorinert I don't think there is a way to prevent the water from ionizing given the constant contact with copper.

I was posting from memory but I think I actually meant non conductive additive (as in it doesn't suddenly turn the water conductive) - I once filled up a loop and the pushfit fitting were dodgy - ended up soaking a 6800 Ultra + motherboard but they both survived! Can't remember exactly what it was, was blue if that helps Tongue

Btw a chiller to handle that sort of loading would be ridiculously expensive, the good thing about a water loop is that if the temps are too high at first try you can just keep adding radiators or more powerful fans till the cooling is sufficient. You can also use it for underfloor air heating with rigs like that Wink
gpufreak (OP)
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
June 24, 2013, 06:52:42 PM
 #9

Looks like water cooling is the way to go. Expecting the cards and chassis to come in somewhere next week. Got a very detailed e-mail from EK with their products which also mention the full cover 7990's so will look into that and puzzle it together so the cards, chassis and hopefully the water cooling products arrive at the same time.

Also did put in a secondary PSU to make sure I do not run into power issues when the four 7990's are running at max speed.

Regarding the liquids to use, i see they have copper and nickel plates, any advice there what to use in combination with what liquid?
tom_o
Sr. Member
****
Offline Offline

Activity: 308
Merit: 250


View Profile
June 24, 2013, 07:01:59 PM
 #10

Looks like water cooling is the way to go. Expecting the cards and chassis to come in somewhere next week. Got a very detailed e-mail from EK with their products which also mention the full cover 7990's so will look into that and puzzle it together so the cards, chassis and hopefully the water cooling products arrive at the same time.

Also did put in a secondary PSU to make sure I do not run into power issues when the four 7990's are running at max speed.

Regarding the liquids to use, i see they have copper and nickel plates, any advice there what to use in combination with what liquid?

Distilled water all the way, some people add small amounts of 'wetter' which reduces the surface tension of the water. Not sure as to the benefits of nickel plating but mixing metals isn't the best idea when it's alu/copper cause you essentially turn the loop into a battery! Not sure what the consensus is on mixing nickel/copper (pretty much all radiators are copper). Secondary PSU is a very good idea too!
YokoToriyama
Newbie
*
Offline Offline

Activity: 58
Merit: 0



View Profile
June 24, 2013, 09:11:41 PM
 #11

try this guy on youtube here's really good at cooling*

http://www.youtube.com/watch?v=xnQASvYeRtE&feature=c4-overview-vl&list=PLWYPRHLWSBL4B04Ab1v3TNsj1akqyW_-E

SingularityComputers
Gomeler
Hero Member
*****
Offline Offline

Activity: 697
Merit: 500



View Profile
June 25, 2013, 12:32:39 AM
 #12

Nickle/copper is fine. Nickle just doesn't tarnish like copper does so your blocks won't turn brown. I have been running a mixed nickle/copper loop with a silver kill coil since ~2011 with no problems. I'm just running distilled water and shitty home depot 1/2" tubing. Get quality barbs and utilize zipties or hose clamps and you won't ever have problems with hoses stretching and leaking. I have been using Bitspower's 1/2" barbs with great success.
uuidman
Full Member
***
Offline Offline

Activity: 121
Merit: 100


View Profile
July 05, 2013, 04:14:06 PM
 #13

Looks like water cooling is the way to go. Expecting the cards and chassis to come in somewhere next week. Got a very detailed e-mail from EK with their products which also mention the full cover 7990's so will look into that and puzzle it together so the cards, chassis and hopefully the water cooling products arrive at the same time.

Also did put in a secondary PSU to make sure I do not run into power issues when the four 7990's are running at max speed.

Regarding the liquids to use, i see they have copper and nickel plates, any advice there what to use in combination with what liquid?
Just a heads up for a second PSU, be a little careful if you run nr 2 out of the loop (by that I mean the PSUs doenst start stop together). Example if PSU nr 1 stops system goes down, everything ok.
If PSU nr 2 stops and its only providing power to the cards, then they will try to get as much as they can from PSU nr 1 and from the MB. In my test case it the cards fans start spinning weird and overall doesnt fell good. Hasnt checked how the mining changed. Felt a need to close the test rather quick. This was with a couple of 6950, with even more powerful cards there can be sideeffects. I would say that 3-4 doubleGPUs is a more bumpy ride.
uuidman
Full Member
***
Offline Offline

Activity: 121
Merit: 100


View Profile
July 05, 2013, 04:20:23 PM
 #14

May I ask what kind of openCL computations you are doing ? I think it can be better with 3x3 GPUs if it doesnt hurt virtualCL, The worker machines can be automatic configured, maybe you already doing that.
Bitweasil
Sr. Member
****
Offline Offline

Activity: 420
Merit: 250


View Profile
July 05, 2013, 10:27:37 PM
 #15

I sent you a PM with some info on solutions I have available with a friend.

Looks like water cooling is the way to go. Expecting the cards and chassis to come in somewhere next week. Got a very detailed e-mail from EK with their products which also mention the full cover 7990's so will look into that and puzzle it together so the cards, chassis and hopefully the water cooling products arrive at the same time.

I entirely disagree - a homebuilt water cooling system for production work is asking for trouble.  I have know many people who have tried water cooling for GPUs before in the password cracking realm, and all of them have gone back to air cooling due to reliability issues.  The 7970s are great cards with excellent coolers on them, provided they're in a chassis that can handle their airflow and power requirements.  It's hard to do this with standard builds, but we have a few options that fit this need perfectly (24/7/365 sustained high load).

Need high quality, rack mountable GPU clusters for OpenCL work or password auditing?  http://www.stricture-group.com/
sQueeZer
Sr. Member
****
Offline Offline

Activity: 312
Merit: 251


View Profile
July 06, 2013, 12:49:08 AM
 #16

The threadstarters needs are not realistic with aircooling.

Closed case + 6/8 GPUs in 1 machine can't work. U need to get that massive amount of heat off that case. No
way without watercooling.

For air cooling: Open milk crates !
Bitweasil
Sr. Member
****
Offline Offline

Activity: 420
Merit: 250


View Profile
July 06, 2013, 01:17:40 AM
 #17

The threadstarters needs are not realistic with aircooling.

Closed case + 6/8 GPUs in 1 machine can't work. U need to get that massive amount of heat off that case. No
way without watercooling.

For air cooling: Open milk crates !

I beg to differ.  We have a number of 8 GPU systems deployed right now, on air cooling, that work just fine.  They are designed for rack deployment, as they need rather more power than a 120v 15A circuit can provide, so we run them on 240v most of the time.

We've got a box with 6 7970s running fine (it had 8 previously, but we needed a few for some other testing and a slot for the infiniband controller), and we have a number of boxes with 4x 7990s running quite comfortably on air.

They're a bit higher price than "milk crate computers," but they're also supported with a warranty, you can put them in a data center (try racking up milk crate systems), and they're designed for 24/7/365 operation.


Need high quality, rack mountable GPU clusters for OpenCL work or password auditing?  http://www.stricture-group.com/
sQueeZer
Sr. Member
****
Offline Offline

Activity: 312
Merit: 251


View Profile
July 06, 2013, 11:53:46 AM
 #18

Can u show some pictures of that machines?
Bitweasil
Sr. Member
****
Offline Offline

Activity: 420
Merit: 250


View Profile
July 06, 2013, 04:25:08 PM
 #19

Here's an early iteration:



This system went through several iterations, eventually ending up without fans on the GPUs because the case fans provided enough airflow to keep them cool, and the removal of the fan shrouds reduced flow resistance enough that they didn't need their fans.  It went down to fewer cards as well, because we needed space for the Infiniband controller (for our purposes, we found that while the bandwidth use of VCL isn't that much higher than gigabit can handle, the latency reduction is well worth the cost of the IB cards/switches).

We've since moved beyond that design to a quad 7990 design with better cooling (we don't have any full photos of it, so I'm linking a build photo).  This lets us fit the full 8 GPUs supported by the driver in, while still keeping space for the IB controller.  Also, cooling is better with the gaps between the cards, and we don't have to physically modify the cards.



Now, if your desired build cost is "hang cards off some scrap aluminum in a milk crate with a used mainboard you got off Craigslist," these aren't the systems for you.  They're not aimed at bitcoin mining, but I get the impression the OP isn't doing bitcoin mining.  For a professionally done VCL cluster, that works reliably and can be run in a data center, we've got a number of options.  We also have some systems with fewer GPUs that are a bit less expensive if you want to start smaller, and we also typically will quote a dedicated cluster controller for the systems - again, we have a lot of experience with VCL, and this is what we've found works best.

Need high quality, rack mountable GPU clusters for OpenCL work or password auditing?  http://www.stricture-group.com/
Gabi
Legendary
*
Offline Offline

Activity: 1148
Merit: 1008


If you want to walk on water, get out of the boat


View Profile
July 06, 2013, 09:31:56 PM
 #20

Most people here use custom cooling solutions, however improvised they may be.

Also, while I understand your question is non-Bitcoin related, you may want to pursue ASIC development for your goals. ASIC means a much more efficient device, that does more work for far less power.
Are you trolling or what?

Pages: [1] 2 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!