Bitcoin Forum
December 03, 2016, 06:00:14 PM *
News: To be able to use the next phase of the beta forum software, please ensure that your email address is correct/functional.
 
   Home   Help Search Donate Login Register  
Pages: [1] 2 3 4 5 6 »  All
  Print  
Author Topic: TriFire water cooling (was: Squeezing 2-slot cards into a single slot)  (Read 9740 times)
cicada
Full Member
***
Offline Offline

Activity: 196


View Profile
November 07, 2011, 04:25:58 PM
 #1

This is a (mostly) theoretical question, as doing the things I'm about to suggest certainly void warranty, and probably aren't worth it.  Basically, do not do this Wink  This isn't exactly specific to mining, but since miners are generally the only people crazy enough to try such things, I'm asking here.

I've recently decided to steal a couple 6950's from my mining rigs to build myself a sweet, sweet crossfire gaming rig.  It is truly awesome and all that jazz, however it's loud as all get out and hotter than hell, even with an open slot between my cards.  To alleviate those issues, I'm turning to water cooling.

Now, the motherboard I'm using (Gigabyte 990FX UD3) has 4 16x length slots, two at 16x and two at 8x.  It occurred to me while researching waterblocks that my two-slot cards would effectively become 1 slot cards, and I thought 'hey, I could squeeze 4 cards onto this board!'

Reality and disappointment struck soon after when I realized my cards have two DVI headers stacked on top of eachother, as such:



The header is a solid-block rather than a ribbon, so the card still occupies two PCI slots even without the monster HSF.

Now, the crazy question - is it possible to desoldier the DVI connectors entirely, removing this restriction?

Leave all the 'this is stupid, use extenders' or 'not worthwhile for mining', 'run it open air', or 'zomg wtf over' type responses at the door, I'm merely asking about the possibility and ramifications, not whether it's sane or worth my time or monies.

The little eeng experience I have tells me the DVI pins are each just connected to individual traces on the board, and 'nothing is plugged in' is basically the same as 'no header connected at all', is this accurate?

As for mining applicability, this would allow squeezing 4 2-slot cards into a single motherboard, well-cooled, inside a case, which obviously has it's appeal Wink

Team Epic!

All your bitcoin are belong to 19mScWkZxACv215AN1wosNNQ54pCQi3iB7
1480788014
Hero Member
*
Offline Offline

Posts: 1480788014

View Profile Personal Message (Offline)

Ignore
1480788014
Reply with quote  #2

1480788014
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
ElectricMucus
Legendary
*
Offline Offline

Activity: 1540


Drama Junkie


View Profile
November 07, 2011, 04:34:43 PM
 #2

If you are not planning on water cooling I can't see how this would work in that little space. But considering you are practically removing any resale value from the card I don't think this is a good idea.

There are some water cooled 6990s around which I think could be single slot.
But again: Don't do it.

As for your question, yeah it could work.

First they ignore you, then they laugh at you, then they keep laughing, then they start choking on their laughter, and then they go and catch their breath. Then they start laughing even more.
cicada
Full Member
***
Offline Offline

Activity: 196


View Profile
November 07, 2011, 04:40:05 PM
 #3

I didn't think it was that TLDR;, but I did say I'm turning to water cooling Wink

My warranty is already technically voided since I've semi-permanently shorted the write-protect pins on the BIOS, so that's not of much concern.  Neither is resale value - this is now a 'gaming' rig, so I don't plan on getting rid of them anytime soon.  A virtually silent 1.4gh/s enclosed rig that also heats my office would just be a nice bonus.

Team Epic!

All your bitcoin are belong to 19mScWkZxACv215AN1wosNNQ54pCQi3iB7
ElectricMucus
Legendary
*
Offline Offline

Activity: 1540


Drama Junkie


View Profile
November 07, 2011, 04:44:47 PM
 #4

You didn't write how you are planning to cool it.

There are some 5770s and 6770s with passive heatsinks, I use one in my desktop and it is pretty quiet.
And those have huge heatsinks, are you planning to adapt those cards to something like that?

First they ignore you, then they laugh at you, then they keep laughing, then they start choking on their laughter, and then they go and catch their breath. Then they start laughing even more.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 07, 2011, 04:51:47 PM
 #5

Not sure why you need them to be single slots how many do you plan on installing?

If you want to watercool your best bet is to sell the 6950s for 6990s or 5970s.  

1) They have a different slot backplane and some watercooling shops sell single slot versions.  You simply remove the DVI nuts (where DVI plug screws into), and two small screws at the end of the PCB (you can see them at the top in the photo).  Then you do the reverse to install a single slot version.


2) Waterblocks are expensive and you will want full cover waterblocks (which also cool the VRMs).  Given the cost getting 2x GPU for a single waterblock is more economical.

3) With 5970s or 6990s you likely don't need to make them single slot.  The ability to put back to back in 2 slots means you can fit up to 4 on one motherboard (8 GPU total).  Remember 8 GPU is going to be >1KW and it takes a significant radiator to cool more than that.

4) Windows drivers limit you to 8 GPU.  Cross fire is limited to 8 GPU.

So the hardcore solution is simply 4x 5970s/6990s watercooled each taking up 2 slots.  If you want to do it w/ 4x 6950 you can however it is a lot of expense for roughly half the hashing power. 

In either case you just need
1) case w/ 8 slots.  They do make them.  The 8th slot is sometimes called 7+1 because it allows the 7th MB slot to fit double width card.

2) motherboard with 7 expansion slots

3) a motherboard layout like this:
Slot 1 - PCIe 16X
Slot 1 - anything
Slot 1 - PCIe 16X
Slot 1 - anything
Slot 1 - PCIe 16X
Slot 1 - anything
Slot 1 - PCIe 16X

If you only want the power of 4x 6950 I would sell them on ebay and buy a pair of 5970s which you can water cool and fit into just about any motherboard.

Really single slot watercooling only makes sense if you want to use >8 GPU.  I have considered putting 7 5970 into a single workstation all watercooled but the radiator would need to be massive, likely 9x 120mm in size and it would only work in Linux.





cicada
Full Member
***
Offline Offline

Activity: 196


View Profile
November 07, 2011, 04:54:34 PM
 #6

You didn't write how you are planning to cool it.

But.. I did.  Twice...

... it's loud as all get out and hotter than hell, even with an open slot between my cards.  To alleviate those issues, I'm turning to water cooling.
...
... It occurred to me while researching waterblocks ...

I am planning to water cool the cards, there's no other way to fit them in a single PCI slot space otherwise.

Specifically, I'm either looking at full-coverage 6870 waterblocks (these XFX cards share the exact same PCB layout), or going the slightly cheaper route of universal waterblocks and a full-coverage heatsink from swiftech:

Full block:

 
or
 
Universal + heatsink:


Team Epic!

All your bitcoin are belong to 19mScWkZxACv215AN1wosNNQ54pCQi3iB7
cicada
Full Member
***
Offline Offline

Activity: 196


View Profile
November 07, 2011, 05:00:27 PM
 #7

Not sure why you need them to be single slots how many do you plan on installing?

If you want to watercool your best bet is to sell the 6950s for 6990s or 5970s.  

The important factors here are

1) the motherboard 16x slot spacing prevents more than two 2-slot cards from being fitted directly to the board (and therefore fit reasonably inside a case)
2) I already own these cards, and selling them all used I'd maybe afford a single 6990 or possibly two 5970s
3) I'm looking at ~$550 for a budget watercooling setup already, I'd prefer not to sink even more money in if I can avoid it

Team Epic!

All your bitcoin are belong to 19mScWkZxACv215AN1wosNNQ54pCQi3iB7
cicada
Full Member
***
Offline Offline

Activity: 196


View Profile
November 07, 2011, 05:02:04 PM
 #8

This is all about squeezing what I've already got into the most compact, crazy setup I can create - not about efficiency or overall performance.  I don't want to buy even more components, the wifey would have my nuts.

Team Epic!

All your bitcoin are belong to 19mScWkZxACv215AN1wosNNQ54pCQi3iB7
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 07, 2011, 05:07:50 PM
 #9

Not sure why you need them to be single slots how many do you plan on installing?

If you want to watercool your best bet is to sell the 6950s for 6990s or 5970s.  

The important factors here are

1) the motherboard 16x slot spacing prevents more than two 2-slot cards from being fitted directly to the board (and therefore fit reasonably inside a case)
2) I already own these cards, and selling them all used I'd maybe afford a single 6990 or possibly two 5970s
3) I'm looking at ~$550 for a budget watercooling setup already, I'd prefer not to sink even more money in if I can avoid it

Remember the cost of full coverage waterblocks they aren't cheap.  2x5970 give you pretty close to the performance of 4x 6950s.  It will only require 2 waterblocks not 4.  If you can get a pair of used 5970s for within $200 of what you can sell your 4x6950s for you likely are coming out ahead and won't need to do any desoldering.  The universal waterblocks might be fine for gaming but mining 24/7 @ 100% load puts insane pressure on the VRMs.  Without active cooling they are going to be 110C+ and that will put a limit on your hashrates.

You probably would be fine desoldering the top DVI port as long as you don't get the PCB too hot but I don't know of anyone who has tried that.

Alternatively if this if your board.  You can go with tri-fire (3x 6950) using 2 slots per card.
http://www.gigabyte.com/products/product-page.aspx?pid=3894#ov

You just need a case with 7+1 slots to fit a double card in the 7th slot of the MB.




cicada
Full Member
***
Offline Offline

Activity: 196


View Profile
November 07, 2011, 05:38:44 PM
 #10

Without active cooling they are going to be 110C+ and that will put a limit on your hashrates.

The swiftech heatsink looks pretty beefy anyway, and I'll likely be mounting a couple 120mm fans to blow across the sinks.  If I can keep the VRMs under ~90C that's about what I'm getting now, which doesn't seem unreasonable considering the tiny little heatsink affixed to them now.  I'd prefer the 'universal' because a) it's about half the price of a full block, and b) I can reuse it in the future, assuming the standard mounts don't change significantly.
 
You probably would be fine desoldering the top DVI port as long as you don't get the PCB too hot but I don't know of anyone who has tried that.

I'd be using a directed hot-air gun and solder wick to do this, so the heat would be pretty localized.  The trick here is there's no 'top' or 'bottom' header - it's a single block, so to remove one header I have to remove both.

Alternatively if this if your board.  You can go with tri-fire (3x 6950) using 2 slots per card.
http://www.gigabyte.com/products/product-page.aspx?pid=3894#ov

You just need a case with 7+1 slots to fit a double card in the 7th slot of the MB.

It's exactly that board, actually.  In  all honesty I'm starting with water cooling just the two currently installed cards and working up from there.  I think I can go tri-fire with my current case with a quick dremel mod, I've got enough clearance between the bottom slot and the PSU I believe, even though I'm lacking the 8th slot.  Especially if the bottom card is also water-cooled.

Team Epic!

All your bitcoin are belong to 19mScWkZxACv215AN1wosNNQ54pCQi3iB7
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 07, 2011, 06:20:23 PM
 #11

Well it sounds like you have a plan.  I would start with 2 cards keeping dual slots, then try three cards modding the case.  If you want to go more extreme you could always mod the cards at that point to get four cards.  If you buy a larger radiator than you need now you can always expand with little more cost than getting an extra waterblock and doing the slot modification.

By "top" I mean the DVI port which is in the slot that the actual PCIE connector is not (i.e. next to the XFX air cutout).  

Here is a custom single slot bracket for HD 6990.  Just eyeballing it, it looks like it would work for your 6950 too
http://www.frozencpu.com/products/13220/pci-12/PCI_Single_Slot_Bracket_-_HD6990_PCI-BCKT-6990.html




cicada
Full Member
***
Offline Offline

Activity: 196


View Profile
November 07, 2011, 07:43:12 PM
 #12

Here's an image I took when I did the circuit-write BIOS mod:



You can sort of see what I mean when I say the header is one 'block' - both DVI ports are a single 'thing', inseparable.  Any cards I mod this way would need a custom bracket, as both DVI ports will be gone.  I'd either just cut the current bracket and block the empty DVI port, or somehow craft a new one with just the HDMI/display port cutouts.

For the curious, you'll also notice the VRM heatsink on the left - this heatsink is the only difference between the 1GB 6950 and a 6870 layout (aside the GPU obviously).   The little green circle is around the (sloppily) modded BIOS chip.

Team Epic!

All your bitcoin are belong to 19mScWkZxACv215AN1wosNNQ54pCQi3iB7
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 07, 2011, 07:46:53 PM
 #13

Ouch. I gotcha now.  Yeah that is a rough mod.  You would need to desolder than dual DVI block and then find a compatible single DVI block and solder it on.  Not something I would want to consider. 

So it looks like triple dual slot w/ case mod is the easier route. 
If you want to squeeze in 4+ GPU I would go with dual GPU card before trying to desolder and resolder the DVI block unless you don't mind losing both DVI and just connect it via the DP plugs.  They make DP -> DVI cables.  Of course the resale value of card is essentialy $0 at that point.   Grin
cicada
Full Member
***
Offline Offline

Activity: 196


View Profile
November 07, 2011, 07:54:15 PM
 #14

Of course the resale value of card is essentialy $0 at that point.   Grin

Heh, yeah that's a given.  If I were really good I could potentially put it back on later, but the scorch marks might become a bit obvious Wink

I wouldn't miss the DVI, I'm using just the single HDMI header on the top card, and can extend to two more monitors with the DP connections.  I'd say any more monitors than that is overkill, but here I am trying to stuff 4 GPUs in a closed case Cheesy

Actually I checked when I ran home for lunch and the PCI-e slot layout lets me squeeze them all in with only modding two of the 4 cards, so the 'main' card in the crossfire setup will still have all it's display headers intact.

Team Epic!

All your bitcoin are belong to 19mScWkZxACv215AN1wosNNQ54pCQi3iB7
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 07, 2011, 07:58:03 PM
 #15

Well sounds like you got the game plan.  Watercooling is spouse friendly. 
My wife asked my why I didn't do it years ago.   Grin

Now that the weather is getting cool she commented it is like a really quiet heater.  This summer I am going to try and rig some "shelf" to put my radiator outside the office window and dump the heat directly outside. 

One thing I would recommend if "Tygon Silver" anti-microbial tubing.  It is soft and flexible but still retains its shape.  I run pure distilled water w/ Tygon tubing and silver coil in the resivour.  No need for toxic goop and the water is still clean 9 months later. 
cicada
Full Member
***
Offline Offline

Activity: 196


View Profile
November 07, 2011, 08:13:40 PM
 #16

I was going to go with the DangerDen tubing packs, seem to be highly regarded, nearly or just-as flexible as the Tygon, and way cheaper. 

I'm planning on getting one of those 9x120mm monster radiators so I'll have plenty of headroom for nutzo overclocking and such, and the CPU will be part of my loop as well.  Good call on hanging it outside during the summer!

Team Epic!

All your bitcoin are belong to 19mScWkZxACv215AN1wosNNQ54pCQi3iB7
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 07, 2011, 08:15:00 PM
 #17

Yeah I am thinking of getting one of those someday.  Expensive but certainly has a lot of cooling capacity.
bulanula
Hero Member
*****
Offline Offline

Activity: 518



View Profile
November 07, 2011, 08:59:38 PM
 #18

Nice watercooling tips here but too bad it is not really that cheap.
catfish
Sr. Member
****
Offline Offline

Activity: 270


teh giant catfesh


View Profile
November 08, 2011, 10:23:24 PM
 #19

...
So the hardcore solution is simply 4x 5970s/6990s watercooled each taking up 2 slots.  If you want to do it w/ 4x 6950 you can however it is a lot of expense for roughly half the hashing power. 
...
Hardcore?

This is hardcore (sorry Jarvis) - rip the whole badly-designed BS off these cards and install them single-slot-style, in a logic board suitably modded to accept 4 cards without bumping into heatsinks etc. (or remove the heatsinks from the logic board).

Then dunk the entire logic board and cards into a tub full of some liquid that has both *incredibly* low electrical conductivity and *incredibly* high heat conductivity. Some pipes, a pump, a feck-off big heat exchanger and a very large, quiet fan (or multiple fans - whatever is required to pull air through the heat exchanger *quietly* but efficiently and, importantly, move a LOT of air) - job's a carrot's second cousin.

It could be that the heat density of both the CPU and GPU cores, plus certain components on the graphics boards that I'm *sure* cause me all the aggro (e.g. XFX put a nice heatsink on the GPU, but the bits that overheat are bare, uncooled RAM chips or VRMs) is simply too high for convection in a bath of inert fluid to be adequate cooling.

Would this be so? Hell - even with a broken fan, the components will hit over 100˚C but have logic built in to prevent 400˚C accidents. I'm not talking about using water (polar, too conductive even if 5 9s pure), or burn-the-house-down diethyl ether (hehehehehe) - but something that boils significantly over 100˚C and doesn't conduct electricity even when hot.

Even if the heat density causes hotspots, this can be solved easily enough because with the cards perpendicular to the logic board, a fluid pump positioned sensibly should cause a steady flow of cool (straight from the heat exchanger) fluid over the cards.

I'm not talking cryogenic fluids here (that'd be *extreme*, not *hardcore* Wink ) - the entire rig should work at room temperature. High boiling point for the inert coolant is desirable to minimise evaporation - but a sealed system is also more feasible if the fluid isn't prone to high vapour pressure in the computation tubs - if the coolant isn't anywhere near boiling, a sealed system would be a LOT easier since pressurisation isn't anywhere near as high in the engineering problems to solve.

Apart from the warranty issues - the whole idea of home-brew water-cooling using waterblocks, home-made piping and domestic pumps / heat exchangers seems insane to me. Some of the biggest hot-spots on GPUs are small components like VRMs and capacitors. Most water-cooling systems put copper waterblocks over the GPU and memory. None of them can give a cool surface to ALL components, especially not the small but high-heat-density ones. So I'm bemused as to why the hardcore haven't tried to use the full-immersion cooling system more frequently.

Is it *really* that hard to find a suitable liquid coolant that is both electrically non-conductive (even accounting for potential contaminants, e.g. if the fluid is hygroscopic) and conducts heat well enough to serve as a coolant in a heat engine?


I've seen the few *extreme* types with bare PCs inside tubs of liquid nitrogen, etc. but those tend to be built 'because I can' rather than because the system really needed it. One of the old Cray supercomputers actually *had* the requirement for full immersion liquid cooling, rather than just the CPU. High-density computing such as bitcoin mining with lots of modern GPUs consuming many kilowatts in less than a cubic metre... I'd say that is getting very close to the point where full-immersion cooling would present *real* benefits.

Is anyone doing this in anger? Are there kits available, and have the bugs been worked out? I'd be keen to remove all the heatsinks and fans off my GPUs and put the *whole lot* into a tub full of inert coolant if the job was feasible. There's more heat capacity in a tub of coolant too... if the pump were to fail then I'd have more time for the machines to shut down before the coolant boiled off - it's not like our GPUs are nuclear fuel rods, but with just convected air, a powered GPU will burn out before software can stop it (hence the panic switches *on the silicon*)...


Bit of a hardcore option, but if these are *dedicated* miners.... Just chucking some ideas around, given my problems with cooling at the moment...

...so I give in to the rhythm, the click click clack
I'm too wasted to fight back...


BTC: 1A7HvdGGDie3P5nDpiskG8JxXT33Yu6Gct
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 08, 2011, 10:30:49 PM
 #20

Full coverage waterblocks cover entire card including all VRM, GPU, PCIe bridge, and other components.

Immersion cooling works but isn't economical.  There are very few low conductive fluids that have higher thermal conductivity than plan ole water.  The few that are have incredibly high price tags.  Mineral oil is a good medium cost heat transfer fluid but is roughly 10x the cost of distilled water and has 1/4 the heat capacity.

Not sure where you get the idea that waterblocks can't keep video cards cool.  My GPU are <50C and VRM are <70C with a 18% overvolt.  

What limits further overclocking isn't temp so some ultra complicated and expensive immersion rig isn't going to get higher throughput.

What limits further overclocking is:
1) I don't have any desire to push higher voltage through card which is necessary to push clocks over 1Ghz.
2) As voltage increases power consumption increases by the square so I don't max overclock my cards anymore due to simple economics.

Quote
Is anyone doing this in anger? Are there kits available, and have the bugs been worked out? I'd be keen to remove all the heatsinks and fans off my GPUs and put the *whole lot* into a tub full of inert coolant if the job was feasible.

Thermodynamics tells us that would be bad.  Any immersion fluid while it may have higher conductivity than air has much less conductivity than copper.  You would need an astronomical flow rate (think giant sump pump) to avoid the chip cooking before it can transfer that thermal energy into the fluid. 

Air is a fluid.  You already have immersion cooling with air.  We use copper heatsinks because they can transfer that thermal energy from the die very quickly.  Even waterblocks which use have thermal conductivity have a decent amount of copper to spread out that point energy source.  Remember a GPU not only emits 200W+ of thermal energy it does so from an area tinier than your finger nail.



Pages: [1] 2 3 4 5 6 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!