Bitcoin Forum
April 18, 2024, 05:23:47 PM *
News: Latest Bitcoin Core release: 26.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 3 4 5 6 7 8 »  All
  Print  
Author Topic: A journey of extreme watercooling: Cooling a rack of GPU servers without AC.  (Read 27312 times)
DeathAndTaxes (OP)
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
February 28, 2012, 05:36:06 AM
Last edit: April 25, 2012, 02:54:42 AM by DeathAndTaxes
 #1

Finally got it stable and hashing.  It was brutal trying to get 4x5970 working in Linux.  Sadly BAMT doesn't work (no dice for 8 GPU in 32bit kernel).  I tried xubuntu but hosed something up installing SDK 2.1.  Tried to restore from an image I made and it wouldn't boot.  Ended up grabbing LinuxCoin (x64) and dropping in cgminer.  I don't like it but it works for now.

Code:
 cgminer version 2.3.1 - Started: [2012-02-28 12:30:03]
--------------------------------------------------------------------------------
 (5s):2935.9 (avg):2952.6 Mh/s | Q:11590  A:3272  R:57  HW:0  E:28%  U:15.06/m
 TQ: 8  ST: 9  SS: 26  DW: 1950  NB: 17  LW: 15248  GF: 0  RF: 0
 Connected to http://192.168.0.189:9332 with LP as user user
 Block: 000007200ebc4183c7cefd4ed93eea81...  Started: [16:04:46]
--------------------------------------------------------------------------------
 [P]ool management [G]PU management [S]ettings [D]isplay options [Q]uit
 GPU 0:  52.0C  960RPM | 378.0/378.7Mh/s | A:445 R: 7 HW:0 U: 2.05/m I: 8
 GPU 1:  52.5C  960RPM | 378.0/378.7Mh/s | A:399 R:10 HW:0 U: 1.84/m I: 8
 GPU 2:  49.0C  960RPM | 378.1/378.7Mh/s | A:431 R: 4 HW:0 U: 1.98/m I: 8
 GPU 3:  54.0C  960RPM | 378.0/378.7Mh/s | A:403 R: 3 HW:0 U: 1.85/m I: 8
 GPU 4:  59.0C  960RPM | 352.8/353.3Mh/s | A:396 R:10 HW:0 U: 1.82/m I: 8
 GPU 5:  58.0C  960RPM | 378.0/378.6Mh/s | A:407 R: 9 HW:0 U: 1.87/m I: 8
 GPU 6:  55.0C  960RPM | 378.1/378.6Mh/s | A:413 R: 8 HW:0 U: 1.90/m I: 8
 GPU 7:  53.5C  960RPM | 327.8/328.2Mh/s | A:378 R: 6 HW:0 U: 1.74/m I: 8
--------------------------------------------------------------------------------

Clocks are somewhat conservative but it is running @ ~3GH/s and pulling 1112W at the wall (120V).  Once it is stable it will move to the 240V PDU so it should be 20-25W less there.  One of the eight cores wouldn't clock to 835 (went sick 0Mh/s instantly) since I was tired I just left it @ 750 Mhz for right now.  It is pointed at p2pool which makes share count, U, etc "weird" due to dynamic share difficulty.

Total System Load: 1112W
Total System Hashing Rate: :2.95 GH/s
System efficiency: 2.67 MH/W
Measured (not calculated) no-GPU (not even installed in rig) system idle: 190W
GPU AC load:  230W ea
GPU DC load: ~200W ea
Total System thermal load on water loop: 800W
GPU efficiency: 3.21 MH/W.  

On edit: old pic removed (better pics below)

Note:
I don't have 30GH/s the figure comes from the potential.  3GH/s per 4U.  45U in a rack.  1U for switch, 2U for PDUs, 2U for "watchdog" server leaves enough space for 10 4U rigs.  
1713461027
Hero Member
*
Offline Offline

Posts: 1713461027

View Profile Personal Message (Offline)

Ignore
1713461027
Reply with quote  #2

1713461027
Report to moderator
The forum was founded in 2009 by Satoshi and Sirius. It replaced a SourceForge forum.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
Starlightbreaker
Legendary
*
Offline Offline

Activity: 1764
Merit: 1006



View Profile
February 28, 2012, 05:39:54 AM
 #2

daaaaaaaaamn

oscer
Member
**
Offline Offline

Activity: 73
Merit: 10


View Profile WWW
February 28, 2012, 05:40:48 AM
 #3

Are you keeping that in a DataCenter ?

Xrnetworks.com Web Hosting
DeathAndTaxes (OP)
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
February 28, 2012, 05:45:10 AM
 #4

Are you keeping that in a DataCenter ?

No although my office is looking more and more like a datacenter.  Got to get some sleep but 24 5970s produce a lot of heat and AC cuts into my profits.  If this 1 rig test goes good my goal is to rack of 6 of these in standard server rack with a heat exchanger to a secondary cooling loop which runs outside to a very large radiator.  Dump 6KW of heat directly outside.  We will see how this 1 unit experiment goes. 
oscer
Member
**
Offline Offline

Activity: 73
Merit: 10


View Profile WWW
February 28, 2012, 05:55:05 AM
 #5

Are you keeping that in a DataCenter ?

No although my office is looking more and more like a datacenter.  Got to get some sleep but 24 5970s produce a lot of heat and AC cuts into my profits.  If this 1 rig test goes good my goal is to rack of 6 of these in standard server rack with a heat exchanger to a secondary cooling loop which runs outside to a very large radiator.  Dump 6KW of heat directly outside.  We will see how this 1 unit experiment goes. 

How Much that thing cost you to build ?

Xrnetworks.com Web Hosting
Splirow
Full Member
***
Offline Offline

Activity: 164
Merit: 100


View Profile
February 28, 2012, 06:42:50 AM
 #6

Why are your GPU temps only around 42-44c?

bravetheheat
Sr. Member
****
Offline Offline

Activity: 457
Merit: 251


View Profile
February 28, 2012, 08:34:45 AM
 #7

Why are your GPU temps only around 42-44c?


I would guess from the pic that it would be due to water cooling.   Cool
ZPK
Legendary
*
Offline Offline

Activity: 1302
Merit: 1021



View Profile
February 28, 2012, 09:26:46 AM
 #8

3 video card do not plug power....why or how ?)
and how motherboard you use ?

Novacoin POS mining only now
DeathAndTaxes (OP)
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
February 28, 2012, 12:03:28 PM
Last edit: February 28, 2012, 04:12:19 PM by DeathAndTaxes
 #9

3 video card do not plug power....why or how ?)
and how motherboard you use ?
Oops that was an early pic when I was testing each card.  BAMT only worked w/ 3 cards so I unplugged one card, mined, powered down, and plugged in a different one to test all 4.

The MB is the "miner classic"  MSI 890FXA-GD70
DeathAndTaxes (OP)
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
February 28, 2012, 12:12:02 PM
Last edit: February 28, 2012, 04:17:18 PM by DeathAndTaxes
 #10

How Much that thing cost you to build ?

Well I already had 3 of the waterblocks, tubing, fittings, connectors, and radiator.   Plus my "standard" air cooled rigs are 3x5970, MSI 890FXA-GD70, 2GB of RAM, Sempron, USB Stick, and Corsair/Seasonic 1200W/1250W PSU which all went in here.

So the incremental cost for the "test rig" was just the case, pump and one waterblock ~ $400.

Full conversion of my other rigs would cost ~$900 ea (4x waterblock, fittings, connectors, tubing, heat exchanger, pump, silver kill coil, and case) unless I can get some volume discounts.

To build it from scratch would be ~$1300 plus cost of GPUs.  When I bought them most of them it was more like $300-$350 ea now they are insanely expensive but hopefully that will change.  I would strongly discourage someone from trying this unless they are already a confident miner AND have experience with liquid cooling.  I am not sure I would try this if I didn't already own the aircooled rigs. Smiley

The reason for doing it is to:
a) improve the efficiency (getting >3 MH/W now at >3GH/s and with underclocking/undervolting that can rise to 5MH/W if necessary over time to stay profitable)
b) push the cards higher.  I think 3.2 GH/s per rig is possible allowing me to pickup 20% more revenue before the reward cut
c) eliminate roughly $4000 per year in AC costs to be more competitive in face of rising # of FPGAs.
d) keep the temps more stable (50C @ 99% load 24/7 all year long shouldn't be a problem)
e) The wife acceptance factor.  She has been a "trooper" with this mad scientist and his 14GH/s of whirling, buzzing, heat belching fun.  
f) maybe someday provide "free" hot water and heating for the entire house (~$1000 per year).
DeathAndTaxes (OP)
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
February 28, 2012, 02:13:18 PM
Last edit: February 28, 2012, 02:44:51 PM by DeathAndTaxes
 #11

Some updated pics.


Not your average watercooling job.  Very "industrial". No lights, UV dyes, clear tubing, bling bling leet gamer nonsense.  Maybe I am weird but I think it looks nicer that some flashy setups.

The PSU is a Seasonic 1250W 80-Plus Gold. I only have one of the 80mm fans in the back hooked up.  The sempron puts off almost no heat so might be fine just to use the PSU intake as an exhaust (yes computers did that at one time Smiley ). The front left (upper right) is the 3x 5.25" bay.  The front center houses a 3.5" bay w/ 80mm intake fan (removed).  The front right houses a 120mm fan for airflow across the GPUs.   The top cover also has 2x120mm fans but I don't think I will be using them.  This case is sold as a CUDA rack but honestly I don't see it having enough airflow for air cooling 4x Teslas.

Yeah the wiring job is horrible.  I haven't decided on which way the tubing will go.  My first thought it to mount the heat exchanger (should arrive by this weekend) in the front and run "cold loop" lines w/ quick disconnects out the 5.25" bay.


Closeup of the 4x5970s.  Finding a rackmount case w/ 8 expansion slots is tricky. The few that exists are $500+.  Luckily Chenbro makes this case and it wasn't too expensive.  The waterblocks are DangerDen because they are the cheapest full coverage waterblocks.  Watercooling is expensive but "saving" money using non-full coverage block is useless as the VRMs get too hot.  5970s are nice because one block cools two GPUs.  7990 would be even nicer but by the time they are affordable FPGA will likely have killed that idea.  I likely will seal the bridges with silicon sealant and apply plumbers tape to all the threads.  I want it as no maintenance as possible.


Front view w/ filter/door open.  On the left is 120mm intake (0.3A).  I would guestimate it at ~60 CFM.  Provides a slight breeze over the cards which is all that is needed for the non-waterblocked components like the caps.    On the right is a dual bay reservoir which also mounts the MCP655 pump.  


With door closed.  It is currently hooked to this "test radiator".  The end goal (assuming all testing goes good) would be to mount a water to water heat exchanger inside each rig with quick disconnects to attach it to a outer "cold loop" which runs to an outdoor radiator.

I am kinda surprised this radiator is holding up because 4x120mm is way undersized for ~800W thermal load.    Ambient temp is ~23C and cards are running at ~55C after 8 hours of hashing.  Strangely the load "bounces" more than I have seen on other rigs going from 1070W to 1120W.  Need to investigate a little further.

Thoughts, comments, suggestions?
BFL-Engineer
Full Member
***
Offline Offline

Activity: 227
Merit: 100



View Profile WWW
February 28, 2012, 03:26:49 PM
 #12

If you had chosen BitFORCE, you could've built the same system (3328 MH/s, about 328 MH/s higher than the actual solution) with only 4 units,
costing total of 2400USD (+ shipping) and consuming only 330 Watts (less if a high-efficiency power-supply was used to power all 4 units).
Virtually silent as compared to GPUs (26dB each unit) and a fraction ( A third so to speak ) on electricity costs Smiley


Regards,

BF Labs Inc.  www.butterflylabs.com   -  Bitcoin Mining Hardware
rjk
Sr. Member
****
Offline Offline

Activity: 448
Merit: 250


1ngldh


View Profile
February 28, 2012, 03:29:43 PM
 #13

If you had chosen BitFORCE, you could've built the same system (3328 MH/s) with only 4 units, costing total of 2400USD (+ shipping)
and consuming only 330 Watts (less if a high-efficienty power-supply was used to power all 4 units). Virtually silent as compared to GPUs
(26dB each unit) and a faction ( A third so to speak ) on electricity costs Smiley


Regards,
This is more fun Grin

And, he can run bitforces off of it!

Mining Rig Extraordinaire - the Trenton BPX6806 18-slot PCIe backplane [PICS] Dead project is dead, all hail the coming of the mighty ASIC!
Photon939
Sr. Member
****
Offline Offline

Activity: 452
Merit: 250



View Profile
February 28, 2012, 03:31:34 PM
 #14

If you had chosen BitFORCE, you could've built the same system (3328 MH/s) with only 4 units, costing total of 2400USD (+ shipping)
and consuming only 330 Watts (less if a high-efficienty power-supply was used to power all 4 units). Virtually silent as compared to GPUs
(26dB each unit) and a fraction ( A third so to speak ) on electricity costs Smiley


Regards,

And it probably wouldn't arrive until 2013 - Don't get me wrong I'd like to purchase one myself but spamming this guy's build thread is in quite poor taste IMO.


To the OP: I wish I had gone with rackmount cases for my little miners I have here, your setup looks good.
DeathAndTaxes (OP)
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
February 28, 2012, 03:34:08 PM
Last edit: February 28, 2012, 04:05:24 PM by DeathAndTaxes
 #15

If you had chosen BitFORCE, you could've built the same system (3328 MH/s) with only 4 units, costing total of 2400USD (+ shipping)

And I would get it in 4-6 weeks months ?

If you are going to spam/advertize in my thread it would be nice to read it first.  The MB, PSU, RAM, CPU, and 24x 5970s (the most expensive part) were purchased between 12 months and 6 months ago.  They are already a sunk cost as I have no desire to try an unload all that equipment.  

While I am interested in migrating to FPGAs.  I have a summer temps arriving in <3 months, so I am looking for a solution to expand the longevity of my EXISTING HARDWARE (which has produced if my math is right ~250 quadrillion valid hashes so far).  Hopefully I can get another ~250 quadrillion hashes more as I mine these cards into the ground.

Quick questions:
If I place an order today for 4 BFL Singles today can you guarantee a delivery date?  What date would that be?  Tell you what, if you are willing to guarantee delivery by 1 April publicly on this forum  (w/ $200 penalty paid by BFL for non-delivery) I will buy 4 today.
BFL-Engineer
Full Member
***
Offline Offline

Activity: 227
Merit: 100



View Profile WWW
February 28, 2012, 03:34:51 PM
 #16

If you had chosen BitFORCE, you could've built the same system (3328 MH/s) with only 4 units, costing total of 2400USD (+ shipping)
and consuming only 330 Watts (less if a high-efficienty power-supply was used to power all 4 units). Virtually silent as compared to GPUs
(26dB each unit) and a fraction ( A third so to speak ) on electricity costs Smiley


Regards,

And it probably wouldn't arrive until 2013 - Don't get me wrong I'd like to purchase one myself but spamming this guy's build thread in quite poor taste IMO.


To the OP: I wish I had gone with rackmount cases for my little miners I have here, your setup looks good.

None was intented. My apologies if it appears so...


Regards,

BF Labs Inc.  www.butterflylabs.com   -  Bitcoin Mining Hardware
rjk
Sr. Member
****
Offline Offline

Activity: 448
Merit: 250


1ngldh


View Profile
February 28, 2012, 03:35:31 PM
 #17

Damnit DeathAndTaxes, you are going to make it hard on me keeping up with the joneses Grin

First things first: what kind of rad are you planning on to dump +6kw of heat? Most oil coolers are rated in horsepower, which is 746 watts per horsepower. You can get a compact (~1 square foot) rad that is good for 10 HP, but only with a loud cooling fan. However, ~no fan or low speed fan gets it a derating to 8 HP (plenty).

Second: Heat exchangers. Some plan to make them pluggable, or some way to make it so you can add/remove them without disturbing the whole cluster? Also, which ones - most I have seen won't handle the heat from 4 5970s.

Third: Power. Is that PSU going to be good enough for continuous duty? Not sure what you run now, but perhaps it will be.

Mining Rig Extraordinaire - the Trenton BPX6806 18-slot PCIe backplane [PICS] Dead project is dead, all hail the coming of the mighty ASIC!
mtminer
Member
**
Offline Offline

Activity: 86
Merit: 10


View Profile
February 28, 2012, 03:48:40 PM
 #18

Pretty impressive.
jamesg
VIP
Legendary
*
Offline Offline

Activity: 1358
Merit: 1000


AKA: gigavps


View Profile
February 28, 2012, 03:58:29 PM
 #19

e) the wife acceptance factor.  She has been a "trooper" with this mad scientist and 14GH/s of whirling, buzzing, heat belching fun.

I had to get a warehouse. Consider yourself lucky.
DeathAndTaxes (OP)
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
February 28, 2012, 04:02:04 PM
 #20

Damnit DeathAndTaxes, you are going to make it hard on me keeping up with the joneses Grin

Sorry about that. Have been planning this for a while (Dec was great but June will be brutal w/ 6KW of heat).  Seeing your out of the box thinking got my behind moving.

Quote
First things first: what kind of rad are you planning on to dump +6kw of heat? Most oil coolers are rated in horsepower, which is 746 watts per horsepower. You can get a compact (~1 square foot) rad that is good for 10 HP, but only with a loud cooling fan. However, ~no fan or low speed fan gets it a derating to 8 HP (plenty).

I haven't spent too much time on the "outer loop" yet.  I want to stress test this single rig first.  If it fails I didn't waste too much time and money.  I am thinking big (maybe 24" x 24") with a 16" or maybe 2 x 10" fans. 

There are a couple of options car radiator, oil cooler, industrial heat exchanger.  I don't know exactly what yet but it likely will be big. 

I found another company which makes really nice custom units for cooling lasers and other high temp components but their prices are insane (way outside my budget).  They have very detailed charts for c/w which gives me a ballpark idea on where I need to be aiming for (surface area and cfm).

http://www.lytron.com/Heat-Exchangers/Standard/Heat-Exchangers-Tube-Fin

Quote
Second: Heat exchangers. Some plan to make them pluggable, or some way to make it so you can add/remove them without disturbing the whole cluster? Also, which ones - most I have seen won't handle the heat from 4 5970s.

Yeah the tubing on the "cold side" of exchanger will have quick disconnects.  This will allow removing one rig from the rack.  For troubleshooting I am planning on keeping my "test radiator" from photo above with a pump and reservoir.  That way I can connect a sick rig to the baby radiator for diagnosis.

I found some brazed flat plate exchangers that with good flow 2gpm can handle 2KW+ with 10C rise over cool side inlet temp.  Remember the AC load on a single 5970s (GPU only) is ~230W so the DC (thermal) load is closer to 200W.  I bought one and will test it out this weekend.  I am thinking of a canister high lift aquarium or pond pump for the "cold" (outer) loop which should keep flow rates high.

Quote
Third: Power. Is that PSU going to be good enough for continuous duty? Not sure what you run now, but perhaps it will be.

I think so.  Seasonic is solid and their customer support is great.  I paid for units w/ 5 year warranty so I will be using that.  If they start to fail I will need to think of alternate loading.
Pages: [1] 2 3 4 5 6 7 8 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!