Bitcoin Forum
May 04, 2024, 02:56:14 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 [4] 5 »  All
  Print  
Author Topic: Mining Farm Cooling  (Read 21584 times)
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
September 23, 2011, 05:13:44 PM
 #61

The upside is Koolance coolant is non-conductive and that's important. I had a small leak develop and didn't hurt any components. It was messy as hell but did not cost me anything to fix it. Engine coolant isn't designed to be non-conductive so you could end up with a very expensive experiment if your not careful.

Distilled water is non-conductive, costs <$1 per gallon and has higher thermal conductivity.
1714834574
Hero Member
*
Offline Offline

Posts: 1714834574

View Profile Personal Message (Offline)

Ignore
1714834574
Reply with quote  #2

1714834574
Report to moderator
1714834574
Hero Member
*
Offline Offline

Posts: 1714834574

View Profile Personal Message (Offline)

Ignore
1714834574
Reply with quote  #2

1714834574
Report to moderator
1714834574
Hero Member
*
Offline Offline

Posts: 1714834574

View Profile Personal Message (Offline)

Ignore
1714834574
Reply with quote  #2

1714834574
Report to moderator
"This isn't the kind of software where we can leave so many unresolved bugs that we need a tracker for them." -- Satoshi
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714834574
Hero Member
*
Offline Offline

Posts: 1714834574

View Profile Personal Message (Offline)

Ignore
1714834574
Reply with quote  #2

1714834574
Report to moderator
P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
September 23, 2011, 06:39:55 PM
 #62

Distilled water will short circuit after a while. Its been tested, I think by Toms Hardware. Ill see if I can find the link later.

I do remember reading about some nanostuff you could mix in water or oil to improve thermal conductivity. It was still in development and testing phase a few years ago, but looked promising and was being specifically designed for use in computers, datacenters and some other key niche markets. Not sure if its reached market yet, let alone what it would cost.

jjiimm_64
Legendary
*
Offline Offline

Activity: 1876
Merit: 1000


View Profile
September 23, 2011, 09:15:40 PM
 #63


Air is a liquid
Air is free
Air is easy to move around

Give the rigs plenty of fresh air and that is that.  you guys are overthinkin it.

get the hot air out of the room, bring fresh air in.  even if that fresh air is 100 f, it is stall alot cooler then the air that the video card just expelled.

1jimbitm6hAKTjKX4qurCNQubbnk2YsFw
P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
September 23, 2011, 10:30:52 PM
 #64


Air is a liquid
Air is free
Air is easy to move around

It also has a thermal conductivity that is 25x lower than water.

RandyFolds
Sr. Member
****
Offline Offline

Activity: 448
Merit: 250



View Profile
September 23, 2011, 10:37:41 PM
 #65


Air is a liquid
Air is free
Air is easy to move around

It also has a thermal conductivity that is 25x lower than water.

so use 25x as much air...problem solved.
kirax
Member
**
Offline Offline

Activity: 77
Merit: 10


View Profile WWW
September 23, 2011, 11:32:40 PM
 #66


Air is a liquid
Air is free
Air is easy to move around

It also has a thermal conductivity that is 25x lower than water.

so use 25x as much air...problem solved.

At a certain point, the 25X airflow becomes a noise and possibly cost problem: I mean, sure, high CFM fans are cheap, but 25 is a dam big multiplier.

If the profit margins on mining had stayed at the (clearly ridiculous, but a guy can dream) levels of $30/BTC, then it would almost be worth it to cool it with flourinert. However, 3M wants... a lot for that stuff. A lot a lot. You could make it nearly silent. There is at least one company I read about specializing in data centre cooling with flourinert, and the savings in cooling power is impressive. You have to remember, of course, that it depends on scale: What is easy to aircool in your house might not be easy (or possible) to aircool if you have, 5, 15, or 50 of them.

VPS, shared, dedicated hosting at: electronstorm.ca. No bitcoin payment for that yet, but bitcoins possible for general IT, and mining/GPGPU rigs. PM for details.
mackminer
Sr. Member
****
Offline Offline

Activity: 348
Merit: 251



View Profile
September 23, 2011, 11:38:50 PM
Last edit: September 24, 2011, 12:01:03 AM by forumuse84
 #67

Anyone ever heard of a datacentre using equipment other than air conditioning, I haven't.

For peace of mind I run my expensive hardware investment with an air conditioner that will always keep the room at 19 celcius - I can go to work, go on holidays, watch tv and be assured that my machines are at a steady temp.

My rigs use 5kW/h and my air con can cool 7kW of heat with just over 2kW of power. Why invest big and then skimp on cooling? Cooling needs to be factored into the cost and is as important as power surges or dirty power - it will destroy your hardware!

With air conditioning you can't introduce separate systems as it messes with the airflow. My GPU's on my 6990's run at around 70 celcius or under continuously. This is with 3x 6990's in each rig running at a standard 830Mhz per GPU, they are also running inside haf x cases with a modified side panel (professionally designed and cut via an engineering company) to fit 4x120mm delta fans for air flow. These fans have dust filters also. Each rig uses about 1.2kW/h and there are four in total. Just to add the fan speed set on the 6990's themselves is 70%, though it can manage no problem at all on 60%.

If your serious then professional is the way to go. Jobs isn't still running his company out of his garage. Business (i.e. investment) requires industry standards. Otherwise it's a joke.

I see people running custom cooling solutions and are just about keeping them under 85 celcius. What if it's a hot day and they pop up to 95? Do you panic? Do you call in sick to work? Do you fly back from your holidays?

I am also considering down clocking my GPU's rather than having them maxed out at 830Mhz all the time - a few less hashes isn't going to kill me. With regard to overclocking, I wouldn't consider it. Cards are certainly not designed to be maxed out all the time, never mind overclocking them.

1BFf3Whvj118A5akc5fHhfLLwxYduMmq1d
RandyFolds
Sr. Member
****
Offline Offline

Activity: 448
Merit: 250



View Profile
September 23, 2011, 11:54:05 PM
 #68


Air is a liquid
Air is free
Air is easy to move around

It also has a thermal conductivity that is 25x lower than water.

so use 25x as much air...problem solved.

At a certain point, the 25X airflow becomes a noise and possibly cost problem: I mean, sure, high CFM fans are cheap, but 25 is a dam big multiplier.

If the profit margins on mining had stayed at the (clearly ridiculous, but a guy can dream) levels of $30/BTC, then it would almost be worth it to cool it with flourinert. However, 3M wants... a lot for that stuff. A lot a lot. You could make it nearly silent. There is at least one company I read about specializing in data centre cooling with flourinert, and the savings in cooling power is impressive. You have to remember, of course, that it depends on scale: What is easy to aircool in your house might not be easy (or possible) to aircool if you have, 5, 15, or 50 of them.

I have regularly handled ventilation on anywhere up to 25,000 watts worth of lighting, often in cramped spaces. It was easily accomplished without flourinert, just good flow and a properly sized AC. You would still need to run an equitable compressor to chill your cooling liquid.
kirax
Member
**
Offline Offline

Activity: 77
Merit: 10


View Profile WWW
September 24, 2011, 12:10:04 AM
 #69

Anyone ever heard of a datacentre using equipment other than air conditioning, I haven't.

For peace of mind I run my expensive hardware investment with an air conditioner that will always keep the room at 19 celcius - I can go to work, go on holidays, watch tv and be assured that my machines are at a steady temp.

My rigs use 5kW/h and my air con can cool 7kW of heat with just over 2kW of power. Why invest big and then skimp on cooling? Cooling needs to be factored into the cost and is as important as power surges or dirty power - it will destroy your hardware!

With air conditioning you can't introduce separate systems as it messes with the airflow. My GPU's on my 6990's run at around 70 celcius or under continuously. This is with 3x 6990's in each rig running at a standard 830Mhz per GPU, they are also running inside haf x cases with a modified side panel (professionally designed and cut via an engineering company) to fit 4x120mm delta fans for air flow. These fans have dust filters also. Each rig uses about 1.2kW/h and there are four in total. Just to add the fan speed set on the 6990's themselves is 70%, though it can manage no problem at all on 60%.

If your serious then professional is the way to go. Jobs isn't still running his company out of his garage. Business (i.e. investment) requires industry standards. Otherwise it's a joke.

I see people running custom cooling solutions and are just about keeping them under 85 celcius. What if it's a hot day and they pop up to 95? Do you panic? Do you call in sick to work? Do you fly back from your holidays?

I am also considering down clocking my GPU's rather than having them maxed out at 830Mhz all the time - a few less hashes isn't going to kill me. With regard to overclocking, I wouldn't consider it. Cards are certainly not designed to be maxed out all the time, never mind overclocking them.

google for "Data center liquid cooling". 3 million results. It isn't common, but it is not heard of. Keep in mind, bitcoin miners with 4 or even 8 GPUs per system is a lot of heat output per sq foot, well into or past the more exotic data centers like blade servers, etc.

Also, on the note of 25kw of lighting, keep in mind that a high density server cabinet can hit 30 kw all by itself right now, with projections up to 50kw coming in a couple years (Source: http://www.42u.com/liquid-cooling-article.htm). Was all 25kw of lighting confined in a 42u sized space, and, also important, all stacked on top of each other? On that note, remember, that is one cabinet, you sure can get a lot of those in one room.

Also, most data centers these days are looking into a lot of other options to improve PUE, heat wheels, liquid cooling, etc, at a certain point, it isn't efficient to just throw more AC at it.


VPS, shared, dedicated hosting at: electronstorm.ca. No bitcoin payment for that yet, but bitcoins possible for general IT, and mining/GPGPU rigs. PM for details.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
September 24, 2011, 03:24:49 AM
 #70

Anyone ever heard of a datacentre using equipment other than air conditioning, I haven't.

Water cooling was much more common in early computing history.  As the power density of a rack fell below 5KW it became easy to cool with air.  However power density are rising and for high performance computing (which can easily achieve 8KW/rack +) water cooling is starting to come back.  As someone else said just search google.   Many supercomputers are water cooled because they have high power densities.

Mining blows those densities out of the water (no pun intended).  A 4U case could hold 8 GPU.  1200 to 1500W per system.  11 systems per rack = up to 15KW/rack.  Ouch.

Quote
I see people running custom cooling solutions and are just about keeping them under 85 celcius. What if it's a hot day and they pop up to 95? Do you panic? Do you call in sick to work? Do you fly back from your holidays?

I am also considering down clocking my GPU's rather than having them maxed out at 830Mhz all the time - a few less hashes isn't going to kill me. With regard to overclocking, I wouldn't consider it. Cards are certainly not designed to be maxed out all the time, never mind overclocking them.

Heat is what matters not load.  I overclock my 5970s 37% to 1000 MHz core.  You know what temp then run at 24/7? 40 deg C.  The fans on the radiator are very low noise too.  Most people air cool @ stock and it sounds like a vacuum cleaner.  My card runs very cool at 37% higher hashrate and the entire system noise is less than single oscillating fan.

As far as overheating.  There is software out there to halt miner, or even shutdown system if temp gets too high.

Now nothing beat the cost of air so if you have a lot of space, don't mind open rigs, and don't care about noise then you should air cool.  However if you want high power densities (i.e. 3x 5970 in a quiet closed case) then water cooling can't be beat.  Two of my rigs sit in the garage and are air cooled but one of my rigs is also my workstation.  It sits indoors in my office.  My wife wouldn't be happy to have fans as loud as a a leaf blower running 24/7 (and it still would do a piss poor job of cooling).  Even if she didn't mine I couldn't get 3x overclocked 5970s cooled to 40C without water cooling.
mackminer
Sr. Member
****
Offline Offline

Activity: 348
Merit: 251



View Profile
September 24, 2011, 12:34:47 PM
 #71

Ok, I stand corrected but I wonder about heat exchangers....although not as capable of cooling densely packed blade servers, my 3x6990 rigs have enough airflow with 4 deltas that use a total of 80 watts.

I am looking into ducting the exhaust heat directly to a radiator with cold water running through it - the air in the room stays the same but the conditioner does not need to cool as much.

I suppose my real question is...how much does water cooling cost to implement and maintain and what are the risks? My initial cost for air conditioning including install was about 1500 euros. It costs me about 750 euro or so on electricity every month. Yearly running cost is 9000 euros.

I thought that reducing the clock with mean less stress on gpu's, not just using heat as a factor. I'm sure Xeons are designed for durability but why AMD GPU's.

1BFf3Whvj118A5akc5fHhfLLwxYduMmq1d
mackminer
Sr. Member
****
Offline Offline

Activity: 348
Merit: 251



View Profile
September 24, 2011, 12:49:59 PM
 #72

Show me one datacenter that runs a bank of 6990's at full bore all the time! You won't find one. Datacenters are designed for zero downtime storage of critical data and are savagely overcooled for that purpose. Not a single person bitcoin mining could ever come close to affording that level of design.

Designed for zero downtime storage? Storage? Storage is easily achieved with hard disks and SAN's, servers are for processing. I have the same level of design as a datacenter regarding cooling. I have airflow and ac, granted I don't have complex issues such as hotspots and lines of racks to take into consideration but I have the two key foundations and they work for my server room.

My home HVAC system can't even keep up with the heat produced by all my rigs running full bore 24/7. On the hottest days this summer my Carrier froze up and needed to be turned off to defrost (granted, we were cooking at that time too). Don't tell me about how old my unit is either the house is only 6 years old. You must live in Alaska for your AC to keep your house at 66.2°F all summer running that many 6990's.

Not sure what point you are making here but we a moderate climate with average temp of 19 celcius over the course of the year. Sucking out and pulling in is still makeshift.

BTW: I don't want my fucking teeth to chatter while I'm watching TV either.

Just one ac for the datacenter here.

1BFf3Whvj118A5akc5fHhfLLwxYduMmq1d
P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
September 24, 2011, 01:09:59 PM
 #73

Designed for zero downtime storage? Storage?

http://www.youtube.com/watch?v=ioCZojN4A0g
http://www.youtube.com/watch?v=BH8X8w8a4f4

Maybe NSA and IBM have no clue though Smiley

mackminer
Sr. Member
****
Offline Offline

Activity: 348
Merit: 251



View Profile
September 24, 2011, 04:02:59 PM
 #74

Show me one datacenter that runs a bank of 6990's at full bore all the time! You won't find one. Datacenters are designed for zero downtime storage of critical data and are savagely overcooled for that purpose. Not a single person bitcoin mining could ever come close to affording that level of design.

Designed for zero downtime storage? Storage? Storage is easily achieved with hard disks and SAN's, servers are for processing. I have the same level of design as a datacenter regarding cooling. I have airflow and ac, granted I don't have complex issues such as hotspots and lines of racks to take into consideration but I have the two key foundations and they work for my server room.

My home HVAC system can't even keep up with the heat produced by all my rigs running full bore 24/7. On the hottest days this summer my Carrier froze up and needed to be turned off to defrost (granted, we were cooking at that time too). Don't tell me about how old my unit is either the house is only 6 years old. You must live in Alaska for your AC to keep your house at 66.2°F all summer running that many 6990's.

Not sure what point you are making here but we a moderate climate with average temp of 19 celcius over the course of the year. Sucking out and pulling in is still makeshift.

BTW: I don't want my fucking teeth to chatter while I'm watching TV either.

Just one ac for the datacenter here.


Yes, but those processors are not running full workload 24/7 in a residence. I work in facilities for a large multinational consulting firm and we have many server datacenters in the 24 storys of the 32 story building we occupy. They have very carefully controlled huge HVAC systems and the individual servers share workload so that no unnecessary excess heat is produced. Just a few weeks ago I was talking to our IT director about some new rackmount APC's he wanted me to order for them. He was worried about the heat the units might generate because they were not originally designed into the system and these units will just sit there doing nothing but waiting for a power failure. My point is: Datacenters are carefully controlled speciality environments designed per application by a team of engineers and can't be even remotely compared to what we are doing here.

I misunderstood you. If you have a seperate server room with independent refrigeration then you are very lucky and very rich. Why are you bitcoin mining and not just using your refrigeration costs to buy bitcoin? The last AC unit I had replaced at my old home was a 5-ton unit, I paid just under $6,000 for it $7,000 with installation and air handler and needed to get a home equity loan to cover the cost. You may be able to get a smaller 2-ton (24000 BTU) installed that doesn't short cycle from the excess heat for only a couple of thousand $. A good short cycle experiment to run to test out your AC systems capacity (learned this from an HVAC guy) is to turn on your oven, open the door, set it to 175°F and leave it 24 hours. If you have a problem with the system you will find it out in that one day test.

"moderate climate with average temp of 19 celcius over the course of the year" Ok, this is an important missing piece of information. My annual average temp may be 10°C but at no time of the year do I maintain a constant 66.2°F/19°C. That would be just too cold to live in but in a seperate built-in server room, like you have, that might be just about right.



Well I have been saying datacenter but hashing farm is more apt.

1BFf3Whvj118A5akc5fHhfLLwxYduMmq1d
jamesg (OP)
VIP
Legendary
*
Offline Offline

Activity: 1358
Merit: 1000


AKA: gigavps


View Profile
September 27, 2011, 12:50:37 PM
 #75

In case anyone was wondering what I am cooling, here are some pics.




P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
September 28, 2011, 08:41:17 AM
 #76

Do I count 90 GPU's there  Shocked

jamesg (OP)
VIP
Legendary
*
Offline Offline

Activity: 1358
Merit: 1000


AKA: gigavps


View Profile
September 28, 2011, 10:09:30 AM
 #77

Do I count 90 GPU's there  Shocked

69 on the rack + another 13 elsewhere.
jamesg (OP)
VIP
Legendary
*
Offline Offline

Activity: 1358
Merit: 1000


AKA: gigavps


View Profile
September 28, 2011, 04:15:55 PM
 #78

How often do you have a card fail? What is your replacement cycle and cost? Are they all the same card model? Would be good to know what the best buy is in cards.

Cards I use:

MSI 6950s x 15
Sapphire 5830s x 50
AMD 5970s x 12

Everything is stable now but i popped 4 5970s before i started using extenders with the molex. Also popped 3 6950s from when i first started and thought it was cool to run the cards at 100c. Also had 3 5830s that didn't want to be overclocked so i sent them back and got new ones. I have also sent back 6 PSUs and have probably 5 MBs that still need to be RMA'd.
Big Time Coin
Sr. Member
****
Offline Offline

Activity: 332
Merit: 250



View Profile
September 29, 2011, 09:32:55 AM
 #79

Sweet!  You finally put up some pics.  That is very impressive build.  Clean and standardized-looking.  Good cable management too.   Grin

Big time, I'm on my way I'm making it, big time, oh yes
- Peter Gabriel
fivebells
Sr. Member
****
Offline Offline

Activity: 462
Merit: 250


View Profile
September 29, 2011, 11:46:54 AM
 #80

That NSA computer is oooold, though.
Pages: « 1 2 3 [4] 5 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!