Bitcoin Forum
May 29, 2024, 10:56:36 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Poll
Question: I would be interested in buying this frame
Bare, in a kit (less than $150) - 32 (29.9%)
Bare, assembled ($150) - 4 (3.7%)
Complete (with MB and PSU trays), in a kit ($175-$190) - 40 (37.4%)
Complete (with MB and PSU trays), Assembled (more than $190...) - 31 (29%)
Total Voters: 107

Pages: « 1 [2] 3 »  All
  Print  
Author Topic: Custom frame for BTC/LTC mining (Up to 24 GPU ) $150 - $190 - $200+  (Read 22617 times)
YipYip
Hero Member
*****
Offline Offline

Activity: 574
Merit: 500



View Profile
April 30, 2013, 08:39:11 AM
 #21

Damnit where were you 2 years ago  Grin

I was designing other stuffs that allowed me to do the design of this frame now  Wink

Looks intresting...what mobo are u using that can run 6 x single gpu cards (i.e 7950's) ??

OBJECT NOT FOUND
Spendulus
Legendary
*
Offline Offline

Activity: 2898
Merit: 1386



View Profile
April 30, 2013, 11:17:09 AM
 #22

I think if you removed every other card in the GPU arrays, cooling would be possible.  An attempt to direct air flow into this packed double array would require a shroud around the entire thing.

The distance between the GPU cards is insufficient for good airflow, air is required to make a 90 degree turn from the card's fan to exit the rack. 90 degree turn is difficult for any air mass, but for distances of 1/16 to 1/4 inch, laminar flow generates additional frictional losses.

Maybe this is a bit technical, but the sum is put them further apart...

We will test with an enclosure around the rack, we do want to try to force the air through it and see what happens.

I have done some air cooling test on SLI/Crossfire setup. I'm not saying it's perfect, but you can keep your cards at reasonable temperature under full loads if you have a fan blowing directly onto them.
Here is a good example of a setup with a 20" fan box and very little spacing (0.06" -- standard): http://www.youtube.com/watch?v=2nDTBN_cPs0
The spacing in between brackets (not cards, it should be slightly more) in the rack is 0.35" or roughly 3/8". It's not a lot, but it is 1/4" more than in standard crossfire/SLI setup (0.06"). I think the fresh air will be able to get in between and reach the GPU fan, which will push it to the top.
I also count on some help from the naturally occurring convection, which will create a chimney pull effect. The FT02 from Silverstone has used this concept and the results are very encouraging. And like I said, this is not new technology, similar arrangements have been used in high end applications, I'm just trying to adapt it for our application.
But I think only testing will show if this approach is accurate or not. Results to be published in about 2 weeks!

If you want to stack the rack in a server rack, you will have to put shrouds on top and bottom of the rack as depicted in one of the picture. The shroud will have to be much larger than 1/4" because they will have to contain the fan(s) to push (/pull). One idea is that we could sandwich the rack in between 2 fans to improve aeration.
Not really, because air will find the path of least resistance, and you are seeking to put it through specific channels of higher resistance.

That means that correct design is to make the spaces between cards the spaces that the air preferably flows through.  You could test this with one graphics card by putting a flat plate to both sides of it at various distances, then turning the card on.

Another issue is based on the card positioning, it is airflow output from one card being input to the next.  Thus in a row of cards, each has higher input air than the previous.  The fix to this is flat plate, or preferrably curved, diverters separating each card and directing airflow out of the row and upwards.
Cablez
Legendary
*
Offline Offline

Activity: 1400
Merit: 1000


I owe my soul to the Bitcoin code...


View Profile
April 30, 2013, 12:16:49 PM
 #23

OK I am back with only a few minor comments.

1) You might consider attaching the GPUs to the upper side of the support as to not cover any of the cards' rear vents which I have found will increase temps even if marginally obstructed.

2) I have an FT02 that I used for mining at one time and found it insufficient to handle large heat loads even in that orientation. What does help though is large amounts of air being pushed from the bottom (ie not the three 180mm fans it came with. Make sure to have lots of static pressure as well and an fairly open exit.

3) The shroud for the top could do with being more open, that is not such a steep curve on exit. This will lead to some pooling of the hot air that will be more difficult to remove.

That's it for now, if I think of anything else I will get back to you. Nice setup and keep improving. Smiley

Tired of substandard power distribution in your ASIC setup???   Chris' Custom Cablez will get you sorted out right!  No job too hard so PM me for a quote
Check my products or ask a question here: https://bitcointalk.org/index.php?topic=74397.0
matt_trade (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
April 30, 2013, 03:32:27 PM
 #24

Not really, because air will find the path of least resistance, and you are seeking to put it through specific channels of higher resistance.

That means that correct design is to make the spaces between cards the spaces that the air preferably flows through.  You could test this with one graphics card by putting a flat plate to both sides of it at various distances, then turning the card on.

Another issue is based on the card positioning, it is airflow output from one card being input to the next.  Thus in a row of cards, each has higher input air than the previous.  The fix to this is flat plate, or preferrably curved, diverters separating each card and directing airflow out of the row and upwards.
Here is a picture of the rack from the bottom. This is what the air will be forced into. I guess we could test and see how much gap is need to have the perfect setup. But As I said, there are physical constraints and we are at the limits... Don't forget that natural convection and most importantly the GPU fan will create a natural upward flow and a "low pressure zone" in between the cards, where the fresh air coming from the bottom will want to rush into.
http://www.awtti.com/images/LTCminingrig13.png


I think we are thinking the same thing as far as card positioning, although I'm a bit confused. Here is the drawing I posted about stacking the racks. It's just one idea among many possible setups.
http://www.awtti.com/images/LTCminingrig10.png
I would not recommend to have the air go through one rack directly to the next as it would seriously warm up the second rack.
If I'm not understanding what you mean, could you possibly do a little schematic of what you think the problem is/will be?
matt_trade (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
April 30, 2013, 03:37:47 PM
 #25

Looks intresting...what mobo are u using that can run 6 x single gpu cards (i.e 7950's) ??

I'm working on that as we speak.
You can find some info here: https://bitcointalk.org/index.php?topic=186877.20
I posted in the second page a list of thread that deals with that. I'm working with Boozer to find a stable setup.
I will also be working with the Linux developer to test the setup under Linux and optimize it for 6 GPUs.
matt_trade (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
April 30, 2013, 04:03:09 PM
 #26

OK I am back with only a few minor comments.

1) You might consider attaching the GPUs to the upper side of the support as to not cover any of the cards' rear vents which I have found will increase temps even if marginally obstructed.

2) I have an FT02 that I used for mining at one time and found it insufficient to handle large heat loads even in that orientation. What does help though is large amounts of air being pushed from the bottom (ie not the three 180mm fans it came with. Make sure to have lots of static pressure as well and an fairly open exit.

3) The shroud for the top could do with being more open, that is not such a steep curve on exit. This will lead to some pooling of the hot air that will be more difficult to remove.

That's it for now, if I think of anything else I will get back to you. Nice setup and keep improving. Smiley

Excellent feedback!!! Thank you. Wink

1) I will probably switch the upper support to a 0.5" (0.5" x 1") profile so that the rear vents are not obstructed. That's an easy fix but I HAD to go with the profile I used because the other one was not in stock. The rack is designed so that this particular profile (the one holding the graphic cards) can be toyed with, replace, modified, moved back and forth, etc... It's the only portion of the rack that will be adjustable.
http://thumbs2.ebaystatic.com/m/mm7Ao65B0ktfUqII7gOv5ig/140.jpg

2) In the case of the FT02, which is a great example, you really "only" have 1 x 180mm (of the three) blowing on your SLI/TRI SLI setup (150 CFM/3 cards). With the rack, I'm counting on 300 CFm / 3 cards minimum (20" Box fan) or 500 CFM with a 24" industrial fan.
http://i792.photobucket.com/albums/yy208/dangcjr/DSC_0086.jpg

For the cooling of the rack, I did consider this (high static pressure), a PT cruiser car fan:
http://i.ebayimg.com/t/New-Radiator-Fan-Cooling-Chrysler-PT-Cruiser-2005-2004-2003-2002-2001-CH3115118-/00/s/NTAyWDUzMA==/z/AyAAAMXQ0v1RcOmo/$(KGrHqNHJEQFDjwVMzBBBRcOmoMJ9g~~60_12.JPGhttp://images1.carpartsdiscount.com/auto/archive/pictures/135625/600/1/P/420D614/chrysler_pt_cruiser_radiator_cooling_fan_oem_5017407ab.jpg

I have a few questions on your setup. Did you run the case open? If so, did it make a difference with the case close? What was the spacing between your card (the standard 0.06")? Did you mod it or switch the bottom fan to have a better cooling?

3) I drew the shroud on top with Word, so it is just a concept. I think we could test various angles and depth to make sure it does not restrict the airflow.

I'm looking for all possible suggestions to improve cooling and set up. Anything you'll suggest will be used to better the rack.
GuiltySpark343
Member
**
Offline Offline

Activity: 98
Merit: 10



View Profile
April 30, 2013, 04:39:03 PM
 #27

Oil cooling, only way to go for this setup:
http://www.maximumpc.com/article/features/hardcorepc_reactor

I don't know half of you half as well as I should like; and I like less than half of you half as well as you deserve.
Ƀ:17wbDetEw2aESM5oWXbm5ih9NSdDruyWNT
matt_trade (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
April 30, 2013, 04:48:39 PM
 #28

Oil cooling, only way to go for this setup:
http://www.maximumpc.com/article/features/hardcorepc_reactor
We'll I guess we'll know pretty soon.
I like the oil cooling technique because, in the case of the rack, it will allow for heat recycling all year long with the use of a Oil/Water heat exchanger (intake of hot water heater, used as a pool heater, etc...). And it will make the rack absolutely quiet and very stable.
CartmanSPC
Legendary
*
Offline Offline

Activity: 1270
Merit: 1000



View Profile
April 30, 2013, 07:35:19 PM
Last edit: April 30, 2013, 09:39:01 PM by CartmanSPC
 #29

My limitation are:
19" wide: because this is the maximum with the rack and be and still fit in a standard server rack
24" long: because I don't want it any wider so that it fits standard fans (20" and 24").
So I built the rack based on these limitations.

I am looking at having something like this built myself. I need to put it into a standard 19" rack though. When you say 19" wide do you really mean around 17.8" since that is the maximum that will fit in a 19" rack. I was going to do 17.5" myself. Most server enclosures are 16.9-17.1 to accommodate for rails.

I was shooting for only 8 cards across that 17.5" knowing that the most an affordable MB could use is 7. Thinking 7-8 cards across 17.5" would allow for enough space between them for airflow. Could reduce the # of cards as necessary.

My design was going to have the cards on top and the MB under them. The PS's were going to be under the MB. I was going to have the top bottom and sides enclosed with the front and back open for push/pull fans. The front would push in cold air and the back would pull hot. This is for a hot/cold isle in the server rack.

These are just my thoughts. Haven't gotten as far as you in designing it. Keep up the good work!  Cool


matt_trade (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
April 30, 2013, 09:45:39 PM
Last edit: May 01, 2013, 12:06:52 AM by matt_trade
 #30

My limitation are:
19" wide: because this is the maximum with the rack and be and still fit in a standard server rack
24" long: because I don't want it any wider so that it fits standard fans (20" and 24").
So I built the rack based on these limitations.

I am looking at having something like this built myself. I need to put it into a standard 19" rack though. When you say 19" wide do you really mean around 17.8" since that is the maximum that will fit in a 19" rack. I was going to do 17.5" myself. Most server enclosures are 16.9-17.1 to accommodate for rails.

I was shooting for only 8 cards across that 17.5" knowing that the most an affordable MB could use is 7. Thinking 7-8 cards across 17.5" would allow for enough space between them for airflow. Could reduce the # of cards as necessary.

My design was going to have the cards on top and the MB under them. The PS's were going to be under the MB's. I was going to have the top bottom and sides enclosed with the front and back open for push/pull fans. The front would push in cold air and the back would pull hot. This is for a hot/cold isle in the server rack.

These are just my thoughts. Haven't gotten as far as you in designing it. Keep up the good work!  Cool



Hi Cartman,
Unfortunately, it's 19" total width... If we did half the number of card, we could make it smaller. But we would also have half the density. Somebody mentioned they could make it fit with a 19" width... So it will fit inside the rack, but you might have to remove the front rails to set it up. Not ideal, but the best we can do now considering the space we need. NB: the limitation comes from the space in between the MB and the CPU, the space in between the MB and its tray, and the space in between the tray and the PSU. If we can trim 1.25" or 0.625 on both sides, then you will be able to fit it on rails.
Right now, you do have to consider that we have limitation of 6 cards (maybe 7 but hard to pull) on most motherboard, so we have to do multiples of 6.
ssateneth
Legendary
*
Offline Offline

Activity: 1344
Merit: 1004



View Profile
April 30, 2013, 10:00:16 PM
 #31

Theres so much hype about oil cooling in this thread... When GPU mining becomes not profitable, you will be left with thousands of dollars in paperweights. Nobody on ebay will want to buy an oily graphics card. They'll think "Wtf this is wet, i don't want to plug this in. It will fry all my equipment!"

On topic though, I would be interested in each card had 3 slots. I don't want my equipment to suffocate.

matt_trade (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
May 01, 2013, 12:32:04 AM
 #32

Theres so much hype about oil cooling in this thread... When GPU mining becomes not profitable, you will be left with thousands of dollars in paperweights. Nobody on ebay will want to buy an oily graphics card. They'll think "Wtf this is wet, i don't want to plug this in. It will fry all my equipment!"

On topic though, I would be interested in each card had 3 slots. I don't want my equipment to suffocate.

SSateneh, if you want to sell a card submerged in oil, it must go through a soap and water cleaning (by hand or dishwasher). If the card was installed in the oil when new, it will keep it this way. Once it is clean, you will not be able to make the difference between the oil cooled one and the air cooled one. Actually, the air cooled one will be dusty and might have overheated in some spots because the air cooling was not adequate (typical with dust accumulation). With the oil cooling one, you know the entire board was pretty much kept at a fairly cool and uniform, somewhat low (46C-55C) temperature for its entire life. There is a reason Green Revolution Cooling (http://www.grcooling.com/) is building a multi-million dollar business on it.
"When GPU mining becomes not profitable" ... : there are already crypto currencies that can only be mined on CPU/GPU. So it might take a while before GPU mining is not profitable at all for all other crypto currencies, including the new ones that will for sure come out. If you think about it, when GPU mining will not be profitable for a currency, it will damage it as most people who support the currency, the mass of miner, is GPU mining. When they all leave replaced by a few ASIC pools, you will loose redundancy, you will loose security, you will lose interest and the free hype that the miners generate. That was literally the foundation LTC is based on: trying to avoid this particular phenomenon.

If you read about oil cooling on GRC's website , you'll also notice that the oil cooling has been proven to improve energy efficiency by up to 20%, which would keep the GPU mining profitable for a longer period. For the few of us that will be able to recycle some of the lost heat, it might keep GPU mining even longer. So far, my calculations for a 24 GPU rack show that you can build an oil cooling system for around $900. Not cheap, but doable if it saves you 15-20% on electricity cost (and improve your mining capacity). Finally, ASIC were developed to mine bitcoin and that's all they will do. Your GPU rack will probably have many more applications developed for it.

If you want to space out your card, you can skip one out of two. I will to prove to you that this rack will be able to manage the cards just like they are, with the right amount of air. Keep following the thread, I will post updates from the Beta test.
Spendulus
Legendary
*
Offline Offline

Activity: 2898
Merit: 1386



View Profile
May 01, 2013, 11:49:49 AM
 #33

Not really, because air will find the path of least resistance, and you are seeking to put it through specific channels of higher resistance.

That means that correct design is to make the spaces between cards the spaces that the air preferably flows through.  You could test this with one graphics card by putting a flat plate to both sides of it at various distances, then turning the card on.

Another issue is based on the card positioning, it is airflow output from one card being input to the next.  Thus in a row of cards, each has higher input air than the previous.  The fix to this is flat plate, or preferrably curved, diverters separating each card and directing airflow out of the row and upwards.
Here is a picture of the rack from the bottom. This is what the air will be forced into. I guess we could test and see how much gap is need to have the perfect setup. But As I said, there are physical constraints and we are at the limits... Don't forget that natural convection and most importantly the GPU fan will create a natural upward flow and a "low pressure zone" in between the cards, where the fresh air coming from the bottom will want to rush into.


I think we are thinking the same thing as far as card positioning, although I'm a bit confused. Here is the drawing I posted about stacking the racks. It's just one idea among many possible setups.
I would not recommend to have the air go through one rack directly to the next as it would seriously warm up the second rack.
If I'm not understanding what you mean, could you possibly do a little schematic of what you think the problem is/will be?
The problem is/will be/always is variations in heat transfer between different parts of boards.

You can figure this pretty easily, put the whole thing inside a 18"x18" by 8' tube made of say, building supply 1/2" foam, then measure airflow (can be done by looking at RPM of a fan blade with a strobe) and temperature in and out.

The increase in efficiency of cooling as you move things apart is exponential, not linear.  For at least the first inch or so.  The best way to handle a row of cards would be to put flat plate diverters between each card , made from plastic or cardboard, not metal.  That would somewhat eliminate card to card radiative heating.
matt_trade (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
May 01, 2013, 04:47:31 PM
 #34

Here it is, ready to ship!
http://www.awtti.com/images/LTCminingrig14.jpg
rob1313
Full Member
***
Offline Offline

Activity: 162
Merit: 100


View Profile
May 01, 2013, 06:13:15 PM
 #35

first thing to do drill holes and mount caster wheel.  Smiley

tip if i help you. doge: D7zzbMR9mxmtDQWWNfrRGY5fFNUnrwexSQ or BTC: 1Fot8CrsuxcZUw6qYX3sVpNo5MDtaf7ZS2
leave rep here for any transaction. 
BTC mining contracts for only 0.0058 BTC / GHs
ISAWHIM
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500



View Profile
May 07, 2013, 01:39:54 PM
 #36

You realy need to show it populated with actual equipment, not theoretical equipment.

90% of that structure is useless and expensive for such a simple structure.

Missing things... any kind of cover, to actually allow air to "flow" over components. Any form of exhaust-vent attachment for the 12KWh heater/GPU exhaust. (You will not be cooling that with an AC unit.) Missing filter attachment, and blower-fan. (A free-flow fan will not suffice. That is why they don't use them in a professional setup. You need a blower with high static-pressure, since this is not an "unrestricted air-flow", thus, flower-fans are just a cheap and ineffective alternative to what is actually needed.)

I'll sell you my old fridge, which is better suited for 2x the volume of your structure. Same price, insulated, sealed, ready for ducting, and has built-in supplemental cooling for moisture-extraction of ambient outside air. $150 OBO, pickup only. Tongue
matt_trade (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
May 07, 2013, 02:27:20 PM
 #37

You realy need to show it populated with actual equipment, not theoretical equipment.

90% of that structure is useless and expensive for such a simple structure.

Missing things... any kind of cover, to actually allow air to "flow" over components. Any form of exhaust-vent attachment for the 12KWh heater/GPU exhaust. (You will not be cooling that with an AC unit.) Missing filter attachment, and blower-fan. (A free-flow fan will not suffice. That is why they don't use them in a professional setup. You need a blower with high static-pressure, since this is not an "unrestricted air-flow", thus, flower-fans are just a cheap and ineffective alternative to what is actually needed.)

I'll sell you my old fridge, which is better suited for 2x the volume of your structure. Same price, insulated, sealed, ready for ducting, and has built-in supplemental cooling for moisture-extraction of ambient outside air. $150 OBO, pickup only. Tongue

Hi Isawhim,

I thought I would address portion of this post.
I'm going to show it populated with actual equipment, this is the objective of the Beta test phase.

I'm not getting what you are saying about 90% of the structure being useless. There will be a lot of weight attached to the structure and each piece plays a roll in either supporting or reinforcing the frame so that it can support the weight of the hardware and PSUs. Do you have some kind of engineering study to support your claim? Do you mind sharing your insight on how you would remove 90% of the components and have something that would work and correctly support and protect the equipment?
As far as prices goes, I'm doing my best to keep it down but the aluminum profiles are not cheap, the fasteners are expensive, and you still have to pay for somebody to cut and drill every piece. It's made here in the USA, so labor is not that cheap. Right now, it cost less than $50 for 6 GPU, which is not too bad if the cooling is adequate. The idea is that the frame will help putting together a setup that will be easier to cool (and save energy, allowing the rack to pay for itself).
The system is designed to handle 6kWh, not 12kWh. I'm not sure how you get to that number knowing that the limitation is 4 PSU or 6400W max... If you want to enclose the frame, you absolutely can. I already plan on doing that when I will run the oil cooling test.
If you want to use a fan with a high static pressure, you also can. But I will start testing with regular fans and see what happens.

You are talking about using a refrigerator that is twice the volume, which defeat the purpose of the rack design (compact setup). We already know that tight spacing with the right airflow can work:
https://encrypted-tbn2.gstatic.com/images?q=tbn:ANd9GcRMk0ZteKoFZO0kqXhSdfCn1uLCdaJKOUc6j8FLFqYoEqmay9l5
Debating this issue is pointless until I start testing with the current spacing. My theory is that, if you can make it work with the cards tightly packed together, why couldn't you with an additional 10mm in between them?

I would love to see what happens in the insulated enclosure if the fan stops working, this would be very interesting.

Update: I will receive the first frames on the 8th. I will post some picture then.


matt_trade (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
May 07, 2013, 02:29:33 PM
 #38

The first test will be on the bare frame. I'm still waiting on the acrylic trays.
As soon as they are made, I will mount them him and redo some testing to make sure it is not affecting the airflow. I did make a modification onto them to allow for a better airflow under the motherboard by creating an opening in front of the PSU fan. It will also reduce any possible airflow restriction to the PSU and reduce the overall cost in material (acrylic) for all the trays.
matt_trade (OP)
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
May 11, 2013, 12:40:58 AM
 #39

FIRST PRE TESTING RESULTS / PRE BETA
Hi Everybody, I'm happy to report that everything is going fairly well and that the first results are positive.
I so far mounted 4 Cards (7950 Gigabyte WF3 with F43 BIOS, BAMT), 5th coming this afternoon. I found the following.
Depending on ambient air temperature (tested at 17-18C outside), I got the thing stable at 67C-72C, 1080W. Hashes average for testing 580-600 kH/s .
For cooling, I used a 24" industrial fan (low speed) which only really pushes air on the edge (design flaw), which is not good at all. It seems a shroud would help a lot to push the air in between the cards and not around the frame.
I'm going to see if I can get some space to allow the cards to stay correctly positioned. Right now, I used a foam spacer otherwise some of them touches (you'll understand when you see the pictures).
Posting again soon.

The 5 cards setup so far. Waiting for #6 and some special parts to customize the rack.
http://www.awtti.com/images/LTCrig1.JPGhttp://www.awtti.com/images/LTCrig2.JPGhttp://www.awtti.com/images/LTCrig3.JPG
http://www.awtti.com/images/LTCrig4.JPG
ISAWHIM
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500



View Profile
May 11, 2013, 04:26:16 PM
 #40

Fridges come in all sizes...

and as for your statement... "I would love to see what happens in the insulated enclosure if the fan stops working, this would be very interesting."

Same thing that would happen in an open-air design... if the fan stops working... It shuts-down that card, or the whole rig. In my 25-plus years, building electronics, I have NEVER had a fan "die". I don't buy cheap crap fans from china... I also don't use stock crap fans.

As for an enclosure, which would actually be better suited for your design. Try a large microwave. Junkyards are full of them for free. Larger... upgrade to a dryer-machine, and gut-it... larger, use a chest-fridge (remove the insulation if you desire, it is just foam.)

All I was saying is you are building a "core" without consideration for any actual evacuation of heat. That unit will hold 3000+Watts , heating a whole house in minutes.

Hot air does NOT only exhaust out the end of a card. It exhausts in all directions, which is why they expect you to contain it, and evacuate it with additional wattage-sucking fans. (hidden costs)

Or, you contain the cards, like I do, in a shell... rip off the PSU/mobo/vid-card-powered fans, and use ONE external 120v/240v fan to replace them all. That makes more power available to the graphics, and system, and provides quiet and better structured cooling that actually works.

Your hidden cost, is all the expensive stuff that is beyond the "frame". For the frame, $150 is a great price. But that isn't a "turn-key" frame, and still requires additional setup, install, and building. If they can manage that other building, then I am sure they can manage building the simple frame.

Still, you are "manufacturing" based on "theory"... Thus... show it populated, and working. (Funny that you keep showing a "failed project" as an example. That was never working either. The one which was half-loaded, because he couldn't get all 16 cards working on one motherboard. Completely unrelated to your build.)

If the cards have a shroud, and evacuation, you can remove the existing shrouds and place them 1mm apart. Stock, it is best to spread them, so they can MIX air better, thus, mixing more cool air with the hot exhaust it is spewing in all directions. (Which then, still has to be evacuated, now that it is mixed with half hot/cool air.)
Pages: « 1 [2] 3 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!