Bitcoin Forum
December 08, 2016, 06:22:42 AM *
News: Latest stable version of Bitcoin Core: 0.13.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: [1]
  Print  
Author Topic: Extending the mining pool, err.. more rigs  (Read 1367 times)
shat
Newbie
*
Offline Offline

Activity: 21


View Profile
May 18, 2011, 12:53:58 AM
 #1

Alright, so I am currently in the process of extending a personal mining pool that is in operation with two other friends.  I just received more Sapphire Radeon 5850s and plenty of Mobo, CPUs, Memory, PSUs etc to build out.  However, I am curious.  I was originally planning on operating only two cards in each machine, or perhaps three, but if I run 3 on the motherboard, I am worried about heat of one of the cards, because it will be directly next to another card.

New hardware:
http://s4.postimage.org/30tx7i6zo/photo_1.jpg

Additionally, I've thought about putting 4 to a machine. 

Motherboards are MSI K9A2 Platinum Motherboard

My question is, based on running them @ 850/900 @ 1.088v (stock voltage) which consumes in the neighborhood of 250W (give or take 20W) at full load, each.  Do you think the heat would be too much?  Currently all my 5850s are 2 to a box, at the cpu/memory and voltage previously mentioned; and operating at 60C - 62C without any issues.  Granted, they're running on a datacenter floor with plenty of cooling, so I'm not worried about the room temperature, it is already super cool ~ 64F.

This is the proposed configuration of four:
http://s2.postimage.org/26o1ijtyc/photo.jpg

Do you think it will be an issue?

All in all this will add another 3GH/s to our pool.  Which is always great! Smiley

1481178162
Hero Member
*
Offline Offline

Posts: 1481178162

View Profile Personal Message (Offline)

Ignore
1481178162
Reply with quote  #2

1481178162
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1481178162
Hero Member
*
Offline Offline

Posts: 1481178162

View Profile Personal Message (Offline)

Ignore
1481178162
Reply with quote  #2

1481178162
Report to moderator
1481178162
Hero Member
*
Offline Offline

Posts: 1481178162

View Profile Personal Message (Offline)

Ignore
1481178162
Reply with quote  #2

1481178162
Report to moderator
allinvain
Legendary
*
Offline Offline

Activity: 1988



View Profile
May 18, 2011, 02:24:36 AM
 #2

Damn. I'm jealous. Wish I had all that hardware to play with Tongue

60 to 62 C for your existing dual 5850 setup is a very good temperature range. I fear that if you put 4 of them next to each other like that you will had some major problems with cooling them. First off heat will bleed into the card next door (typically the one above it) and the resultant effect will be that the top card will be baked like a cookie. Second I'd recommend you leave a nice gap between the cards. The fans need some space to intake air.

I can't quite tell from that picture but are those 5850 using a turbine cooler or a non reference design (ie aftermarket)? If fans instead of turbine then I think you're asking for trouble putting them so close to each other with no room to breathe. If turbine you may be able to get away with it if you maybe I dunno jam something in between them to open up at least a tiny gap. Also if you do this put two high CFM rated fans at the back where the intake vents are and run them at full blast to ensure that cool air is force fed into them.

If you can get each card to run without going above 80 - 90 C you're golden Smiley


1bitc0inplz
Member
**
Offline Offline

Activity: 112


View Profile
May 18, 2011, 02:39:45 AM
 #3

Dang, thats a lot of processing power. I'm jealous.  Cheesy

Mine @ http://pool.bitp.it - No fees, virtually 0 stales, what's not to love!
Chat with us @ #bitp.it on irc.freenode.net
Learn more about our pool @ http://forum.bitcoin.org/index.php?topic=12181.0
rezin777
Full Member
***
Offline Offline

Activity: 154


View Profile
May 18, 2011, 03:20:50 AM
 #4

I think that those particular cards (meaning the coolers) will perform poorly when stacked together like that. I've only had luck with the reference design when it involves sandwiching. I am interested to see your results though.
shat
Newbie
*
Offline Offline

Activity: 21


View Profile
May 18, 2011, 03:23:35 AM
 #5

Damn. I'm jealous. Wish I had all that hardware to play with Tongue

60 to 62 C for your existing dual 5850 setup is a very good temperature range. I fear that if you put 4 of them next to each other like that you will had some major problems with cooling them. First off heat will bleed into the card next door (typically the one above it) and the resultant effect will be that the top card will be baked like a cookie. Second I'd recommend you leave a nice gap between the cards. The fans need some space to intake air.

I can't quite tell from that picture but are those 5850 using a turbine cooler or a non reference design (ie aftermarket)? If fans instead of turbine then I think you're asking for trouble putting them so close to each other with no room to breathe. If turbine you may be able to get away with it if you maybe I dunno jam something in between them to open up at least a tiny gap. Also if you do this put two high CFM rated fans at the back where the intake vents are and run them at full blast to ensure that cool air is force fed into them.

If you can get each card to run without going above 80 - 90 C you're golden Smiley



I am not sure if they're reference or not, they're Sapphire Radeon HD 5850, specifically, Sapphire Radeon 5850 on Tiger Direct.  They appear to have a fan, not a turbine of any type.  I can't tell if they take air in the back or exhaust out the back.  I'm looking at a rig now, thats been running for a while and its at 66C on both GPUs and one of them (fans) is at 75% other at 53%.  The one at 75% is because it is just as close to the PSU (on bottom) as these 4 cards will be to one another, but then again.. they're not that close together to each other and I know the GPU puts off more heat than anything else in the case.

I might test it.. but I might just put 3 in still, or perhaps only 2 and just be stuck with a spare card in another machine alone, until more come in.
fpgaminer
Hero Member
*****
Offline Offline

Activity: 546



View Profile WWW
May 18, 2011, 03:28:35 AM
 #6

I have Sapphire Radeon 5850's from TigerDirect. Three to a rig. See my post here.

They are right next to each other, as you propose, and after tweaking the setup they run just fine, <70C. To summarize here:

Move the PSU away from the cards if you have a case that puts them close. If you're caseless, well, no worries!

Use some non-conductive material to spread the cards apart at the back (behind the fan, making sure not to block the fan!). This will allow more air to flow into the middle card(s). Folded paper, cardboard, or plastic.

Drop the memory clock to 300MHz or lower.

If you have a case, make sure to put some fans on the side panel blowing on the cards.

EDIT: I've tested 850/285 on one rig and she runs fine. In fact, that's on the rig that isn't even setup correctly (PSU is too close and no spacers on the cards).

1bitc0inplz
Member
**
Offline Offline

Activity: 112


View Profile
May 18, 2011, 03:57:07 AM
 #7

Drop the memory clock to 300MHz or lower.

Does lowering the memory clock just shave off some of the power consumption? Or is this a heat thing too?

Either way, would this apply to all Radeons? Sorry, just curious.

Mine @ http://pool.bitp.it - No fees, virtually 0 stales, what's not to love!
Chat with us @ #bitp.it on irc.freenode.net
Learn more about our pool @ http://forum.bitcoin.org/index.php?topic=12181.0
allinvain
Legendary
*
Offline Offline

Activity: 1988



View Profile
May 18, 2011, 04:32:50 AM
 #8

Damn. I'm jealous. Wish I had all that hardware to play with Tongue

60 to 62 C for your existing dual 5850 setup is a very good temperature range. I fear that if you put 4 of them next to each other like that you will had some major problems with cooling them. First off heat will bleed into the card next door (typically the one above it) and the resultant effect will be that the top card will be baked like a cookie. Second I'd recommend you leave a nice gap between the cards. The fans need some space to intake air.

I can't quite tell from that picture but are those 5850 using a turbine cooler or a non reference design (ie aftermarket)? If fans instead of turbine then I think you're asking for trouble putting them so close to each other with no room to breathe. If turbine you may be able to get away with it if you maybe I dunno jam something in between them to open up at least a tiny gap. Also if you do this put two high CFM rated fans at the back where the intake vents are and run them at full blast to ensure that cool air is force fed into them.

If you can get each card to run without going above 80 - 90 C you're golden Smiley



I am not sure if they're reference or not, they're Sapphire Radeon HD 5850, specifically, Sapphire Radeon 5850 on Tiger Direct.  They appear to have a fan, not a turbine of any type.  I can't tell if they take air in the back or exhaust out the back.  I'm looking at a rig now, thats been running for a while and its at 66C on both GPUs and one of them (fans) is at 75% other at 53%.  The one at 75% is because it is just as close to the PSU (on bottom) as these 4 cards will be to one another, but then again.. they're not that close together to each other and I know the GPU puts off more heat than anything else in the case.

I might test it.. but I might just put 3 in still, or perhaps only 2 and just be stuck with a spare card in another machine alone, until more come in.

@shat They do not exhaust out the back. They exhaust inside the case. I think in your case maybe running caseless Wink is best?

Well if you can get away with 4 running at 66 C they that's pretty nice Smiley Yes the gpus are the real ovens, not so much the PSU - at full load most good PSU run at 50C max.

You said you're putting the mining machine in a data center, so I'm not sure if you can go caseless there, so maybe a good sidepanel fan is called for - 140mm if possible.90 CFM min (higher is better but higher = louder)

@1bitc0inplz

It lowers both power consumption and heat output as the memory does not have to work so hard and therefore draw as much power. Yes it applies to all video cards not just radeons. But lowering the memory to 300 Mhz (or lower in some cases) is a very common trick among miners.

shat
Newbie
*
Offline Offline

Activity: 21


View Profile
May 18, 2011, 05:02:12 AM
 #9

I can go caseless on a shelf/tray in a rack.  We have ~ 15 racks, not too much free space, but enough to get by.  Currently, I just tried plugging in a 5850 using those silly molex 4 to pci-6 pin connectors that ship with the units... the video card gets power, the fan is spinning... but it doesn't show up in windows.

Not sure why, I tried multiple cards with multiple little adapter cables.  I was hoping to use some of these smaller 650W psu's i have laying.

damnit... ideas?
allinvain
Legendary
*
Offline Offline

Activity: 1988



View Profile
May 18, 2011, 12:26:45 PM
 #10

I can go caseless on a shelf/tray in a rack.  We have ~ 15 racks, not too much free space, but enough to get by.  Currently, I just tried plugging in a 5850 using those silly molex 4 to pci-6 pin connectors that ship with the units... the video card gets power, the fan is spinning... but it doesn't show up in windows.

Not sure why, I tried multiple cards with multiple little adapter cables.  I was hoping to use some of these smaller 650W psu's i have laying.

damnit... ideas?

So you tried how many cards at once with the molex to 6 pin pcie connectors? It may mean the PSU ran out of juice on the 12V rail or it's not able to keep up the voltage due to load.

You can use 650 W PSU but for no more than 3. I run 2 5870s off of a 600W PSU and it holds up just fine but doesn't have that much more room for expasion (438 Watts at full load) and I too also use those stupid molex to 6 pin PCIe cables - wish I didn't have to but I'd have to get another power supply then (what I'm using is a left-over I guess from an older desktop of mine)

Did you also try connecting the monitor directly to that card (the one powered by the molex connectors)? If you're not running in crossfire mode windows will shut off whatever vid card is not connected to a video load (monitor) so if that's the case then you're going to need some VGA dummy cables.


Pages: [1]
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!