Bitcoin Forum
May 11, 2024, 09:11:59 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: 1 2 3 4 5 6 7 [All]
  Print  
Author Topic: Mining server room (cooling development help)  (Read 22790 times)
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 07:26:49 AM
 #1

Images included this is a scale model of the room in development we are currently waiting on a 200 amp drop to the location to be able to feed this beast but other then that hardware is trickling in from various sources for reference sakes lets assume all gpu's are 5850's light grey are psu's black HDD's and 5th rack down light blackish is switch 1 window to room 1 door units are staggered so there is more chance of air flow 5850 are front and rear exhaust what we need to know is this

should we opt air conditioning if so how should it be routed or should we opt air exchange ?

we are leaning to air exchange as it would be simpler but we need to know how much air should be displaced and how it should be routed ?

there is about 810 ft3 in this room

we have no idea the actual heat that the cards are putting out because it is dependent on alot of factors such as the room air temp and such .. but open air with fan running 100% we sit at about 62 degrees C assuming the outside air is about 17 C so given the max temp of ~37C  and a min in the winter of -30C

roughly how much air should we be moving Wink im sure i could setup a thermostat to regulate the fans on and off but still would be nice to know how many cfm the inline fans should move to cool the room at a max temp of 37C outside


Images oversized sorry lol and detail is lacking just so you can get a panned out view if anyone uses maya and wants the maya file let me know


1715461919
Hero Member
*
Offline Offline

Posts: 1715461919

View Profile Personal Message (Offline)

Ignore
1715461919
Reply with quote  #2

1715461919
Report to moderator
1715461919
Hero Member
*
Offline Offline

Posts: 1715461919

View Profile Personal Message (Offline)

Ignore
1715461919
Reply with quote  #2

1715461919
Report to moderator
1715461919
Hero Member
*
Offline Offline

Posts: 1715461919

View Profile Personal Message (Offline)

Ignore
1715461919
Reply with quote  #2

1715461919
Report to moderator
TalkImg was created especially for hosting images on bitcointalk.org: try it next time you want to post an image
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
dikidera
Full Member
***
Offline Offline

Activity: 126
Merit: 100


View Profile
May 24, 2011, 07:31:36 AM
 #2

I like all them GPUs, but you will increase the diff a lot dude. Think about the other users.
TurdHurdur
Full Member
***
Offline Offline

Activity: 216
Merit: 100


View Profile
May 24, 2011, 07:57:36 AM
 #3

I like all them GPUs, but you will increase the diff a lot dude. Think about the other users.
That's not how Capitalism works.
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 07:59:26 AM
 #4

You are right i will increase the difficulty but the difficulty will continue to increase regardless if its me or say 40 other users really doesn't matter if it is me or not the end result is a rise
cschmitz
Member
**
Offline Offline

Activity: 98
Merit: 10


View Profile
May 24, 2011, 08:02:09 AM
 #5

we have no idea the actual heat that the cards are putting out

i would say if that is the depth of knowledge you have on the issue, dont invest in an installation of that size. the heat of each card is precisely defined and possible to even calculate quite accurately, going with the tdp seems to be a good rule of tumb and leave some buffer.
not knowing about tdp and thinking that heat output in watt is in any way related to the environmental temperature does suggest you should do some more reading rather than building your private datacenter.
you should not confuse heat output per card with the cards ability to stay within its thermal envelope inside your projected building, these are two different issues.

proud 5.x gh/s miner. tips welcome at 1A132BPnYMrgYdDaRyLpRrLQU4aG1WLRtd
cschmitz
Member
**
Offline Offline

Activity: 98
Merit: 10


View Profile
May 24, 2011, 08:03:55 AM
 #6

I like all them GPUs, but you will increase the diff a lot dude. Think about the other users.

you actually think someone with 3*16=48 pcs with say 3x 5850 each will cause a huge diff increase? dont be so naive, thats only about 46gh.

proud 5.x gh/s miner. tips welcome at 1A132BPnYMrgYdDaRyLpRrLQU4aG1WLRtd
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 08:09:46 AM
 #7

heat is the only issue and your right i don't know much about it the rest of the development is under control Smiley

i wouldn't ask if i didn't know Smiley
Raulo
Full Member
***
Offline Offline

Activity: 238
Merit: 100


View Profile
May 24, 2011, 08:12:00 AM
 #8

I'm not sure what the wattage of your whole setup is but if I understand it correctly, it is 10kW (50 boxes about 200W each). You will not cool it without AC. Even 1kW noticeably heats a room. And you will likely not cool it with ventilation only unless it's cold winter. Specific heat of air is 1.2 kJ/(m3*K) so assuming 10 K difference between incoming and exhausting air from the room, you need to move 10000/1200/10=0.8 m3 of air per second. This is a lot. You need to exchange all the air in the room in less than 1.5 minutes. Normal ventilation is exchanging all the air in 1-2 hours. Not only that but you need to make sure, the air is removed properly from the whole room and there are no hot spots. And you'll still have a room that is at least 10 K warmer than the outside temperature which even if not make throttling of the cards, it will make them degrade quicker.

Get an AC unit that is rated for the heat removal at least equal to the heat produced by the setup. Or start with a smaller installation if you have not enough knowledge.

1HAoJag4C3XtAmQJAhE9FTAAJWFcrvpdLM
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 08:17:31 AM
 #9

I can get a 1260 CFM inline blower that will exchange the the rooms air in under a min would that not work ? and yes i understand the hot spot issue AC is certainly a option just weighting each
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 3920
Merit: 2348


Eadem mutata resurgo


View Profile
May 24, 2011, 08:18:01 AM
 #10

How many GPUs on each shelf (and/or rig)?

Pretty hard to estimate heat without knowing that.

jasonk
Full Member
***
Offline Offline

Activity: 168
Merit: 100


View Profile
May 24, 2011, 08:22:09 AM
 #11

Wow this is crazy, especially considering the massive difficulty increases.  I've only been mining for 3 weeks, I've already seen the network MORE than double!

---------------------------

Bad idea to try to cool the room with only an AC, considering 10KW of heat!  Your best bet would be to run some kind of ducting to each computer to vent the air out of the room.

The idea is you don't want to cool the heat, but instead get rid of the heat ASAP then add cool air into the room.

--------------------------

Either way, I think this is an insane project to attempt, and I believe we will see a bitcoin economy crash soon enough if people like you start going overboard.

--------------------------

I can get a 1260 CFM inline blower that will exchange the the rooms air in under a min would that not work ? and yes i understand the hot spot issue AC is certainly a option just weighting each

If you have that kind of CFM going into the room, can you also evacuate that amount of air from the rest of the room?
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 08:24:57 AM
 #12

there will be 36 by this friday and more added the following week
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 3920
Merit: 2348


Eadem mutata resurgo


View Profile
May 24, 2011, 08:26:55 AM
 #13

there will be 36 by this friday and more added the following week

36 on each shelf?

BitterTea
Sr. Member
****
Offline Offline

Activity: 294
Merit: 250



View Profile
May 24, 2011, 08:30:58 AM
 #14

Why not put that waste heat to good use?

http://en.wikipedia.org/wiki/Waste_heat#Electrification_of_waste_heat
Hawkix
Hero Member
*****
Offline Offline

Activity: 531
Merit: 505



View Profile WWW
May 24, 2011, 12:34:03 PM
 #15

Have you seen this Youtube video - custom exhaust for mining rigs? http://www.youtube.com/watch?v=G5f_e4P6gMA

Donations: 1Hawkix7GHym6SM98ii5vSHHShA3FUgpV6
http://btcportal.net/ - All about Bitcoin - coming soon!
Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
May 24, 2011, 12:47:40 PM
 #16

if you had investments for such hardware amount, you probably had money for professional air cond builder, which can design/engineer cooling for you solution.
check you local newspaper for such ads.
in case if you short on cash - pick industrial air conditioner of appropriate type/power[consult w/dealer].
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 03:29:31 PM
 #17

As you see it is how it will be built 16 machines to a shelf and currently 3 shelves with possibility of 2 more shelves going in down the line

Biggest project concerns are heat and power were in the process of contracting an electrical engineer to run 150 amp drop just to this server room which by itself is proving to be a pain the inline air movers are rated to 1260 CFM each and one would be intake the other exhaust if that were the way to go there would be a arduino based temperature controller taking data from multiple sensors to figure out how high or low to adjust the fans to maintain a certain temperature


This option is still more cost effective then to have AC ATM also working within the limited power constraints exhaust fans would be more effective then AC but I do understand the issue of hotspots what about heat dispersal in the Tom setting up perimeter box fans to create a cyclone effect ? Thoughts ?
acamus
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
May 24, 2011, 03:33:58 PM
 #18

you should definitely close the system. Put the miners near one another and do an intake/exhaust cycle on each card. Use ducts in a closed system to get all that air OUT of that room. A few of these things might be able to get the air moving although they throw off a bunch of heat themselves

http://www.steam-brite.com/store/images/Mytee_2200_air_mover.jpg

shouldn't really be too much hotspot issues with a ducted system like this.
w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 24, 2011, 03:53:36 PM
 #19

From your image it looks like you're going have one row ingesting the other's exhaust. It would make sense to take some of the concepts employed in datacenters like hot/cool aisles and enclosures.

Basically, create a partition of some kind that separates and contains the hot/cool airflows so that you can handle them easier. Exhaust each of the units into a central aisle/chimney with a return in the roof. The cool aisle is then fed with whatever you decide on to provide the 6 or so tons of cooling that setup would require.

Have you considered using a colo of some kind? Given the amount of money you're spending it might make sense to plop the stuff into one and forget about the power/cooling headaches all together.
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 04:05:38 PM
 #20

Yes we have considered a colo but the ones available to us in Edmonton are limited and pricing with min. Band width charges we have also considered leaseong commercial space both are not finically feasible at this time to justify it the power situation is about a 3200 dollar upgrade and it's one time were as colocation would be continuos although I do like the partition idea and hadn't even considered that that will certainly go into the design model if we opt exhaust over AC
w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 24, 2011, 04:13:52 PM
 #21

I do like the partition idea and hadn't even considered that that will certainly go into the design model if we opt exhaust over AC

I don't follow.

Partitioning would be a benefit regardless. It's not a solution in itself, it's something that makes whatever else you're doing more efficient.

$3200 isn't bad. That would cover a single rack for 2 months or less at a typical colo.
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 04:16:21 PM
 #22

This is currenty a very rough model ATM everything is to scale ATM including down to the mono's and the lumber we are certainly very open to suggestions and would be willing to provide dimensions to anyone interested in suggesting cheap and efficient and expandable design ideas (there is a upper limit) but that is to be detirmined by what we can get for power

Another possible issue is fire suppression but that will be a diffrent topic

Keep in mind with the current difficulty increasing there will be a point where mining will not be finically possible but that said we just wish to pay hardware and expenses and shall be happy

The room will be used for other projects in the future and possible leased for projects

-J


We just hadn't considered a partition and it will go into the model for sure
w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 24, 2011, 04:26:58 PM
 #23

Seems as good a place as any to say this...

Most of the setups I've seen are using standard cases and populating the motherboards directly. I wonder if anyone has experimented with using a PCIE splitter/extender to run 4x as many cards off a single motherboard? The benefit would be removing the overhead of motherboard/cpu/ram and adding flexibility to place the GPUs away from the mobo.
xf2_org
Member
**
Offline Offline

Activity: 98
Merit: 13


View Profile
May 24, 2011, 04:31:35 PM
 #24

Get an AC unit that is rated for the heat removal at least equal to the heat produced by the setup. Or start with a smaller installation if you have not enough knowledge.

Does anybody have any sort of numbers on heat produced by, say 2x 5870?

Any ballpark numbers on per-rig heat produced?

keybaud
Full Member
***
Offline Offline

Activity: 120
Merit: 100


View Profile
May 24, 2011, 04:32:40 PM
 #25

Seems as good a place as any to say this...

Most of the setups I've seen are using standard cases and populating the motherboards directly. I wonder if anyone has experimented with using a PCIE splitter/extender to run 4x as many cards off a single motherboard? The benefit would be removing the overhead of motherboard/cpu/ram and adding flexibility to place the GPUs away from the mobo.

You'll have a power problem, as the PCI-e bus is rated for up to 75W and the GPUs use this power as well as the separate connectors.

You can use single extenders though:
http://blog.zorinaq.com/?e=42
keybaud
Full Member
***
Offline Offline

Activity: 120
Merit: 100


View Profile
May 24, 2011, 04:34:04 PM
 #26

Get an AC unit that is rated for the heat removal at least equal to the heat produced by the setup. Or start with a smaller installation if you have not enough knowledge.

Does anybody have any sort of numbers on heat produced by, say 2x 5870?

Any ballpark numbers on per-rig heat produced?



For electrical devices, wattage to power the item is effectively the heat output, as there is no mechanical loss, so 190 Watts per 5870 approx.
w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 24, 2011, 04:43:26 PM
 #27

Seems as good a place as any to say this...

Most of the setups I've seen are using standard cases and populating the motherboards directly. I wonder if anyone has experimented with using a PCIE splitter/extender to run 4x as many cards off a single motherboard? The benefit would be removing the overhead of motherboard/cpu/ram and adding flexibility to place the GPUs away from the mobo.

You'll have a power problem, as the PCI-e bus is rated for up to 75W and the GPUs use this power as well as the separate connectors.

You can use single extenders though:
http://blog.zorinaq.com/?e=42

Interesting, I was under the impression that with the advent of hungrier cards the whole draw had been relocated to the 6/8-pin connectors, not that it was being divided with the mobo slot.
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 24, 2011, 04:48:50 PM
 #28

Get an AC unit that is rated for the heat removal at least equal to the heat produced by the setup. Or start with a smaller installation if you have not enough knowledge.

Does anybody have any sort of numbers on heat produced by, say 2x 5870?

Any ballpark numbers on per-rig heat produced?



For electrical devices, wattage to power the item is effectively the heat output, as there is no mechanical loss, so 190 Watts per 5870 approx.

is that 190W an actual measured consumption or the design max from a data sheet ??
keybaud
Full Member
***
Offline Offline

Activity: 120
Merit: 100


View Profile
May 24, 2011, 04:55:43 PM
 #29

Get an AC unit that is rated for the heat removal at least equal to the heat produced by the setup. Or start with a smaller installation if you have not enough knowledge.

Does anybody have any sort of numbers on heat produced by, say 2x 5870?

Any ballpark numbers on per-rig heat produced?



For electrical devices, wattage to power the item is effectively the heat output, as there is no mechanical loss, so 190 Watts per 5870 approx.

is that 190W an actual measured consumption or the design max from a data sheet ??

Manufacturer is 188W, but that is about what they draw at 100% load as I have 3 of them. My rig (exc monitor) draws 690 Watts at the plug for 3 HD5870 (at 950/300).  

You'll be much more efficient with directed cooling, as you need to increase the air flow to the GPU fan, not just reduce the temperature. As an example, taking my case off and having a cold wind blow onto the rig, the GPUs reached 100 degrees (as they are close together). Putting the case on and adding 2 fans that blow the air directly onto the GPUs dropped the temperature to 76 degrees. You may not have this issue if you have lots of space, but it may help with cost effective cooling, by reducing the volume of air you need to cool.
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 24, 2011, 05:11:25 PM
 #30


is that 190W an actual measured consumption or the design max from a data sheet ??

Manufacturer is 188W, but that is about what they draw at 100% load as I have 3 of them. My rig (exc monitor) draws 690 Watts at the plug for 3 HD5870 (at 950/300). 

You'll be much more efficient with directed cooling, as you need to increase the air flow to the GPU fan, not just reduce the temperature. As an example, taking my case off and having a cold wind blow onto the rig, the GPUs reached 100 degrees (as they are close together). Putting the case on and adding 2 fans that blow the air directly onto the GPUs dropped the temperature to 76 degrees. You may not have this issue if you have lots of space, but it may help with cost effective cooling, by reducing the volume of air you need to cool.

Just wondered .. was doing btu conversion

you rig is about 2356 btu/hr of heat
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 24, 2011, 05:48:59 PM
 #31

Consider fans/forced air only work if the ambient air is cooler than what you want to cool
If its 100 degrees outside its most likely warmer inside and forcing 100+ degree air on gpu's ain't gonna cool much

also from past experience if its not an air tight enclosure its better to exhaust hot air than blow in cooler air
sealing makes the difference 
Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
May 24, 2011, 06:02:32 PM
 #32

starter:
dont put it that way. or you need something like 2x3400mm "fan" just ensure. for consistent airflow[let alone exhaust].
instead:
1. put/mount in in standard[19''] rack, close it and attach to air conditioner.
2 or use usual office/from-the-shelf midi-tower PC-cases and attack 120mm pvc/composite tubes for hot exhaust, backed by external fans. at each step.
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 24, 2011, 06:15:04 PM
 #33

starter:
dont put it that way. or you need something like 2x3400mm "fan" just ensure. for consistent airflow[let alone exhaust].
instead:
1. put/mount in in standard[19''] rack, close it and attach to air conditioner.
2 or use usual office/from-the-shelf midi-tower PC-cases and attack 120mm pvc/composite tubes for hot exhaust, backed by external fans. at each step.

Just what were you trying to say Huh
your options may not be reasonable...cost effective
It's not like I'm getting paid for designing a system
w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 24, 2011, 06:31:22 PM
 #34

starter:
dont put it that way. or you need something like 2x3400mm "fan" just ensure. for consistent airflow[let alone exhaust].
instead:
1. put/mount in in standard[19''] rack, close it and attach to air conditioner.
2 or use usual office/from-the-shelf midi-tower PC-cases and attack 120mm pvc/composite tubes for hot exhaust, backed by external fans. at each step.

I like the mid-tower case option. You can get pretty decent (for this purpose) cases for $35 each. It wouldn't add a huge amount to the overall cost but it would make arranging the units much easier and provide a mount for exhaust tubes.

It's a shame the Supermicro GPU-centric server cases are to expensive and geared towards Tesla.  I imagine some old Dell 6X50 cases could be turned into a nice GPU rackmount with a little effort.
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 24, 2011, 06:47:51 PM
 #35

starter:
dont put it that way. or you need something like 2x3400mm "fan" just ensure. for consistent airflow[let alone exhaust].
instead:
1. put/mount in in standard[19''] rack, close it and attach to air conditioner.
2 or use usual office/from-the-shelf midi-tower PC-cases and attack 120mm pvc/composite tubes for hot exhaust, backed by external fans. at each step.

I like the mid-tower case option. You can get pretty decent (for this purpose) cases for $35 each. It wouldn't add a huge amount to the overall cost but it would make arranging the units much easier and provide a mount for exhaust tubes.

It's a shame the Supermicro GPU-centric server cases are to expensive and geared towards Tesla.  I imagine some old Dell 6X50 cases could be turned into a nice GPU rackmount with a little effort.

it all depends on how many - how much power
will a dozen cased systems with dryer hose out the window do the job
all year Huh  What works in New York is a bust is California
We can hardly guess if the person that asked for help cant provide any details
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 06:51:20 PM
 #36

Open board is the way we would like to keep it cases are not a option we have found the making small intake fans with cowling to provide a steady air intake to the cards open air cools more efficiently then a case would and we like the open air concept both for space and the modular ease of replacing dead hardware I think that by increaseing the air movement in the given room we can keep a mean ambient average and keep hot spots minimal again AC is certainly a option but I think we can effectively cool at this point with a intake exhaust system I like the partition idea simply because we can keep better control of the temperature due to segmentation I shall talk with my partner tonight and will do some more 3d modeling while we wait for the extra hardware to arrive as there is a bunch more to include in the design aspect of things


To the last person who posted please see original post in Edmonton Alberta Canada temp swings from +30 c in summer to -30c in winter so hard Wink so not super easy to design for

36 5850s clocked to 900 and 13 5870s at 950 at the present moment awaiting to run

The current "rack" dimensions will support up to 3 rigs per shelf for a total of  72 machines I think the idea thou is to stay under 50 depending one the power our provider will give us because at I think it is 48 machines the calculated load average was 98 amps ? I think I'm going off my memory from my partner
Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
May 24, 2011, 07:05:59 PM
 #37

i mean using open stand is no-go situation: not cooling benefits, no management benefits, except visual state control, not dust/bio-proof&etc and less energy efficient[to cool it down].
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 24, 2011, 07:17:55 PM
 #38

Open board is the way we would like to keep it cases are not a option we have found the making small intake fans with cowling to provide a steady air intake to the cards open air cools more efficiently then a case would and we like the open air concept both for space and the modular ease of replacing dead hardware I think that by increaseing the air movement in the given room we can keep a mean ambient average and keep hot spots minimal again AC is certainly a option but I think we can effectively cool at this point with a intake exhaust system I like the partition idea simply because we can keep better control of the temperature due to segmentation I shall talk with my partner tonight and will do some more 3d modeling while we wait for the extra hardware to arrive as there is a bunch more to include in the design aspect of things


To the last person who posted please see original post in Edmonton Alberta Canada temp swings from +30 c in summer to -30c in winter so hard Wink so not super easy to design for

36 5850s clocked to 900 and 13 5870s at 950 at the present moment awaiting to run

The current "rack" dimensions will support up to 3 rigs per shelf for a total of  72 machines I think the idea thou is to stay under 50 depending one the power our provider will give us because at I think it is 48 machines the calculated load average was 98 amps ? I think I'm going off my memory from my partner

Sounds like a plan to me
consider filters/cover the intake side.. you don't need any dust
in the winter exhaust back into the room you need the heat
in summer exhaust it outside
smooth
Legendary
*
Offline Offline

Activity: 2968
Merit: 1198



View Profile
May 24, 2011, 07:24:06 PM
 #39

in the winter exhaust back into the room you need the heat

Just reduce the flow rate.  Get a thermostat (and a relay) for the fans.
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 07:33:48 PM
 #40

in the winter exhaust back into the room you need the heat

Just reduce the flow rate.  Get a thermostat (and a relay) for the fans.


Arduino that takes temp readings across multiple probes and adjust the fan speed based on a set base temp Smiley
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 07:35:03 PM
 #41

Another thought was to incorporate it into the existing heating system at the location to warm it during the winter Smiley rather then waste all the heat
smooth
Legendary
*
Offline Offline

Activity: 2968
Merit: 1198



View Profile
May 24, 2011, 07:54:03 PM
 #42

Arduino that takes temp readings across multiple probes and adjust the fan speed based on a set base temp Smiley

Yes just hook up the 1260 CFM fans to one of the output pins on the Arduino, use analogWrite() and you are good to go.  Don't forget the flyback diode.

Smiley
w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 24, 2011, 07:57:39 PM
 #43

i mean using open stand is no-go situation: not cooling benefits, no management benefits, except visual state control, not dust/bio-proof&etc and less energy efficient[to cool it down].

I tend to agree with this.

Years of working in poorly designed/maintained datacenters and a few good ones has taught me that "open air" only works with an overabundance of space and airflow (inefficient and more inefficient).
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 24, 2011, 08:02:42 PM
 #44

Intresting ok I'm no pro so I don't have a leg to stand on i shall talk to the partner and give open air a shot an see results and can test as nessicary to see optimal results thanks for the input
Gameover
Member
**
Offline Offline

Activity: 92
Merit: 10

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE


View Profile WWW
May 25, 2011, 04:07:21 AM
 #45

lol you are so in over your head I hope you actually make it Cheesy

each system is 690watts = 2400btu/hr (a drastically small estimate IMO)
x48 machines = 115,000 btu
so you will need 8 of these window air conditioners
http://www.amazon.com/LG-Electronics-LW1511ER-Window-Conditioner/dp/B004U4MLYU

I would just cool with outside air, build a wall of box fans, remove the fan from the box and grills, make a custom wall with round holes to increase fan efficiency, air comes in one window, out another, prepare for the FEDs to show up a few weeks later due to energy usage and IR imaging...

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 07:59:06 AM
 #46

lol you are so in over your head I hope you actually make it Cheesy

each system is 690watts = 2400btu/hr (a drastically small estimate IMO)
x48 machines = 115,000 btu
so you will need 8 of these window air conditioners
http://www.amazon.com/LG-Electronics-LW1511ER-Window-Conditioner/dp/B004U4MLYU

I would just cool with outside air, build a wall of box fans, remove the fan from the box and grills, make a custom wall with round holes to increase fan efficiency, air comes in one window, out another, prepare for the FEDs to show up a few weeks later due to energy usage and IR imaging...

Now why the hell would you make a statement like that ? you don't know me or what I am capable of or not .. or the budget in which im doing this so why would you even assume im in over my head ?

I do have a cooling solution one that will work just fine .. what I'm asking for is Idea's on possible alternatives that may be cheaper and more cost effective

secondly did you not see by reading my posts that the power company is coming to run a new 200+ amp line to the place as currently the service is only 100 amp ? so with that combined with a electrical engineer and city permit's I do not have to worry about the "FED's" kicking in my door this has already been thought of and both the power company and the local Law Enforcement have been notified of the goings on and are welcome to come out and inspect provided they have a warrant for entry as we nixed there "Reasonable probable grounds" of entry by notifying them of what was transpiring so they can not unlawfully enter without warrant now  
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 25, 2011, 12:23:25 PM
 #47

lol you are so in over your head I hope you actually make it Cheesy

each system is 690watts = 2400btu/hr (a drastically small estimate IMO)
x48 machines = 115,000 btu
so you will need 8 of these window air conditioners
http://www.amazon.com/LG-Electronics-LW1511ER-Window-Conditioner/dp/B004U4MLYU

I would just cool with outside air, build a wall of box fans, remove the fan from the box and grills, make a custom wall with round holes to increase fan efficiency, air comes in one window, out another, prepare for the FEDs to show up a few weeks later due to energy usage and IR imaging...

 drastically small estimate -- NOT -- 1W=3.414 btu
 x48 machines -- NOT -- 48 gpu's not 48 machines

the 690w you misquote and use for your bad calculations Was a complete system with three 5870's
5870 draw 188w - 5850 draw almost 10% less 170w

48 gpu's 3 per system = 16 systems
even 16 x 2356 btu/hr = 37,632 btu/hr  less than 1/3 of your 115,000 btu


miner249er
Newbie
*
Offline Offline

Activity: 24
Merit: 0


View Profile
May 25, 2011, 01:30:36 PM
 #48

I'm looking at your load (non cooled) to be around 75amps@120v assuming the quantity of graphics cards noted by the OP - this comes out to about 9000 watts.

At this density you need about 2.5 tons of AC to cool this properly (that's 30,000 btu unit) which will draw roughly 40 amps @120v (though mostly likely it will come in a 230V configuration).

Your total consumption is about 115 amps - leaving around 50-60 amps or so before you reach your new 200 amp limit.

In short, sure it's mechanically feasible. Looking at the project and the overhead required along with the lower density (read not as efficient) machines there is probably some room for improvement.

Personally your "finished" cost per MH should be around $1. That is a machine that does 1.2/GigH should be $1200.

Gameover
Member
**
Offline Offline

Activity: 92
Merit: 10

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE


View Profile WWW
May 25, 2011, 02:08:46 PM
 #49

Now why the hell would you make a statement like that ? you don't know me or what I am capable of or not .. or the budget in which im doing this so why would you even assume im in over my head ?

secondly did you not see by reading my posts that the power company is coming to run a new 200+ amp line to the place as currently the service is only 100 amp ? so with that combined with a electrical engineer and city permit's I do not have to worry about the "FED's" kicking in my door this has already been thought of and both the power company and the local Law Enforcement have been notified of the goings on and are welcome to come out and inspect provided they have a warrant for entry as we nixed there "Reasonable probable grounds" of entry by notifying them of what was transpiring so they can not unlawfully enter without warrant now  

I don't have to know anything about you knowing you are spending $60,000 on a mining setup that may never pay for itself, given that for you to break even bitcoin is going to have to increase in value to $50 or $100, you should simply be buying bitcoins.

Do you think people raided for drugs are using illegal power drops connected to the grid?  They are not, having a legal drop means nothing, although contacting them is a great idea I wouldn't count getting raided out.  Law enforcement is notorious for not communicating internally.

drastically small estimate -- NOT -- 1W=3.414 btu
 x48 machines -- NOT -- 48 gpu's not 48 machines

the 690w you misquote and use for your bad calculations Was a complete system with three 5870's
5870 draw 188w - 5850 draw almost 10% less 170w

48 gpu's 3 per system = 16 systems
even 16 x 2356 btu/hr = 37,632 btu/hr  less than 1/3 of your 115,000 btu

The OP pic shows 48 machines with what I was assuming is 3 GPUs each.

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 25, 2011, 02:29:29 PM
 #50

Now why the hell would you make a statement like that ? you don't know me or what I am capable of or not .. or the budget in which im doing this so why would you even assume im in over my head ?

secondly did you not see by reading my posts that the power company is coming to run a new 200+ amp line to the place as currently the service is only 100 amp ? so with that combined with a electrical engineer and city permit's I do not have to worry about the "FED's" kicking in my door this has already been thought of and both the power company and the local Law Enforcement have been notified of the goings on and are welcome to come out and inspect provided they have a warrant for entry as we nixed there "Reasonable probable grounds" of entry by notifying them of what was transpiring so they can not unlawfully enter without warrant now 

I don't have to know anything about you knowing you are spending $60,000 on a mining setup that may never pay for itself, given that for you to break even bitcoin is going to have to increase in value to $50 or $100, you should simply be buying bitcoins.

Do you think people raided for drugs are using illegal power drops connected to the grid?  They are not, having a legal drop means nothing, although contacting them is a great idea I wouldn't count getting raided out.  Law enforcement is notorious for not communicating internally.

drastically small estimate -- NOT -- 1W=3.414 btu
 x48 machines -- NOT -- 48 gpu's not 48 machines

the 690w you misquote and use for your bad calculations Was a complete system with three 5870's
5870 draw 188w - 5850 draw almost 10% less 170w

48 gpu's 3 per system = 16 systems
even 16 x 2356 btu/hr = 37,632 btu/hr  less than 1/3 of your 115,000 btu

The OP pic shows 48 machines with what I was assuming is 3 GPUs each.

Maybe you should reread http://forum.bitcoin.org/index.php?topic=9621.msg139579#msg139579

Gee if using 100 amps of power gets you a visit from the authorities
they are visiting every small business & factory in the country
Now back to REALITY
Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
May 25, 2011, 03:36:01 PM
 #51

also consider buying more expensive[in terms of Mh/s per investments and PCi-e slots waste]cards w/o active coolers/fans. this time its something like 6870 or below.
JJG
Member
**
Offline Offline

Activity: 70
Merit: 20


View Profile
May 25, 2011, 04:11:18 PM
 #52

Whatever your plan is, I hope your top priority is to get as many of these machines up and running within your current cooling/power limits as you possible can.

Especially with the next difficulty jump at over 70% (!!)
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 25, 2011, 04:41:54 PM
 #53

cuting some 4x8 sheet of composite board with some u channel guides top & bottom
would make some cheep side panels-doors
six 20' box fans should give plenty of air flow even at low speed
cheep furnace filters in some u channel for filtering the fan intake side for dust
Gameover
Member
**
Offline Offline

Activity: 92
Merit: 10

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE


View Profile WWW
May 25, 2011, 05:09:00 PM
 #54

Maybe you should reread http://forum.bitcoin.org/index.php?topic=9621.msg139579#msg139579

Gee if using 100 amps of power gets you a visit from the authorities
they are visiting every small business & factory in the country
Now back to REALITY

Yeah i missed that one, no worries with that size of a setup.  I am curious, what do you estimate your earnings to be in 20 days when difficulty is 1.2M or in 30 days when it is 2M?

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE
acamus
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
May 25, 2011, 05:25:21 PM
 #55

He could possibly break even in a few months. Then he's just wasted the opportunity cost of his time.
bitcool
Legendary
*
Offline Offline

Activity: 1441
Merit: 1000

Live and enjoy experiments


View Profile
May 25, 2011, 05:27:59 PM
 #56

cuting some 4x8 sheet of composite board with some u channel guides top & bottom
would make some cheep side panels-doors
six 20' box fans should give plenty of air flow even at low speed
cheep furnace filters in some u channel for filtering the fan intake side for dust
Don't forget buying a large fire insurance, from a company that has no swap trade with AIG.
You'll start a fire in no time.
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 05:32:40 PM
 #57

First off my cost is no were near 60g lol try more like 12g haha like I said this is a cheap setup check YouTube.com/warweed for what we were running before that's all brig consolidated into one room and tripled then add cooling ontop of that


And like I said it's epcor (power company) that flags a house for excessive usage they have already been contacted same with local law if the proceed we will sue the
 And don't worrie we will recover our costs easily in hardware and when power starts to be to mch in relation to payout we shall lease out the gpu cluster to university researchers Smiley or something to that effect
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 25, 2011, 05:35:42 PM
 #58

cuting some 4x8 sheet of composite board with some u channel guides top & bottom
would make some cheep side panels-doors
six 20' box fans should give plenty of air flow even at low speed
cheep furnace filters in some u channel for filtering the fan intake side for dust
Don't forget buying a large fire insurance, from a company that has no swap trade with AIG.
You'll start a fire in no time.

How cheap is ASSHOLE insurance Huh
Just what that I stated is a fire hazard Huh
Troll Troll go away
play your games another way
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 05:39:57 PM
 #59

He does make a valid point thou about that actually but I already have the equipment already around for a CO2 argon fire suppression system Smiley
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 05:48:56 PM
Last edit: May 25, 2011, 06:02:14 PM by warweed
 #60

We have a source for complete single slot pcie machines in the amount of 50 machines a skid at a cost of 20 each then add 133 plus tax only on 133 of 5% for the gpu add psu that powers the gpu and board sufficiently for 30 that's 9600 just for the machines that's cheaper by far then what 90% of you even pay for a 5870

See bobR it's not who you know but who knows you ! That's how you get good deals buy in volume with a corporate account

Don't under estimate the cost and moving factor of volume purchases of end of life hardware
pwnyboy
Full Member
***
Offline Offline

Activity: 125
Merit: 100


View Profile
May 25, 2011, 05:57:35 PM
 #61

Now why the hell would you make a statement like that ?

You're coming to a forum to ask for advice on how to cool this room of yours.  A forum!  Not to an engineer, not even to an HVAC tech, but a forum, where the overwhelming majority of the users are not HVAC design engineers.  At best, you might find someone familiar with the laws of physics (see above) or someone who's actually designed and built datacenters for a living (I happen to fall into that category).

Quote
you don't know me or what I am capable of or not .. or the budget in which im doing this so why would you even assume im in over my head ?

I've seen this time and again in the web hosting business.  The people who run to forums for advice _first_ are the ones who _do not_ have a budget to talk to an engineer, and _are indeed_ over their head.

Quote
I do have a cooling solution one that will work just fine .. what I'm asking for is Idea's on possible alternatives that may be cheaper and more cost effective

You have yet to define what "work just fine" means exactly.  If that means 5 degrees (Celsius) above outside air temp, not likely.  If that means 20 degrees above outside air temp, you might have a shot.  

Quote
I do not have to worry about the "FED's" kicking in my door this has already been thought of and both the power company and the local Law Enforcement have been notified of the goings on and are welcome to come out and inspect provided they have a warrant for entry as we nixed there "Reasonable probable grounds" of entry by notifying them of what was transpiring so they can not unlawfully enter without warrant now  

That's ridiculous.  All you've done is put yourself on their radar.  Please do report back to us how your project goes, and at what point the feds do show up.
w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 25, 2011, 06:05:27 PM
 #62

We have a source for complete single slot pcie machines in the amount of 50 machines a skid at a cost of 20 each then add 133 plus tax only on 133 of 5% for the gpu add another 35 for a psu that powers the gpu and board sufficiently for 30 that's 9600 just for the machines that's cheaper by far then what 90% of you even pay for a 5870

Be careful with those pallets.

My employer would auction off as they fell of maintenance, over a thousand Dell Optiplex machines every year.

It was common practice for the technicians to rip out the RAM and sometimes CPUs prior to sending them off. Old DDR and DDR2 can be very expensive to procure these days.
bitcool
Legendary
*
Offline Offline

Activity: 1441
Merit: 1000

Live and enjoy experiments


View Profile
May 25, 2011, 06:06:02 PM
 #63

It really depends on your location (latitude). If you are north of 40N, highest ambient temperature is probably 40C in the summer, CPU & GPUs can work up to 90~100C,  you don't even need a room, a true open air operation will be the best:
http://t1.gstatic.com/images?q=tbn:ANd9GcTLX5-9eUfY97PMipkLQXn0MgXyn8WHQ9MJUpFnGYizpgXSIroVCA
Alaska anyone?
acamus
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
May 25, 2011, 06:11:19 PM
 #64

wanna try and screw this guy? upvote. http://www.reddit.com/r/reddit.com/comments/hjylq/got_an_ati_video_card_start_generating_bitcoins/
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 25, 2011, 06:13:58 PM
 #65


Quote
I do not have to worry about the "FED's" kicking in my door this has already been thought of and both the power company and the local Law Enforcement have been notified of the goings on and are welcome to come out and inspect provided they have a warrant for entry as we nixed there "Reasonable probable grounds" of entry by notifying them of what was transpiring so they can not unlawfully enter without warrant now 

That's ridiculous.  All you've done is put yourself on their radar.  Please do report back to us how your project goes, and at what point the feds do show up.

pwnyboy can you at least read before sticking your foot in your mouth

Would be a nice trick for "The Fed's" to bag a CANADIAN
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 06:20:27 PM
 #66

Yup again I am Canadian I am aware of the busts made in Canada for suspected grow op operations I also have experience with dealing with the law including successfully sueing I have done my due dillidgance and have made sure to follow all regulations and sure you know what it is a shotty setup but it is a to code setup I'm not running to the forums for help I'm asking for advice qnd suggestions and even previous experiences


My calculated load average is 96 amps at 120 volts plus cooling and lastly

Those machines I got cheap were all inspected prior to purchase they all have procs psu's and ram 80gig Sata drives and a single gig of ram each the pcu cooler had to be modded to accept the gpu as it was in the way and the case top would not clear hence open air
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 06:28:38 PM
 #67

I'm not denying that it is a shit show I'm simply not it is a on the fly idea but it is a shit show that I wouldn't even attempt if I hadn't even at least previously covered my hardware costs I have already paid off my 13 5870s mining since difficulty was at 72k and paid of what I'm investing now the rest is gravy and if you want to call my bluff you can look at block explorer I have been steady cashing our to lr 1000 a day for the last 19 days and shall keep doing so
exahash
Sr. Member
****
Offline Offline

Activity: 278
Merit: 250



View Profile
May 25, 2011, 06:30:20 PM
 #68

@warweed - your projected 150 or 200 amps is not going to be enough. 

I am running an almost identical set up, with roughly one of your racks and am drawing about 150 amps.  I have 3- and 4-card boards with sempron 140's and those same sapphire xtreme 5850's.  The 3-card boards are each pulling 5.5 amps (measured with a kill-a-watt at the outlet) and the 4-card boards are pulling almost 7 amps (at full load with slight oc of course).  Doing the math... you're looking to run 48 boards @5.5 amps each for a total of 264 amps.  Better make that 300 amp service if you want to run a bunch of fans or a/c units in that room.

I'm running my setup in an office which already has a/c and it can't keep up.  I've put a large window fan in to exhaust the hot air and another large intake fan to bring cool air in from the rest of the building.  The heat from those cards is ejected laterally, so instead of having fans blow in the usual front-to-back orientation like in a computer case, I have box fans blowing the length of the shelves, to move the heat toward the window.

I also found that its cheaper and slightly lower power to go with USB stick drives instead of regular hd's.  An ubuntu server install plus everything you need fits on a 4 GB stick.

Good luck, I hope electricity is free there!
pwnyboy
Full Member
***
Offline Offline

Activity: 125
Merit: 100


View Profile
May 25, 2011, 06:39:39 PM
 #69

pwnyboy can you at least read before sticking your foot in your mouth

Would be a nice trick for "The Fed's" to bag a CANADIAN

The OP mentioned "FEDS".  I knew he was Canadian and rightfully assumed he was referring to the Royal Canadian Mounted Police, a federal police service.

It is you who'se put his foot in his mouth.  Good day sir.
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 25, 2011, 06:51:15 PM
 #70

pwnyboy can you at least read before sticking your foot in your mouth

Would be a nice trick for "The Fed's" to bag a CANADIAN

The OP mentioned "FEDS".  I knew he was Canadian and rightfully assumed he was referring to the Royal Canadian Mounted Police, a federal police service.

It is you who'se put his foot in his mouth.  Good day sir.

yep and you were loads of help to his problem
get a life & stop trooling
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 25, 2011, 07:11:30 PM
 #71

Why would it be a problem if he wasn't doing anything illegal in the first place?

It wouldn't
but we have the trolls that seem to have nothing better to do than scan what has been posted adding nothing but drivel just to see their name or say how great they are
There should be a special twit thread just for the KIDS that need to do that
Gameover
Member
**
Offline Offline

Activity: 92
Merit: 10

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE


View Profile WWW
May 25, 2011, 07:12:08 PM
 #72

Why would it be a problem if he wasn't doing anything illegal in the first place?

its not a problem in the long run, but the feds busting down your door at 2am can be some scary shit lol, more of an inconvenience.  unknown to most that local police, DEA, and FEDs use IR drones to detect heat signatures and monitor houses power usage to detect grow houses, since lighting uses a lot of electricity to grow pot plants.

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE
Gameover
Member
**
Offline Offline

Activity: 92
Merit: 10

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE


View Profile WWW
May 25, 2011, 07:18:06 PM
 #73

First off my cost is no were near 60g lol try more like 12g haha like I said this is a cheap setup check YouTube.com/warweed for what we were running before that's all brig consolidated into one room and tripled then add cooling ontop of that

And like I said it's epcor (power company) that flags a house for excessive usage they have already been contacted same with local law if the proceed we will sue the
 And don't worrie we will recover our costs easily in hardware and when power starts to be to mch in relation to payout we shall lease out the gpu cluster to university researchers Smiley or something to that effect

cool man, if you would like to share i would still love to see your cost projections and ROI, but as you say since it sounds like all of your next setup is funded by bitcoins there is nothing to lose in effect, but still that's 12g that could be in your bank...

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE
bobR
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
May 25, 2011, 07:22:10 PM
 #74

Why would it be a problem if he wasn't doing anything illegal in the first place?

its not a problem in the long run, but the feds busting down your door at 2am can be some scary shit lol, more of an inconvenience.  unknown to most that local police, DEA, and FEDs use IR drones to detect heat signatures and monitor houses power usage to detect grow houses, since lighting uses a lot of electricity to grow pot plants.

you guys need to get a life
stop believing in santa and the sifi fairy
most drug bust's are ratted out info
if it was ir from power usage every factory and most any business would be on the radar

AND this has little to due with the subject at hand
some one doing legitimate business ...In canada ..where the US dea HAS NO AUTHORITY
enough already with this BS
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 07:49:15 PM
 #75

Exactly and my dad is aviation enforcement and maintenance enforcement cheif of the prairie region of Canada the local sky lights flying overs sophisticated enough to detect if the heat is consistent with a signature from a metal halide or high press sodium bulb

Secondly Edmonton is a very large police force there is a big difference between RCMP and municipal police (like comparing a highway cop to a city cop)

Thirdly no the draw should not exceed 100amp at 120v
Plus AC or cooling

So if anyone else has concerns I will address them now then respectfully request we get off this tangent qnd back to the main topic
pwnyboy
Full Member
***
Offline Offline

Activity: 125
Merit: 100


View Profile
May 25, 2011, 07:53:57 PM
 #76

yep and you were loads of help to his problem
get a life & stop trooling

My post was loads of help to anyone considering doing the same thing - quite simply - to go it alone is ridiculous.  It was based on the insights gained by building and operating Internet datacenters for the past 6 years.  And it'll be even more helpful when the OP comes back and says "omgosh, I had to shut off rigs during the day because my room was 40 degrees hotter than the outside air during the peak of summer!".  It is you who are trolling, but your naiveté precludes you from realizing the same.
cschmitz
Member
**
Offline Offline

Activity: 98
Merit: 10


View Profile
May 25, 2011, 08:04:50 PM
 #77

We have a source for complete single slot pcie machines in the amount of 50 machines a skid at a cost of 20 each then add 133 plus tax only on 133 of 5% for the gpu add psu that powers the gpu and board sufficiently for 30 that's 9600 just for the machines that's cheaper by far then what 90% of you even pay for a 5870

See bobR it's not who you know but who knows you ! That's how you get good deals buy in volume with a corporate account

Don't under estimate the cost and moving factor of volume purchases of end of life hardware

Given the way you structured your sentence and your hardware logic, i can only say its not the best idea in the universe, really. Triple slot am3 with 5850s are your best bet if you want to expand massively. your single slot shitpcs have no resale value and will really put a hurting on your administrative overhead with that amount of cards.

proud 5.x gh/s miner. tips welcome at 1A132BPnYMrgYdDaRyLpRrLQU4aG1WLRtd
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 08:13:41 PM
 #78

Oh and in addition it's the power company that tips off the cops of large use also run audits for illegal taps checking with some weird laser at the feed into the house and the feed from the meter to verify all is good there is a threshold to which raises flags of above normal household consumption to which is then investigated
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 08:18:46 PM
 #79

We have a source for complete single slot pcie machines in the amount of 50 machines a skid at a cost of 20 each then add 133 plus tax only on 133 of 5% for the gpu add psu that powers the gpu and board sufficiently for 30 that's 9600 just for the machines that's cheaper by far then what 90% of you even pay for a 5870

See bobR it's not who you know but who knows you ! That's how you get good deals buy in volume with a corporate account

Don't under estimate the cost and moving factor of volume purchases of end of life hardware

Given the way you structured your sentence and your hardware logic, i can only say its not the best idea in the universe, really. Triple slot am3 with 5850s are your best bet if you want to expand massively. your single slot shitpcs have no resale value and will really put a hurting on your administrative overhead with that amount of cards.


Have you seen the costs of psu's that are needed for a 3 gpu machine on top of that what happens when a board or psu or one fault happens 3 cards go down huh
Even daisy chaning psu's is pointless and stupid


And guys quit fucking bickering seriously all your advice and suggestions are all being taken into consideration

And your input is appreciated because your right some of you have alot more experience

Jesus Christ are we 14 trying to start a flame war grow the fuck up

warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 08:23:36 PM
 #80

We have a source for complete single slot pcie machines in the amount of 50 machines a skid at a cost of 20 each then add 133 plus tax only on 133 of 5% for the gpu add psu that powers the gpu and board sufficiently for 30 that's 9600 just for the machines that's cheaper by far then what 90% of you even pay for a 5870

See bobR it's not who you know but who knows you ! That's how you get good deals buy in volume with a corporate account

Don't under estimate the cost and moving factor of volume purchases of end of life hardware

Given the way you structured your sentence and your hardware logic, i can only say its not the best idea in the universe, really. Triple slot am3 with 5850s are your best bet if you want to expand massively. your single slot shitpcs have no resale value and will really put a hurting on your administrative overhead with that amount of cards.


Oh and surprisingly I have sold all the legit cd keys for xp pro already for double the cost of the pc itself 

But I do get your concerns you are right and English isn't my first language so I'm sorry I know I carry no grammer
Gameover
Member
**
Offline Offline

Activity: 92
Merit: 10

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE


View Profile WWW
May 25, 2011, 09:13:33 PM
 #81

We have a source for complete single slot pcie machines in the amount of 50 machines a skid at a cost of 20 each then add 133 plus tax only on 133 of 5% for the gpu add psu that powers the gpu and board sufficiently for 30 that's 9600 just for the machines that's cheaper by far then what 90% of you even pay for a 5870

See bobR it's not who you know but who knows you ! That's how you get good deals buy in volume with a corporate account

Don't under estimate the cost and moving factor of volume purchases of end of life hardware

Given the way you structured your sentence and your hardware logic, i can only say its not the best idea in the universe, really. Triple slot am3 with 5850s are your best bet if you want to expand massively. your single slot shitpcs have no resale value and will really put a hurting on your administrative overhead with that amount of cards.


Oh and surprisingly I have sold all the legit cd keys for xp pro already for double the cost of the pc itself 

But I do get your concerns you are right and English isn't my first language so I'm sorry I know I carry no grammer

he has a point though, given that the difficulty is going well over a million in 20 days, it is quickly going to become all about efficiency, and having 1 card per cpu is a drawback, just as cpu mining isn't worth it, next will be nvidia gpus.  how much is each system w/o the vid card going to use?

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE
smooth
Legendary
*
Offline Offline

Activity: 2968
Merit: 1198



View Profile
May 25, 2011, 10:39:12 PM
 #82

it is quickly going to become all about efficiency

Efficiency is subjective
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 11:01:38 PM
 #83

I don't disagree that running single gpus per mobo is more power consuming but the proc is hardly used and the hdd won't be spinning constant so the actual benefit ia minimal just like if I were to run my psu's on 240 instead of 120 but yeah I get what your saying when we generate some more profit we may consider switching up hardware and power and use the singles as backups and translationing boards while funding bigger setups who knows this is a fly by the seat of our pants operation lol
w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 25, 2011, 11:03:43 PM
 #84

We have a source for complete single slot pcie machines in the amount of 50 machines a skid at a cost of 20 each then add 133 plus tax only on 133 of 5% for the gpu add psu that powers the gpu and board sufficiently for 30 that's 9600 just for the machines that's cheaper by far then what 90% of you even pay for a 5870

See bobR it's not who you know but who knows you ! That's how you get good deals buy in volume with a corporate account

Don't under estimate the cost and moving factor of volume purchases of end of life hardware

Given the way you structured your sentence and your hardware logic, i can only say its not the best idea in the universe, really. Triple slot am3 with 5850s are your best bet if you want to expand massively. your single slot shitpcs have no resale value and will really put a hurting on your administrative overhead with that amount of cards.


Oh and surprisingly I have sold all the legit cd keys for xp pro already for double the cost of the pc itself  

But I do get your concerns you are right and English isn't my first language so I'm sorry I know I carry no grammer

he has a point though, given that the difficulty is going well over a million in 20 days, it is quickly going to become all about efficiency, and having 1 card per cpu is a drawback, just as cpu mining isn't worth it, next will be nvidia gpus.  how much is each system w/o the vid card going to use?

I imagine losses in efficiency are effectively offset by the fact that the PCs cost $20. Considering a minimal system sans GPU is going to consume <100W in any case and PCs old enough to going for $20 aren't going to particularly power hungry.

Someone building from scratch with new equipment is looking at a lot more than $20 for their CPU, RAM, PSU and 3-slot mobos, probably $100 or more per node. That's a difference of $4000 across 50 machines. Those 50 inefficient machines will probably eat no more than an extra $300-400 per month in power over a 3-slot build. We can hardly predict what the bitcoin world will look in 6+ months when the two would converge.

Someone could point to the potential resale of newer gear at that point but, I think they'd actually be mistaken. You'll have an easier time moving or possibly donating semi-complete systems than selling a pile of low-end, outdated parts.
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 11:19:48 PM
 #85

Not to worrie about resale as I previously stated when to cost to mine out weights the benefits of mining we will be using the gpu "cluster" persay to work on some other projects and lease it out to researchers I'm sure there are alot of people that would love cheap access to that much power Smiley

Who knows lol I sure don't will play it by ear


Smiley
allinvain
Legendary
*
Offline Offline

Activity: 3080
Merit: 1080



View Profile WWW
May 25, 2011, 11:27:01 PM
 #86

I don't disagree that running single gpus per mobo is more power consuming but the proc is hardly used and the hdd won't be spinning constant so the actual benefit ia minimal just like if I were to run my psu's on 240 instead of 120 but yeah I get what your saying when we generate some more profit we may consider switching up hardware and power and use the singles as backups and translationing boards while funding bigger setups who knows this is a fly by the seat of our pants operation lol

Don't use a HD at all. Use a bootable USB stick with LinuxCoin on it. Also running linux ensures that you won't have that nasty 100% cpu usage issue windows miners have.

For PSU I'd try to find 80 Plus Gold rated PSUs if you're concerned about power efficiency - which you should. You'd save 10 to 20 percent per machine in otherwise wasted energy.

Yep, fly by the seat of your pants operations are more fun Smiley Hey if you can have fun while doing it and make some BTC why not do it Smiley

allinvain
Legendary
*
Offline Offline

Activity: 3080
Merit: 1080



View Profile WWW
May 25, 2011, 11:29:34 PM
 #87

Not to worrie about resale as I previously stated when to cost to mine out weights the benefits of mining we will be using the gpu "cluster" persay to work on some other projects and lease it out to researchers I'm sure there are alot of people that would love cheap access to that much power Smiley

Who knows lol I sure don't will play it by ear


Smiley

You live somewhere in Alberta right? Lease the computational power to the University of Calgary or some other western university.

w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 25, 2011, 11:32:03 PM
 #88

Not to worrie about resale as I previously stated when to cost to mine out weights the benefits of mining we will be using the gpu "cluster" persay to work on some other projects and lease it out to researchers I'm sure there are alot of people that would love cheap access to that much power Smiley

Who knows lol I sure don't will play it by ear


Smiley

I don't know about that.

You've got a couple of things to overcome:

1) There are already brokers for general purpose cloud-compute capacity but, nothing for GPGPU yet as far as I know.

2) Scientific computing tends toward Nvidia. Nvidia has been working on making itself a player in that area for several years while AMD is just started. People with CUDA code aren't going to have any interest in your cluster.

I expect that #1 will be resolved before long if it isn't underway already. Once that happens, #2 will disappear too.

It would be very interesting to see bitcoin miners end up as next-generation cloud providers.
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 11:35:53 PM
 #89

Yeah Edmonton and I was thinking about u of a researchers Smiley not having the actual numbers what do you figure the power usage would be say of single machine running a hdd then say a thumb drive ? And I Haven't used Linux coin yet but is it fairly easy to overclock in ? ? I would need to find a cheap source of a bunch of thumb drives in order to do this
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 25, 2011, 11:48:44 PM
 #90

You could certainly
Be right i guess it depends on the group thou we shall see at the very least I can think of a few small groups who would lease rendering time as well as a few guys doing work on md5 cracking as well as wpa stuff
JJG
Member
**
Offline Offline

Activity: 70
Merit: 20


View Profile
May 26, 2011, 12:06:35 AM
 #91

Not to worrie about resale as I previously stated when to cost to mine out weights the benefits of mining we will be using the gpu "cluster" persay to work on some other projects and lease it out to researchers I'm sure there are alot of people that would love cheap access to that much power Smiley

Who knows lol I sure don't will play it by ear


Smiley

I don't know about that.

You've got a couple of things to overcome:

1) There are already brokers for general purpose cloud-compute capacity but, nothing for GPGPU yet as far as I know.

2) Scientific computing tends toward Nvidia. Nvidia has been working on making itself a player in that area for several years while AMD is just started. People with CUDA code aren't going to have any interest in your cluster.

I expect that #1 will be resolved before long if it isn't underway already. Once that happens, #2 will disappear too.

It would be very interesting to see bitcoin miners end up as next-generation cloud providers.

As someone who has worked with clusters and Universities in the past, I'd say he actually has a lot of obstacles to overcome.

For one, this is as garage-shop as it gets. It's perfectly fine for Bitcoin mining, but it's not anywhere near the complexity required to quickly and easily load a client's software. Furthermore, the hardware outside of the GPUs is sketchy, to say the least. I doubt it has ECC memory or other dependable hardware. The overclocked GPUs are a non-starter as well, although that's easy to fix. Finally, he's going to need some serious bandwidth to his house to allow anyone to connect and move a reasonable amount of data into/out of the cluster. And if the room heats up too much and crashes a few machines in the middle of someone's compute time, it's all over.

Again, absolutely perfect for bitcoin work but a non-starter for academic purposes. Especially with Amazon EC2 providing quick and reliable instances that are billed hourly and available instantly.
allinvain
Legendary
*
Offline Offline

Activity: 3080
Merit: 1080



View Profile WWW
May 26, 2011, 12:27:32 AM
 #92

Yeah Edmonton and I was thinking about u of a researchers Smiley not having the actual numbers what do you figure the power usage would be say of single machine running a hdd then say a thumb drive ? And I Haven't used Linux coin yet but is it fairly easy to overclock in ? ? I would need to find a cheap source of a bunch of thumb drives in order to do this

All you would need is at max 4 GB usb thumb drives which sell for like $12-20 each I guess if you buy in bulk you can get a discount maybe. Maybe something like this would be suitable:

http://www.newegg.ca/Product/Product.aspx?Item=N82E16820139291

A low RPM HD should not eat that much power but a thumb drive eats even less. We're talking watts vs milliwatts here. For example a Samsung F1 640GB is rated 0.7A @ 12V and 0.5A @ 5V - that means max 13 watts power consumption. A USB stick on the other hand will consume much much less than that - max 1 watt. SSH HDs are pretty power efficient too but they're more expensive.

Bear in mind that a USB stick will be slower typically than a HD when booting from it, but who cares cause once the OS is booted you won't be using the drive much (maybe just once in a while for system logs and other stuch miscellaneous items, even this can be disabled and optimized to not occur).

The latest version of LinuxCoin contains all the tools needed to overclock and manage ATI cards. As far as being easy it's as easy as using a command prompt. I don't think there are any GUI based utils for linux yet though.

marcus_of_augustus
Legendary
*
Offline Offline

Activity: 3920
Merit: 2348


Eadem mutata resurgo


View Profile
May 26, 2011, 01:38:45 AM
 #93


Definitely out of his depth ... but it is the best way to learn how to swim with the big boys .... good luck ... go for it!

Hint: blow the heat out up through the ceiling as much as possible and bring the new air in through the floor ... heat rises, the top stacks will be hottest ... good luck.
Q. = m. Cp delta T

If you can't change temp. difference you gotta move more fluid (air).

warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 26, 2011, 04:34:24 AM
 #94

bah i give up Tongue lol i know that heat rises and cold stays low lol give me a bit of a break


smooth
Legendary
*
Offline Offline

Activity: 2968
Merit: 1198



View Profile
May 26, 2011, 04:58:19 AM
 #95

bah i give up Tongue lol i know that heat rises and cold stays low lol give me a bit of a break

He's right about deltaT though.  It's going to be really hard to keep the room at a reasonable temperature when it's 35+ outside.  One approach is to just plan for a bit of downtime when it gets too hot. 
w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 26, 2011, 05:22:54 AM
 #96

bah i give up Tongue lol i know that heat rises and cold stays low lol give me a bit of a break

He's right about deltaT though.  It's going to be really hard to keep the room at a reasonable temperature when it's 35+ outside.  One approach is to just plan for a bit of downtime when it gets too hot. 


What you really need for such an operation is a centralized throttle that communicates with an agent on each node to dial the clocks back on some or part of the installation as needed instead of losing time entirely due to shutting down. Ideally, this would all be automated via feedback from temperature sensors spread around the room.
Gameover
Member
**
Offline Offline

Activity: 92
Merit: 10

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE


View Profile WWW
May 26, 2011, 05:40:11 AM
 #97

hds are typically 5-15w where a usb stick will be less than 1w, plus you might save some more watts by not having the hdd controller in use.

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE
allinvain
Legendary
*
Offline Offline

Activity: 3080
Merit: 1080



View Profile WWW
May 26, 2011, 05:47:40 AM
 #98

bah i give up Tongue lol i know that heat rises and cold stays low lol give me a bit of a break

He's right about deltaT though.  It's going to be really hard to keep the room at a reasonable temperature when it's 35+ outside.  One approach is to just plan for a bit of downtime when it gets too hot. 


He needs to move this operation to the Northwest Territories (Just above Alberta where he is) or as close as possible to the north pole as possible Tongue. When it's -30 outside I don't think cooling shall be a problem. I'm being facetious here but lol someone crazy enough may do this Smiley

allinvain
Legendary
*
Offline Offline

Activity: 3080
Merit: 1080



View Profile WWW
May 26, 2011, 05:50:44 AM
 #99

hds are typically 5-15w where a usb stick will be less than 1w, plus you might save some more watts by not having the hdd controller in use.

He could also massively underclock each CPU and the RAM. If they're all AMD boards then turn on that cool'n'quiet junk. If he wants to be even more extreme maybe underclock the cpus so much that he can passively cool them?

warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 26, 2011, 06:56:32 AM
 #100

What I was reading in the linuxcoin thread was this list

Enable overclocking ..
Code:
aticonfig --enable-od

next lets get the clocks

Code:
aticonfig --odgc --adapter=your adapter

Set them to what you want.

Code:
aticonfig --odsc 900,900 --adapter=youradapter

And commit them

Code:
aticonfig --odcc --adapter=youradapter

And set the fan speed to 70% (a good setting for most 5XXX&6XXX cards)

Code:
DISPLAY=:0.youradapter aticonfig --pplib-cmd="set fanspeed 0 70"

Here's a little bash magic to test your card temps.


Code:
while true; do
        aticonfig --adapter=0 --od-gettemperature | tail -n1 | awk '{print "Current temp: " $5}' ;
        aticonfig --adapter=0 --od-gettemperature | tail -n1 | awk '{print "Current temp: " $5}' ;
        echo $(aticonfig --odgc --adapter=0| grep GPU);
        echo $(aticonfig --odgc --adapter=1| grep GPU);
# Next lines are to check your balance if solo mining
#        BALANCE=$(bitcoind getbalance)
#        echo -ne "Bitcoin Balance: ${BALANCE}\r";
        sleep 35;
        clear
done

But I will look into adding AMDoverdrivectl. Diablo miner will be added on the next version and I will take a look at hashkill.

The MD5 hash don't match with the file i downloaded from this site today.
my hash is   MD5: d50c4d3a38a1349111bac5f69b9571f1  linuxcoin-v0.1b-ati-cd.iso
while it says CD MD5: 71e5253bc5bda47003c57d71c52db869  linuxcoin-v0.1b-ati-cd.iso
My file size is 684 MB (717,387,776 bytes) on win7 32.
Who is wrong?

The md5sun for the CD version is 28bc5b424e3f33779f67fbe4fe1a9a67 I'll update now.

Don't forget the USB version is uploaded now Wink
[/quote]
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 26, 2011, 07:00:42 AM
 #101

With this Your right I could fairly easily adjust the clock speed to suit the cooling needs the usual temp in Edmonton hovers around 25 in the summer an hits maybe 32 ish for 2 weeks the bonus is both sides of the room face north or south so no direct sunlight Smiley


warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
May 26, 2011, 07:02:59 AM
 #102

I forgot to ask thou I know Linux a bit but to be honest I'm pretty much a n00b I can make my way around terminal and such not much for GUIs ect but is there any good software from overclocking past what overdrive will let you ? In the linuxcoin distro ?

Assuming I can figure out why I'm getting rpc errors ?
namley
Member
**
Offline Offline

Activity: 62
Merit: 10


View Profile
May 26, 2011, 07:09:28 AM
 #103

I know this is a bit off topic, but back to the beginning of the thread.

I am planning something similar, but on a much smaller scale

You think it would by more cost efficient to isolate the racks in a semi-sealed environment and use portable air conditioners?

By that I mean, Isolate an entire shelf with isolating plastic, available at most hardware stores. Use velcro strips to make a door, and buy portable air conditioning units from walmart.

Make an outlet to vent and keep your AirFlow CFM up.

I'm not an engineer by anymeans, if you haven't noticed  yet.

Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
May 26, 2011, 07:24:18 AM
 #104

use tubes attached to ordinary desktop/midtower cases fans to exhaust heat rather than messing/mixing it in room fighting with aerodynamic.
PVC-made isn't strong enough termally, but rest can do the trick.

in case if you prefer cool room enteriely, as usual for 19'' racks-based data-centers, keep in mind thats not so energy-efficient[even meaning air loses in tubes]
noobboon
Newbie
*
Offline Offline

Activity: 25
Merit: 0


View Profile
May 26, 2011, 11:14:16 AM
 #105

burn!!!!!!!!!

don't forget to call 911 Smiley
w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 26, 2011, 12:18:07 PM
 #106

I know this is a bit off topic, but back to the beginning of the thread.

I am planning something similar, but on a much smaller scale

You think it would by more cost efficient to isolate the racks in a semi-sealed environment and use portable air conditioners?

By that I mean, Isolate an entire shelf with isolating plastic, available at most hardware stores. Use velcro strips to make a door, and buy portable air conditioning units from walmart.

Make an outlet to vent and keep your AirFlow CFM up.

I'm not an engineer by anymeans, if you haven't noticed  yet.

Yes.

This is what I was talking about with partitioning early on.
bitcool
Legendary
*
Offline Offline

Activity: 1441
Merit: 1000

Live and enjoy experiments


View Profile
May 26, 2011, 02:17:13 PM
 #107

He needs to move this operation to the Northwest Territories (Just above Alberta where he is) or as close as possible to the north pole as possible Tongue. When it's -30 outside I don't think cooling shall be a problem. I'm being facetious here but lol someone crazy enough may do this Smiley
Here is the idea:
1. Gather all mining equipment you can get
2. Get one of these portable nuclear power station. http://www.hyperionpowergeneration.com/
3. Find a remote place in NWT at the cost of $1/acre. http://www.ehow.com/facts_6775117_homestead-act-canada.html
4. Build biosphere 3, powered by mining rigs.
5. The birth of Bitcointopia!

allinvain
Legendary
*
Offline Offline

Activity: 3080
Merit: 1080



View Profile WWW
May 26, 2011, 02:55:55 PM
 #108

I forgot to ask thou I know Linux a bit but to be honest I'm pretty much a n00b I can make my way around terminal and such not much for GUIs ect but is there any good software from overclocking past what overdrive will let you ? In the linuxcoin distro ?

Assuming I can figure out why I'm getting rpc errors ?

Well you should be able to set any clock rate you want with AMDoverdrivectl or even with aticonfig. To be honest I haven't tried to overclock my cards under linuxcoin but theoretically it should work just the same as in windows except that you're doing it via command line.

RPC errors means your miner lost connectivity to the master bitcoin daemon. Are you mining solo, pool? This usually means there is a network connectivity issue or massive packet loss/lag.

allinvain
Legendary
*
Offline Offline

Activity: 3080
Merit: 1080



View Profile WWW
May 26, 2011, 02:57:59 PM
 #109

He needs to move this operation to the Northwest Territories (Just above Alberta where he is) or as close as possible to the north pole as possible Tongue. When it's -30 outside I don't think cooling shall be a problem. I'm being facetious here but lol someone crazy enough may do this Smiley
Here is the idea:
1. Gather all mining equipment you can get
2. Get one of these portable nuclear power station. http://www.hyperionpowergeneration.com/
3. Find a remote place in NWT at the cost of $1/acre. http://www.ehow.com/facts_6775117_homestead-act-canada.html
4. Build biosphere 3, powered by mining rigs.
5. The birth of Bitcointopia!



That Hyperiod mini nuclear reactor is really cool. I'd definitely get one if I was super rich. I bet that the thing costs millions of dollars so I don't think it's within the reach of even the most hardcore bitcoin miner.

pwnyboy
Full Member
***
Offline Offline

Activity: 125
Merit: 100


View Profile
May 29, 2011, 06:56:37 AM
 #110

Yes.

This is what I was talking about with partitioning early on.

To add to this, I would think regular plywood for the partitions, assuming the original scheme as depicted in the OP's drawings is used.  You could go as far as to space the plywood off of the shelving with a 2x4, this would give you almost 2" that you could use to staple insulation to the inside of the plywood.  You can hinge the plywood at the top, where it fastens to the 2x4 beam, to make the shelves accessible by simply flippling the plywood up.  If the shelves are made of a solid material as-depicted, I'd envision ventilation ducts for input and output attaching to each side of the shelf ends (so at a 90 degree angle to the plywood which runs the length of the shelf, input at one end, output at the other end).  This would also be extremely easy to build up and test because you could do it with just one level of one shelf.  If you settled on a good vendor for the fans (like some place akin to Walmart) which sold fans with a rated CFM, you could test, adjust, test some more, and have a pretty good idea of how your system will perform using the delta T calculation provided above.

But that's merely conjecture, and math is hard.  It's much easier to just go shopping for video cards and ignore the details.. yanno?

mathx
Newbie
*
Offline Offline

Activity: 29
Merit: 0


View Profile
May 29, 2011, 07:30:48 AM
 #111

Oh geez. I didnt even read the whole thread. In a former life I ran a 10,000 sq ft data centre, and build cluster computers for many different outfits.

Luckily we had real budgets so we bought AC's and proper server rooms.

Cheaper solution is to use environmental cooling. see intel's site for more info on that.

You can move regular temperature air in, just gotta move more faster (exponentially) as the temp diff between outside air and your cards is lower (course ,the cards
will just get hotter if you arent cooling them enough, then fail or otherwise). Obviously having them throttle down based on operating temp will keep your investment safe.

Try to keep your exhaust air from incoming air as seperate as possible - duct away from the cards as you can. Beware however, if there's a failure of incoming air for any reason,  your ducting may insulate your cards from new cool air, and you'll quickly achieve burnout (or possibly fire)? One way to do this is to keep two air moving systems on the same circuit as the rig itself, so if the power goes out then everything is off, not just the AC/air mover. You dont want your gear in a small box with a tiny air buffer that it can heat up to very hot in a short time while still powered. Thermal trips on the mobos will also help of course, they can just shut off the whole system if too hot.

Ducting upwards is a good way to keep exhaust from one row away from the next. (course you need to get it back away from the shelf as per the diagrams first, so back, then up), so you need incoming air coming from the 3rd dimention, from the side, then thru the back, then up and out, then there's no mixing of air, or exhaust air becoming input air to the next row. (You could also have air coming in the bottom, but now you're talking raised floor).

Im thinking at approx 0.7-0.9W/W to cool with a professional chiller, that it isnt worth it. Cheaper to throttle on hot days and use ambient air. 1.8x the cost of power is a big increase in cost.

a 1 ton chiller will cool 12000 BTU/hr, or about 3500W of cooling ish. That will cool 4000-5000 W of dissipated heat. a 1 ton chiller is about $5000 USD new in places, i see some for $3k used. Install is going to be at least that (then count shipping).

 http://www.dimplexthermal.com/UsedChillerStore.aspx

15 ton for $18.5 K, add in $5-8k for ship/install assuming you have some power busses in place already for 3phase etc. Pushing $25K gives you cooling using ~50kva for 60-75,000VA (depending on many factors). 50kva is about $6.25/hr in my market (assuming you're not enjoying industrial rates, in which case it's about 60-90% of that), which is ~$30/day or ~~$1k/mo. not huge, the capital cost and install will beat operating in amortization for 24 months or more.

However, I think bigger blowers will give you more ambient environmental air, just gotta clean your gear more often. Itll turn black if you're in a city/near any roads due to diesel particulate. Pretty gross.
Filtering the air just means lower airflow due to backpressure/drag. Cleaning is cheaper Smiley Doesnt affect operation for the most part, unless your fans are getting gummed up. Watch for humidity issues however in very dry climates/winter, could get static and sparks and dead gear. Wetness wont be an issue if the room is always 20F > outside... however, gear may really suffer when its 90F out and 110-130F inside. So will the humans.



Chucksta
Full Member
***
Offline Offline

Activity: 168
Merit: 100



View Profile
May 29, 2011, 08:13:32 AM
 #112

LOL, mug... you've only got a month at best before this becomes unprofitable... that's if the price does not change and the we continue getting similar increases in difficulty.

I hope I am wrong, as it would be nice to make a bit more than a few hundred dollars profit from this.
CalibrataBG
Newbie
*
Offline Offline

Activity: 38
Merit: 0


View Profile
May 29, 2011, 09:08:36 AM
 #113


for mining that's OVERKILL!

Question is are you that stupid to invest in this at this time or you know something we don't? Please do tell!
All distributed computing projects I know are unpayed, made for dumb people that don't get they're being used, so maybe I'm missing something...

PS. I wonder why you ask about cooling ... when the answer is obvious ...
colossus
Full Member
***
Offline Offline

Activity: 121
Merit: 100

Obey me and live or disobey and die.


View Profile
June 01, 2011, 05:37:37 AM
 #114

Remember to put this on youtube after your done building it.
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
June 02, 2011, 07:14:30 AM
 #115

Its all in the works 27 miners are up ATM but there all in cases ATM the others are a work in progress were are having issues getting enough power to our location ATM so were looking into other options As for The last poster WTF are you talking about man are you on glue ?

My power bill comes in at 700/mo. Right now

I'm currently at current difficulty mining average 35 btc a day ...

Do the math it is certainly worth it for me to mine and expand my mining operations and I put aside a certain % dependent on a few variables for expanding my mining operation to stay ahead of the game right now I'm operating at zero loss what so ever and a rather large net. Profit


So yes I know something you obviosily don't mining is profitable
inh
Full Member
***
Offline Offline

Activity: 155
Merit: 100


View Profile
June 02, 2011, 01:14:34 PM
 #116

Its all in the works 27 miners are up ATM but there all in cases ATM the others are a work in progress were are having issues getting enough power to our location ATM so were looking into other options As for The last poster WTF are you talking about man are you on glue ?

My power bill comes in at 700/mo. Right now

I'm currently at current difficulty mining average 35 btc a day ...

Do the math it is certainly worth it for me to mine and expand my mining operations and I put aside a certain % dependent on a few variables for expanding my mining operation to stay ahead of the game right now I'm operating at zero loss what so ever and a rather large net. Profit


So yes I know something you obviosily don't mining is profitable

So each machine only gets about 500 Mh/s ? Seems like there is a LOT of room for improvement here.
pwnyboy
Full Member
***
Offline Offline

Activity: 125
Merit: 100


View Profile
June 02, 2011, 01:28:59 PM
 #117


So each machine only gets about 500 Mh/s ? Seems like there is a LOT of room for improvement here.

The OP decided on the one-card-per-box strategy for all of the new deployments because he could get machines for something like $20 each (see previous pages in this thread).  It could be argued that when you multiply that figure out by 4, he could've bought one of those MSI boards with slots for 4 cards, plus a low power Sempron for maybe $240 versus the $80.  But the power savings would likely take 6 or so months to pay back the difference.. and one could argue that it's a wise move to pay the difference in power in the short-term, if his goal is to get as many rigs up as he can with a given budget constraint.  I'd likely employ the same strategy if I were him.
inh
Full Member
***
Offline Offline

Activity: 155
Merit: 100


View Profile
June 02, 2011, 01:37:27 PM
 #118

But you also deacrease your footprint by 4 times as well, which allows for more efficient cooling and then you not only gain the power savings of going from four to one systems, but also the gain in going from four crappy PSUs to one decent. it's all a trade off. Personally i'd go for a more dense system.
Meatball
Sr. Member
****
Offline Offline

Activity: 378
Merit: 250



View Profile
June 02, 2011, 01:46:54 PM
 #119

But you also deacrease your footprint by 4 times as well, which allows for more efficient cooling and then you not only gain the power savings of going from four to one systems, but also the gain in going from four crappy PSUs to one decent. it's all a trade off. Personally i'd go for a more dense system.

I don't know if I agree with that.  I've noticed that cramming 4 cards into one box puts off a lot more heat than a single card in the box.  Might be a wash with the fewer CPU's/PSU's though...
airdata
Hero Member
*****
Offline Offline

Activity: 1148
Merit: 501



View Profile
June 02, 2011, 01:59:40 PM
 #120

http://www.newegg.com/Product/Product.aspx?Item=N82E16896808070&cm_re=air_conditioner_portable-_-96-808-070-_-Product

8000btu portable air conditioner $229 + free shipping

▄▄▄▄███████▄▄▄▄        ▄▄▄▄███████▄▄▄▄        ▄▄▄▄███████▄▄▄▄
▄▄█████████████████▄▄  ▄▄█████████████████▄▄  ▄▄█████████████████▄▄
▄█████████████████████▄▄█████████████████████▄▄█████████████████████▄
██████████▀▀  █████████████████▀      ▀████████████████▀      ▀████████
▄█████████     ████████████████   ▄██▄   ██████████████   ▄██▄   ███████▄
████████████   ███████████████████████   ████████████████████▀   ████████
████████████   █████████████████████▀   ▄██████████████████     █████████
████████████   ███████████████████▀   ▄██████████████████████▄   ████████
▀███████████   █████████████████▀   ▄██████████████████   ▀██▀   ███████▀
███████████   ████████████████          ███████████████▄      ▄████████
▀█████████████████████▀▀█████████████████████▀▀█████████████████████▀
▀▀█████████████████▀▀  ▀▀█████████████████▀▀  ▀▀█████████████████▀▀
▀▀▀▀███████▀▀▀▀        ▀▀▀▀███████▀▀▀▀        ▀▀▀▀███████▀▀▀▀
......swap...Swap, Earn, Bridge, Mint Crypto
& NFT in Multiple Chains
.
...MVP LIVE...
.
allinvain
Legendary
*
Offline Offline

Activity: 3080
Merit: 1080



View Profile WWW
June 03, 2011, 12:30:32 AM
 #121

I think he's going to need more BTUs. Maybe two of these:

http://www.newegg.com/Product/Product.aspx?Item=N82E16896865291

Smiley

warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
June 03, 2011, 06:26:14 AM
 #122

pwnyboy nailed it on the head it is not about the long term savings its about getting as many machines up as quickly as possible to combat the aggressive rise in difficulty the other issue with running 4 cards rather then single cards is proper cooling (for us)

we opted air displacement for cooling and thus far we are maintaining a average of 65 degrees across all the cards

sadly we have to make due with our 100 amp drop as epcor wants to rape for 100+ amp drops transmission fee's and cost of dropping the line and a bunch of crap it's retarded quick math actually shows if

we have a bunch of cards sitting asap it's cheaper to rent a apartment with free utilitys and house the new machines there with the ones we are currently running .. thou with the forming of a small Edmonton based bitcoin group im going to setup a meeting seeif people would be intrested in leasing commercial space from a fellow i know with a 450 amp drop to the location already built for houseing servers with security and fire suppression and cooling

-J

this weekend i shall try and put together a video showing all the rigs
allinvain
Legendary
*
Offline Offline

Activity: 3080
Merit: 1080



View Profile WWW
June 03, 2011, 05:24:00 PM
 #123

Cool. Can't wait to see the video or any future pics Smiley


phillipsjk
Legendary
*
Offline Offline

Activity: 1008
Merit: 1001

Let the chips fall where they may.


View Profile WWW
June 03, 2011, 06:24:47 PM
Last edit: June 03, 2011, 07:08:10 PM by phillipsjk
 #124

sadly we have to make due with our 100 amp drop as epcor wants to rape for 100+ amp drops transmission fee's and cost of dropping the line and a bunch of crap it's retarded quick math actually shows if

we have a bunch of cards sitting asap it's cheaper to rent a apartment with free utilitys and house the new machines there with the ones we are currently running .. thou with the forming of a small Edmonton based bitcoin group im going to setup a meeting seeif people would be intrested in leasing commercial space from a fellow i know with a 450 amp drop to the location already built for houseing servers with security and fire suppression and cooling

You are learning that the cheapest kilowatt (as in kilojoules per second) is the one you don't use.

With careful mapping of your wiring, you may not need 200 Amp service. Your service is 100Amp, 240V. Each "branch" can supply 80 amps continuously. If you know for a fact that different outlets are on different sides of the branch circuits, you probably don't even need 240V plugs. You can check this by using a multimeter to check the voltage between the "hot" contacts in the sockets. If they are on the same side of the circuit, the voltage difference will be less than a volt. If they are on different sides of the circuit, the voltage difference will be 240V.

One thing you will have to watch for is power factor. Most my computers without active power factor correction have a power factor of about 0.67. That means that for every amp of current supplying power, I have half an amp of "wasted" current just heating the wiring. What that means for you is that you should derate each circuit by the power factor of the load. For example, on a 15 Amp circuit, you are allowed to draw 12 amps (multiply 15 by 0.8) continuously. If your load has a power factor of 0.67, you would multiply that 12 amp rating by 0.67: giving 8 amps delivering actual power (960Watts). I recently bought a 380 Watt power supply with active power factor correction: it cost me just over $45 (a little over your budgeted $20 per machine).

You can free up about 40 amps (well, 32 Amps after derating for continuous load) by replacing your electric stove with a gas stove. You can then get an electrician to pull the 240 volt connection over for the AC or even running some of the machines.

Of course, renting actual commercial space is a option. However you should really consider whether you consider this a hobby or a business. If it is a hobby, you don't need every last machine running. You may even want to have a few on standby and only run them if the difficulty level network hash rate drops for whatever reason (or in the event of hardware failure). If you consider it a business, you may need to consider getting a business license. Miners should also be aware that there are only about 50,000 "winners" (blocks) every year.

James' OpenPGP public key fingerprint: EB14 9E5B F80C 1F2D 3EBE  0A2F B3DE 81FF 7B9D 5160
warweed (OP)
Full Member
***
Offline Offline

Activity: 130
Merit: 100


View Profile
June 03, 2011, 09:08:07 PM
 #125

Thanks for the input Smiley it is appreciated to behonest I work full time for me this Is a hobby for my bussiness partner this is his full time job and is sustaining living on bitcoins ! 100% payed off his visa and line of credit bought a butt load of investments and pays rent and utilities entirely with exchanged bitcoins Smiley


And will continue to do so for as long as he can

The reason I suggested commercial space is because of the need of constant expansion to combat difficulty unfortunately the last few massive jumps dropped the earnings down a fair amount but with the rise in value it evens out fairly well Smiley
Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
June 11, 2011, 05:06:44 PM
 #126

not sure about mobile nuke powerstation[but Russia plan start build/sell some soon], but using obsolete military bomb shelters and other former industrial underground [well-made]structures, for data-center/mining is good idea/investment/decision.
esp if that's really bomb-proof, ie, able withstand at least old bomb hits[not deep penetrators with active propulsion improvement tech].
WiseOldOwl
Full Member
***
Offline Offline

Activity: 238
Merit: 100



View Profile WWW
June 11, 2011, 06:11:03 PM
 #127

Seal each rack's shelf
intake cold in bottom corner exhaust top caddy corner.
Use this method for 1000watt grow lights.
It forces the cold air to rise through and across the hot cards and the hottest air is at the top where the other fan is sucking it out.
Also by sealing off the shelves you are decreasing the amount of air you need to vent every minute.
to whats inside the shelves rather than the whole room (the room will need some venting for sure but it wont be so hot you cant stand it,) and thus increasing the amount of times it gets "cycled" or "vented".Which is good.
A 6 or 8 inch duct to and from each shelf working at a solid CFM should do the trick. The intake air for this would most likely need to be chilled, but I dont know where you are. Northern finland or something you could just pull from outside, florida or something you will need some cooling my friend it is summer.
Might i suggest building one and getting an estimate of the heat produced and work neccessary to eliminate it.

All in all it is a cool idea and i wish i could do it.
No matter what though it can be done, just look at some sick data centers or huge multi level grows, both have extreme heat issues and both go down 24/7.
fcmatt
Legendary
*
Offline Offline

Activity: 2072
Merit: 1001


View Profile
June 11, 2011, 07:20:42 PM
 #128

movin cool office pro 60. i have one in a colo room that needed extra cooling and it is top of the line.
paid around 9 grand by shopping around. 60000 BTU and keeps its resale value. Very flexible options
to direct cooling. Venting is easy. It is a 5 ton unit. A real work horse.
Pages: 1 2 3 4 5 6 7 [All]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!