Bitcoin Forum
November 01, 2024, 03:15:36 AM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Colocation vs Small Warehouse  (Read 2754 times)
tagged (OP)
Sr. Member
****
Offline Offline

Activity: 406
Merit: 251


View Profile
December 21, 2013, 03:05:15 PM
 #1

Is anyone colocating their rigs in data centers or running small warehouse mining operations?

I have 8 rigs each running 4x R9 280x cards on 1300w PSUs, each pulling 11 amps at the wall. They are a few in my house, few at neighbors, few at friends.  I just ordered 8 more rigs worth of equipment.  I cannot keep spreading them out. Plus summer time people may not be as willing as they are now for the free heat    Smiley   (i pay them power money, they save on not heating their houses, works out great right now!). My cousin also just ordered 8 rigs worth of equipment, he has a commercial mechanic garage that he CAN run these in, but he would rather consider the solution that I come up with. His garage is not the most dust free of places!

I am looking into several datacenters local to me in Charlotte, and am talking to on DC operator who is suggesting Arizona or Portland and shipping my rigs there. I would rather keep them close to home but willing to consider anything depending on pricing.

So, what are you doing? 

1.
If you are in a warehouse would you share some detailed pics of how you exhaust your heat and have your power coming in? Can you share some of your costs to upfit the space for your operation and wiring? How much do you pay in electric and rent?

2.
If you are in a datacenter would you post a pic of your rigs (are they rack moutable? How many U does each take and how many can you put into racks? Do you pay by the cabnet or what pricing have you found?)

3.
How are you making your rigs Rack mountable if you are? Your cost for materials? Your design for airflow?


Thank you all!
joshv06
Hero Member
*****
Offline Offline

Activity: 991
Merit: 500



View Profile
December 21, 2013, 03:06:46 PM
 #2

Dude I was in the same boat as you like 2 weeks ago

I am in Houston TX. I was so close to getting a warehouse. I thought about it, and while it's nice and cold now, I figured when summer rolled around, it would be hard to cool 120,000 BTU's of heat. I also took into account, possible internet/electricity downtime.I called around Datacenters. Of the ~20 I talked to, only 2 could accommodate my custom rig.

Send the rigs down here, I could get you a good deal since I am already in and setup XD. They still have yet to cage my rig. This is 40,000 Watts of electricity.




D A I L Y -  C R Y P T O  -  G I V E W A Y S
▬▬ ●●     Your source for daily free giveaways !    ●● ▬▬
DISCORD    -    TWITTER    -    WEBSITE
knedle
Member
**
Offline Offline

Activity: 99
Merit: 10


View Profile
December 21, 2013, 05:54:32 PM
 #3

How does the cooling of those gpus work? I'm asking cause it looks like the fans are completedly blocked and in my case I need to have rather big space for them to work and cool my gpus correctly.

Also answearing the question in the topic.

I have another room rented in my office building with cheap and stable electricity (power plant is 500 meters from my building), and two separate internet connections just in case.
mackminer
Sr. Member
****
Offline Offline

Activity: 348
Merit: 251



View Profile
December 21, 2013, 08:52:32 PM
 #4

Are you using those GPU's for bitcoin mining?  Huh

1BFf3Whvj118A5akc5fHhfLLwxYduMmq1d
tagged (OP)
Sr. Member
****
Offline Offline

Activity: 406
Merit: 251


View Profile
December 21, 2013, 08:54:41 PM
 #5

How does the cooling of those gpus work? I'm asking cause it looks like the fans are completedly blocked and in my case I need to have rather big space for them to work and cool my gpus correctly.

Also answearing the question in the topic.

I have another room rented in my office building with cheap and stable electricity (power plant is 500 meters from my building), and two separate internet connections just in case.

How many rigs do you have in how much space. I was considering office space but am concerned with heat....


Thanks for posting?
gamefixer
Sr. Member
****
Offline Offline

Activity: 440
Merit: 250


View Profile
December 21, 2013, 11:46:38 PM
 #6

wow, those pics of the hardware in the DC are pretty cool!
TheWoodser
Full Member
***
Offline Offline

Activity: 188
Merit: 100


View Profile
December 24, 2013, 12:10:59 AM
 #7

Is anyone colocating their rigs in data centers or running small warehouse mining operations?

I have 8 rigs each running 4x R9 280x cards on 1300w PSUs, each pulling 11 amps at the wall. They are a few in my house, few at neighbors, few at friends.  I just ordered 8 more rigs worth of equipment.  I cannot keep spreading them out. Plus summer time people may not be as willing as they are now for the free heat    Smiley   (i pay them power money, they save on not heating their houses, works out great right now!). My cousin also just ordered 8 rigs worth of equipment, he has a commercial mechanic garage that he CAN run these in, but he would rather consider the solution that I come up with. His garage is not the most dust free of places!

I am looking into several datacenters local to me in Charlotte, and am talking to on DC operator who is suggesting Arizona or Portland and shipping my rigs there. I would rather keep them close to home but willing to consider anything depending on pricing.

So, what are you doing? 

1.
If you are in a warehouse would you share some detailed pics of how you exhaust your heat and have your power coming in? Can you share some of your costs to upfit the space for your operation and wiring? How much do you pay in electric and rent?

2.
If you are in a datacenter would you post a pic of your rigs (are they rack moutable? How many U does each take and how many can you put into racks? Do you pay by the cabnet or what pricing have you found?)

3.
How are you making your rigs Rack mountable if you are? Your cost for materials? Your design for airflow?


Thank you all!

You have asked a lot of questions here and I hope to hit them all.

The main thing that you are paying for with a datacenter is their redundancy.  Not only power, but internet connection and cooling as well.  Like others have mentioned the cooling is not such a big deal in the winter, but come summer time you could spend TONS of money running the A/C.

It really boils down to your power requirements and your tolerance for downtime.  I recently purchased a TerraMinerIV and plan to sell shares.  It is going to be worth it for me to put it in a datacenter, as the power where I live tends to go out in a storm.  I also don't want to pay for the UPS systems or failover generator that would be required.  All of those expenses still would do nothing for network redundancy.

I plan on putting my TerraMinerIV at NetAccess in New Jersey.  I got a real good deal and I think it will be good for my share holders....Once I get around to selling the shares.  MOST datacenters are going to want you to have something rack mountable.  They will also charge you for space OR power draw.  Usually whatever is MORE expensive.  Some of the cheaper datacenters only have between4-20KW per cabinet.  If your rig draws 12KW and they only have 10KW per cabinet, they will charge you for 2 even if they are not completely full.  (I hope that make sense)  The racks at NetAccess have heat chimneys so the cool air from the room gets drawn over your machines then sucked into the return for the HVAC.  Your best bet would be a design that is rack mountable, that intakes air from the front and blows it out the back.

Let me know if you have any more questions.  Sorry, I don't have any photos.

Woodser

Tip Jars:  BTC: 1J8y3SLzGoY2gLYScsbTZmUH7dW18J1q4S          LTC:  LMkJZ8yuwtVr57GLYZ1JRWstQTc29Cfrtn     Doge: D95sgyBbRz8xhsQMAPmACYfB8vKWUAGuTn 
Rep Thread: https://bitcointalk.org/index.php?topic=366385.0
TheWoodser
Full Member
***
Offline Offline

Activity: 188
Merit: 100


View Profile
December 24, 2013, 12:13:23 AM
 #8

JoshV,

Impressive setup!  What is your airflow like?  It looks like you did a great job getting those cards stacked in there, but it looks like there is not a solid airflow plan.

Woodser

Tip Jars:  BTC: 1J8y3SLzGoY2gLYScsbTZmUH7dW18J1q4S          LTC:  LMkJZ8yuwtVr57GLYZ1JRWstQTc29Cfrtn     Doge: D95sgyBbRz8xhsQMAPmACYfB8vKWUAGuTn 
Rep Thread: https://bitcointalk.org/index.php?topic=366385.0
joshv06
Hero Member
*****
Offline Offline

Activity: 991
Merit: 500



View Profile
December 24, 2013, 02:19:16 PM
 #9

Thanks guys!

The cards are supposed to be on risers and "floating" with about 1.5" of space between eachother. There are supposed to be 4 cards for each rig. I had some problems with the long risers I bought and ordered some replacements. Cards stay at around 85 degrees each right now which is pretty good for being right next to eachother. The DC stays pretty cool.

If anyone is around Houston Texas, hit me up! I'm sure I can get you in for a good price! They are going to put a cage around us soon.

D A I L Y -  C R Y P T O  -  G I V E W A Y S
▬▬ ●●     Your source for daily free giveaways !    ●● ▬▬
DISCORD    -    TWITTER    -    WEBSITE
rudyo
Full Member
***
Offline Offline

Activity: 170
Merit: 100


View Profile
December 25, 2013, 08:31:55 AM
 #10

Couple of thoughts.  You have 42 cards? What kind of GPU's are they?

Not sure how you are getting to 42,000 watts, that is WAY more than 42 cards worth.

I run 4 R9 280x cards per rig, and it draws about 1200 watts, or roughly 300 watts per GPU.  Not many cards draw much more than that (excepting the 7990).  You should be around 10,000 - 12,000 watts total for 42 GPU's, so 12KW

How much are you paying? Every data center I have dealt with has always been more concerned about power, more-so that space, cooling or even bandwidth.  Power is the driving factor in a data center.  I have seen some datacenter "specials" as low as $99 per KW, so that would put you around $1200 a month.  Is this in the ballpark?

On another note, 85c is pretty hot for the GPU's and I would be a little concerned about their longevity if they are run continuously at that level.  Spreading them apart and maybe undervolting a little will work wonders for them...just a thought.

joshv06
Hero Member
*****
Offline Offline

Activity: 991
Merit: 500



View Profile
December 25, 2013, 06:16:37 PM
 #11

Couple of thoughts.  You have 42 cards? What kind of GPU's are they?

Not sure how you are getting to 42,000 watts, that is WAY more than 42 cards worth.

I run 4 R9 280x cards per rig, and it draws about 1200 watts, or roughly 300 watts per GPU.  Not many cards draw much more than that (excepting the 7990).  You should be around 10,000 - 12,000 watts total for 42 GPU's, so 12KW

How much are you paying? Every data center I have dealt with has always been more concerned about power, more-so that space, cooling or even bandwidth.  Power is the driving factor in a data center.  I have seen some datacenter "specials" as low as $99 per KW, so that would put you around $1200 a month.  Is this in the ballpark?

On another note, 85c is pretty hot for the GPU's and I would be a little concerned about their longevity if they are run continuously at that level.  Spreading them apart and maybe undervolting a little will work wonders for them...just a thought.



Lol did you not pay attention to most of my post?

I said the cards are going on risers, to have them separated. They will run cooler. They are running next to each other temporarily.

108 total cards. ~80 being used in the pic. R9 290's. My rigs actually run at 1300W, rather than 1400w as I though so total of about 35KW.

I will not get into the price to host at the data center. But I will says it's about ~1.5x more than the original electricity cost give or take.

For some reason every time I tell someone about the price I pay for any service, I get blasted and say I'm paying too much.

D A I L Y -  C R Y P T O  -  G I V E W A Y S
▬▬ ●●     Your source for daily free giveaways !    ●● ▬▬
DISCORD    -    TWITTER    -    WEBSITE
TheWoodser
Full Member
***
Offline Offline

Activity: 188
Merit: 100


View Profile
December 25, 2013, 08:39:06 PM
 #12

ruydo: What are you getting for the $99 per kw?  Does that include a certain amount of space? Or just total power?

I am going to go out on a limb here but JoshV is in either a private datacenter or one that is in the business of "web wholesale" where they are selling their own hardware to host websites. (AKA Not collocating other people’s hardware....)

I have visited MANY datacenters in my time and I can tell you, if they do ANY colocation service at all....everything is in cages or is sold by the rack. (They have to provide security within the data center to maintain SAS-70 or Uptime Institute validation) 

While JoshV might have a great situation (No one knows what deals he can pull for himself) either case, you are going to be at the mercy of what you can get your hands on.  Is JohnV's situation better than mine?  I can't answer that.   You really get what you pay for with a datacenter. 

You are paying for redundancy in power, internet, and UPS.  It depends on what you are willing to lose.  The data center I have my stuff in has a SLA of 100%, BUT you are going to pay a shitload for that. During "Super Storm Sandy" when everyone else was out of power, my data center was renting desk space.  They were on backup power and had fuel contracts in place to stay online for months if the grid didn't come on line.......   Maybe 99.999% is OK for you.....   Only you can answer that.

Let me know what other colo questions you have.  If you are interested in colo in the north east, I have contacts that may be of some help.  (My friends own a Tier2, a Tier3+, and a Tier4 datacenter)

Woodser

Tip Jars:  BTC: 1J8y3SLzGoY2gLYScsbTZmUH7dW18J1q4S          LTC:  LMkJZ8yuwtVr57GLYZ1JRWstQTc29Cfrtn     Doge: D95sgyBbRz8xhsQMAPmACYfB8vKWUAGuTn 
Rep Thread: https://bitcointalk.org/index.php?topic=366385.0
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!