Couple of thoughts. You have 42 cards? What kind of GPU's are they?
Not sure how you are getting to 42,000 watts, that is WAY more than 42 cards worth.
I run 4 R9 280x cards per rig, and it draws about 1200 watts, or roughly 300 watts per GPU. Not many cards draw much more than that (excepting the 7990). You should be around 10,000 - 12,000 watts total for 42 GPU's, so 12KW
How much are you paying? Every data center I have dealt with has always been more concerned about power, more-so that space, cooling or even bandwidth. Power is the driving factor in a data center. I have seen some datacenter "specials" as low as $99 per KW, so that would put you around $1200 a month. Is this in the ballpark?
On another note, 85c is pretty hot for the GPU's and I would be a little concerned about their longevity if they are run continuously at that level. Spreading them apart and maybe undervolting a little will work wonders for them...just a thought.
Lol did you not pay attention to most of my post?
I said the cards are going on risers, to have them separated. They will run cooler. They are running next to each other temporarily.
108 total cards. ~80 being used in the pic. R9 290's. My rigs actually run at 1300W, rather than 1400w as I though so total of about 35KW.
I will not get into the price to host at the data center. But I will says it's about ~1.5x more than the original electricity cost give or take.
For some reason every time I tell someone about the price I pay for any service, I get blasted and say I'm paying too much.