hey dogie,
thanks for taking the time to write all that out, i am curious how hot the computer cases will get. As you stated we may need to open our cases and put a large box fan right over where the front panel would have been.
im going to wait for the cards to drop in price and probably pick up two. i am into gaming and have two 6900s right now which are barely keeping up. i would like to go with two Maltas but we will see how other people fare first.
It will take even the most experienced vets by surprise just how much heat 600W is when its no longer exhausted. The actual metal on the case of my machine hits 80C because the limit of cooling is actually the ambient. So many fans on high and it just can't get rid of that much heat and keep case even <40C.
|
|
|
I also have 2 non malta 7990s.
I get 1280 per board with 1050 cores although also undervoltaged. Power consumption was 850ish from the wall, so - system and psu thats about 310W per GPU.
So even though my partner boards have 3x8pins, they have the same power draw as these 'new' cards, and cheaper. These are $1000 release cards, so £800 with VAT added. I paid £680ish, although there are more games with this.
However I see a serious problem with these cards: They've put them in dual slot coolers. At stock clocks they were loading just below the 80s, horizontally mounted in an open air, single card system. This was on GPGPU computing. ZERO air is exhausted and its all left to be removed by the case.
My 3 slot cards and their 3kg of heatsink struggle together, even with a riser'ed slot gap. The bottom card loads at +45-50 above ambient, and the top card +55-65 above ambient. [Note I don't have voltage control of the 2nd core on each card for some reason, hopefully this will arrive with newer drivers as I have control when there is only ONE card - this will improve temps slightly]. This is with a 1050 core.
Now I can honestly say there just isn't going to be any way that you could put 2 of these cards in a case and expect the top one not to overheat. I have 3x250mm, 5x240mm of fans removing heat from cards that semi exhaust - these don't at all. You will have to open air, just have to.
I'm more than happy with my 3 slot non Maltas, think I got it right.
Tldr: Partner cards (PCs) with the same hardware but better cooling already existed The PCs have a higher extreme range due to 525 rather than 325 power delivery The PCs struggle to dual card [hottest cards touching 90, throttling at 94ish], these new cards will be REALLY REALLY hard The PCs had triple slot coolers, these are 2. The PCs exhausted a good portion of the heat, these don't. Same performance, I get 1240Mh per card at 1050 cores. More expensive than the PCs, surprisingly...
|
|
|
I also have 2 non malta 7990s.
I get 1280 per board with 1050 cores although also undervoltaged. Power consumption was 850ish from the wall, so - system and psu thats about 310W per GPU.
So even though my partner boards have 3x8pins, they have the same power draw as these 'new' cards, and cheaper. These are $1000 release cards, so £800 with VAT added. I paid £680ish, although there are more games with this.
However I see a serious problem with these cards: They've put them in dual slot coolers. At stock clocks they were loading just below the 80s, horizontally mounted in an open air, single card system. This was on GPGPU computing. ZERO air is exhausted and its all left to be removed by the case.
My 3 slot cards and their 3kg of heatsink struggle together, even with a riser'ed slot gap. The bottom card loads at +45-50 above ambient, and the top card +55-65 above ambient. [Note I don't have voltage control of the 2nd core on each card for some reason, hopefully this will arrive with newer drivers as I have control when there is only ONE card - this will improve temps slightly]. This is with a 1050 core.
Now I can honestly say there just isn't going to be any way that you could put 2 of these cards in a case and expect the top one not to overheat. I have 3x250mm, 5x240mm of fans removing heat from cards that semi exhaust - these don't at all. You will have to open air, just have to.
I'm more than happy with my 3 slot non Maltas, think I got it right.
|
|
|
And the usual question, why?
|
|
|
Schmidt is the guy who wants to make personal ownership of drones illegal.
Authoritarian from the word go.
Google + Bitcoin is not a good fit. Western Union makes sense - even Amazon. But not Google.
Ex Googler: Things like this do interest Google, infact bitcoin is exactly up their street. Anything that uses technology for 'the greater good' is a primary target. They will have a small team watching and waiting.
|
|
|
Possible reasons:
1) Overheating 2) Silly driver 3) A high CPU usage thing just happened 4) Intensity setting not high enough 5) Crossfire being silly, if they are in cf 6) Custom clocks resetting, every time they reset they'll stop hashing for a bit.
|
|
|
It's really an interesting idea to keep mining rig cool during the summer
But here is some of my concern, I might be wrong in doing my calculation coz I've never dealt with datacenter space rent.
From data you provide,
1U rackspace comes with 90W of power costs $20/Month, I only consider in terms of power supply (not space in the rack) ATM, in order to power a 4-video cards ming rig, it requires 7-9 blocks of 90W depends on cards, so that is $20 for the first 90w block plus another 6-8 90W blocks @ $15/each, that's about $90-$120
So only power supply will cost around $100/month/rig
I've read through a little on the forums here and saw one GPU can use around 200-250W of power. The all-inclusive (conditioned power with UPS, BW, rackspace) cost in the model I proposed is around $0.30/kWh ($0.30 * 0.09kW * 744h on a month = $20.09). Comparison: Domestic/Residential power in the US is around $0.10-0.17/kWh (source: http://www.eia.gov/electricity/monthly/epm_table_grapher.cfm?t=epmt_5_06_b), but don't forget air conditioning expenses (without being optimized as in a datacenter you are looking at least at a PUE of 2, so $0.20-0.34/kWh) and bandwidth (mining uses almost nothing but still you have a minimum monthly internet bill to pay at home). Not even counting the initial investment on a UPS and on an airconditioning unit. Now, get back to space in the rack, a full size video card is about 5 inch and a typical 2U is about 3.5 inch, I don't know if there is enough space inside to accommodate a typical 4-video cards mining rig.
For ppl who are using FPGA might be a good idea?
Usually on rackmount servers a video card will be sideways 90 degrees from the motherboard (facing the floor/top) connected to the motherboard by a riser card. On a 1U you can fit up to 2 PCI slots on a single case depending on the case. On a 2U you usually can fit 4 or 5x half-height, full length cards. Supermicro also has servers that go up to 4x GPU PCI-e slots distributed over the chassis in 1U or 2U config. Have fun cooling 1KW of heat in a 2U server. Ah yes that's right, you can't. Whole idea sucks in terms of actually being able to do what you said, and the fact it would cost about $1.5k a year per computer.
|
|
|
...but it was the factor google wich has make change mind as he said it was enginered by google i said this is the best thing i can have in the world...
Google is well known for using commodity hardware. They don't use specialized stuff much. Moreover if Google did but stuff from them, even a simple component, there is no way this guy would be telling it to you -- even if the said component was a simple screw and he was really drunk... In fact, especially since this is "designed by Google", this guy wouldn't event know he was selling to Google. It would be routed thru a proxy. AFAIK Google builds their own systems in-house, and are very secretive about their hardware specs and numbers (and locations). Hope you get your money back... or at-least teach the guy a lesson... I work(ed) for Google and probably the only thing they buy in hardware wise is the laptops and desktops. Everything else is custom built, outsourced to very specific top range secrecy manufacturers, reassembled in the US. NOTHING they would buy would be in this form, as a one off. Its a scam.
|
|
|
overvolting is the only way. Mineral oil cooling is interesting got any links on where to get the materials and how to do it properly? I'm thinking of dropping all my kit in oil...
Oil dipping does NOT work for 24/7 operation. The basic principle is that it has so much heat storage that you can game for 1-4 hours without temps rising significantly, and then slowly release the heat over the other 23-20 hours. If you're permanently mining, then heat in > heat out = you'll have to turn it off a day a week.
|
|
|
DFI Lanparty UT Ultra-D NF4 and Zalman 9500CNPS-A LED. Oh those were the days, miss my opteron 165 ![Sad](https://bitcointalk.org/Smileys/default/sad.gif)
|
|
|
I am indeed. Would still require escrow and release upon collection in person. Thanks Which corner of this fine country are you located at for collection? Don't think I can bring out the 1000 btc offers but I can still leave a dent.
|
|
|
PM price and basic logistical details please ![Smiley](https://bitcointalk.org/Smileys/default/smiley.gif)
|
|
|
Considering selling my 90z silver bullions. I am located in Norway so we can meet up or I will ship anywhere in the world, but its makes most economic sense for purchase from within Europe. Payment in BTC, BTC-E reedemable code, NOK, EUR, USD. This is what I am offering; 90 x 1oz coins - Austrian Philharmonic rounds (they are in tubes of 20 each, you get tubes) They are all obviously 99,99% pure silver. Sturles ( http://bitcoin-otc.com/viewratingdetail.php?nick=sturles) has volunteered to be escrow if necessary. Price I want is 10% markup above spot for fiat, 7,5% for BTC/mtgoxUSD. It is negoitable, but wont deviate much from that price. So lets get this straight: You're offering to sell us at a 10% premium what we can buy silver for at any other outlet in the universe? I feel honoured.
|
|
|
I'm waiting on mine from Asia and if they don't arrive by Friday, I'm screwed.
Its really strange NO ONE in the UK sells them, not one.
|
|
|
Hi, I am running several 7950 around 580mH/s for a while now. During the night, one card hanged out so I restarted the rig in the morning. Now, I am getting only around 250mH/s on that card to matter what I do. Any ideas what is wrong?
Thanks
For completeness: Had a similar thing. Its caused by all the clocks resetting to the lowest possible for some reason after a driver crash. So although it was hashing fine, it was likely hashing at 300MHz core. Fixed by manually setting the clocks back or reinstalling drivers as you found.
|
|
|
I interned at Google, and lets just say, Google > DDOS.
Nothing can touch those servers, its literally impossible.
|
|
|
Contracts Sold
5193/6000
really?
this seems too good to be true
Ofc it is. If you believed in this, you'd have shown £100k in share capital or attached this to an existing business. I do that with mine, no, in fact I do all my high risk trading as a sole trader which gives me unlimited liability. If you don't give, you don't get.
|
|
|
I've entered the website details and the details of the scam in the UK fruad squads online reporting form.
Fraud is illegal.
I don't think there is need to involve the authorities. This only affects the bitcoin community, our responsibility is to alert other in the community. Problem solved without potentially giving bitcoin bad press. Criminals are criminals regardless of the area they operate in. This will be the same guy trying to scam old ladies because their 'windows is broken'.
|
|
|
What's today's price? Shipping to the UK possible?
|
|
|
|