There's something wrong with the pool server - the miners keep getting empty queues and sitting idle. I've been forced to switch back to BTCguild for the time being until the Coinotron pool servers respond correctly. @ other pools right now I get zero idle time and no empty queues. Kind of starting to feel like I've wasted 1.5 days mining here, which means I've not made 2BTC when I should have. What is going on?
|
|
|
Linux question here. I've got the following code to grab health stats and each command works on its own when I login as a regular user. Some boxes execute ok without having an active X session on my console (via ssh), others require the X session. The GPUs are running fine and my mining is fine. The problem is when I try to run these commands from a shell script. They all complain about "No Protocol Specified". How can I fix this so they will run from a shell script via crontab? export DISPLAY=:0 export AMDAPPSDKROOT=/opt/AMD-APP-SDK-v2.4-lnx64 export AMDAPPSDKSAMPLESROOT=/opt/AMD-APP-SDK-v2.4-lnx64 export LD_LIBRARY_PATH=${AMDAPPSDKROOT}/lib/x86_64:${LD_LIBRARY_PATH} FAN=`/usr/bin/aticonfig --pplib-cmd "get fanspeed 0"|grep "Fan Speed"|awk -F: '{print $3}'` TEMP=`/usr/bin/aticonfig --odgt --adapter=$DEVICE | grep -o '[0-9][0-9].[0-9][0-9]' | sed 's/\.[0-9][0-9]//g'` CLOCK=`/usr/bin/aticonfig --odgc --adapter=$DEVICE | grep "Current Clock"| grep -o '[0-9][0-9][0-9]'|head -n 1` MEM=`/usr/bin/aticonfig --odgc --adapter=$DEVICE | grep "Current Clock"| grep -o '[0-9][0-9][0-9][0-9]'|tail -n 1`
|
|
|
I didn't build a fancy case because I thought it was cool,
I built because I needed a way mount a lot of equipment in a small space and vent it outside...
I'm saving a significant amount of money on AC by doing so, and my living room is more than 15F cooler.
This is the difference - I do not have my 16 GPUs in my living room, so my circumstances are different and thus require a server rack. If I were keeping my gear in my living room I would also go open-air.
|
|
|
Really? You're asking for tech support on a system we didn't sell? Thats bs, you cant cool 3x 6990s in a conventional case.
Our Pro Rig with 3x 6990s typically run under 80c, in a room at 25c. That's plenty cool enough for 6990s. I'm gonna be honest here. I don't see how any of your rigs, especially the rental one, is anything close to cost effective. You're charging $100 over retail for 6990 cards and the rental rigs would barely even break even at 3 months for the cost of the contract. Maybe I'm wrong - maybe you can explain the financial benefit and cost/performance analysis that shows them being profitable for the buyer at the current difficultly level.
|
|
|
And the longer you wait.. with diff going up... it just gets worse and worse.
As long as you bring in more than you pay in electric then it's all good.
|
|
|
People need to put more ghash into the BTC pool at CoinoTron - we're never going to solve a block if it's just me and one 600mhash guy, and we a lot of shares actively submitted that are never going to payout if no one mines BTC. Maybe some people can put 50% of the mhash into BTC? Please!
BTW, the stats are reporting my Mhash incorrectly. It says I have 1.037Ghash but I actually am running (confirmed on my cards and proxy), 3.2ghash/s. What is the averaging time period for the stats mhash/s value?
I appreciate your work shotgun. Actually I'm considering some kind of lonely miner award for you or title: the Last of the Mohicans. You are totally right. People got mad about namecoins. When tomorrow difficulty rise over 100, “the last will be first, and the first will be last”. My only hope is that they stay and mine BTC. Regarding your hashrate. – I see c.a. 2400 in BTC and 300 in NMC. Averaging period is 10 minutes. Well, at this point I'm just going to keep my ghash there and hope I find a block soon ... I have 4 more cards coming next week so expect that hashrate to jump a bit
|
|
|
Take a look through the " Pictures of your Rig" thread. As far as I'm concerned any money spent on fancy boxes and desks is money not spent on GPU cards - and hence, wasted. That assumes everyone has the same values and purpose. The guys running 42 cards from one rig in their mom's basement with 6 different fans from homedepot because they're 17 years old and don't know dick about distributed computing or clusters, well those are one type of miner. There are others that approach it professionally and have experience in datacenters and HPC/clustering - those are the guys that don't consider infrastructure a waste of money. Cost is relative. Personally I don't mind dropping several hundred on a managed switch for my rack because I can watch port statistics and have multiple VLANs, but then there are other miners that cobble together a switch and router form a yard sale. It all depends on what you want out of your infrastructure and how much you are willing to invest in quality. Putting all your money into GPUs is one thing but spending it all at the expense of having a stable and reliable infrastructure is stupid.
|
|
|
People need to put more ghash into the BTC pool at CoinoTron - we're never going to solve a block if it's just me and one 600mhash guy, and we a lot of shares actively submitted that are never going to payout if no one mines BTC. Maybe some people can put 50% of the mhash into BTC? Please!
BTW, the stats are reporting my Mhash incorrectly. It says I have 1.037Ghash but I actually am running (confirmed on my cards and proxy), 3.2ghash/s. What is the averaging time period for the stats mhash/s value?
|
|
|
id rather commit suicide than hear the stock cooler during the night So don't keep the computer in your bedroom. Problem solved.
|
|
|
I'm already looking into it, my local retailer is looking into some servers... He found a few, soon enough I'm gonna have the prices and relay to you guys so you can toss your ideas at it.
In regard to servers - you can build the same ones I have in my rack. If you want the newegg parts list let me know.
|
|
|
The DIY aluminum case (in the post you linked to) is pretty slick -- the main problem I have with it is: time. I don't have that level of free time for mechanical tinkering. That thing costs *quite a bit* if you factor in even $8/hour for your time. True, there's the "learning experience" factor, which we've all indulged in for building rigs. But you only have so much time to learn I agree. I don't have the time to sit around cutting and drilling and measuring. A $50-60 4U rackmount case and a couple of 120mm fans is faster to deploy, easier to resell, easier to manage, and all around more professional. Imaging trying to resell that - "I made this case in my garage! It's really awesome I swear."
|
|
|
That means the 3 gpu rig costs 20% more.
You can get 2x (58xx or 6xxx using a total of 300 watt PCIe power connector) and 2x5770 (150 watts of PCIe) cards into one case. I know this because I'm doing it. That keeps the cost down significantly because you can run a 600-700 watt PSU depending on CPU. And when you get over the 3 box hump you can't realistically (unless you feel like spreading parts around on a large table or baking rack and filling the room with oscillating fans) put them in one location without looking like a degenerate that doesn't care about consistency and stability. 4 cards per box is the best use of space when you're dealing with 3+ boxes. There's also the power distribution, networking cables, environment cooling and heat exhaust systems, and administrative work to consider. Having all of your gear in an orderly arrangement keeps you sane. I know a lot of people here love open cases but coming from a datacenter work environment it just makes me think you're playing with toys when you could be approaching it in a serious manner.
|
|
|
How much does that thing weight?
The rack without any equipment is about 100lbs or maybe 125lb. It has wheels on it that can be removed. I can move it around with it fully loaded up.
|
|
|
I've been having some table locking issues with the curent database since the schema creates all of the tables as MyISAM. As a MySQL DBA per profession, I would recommend that anyone running more than a handful of miners to switch to InnoDB. You can run the following commands on the mysql command line to convert your tables. alter table pool engine=innodb; alter table settings engine=innodb; alter table submitted_work engine=innodb; alter table work_data engine=innodb; alter table worker engine=innodb; alter table worker_pool engine=innodb; The schema file can be changed to create tables as InnoDB from the start by changing each line that says "ENGINE=MYISAM" to "ENGINE=INNODB".
|
|
|
Why buy a case? I run practically that same exact rig, sitting on it's motherboard box, without any external fans, and the cards run right at 69C at 70%. Putting it in a case is silly and just jacks up the costs because you'll need a more rigorous motherboard (like you pointed out) due to heat, and buying a case and additional fans. Open air is the way to go for a mining rig.
Open rig works great until you run out of table space and actually need to organize more than a couple open air rigs. Density efficiency is important for large operations. Just ask Dimitri.
|
|
|
I certainly wouldn't run any of my cards without heatsinks. My advice would be to put the stock cooler back on if your aftermarket one doesn't fit.
|
|
|
Where did you buy the rack? And have you had any sort of overheating problems with it?
Thanks.
I bought the rack from Ray at http://www.rhfive.com - he's about an hour drive from me so I just drove on down and picked up the rack and drove it home. I bought the rack and the Sun servers from him before I started mining. The rack was $200, and he has a bunch of them. I've seen other racks on craigslist for $100 and $150 if you're not into the super awesome Sun Microsystems version. Overheating has not been an issue as all of the miners have multiple 120mm case fans running full blast. My cards are all running 62-68C at the moment.
|
|
|
Ok, care to explain that shotgun?
Sure thing. Here's my method. Install LinuxCoin to all boxes. On one box, or a separate box, install BitCoin Mining Proxy ( https://en.bitcoin.it/wiki/Bitcoin-mining-proxy ), then setup all workers to proxy through the BTC Proxy. Now you never have to manually change pools. If you want to monitor the temps of the workers you run this via an ssh loop on your boxes: #!/bin/bash maxtemp="80" running=true while $running; do if [ "`aticonfig --odgt --adapter=0 | grep -o '[0-9][0-9].[0-9][0-9]' | sed 's/\.[0-9][0-9]//g'`" -gt "$maxtemp" ] ; then echo "TOO HOT, shutting down miner" pkill -9 phoenix.py running=false fi done If you have ssh-keys setup you don't have to enter passwords to run the temp monitor. So now your boxes are easy to admin remotely and you don't have to have anything other than a laptop or gaming box to admin them as needed. So, then you put them in a server rack and put the server rack in your garage. Like this: Ultra40 #1 (2x 6870 (MSI and ASUS)) Ultra40 #2 (2x 6870 (MSI and ASUS)) Quad0 box (Currently 2x6950 MSI, room for two more 5770 single slot versions). Running a Sapphire Pure-Black MB with 4xPCIe slots. Sun X4600-M2 not running Sun X2100 #1 not running but maybe it will provide H/A for the next box... Sun X2100 #2 running BTC-Mining-Proxy, apache/mysql, monitoring apps Tres0 (2x 6870 MSI Hawk, 1x XFX 5770) room for one more single slot 5770. Running a Sapphire Pure-Black MB with 4xPCIe slots. Tres1 (2x 6870 MSI Hawk, 1x XFX 5770) room for one more single slot 5770. Running a Sapphire Pure-Black MB with 4xPCIe slots. APC-SmartUPS-1500VA I'm planning on removing the ultra40 boxes and replacing them with duplicates of the custom builds (the tres0 and tres1 boxes). And of course I'll be removing the X4600M2 box since it's taking up space and I'm not utilizing it... so that and the rest of the empty space will be filled with more boxes like the tres0.
|
|
|
why wouldn't you want to use remote administration software? wait.. that means you compute near your rigs? hmm, never understood why people don't put them in a server rack and shove them in the garage or closet, you can get a cheap used one for $100-150.
|
|
|
|