NOTE: Some info in this post is out of date. Read the thread for the most up-to-date info. I may create a new thread soon that has the info in a more concise format.NOTE: All the parts and accessories for this build have been sold.
So I have been talking about this in other unrelated threads around here, and I figured it would probably be better to start my own topic for this project, to keep all the information in one place. There are obviously a large number of hurdles in this project that must be overcome before it will be useful, not the least of which is the driver limit of 8 GPUs. I will attempt to address these as I meander through the project - this is definitely a weekender since I am rather busy with daily work, so don't expect magical things to happen right away. Pics of the project will be kept in an Imgur album located here: http://imgur.com/a/0oist
EDIT: Parts list and cost table will be kept updated in the second post in this thread!
This is my quest to get the insane and crazy 18 slot PCI Express backplane
from Trenton Technology
working with video cards for bitcoin mining. I was able to pick up the board for $600 - Trenton quoted me almost double that in single quantities. The one I got is unused.
To go with it, I need a Single Board Computer (SBC)/System Host Board (SHB) - Trenton prefers the notation SHB, so I will stick to that when referencing it. Basically, the board with all the slots is just an expansion board - the actual computing power gets plugged into it just the same as all the video cards (or whatever someone wishes to plug in). Since I do not need a lot of power to run miners, and I was trying to be cheap, I went with a refurbished NLT6313
(also from Trenton) which I obtained for $200. When it arrived, it appeared to be brand new, I couldn't tell that it had ever been used and then refurbished. To go with it, I got 8 GB of Hynix ECC DDR2 (4x 2GB) for $30. This board has two LV (Low Voltage) Xeon server processors, which should also help save some power.
For power, I got myself a Dell 2360 watt power supply
from their current-generation blade servers - it cost me $95. I still need to figure out how to turn it on.
I will add more as necessary - this model is just a hair over 90% efficient at 50% load. [PDF link
Here are some pics to kick off the thread - click them for full-size images:
This is the SHB and part of the backplane.
This shows where the SHB plugs into the backplane.
Full frontal including the SHB plugged into the backplane.
Mission control, we have a problem! No way in hell are we going to fit a card into that last slot there....
IT MINES! CD drive and HDD disconnected a few minutes later, running on BAMT with 1 5850 for now.
Did you notice that it wasn't in the bottom slot? Yeah...
The PSU connector gets in the way of the bottom slot. Have we been reduced to only 16 useable slots? Not so fast....
It also obscures one of the ATX 12v connectors, and that is only a 5850. Imagine what a 5870 or 5970 would do!
Why am I not worried about the missing slots? Simple, the PSU connectors currently in use for testing are only the poor-man's connectors. See all the black terminal blocks across the back of the board in the third picture? That is where I will be putting all my power in, and those blocks are the same height as the PCIe Connectors. (Read: everything is going to fit just fine) Oh, and what about those pesky CPU fansinks getting in the way of the last slot? Well, I have a couple choices there: I could use that slot for a smaller card, or for a network/RAID/whatever card, or I could do away with the fansinks and replace them with something smaller. (Waterblocks, anyone?)