Bitcoin Forum
December 12, 2017, 01:25:49 PM *
News: Latest stable version of Bitcoin Core: 0.15.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: 1 2 [All]
  Print  
Author Topic: Crazyness Build Time  (Read 3741 times)
shakaru
Sr. Member
****
Offline Offline

Activity: 364


View Profile WWW
October 17, 2011, 10:14:08 AM
 #1

Ok, so Im now back on the "I will build the most badass hashing machine ever to grace the planet" kick again.
This time I am looking over to this post here https://bitcointalk.org/index.php?topic=8573.msg578827#msg578827 to see what is the possibility of adding some more gpus via usb. Now this is where I ask you to all chime in.

Host systems are the following boards.

Msi 890-gd70
Msi 890-gd65
Foxcon POS that I am to lazy to check out the model, but it has 2x 16 and 2x 1 on it.
Msi 790-??? 2x1
Msi 840-g45

Now the way I see it, the power on these things works like via a floppy with a 5v/12v connection to get the juice for the pcie connection. I plan on supplying my 12v 6pin power to either a 5830 or 5770 from an aux psu in the form of a 80+ silver balanced correctly, or running off of a psu like https://cablesaurus.com/index.php?main_page=product_info&cPath=1&products_id=37&zenid=4dc76ae8853451f4a25565fdff6fd727 but is 12v @ v4.? amps and runs on 115v @ 360w (if any one wants to give me a 5830/psu math check, Im thinking 1 or 3 5770s) So by my math, this could be a theoretical system build if this works.

Msi 890-Gd65
2x Thermaltake 750w
6x 5830s on pcie connections (4 1x / 2 16x)
2x 5830s on usb boxy thing with Visontec 450w Juicebox for the last 2 (this will power 2 5830s, I have them, I run them, just dont put a 5970.....rip Sparky EDIT: Sparky was the psu, didnt trash a $500 card)
Ddr 3 ram (box o' ram)
Semp 130/140 or Ath X2ish thingys
Sata hdd because I love Win7

Now my concerns lay with the usb bandwidth (not in use by anything else on this machine) and of course that it really will work over usb. They have connections I guess over mini hdmi to an ExpressCard for laptops on PCMCIA but looking at the hardware, I may not need it.

Outside of that I have been looking at a few other options like the Magma ExpressBox 7 - x4G1 (starts at $2799 and would be gutted on site and fitted to a shelf and have pcie risers plus a psu fitted with 130cfm fans and attached to a 1u server for a control. The only part I would have to buy would be the nearly 3k piece of hardware which may sound crazy, but I found out they go for much cheaper sometimes.

So let me know so I can let my good buddy over at BitEgg and BitPizza order me 10-20 of these things and go nuts!
1513085149
Hero Member
*
Offline Offline

Posts: 1513085149

View Profile Personal Message (Offline)

Ignore
1513085149
Reply with quote  #2

1513085149
Report to moderator
1513085149
Hero Member
*
Offline Offline

Posts: 1513085149

View Profile Personal Message (Offline)

Ignore
1513085149
Reply with quote  #2

1513085149
Report to moderator
1513085149
Hero Member
*
Offline Offline

Posts: 1513085149

View Profile Personal Message (Offline)

Ignore
1513085149
Reply with quote  #2

1513085149
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1513085149
Hero Member
*
Offline Offline

Posts: 1513085149

View Profile Personal Message (Offline)

Ignore
1513085149
Reply with quote  #2

1513085149
Report to moderator
MadHacker
Full Member
***
Offline Offline

Activity: 207



View Profile
October 17, 2011, 03:37:16 PM
 #2

i think USB3 would have more then enough bandwidth.
i just don't know of any USB to PCIE dongles.
plastic.elastic
Full Member
***
Offline Offline

Activity: 168


View Profile
October 17, 2011, 07:45:40 PM
 #3

what makes you want to invest a large fund to mining hardware now? Dont you have to pay for electricity?

Shouldnt you just buy BTC out right at the current price?

Tips gladly accepted: 1LPaxHPvpzN3FbaGBaZShov3EFafxJDG42
shakaru
Sr. Member
****
Offline Offline

Activity: 364


View Profile WWW
October 18, 2011, 12:15:54 AM
 #4

what makes you want to invest a large fund to mining hardware now? Dont you have to pay for electricity?

Shouldnt you just buy BTC out right at the current price?



A) I run Shades Minoco Mining Contractors
B) No I do not pay for elec, I have a datacenter, I pay a hosting rate
C) People keep wanting to buy mining power without the hassle, so I will keep making more rigs.
JoelKatz
Legendary
*
Offline Offline

Activity: 1582


Democracy is vulnerable to a 51% attack.


View Profile WWW
October 18, 2011, 12:25:03 AM
 #5

How do you connect a graphics card to a USB port?!

I am an employee of Ripple. Follow me on Twitter @JoelKatz
1Joe1Katzci1rFcsr9HH7SLuHVnDy2aihZ BM-NBM3FRExVJSJJamV9ccgyWvQfratUHgN
shakaru
Sr. Member
****
Offline Offline

Activity: 364


View Profile WWW
October 18, 2011, 01:36:15 AM
 #6

How do you connect a graphics card to a USB port?!

Well, it looks like you can take this 1x to Expresscard item I linked to at the top of my post, and connect it via USB. If thats not the case, you can use a pcmcia interface for sure as that has been confirmed and with the use of CGminer you can OC the cards.

Im looking at the hardware in the pictures and I see a female usb connection (odd that it is female) and I am hoping it will connect via that. I do have PCMCIA -> USB and PCMCIA to PCI adapters at my disposal.

Edit: Thought I linked it up above, LIE
http://www.harmonicinversion.com/index.php?page=shop.product_details&product_id=193&vmcchk=1&option=com_virtuemart&Itemid=3

Want to order a bunch of them.
mmortal03 claims to have them working.
plastic.elastic
Full Member
***
Offline Offline

Activity: 168


View Profile
October 18, 2011, 02:38:30 AM
 #7

Using any PCI-E expansion board is not cost effective at all.

Either get a 8 PCI-e slots board or use dual GPU on 4 PCI-E slot board.

The driver limits you at 8 GPUs.

Tips gladly accepted: 1LPaxHPvpzN3FbaGBaZShov3EFafxJDG42
PatrickHarnett
Hero Member
*****
Offline Offline

Activity: 518



View Profile
October 18, 2011, 03:16:15 AM
 #8

I'll ask a question having looked very hard at external PCIe connectors a while ago.  How do you connect your mobo to the pcmcia interface on the ExpressCard which is a laptop centric interface?

(that is, it is not a USB connection from the mobo - as far as I know, they don't exist yet.  there is something like a pcie splitter I think I saw a few months back - cablesarus maybe??)
plastic.elastic
Full Member
***
Offline Offline

Activity: 168


View Profile
October 18, 2011, 05:06:16 AM
 #9

I'll ask a question having looked very hard at external PCIe connectors a while ago.  How do you connect your mobo to the pcmcia interface on the ExpressCard which is a laptop centric interface?

(that is, it is not a USB connection from the mobo - as far as I know, they don't exist yet.  there is something like a pcie splitter I think I saw a few months back - cablesarus maybe??)

There is a PCI-E card for expresscard. Tons on ebay.

This should be the least you have to worry about tho

Tips gladly accepted: 1LPaxHPvpzN3FbaGBaZShov3EFafxJDG42
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
October 18, 2011, 03:06:43 PM
 #10

The larger issue is you are paying $55 per card for connectivity (plus existing computer) and there is a limit of 8 GPU in windows (not sure about Linux).

Take a motherboard with 4 PCIe 16x slots say $150 plus $20 RAM + 30 CPU = $200 for 4 slots = ~$50 per slot. 

Your method costs more for more complexity.
shakaru
Sr. Member
****
Offline Offline

Activity: 364


View Profile WWW
October 18, 2011, 06:20:59 PM
 #11

The larger issue is you are paying $55 per card for connectivity (plus existing computer) and there is a limit of 8 GPU in windows (not sure about Linux).

Take a motherboard with 4 PCIe 16x slots say $150 plus $20 RAM + 30 CPU = $200 for 4 slots = ~$50 per slot. 

Your method costs more for more complexity.


As I said, I dont care about cost here. Just looking for if it works.
JoelKatz
Legendary
*
Offline Offline

Activity: 1582


Democracy is vulnerable to a 51% attack.


View Profile WWW
October 18, 2011, 06:37:26 PM
 #12

How do you connect a graphics card to a USB port?!
Well, it looks like you can take this 1x to Expresscard item I linked to at the top of my post, and connect it via USB. If thats not the case, you can use a pcmcia interface for sure as that has been confirmed and with the use of CGminer you can OC the cards.
The USB port on the adapter is not usable to connect the graphics card. It only connects the adapter. (Much like the USB ports on many televisions.)

I am an employee of Ripple. Follow me on Twitter @JoelKatz
1Joe1Katzci1rFcsr9HH7SLuHVnDy2aihZ BM-NBM3FRExVJSJJamV9ccgyWvQfratUHgN
shakaru
Sr. Member
****
Offline Offline

Activity: 364


View Profile WWW
October 19, 2011, 12:28:42 AM
 #13

Ok, well I have 4 pcmcia connectors so Im going to try it out on them when they arrive.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
October 19, 2011, 12:59:04 AM
 #14

Ok, well I have 4 pcmcia connectors so Im going to try it out on them when they arrive.

The link you provided is for ExpressCard adapter.  ExpressCard is the replacement for PCMCIA but is incompatible.

http://en.wikipedia.org/wiki/ExpressCard
shakaru
Sr. Member
****
Offline Offline

Activity: 364


View Profile WWW
October 19, 2011, 01:17:57 AM
 #15

Ok, well I have 4 pcmcia connectors so Im going to try it out on them when they arrive.

The link you provided is for ExpressCard adapter.  ExpressCard is the replacement for PCMCIA but is incompatible.

http://en.wikipedia.org/wiki/ExpressCard

This is why I post. Good catch on that one.
JoelKatz
Legendary
*
Offline Offline

Activity: 1582


Democracy is vulnerable to a 51% attack.


View Profile WWW
October 19, 2011, 01:54:11 AM
 #16

Ok, well I have 4 pcmcia connectors so Im going to try it out on them when they arrive.
PCMCIA cannot be adapted to PCI Express. Only a purely generic bus or something that is basically already PCI can be adapted to PCI Express.

I am an employee of Ripple. Follow me on Twitter @JoelKatz
1Joe1Katzci1rFcsr9HH7SLuHVnDy2aihZ BM-NBM3FRExVJSJJamV9ccgyWvQfratUHgN
PatrickHarnett
Hero Member
*****
Offline Offline

Activity: 518



View Profile
October 19, 2011, 02:33:45 AM
 #17

Ok, well I have 4 pcmcia connectors so Im going to try it out on them when they arrive.
PCMCIA cannot be adapted to PCI Express. Only a purely generic bus or something that is basically already PCI can be adapted to PCI Express.

I thought they were trying to use an interface that is used in some laptop connections to link to PCIe - there are some cumbersome methods to do that.

If money is really no object, they should be adding an external PCIe enclosure.
sadpandatech
Hero Member
*****
Offline Offline

Activity: 504



View Profile
October 19, 2011, 02:53:44 AM
 #18

Ok, well I have 4 pcmcia connectors so Im going to try it out on them when they arrive.

The link you provided is for ExpressCard adapter.  ExpressCard is the replacement for PCMCIA but is incompatible.

http://en.wikipedia.org/wiki/ExpressCard

This is why I post. Good catch on that one.


  There is also 2 different size express card slots. I believe ExpressCard 54 and ExpressCard 36(?) Not much a fan of laptops..
  In any effect, there is an adapter that will convert expresscard 54 to usb. Its intended purpose was for laptops with older slots to be able to use expresscard over usb. Can't be assed to find you a link, sorry. ;p

  Big Bang Marshal ftw if cost is no object.
  
  If you have free pci slot and pcmcia as you say that you can use for external there are these here, http://www.adexelec.com/cb.htm#PCICBI      I did not see anything listed but this company has a HUGE selection of pcie risers, etc. Might be worth a call to see if they make a 'splitter' of sorts.

If you're not excited by the idea of being an early adopter 'now', then you should come back in three or four years and either tell us "Told you it'd never work!" or join what should, by then, be a much more stable and easier-to-use system. - GA
It is being worked on by smart people. -DamienBlack
catfish
Sr. Member
****
Offline Offline

Activity: 270


teh giant catfesh


View Profile
October 19, 2011, 08:05:15 AM
 #19

Any particular reason why you want so many GPUs connected to one logic board? You have a ton of issues to contend with - firstly the actual connectivity (PCI express can't just be magicked from USB without another bridge board that is a rare requirement, low volume and hence expensive), then the power supply considerations (I've done multiple PSUs and I simply don't trust them) - and eventually, going to the extremes, aren't you going to hit cable length limits at some point? The electronic engineers here can put me straight on this one but I'm sure that PCI express in particular isn't fond of metre-long cables. Could be wrong - bitcoin mining doesn't stress the memory bandwidth between GPU and board anyway.

It's a lot easier to standardise on a micro-ATX board with 4 PCIe slots, with 4 GPU cards of your choice. The setup fits into a cube, requiring a PSU somewhere else.

I've nearly finished my Shelf Rig Mk II - using this exact concept. The picture is below (poor quality - only got it up and running late last night) but there are three Gigabyte H61M-D2-B3 boards, each with four GPUs (of different types in my case). The boards are inexpensive and REALLY reliable - no messing about, all four PCIe slots accept GPUs with x1 -> x16 extenders without any fuss, and the CPU / board / northbridge can be tuned to use very little power, leaving most of the PSU juice for the GPUs.

With cash not a consideration, I'd run quad 6990s or something insane like that Smiley As Messhead pointed out, the drivers limit you to 8 devices anyway so there's little point going further. A Marshal Big Bong or whatever filled with dual-GPU cards simply won't work unless you write your own drivers. Caveat - I'm using Linux and the 11.6 Catalyst proprietary drivers... the 8-device limit may be lifted in the latest drivers. YMMV.

Modular approaches like this also mean that a catastrophic failure of a PSU (e.g. sending a huge spike down all the 12V cables) wipes out a board and 4 graphics cards, not a hugely expensive rack of custom-hardware exotica...

My design is cheap, as you'd expect from a hobbyist. A professional refactor of the basic concept presented here would be cheap to build and easy to maintain:


...so I give in to the rhythm, the click click clack
I'm too wasted to fight back...


BTC: 1A7HvdGGDie3P5nDpiskG8JxXT33Yu6Gct
sadpandatech
Hero Member
*****
Offline Offline

Activity: 504



View Profile
October 19, 2011, 11:39:11 AM
 #20

  That's pretty damn slick in design and spec, catfish. Mad props! 

If you're not excited by the idea of being an early adopter 'now', then you should come back in three or four years and either tell us "Told you it'd never work!" or join what should, by then, be a much more stable and easier-to-use system. - GA
It is being worked on by smart people. -DamienBlack
sadpandatech
Hero Member
*****
Offline Offline

Activity: 504



View Profile
October 22, 2011, 03:51:55 AM
 #21

  To add to my post before about external connections.  If your only option is to connect with that express card adapter, there is as I stated an expresscard(universal both EC54 and 34) that will connect to USB.
 Since I did not provide a proper link before I hunted one down. Only $15.99 too.. There is a catch though, and that is the express card adapter you use from the vid card cannot be the pcie type. They say it will jsut not detect. That can likely be hacked if the guys making the other adapters have not already thought of that....

  http://www.amtron.com/expresscard/usbexp54b.htm


 
  I have been searching my nuts off for a straight up PCIe to USB converter. They were seen as useless for the most part whne companies were researching external plugs due to the low badnwidth of USB 2.0 (~490Mbs) verse ExCard or custome solution being very close to PCIe.

  But, we know mining uses very little for badnwidth. And, with USB3.0 having much more badnwidth hopefully some of these options can be reconosidered.

  I wonder how hard would it be for the community here to design a board that converted PCIe to USB and in varying numbr of PCIe slots? I know it would add very little in the way of efficiency to existing PC power consumption. But it would be more efficient none the less. Especially on very low power, underclocked laptops....


  Cheers

If you're not excited by the idea of being an early adopter 'now', then you should come back in three or four years and either tell us "Told you it'd never work!" or join what should, by then, be a much more stable and easier-to-use system. - GA
It is being worked on by smart people. -DamienBlack
catfish
Sr. Member
****
Offline Offline

Activity: 270


teh giant catfesh


View Profile
October 22, 2011, 12:02:06 PM
 #22

To be frank, I can't see any demand for it. USB doesn't supply enough power either - PCIe is itself meant to supply up to 75W.

Firewire, however, is a different kettle of fish and I could see Firewire being adapted to daisychain a load of GPUs...

And thinking of other Mac technologies - that Thunderbolt thing is more or less DisplayPort and probably has the bandwidth to support multiple external GPUs. That's probably where the tech is going.



What would be more useful to me as a bitcoin miner would be a PCIe lane splitter. My favourite 'old faithful' logic board, the Gigabyte H61M-D2-B3, isn't the best example because it's ideal for mining with little wastage. It has one x16 slot, and three x1 slots. I use x1 -> x16 extenders exclusively in my mining rigs, so having lots of x1 slots is useful.

However some of my logic boards have multiple x16 slots - for 'crossfire' and whatnot. Now, if a PCIe *slot* is an x16 spec, does this mean it has 16 lanes? And if a PCIe *slot* is x1 spec, it has 1 lane? If so, then there must be some electronic possibility to connect the pins from the x16 slot into 16 separate x1 PCIe lanes, surely?

It'd obviously need powered risers because each *slot* needs to deliver 75W (or the daughterboard that converted the single x16 slot into 16 separate x1 slots could have its own adapter from the PSU).

But if PCIe lanes actually work like this, and having 16 lanes in one slot can be split up into separate slots, then *very* cheap and small logic boards with only ONE PCIe x16 slot could be made into 'ultimate' mining boards.

I'd be willing to pay for this, if it worked. The logic board situation would become trivial - I'd buy those AMD Hudson integrated thingies - cheap as chips - and then plug in the daughterboard for the multiple x1 slots. It could even be designed such that the x1 slots were actually x16 length and spaced apart properly - maybe no connections to the 'end' pins, but it'd allow many GPUs to be plugged into slots without needing extender cables, and mounting the daughterboard solidly would make all these complex mining rig builds unnecessary.

Anyone know enough to tell me that I'm talking rubbish? Even if the x16 slot can only be split into 4 x1 slots, that'd still be a win if the daughterboard was built to accept properly spaced GPUs and had a long enough cable to plug into the logic board's x16 slot - with my Gigabyte board, I'd end up having 7 PCIe x1 slots - four of which wouldn't require extender cables...

Would this be better off in its own thread?

...so I give in to the rhythm, the click click clack
I'm too wasted to fight back...


BTC: 1A7HvdGGDie3P5nDpiskG8JxXT33Yu6Gct
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
October 22, 2011, 01:57:32 PM
 #23

Yes and no.

PCIe has a concept of both lanes and ports.  So the Southbridge has those 16 PCIe lanes configured to a single port.  When you see a board that has 2 slots and it can be 1x PCIe16x or 2 PCIe8x it means the PCIe switch inside the southbridge is capable of dynamically configuring the port to lane assignments. 

Just splitting the lanes physically will do no good.  The PCIe switch inside the southbridge won't "know" there should be 3 or 7 more logical ports.

So what you need is a PCIE switch (a chip which can route PCIe lanes to ports) and that would require a custom PCB.

There are existing solutions which provide PCIe expansion but they are way to expensive.  Here is one example:
http://www.magma.com/expressbox16basic.asp
It adds 16 PCIe slots in an external chasis from a single 16x port.  So hypothetically take a MB w/ 4 PCIe 16x slots connect each one to a 16 bay expansion chasis and gain 64 slots for GPUs.

Two problems.  One the chassis and expander is an insane $4500.  Two AMD stupidly limits you to 8 GPU.

Given the 8 GPU limit I have found the easiest, fastest, and most efficient setup is a motherboard with 3x 16x slots spaced two spaces apart and use 3x 5970s for 6 total GPU.  No other method seems to come close to that in terms of cost.
shakaru
Sr. Member
****
Offline Offline

Activity: 364


View Profile WWW
October 22, 2011, 05:42:50 PM
 #24

Ive been looking into Magma products fir a while now and while I love the companies products, with them you are paying for enclosure and cabling.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
October 22, 2011, 10:33:53 PM
 #25

Ive been looking into Magma products fir a while now and while I love the companies products, with them you are paying for enclosure and cabling.

A couple other companies make bare backplanes.  Price is still astronomical.  Like >$2K for 16 slots and >$1K for 4 slots.  IIRC Magma doesn't manufacture their own bakcplanes. They are just a Value Added Reseller putting OEM parts together with warranty and service.
PatrickHarnett
Hero Member
*****
Offline Offline

Activity: 518



View Profile
October 23, 2011, 06:09:30 AM
 #26

Don't know if it is still current, but I was looking at this last year (different application) and found some Compaq stuff.  4 16x slots for $3.5k from memory - there were cheaper ways to do this, and having boards with 4 GPUs is ok for what I'm playing with.
catfish
Sr. Member
****
Offline Offline

Activity: 270


teh giant catfesh


View Profile
October 23, 2011, 12:48:44 PM
 #27

Yes and no.

PCIe has a concept of both lanes and ports.  So the Southbridge has those 16 PCIe lanes configured to a single port.  When you see a board that has 2 slots and it can be 1x PCIe16x or 2 PCIe8x it means the PCIe switch inside the southbridge is capable of dynamically configuring the port to lane assignments. 

Just splitting the lanes physically will do no good.  The PCIe switch inside the southbridge won't "know" there should be 3 or 7 more logical ports.

So what you need is a PCIE switch (a chip which can route PCIe lanes to ports) and that would require a custom PCB.

There are existing solutions which provide PCIe expansion but they are way to expensive.  Here is one example:
http://www.magma.com/expressbox16basic.asp
It adds 16 PCIe slots in an external chasis from a single 16x port.  So hypothetically take a MB w/ 4 PCIe 16x slots connect each one to a 16 bay expansion chasis and gain 64 slots for GPUs.

Two problems.  One the chassis and expander is an insane $4500.  Two AMD stupidly limits you to 8 GPU.

Given the 8 GPU limit I have found the easiest, fastest, and most efficient setup is a motherboard with 3x 16x slots spaced two spaces apart and use 3x 5970s for 6 total GPU.  No other method seems to come close to that in terms of cost.
OK so it's possible, just expensive.

The trouble with *your* solution DAT is that finding 5970 cards has become rather difficult, not to mention rather expensive in hash per £.

The video card vendors, certainly here in the UK, seem to know damn well that bitcoin miners want the 58xx/5970 cards at any cost and are pumping up the prices. The cheapest I've seen for a usable card (over 200 MH/s for me) is £80 for the 5770. At those prices, when the dual-GPU single cards are costing £600 or more, it's almost worth buying two of my favourite Gigabyte 4-slot boards and running 8 of the 5770s (which all reliably clock up to 220 MH/s each - the disadvantage being space, and potentially efficiency.

I must say that I'm surprised at the cost of the 'lane splitter' solutions. Entire logic boards (like my beloved Gigabyte H61M-D2-B3) cost around £55 inc VAT these days. There's a southbridge on there which splits the PCIe bandwidth into one x16 slot and three x1 slots. Hence the chip logic to split PCIe lanes into different configurations can't be *that* expensive, can it?

I've got an Asus logic board which I've given up on - I bought it for its 5 PCIe slots - and three well-spaced x16 slots (it'd be a candidate for your preferred format, since you'd get your three 5970s onto the board with no extender cables required). A feature in its BIOS allows you to split the bandwidth across the physical PCIe slots - you can disable the two x1 slots and give maximum bandwidth (x16, x8, x8) for three GPUs, or you can have two full-fat x16s with the rest disabled, or the 'x1' config I was trying to get to work, where all slots are available at x1 bandwidth.

This board caused me so many problems (trying to get all 5 slots working with GPUs) that I've taken the CPU out and put it in another Gigabyte board - that's four Gigabyte boards I've got now in my farm Smiley They just work.

However, this Asus board suggests that the BIOS can configure which slots get however many PCIe lanes... so if the software and hardware can already split PCIe bandwidth across the available slots, why are these extension systems so damned expensive?

After all, if the solution I want costs thousands, then I can just bite my tongue and order a few of those damn 'extreme gamer' Big Bong boards which already have 7 PCIe slots on the board (and probably waste 50W or so just running the logic board, requiring more powerful PSUs as well, adding to the cost) Sad

...so I give in to the rhythm, the click click clack
I'm too wasted to fight back...


BTC: 1A7HvdGGDie3P5nDpiskG8JxXT33Yu6Gct
Brian DeLoach
VIP
Full Member
*
Offline Offline

Activity: 166


View Profile
October 25, 2011, 12:52:36 AM
 #28

B) No I do not pay for elec, I have a datacenter, I pay a hosting rate

Is this to say you own a datacenter or are just leasing from one? Since you say your not paying for electricity, leasing has got to be much more expensive?
Pages: 1 2 [All]
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!