Bitcoin Forum
December 08, 2016, 12:02:55 PM *
News: Latest stable version of Bitcoin Core: 0.13.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: [1] 2 »  All
  Print  
Author Topic: running more then 6 GPU's  (Read 2099 times)
MadHacker
Full Member
***
Offline Offline

Activity: 154



View Profile
October 31, 2011, 07:27:33 PM
 #1

anybody have any luck running more then 6 GPU's
I have only tried in windows 7 64bit but when ever i add a 7'th GPU my hash rate drops down to 50% - 75%.
I have more then enough PSU to do the job. so not sure why there is a problem.

tested with 7 5830's (7 individual cards)
and 1 6990, 1 5870 & 4 5830's

each time i had to remove 1 of the 5830's from the machine to get full hash rate.

initial post here with hardware spec https://bitcointalk.org/index.php?topic=39911
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
October 31, 2011, 07:49:48 PM
 #2

I did w/  4x 5970s not sure if the fact that it was only 4 "slots" made a difference.  I found it too difficult to keep the #2 card cool enough without extenders so now I just run 4x rigs w/ 3x5970s ea (6 GPUs).
MadHacker
Full Member
***
Offline Offline

Activity: 154



View Profile
October 31, 2011, 07:56:44 PM
 #3

i have a 7 slot PCIE board (4 16x, 3 1X) and i was initialy hoping i could get a card in each slot
now that 5970 are becoming a bit more availble i have 3 on its way...
I use extenders so cooling for now won't be a problem.
i plan to get waterblocks fro them so that i can better controll cooling for them
each time i have tried to get 7 cards or 6 cards (1 dual GPU)  it didn't want to work right.

I just dont ahve a clue why..
well when my 3 5970's come in i'll try it with my extra 5830 and hope it will work.
that may not be for a week or more... USPS isn't the fastest mailing system, especialy to Canada
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
October 31, 2011, 08:00:49 PM
 #4

I have found 5970s used on ebay a good deal.  If you build any new rigs look for boards with layout like

Slot 1 --- 16x slot
Slot 2 --- anything
Slot 3 --- anything
Slot 4 --- 16x Slot
Slot 5 --- anything
Slot 6 --- anything
Slot 7 --- 16x Slot

Makes it easy to drop 3x5970s.  They get plenty of air.  No need for extenders or custom frames.  Can get 2300MH/s easy on < 1kWh at the wall. 

I had a hodgepodge of various cards, boards, and configurations.  Slowly traded them in for nothing but 5970s from ebay (and newegg when they were $400 ea).
MadHacker
Full Member
***
Offline Offline

Activity: 154



View Profile
October 31, 2011, 08:07:52 PM
 #5

I have found 5970s used on ebay a good deal.  If you build any new rigs look for boards with layout like

Slot 1 --- 16x slot
Slot 2 --- anything
Slot 3 --- anything
Slot 4 --- 16x Slot
Slot 5 --- anything
Slot 6 --- anything
Slot 7 --- 16x Slot

Makes it easy to drop 3x5970s.  They get plenty of air.  No need for extenders or custom frames.  Can get 2300MH/s easy on < 1kWh at the wall. 

I had a hodgepodge of various cards, boards, and configurations.  Slowly traded them in for nothing but 5970s from ebay (and newegg when they were $400 ea).

I plan to slowly trade all my 5830's for 5970 eventualy...
but with 7 slots on each board, it would be nice to have 7 5970s in each one...
of course i would have to run linux... so learn linux.
but i will be a few months untill i get there...

I already have the extenders, as well as frames to hold the cards made out of some wood.
just need to figure out why i have a 6 GPU limit... perhaps its the board? I could understand that with 7 cards but not 6.
duno
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
October 31, 2011, 08:11:47 PM
 #6

Were the extenders powered?

The PCIe spec allows up to 75W per slot (from the motherboard).  Now high end cards tend not to draw more than ~25W to 30W but lowe end cards might be pulling more (since they only have a single PCIe power connector).

It might be that the motherboard simply can't delivery 6x say 40W = 240W running through PCIe slots.  If MB manufacturer made the assumption that is would never be more than ~150W well that isn't going to work. 

It is possible to buy powered PCIE slots which have a molex connector to feed power directly to card via slot connector (and thus lower draw on MB).  I have heard some people have also jury rigged their own. 

Maybe that is why I had no problem w/ 4x5970.  It is 8GPU but they only pull 30W per slots so ~120W from MB.  Just a theory.
MadHacker
Full Member
***
Offline Offline

Activity: 154



View Profile
October 31, 2011, 08:27:38 PM
 #7

Were the extenders powered?

The PCIe spec allows up to 75W per slot (from the motherboard).  Now high end cards tend not to draw more than ~25W to 30W but lowe end cards might be pulling more (since they only have a single PCIe power connector).

It might be that the motherboard simply can't delivery 6x say 40W = 240W running through PCIe slots.  If MB manufacturer made the assumption that is would never be more than ~150W well that isn't going to work. 

It is possible to buy powered PCIE slots which have a molex connector to feed power directly to card via slot connector (and thus lower draw on MB).  I have heard some people have also jury rigged their own. 

Maybe that is why I had no problem w/ 4x5970.  It is 8GPU but they only pull 30W per slots so ~120W from MB.  Just a theory.
yes my extenders were powered,
i run 4 16X PCIE not powered, but on all the 1X slots I make them powered.
also wrap them in metal tape to minimize EMI
likuidxd
Sr. Member
****
Offline Offline

Activity: 448



View Profile
October 31, 2011, 09:41:04 PM
 #8

Yes! I'm back up to 8 5870's running a custom linux iso a friend made because bamt stopped recognizing the eighth gpu. Bamt does run 7 just fine though.

https://bitcointalk.org/index.php?topic=7216.msg590121#msg590121
https://bitcointalk.org/index.php?topic=37880.msg590112#msg590112

MadHacker
Full Member
***
Offline Offline

Activity: 154



View Profile
October 31, 2011, 09:47:55 PM
 #9

i thought linux supported 16 GPUs?
likuidxd
Sr. Member
****
Offline Offline

Activity: 448



View Profile
October 31, 2011, 09:56:32 PM
 #10

Bamt is custom coded, idk what went wrong with it and aaron wolfe was unresponsive/didn't want to help. I think linux supports up to 12. Windows 7 is supposed to support 8, but CCC has serious problems when going over 4 gpus and will bsod after a few minutes

MadHacker
Full Member
***
Offline Offline

Activity: 154



View Profile
October 31, 2011, 10:36:36 PM
 #11

Bamt is custom coded, idk what went wrong with it and aaron wolfe was unresponsive/didn't want to help. I think linux supports up to 12. Windows 7 is supposed to support 8, but CCC has serious problems when going over 4 gpus and will bsod after a few minutes
I guess 12 will have to do.
that would be 6 5970's per machine.
catfish
Sr. Member
****
Offline Offline

Activity: 270


teh giant catfesh


View Profile
October 31, 2011, 11:40:39 PM
 #12

It's utterly dependent on the luck of the draw, and the logic board you have.

For a start, the designers will have needed to have thought of some mad hackers Wink plugging extender cables into every single PCIe slot and every single PCIe slot pulling significant 12V current, aka plugging top-end GPUs into EVERY slot.

Hardly ANYONE does this apart from us lot. And most of 'us lot' don't build past 4 GPUs / logic board (I spent so much time and money trying (and finally succeeding) to get 5 GPUs running on one board, that I designed a modular 4-GPU per board system, with very cheap 4-PCIe slot boards).

Going to the big rigs is SO much more expensive - once you go above 1000W PSUs, the price goes up exponentially (sort-of - don't pick at my maths eh? Wink ), as does the price of logic boards. My old faithful Gigabyte H61M-D2-B3 with 4 PCIe slots costs around £50 these days. I've got four of the things - one entire shelf rig uses nothing but them. Three in a row, each with 4 GPUs plugged in via x1-x16 extenders. Unpowered. They all work. 5830s, 5850s, 6950s. A wooden tower of power at 2 kW.

Yes, I could get the requisite 12 GPU cores by a 6-slot Magnum Big Bong Extreme Marshal Mathers Whatever and plugging 6 5970s into the board. That'd be ONE system instead of having to admin three Linux systems. But those Slim Shady boards are bloody expensive in the UK. Then again, that sounds like needing 2.5 kW of PSUs - that'll be two 1200W units then, which are £200 each. Ahem. Then, in the UK, find me some 5970s. OK, forget that, let's have 6990s instead. The damn things are £500 *each*.

Hence the Slim Shady solution ends up costing £3k in GPUs, say £600 in board and PSUs. If just one of those GPUs locks up, you can freeze the whole rig. If just one bit of home-soldering (for ganging the two PSUs) goes wrong, or a PSU fails.... that's £3k down the drain.

Sorry Mad Hacker, but I've been called exactly that epithet in my distant youth and it was for projects *exactly* like this. It's just fucking insane. You'll need just as much room as my shelf rig to keep the systems cool because they'll give around the same hashrate and consume around the same power - dissipating the same heat. I don't see any gain from these fancy big single rigs, other than ease of control and the ability to perhaps play games (but don't game players use that system called 'windows' or whatever and the drivers limit you to 8 GPUs?).

I've had too many of my GPUs and one PSU blow up. I can just about handle the disappointment because the PSUs all cost £100 or less, and the GPUs maxed out at £180. If I'd invested in a trio of 6990s and the £250 PSU blew up, taking the whole lot down, I'd be pretty pissed off.

Looks chaps (and ladies, if present) - bitcoin mining is NOT normal use of ATI GPUs. Certainly the reference designs may be able to cope non-stop at non-overclocked rates... but the OEM boards sure as HELL aren't designed to be running tight OpenCL kernels 24/7. And put your hand up if you're overclocking the core, underclocking the memory and messing with the card voltage... we're a right old extreme case of nutters and the hardware IS NOT DESIGNED FOR WHAT WE DO.

Hence we're all going to break things eventually. I'd rather break a few slower inexpensive cards / PSUs than blow up a single God Box that cost more than my old PowerMac Quad G5 (fuck me, I can still remember the invoice - modern Macs are cheap as chips in comparison)... blimey.

Sorry for the rant, too much brew and no aircon, I'm roasting like a pig on a spit surrounded by testing multi-GPU mining boards..... Smiley

...so I give in to the rhythm, the click click clack
I'm too wasted to fight back...


BTC: 1A7HvdGGDie3P5nDpiskG8JxXT33Yu6Gct
worldinacoin
Hero Member
*****
Offline Offline

Activity: 658



View Profile WWW
October 31, 2011, 11:42:20 PM
 #13

That is Mad! Smiley

anybody have any luck running more then 6 GPU's
I have only tried in windows 7 64bit but when ever i add a 7'th GPU my hash rate drops down to 50% - 75%.
I have more then enough PSU to do the job. so not sure why there is a problem.

tested with 7 5830's (7 individual cards)
and 1 6990, 1 5870 & 4 5830's

each time i had to remove 1 of the 5830's from the machine to get full hash rate.

initial post here with hardware spec https://bitcointalk.org/index.php?topic=39911

MadHacker
Full Member
***
Offline Offline

Activity: 154



View Profile
November 01, 2011, 12:17:38 AM
 #14

myself i'm working on power efficiency
the more GPU's i can get in a single motherboard, the less power i can consume per GPU.
as for cooling...
i plan to watercool.
buy waterblocks slowly over time.
I'm not in it to make money...
i'm in it to have phun.
if i make some money in the process.. great...
however i spent all my 500 bitcoins now that i have mined... on more video cards Smiley

It's utterly dependent on the luck of the draw, and the logic board you have.

For a start, the designers will have needed to have thought of some mad hackers Wink plugging extender cables into every single PCIe slot and every single PCIe slot pulling significant 12V current, aka plugging top-end GPUs into EVERY slot.

Hardly ANYONE does this apart from us lot. And most of 'us lot' don't build past 4 GPUs / logic board (I spent so much time and money trying (and finally succeeding) to get 5 GPUs running on one board, that I designed a modular 4-GPU per board system, with very cheap 4-PCIe slot boards).

Going to the big rigs is SO much more expensive - once you go above 1000W PSUs, the price goes up exponentially (sort-of - don't pick at my maths eh? Wink ), as does the price of logic boards. My old faithful Gigabyte H61M-D2-B3 with 4 PCIe slots costs around £50 these days. I've got four of the things - one entire shelf rig uses nothing but them. Three in a row, each with 4 GPUs plugged in via x1-x16 extenders. Unpowered. They all work. 5830s, 5850s, 6950s. A wooden tower of power at 2 kW.

Yes, I could get the requisite 12 GPU cores by a 6-slot Magnum Big Bong Extreme Marshal Mathers Whatever and plugging 6 5970s into the board. That'd be ONE system instead of having to admin three Linux systems. But those Slim Shady boards are bloody expensive in the UK. Then again, that sounds like needing 2.5 kW of PSUs - that'll be two 1200W units then, which are £200 each. Ahem. Then, in the UK, find me some 5970s. OK, forget that, let's have 6990s instead. The damn things are £500 *each*.

Hence the Slim Shady solution ends up costing £3k in GPUs, say £600 in board and PSUs. If just one of those GPUs locks up, you can freeze the whole rig. If just one bit of home-soldering (for ganging the two PSUs) goes wrong, or a PSU fails.... that's £3k down the drain.

Sorry Mad Hacker, but I've been called exactly that epithet in my distant youth and it was for projects *exactly* like this. It's just fucking insane. You'll need just as much room as my shelf rig to keep the systems cool because they'll give around the same hashrate and consume around the same power - dissipating the same heat. I don't see any gain from these fancy big single rigs, other than ease of control and the ability to perhaps play games (but don't game players use that system called 'windows' or whatever and the drivers limit you to 8 GPUs?).

I've had too many of my GPUs and one PSU blow up. I can just about handle the disappointment because the PSUs all cost £100 or less, and the GPUs maxed out at £180. If I'd invested in a trio of 6990s and the £250 PSU blew up, taking the whole lot down, I'd be pretty pissed off.

Looks chaps (and ladies, if present) - bitcoin mining is NOT normal use of ATI GPUs. Certainly the reference designs may be able to cope non-stop at non-overclocked rates... but the OEM boards sure as HELL aren't designed to be running tight OpenCL kernels 24/7. And put your hand up if you're overclocking the core, underclocking the memory and messing with the card voltage... we're a right old extreme case of nutters and the hardware IS NOT DESIGNED FOR WHAT WE DO.

Hence we're all going to break things eventually. I'd rather break a few slower inexpensive cards / PSUs than blow up a single God Box that cost more than my old PowerMac Quad G5 (fuck me, I can still remember the invoice - modern Macs are cheap as chips in comparison)... blimey.

Sorry for the rant, too much brew and no aircon, I'm roasting like a pig on a spit surrounded by testing multi-GPU mining boards..... Smiley



That is Mad! Smiley

anybody have any luck running more then 6 GPU's
I have only tried in windows 7 64bit but when ever i add a 7'th GPU my hash rate drops down to 50% - 75%.
I have more then enough PSU to do the job. so not sure why there is a problem.

tested with 7 5830's (7 individual cards)
and 1 6990, 1 5870 & 4 5830's

each time i had to remove 1 of the 5830's from the machine to get full hash rate.

initial post here with hardware spec https://bitcointalk.org/index.php?topic=39911


why?
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 01, 2011, 12:25:37 AM
 #15

I've had too many of my GPUs and one PSU blow up. I can just about handle the disappointment because the PSUs all cost £100 or less, and the GPUs maxed out at £180. If I'd invested in a trio of 6990s and the £250 PSU blew up, taking the whole lot down, I'd be pretty pissed off.

Why good powersupply comes with 5 year warranty? 

Quote
Looks chaps (and ladies, if present) - bitcoin mining is NOT normal use of ATI GPUs. Certainly the reference designs may be able to cope non-stop at non-overclocked rates... but the OEM boards sure as HELL aren't designed to be running tight OpenCL kernels 24/7. And put your hand up if you're overclocking the core, underclocking the memory and messing with the card voltage... we're a right old extreme case of nutters and the hardware IS NOT DESIGNED FOR WHAT WE DO.

I have five 3x5970 rigs running now 7 months of 24/7 operation.  The reference design cards are built much better than the modified (usually to cut cost) models.  The high end cards have beefier heatsinks, stronger VRM, better fans.

Quote
Hence we're all going to break things eventually. I'd rather break a few slower inexpensive cards / PSUs than blow up a single God Box that cost more than my old PowerMac Quad G5 (fuck me, I can still remember the invoice - modern Macs are cheap as chips in comparison)... blimey.

5970 is more efficient and lower cost per MH in the US than a bunch of low end junk.  I can get >2.2MH/w and >730 MH for $360 used.  Higher end components for cheaper and with higher efficiency combined with easier installs and higher densities.

jaybones
Member
**
Offline Offline

Activity: 80


View Profile
November 13, 2011, 02:42:19 AM
 #16

DeathAndTaxes,

what motherboard and psu do you use for rig setup for the 3x 5970s?


jay
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
November 13, 2011, 05:05:46 AM
 #17

DeathAndTaxes,

what motherboard and psu do you use for rig setup for the 3x 5970s?

jay

MB
http://www.newegg.com/Product/Product.aspx?Item=N82E16813130274

PSU
http://www.newegg.com/Product/Product.aspx?Item=N82E16817171055

Any MB with a PCIe x16 in slots 1, 4, & 7 will work. 

The PSU I got because it was on sale.  Cooler Master isn't the best brand but I later bought the same components to keep all 5 rigs Identical.  Haven't had any significant problems in 9 months now. 

I also have a workstation w/ 3x5970 watercooled using the same MB but this powersupply
PSU:
http://www.newegg.com/Product/Product.aspx?Item=N82E16817194092

I like this unit.  It is solid.  Well built.  Heavy cables (lower losses due to current).  It is probably overkill for a mining rig though.

If you are running only 3x5970 with a sempron CPU (and willing to go into bios and turn off what you don't need) you probably can get by with a 1000W unit.  At the time I wasn't sure if I was going to expand to 4x5970 so that is why I got the 1200W models.  The rigs (not workstation) pull ~870W at the wall.  If we back out 12% for PSU inefficiency that is only 765W DC.  A solid 80-Plus Gold 1000W power supply should be fine.  If I had to do it again I likely would have gotten a nicer brand 1000W PSU (seasonic or enermax) as I never did expand to 4 cards
jjiimm_64
Legendary
*
Offline Offline

Activity: 1680


View Profile
November 13, 2011, 05:32:48 AM
 #18


I love the 5970's.

i have 4 rigs with 4 each.  each rig getting 2900Mh pulling less then 1200 watts at the wall.  the msi gd70 and 2 seasonic 750 psu's.

6 weeks and counting. ( i got all the 400newegg models i could Smiley)

https://plus.google.com/u/0/photos/112408294399222065988/albums/5658727447810944545


1jimbitm6hAKTjKX4qurCNQubbnk2YsFw
catfish
Sr. Member
****
Offline Offline

Activity: 270


teh giant catfesh


View Profile
November 13, 2011, 05:38:38 AM
 #19

I have five 3x5970 rigs running now 7 months of 24/7 operation.  The reference design cards are built much better than the modified (usually to cut cost) models.  The high end cards have beefier heatsinks, stronger VRM, better fans.

Quote
Hence we're all going to break things eventually. I'd rather break a few slower inexpensive cards / PSUs than blow up a single God Box that cost more than my old PowerMac Quad G5 (fuck me, I can still remember the invoice - modern Macs are cheap as chips in comparison)... blimey.

5970 is more efficient and lower cost per MH in the US than a bunch of low end junk.  I can get >2.2MH/w and >730 MH for $360 used.  Higher end components for cheaper and with higher efficiency combined with easier installs and higher densities.
D&T... not being a cock or anything, but you *must* have noticed the different currency symbol I have a habit of using...

Your economics are simply impossible in the UK. If you convert my numbers straight into USD (I'm assuming your $ sign is an American one, and that you live in the USA, rather than the dollars being Canadian, Australian or what-have-you) then it looks like I'm being silly. But if you turn it the other way round, and convert your numbers into GBP... it's pie-in-the-sky stuff.

Both hardware and electricity are significantly more expensive over here, and on the hardware front it's not just a matter of cost, but a matter of availability. New 5850s are available from VERY few shops. I bought a load of XFX cards and got a 50% failure rate. These cards will cost me a fortune just sending them back (if the vendor will accept a return - and the best I'll get is another junk XFX card in return, postage paid for by me twice) - I will never buy XFX again.

Reference 5870s? Well the only ones I've seen available new at retail are selling for around £350. That's $562. Non-reference cards vary in price but are usually over £200. The cheap XFX cards I bought (still over £150 each) were 50% DOA.

Don't get me wrong - I agree with you, and standardising on a known-good high-quality set of components is the most sensible way to go about the process. I started with a massive hodgepodge of components - I've now at least standardised on logic boards, CPUs, RAM, PSUs and all the ancillaries and software setup.

However, equipping my rigs with *reference* 5870s is a hopeless task in the UK. If I were to acquire some from the same places you get yours, and have them shipped from the USA to the UK, then I get hit with import duty, then 20% VAT on top of all that. Not financially viable.


As a result I've had to learn the hard way - by trying anything I could get a feasible supply of, and then finding out myself whether the OEM made a high quality card, or made junk. It's not immediately obvious from looking at pictures of the board (though it can help) - but there are UK sites that sell GPUs with a photo of a different card... small print on their 'terms and conditions' page states that the photos may not represent the true product (which is shabby in its own right), but I blew up the thumbnail photos on my Mac and some of the photos were clearly photoshopped (using the photo of a 5850 with the better heatpipes / fansink, for example, and editing the '50' to say '30' - so I thought I was getting a 5830 with the 'old' style full-length fansink / heatpipe setup, but received some crap spiral heatpipe thing which bends in the middle).

I'm trying to standardise on GPUs, like you. But I can't find 5870s at a price that makes *any* sense for mining, unless I'm looking in the wrong places. My best cards are the Sapphire *original* 5850 - the long card with 5 heatpipes, and the Asus 6950 DirectCU II. Both are 1GB versions and a LOT cheaper than 5870s. The Asus 6950 has a trick BIOS and a great heatsink and is pretty much guaranteed to run 420 MH/sec once hacked (it's a 6970 really) - and they're available in the UK in quantity. The Sapphire *original* 5850 (not the 'Extreme') is fantastic but bloody hard to find. I ran into 2 last week and bought them immediately. They're running like champs right now.

It's a numbers game and there's some merit in the cheaper cards too. Whilst I'd like to have entire rigs filled with my two favourite cards as above, the 6950 costs £200 and the 5850 (if you can find one) is around £150 minimum. That all made sense earlier this year, but with the current BTC exchange rate... not so much now. To add to the confusion, even though I hate XFX now and their appalling quality, they also make a single-slot 5770 (and 6770, assuming it's the same - it looks identical). This has a rear exhaust to confuse matters, but they're available in quantity for £80 each and are reliable, 200 MH/sec workers. You can squeeze a lot of these into a PC case.

But when you're building open frame custom rigs, like my mad shelf unit, there's room and cooling for fat cards. So I'm slowly trying to get away from 5830s, anything from XFX, and consolidate with Sapphire 5850s and Asus 6950s.

Apart from your 'reference cards only' preference, do you have any other opinions / advice for English chaps like myself who can't find reference cards at anything other than ludicrous prices, and prefer 5850s over 5870s due to sheer hash per pound (erm, sorry, very very bad UK / USA joke reference there - I mean hashpower per GBP)?


PS. the only failures I've had since taking mining seriously have been XFX graphics cards. Everything else works, and has been going 24/7 since July...


ETA - DOH - you actually said 'in the US' - sorry, ignore this post unless you can help with the last question....

...so I give in to the rhythm, the click click clack
I'm too wasted to fight back...


BTC: 1A7HvdGGDie3P5nDpiskG8JxXT33Yu6Gct
likuidxd
Sr. Member
****
Offline Offline

Activity: 448



View Profile
November 13, 2011, 06:30:15 AM
 #20

Apart from your 'reference cards only' preference, do you have any other opinions / advice for English chaps like myself who can't find reference cards at anything other than ludicrous prices, and prefer 5850s over 5870s due to sheer hash per pound (erm, sorry, very very bad UK / USA joke reference there - I mean hashpower per GBP)?

ETA - DOH - you actually said 'in the US' - sorry, ignore this post unless you can help with the last question....

You could look at my post...
https://bitcointalk.org/index.php?topic=51621.0

2.6 Mh/$, convert to GBP or EUR, for each card. They run cool in my open rigs with the fans running at 55% I top out at 63C.

Pages: [1] 2 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!