O yes, i drew over 3600 watts which would have spiked at 7200 when it all kicked in. That never did anything to the electrics
from 1 plug socket. UK 230-240V mind you but still. 1500 is peanuts. Plug it in, connect the black and green cable together (pins 3-4 it would be going along the top row) and the power stays on
The bit that plugs into the motherboard, the 24 pin one. That just sends the signal for on.
.0 MASSIVE SPIDER just ran at me haha. Jeepers
I just lost my mobo with 4 x1 and 1 x16 socket on and they have no more
PCI to PCI-E are expensive
well, £20 haha. But still, just got my akasa 1200 up, Was going test the 5 cards but my other mobo is only 2 1xPCIE and 1 x13PCIE, 3
Ordered 1 PCI converter but need another. This is mint if true though as i will get 5 cards per system with a 1200/1250 PSU there ain't a problem. REALLY wanted see 5 cards running 400+ mhash each too
My 'RIG' now comprises of 10 cards, not 8 when complete now
4 Ghash a sysytem, Costing about 1700-1800 GBP. I dont think thats too bad. It earns £20 a day, £140 a week, Card payed for
Thats conservitive and i recon values going go up. Rose for the first weekend ever this one i believe
What I mean by the Quad G5 PSU being big is that Apple had to sell them with special cables for EU regulations - standard kettle leads aren't rated for the power the thing draws. Presumably the *input* power is huge. I'll dig the specs out for you - I remember it was extreme because Apple built that box for serious pros - it has 4 PCIe x16 slots and could drive 8 monitors at the time (and cost £4,500 as the first 64-bit desktop workstation). Pros buy Apple so they don't have to tinker or upgrade PSUs, etc. - if you make your living with Photoshop, and buy a couple more screens, you go to Apple, get more GPUs, open the case, bolt them in, job done. No worrying about PSU capacity because Apple sized the PSU for the maximum. That's four ridiculously hot G5s, a full water cooling system, and up to 4 graphics-cards-of-the-day.
Anyway this is all horseshit because the Apple G5 PSU doesn't have ATX connectors for a PC logic board. I'm sure there's a way to hack it, but that CoolerMaster 600W unit cost £88 and I can live with that.
Back to those 5850 Extreme cards. I simply do NOT believe they are as good as the original design Sapphire 5850. My first Bitcoin GPU was a retail boxed Sapphire 5850 from PC World, I paid £175. It was a big long card with 5 visible 1-cm-diameter copper heatpipes snaking at the top of the card. It's a good 5 cm longer than the Extreme cards.
The 'old' Sapphire card had dual DVI plugs - which *should* reduce airflow since the Extremes (which only have one DVI plug) use the area above all for extracted air. However the Extreme cards don't have all the visible heatpipes. I'm not saying that heatpipes are the be-all and end-all, but cost of materials must make the 'old' Sapphire 5850 a much more expensive card to produce.
I reckon the 'Extreme' edition cards aren't 'Extreme' in performance terms - simply 'extremely cheap' - Sapphire have done a cost-cutting job on the cards. The two completely different PCB designs that I own suggest that it's been worth Sapphire's R&D time to further cut costs and crank out volume (surely Bitcoin miners aren't a big enough market to justify this?).
Basically - the 'old' 5-heatpipe Sapphire 5850 overclocks like a champ. I've had it higher, but I'm happy with the fact that on *standard voltage* and on a test board with NO additional cooling, it cranks out 402 Mh/s at 64.5 deg C - with the fan at a comfortable 70%. If I bang 100% fan on, then temps drop to 60.5 to 61.0 deg C, but it roars like a giant dog eating catfish. The settings are simply 999 core clock and 275 memory clock. No other messing around, on a completely standard Ubuntu installation. No BIOS tweaking, nothing. The card is BRILLIANT.
I was hoping that the 'Extreme' cards would be better. However they're *clearly* much lower build quality. The cooling setup is clearly inferior, obviously cheaper as there's a lot less copper and only two visible heatpipes. The 'Extreme' cards are much lighter (when I rebuild the systems I'll weigh them for real numbers).
Also, they don't overclock anywhere *near* as easily. Using an *identical* installation process to the board with the 'old' Sapphire 5850 - the same Linux, same scripts - everything - one card wouldn't get past 925 core clock (around 380 Mh/s) and the other wouldn't get past 950 core clock (around 389 Mh/s). Not too shabby, but still maxed out and with a huge desk fan blowing on them. Temps for both cards are well under 70 deg C, so this isn't a heat issue. The GPUs crash at higher core clocks, even if the entire core is only 61 deg C...
Which is which? Well, the slowest card is Card A above - horizontal power leads, P/N 299-2E174-000SA. The faster card (950 MHz core) is Card B above - vertical power leads, P/N 299-1E174-230SA.
Due to PSU issues, I'm wary of messing with GPU voltage since it exponentially increases power consumption.
Anyone know how to measure *directly* the power consumption of a specific *entire* Radeon card whilst it's running? I've got a multimeter etc. If these cards are only drawing 150W flat out, then my 600W supply should give plenty of headroom. If they're running 250W each then I haven't got the option of bolting in my 5770 as well