Keefe
|
|
December 22, 2013, 10:15:52 PM |
|
There's another reason to use more than 2 cables: reducing the power bill. Let's assume the wires from the PSU to the rig are 18 gauge and 2 ft long, and the rig uses 600W. With two cables (don't use a single cable with two plugs!), we have 6 wires for each of + and -, each carrying 8.3A of current. Each of those 12 wires has a resistance of 0.0128 ohms*, so the voltage drop at 8.3A is 0.106V and the wasted power is 0.88W per wire, for a total waste of 10.6W. If you figure on running these rigs for 6 months and your power cost is $0.15/kwh (mine's twice that), you waste $6.95 just to heat the wires. The connectors probably add significant resistance and therefore waste even more power. Now let's run the numbers for 4 cables instead of 2. 24 wires, 0.0128 ohms, 4.15A, 0.053V, 0.22W/wire, 5.3W total wasted on heating the wires. As a general rule, doubling the number of wires (or their cross-section area, i.e. 15 gauge vs 18 gauge) sharing the same load halves the power wasted. You save at least $3.47 over 6 months and reduce the risk of fire due to overloaded connectors. Since my power costs twice as much as the examples above, I went a little overkill and made cables with one 12 gauge wire instead of three 18 gauge wires, for my modular PSUs, with male PCIe plugs on one end to fit the PSU and ring terminals on the other, for a total of eight 12 gauge wires (four +, four -) per rig. * http://en.wikipedia.org/wiki/American_wire_gauge
|
|
|
|
Doff
|
|
December 22, 2013, 11:24:59 PM |
|
There's another reason to use more than 2 cables: reducing the power bill. Let's assume the wires from the PSU to the rig are 18 gauge and 2 ft long, and the rig uses 600W. With two cables (don't use a single cable with two plugs!), we have 6 wires for each of + and -, each carrying 8.3A of current. Each of those 12 wires has a resistance of 0.0128 ohms*, so the voltage drop at 8.3A is 0.106V and the wasted power is 0.88W per wire, for a total waste of 10.6W. If you figure on running these rigs for 6 months and your power cost is $0.15/kwh (mine's twice that), you waste $6.95 just to heat the wires. The connectors probably add significant resistance and therefore waste even more power. Now let's run the numbers for 4 cables instead of 2. 24 wires, 0.0128 ohms, 4.15A, 0.053V, 0.22W/wire, 5.3W total wasted on heating the wires. As a general rule, doubling the number of wires (or their cross-section area, i.e. 15 gauge vs 18 gauge) sharing the same load halves the power wasted. You save at least $3.47 over 6 months and reduce the risk of fire due to overloaded connectors. Since my power costs twice as much as the examples above, I went a little overkill and made cables with one 12 gauge wire instead of three 18 gauge wires, for my modular PSUs, with male PCIe plugs on one end to fit the PSU and ring terminals on the other, for a total of eight 12 gauge wires (four +, four -) per rig. * http://en.wikipedia.org/wiki/American_wire_gaugeSo you just made me realize I need to buy two of those Extenders. Is it ok to connect two ring connectors to each terminal like that? At least that's the only way I see connecting for pci connections.
|
|
|
|
Keefe
|
|
December 22, 2013, 11:48:36 PM |
|
There's another reason to use more than 2 cables: reducing the power bill. Let's assume the wires from the PSU to the rig are 18 gauge and 2 ft long, and the rig uses 600W. With two cables (don't use a single cable with two plugs!), we have 6 wires for each of + and -, each carrying 8.3A of current. Each of those 12 wires has a resistance of 0.0128 ohms*, so the voltage drop at 8.3A is 0.106V and the wasted power is 0.88W per wire, for a total waste of 10.6W. If you figure on running these rigs for 6 months and your power cost is $0.15/kwh (mine's twice that), you waste $6.95 just to heat the wires. The connectors probably add significant resistance and therefore waste even more power. Now let's run the numbers for 4 cables instead of 2. 24 wires, 0.0128 ohms, 4.15A, 0.053V, 0.22W/wire, 5.3W total wasted on heating the wires. As a general rule, doubling the number of wires (or their cross-section area, i.e. 15 gauge vs 18 gauge) sharing the same load halves the power wasted. You save at least $3.47 over 6 months and reduce the risk of fire due to overloaded connectors. Since my power costs twice as much as the examples above, I went a little overkill and made cables with one 12 gauge wire instead of three 18 gauge wires, for my modular PSUs, with male PCIe plugs on one end to fit the PSU and ring terminals on the other, for a total of eight 12 gauge wires (four +, four -) per rig. * http://en.wikipedia.org/wiki/American_wire_gaugeSo you just made me realize I need to buy two of those Extenders. Is it ok to connect two ring connectors to each terminal like that? At least that's the only way I see connecting for pci connections. I have 4 ring terminals on each screw. I'm using 4 of these for a rig:
|
|
|
|
twmz
|
|
December 23, 2013, 12:00:47 AM |
|
I have 4 ring terminals on each screw. I'm using 4 of these for a rig: Don't you lose most of the benefit of distributing the load across many more wires if you don't use all 6 wires from each PCIe connector? Your photo looks like you're only using one + and one GND from the PCIe connector.
|
Was I helpful? 1 TwmzX1wBxNF2qtAJRhdKmi2WyLZ5VHRs WoT, GPGBitrated user: ewal.
|
|
|
Keefe
|
|
December 23, 2013, 12:09:28 AM |
|
I have 4 ring terminals on each screw. I'm using 4 of these for a rig: Don't you lose most of the benefit of distributing the load across many more wires if you don't use all 6 wires from each PCIe connector? Your photo looks like you're only using one + and one GND from the PCIe connector. No, each splice has three 18 gauge wires (with extra thick insulation) from the PCIe plug on one end and one 12 gauge wire on the other end.
|
|
|
|
Anddos
|
|
December 23, 2013, 12:20:22 AM |
|
Why is the price so dam high for 400GH?
|
| █ █ █ █ █ █ █ █ █
| | █ █ █ █ █ █ █ █ █
| | █ █ █ █ █ █ █ █ █
| ☑
|
|
|
|
Doff
|
|
December 23, 2013, 01:11:07 AM |
|
Why is the price so dam high for 400GH?
They are just for looks and don't actually want to sell them. (Sorry couldn't help myself)
|
|
|
|
Keefe
|
|
December 23, 2013, 01:56:48 AM |
|
|
|
|
|
Doff
|
|
December 23, 2013, 02:18:07 AM |
|
So one last question. It should be safe to have the two PCI Plugged in on the Front of my V3 Board, and 2 More going to the Back of the board on the Terminals from the same PSU of course.
|
|
|
|
Keefe
|
|
December 23, 2013, 02:19:40 AM |
|
So one last question. It should be safe to have the two PCI Plugged in on the Front of my V3 Board, and 2 More going to the Back of the board on the Terminals from the same PSU of course.
Yes
|
|
|
|
Trongersoll
|
|
December 23, 2013, 08:23:07 PM |
|
*kicks Daves desk* Hey! Wake up! You got orders waiting to be shipped!
|
|
|
|
allinvain
Legendary
Offline
Activity: 3080
Merit: 1080
|
|
December 24, 2013, 09:31:05 AM |
|
So still no word on any possible price reductions?
|
|
|
|
Beans
|
|
December 24, 2013, 09:36:06 AM |
|
None of my units run properly. I've tried everything, but the cards just drop offline if they show up in the first place. Then slots start going dead entirely and cards never show up on them again. I thought maybe I had a bad component. I just got two new 100gh units today, that I bought specifically to get my October unit working finally. These ones are just doing the same thing, so there goes that plan. They don't even run for 2 minutes without dropping cards. I've lost so much time and money and these things it's disgusting. I've averaged about 50gh on my october unit, and I'm talking about 15 cards not 1.
|
|
|
|
allinvain
Legendary
Offline
Activity: 3080
Merit: 1080
|
|
December 24, 2013, 09:51:12 AM |
|
None of my units run properly. I've tried everything, but the cards just drop offline if they show up in the first place. Then slots start going dead entirely and cards never show up on them again. I thought maybe I had a bad component. I just got two new 100gh units today, that I bought specifically to get my October unit working finally. These ones are just doing the same thing, so there goes that plan. They don't even run for 2 minutes without dropping cards. I've lost so much time and money and these things it's disgusting. I've averaged about 50gh on my october unit, and I'm talking about 15 cards not 1.
Hmm, yeah they can be finicky/flaky, but I'm wondering did you make sure the cards are not wobbling? If they are left freestanding in the slots and you got a powerful fan blowing on them they will sway and wobble like crazy and this can cause them to work themselves loose from their respective slots. If you don't have a case for them I suggest maybe securing them by some method. Some have even used tape on the top of the cards. Also pay close attention to the slot arrangement. Make sure you have the cards distributed as evenly as possible across the various banks. Place the worst performing card at the end of the card chain.
|
|
|
|
Beans
|
|
December 24, 2013, 11:37:12 AM |
|
None of my units run properly. I've tried everything, but the cards just drop offline if they show up in the first place. Then slots start going dead entirely and cards never show up on them again. I thought maybe I had a bad component. I just got two new 100gh units today, that I bought specifically to get my October unit working finally. These ones are just doing the same thing, so there goes that plan. They don't even run for 2 minutes without dropping cards. I've lost so much time and money and these things it's disgusting. I've averaged about 50gh on my october unit, and I'm talking about 15 cards not 1.
Hmm, yeah they can be finicky/flaky, but I'm wondering did you make sure the cards are not wobbling? If they are left freestanding in the slots and you got a powerful fan blowing on them they will sway and wobble like crazy and this can cause them to work themselves loose from their respective slots. If you don't have a case for them I suggest maybe securing them by some method. Some have even used tape on the top of the cards. Also pay close attention to the slot arrangement. Make sure you have the cards distributed as evenly as possible across the various banks. Place the worst performing card at the end of the card chain. I've tried securing them a few ways, they don't seem to move. Even with just about no airflow I get the same result. When I do find a card combination that seems to work, it's not long before a card drops off and even a reboot or reseating the cards doesn't fix the problem. I've tried hundreds of card combinations, I pretty much have to rearrange them every time I restart. Which is like 20 times a day. If I don't, eventually only 1 or 2 cards will be left running. If any at all.
|
|
|
|
ShadesOfMarble
Donator
Hero Member
Offline
Activity: 543
Merit: 500
|
|
December 24, 2013, 11:50:14 AM |
|
When I do find a card combination that seems to work, it's not long before a card drops off and even a reboot or reseating the cards doesn't fix the problem. I've tried hundreds of card combinations, I pretty much have to rearrange them every time I restart. Which is like 20 times a day. If I don't, eventually only 1 or 2 cards will be left running. If any at all. I know that behaviour very well (from a total of 7 units, so at least it's consistent...). What I did to mitigate this was to follow the steps posted by punin: https://bitcointalk.org/index.php?topic=250249.msg3455392#msg3455392Although I waited 10-15 mins between the power off/power on in Solution B. Only waiting a short time resulted in cards still not hashing after the hard reset. Maybe there is some kind of fuse that needs to reset, which takes some time. Dunno. Nevertheless, the units are now hashing with 400++ most of the time. Also note that reseating the cards never changed/helped anything for me. Only a hard reset with the unit being turned off for at least 10 minutes got all cards back to hashing, with the config change proposed by punin (turn off auto-tuning) helped to keep them hashing...
|
|
|
|
allinvain
Legendary
Offline
Activity: 3080
Merit: 1080
|
|
December 24, 2013, 01:05:10 PM |
|
Here is what you need to do. Take out 4 cards. Run your full rigs with 12 cards MAX. The mix and matched full rigs are unstable. 16 card full rigs will eventually become glitchy due to SPI bugs.
|
|
|
|
ktbken
|
|
December 24, 2013, 01:59:01 PM |
|
I have also found bfgminer to be much more stable than chainminer it will run for days with no issues
|
|
|
|
ShadesOfMarble
Donator
Hero Member
Offline
Activity: 543
Merit: 500
|
|
December 24, 2013, 02:43:18 PM |
|
Here is what you need to do. Take out 4 cards. Run your full rigs with 12 cards MAX. The mix and matched full rigs are unstable. 16 card full rigs will eventually become glitchy due to SPI bugs.
Losing 25% hashrate doesn't sound like a good option.
|
|
|
|
klondike_bar
Legendary
Offline
Activity: 2128
Merit: 1005
ASIC Wannabe
|
|
December 24, 2013, 04:23:15 PM |
|
Here is what you need to do. Take out 4 cards. Run your full rigs with 12 cards MAX. The mix and matched full rigs are unstable. 16 card full rigs will eventually become glitchy due to SPI bugs.
Losing 25% hashrate doesn't sound like a good option. agreed. another thought is that unless you have a lot of heatsinking and airflow the cards can become unstable above 35-37GH range (some will do 40GH but require a lot of tuning and cooling efforts). Its possible your cards are heating up with operatin and eventually getting into the 35-37GH reported speed then switching off due to heat.
|
|
|
|
|