|
April 27, 2013, 04:04:39 AM |
|
Switch to a 220v setup... It runs the same power at half the amps. EG, less breakers required, twice the available potential. Not to mention, 220v PSU's are about 5x cheaper!
If you have them setup 220v to a circuit, you MUST label it 220v, and then simply change the switch on your PSU to 220v and plug it back in. (Don't flip the switch to 220v on a 110v line, you will fry yoru PSU as it "tries" to regulate with half-power. Had to say it, because some noob will think, "Hey, lets just flip the switch, and it now runs at half-power!", no, well, yes, but that kills it, without the other half of the power. It only makes it more efficient and draws less amps/heat.)
Since you are asking from a "power" concern... and logistics...
The AMPS on a power-bar are a safety for the GAUGE of wire within the cord, and the internal circuits (if any), and the length of wire used... based on "standard power and draws", from UL records/laws per "individual outlet"... You will not find a "legal", 20A breaker on a power-strip, that is UL,CA and actually "insured", but it may have a nice warning about fire-hazards stamped on the plug. Those are for "intermittent" use, in a consumer-world. (The 20A-30A breakers.) The "Cord" may be UL,CA approved for 30A, but the breakers will not, for 110V. 15A = 1650Watts. 20A = 2200Watts... at 110v... many times the electric company reduces power to 95v to "make more money" from your meter, which measures amps... That turns a 20A source from being 2200Watts at 110v, into a 23A source at the same wattage of 2200Watts. Thus, "tripping" 20A circuits in the summer-time, but not a 25A circuit, so it goes undetected most times.
As opposed to 2200Watts from a 220V line = only 10A, not 20A.
|