Originally Posted by CMcAllister
Length? An electric guy once told me the length is more important than the gage in determining amp capacity. I believe him.


I'd find a new electric guy. Both are important. The wire gauge determines resistance per unit length. Obviously you then multiply by length to get total resistance! Then you figure out voltage drop by multiplying the resistance by the current. (Ohm's law).

A very short piece of small gauge wire is known as a "fuse" and it works by melting whistling

In response to Cab's question: at 77F, 4 ga. wire has a resistance of 0.25 milliohms per foot. (At 122F it's about 0.28).
So... how much voltage drop are you willing to tolerate, how long is the wire and how much current is passing through it? work

Incidentally a "12 volt" system with an alternator is around 13.8 volts which also varies with temperature (higher voltage in colder weather). If you're not running an alternator then your battery voltage will be dropping with time also

Edited to add: 14.5 volts, take a SWAG (Scientific Wild-Ass Guess) that you are OK with 0.5 volt drop, and the cable is six feet long? That's 0.25 * 6 = 1.5 milliohm.
1.5 milliohm will drop 0.5 volt if 333 amps are flowing. (0.5/0.0015 = 333).
Unless you're using it for starter cable, you don't even need #4 most likely.
.

Last edited by DrCharles; 01/31/20 11:56 PM.