This question has been around for decades in various forms. I'll try to clarify the fog surrounding it.

When cranking an engine, especially at lower temperatures, the starter imposes so much load on the battery that the voltage drops (as low as 8 to 9 volts). This reduced potential (voltage, E), would reduce the ignition system's spark output and make cold starting difficult. In a nutshell, what the factory did was to design the ignition system (breaker points and electronic) to produce full output at this reduced cranking voltage. Then, under normal running conditions, a resistor is used to reduce the voltage to the design point. This is why there's two different ignition wires (and switch terminals), "start" and "run" (or "ign 1" and "ign 2").

Bypassing the ballast resistor will overtax all ignition components: ECU / breaker points and ignition coil), increasing the spark energy (output) while reducing their reliability markedly.

Factory hi-po ECUs ("gold box", etc.) were designed for race use and used lower-resistance ballasts, their internal components and heat sinking were designed for this, still, they were never recommended for street use / idle.

Little, if any, of the above applies to aftermarket ignition systems and/or components. Many are designed to run on full battery potential; they have either internal regulators or dc-to-dc converters. Even with a factory OEM ECU, if your replacement coil has higher internal resistance than the OEM 1.4 to 1.8 ohm value, you might be able to safely reduce the ballast's value.

Do not lose sight of this fact: If there's no misfiring (at any useable RPM) in your present configuration, no ignition system swap or upgrade will increase output. EG: I witnessed a factory engine dyno session (Chrys.) where the subject engine produced over 150 HP/L on dead-stock wasted-spark ignition (even wires, coil, etc.). All aftermarket systems tried produced zero gains.

Hope this helps,

Rick E.