Andrew is mostly correct.

It's true that power = voltage * amps. So, if the voltage goes up, the amperage goes down by the same ratio. Double the voltage, and the amperage is halved, if designed for the same power delivered (in watts).

Assuming a static 10ohm load:
At 12v, this will draw 1.2A, for a power delivered of 14.4w
At 24v, this will draw 2.4A, for a power delivered of 57.6w (!)

If we change the load to 40ohms at 24v, then the amperage goes down to 0.6A, and the power delivered goes to 14.4w, just like the 10ohm@12v circuit. Twice the voltage with half the current, and the same delivered power, but 4 times the resistance. The benefit here is that the wires can be much smaller, since they're carrying less current than at 12v.

A starter designed for 12v will pull twice the current at 24v than it was designed for (and way more power than designed for), and will eventually burn up.

Back to the fan, assuming it is running at one speed and drawing a constant power number, then yes, as the voltage goes up, the amperage should go down.

But, your test was in the kitchen not pulling through a radiator? If the fan has a target rotation speed for each speed value (I'm not sure), it would need more power to meet that if it's now pulling through the radiator/grill in the vehicle.

There's also the ambiguous case where the fan may be trying to go faster at the supplied voltage and current, but can't, until the voltage increases, so it's finally able to speed up and draw more current. In that case more voltage may indeed lead to more current.

Edit: yep, for the same wattage, current halves if volts doubles.

Last edited by hooziewhatsit; 06/30/18 02:44 PM.

If you ever find yourself in a fair fight, your tactics suck.