Electrical – Is Driving a Motor With PWM Inherently Less Efficient than Using a Lower Voltage

efficiencymosfetmotorpwm

This is a hypothetical question but it's been bugging me for a while.

Lets say I hook up a DC motor with a propeller on it to a 10v battery and let it run.

Then lets say I hoop up a second identical DC motor and prop to a 100v battery, but with a PWM controller. And lets say I tuned the PWM duty cycle so that the RPMS of the motors are identical.

The power output of the props should be identical now as well, since it's related only to the RPMs.

Lets also say that hypothetically the mosfets in the PWM controller have 0 on resistance, and switch instantly.

Will both systems be identical in efficiency? Or are there some inherent losses due to the PWM throttling?

Best Answer

Lets also say that hypothetically the mosfets in the PWM controller have 0 on resistance, and switch instantly.

Will both systems be identical in efficiency? Or are there some inherent losses due to the PWM throttling?

You haven't stated the PWM frequency. But assuming it's around 100 kHz, then it's very close to being the same as if you didn't use PWM.

If it's in the MHz - GHz region then you can expect some of the energy being radiated outwards because some of your wires will act like an antenna. I'd call this a "loss".

If it's in the sub 20 kHz range then you can expect to hear the PWM sound, this might drive you mad. I'd call this a "loss".

If it's in the 30 kHz range, you won't go mad, but your dogs might. (It's like giving them tinnitus. I'd call this a "loss".


When your transistors are off, the motor will act like a generator (if it is rotating), giving current back to your system. This means you will need some clamping diodes to guarantee that the terminals don't reach unsafe voltages. So that's some losses, depending on how you handle that excessive current.