Electronic – Series inductor to limit motor current

current-limitingdc motorpower supply

Short question: Would it be possible to use a series inductor to limit the inrush current when switching on (or briefly loading) a DC motor?

Longer question: I have a battery-powered motorized toy which for convenience I'd like to run from a power socket instead (to avoid having to recharge and replace batteries). Currently it uses 4 1.5v batteries and if I measure it in steady state it draws about 0.8A at about 6V. The way that the toy works is that the motor is usually running continuously with very low load, and occasionally it briefly gets some load on the motor. I don't know exactly what kind of motor is used, but assume it's cheap and simple.

I bought a (cheap) 6V power supply which is allegedly capable of delivering 1.2A, but its current protection cuts in when switching on the motor. As I understand it (please forgive my very basic knowledge here!) the motor presents an extremely low resistance when it's starting up, and so tries to draw a lot more than its usual 0.8A. When running from batteries, I guess they have a maximum current that they're able to provide, so everything gets automatically limited. When the motor is up to speed, its "effective" resistance rises due to its motion, and it reaches its steady state of drawing 0.8A. I expect that when the motor experiences some mechanical load, then the drawn current also temporarily rises, but again the batteries can only provide what they can provide.

When it's connected to the mains power supply, the motor tries to draw much more than 0.8A (I'm not sure exactly how much more), trips the power supply's protection and the power supply cuts out. The motor slows down again, the power supply recovers and switches back on, but then the motor tries to draw too much again and it cuts out again.

I've read about "motor inrush current" and "current limiting" but for this simple toy I'm hoping it won't be necessary to build a custom driving circuit with transistors or mosfets and I can't believe that a temperature-based thermistor solution is ideal (partly because of wasted heat, partly because my intermittent load is not temperature-related). From what (very) little I know about inductors, it seems that one of those in series would "fight against" the sudden change in current and develop a voltage across itself at startup. So I'm hoping this would temporarily limit the current to say 1A. In the steady state this voltage across the inductor would drop to zero and it would then act as a pure low resistance, which is exactly what I think I want.

It doesn't need to be super-precise or to have a perfectly-flat 1A current limit, it would be fine to curve around 0.9A or 1.1A, but I don't know of an easy way to measure what's going on with a simple DC amp-meter.

Could this be the right way to go, or is it too simplistic? And how would I calculate what kind of inductor would be suitable?

Best Answer

I'd go with an off-the-shelf DC-DC converter with built-in current-limit and feed that from a plug-pack with a higher voltage

eg:

https://www.aliexpress.com/store/product/DC-DC-CC-CV/1326062_32803647489.html

The three brass screws control the output parameters, one sets voltage, one sets output current, and the other sets input current, set the voltage one for 6v then set the input current one so that the powersupply's over-current current limit does not activate.

You'll have to feed this thing with more than 6V though, the output is always less than the input, 9V or 12V etc would work fine.

you're going to need a voltmeter so you can set the voltage.

Related Topic