Electronic – Adjustable LDO post regulator for bench supply design

power supply

I'm designing a small programmable bench power supply (well, I say "small", but the AOT430 FETs will handle 10A+ in their linear region without thermal runaway so…).

A switching regulator provides coarse voltage stepdown (24Vdc to 0-20Vdc isolated) while a MOSFET based LDO handles the fine control, a small DSP oversees everything else.

I want to have both programmable current and voltage limits. As for the control loop, it's probably going to end up as a mix of analog and software controls. I can either regulate the voltage across the load with discretes and handle the current control in software or regulate the current through the load with discretes and handle the voltage regulation in software.

I would like to know what sort of advantages/disadvantages (if any) that either solution would have in the context of a variable bench supply. Or should I try something else altogether? (like all analog or all digital control loops)

At the moment I'm leaning more towards analog voltage control. I'm expecting that analog current control might result in bigger voltage spikes/sags when loads change while the DSP readjusts the current settings to bring the output voltage back under control. But at the same time, I'm thinking that the analog current version would have (much) faster current limiting and short-circuit protection.

schematic

simulate this circuit – Schematic created using CircuitLab

(Note: These are only conceptual schematics and have been greatly simplified)

Best Answer

Since you're using a linear post-regulator after a switcher, the switcher can afford to have more ripple than you want on the output. This allows for very simple control schemes for the switcher.

I haven't really thought this thru carefully, but my first knee jerk reaction is to do a fairly dumb switcher in the micro. Possibly the firmware doesn't even get envolved with the switcher control once the PWM generator is set up.

The micro would take care of the user interface, reading the voltage and current setpoint knobs, getting commands from a communication interface, displaying values, etc. It then creates the reference voltage via PWM, which is low pass filtered and presented to the analog section. Likewise, it makes a current limit setpoint voltage.

The voltage controller is a simple opamp with FET follower. A PNP transistor is used to detect the switcher output being 700 mV or so above the actual output. This produces a binary signal that drives the shutdown input of the PWM generator in the micro.

The current loop is a high side current sense between the switcher and the output FET. A ground-referenced current-magnitude signal is created, which is compared to the filtered current limit voltage from the micro. A comparator is used to also shut down the PWM generator when the current is above the setpoint.

For extra credit, have the micro continually read the raw input voltage. It then adjusts the PWM duty cycle for the optimum value based on the input voltage, the output voltage setpoint, and the maximum current that the switcher needs to be able to deliver. This dynamically tweaks the efficiency a bit, and should help avoid inductor saturation.

This scheme of fixed PWM with shutdown will cause more ripple on the switcher output, but since that's followed by a linear regulator, it shouldn't matter much. The big advantages of this method are that it is very simple, and inherently stable over the wide range over operating points a bench supply must be able to handle.