Electronic – Programmable Power Supply Design: Issues with variable load and voltage stability

dacloadpower supplyvoltage-regulator

TL;DR: How can I stabilize/regulate an output voltage coming from a DAC and use it to drive a load?

Long-time web-searcher/lurker, first time poster.

I've been designing an automated test station for work, and it's sort of turn into a pet project of mine that I spend a lot of time on outside of work. I guess that happens when your hobbies and work mix. Learning tons, but I obviously still have a lot to learn.

The point is to run one of our products through a series of regression tests. One of the requirements is adjustable (programmable) power. The only thing is, I designed this to simultaneously test n number of units (currently 8). So buying a programmable supply for each one is out of the question, not to mention space constraints. To that end I architected the system to have one main control board, which can issue commands to a theoretically unlimited number of "cards", which contain the hardware (DAC, latches, etc) necessary to remember the state the board told it to be in with regards to power level, adc level, inputs, etc…

I'm handling the programmable power section with a high-voltage 8-bit DAC made out of transistors. This can be anywhere between 0 & 90VDC. Typically 12-24V. This part works surprisingly well and is very accurate with ~100mV steps.
Check out the simulation link at the bottom of this post.
The resistor dividers are to prevent exceeding the max Vgs rating of the transistors while still keeping them in saturation.

DAC Circuit

The only problem here is actually using that voltage to drive a load. Because this is a resistor network, it's really, reeeealllly sensitive to impedance changes. The best solution I've found so far is to use a double sziklai pair (simulation at the bottom of the post). This works reasonably well, and drives the load, but it has a few issues.

  • Small currents disable the pair and output voltage spikes to max. The device under testing can draw anywhere between 0.5ma & 500mA. I had to add a resistor (R1 in the next image) to make sure the transistors stay on.
  • Output voltage is wildly unstable. It's great when only drawing tens of milliamps, but when the current draw jumps up to 300-500mA, the output voltage can drop by 2 whole volts sometimes.

Power Delivery Circuitry

I could deal with this problem by using feedback to adjust the voltage continuously (I have a line for reading back the set voltage), but I'd really like to figure out a way to make this work without any trickery.

I've tried other methods to make this work, like LM317T regulator with digital pot for control, programmable bucks, but I can't find any combination of devices that will work both within the current draw and voltage levels.

So, with all the long winded stuff out of the way, is there a way to stabilize the voltage here or am I totally out to lunch with this design?

Falstad Simulations:

Best Answer

I revisited the op-amp config, and it works great! I still get a voltage drop of around 300mV at high current, but this is more than acceptable for what I'm doing. Thanks to all who pitched in. Cheers,