Electronic – DAC/ADC Precision & Noise: PD Resistor/Capacitor effect

adcdacnoiseoperational-amplifierpulldown

I'm developing a device that utilizes both ADC and DAC, and I have realized that my DAC output, which is fed into the op amp, is de-facto almost floating, since op amp has high input impedance. Both DAC and ADC are on the MCU now (STM32G071KBT6), although for some time they were discreet components, which doesn't matter much, I guess.

I tried to google some stuff, and only found out that lines that end up with Hi-Z are susceptible for noise, which makes sense to me, but nothing more specific.

Also, some discreet DACs have max output capacitance in datasheet, which is usually of order of 1000pF, some a bit more, some a bit less (others never mention output capacitance).

I tried to google specifically my question, but couldn't find any reasonable answer.

Should I have some large pulldown resistor on DAC output to add it some stability and noise resistance? Some 47k (which are everywhere on my PCB, so another one won't hurt). Why, why not? (I would also really love to be able to go close to 0V output).

What about ADC? My ADC is fed from the voltage divider from the current sense amplifier, total resistance of that one is ~15k (5V->3V3, 4.7k/9.12k to produce 3.299 from 5V to stay in range, will have to account for tolerance too, probably in software).

If I fed my ADC (input) from op amp directly, it would create same problem as with DAC output into op amp. Would I need a pd resistor there? What about capacitor on the ADC line?

Summary of questions:

ADC input: capacitors and pd resistors? Yes/No, when, why?

DAC output: capacitors and pd resistors? Yes/No, when, why?

Best Answer

ADC input:
The output impedance of your voltage divider is given by the parallel value of both resistors, in this case about 3.1k. This is quite a low impedance to drive the ADC, so depending on your application you won't need any capacitor. A pulldown would only load the output of the voltage divider and give you a slightly wrong reading. A capacitor to ground will form a lowpass filter with the output impedance of the divider. You can use this to filter higher frequencies to make sure you are fulfilling the nyquist criteria (sampling rate of ADC at least double of the highest frequency in the signal). *1

DAC output:
While the OpAmp input might be high impedance, the output of the DAC counts as low(-ish) impedance. It is not floating, but driven activly to the ouput voltage.
Simply placing a pull down resistor will only load the output and drop the voltage a little, but yield not much positive. Depending on your signal frequency and your DAC sampling rate you might want to place a series resistor with a capacitor to ground as a lowpass filter (also called reconstruction filter) to eliminate noise from the DAC. If you output only DC (or very low frequencies) that might not be neccessary.
Additional remark: You want your output of the DAC to go as close as possible to ground. The DAC of your µC has an internal buffer that lowers the output impedance from something like ~18k to something between 2k and 3k. But I would not enable that in your case! You are driving a high impedance OpAmp, so you don't need the low impedance output. And the internal buffer can only go down to about 0.2V, while the DAC itself can drive the output directly to ground.


*1
Wether or not you really need a lowpass filter in front of your ADC depends a little (as it always is) on your exact specs. If the current you measure is almost constant and you are quite sure that the sampling rate of the ADC is faster than any changes in current you probably can skip it. But be aware that incoupling noise or fast changing current waveforms can lead to completly wrong measurements.
If you don't want to take that risk you need the filter. You can follow this procedure to design it:

  • Set the corner frequency of the filter to \$ f_c \leq \frac{f_s}{2} \$ , with \$f_s \$ beeing the sampling frequency. If your ADC takes one sample every ms, your filter has to cut of at a maximum frequency of 500 Hz.
  • Choose a reasonable combination of resistor and capacitor, to fulfill \$ f_c = \frac{1}{2 \cdot \pi \cdot RC}\$. Because the input impedance of your ADC is about 50k, you want the signal impedance to be not higher than about a 1/10 of that. If we set R = 5k, this gives us a C of 64nF for 500Hz. I would round to about 100nF. Because the voltage divider already gives you 3.1k you only need to add 1.9k in front of the capacitor.

The above is only a rough estimate. There are two important factors to keep in mind:

  1. The filter has a roll-off, it does not attenuate everything above the corner frequency to 0. The further \$f_c\$ is below \$f_s\$ the better.
  2. Even when the signal impedance is only a 1/10 of the ADCs input impedance you will get some voltage drop (signal impedance and ADC input form a second voltage divider). You have to determine how much you will allow the voltage to drop. This depends on your specs for resolution of the measurement.