Electronic – Output coupling capacitors in dual-supply op amp circuits

audiooperational-amplifier

Consider a dual-supply op amp circuit used to process (but not amplify) line level audio; for the sake of simplicity, imagine a unity gain follower. With the input AC coupled and referenced to ground, only a small DC offset, caused by the op amp, will appear at the output and this seems negligible at the first glance.

In this scenario, is it necessary to also put a coupling capacitor at the output? Here are some of my thoughts so far.

Pros of having an output coupling capacitor:

  • If a later stage is DC coupled and has gain, the coupling capacitor would prevent the DC offset from being amplified.
  • If a fault develops within the device and the op amp starts outputting a rail voltage (or any substantial DC level for that matter), the output capacitor would prevent this condition from negatively affecting later stages.

Cons of having an output coupling capacitor:

  • Since the input impedance of a later stage is unknown, a fairly large capacitance is required to achieve the desired low-frequency response.
  • The capacitor would have to be terminated with a resistor to ground. To avoid loading the output, it would have to be a high value resistor and this would introduce additional noise at the output.

Is there anything I'm missing here? As much as I'd like to omit the cap and the accompanying resistor for obvious reasons, I really wouldn't like to design a product that causes trouble for its users and interacts badly with other devices.

Best Answer

The capacitor would have to be terminated with a resistor to ground. To avoid loading the output, it would have to be a high value resistor and this would introduce additional noise at the output.

Not really, a 1 Mohm resistor might introduce 18 uV across the full audio bandwidth (see this calculator) at ambient temperatures of 20 degC but this would be into an open circuit. That 1 Mohm resistor would not "see" an open circuit - the input impedance of the following stage might be 10 kohm so that turns 18 uV into 1.8 uV but, even then the driving amplifier at frequencies in the audio band would have a substantially lower impedance of a few ohms.

A 10 uF series output cap and 10 kohm shunt produce a low frequency 3 dB point of 1.6 Hz so there isn't even a problem about choosing an electrolytic because a neat little SMD cap will do the job.

So ,if the input impedance of the unknown next stage were 1 kohm, the low frequency cut-off would rise to 16 Hz - is this really an issue?

So how big could the offset be for a unity gain op-amp - maybe 5 mV - if this were put across a 10 ohm resistor there would be a current flow of 0.5 mA and would not really trouble nearly all op-amps capable of working to 20 kHz.

Is there anything I'm missing here?

Putting things into perspective? Considering reality? Provoking me to answer your question (by raising it LOL)?