Electronic – Gain vs Output Impedence

amplifieranaloganalysisintegrated-circuittransistors

I am currently trying to improve my understanding of analogue IC design and my textbook kind of brushes over this problem.

It is made quite clear that a low output impedance is desired so that the output voltage and current are mostly spread across the load impedance. However, in design (specifically in small signal analysis) a very common way of improving the gain is to increase the output resistance (since transistors amplify by means of the output current).

I understand that both can be achieved in multi-stage devices ex. with a buffer amplifier stage at the output.

My question is: how do I reconcile these design goals, particularly in single transistor circuits?

Cheers

Best Answer

In a single amplifier stage you can't have low output impedance and high gain at the same time. The active device (bipolar or MOSFET) is voltage controlled and generates a current. In order to get a voltage again you need a resistor. Therefore you have to make a tradeoff between gain and low output resistance.

The output resistance is only of concern when resistive loads are driven. In analog IC design the loads are often capacitive and OTAs (amplifiers with a high output resistance) are perfectly fine for such a task.

If it is needed to drive a resistor either a buffer stage (emitter/source follower) or feedback is required to achieve that goal, which usually requires a multi-stage design.