LTSpice simulation trouble

ltspicespice

I'm simulating a passband amplifier (some 80 – 200 MHz) stage with two transistors. I'm using LTSpice. When I simulate the transfer function I get a maximum gain at a given frequency, let's say 100 MHz. Then, as a way to measure stability I'm running a transient analysis with an input frequency sweeping from 5 to 500 MHz and I found two weird things:
– the maximum gain is at a different frequency from the transfer function analysis, the question is, which one is right and why do they differ?
– if I run the transient for a number of frequency sweep cycles, the max gain frequency changes (the sweep is, I think, slow enough to allow an almost static response)

Best Answer

This sounds like your bias point circuit is moving or is being excited by the input waveform. An AC analysis does an operating point analysis then uses those biases to do a small signal sweep (that's why it is so fast). A transient analysis of course recomputes the bias as the signal changes. The fact that running different transient analysis with different stimulations gives differing results is a big clue that your operating point is shifting as well.