I'm simulating a passband amplifier (some 80 – 200 MHz) stage with two transistors. I'm using LTSpice. When I simulate the transfer function I get a maximum gain at a given frequency, let's say 100 MHz. Then, as a way to measure stability I'm running a transient analysis with an input frequency sweeping from 5 to 500 MHz and I found two weird things:
– the maximum gain is at a different frequency from the transfer function analysis, the question is, which one is right and why do they differ?
– if I run the transient for a number of frequency sweep cycles, the max gain frequency changes (the sweep is, I think, slow enough to allow an almost static response)
LTSpice simulation trouble
ltspicespice
Related Topic
- Electrical – Issues with LTSpice Sallen-Key single-supply Simulation
- Electrical – High Pass filter gain at higher frequency than cutoff frequency
- Electronic – How to determine mosfet capacitances (Cgs, Cds, Cgd, …) in LTSPICE
- Electronic – AC analysis of opamp loop in LTspice
- Electronic – Power amplifier simulation and real time experiment mismatch problem
- Electrical – Second Order Passive RC Low Pass Filter Doesn’t Have -6dB at cutoff Frequency in LTSpice
- Electronic – LTSpice AC analysis and DC analysis don’t agree
Best Answer
This sounds like your bias point circuit is moving or is being excited by the input waveform. An AC analysis does an operating point analysis then uses those biases to do a small signal sweep (that's why it is so fast). A transient analysis of course recomputes the bias as the signal changes. The fact that running different transient analysis with different stimulations gives differing results is a big clue that your operating point is shifting as well.