Electronic – Just how does high VSWR damage RF amplifiers

amplifierRF

Just how is it that a high VSWR can damage the final transistors in an RF power amplifier?

Is the transmission line significant beyond the effect it has on transforming the impedance of the load at the other end? Or would an equivalent lumped impedance directly at the amplifier's output be just as damaging?

Of all the possible impedances that result in a given VSWR, are they all equally bad?

Is the reflected power "absorbed" by the amplifier? For example, if I'm getting 100W reflected power, is that more or less the same as putting a 100W heater on the amplifier?

I've also read that excessive voltage can be the mechanism leading to damage. How is it that a voltage higher than the supply voltage can appear? Is there a limit to how high this voltage can be in the presence of an arbitrary mismatch?

Best Answer

Just how is it that a high VSWR can damage the final transistors in an RF power amplifier? Is it simply the wrong impedance (after transformation by the feedline) appearing at the terminals or is the transmission line in particular important?

It depends on the design of the amplifier you're using.

If the reflection coefficient seen by the amplifier is -1 (thus \$\rm{VSWR}\approx\infty\$), that's equivalent to driving a short circuit, and you can see why that would be an overload condition for just about any type of amplifier.

If the reflection coefficient is +1 (again \$\rm{VSWR}\approx\infty\$), that's equivalent to driving an open circuit. If you're amplifier's output stage looks like a common emitter amplifier with resistive pull-up (for example a CML buffer), that's not going to be a problem at all. In some other amplifier configuration with reactive elements, the increased output voltage could cause breakdown of the output devices, for example.

Is it reflected power being absorbed and dissipated in the transistors or something else?

If the output of your amplifier has a real part to its output impedance, then that would imply that it is absorbing the reflected wave.

However the reflected wave will likely be coherent with the outgoing wave the amplifier is producing. Thus it's possible that interference effects between the two waves enhance or reduce the possibility of damage to the amplifier, depending on the phase relationship between them.

If you're driving a long line, then small changes in the signal frequency, or even the temperature of the line, could change the reflected wave phase significantly, so it would probably not be a good idea to try to design on the assumption that you can control the phase of the reflection.

If you're driving a short line, then controlling the phase of a reflection by controlling the line length is a common practice, done every time we use a stub or shunt as a matching filter, for example.