Electronic – What exactly limits the signal frequency on transmission lines

electromagnetismsignaltransmission line

In practice, the frequency of a signal seems to have a significant impact on whether or not it can reasonable be transmitted over a line. As an example, consider the IEEE 802.3ab standard on Gigabit Ethernet (1000BASE-T), which deliberately employs a symbol rate of just \$125\text{ MBaud}\$.

Intuitively, I understand that you cannot use a conventional twisted pair cable to transmit a \$100\text{ GHz}\$ signal. But what exactly is it that keeps us from doing so? In real-world scenarios, what are the equations that when being populated with these frequencies would yield infeasible results?

Here are my thoughts so far:

  1. With increasing signal frequency, the parasitic inductance as well as the parasitic capacitance will probably lead to a significant damping. Is this effect captured by the real part of the characteristic impedance
    $$\sqrt{\frac{R'+j\omega L'}{G'+j\omega C'}}$$ in sufficient detail? And if this is the case: Wouldn't this mean that lossless transmission lines (with \$R'=G'=0\$) are unaffected by this frequency-dependent effect?
  2. According to these slides, higher signal frequencies "tend to radiate better." What exactly does this mean? Is this related to the damping that the resulting EM wave experiences during its propagation?

Best Answer

Skin effect in metal conductors contribute losses. This loss mechanism can be compensated to some extent by using large diameter conductors (which could be hollow for weight savings).

Most of the loss, however, is caused by losses in the dielectric material between conductors. All practical dielectrics that can be used with coax, for example, have increased losses at high frequencies. This loss mechanism can be compensated somewhat by using dielectrics that have voids in them, with the goal of approximating an air dielectric.

Why do dielectrics have increased loss at higher frequencies? This is caused at the molecular level by repetitive motion of polar molecules. When exposed to electric fields, the molecules in these dielectrics rotate to align their polarities with the field. Because the field is changing, these rotations occur at the same frequency as the signal in the cable. As the frequency increases, these molecules rotate faster and faster, generating more and more heat in the dielectric. This means that power is required to cause this rotation. That power comes at the expense of the signal propagating in the line.

Incidentally, waveguides can also be used for RF transmission. For normal waveguides, air is the dielectric. In some cases, waveguides are actually evacuated so that there is no dielectric (other than the vacuum inside the sealed waveguide). This allows transmission of high power RF with low losses (for example for radar on naval vessels).

Technically, coax is also a waveguide. But what I mean above is higher order waveguides like this: https://www.fairviewmicrowave.com/popup.aspx?src=/images/Product/large/SMF112S-12.jpg

Related Topic