Electronic – Radio frequency propagation near-field/far-field

fieldpropagationRF

What is the difference between the near-field and far-field radio frequency radiation?

Is it true that the near field radiation is caused by the the energy dissipated at the end of the antenna connected to the oscillator, but far-field radiation and propagation is caused by the repeated, energy-shifting interaction between the electric field and the magnetic field.

Best Answer

I tend to think of it this way: Free space (i.e., "far field") has a certain inherent impedance that's determined by the relationship between its elecrical permittivity, ε0, and magnetic permeability, µ0. Together, they dictate an impedance of about 377 Ω, and this determines the magnitude relationship between the E-field and the M-field of an electromagnetic wave.

However, in the presence of conductors and dielectrics (including the antenna itself), which have very different values for either or both of these constants, the impedance changes, and the balance between E-field and M-field is different. The usual convention is to start considering these effects anywhere within about 1 wavelength of the objects.

With conductors in particular, you also need to take into account the current and voltage distributions in those conductors — whether driven by RF sources or not. The net effect at any given point will be the sum of all of the effects of each "quantum" of current. It's very difficult to get closed-form equations for anything but the simplest cases, so these problems are generally solved using numerical approximations.

Does this help, or have I misunderstood the question?