Electrical – Electric field measured in the far field of an antenna

antennaelectromagnetic

I came across a question:

Electric field measured in the far field of an antenna at a distance r of 50 m = 1 V/m. Find the electric field at a distance of 500 m from the antenna.

The solution starts with – E is proportional to 1/r.

I know, E = V/r. Is it assumed that potential is the same at 500 m, to say that E is proportional to 1/r !? I thought E is proportional to 1/r^2, as per coulombs formula, if the charges are assumed to be constant. Where did I go wrong ?

Best Answer

Farfield = 1/r. Does it suggest that potential is equal everywhere in the Farfield?

No it does not. The presence of electric field implies that there will be potential difference, in other words, no potential difference or a constant potential would imply no electric field. You are wrong in assuming \$E = \frac{V}{r}\$ for any r. Instead, $$E = -\frac{dV}{dr},$$ thus \$E\ \ \alpha\ \ \frac{1}{r}\$ would imply a logarithmic potential field (i.e. logarithmic dependence on r) rather than a constant field as you say.
If you want to know as to why \$E\ \ \alpha \frac{1}{r}\$, you can imagine a sphere of radius r centered at the source. Then, power leaving the sphere is: $$P_{rad} = (4\pi r^2)P_d,$$ where \$P_d\$ is the power density of the field. To have finite power radiated away from the source even for large r, \$P_d \ \alpha \frac{1}{r^2}\$. As, power density is related to squared of the electric field magnitude, thus we get a inverse dependence on distance for electric field. In other words, $$P_d \ \alpha E^2 \ \alpha \frac{1}{r^2} \implies E\ \ \alpha \frac{1}{r}$$

In general, power density will be of the form: $$P_d = \frac{C_1}{r^2} + \frac{C_2}{r^3} + \frac{C_3}{r^4}+...$$ For large r we just get the radiated term (first term) as the others would be negligible, but for small r's we will have small radiating term and we will have field mostly due to rest of the terms and it will constitute near field.