How to show field strength varies as 1/r

radiotransmission

This is a question, from a radio amateur, that should have a straightforward answer: but I cannot find one on the Web.

Is there a robust but simple explanation, not needing advanced maths, as to why the field strength in the far field falls off as \$1/r\$ and not \$1/r^2\$ ?

Best Answer

Picture a transmitting antenna like a lightbulb. All the power emitted by the bulb can be thought of as hitting a sphere at any particular distance. The surface area of that sphere grows with the square of the distance, to the illumination hitting a piece of paper on that sphere goes down with 1/r2.

However, that was power. Field strength is like voltage in that power is the square of the field strength. If the square of the field strength falls off with 1/r2, then the field strength itself must be falling off with 1/r.