We all know the Faraday cage effect: light waves get through the screen on the front of your microwave oven, because their wavelength is much less than the size of the holes, whereas
the microwaves don't get out, because their wavelength is much greater. Yet despite many hours of looking around and dozens of discussions with individuals, I've been unable to
find an analysis of the mathematics of this effect.
Presumably there's a simple argument that shows some kind of exponential attenuation depending on the ratio of wavelength of hole size. Can anyone point me to the literature on this subject?
Best Answer
The following forum post might be interesting for you: Mathematical derivation of the Faraday cage from the Maxwell Equations.
Especially post #4 from Astronuc (please have look in the link above for the complete citation):