Electronic – Noise in crystal detectors (historical)

detectordiodes

I'm reading over the history of British WWII radar systems, and one thing that really struck me was the improvement in crystal detectors. The were a vital component of microwave radars, and while the Germans had a magnetron literally fall into their hands in 1943, the lack of a useful receiver meant they were still struggling to make a working system when the war ended.

One point is interesting to me. While Crystal Fire recounts the development of diodes, mostly at UChicago IIRC, the British already had military-grade versions in 1940. The difference appears to be the noise level. The British version used in the Type 271 radar had 20 db of noise, while the US model in the Type 277 was 14.

Can someone familiar with these early devices explain where the noise was coming from, and how it was reduced?

And for comparison, can anyone offer a figure for the end-to-end noise in a modern radar system? I see that systems in the 1960s were 10s of MW and today they are replaced with ones using as little as 25 kW, and I suspect that is due largely to receiver improvements?

Best Answer

Long pulses and intra-pulse modulation are also your friend. Instead of 4uSec pulse from magnetron, you can have longer pulses at much lower power, and digitize the inphase-quadrature channels in the receiver after down-conversion. Once digitized, do whatever signal-processing (correlation, dispreading, bandwidth reduction) you seek.

Low Probability (of) Intercept airborne radars, on fighters and bombers, use this approach.

Related Topic