Electronic – RC time constant and diode detector

demodulationModulation

For a better detection of the modulated signal with the diode detector below, one requirement is that the time constant of the RC filter conforms to:

$$ \frac{1}{\omega_{c}} \leq RC \leq \frac{ \sqrt{1-\mu^{2}}}{\omega_{m}\mu} $$

where:

  • \$\omega_c\$ is the angular frequency of the carrier
  • \$\omega_m\$ is the angular frequency of the information, and
  • \$\mu\$ is the modulation index.

How exactly can we prove that?

enter image description here

Best Answer

The formula is derived from practical experiences and not from mathematical 1st principles. It is unprovable other than by being practical and thinking what a diode detector has to achieve.

Firstly, the formula states that RC has to be equal to or greater than \$\dfrac{1}{\omega_c}\$.

If the RC time constant were too short there would be significant levels (ripple) of the carrier frequency on the output - this is not what is wanted from a diode detector (or an AC rectifier in a power supply) BUT, it's never going to be a perfect brick wall filter and so carrier ripple has to be acceptable (to some degree).

Personally, I would like to see the RC time constant 5 times greater than \$\dfrac{1}{\omega_c}\$

At the other end of the scale, RC cannot be too big or it will start to significantly attenuate high frequencies in the "detected" analogue waveform that is represented by \$\dfrac{1}{\omega_m}\$.

Here is a picture that hopefully explains: -

enter image description here

This picture was taken from here and basically is saying, if the modulation index is too high for the value of RC chosen there will come a point in the detection of the signal that the RC time constant is too long.

You should also note that as the modulation index approaches 1, the RC time constant has to theoretically be very small and this will make it likely clash with the requirement for it to be significantly greater than \$\dfrac{1}{\omega_c}\$.