Electronic – Phase Noise(dBc/Hz) and Jitter

jitternoise

Jitter as wiki explains undesired deviation in the periodicity of the clock and Phase Noise is random fluctuation in the phase of waveform caused by jitter. Exploring deeper into these terms when I look at measuring units i.e. femtosec or nanosec for jitter is easy to understand about the deviation in the periodicity.

But when I look into measuring unit of phase noise it is dBc/Hz.

Please explain the insight understanding that lies in dBc/Hz and how is co-related with the jitter.

Best Answer

Jitter can be thought of as phase/frequency modulation of the signal. In the frequency domain, modulation creates sidebands around the carrier. Thus, if you observe a jittered signal on a spectrum analyzer, you'll observe the carrier's frequency, and it's upper and lower sideband components of the jitter. Its important to remember that jitter occurs at different rates (frequencies). You can think of this as "how slowly or rapidly is the ideal edge position being moved from its ideal location in time". Different jitter 'frequencies' will result in frequency domain sidebands at different offset frequencies (offset from carrier frequency).

Jitter is often spec'd in the time domain. Phase noise is spec'd as the 'spot magnitude, of the sideband at a specific offset frequency. For example, you might see a spec like -90dBc/Hz at 10kHz offset. This says that the sideband level is 90dB down from the carrier magnitude when measured at 10kHz away from the carrier frequency, normalized to a 1Hz measurement BW.

So, you can't directly compare a dBc/Hz level with a jitter spec. The only way to relate them is to consider the complete phase noise characteristic in the frequency domain, and integrate the total amount of power in the phase noise sidebands. It is then possible to relate this total integrated phase noise power to the jitter value.