Electronic – Definition of signal bandwidth

adcbandwidth

From my understanding the bandwidth of an input signal to an ADC is defined as the range of frequencies the signal consists of, which is calculated as the difference between the highest and lowest frequency within the signal band. So ok, but what about a time to digital converter (TDC)? How can one define the bandwidth here?

A TDC takes in two inputs, usually called Start and Stop signals, and measures the time difference between the two. For example, the input stream might be like this

enter image description here

What does input signal bandwidth mean here?

EDIT

What confuses me is this Sigma-Delta TDC paper for example. It says:

In the MASH Sigma-Delta TDC, there is also a special relation between
the OSR and full scale input range. The bandwidth of the input signal,
BW, is set to 100 kHz in this design. The sampling clock of the TDC
system is then equal to 2 BW OSR. Due to the input signal’s timing
nature, the peak-to-peak full scale input signal amplitude has to be
smaller than one period of the sampling clock.

Now according to this, if OSR=25, the system clock is found to be 5MHz, which means that the PP FS input range has to be smaller than 200ns.So not sure what the physical interpretation of BW is here.

In another paper I found a comparison table which compares different architectures:
enter image description here

The papers indexed 26 and 28 have the same 50MS/s sampling frequency, but the BW in former case is 10 times more. The FS input range in both cases is more or less the same. Anyway, I cannot turn my head over how or in what manner BW could limit system performance?

Best Answer

'Bandwidth' is one of those words that gets applied to many different things, so it doesn't have a concrete, context-free, definition.

With a TDC, one context is the maximum and minimum frequencies that can be applied to the inputs, and still have the timing behave properly.

Another one is the reciprocal of the smallest time difference that can be resolved between the edges. Of course you need to consider the difference between resolution and accuracy here.

It's also worth considering this context when thinking about metastability in latches, that you need a bandwidth approaching infinity if you are going to resolve the difference between a data edge being before or after a clock edge by an infinitessimal amount. In fact when designing a TDC, metastability in the input circuitry is a key concern.

Yet another context is the rate at which successive readings are taken, which may be the same as input frequency, but need not be. This controls the bandwidth of phase modulation on the signal can be detected. Even if there is no deliberate phase modulation, phase noise from large offsets will be aliased down to baseband.

Related Topic