Electronic – What quantity changes when the signal is amplified without changing SNR in this case

adc

This is a conceptial question and I want o ask via an example in my mind. So I neglect any other factors which are possible for the sake of simplification. Let’s assume an analog signal through a channel can handle 1V variation and has a noise level of 1mV.

And lets assume this analog signal will be sampled by a 10 bit ADC which has 10V input range.

We can directly couple this 1V signal and sample by the ADC or we can amplify the signal by 10 and match it to the ADC range.

If we assume the SNR remains the same before and after the amplification, what do we obtain better by matching the signal to the ADC range by amplification? What improves? Can you give some insight by using my example?

Best Answer

I assume you mean a noise level of 1mV RMS.

A 10-bit ADC with a 10V input range will have quantization steps of 10mV. If you treat the quantization as random (which may underestimate the effect) then that works out to an RMS noise of 3.46mV. So your 1V signal, 1mV noise input signal will have a 60dB SNR going in to the ADC, and around a 49dB SNR coming out.

If, however, you amplify your signal to 10V signal + 10mV noise, then the total noise after quantization will be the equivalent of 11.6mV RMS, for a 60dB SNR going in and a 59.5dB SNR coming out. That half a dB of degradation is practically negligible, unless you're locked in a specifications duel with a competing vendor.