Electrical – What matters, the rms or peak value of the Analog signal to choose appropriate ADC for digitization

adcanalogdigital-logicrmsvoltage

i have an 23 bit ADC with voltage reference of 1.4 V.
If my input Analog signal is AC with frequency 50 HZ and its rms ranges from 0.2 Vrms to 1.3 Vrms. Then its peak voltages will vary from 0.28 V to 1.838 V.

Now, considering ADC high value that it can digitize (1.4 V). What does it mean?
I mean, if i consider input Analog rms values or peak values to find out that this adc will work for my inputs. Because if i consider rms value then it probably would work, but it ADC is concerned with max peak of incoming Analog signal then my inputs with peaks higher than 1.4 V won't be able to digitized correctly.?

Please guide, what is considered by choosing ADC, is it the rms of Analog or peak.?

And if it is peak, then what would happen if Analog value is higher than adc specified value? Will all values higher than 1.4 V will treated as same digital value??

Thanks in advance..

Best Answer

An RMS sine wave of 1.3 volts has a peak-to-peak value of 2 x \$\sqrt2\$ x 1.3 volts i.e. 3.677 volts p-p. It is not only that range that has to be accommodated by your ADC but you also need to DC bias the AC signal so that when it is 0 volts AC, the DC level is 3.677 volts / 2 i.e. 1.838 volts.

If you don't do this you will receive a clipped digital version of your input analogue waveform and you may also damage your ADC.

You should also read any prospective ADC data sheet and understand the effects of gain and zero offset errors and take these into account so that you don't get digital clipping.