Electronic – Audio ADC voltage level SNR question

adcaudioinputsnrvoltage

I keep looking for Audio specific ADCs and the majority of these devices work at low voltages on the analog circuitry (5V for example).

Lets say we want to process an higher voltage audio input signal (let´s say 10Vrms), then a signal conditioning circuit must be placed between the input and the ADC to reduce the input voltage.

Then, after DAC, the signal must be restored to nominal level (in our example 10Vrms).

  • Won´t this gain reduction at the input and signal restoration at the output increase the overall SNR of the circuit?

  • Why do manufacturers use this low voltage at the analog input of the ADC?

Best Answer

The ADCs work with typical voltage levels for the circuitry they must interface with.

You often have 5V or 3.3V as a power rail in digital circuitry. The analog sections of the ADC can also only work within the available power rails. If you've only got +5VDC as a rail, then the opamps and other analog parts of the ADC cannot work with anything above 5V or below ground.

To work with a 10V signal, you need a power rail of 10V for your analog circuitry. Generating that is a nuisance, and can introduce more noise into other parts of the system - getting 10V out of 5V involves a switching power supply, and they are notoriously noisy. Yes, you can clean up the noise. Better if you don't have it there to begin with.

As for "increasing the signal to noise ratio" - that doesn't happen. I think you meant to say "decrease the signal to noise ratio." There is a bit of confusion in the wording. If it were expressed as a fraction, then I think you'd understand better.

A signal to noise ratio of -20dB means that the noise is 0.1 times the signal level. So, for a signal level of 1Vpp, you would have a noise level of 0.1Vpp.

If you want to increase the signal to noise ratio, you must either lower the noise or increase the signal. Lets take the example above, and improve the signal to noise ratio to -40dB. That means that the noise is 0.01 times the signal level. So, for a signal of 1Vpp, the noise would be 0.01Vpp.


Now that you understand what the signal to noise ratio is, lets go back to your question.

Every time you amplify a signal, you add noise to it. Every amplifier makes a little noise, so when you pass a signal through it the signal becomes noisier. The signal to noise ratio gets worse because the fraction that is noise increases.

Take our -20dB SNR example. Say our signal has a SNR of -20dB and we pass it through a crappy amplifier with a gain of 1 that adds 0.1 Vpp of noise to the signal. We already have 0.1Vpp of noise because the signal to noise ratio is -20dB. We add another 0.1Vpp of noise from our crappy amplifier, and the signal to noise ratio changes to -13dB. We have a higher number, but it means that the ratio is worse.

So, every time you pass your signal from 10Vpp down to 5Vpp, then amplify it back to 10Vpp, you will add noise to it and make the signal to noise ratio worse.

This isn't usually a problem. Attenuators (to go from high level to low level) and amplifiers (to go from low level to high level) can be designed so as to not add a noticeable amount of noise to the signal.

Of course, the more times you attenuate or amplify, the worse the signal to noise ratio gets. You avoid that by only doing those things when you must, and by using good (low noise) amplifiers in those cases.