Electronic – Shrinking an input signal for the ADC


I'm looking to sample a 0 to +15 V analog input signal, and it is not sinusoidal, rather it is impulse driven. I don't need to sample at a high rate (< 1kHz), but I need to sample over the full range of the signal. The way I see there are two options:

  1. Buy a more expensive ADC with a +/- 10 V range and try to bias the input to fit into that swing. This would require two voltages supplies though I think. I may be wrong though…
  2. Attenuate the input signal to make the signal swing fit in the range of normal low-cost ADCs

While 2) seems more difficult in the design, it certainly seems to have a better cost benefit based on what I've seen from Analog and Linear's offerings.

By attenuating the signal, do I risk losing anything though? I was thinking that if the ADC has the same sample bit width as that of the larger swing ADCs, the samples could be scaled digitally in software so that it appears the initial signal voltages are sampled.

Best Answer

Elementary, Watson. You sortof had the idea with #2, except that you don't want a negative gain but rather a gain between 0 and 1. In other words, you want to attenuate the 0-15 V input signal to match the input range of your A/D.

This is easily accomplished with two resistors in a "resistor divider" configuration. If your A/D has a native range of 0-5V, then you want to divide the input voltage by 3. This can be accomplished, for example, with 2K Ohms in series followed by 1K Ohms to ground.

Anything you do to a signal will always change it slightly. In this case, some of the high frequencies will be lost. However, at impedances of 10s of K Ohms, this won't be a problem with a sample rate of 1KHz or less. That implies a upper frequency limit of 500Hz maximum, rather less in practise. Even 100s of K Ohms used in the resistor divider should be able to pass such low frequencies without losing the part you care about.