I'm looking to sample a 0 to +15 V analog input signal, and it is not sinusoidal, rather it is impulse driven. I don't need to sample at a high rate (< 1kHz), but I need to sample over the full range of the signal. The way I see there are two options:
- Buy a more expensive ADC with a +/- 10 V range and try to bias the input to fit into that swing. This would require two voltages supplies though I think. I may be wrong though…
- Attenuate the input signal to make the signal swing fit in the range of normal low-cost ADCs
While 2) seems more difficult in the design, it certainly seems to have a better cost benefit based on what I've seen from Analog and Linear's offerings.
By attenuating the signal, do I risk losing anything though? I was thinking that if the ADC has the same sample bit width as that of the larger swing ADCs, the samples could be scaled digitally in software so that it appears the initial signal voltages are sampled.