Electronic – Analog signal amplification before ADC reading

adcamplifiersensor

I'm working on a system with a distance sensor. As the distance increase, the output voltage decrease. ( Fig. 2 Page 5 : http://www.farnell.com/datasheets/1657845.pdf?_ga=1.97138143.1773643066.1472611965)

Is it better to let the ADC "read" the raw voltage or should I amplify it when the output voltage is too low? (and consider the amplification in software)


Edit :

How is it possible to know the distance when the output is 1v? (Both 2cm and 27cm match with this voltage)

Any idea? Or should I avoid the scenario of a under 10cm distance object?

Best Answer

You have to tell us how the device is to be used before anyone can answer this meaningfully.

You've got 0.5V of change over the range of the device. This is a significant amount for a 10-bit A/D, so you wouldn't need to amplify if you were trying to tell the difference between the low end and the high end of the distance range.

Assuming the output is linear with distance, if you were trying to resolve 1mm of motion, the voltage change would be 0.5/700, or 0.7mV. This is a bit much to expect a 10/bit A/D to handle, and you would need amplification.

Assuming a 5V reference, 1 LSB on a 10bit A/d is a bit under 5mV. A 1cm change in your device would cause a change of about 7mV. So if you're trying to resolve 1cm, this is still pushing it. I'd recommend amplification of about a factor of 4 (more than that risks saturation). You'd need to remove offsets, obviously, to put your signal in the 0-5V range.

If you're trying to resolve 10cm, you probably don't need amplification