I know the topic is immensely broad but I dont have the practical knowledge on the following issue:
Let's say I have a very long like 1km BNC cable carrying DC signal around 1V analog voltage signal coming from a transducer.
And let's say the BNC goes all the way straight to an ADC with 100 Meg-ohm input impedance at the end.
If the measurement should be very accurate, do we need to buffer this signal?
BNC cable even 1km long has very very low resistance comparing to ADC's huge input impedance.
Theoretically form this point I would think one wouldn't need a buffer.
But how would that be implemented in practice for 1V DC level and 10mV DC?
Best Answer
texas instruments buffer example: link
Your source is the transducer's voltage and the 50 ohm should be replaced to 10M (for example). The output resistor 50 ohm matches the characteristics impedance of your cable (50 or 75, for example). At the end you should place the same resistor (50 or 75 ohm).
simulate this circuit – Schematic created using CircuitLab