Electronic – Instrumentation Amplifier – how to use it correctly

adcinstrumentation-amplifiervoltage measurement

I am trying to perform some measurements based on a special technique used to detect corrosion in rebars of concrete bloc. The idea is to measure the potential difference between an electrode and the rebar:

enter image description here

I tried it with a conventional voltmeter. It worked just fine. The potential difference lies between 0V and 1V.

Now, I want to do it with a micro controller. My idea was then to connect the two inputs (electrode and rebar potential) to the input of an instrumentation amplifier and then to read the output with an ADC input of the micro controller.

To do so, I used the INA118.

I left the Rg input unconnected to have a unity gain.

V+ = 5V
V- = GND (therefore "single supply operation")
Vo = ADC input (with 100nF capacitor to GND)
Ref = GND

I also connected two resistors of 1MOhm between Vin-,+ and GND.

Now the problem is that I have nothing on the output of the INA118. More precisely I have 30mV always; even when I short-circuit Vin+ and Vin-.

What is wrong with my circuit?

Should I add an offset voltage to the inputs of the INA118?
I have read that:
"With single supply operation, V+ and V– must be both
0.98V above ground for linear operation. "

Or is it that the input bias current is too low?

Thank you

Image of the circuit:
enter image description here

Best Answer

Input voltage range issue. With 0 & +5 supplies, the input range is only 1 to 4 volts. Your inputs are below the range where this amp will work. Check "linear input range" on datasheet.

One fix is to get negative voltage on V-. Another is to bias one end of the voltage measurement. Eg, bias the low side to 1.5 V and leave the high side unbiased/high impedance.