Electronic – How to make IR proximity detector immune to the daylight

infraredproximity-sensor

I am trying to make an infrared proximity measurement device.

I want it to be in the range of 10cm or 4" (maybe 15 cm?). The frequency I use is 10 KHz. Here is the circuit I used, except that I have used 1 nF capacitors and resistors that suits them for band-passing 10 KHz. I have used LM358A for the OP-AMP and I don't know the part ID of my IR diode.

To increase the sensitivity and remove the offset, I added a difference amplifier with a gain of 10 using the other OP-AMP inside the LM358A. I've used a potentiometer to set the voltage to be subtracted from the out of below circuit.

It works! With a reasonable linearity. However, the voltage levels change with the day light intensity.

Is there any way to make this device immune to the daylight using an LDR? I've tried to connect the LDR in parallel with the offset removing potentiometer, however, as obvious, that didn't give good, logical results. I do not have any IR filters and it is really expensive to get them from Farnell or such in Turkey.

Schematic

From here.

Edit:

Here is my schematic:

My Schematic

Best Answer

I don't think that using the signal of an LDR can do much because the circuit already has some kind of ambient light suppression: it's the high pass filter at capacitor C8.

I agree with MikeJ-UK that the signal probably is saturated by ambient light.

If you just want to get the proximity sensor working with more ambient light I'd suggest to put an IR filter in front of the detector.

If this is too easy (or you also have a lot of ambient IR light, e.g. because the sun is shining at the detector):
You have to solve the problem of the signal being totally jammed by the ambient light.

Lets suppose the photocurrent caused by the signal is some micro amps or less and the ambient light gives you already some 0.1mA there is only a very very small signal voltage at the input voltage divider (D1/R10). The more current (caused by ambient light) is flowing in the voltage divider, the smaller your singal will be.

Just increasing the amplification doesn't help, because the noise will be amplified too and I think you come into regions where signal-to-noise ratio is what you have to take care about.

So instead of having a voltage divider at the detector a better approach would be to utilize a transimpedance amplifier:
enter image description here

Its output voltage is linear to the photo current. So this will give you at least constant signal level, no matter how much ambient light you have (see also this article about this problem by Bob Pease).

Of course this is only true within limits: if your amplifier is jammed, you can't do much.

So the amplification before bandpass filtering must not be too large. But if you make your bandpass filter narrow enough you can do huge amplification afterwards (like in radio receivers).