Electrical – Purpose of attenuation before op-amp input

amplifierattenuation

What is the purpose of putting attenuation circuit before op-amp input? I cant find the actual reason for this topology. Is there any reason behind it? Why we need to attenuate the incoming signal but at the end we amplify it.

Signal input -> attenuation circuit (reduce incoming signal level) -> op amp input (amplify signal level)


sorry for late response.

I attached my circuit diagram. Basically it just a simple voltage source (from MIC) + bandpass filter with positive rectifier (range within 5KHz bandwidth) and this signal will input to ADC for detection.

Based on circuit below, I confirmed the attenuation and gain

Attenuation level : -15.56dB
Gain level : 29dB

If based on above level, can we just set gain of the amplifier to be 13.44dB (gain level – attenuation level)?

I'm thinking to reduce the component used by deleting the attenuation circuit before the amplifier input.

enter image description here

.

Best Answer

In addition to Bimpelrekkie's answer, here is another very common scenario:

schematic

simulate this circuit – Schematic created using CircuitLab

Here we are monitoring a system power rail (typical in avionics). As most ADCs have reference voltages of 5V (or less) we need to change the effective range of the supply rail being monitored.

We don't want to waste power so I am using high value resistors, but the source resistance to an ADC typically needs to be quite low (a few k ohms at most usually), so the sense point in the voltage divider is buffered to properly drive such an ADC.

I haven't taken the trouble here to convert the signal to a 0V to Vref span and in fact it is rarely necessary in such a situation.