enter image description hereI am building a small project and I used an inverting amplifier to amplify AC signal.
When R2= 220R and R1= 33k, It can work with input signal of upto 600mV but when I reduced the R2=100R and R1= 15K the maximum input voltgae (AC) I can input is 60mV. I am giving an AC signal from a frequency generator(Can there be an impedance issue between the freq gen and my input opamp).
I want to reduce the resistor because of input noise. I can see a good input noise when I reduced R2 to 100R but I don't understand why its clipping the input voltage when I increase the input.
Appreciate if you can help me understand this phenomenon.
I attached the image when R2=100R and when input is >60mV ch1=input signal ch2=output signal, Capacitor value is 10uF
Best Answer
The gain of your amplifier is R1/R2. In both cases, the gain is 150, so that's not the issue.
The impedance is lower in the second case. The opamp has to supply more than twice the current at the same output voltage. However, even with a 15 kΩ load, most opamps will be fine. This is unlikely the problem either.
There is therefore something else going on. Something is not as you think it is, but with the sparse information you have supplied I can only guess at possibilities:
Again, supply some real data and maybe we can figure this out.
Added
Now that you've shown a scope trace, we can see that there doesn't appear to be anything wrong. You are putting in a signal about 70 Vp-p, and your amplifier has a gain of -150. That would result in 10.5 Vp-p out, but your amplifier can't do that since it is running from 0 and 3.3 V supply. As a result, it clips.
How is any of this not as expected?