Series LCR experiment

accapacitanceinductanceoscillator

i am doing a series LCR experiment for measuring the resonant frequency taking,
R= 100 ohm; L= 40mH; C=0.1uF. I am using a 10KHz audio oscillator having 10V p-p as the input.

I have fixed the input voltage at 3V (rms value). But, on increasing the frequency(say, from 100Hz), the input voltage keeps dropping till I reach the resonant frequency (approx 2KHz)and then increase on increasing the frequency further.
enter image description here
For example: At 100 Hz, Vi=3V;
At 2000Hz, Vi=2.2V;
At 10KHz, Vi=2.4V

Is something wrong with the circuit? What should be done to stabilise the input voltage,Vi?

Best Answer

Most signal generators have an output impedance set to usually 50 ohms. Some can be set to 600 ohms and some can be set to zero ohms although those set to zero ohms will not be able to supply amps of signal rather they will limit in some haphazard way.

At resonance, the impedance of L+C+R will be R because Xc and Xl are identical |impedances| but of opposite polarity. Net impedance is R: -

enter image description here enter image description here

R then loads the output of the signal generator and you get full signal at low and high frequencies and some fraction of the signal at resonance.