Electronic – Measuring a soundcard output voltage

function generatoroscilloscopevoltage measurement

I have been experimenting with a Soundcard Oscilloscope and I am having some problems understanding what's going on! I have attached a screenshot of my digital scope setup. My hardware setup is simple. I have a male to male TRS jack running from my line out into my mic input. I have been experimenting with a simple sine wave.

Screenshot of Soundcard Oscilloscope Setup

When I measure the output voltage of my line out (Stereo plug – TRS) with a voltmeter I get a reading of 26mV for the right channel and 12mV for the left channel (Left and Right speakers not the left and right scope channels). However, the scope is reporting an effective voltage of 508mV rms for the left channel and the right channel (not in the screen shot) is saturated at 783mV rms.

Why do the 26mV and 12mV readings remain constant on the line out when I adjust the signal from my sine wave amplitude or change the volume and balance control of my PC soundcard? The voltmeter reading never changes. The voltage changes on the scope, but not on the voltmeter. What am I missing?

My second question has to do with the soundcard max voltage readings I am getting. When I boost the frequency or amplitude of the wave it saturates at 2 volts. I am guessing +2 and -2 is the max input/output of my soundcard? The research I have done would agree with this conclusion but I would appreciate some input! If this is the case why can't I get a 2 volt reading on the voltmeter instead of this micro volt jitter?

Thanks for your time!

Best Answer

Firstly, as mentioned the multimeter in DC mode will only give you the DC level of the signal, which if it's swinging around 0V will be 0V (or close enough) You would need a half decent meter with a low range AC mode to get a reasonable reading.

Secondly the sound card oscilloscope software will almost certainly need to be calibrated. All it receives from the card is a value ranging between 0 and its full scale, e.g. in a 16-bit card this would be from 0 to 65,536.
It does not know how these values translate into voltage without calibration (the software may have a default setting based on usual sound card ranges, e.g. +/- 2V or whatever, which may be what you are seeing now, and may or may not be so accurate)
For example if your sound card range is +/- 1V then 65,536 would equate to +1V. If the software is set to a default of +/- 2V for full range then it will see a value of 65,536 as 2V, when in face the actual signal level would be 1V.
The fact you have your line out feeding mic in may cause the default calibration and reported levels to be a fair bit off, as line level out is a fair bit higher than what a mic input will expect.

If the software is any good it should have a calibration setting which you can use to set things up correctly. This will probably involve feeding a signal of a known level into the card and telling the software what the level is, then it can work the rest out. Since most soundcards have a DC blocking capacitor you will need an AC signal for this.