Electronic – Why is the calibration signal’s amplitude of the oscilloscope 10 times higher than stated

amplitudecalibrationoscilloscopeprobe

I'm trying to calibrate my HAMEG HM507 oscilloscope.

The manual says that the calibration signal is 0.2Vpp. With the attentuation of the probe set to x10, volts/division set to 5mV/cm, time/division set to 0.2ms/cm, I should see a square wave with a 4cm amplitude.

But that's not the case. I have to set volts/div. to 50mV in order to see the waveform correctly, because the amplitude is actually 10 times bigger! And indeed, when reading the calibration signal with attentuation set to X1, the scope shows a square wave with an amplitude of 2 volts.

But both the manual and the label on the scope say that the amplitude of the signal is 0.2Vpp. And in the settings I can only set the frequency of the test signal.

Is this an error in the manual? Is there some unknown zoom function that stretches the y-axis? (That can't be, because the x-magnification also adjusts the time/div. indicator). Maybe it's a function that multiplies the signal by 10 to adjust to the attentuation?

Best Answer

I did not find what type of probes were delivered with your scope, but I did find the manual.

With 0.2 VPP, it makes sense that you want a "final" scale of 50 mV/cm. 200 mVPP / 50 mV/cm gives you the final 4 cmPP.

Since you actually get the correct signal with the correct setting using your probe in the X10 mode, I can only assume that the oscilloscope is also configured in this mode:

enter image description here

As you can see, each input has an x1/x10 setting, which you can toggle by holding the corresponding button.

Probe factor selection is performed by pressing and holding the pushbutton. This selects the indicated deflection coefficient of channel I displayed in the readout, between 1:1 and 10:1. In condition 10:1, the probe factor is thus indicated by a probe symbol displayed by the readout in front of the channel information (e.g. ”probe symbol”, Y1...).