Electronic – VAR knob used for and why should it be usually in CAL position in analog scopes


I'm trying to understand the VAR CAL uses in analog scopes.

Almost all the texts I read mention briefly about these and do not explain well.

Here are some examples:

CAUTION! There is usually a small knob marked VAR (or VARIABLE) that
allows adjustment between the “clicks” of the HORIZONTAL control knob.
When measuring times from the screen, ensure that the VAR knob is in
the CAL (or CALIBRATE) position!


Turn to the position marked Cal – completely clockwise.


Sensitivity calibration. This knob is used to change the vertical
scale. If it is not turned all the way clockwise, the scope will be
uncalibrated and your data will be worthless. Check this knob
frequently as you take data.

Is there an easy way to explain or illustrate what is VAR knob used for and why should it be in CAL position in analog scopes?

Best Answer

When the "Var" knob is set for "CAL", then the vertical sensitivity (or other function) matches what you see on the screen. If you have the range set for 1 volt per division, then "CAL" = 1V/div.

But the "Var" knob allows you to adjust the trace on the screen to something less than the calibrated value. This may be useful for examining the waveform or comparing it to another waveform, etc.

The reason for setting the "Var" knob for "CAL" is where you are using the waveform on the screen to actually MEASURE the VOLTAGE amplitude on the screen. (Or the X-axis TIMING on the screen.) If you are not MEASURING the voltage or timing on the screen, then you can set the "Var" adjustment to wherever is convenient.