Measurement – Vector Voltmeters vs Oscilloscopes Comparison

measurementoscilloscopephase shiftvoltmeter

I have come across some electronic engineering papers from the mid 1970's where they were testing tunable phase shifters.
To measure the phase shift they used a vector voltmeter. As this is not a bit a kit I have come across or is as readily available as an oscilloscope around the university, I am curious as to what advantages or drawback a vector voltmeter might have compared with an oscilloscope, if we are considering phase measurements?

Is there any particular reason you might choose one over the other or might this just have been equipment that was available at the time and suitable for purpose?

Best Answer

The 1970's is long before I started engineering work, but at that time an oscilloscope that measured beyond 100 MHz (maybe even 10 MHz) would have been quite an expensive instrument if it was available at all. So oscilloscopes were rarely used for RF work.

Furthermore, 1970's oscilloscopes were entirely analog instruments, and would provide you no assistance in estimating the phase of a signal. Your ability to do this measurement with a 'scope would be limited to your ability to 'eyeball' the phase based on the position of the traces on the screen (probably using an XY trace and estimating the aspect ratio of the ellipse formed on the screen). Maybe you could achieve 10 degrees precision with this method (and the precision and accuracy would depend on the experience of the operator doing the measurement).

Nowadays, oscilloscopes for RF frequencies remain relatively expensive, so you'd still likely use a network analyzer (one component of which is essentially an automated vector voltmeter) to do that measurement, if you were dedicating an instrument to doing it every day (for example, on a production line).