I had this question on the physics community, but I thought this would be appropriate to cross-post here.
I read that in VLBI, the signals of microwaves (on the millimeter scale) are recorded and then later combined. A famous example of this is the Event Horizon Telescope which imaged the black hole. Presumably, the telescopes recorded the phase of the microwaves.
This tells me that it is possible to record the specific phase of a microwave, but I haven't seen any technology that can do this. Most oscilloscopes, for example, don't have high enough frequency to resolve such short time scales required to measure microwave phases.
My two questions are simple. Answers to either one would be helpful.
Is there any piece of technology that can measure the phase of a microwave? What technology is used anyways for VLBI to record phases?
Edit: When I said "phase" I meant to say the actual "magnitude" of the wave at any point in time, but I see it is more complicated.
The phase of a signal is meaningless without a reference. We are therefore only ever talking about the relative phases of 2 (or in the case of VLBI radio telescopes) N signals.
We use a local oscillator to mix, aka superheterodyne, the reference and the working signals down to a low IF, which are then sampled with a conventional ADC. Even in 2010 while I was still working, 250MHz 14 bit ADCs were becoming commonplace, so allowing IFs up to 100MHz, or higher with down-sampling, allowing bandwidths up to 100MHz. To give you an example of what's possible, for a carrier stability test, I mixed down to a 10kHz IF, used a PC sound card to sample that and 10kHz derived from its reference, and was able to demonstrate steps of 0.01 degrees phase at a 1GHz carrier.
That's the conceptual approach. However there are practical difficulties with supplying a common reference signal to two sites on opposite sides of the earth. What is done therefore is to mix down the received signal using a reference synthesised from a local quiet standard like a rubidium oscillator or hydrogen maser, and record it along with mixed down versions of other references, GPS satellites usually, which allows the timebases to be mutually synchronised when the recordings are brought together for reconstruction, which used to be tapes, then hard drives, now fibre optics. The key thing is that the good mid-term stability of the local standard means very long comparison times are available, so the synchronisation uncertainty can be averaged down to many orders of magnitude below the single observation noise.
The paper that Jim's answer linked to is well worth a read.