Well, you're not going to be making RF measurements up to 30 GHz without spending a bunch of money, so either path is big bucks.
Typically, Spectrum analyzers are used to do frequency domain measurements. You'll get a display of power vs frequency on the display. The controls in the SA are setup for relevant things, Center frequency, bandwidth, resolution bandwith, signal powers in dBm/dBc etc.
Digital oscilloscopes don't directly have sampling rates to directly sample a 30 Ghz signal, so they'll undersample and assume that the signal repeats. probably a safe assumption, although with no front end filters built into them, you've got dynamic range issues, as well as aliasing concerns that aren't present in a Spectrum Analyzer. You won't directly get spectral plots out of a Digital oscilloscope, you'll need to do an FFT on that. Now, that opens up a can of worms. FFT bin width/windowing function selection, etc. All stuff that can be worked through, but another question to deal with.
You won't get eye diagrams out of a spectrum analyzer, it's a useless measurement @ RF. That's a demodulated signal measurement.
Ultimately, if you want time domain data, then use an oscilloscope. If you want Spectral information, use a spectrum analyzer.
A oscilloscope plots voltage as a function of time, so your display is reasonable as you show it. However, the term "time base" is meaningless to indicate the X axis scale. What you want is "s/div" (or ms/div or µs/div). This is independent of the sample rate, although there is little point using more than a few pixels per sample.
The sample times you mention are very slow for ordinary oscilloscopes. Some signals will be reasonably visible at those rates, but most things you encounter will not be.
I would probably figure out what the fastest sample rate is that you can support, then always sample at that rate. If the application indicates it does not need samples that fast, then you can merge multiple samples into one before sending over the network. In that case you don't want to do traditional decimation, which seeks to eliminate frequencies that alias. Instead, for each data point send the min and max A/D samples covered by that data point. Each data point should then be shown to vertically cover that min/max range. If the user selects a slow sample rate and a faster signal is being sampled but it is still within the capability of the A/D and the underlying fast sample rate, then the display will be a horizontal bar with vertical width showing the signal peaks. That is a much better display than something that aliases.
Best Answer
Almost all analog 'scope display slightly behind real-time...perhaps by 50 to 100 ns.
Digital 'scopes likely add much more delay....
Waveform samples collected in real-time are often stored in digital memory. From there, a microcontroller transforms samples to drive the digital display. The microcontroller may be required to do other calculations (for example: calculate mean voltage, RMS voltage, peak voltage, frequency) that slows display.
Depending on processor speed and display interface, you cannot know the display latency. Samples stored in memory can sit undisplayed for a very long time.
For example, my digital 'scope states that up to 2000 waveforms per second can be displayed, if no additional waveform calculations are requested. No mention is made of display latency, which is an entirely different matter.
If a 'scope user must discover timing relationship between two events, use a multichannel 'scope, or one with an independent trigger input. The timing between a trigger event, and a displayed waveform is available to the user, with a decently defined specification in the manual. Similarly, timing between channels of a multichannel 'scope is well-defined.