I want to measure how much noise is a certain oscilloscope responsible for. Up until now we (my team) have used the approach of inserting a square wave and measuring the random jitter RMS. This method assumes there is no noise from the signal generator (minimal noise), which is not good enough for us.
Other than that is this a correct approach? If not what is wrong with it?
Is there a way to evaluate the random noise generated from within the scope itself without using an external signal?
To answer the questions in comments below, we are trying to evaluate an oscilloscope but we have a small noise budget aloud in the system while measuring (I cannot disclose other details). We are currently considering DPO70000SX series with 33GHz and 200Gs/s. Our goal at the moment is to measure the RMS random jitter in the time domain which is the standard deviation on the total jitter distribution.
Any other suggestion on how to evaluate this oscilloscope are most welcome.