I had thought that a (digital) Oscilloscope with higher Sampling Rate, would automatically have higher Bandwidth. That seems intuitive because of Nyquist Sampling Theorem. But I've read in several places that in an Oscilloscope "sampling rate is not directly related to the bandwidth specification" (see here for example). That doesn't make any sense to me. Could you explain the logic?
- Electronic – How to you recieve a 6 GHz signal with only 64MS/s ADC
- Electronic – why is this oscilloscope so cheap with a 20Gsa/s sampling resolution
- Electronic – Optimal tradeoff between ADC bit depth and sampling rate
- Electronic – Why do digital scopes sample signals at a higher frequency than required by the sampling theorem
- Electronic – How does playback rate of an arbitrary waveform generator determine frequency content of that signal
- Electronic – Why are oscilloscope input impedances so low
- Electronic – How to find an ADC circuit’s bandwidth experimentally