I am using Keysight MSOX3102T oscilloscope which has 1 GHz bandwidth. At these frequencies, \$1~M\Omega\$ input impedance is not suitable due to reflections. Therefore, it has the possibility to select \$50 \Omega\$ input port.
simulate this circuit – Schematic created using CircuitLab
Assuming the configuration shown in the figure in which I measure a RMS voltage of \$V_{rms} = 2.2V\$ on the oscilloscope, is it correct to compute the power provided by the signal generator as:
$$ P = \frac{V^2_{rms}}{R}= \frac{2.2^2}{50}= 0.0968 W$$
where \$V_{rms}\$ is the RMS voltage measured on the oscilloscope and R the input port impedance?
UPDATE: Thank you everyone for the replies. Maybe I have not been clear enough explaining my doubt. It can be summarized as, setting the input impedance at \$50 \Omega\$, can I take the load impedance as just \$50 \Omega\$ or does it not work that way in an oscilloscope?
Best Answer
If you measure 2.2 volts RMS on the scope then that equates to a power taken by the scope's 50 ohm internal impedance of 96.8 mW. The power provided by the signal generator shares equally into it's own 50 ohms and the scope's 50 ohms so, V1 (in your pictures) provides a power of 193.6 mW or, put another way, V1 has an RMS output of 4.4 V RMS.
If the scope didn't make its input impedance 50 ohm then you would get signal reflections and people would be up-in-arms.