Can signal arrival times be a function of receiver bandwidth

bandwidthelectromagneticnoisefloorreceiversignal

Lets assume that the speed of light is exactly \$3\!\left(\!10^{8}\!\right)\text{meters}/\text{second}\$. Suppose we have a transmitter on the surface of the earth emitting a signal at 1000 MHz and that there is a receiver \$3\!\left(\!10^{8}\!\right)\!\text{meters}\$ above the surface of the earth that is receiving the signal (line of sight, straight line distance, normal to the surface of the earth). The bandwidth of the signal is 2 MHz and the signal is a precision timed signal such that arrival times at the receiver can be accurately determined by an on-board computer. When the receiver's bandwidth is set to 10 MHz the arrival time is calculated by the on-board computer to be \$1\!\left(\!10^{-3}\!\right)\!\text{second}\$, which is what it should be with the speed of light and distance as given above (ignoring atmospheric effects). Lets suppose when the receiver's bandwidth is changed to 20 MHz the arrival time is measured at \$1.5\!\left(\!10^{-3}\!\right)\!\text{seconds}\$.

Two questions:

1.) Taking the atmosphere into consideration and assuming non-ideal (i.e., real world) receivers and transmitters, is it possible to have different arrival times for different receiver bandwidths as described above? (Atmospheric multi-path should not be dependent on the receiver's bandwidth setting.)

2.) Could "noise power" affect arrival time calculations based on the receiver's bandwidth setting?

Best Answer

I think you mean 1 second delay not 1ms.

Arrival time is governed by the permeability and permittivity of the substance the radio wave travels thru, nothing else. If it travels thru a complex medium of significant width compared to the overall distance then it will slow down and be delayed. For instance, light (RF) takes a lot longer to pass thru water than a vacuum.

Noise has nothing to do with it and why oh why could the receiver's bandwidth be known by the yet-to-arrive radio wave?