Wireless – Calculating Maximum Distance for Receiving LTE Signal

wireless

I work for a large retailer who recently started adding outdoor Yagi antennas to try to boost their LTE signal for their LTE based modems at each store. I wanted to see if there was a way to calculate the maximum distance a cellular tower could be from us and still maintain a usable signal.

Here is what I know:

  1. Our received signal with a 1dBi (built in antenna) is, for example -80dBm.
  2. I know that our Yagi antenna supposedly grants us a gain of 11dBi, assuming we mount it properly/get polarity correct, etc.
  3. I also have the SNR, if that is necessary. In this case, SNR is 2dB.
  4. I can find out how many kilometers away the nearest cell tower is for my carrier.
  5. I know the carrier frequency is LTE band 13 (verizon) which is 700 Mhz.

Is there a way to use a formula to extrapolate the signal strength all else staying the same, given that we use an antenna with 11 dBi gain instead of the existing antenna? Or are there too many variables? Even a ballpark figure would help.

Best Answer

Let's assume that the EIRP from the cell tower is about 30 dBm, and it's about 2 kms away. Making yet another strong assumption that the path loss can be characterised by just free space -

Free Space Loss = 20 log (4 * pi * d * f / c) ~ 95.37 dB

That means the received power would be = 30-95.37+11 = -54.37 dBm

However, this is far from being accurate, since the path loss can much greater than ~95 dBm, and you would have to know the thermal noise at the receiver (and possibly interference) to properly estimate the SNR.