Frequency allocation for telecommunication companies

bandwidthbit ratecommunicationdatafrequency

For a high bit rate in data communication we need a high bandwidth. Suppose there are hundreds of carriers in a particular region and they are all allocated microwave frequencies. Minimum spacing rules are also applied during frequency allocation. Due to this the resultant bandwidth, that each carrier gets, reduces. So because of this the resultant bit rate would also decrease if we consider the shannon-hartley theorem. Then how do carriers claim to have high-bit rates? Is there a different way through which frequencies are allocated?

Best Answer

According to Wikipedia, 4G networks (IMT-Advanced)

Have peak link spectral efficiency of 15 bit/s/Hz in the downlink, and 6.75 bit/s/Hz in the uplink (meaning that 1 Gbit/s in the downlink should be possible over less than 67 MHz bandwidth).

The possibility of transferring more than 1 bps/Hz is a direct consequence of the Shannon-Hartley theorem

\$C = B \log_2\left(1+\mathrm{SNR}\right)\$

where C is capacity and B is bandwidth.

Achieving 15 bps/Hz thus requires an SNR of at least 215-1 or about 33,000.