Electronic – the difference between capacity and throughput in context of wireless/cellular communication

cellularcommunicationdigital-communicationssignalwireless

From what I understand:

  • Throughput tells you how much data was actually transferred from a source at any given time
  • Bandwidth tells you how much data could theoretically be transferred from a source at any given point
  • Capacity is the maximum information transmitted per unit time

I'm not able to perfectly capture the differences between throughput and capacity. Any help would be much appreciated.

Best Answer

Accurate definitions with due attention to wording will help us to clarify your issue:

  1. Throughput is the amount of data received by the destination. The average throughput is the throughput per unit of time

  2. The bandwidth is the parameter of signals transmitted over a communication link, specifically, it is a "bandwidth in frequency domain". When talking about a communication channel bandwidth, one means that the signals arrive at a receiver distorted by a "transfer function" of the channel, with this communication channel bandwidth parameter characterizing the transfer function of the channel.

  3. However a linear transfer function of the channel may distort signals, in the absence of noise it does not impair the ability of the ideal receiver to precisely recover data from signals, that is, to perform a reliable, error-free communication. Only the presence of noise leads to transmission errors and limits the average throughput of communication channels. The maximum data rate at which reliable communication is possible is called the capacity of the channel. Information Theory states that the two parameters defining the achievable average throughput are the bandwidth and the signal-to-noise ratio.

Before Claude Shannon invented Information Theory, the reliability of communication over noisy channels was understood as the robustness of an individual received signal toward detection errors. The only option to achieve reliable communication with limited bandwidth was to reduce data rate (average throughput) with repeated transmissions. Still, the zero error probability (reliable communication) at finite (not infinitesimal) data rates (average throughputs) was not achievable even in theory. Shannon showed: coding the data sent over a channel, one can communicate at a strictly positive rate -- maybe (potentially) low, but not infinitely low. The theory does not specify the coding techniques, for a particular channel and signal models one may need to develop very sophisticated codes, but the theory states these coding schemes exist. And there is a maximum rate for which reliable communication over a noisy channel is possible: this maximum rate is called channel capacity. Information Theory gives an upper limit for the channel capacity: $$ C_{AWGN} = log(1+SNR)·W $$ where W is the bandwidth, SNR is the signal-to-noise ratio. The signal-to-noise ratio is the ratio of the total received signal power, including noise, to the noise power at the receiver.

The capacity variable is denoted CAWGN, to stress that the noise is additive white Gaussian noise. This model of noise is not too restrictive; on the contrary, Information Theory shows that the AWG noise is the worst case for achieving maximum capacity. With any other noise models, the achievable capacity is higher.