Network Queuing Delay – Common Questions Answered

Network

I am learning computer network, and confused by the queuing delay. In my textbook, it says that when La/R approaches 1, and with random inter-arrival times, then the average queuing delay is closer to infite. Here, R
is the transmission rate, a is in units of packets/sec and all packets have L bits data.

In my view, if the emission rate equals transmission rate, for example both are 500 packets/second, then sometimes the emission rate will go above it so the queue will expand, but also sometimes the emission rate will less than it so the queue will shrink. It seems like will achieve some kind of balance and the queue will not become infinite. Someone told me that it is a kind of queueing theory model and follows Poisson distribution. Can anybody give more detailed explanation to me? Thanks a lot!

Here is what my textbook says:

Typically,
the arrival process to a queue is random; that is, the arrivals do not follow any pattern
and the packets are spaced apart by random amounts of time. In this more realistic case,
the quantity La/R is not usually sufficient to fully characterize the queuing delay statistics. Nonetheless, it is useful in gaining an intuitive understanding of the extent of the
queuing delay. In particular, if the traffic intensity is close to zero, then packet arrivals
are few and far between and it is unlikely that an arriving packet will find another packet
in the queue. Hence, the average queuing delay will be close to zero. On the other hand,
when the traffic intensity is close to 1, there will be intervals of time when the arrival
rate exceeds the transmission capacity (due to variations in packet arrival rate), and
a queue will form during these periods of time; when the arrival rate is less than the
transmission capacity, the length of the queue will shrink. Nonetheless, as the traffic
intensity approaches 1, the average queue length gets larger and larger. The qualitative
dependence of average queuing delay on the traffic intensity is shown in Figure 1.18.

enter image description here

Best Answer

If arrival times are random , then occasionally packets will arrive faster than transmitted and the queue will increase. If that happens often enough, the queue will fill and packets will be dropped. That’s when delay becomes infinite.

Related Topic