# TCP – How to Calculate SampleRTT?

tcp

I'm working on a TCP implementation with UDP socket in C, and I'm trying to make the timeout variable with this algorithm

``````estimatedRTT = (1-a)*EstimatedRTT + alpha*SampleRTT
devRTT = (1-beta)*devRTT + beta*abs(sampleRTT-estimatedRTT);
timeout = estimatedRTT + 4*devRTT;
``````

but I don't understand how to calculate the sampleRTT. The problems I encountered are 2:

1. If I calculate the sampleRTT as the round trip time of each packet (the time between sending the packet and receiving the related acknowledgment), the timeout will increase forever in case of packet loss. For example, let's say I have a timeout value of 100, I send 1 packet and it is lost, therefore retransmitted for example 3 times. When I receive the ACK, I will calculate the new timeout and it will certainly have increased compared to the previous value, as I have calculated a sampleRTT equal to 300 (3 times the previous timeout, due to the 3 retransmissions). At this point I send a new packet and this too, before arriving, will have to be retransmitted a certain number of times, as it will be lost. And so on. If the loss rate is constant in a network, the timeout will never stop increasing. But, for me, this does not reflect the state of network congestion, because if there is packet loss, a higher value timeout will not reduce losses, at least not directly, and the bandwidth is underutilized
2. Let's suppose that the sender sends 3 packets, the first is lost while the other two arrive at their destination, and it retransmits only the first. At this point the sender will receive a cumulative ACK, containing the acknowledgment for all 3 packets. In this case, how do I calculate the sampleRTT, given that I don't have an ACK available for each packet?

I hope I was clear.