898 views

Assume a scenario where the size of congestion window of a TCP connection be 40 KB when a timeout occurs. The maximum segment size (MSS) be 2 KB. Let the propagation delay be 200 msec. The time taken by the TCP connection to get back to 40 KB congestion window is _________ msec.

I think answer should be 5600. But 6000 is given. They have added the last RTT also. Will it be added?

After timeout in any case it will reset to 1MSS

1 | 2 | 4 | 8 | 16 | 20 | 22 | 24 | 26 | 28 | 30 | 32 | 34 | 36 | 38 | 40 = 15 RTT * 400 = 6000 msec. (without taking last line 15 RTT's will be there)
@Ashwin 1 MSS = 2KB so, your first value should be 2 (and not 1)
What about RTT when TO occurs

When Wc = 40KB TImeout occurs that means after 40/2  =  20 Segments are sent and TO timer started its ack doesn't come within timeout Hence slow start phase again started after Timeout.

So time between 40KB trasnmission and 1st transmission of slow start phase will be TImeout time.

But as we don't know TO time we should consider it as RTT

hence total RTT = 15(14+1(not after last one but when TO occurs))

@Ashwin Kulkarni how after 16 20 is coming??