3,460 views
2 votes
2 votes

Assume a scenario where the size of congestion window of a TCP connection be 40 KB when a timeout occurs. The maximum segment size (MSS) be 2 KB. Let the propagation delay be 200 msec. The time taken by the TCP connection to get back to 40 KB congestion window is _________ msec.

i am getting 5600msec but answer given is 6000msec.I am taking MSS as 2KB.ls someone explain 

NOTE:same question also asked here https://gateoverflow.in/1794/gate2014-1-27

4 Answers

0 votes
0 votes
Take 1mss=2KB

          2mss=4KB

          4mss=8kb

          8mss=16kb

          9mss=18kb

threshold=40/2=>20kb..i.e10mss

        ...

        ....

Increase upto 20 miss=40kb

Hence total RTT is 15 so 15*400=6000
0 votes
0 votes
Given cwnd = 40KB.

threshold =  20 KB (congestion window of a TCP connection be 40 KB when a timeout occurs)

Assuming 1 mss = 2KB.

cwnd =  40KB equivalent to 20 MSS (since 1MSS =  2KB)

threshold =  20 KB equivalent to 10 MSS

1 | 2 | 4 | 8 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20  ------->14 RTT's

RTT = 2 *  propagation delay

Time taken = 14 * 400 = 5600msec
0 votes
0 votes

At timeout time congestion window size is 40KB.

So,New Threshold=40KB/2=20KB.


Now,start from 0 KB exponentially increasing upto Threshold.

0kb-2kb,2kb-4kb,4kb-8kb,8kb-16kb,16kb-20kb(here 16kb-32kb is not possible because it exceed Thrshold),

And now increase only one MSS at a time

20kb-22kb,22kb-24kb,24kb-26kb,26kb-28kb,28kb-30kb,30kb-32kb,32kb-34kb,34kb-36kb,36kb-38kb,38kb-40kb.

Total 15 times.

so total time=RTT*15

RTT=2*PD=2*200msec=400msec.

total time=15*400=6000msec.

 

Related questions

2 votes
2 votes
1 answer
2