edited by
772 views
1 votes
1 votes
Let the size of congestion window of a TCP connection be 38 KB when a timeout occurs. The propagation time of the connection is 100 msec and the maximum segment size used is 2 KB. The time taken (in msec) by the TCP connection to get back to 36 KB congestion window is ________.
edited by

1 Answer

Best answer
4 votes
4 votes

 If congestion is detected by time out error:

  1. Set threshold value to half of the current window size where congestion is detected. New threshold = 38/2 =19 KB
  2. start the slow-start phase. Hence new window size = 2 KB

Communication will be like ==>> 2KB | 4KB | 8KB | 16 KB  (now threshold reaches so Additive increase starts) | 18 KB | 20 KB | 22 KB | 24 KB | 26 KB | 28 KB | 30 KB | 32 KB| 34 KB | 36 KB

NB: Each RTT is show as | (vertical line), so total time taken = 13* 100ms ==>> 2600 msec

selected by

Related questions

1 votes
1 votes
1 answer
2