edited by
804 views
1 votes
1 votes
Consider two TCP senders $T_{1}$, $T_{2}$ each of them is connected with a different line to a router whose capacity is $9000$ $bytes$. The bandwidth of each of these lines is infinite and the delay of each TCP sender to the router is $11$ $milliseconds$. Two TCP receivers $C_{1}$ and $C_{2}$ are connected to the router via a common shared line of $21,000,000$ $bps$. The delay from the router to the receivers is $11$ $milliseconds$.

 

$T_{1}$ communicates with $C_{1}$ and $T_{2}$ with $C_{2}$.

$T_{1}$has buffer of  $9000$ $bytes$ and the application writes data with speed $300 kbps$.

$T_{2}$ has buffer of  $17000$ $bytes$ and the application writes data with speed $500 kbps$.

 

$C_{1}$ has buffer of  $66000$ $bytes$ and the application read data speed is infinite.

$C_{2}$ has buffer of  $9000$ $bytes$ and the application read data speed is $301 kbps$.

 

At $time$ $0$ the application of $T_{1}$ begins to send data, and after $130$ $milliseconds$ $T_{2}$  begins to send data

The question

1. After how long a packet loss will occur due to congestion?

and

2. After how long the two senders will send data at the same rate?

 

useful notes:

-time less than $1$ $millisecond$ is negligible

-packet size is equal $2000$ $bytes$ $=1$ MSS

-solve using packets/RTT

the results may be with float

Is there any formula to use?

What I thought is

2000kbps / 21Mbps + 11msec =..

After the other tcp sender starts to send data the shared common line will be 21Mbps/2
edited by

Please log in or register to answer this question.

Related questions

2 votes
2 votes
1 answer
2
2 votes
2 votes
2 answers
3
Na462 asked Dec 2, 2018
1,175 views
Assume the scenario where size of the congestion window of a TCP connection be 40KB when timeout occurs. The MSS is 2KB. Propagation delay be 200msec. Time taken by TCP c...