Suppose host A is sending a large file to host B over a TCP connection.
The two end hosts are 10msec apart (20msec RTT) connected by a 1Gbps link.
Assume that they are using a packet size of 1000 bytes to transmit the file.
Also assume for simplicity that ACK packets are extremely small and can be ignored.
At least how big would the window size (in packets) have to be for the
channel utilization to be greater than 80%.
I am getting answer as 2001.
my final calculation gave me
$W_s \gt 2000.8$(Window Size)
And hence my answer came 2001. Is it correct?