30. A system uses the Stop-and-Wait ARQ Protocol. If each packet carries 1000 bits of
data, how long does it take to send 1 million bits of data if the distance between the
sender and receiver is 5000 KIn and the propagation speed is 2 x 108 m? Ignore transmission,
waiting, and processing delays. We assume no data or control frame is lost
My solution -
Propagation time = Distance / speed = 5000 * 10^3 / (2 * 10^3) = 25 ms
RTT = 2 * Propagation time = 2 * 25 = 50 ms
Total packets = 1 million bits / 1000 bits = 10^3 packets
1 packets takes 50 ms
10^3 packets take time = 10^3 * 50 ms = 50 s
so answer is 50 s.
Am i right or wrong? if wrong, please correct me with right approach.