767 views
2 votes
2 votes
Suppose that you are measuring the time to receive a segment. When an interrupt
occurs, you read out the system clock in milliseconds. When the segment is fully processed,
you read out the clock again. You measure 0 msec 270,000 times and 1 msec
730,000 times. How long does it take to receive a segment?

1 Answer

0 votes
0 votes
Total Time= 270000 + 730000 = 1000000

T1 = 270000

T2= 730000

Average = T1+T2/Total Time

= (270,000*0 + 730,000*1 msec)/1,000,000

= 730 micro Seconds

Related questions

1 votes
1 votes
1 answer
3
ajaysoni1924 asked Mar 18, 2019
1,327 views
In a network whose max segment is 128 bytes, max segment lifetime is 30 sec, andhas 8-bit sequence numbers, what is the maximum data rate per connection?