657 views
0 votes
0 votes
Suppose a server transmits one frame of a video every second, and the client starts playing the video at one frame per second as soon as the first frame arrives. Suppose the first ten frames arrive at times 0, 1.2, 1.99, 4.17, 4.01, 5.03, 8.05, 7.50, 8.90, 8.99, all in seconds. Which frames reach the client too late for playout?

(A) 7

(B) 3

(C) 6

(D) 5

1 Answer

0 votes
0 votes

For every second one frame reaches the client.

Time(seconds) Frame No. Arrive at client Diff in delay to reach client
0 0 0 0
1 1 1.2 0.2
2 2 1.99 0
3 3 4.17 1.17
4 4 4.01 0.01
5 5 5.03 0.03
6 6 8.05 2.05
7 7 7.50 0.50
8 8 8.90 0.90
9 9 8.99 0.99

If we consider frame no from (0-9) then 6th frame is having more delay.

If we consider frame no from (1-10) then 7th frame is having more delay.

Hence it depends on which frame no consideration.Either from (0-9) or (1-10) (in question it is not clear about frame no)

Related questions

0 votes
0 votes
2 answers
1
Parshu gate asked Nov 6, 2017
608 views
0 votes
0 votes
1 answer
2
0 votes
0 votes
0 answers
4
Sahil_Lather asked Jan 27, 2023
342 views
A host sends two packets and receives two acknowledgments. The time is shown as hour:minute:seconds.Segment 1 was sent at 0:0:00.Segment 2 was sent at 0:0:07.ACK for segm...