is A the answer

The Gateway to Computer Science Excellence

First time here? Checkout the FAQ!

x

+1 vote

Assume the propagation delay in a broadcast network is $5 \: \mu s$ and the frame transmission time is $10 \: \mu s$. How long does it take for the last bit to reach the destination after the first bit has arrived?

- $10 \: \mu s$
- $5 \: \mu s$
- $15 \: \mu s$
- $50 \: \mu s$

0

The definition of propagation delay is the time taken for the first bit on the link to reach the other side

The definition of transmission delay is the time taken for the first bit to be placed on the link

So here 10 for transmitting the frame on to the link and 5 to propagate to other end

=> 15 micro seconds

The definition of transmission delay is the time taken for the first bit to be placed on the link

So here 10 for transmitting the frame on to the link and 5 to propagate to other end

=> 15 micro seconds

+6 votes

Best answer

question is how long it takes for last bit to reach destination after first bit has already arrived. therefore here we have to take difference of their time.

for first bit their will be no transmission time hence total 5.

for last bit we have to first transmit all the bits hence total time is 10+5=15;

time difference is 10. hope u get it now.

for first bit their will be no transmission time hence total 5.

for last bit we have to first transmit all the bits hence total time is 10+5=15;

time difference is 10. hope u get it now.

- All categories
- General Aptitude 1.5k
- Engineering Mathematics 7.1k
- Digital Logic 2.7k
- Programming & DS 4.9k
- Algorithms 4.2k
- Theory of Computation 5.3k
- Compiler Design 2.1k
- Databases 4k
- CO & Architecture 3.5k
- Computer Networks 4k
- Non GATE 1.4k
- Others 1.5k
- Admissions 559
- Exam Queries 555
- Tier 1 Placement Questions 23
- Job Queries 69
- Projects 18

47,938 questions

52,338 answers

182,398 comments

67,820 users