is A the answer

The Gateway to Computer Science Excellence

First time here? Checkout the FAQ!

x

+1 vote

Assume the propagation delay in a broadcast network is $5 \: \mu s$ and the frame transmission time is $10 \: \mu s$. How long does it take for the last bit to reach the destination after the first bit has arrived?

- $10 \: \mu s$
- $5 \: \mu s$
- $15 \: \mu s$
- $50 \: \mu s$

0

The definition of propagation delay is the time taken for the first bit on the link to reach the other side

The definition of transmission delay is the time taken for the first bit to be placed on the link

So here 10 for transmitting the frame on to the link and 5 to propagate to other end

=> 15 micro seconds

The definition of transmission delay is the time taken for the first bit to be placed on the link

So here 10 for transmitting the frame on to the link and 5 to propagate to other end

=> 15 micro seconds

+5 votes

Best answer

question is how long it takes for last bit to reach destination after first bit has already arrived. therefore here we have to take difference of their time.

for first bit their will be no transmission time hence total 5.

for last bit we have to first transmit all the bits hence total time is 10+5=15;

time difference is 10. hope u get it now.

for first bit their will be no transmission time hence total 5.

for last bit we have to first transmit all the bits hence total time is 10+5=15;

time difference is 10. hope u get it now.

- All categories
- General Aptitude 1.5k
- Engineering Mathematics 6.9k
- Digital Logic 2.7k
- Programming & DS 4.8k
- Algorithms 4.1k
- Theory of Computation 5.2k
- Compiler Design 2k
- Databases 3.9k
- CO & Architecture 3.4k
- Computer Networks 3.9k
- Non GATE 1.4k
- Others 1.5k
- Admissions 516
- Exam Queries 525
- Tier 1 Placement Questions 23
- Job Queries 67
- Projects 18

46,966 questions

51,292 answers

177,262 comments

66,643 users