The Gateway to Computer Science Excellence
First time here? Checkout the FAQ!
0 votes
P7. In this problem, we consider sending real-time voice from Host A to Host B
over a packet-switched network (VoIP). Host A converts analog voice to a
digital 64 kbps bit stream on the fly. Host A then groups the bits into 56-byte
packets. There is one link between Hosts A and B; its transmission rate is 2
Mbps and its propagation delay is 10 msec. As soon as Host A gathers a
packet, it sends it to Host B. As soon as Host B receives an entire packet, it
converts the packet’s bits to an analog signal. How much time elapses from
the time a bit is created (from the original analog signal at Host A) until the
bit is decoded (as part of the analog signal at Host B)?
asked in Computer Networks by Junior (665 points) | 51 views

1 Answer

+1 vote
64 kbps means  in 1 sec 64 k bits converted hence time for 1 bit=1/64k

time taken to group 56 bytes=56*8*1/64*10^-3=7 m sec (encoding time)

same time will be taken to decode=7 m sec ( decoding time)

transmission time=56*8/2*10^-6=224 micro sec

propagation time=10 m sec

if we consider only one such packet in that case total time=7 m sec

hence total time taken=TT+PT+encoding time+decoding time

=224 micro sec +10 m sec +14 m sec

=24.224 m sec
answered by Active (3.9k points)
edited by
Why 2*PT?

Why propagation delay is taken twice?
sorry for that..i took another thing in it has been edited

Related questions

Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true

46,665 questions
51,139 answers
66,556 users