The Gateway to Computer Science Excellence
First time here? Checkout the FAQ!
x
0 votes
47 views
P7. In this problem, we consider sending real-time voice from Host A to Host B
over a packet-switched network (VoIP). Host A converts analog voice to a
digital 64 kbps bit stream on the fly. Host A then groups the bits into 56-byte
packets. There is one link between Hosts A and B; its transmission rate is 2
Mbps and its propagation delay is 10 msec. As soon as Host A gathers a
packet, it sends it to Host B. As soon as Host B receives an entire packet, it
converts the packet’s bits to an analog signal. How much time elapses from
the time a bit is created (from the original analog signal at Host A) until the
bit is decoded (as part of the analog signal at Host B)?
asked in Computer Networks by Junior (563 points) | 47 views

1 Answer

+1 vote
64 kbps means  in 1 sec 64 k bits converted hence time for 1 bit=1/64k

time taken to group 56 bytes=56*8*1/64*10^-3=7 m sec (encoding time)

same time will be taken to decode=7 m sec ( decoding time)

transmission time=56*8/2*10^-6=224 micro sec

propagation time=10 m sec

if we consider only one such packet in that case total time=7 m sec

hence total time taken=TT+PT+encoding time+decoding time

=224 micro sec +10 m sec +14 m sec

=24.224 m sec
answered by Active (3.6k points)
edited by
0
Why 2*PT?

Why propagation delay is taken twice?
0
sorry for that..i took another thing in consideration..now it has been edited

Related questions



Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true

40,819 questions
47,498 answers
145,754 comments
62,259 users