+1 vote
86 views

Let us consider a scenario of sending real-time voice from Host A to Host B over a packet-switched network. Host A converts analog voice to a digital $32$ kbps bit stream on the fly. Host A then groups the bits into $44$-byte packets. There is one link between Hosts A and B; its transmission rate is $4$ Mbps and its propagation delay is $19$ msec. As soon as Host A gathers a packet, it sends it to Host B. As soon as Host B receives an entire packet, it converts the packet's bits to an analog signal. How much time elapses from the time a bit is created (from the original analog signal at Host A) until the bit is decoded (as part of the analog signal at Host B)?

1. $11$ ms
2. $88$ ms
3. $30$ ms
4. $30.088$ ms
| 86 views

+1 vote
Consider the first bit in a packet. Before this bit can be transmitted, all of the bits in the packet must be generated. This requires
$44 \times 8/(32 \times 103)$ seconds = $11$ milliseconds
The time required to transmit the packet is $44 \times 8(4 \times 106)$ seconds $=88$ microseconds.
Propagation delay = $19$ milliseconds.
The delay until decoding is $11$ milliseconds + $88$ microseconds + $19$ milliseconds $= 30.088$ milliseconds
by Junior (725 points)
+1

@Applied Course ans should be 30.088 miliseconds.

+1
Yeah, I was also getting 30.088 and skipped this problem.
0