The Gateway to Computer Science Excellence
+3 votes
93 views
16. In a CDMA/CD network with a data rate of 10 Mbps, the maximum distance between any station pair is found to be 2500 m for the correct operation of the collision detection process. What should be the maximum distance if we increase the data rate to 100 Mbps? To 1 Gbps? To 10 Gbps?
in Computer Networks by Boss (35.7k points) | 93 views

1 Answer

+5 votes

Transmission time of 1 frame = $2 * T_p = \large \frac{\text{Frame size}}{ \text{Data rate}}$

Let Frame size be $x$,

Transmission time of 1 frame = $2 * \frac{D}{\text{Speed of Propagation}} = \large \frac{x}{\text{Data rate}}$

where D = Distance between source and destination.

$D = \Large \frac{   \text{ Speed of Propagation } * \ x }{2 \  * \ \text{Data rate}} $

 

$ \text{ Speed of Propagation and Frame size are constant.} $

Hence we know that Distance and Data Rate are inversely proportional.

$10 Mbps \rightarrow 2500m$

$100 Mbps \rightarrow 250m$

$1 Gbps \rightarrow 25m$

$10 Gbps \rightarrow 2.5m$

by Boss (35.7k points)
edited by

Related questions

Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true
50,647 questions
56,496 answers
195,488 comments
100,795 users