264 views
16. In a CDMA/CD network with a data rate of 10 Mbps, the maximum distance between any station pair is found to be 2500 m for the correct operation of the collision detection process. What should be the maximum distance if we increase the data rate to 100 Mbps? To 1 Gbps? To 10 Gbps?
| 264 views

Transmission time of 1 frame = $2 * T_p = \large \frac{\text{Frame size}}{ \text{Data rate}}$

Let Frame size be $x$,

Transmission time of 1 frame = $2 * \frac{D}{\text{Speed of Propagation}} = \large \frac{x}{\text{Data rate}}$

where D = Distance between source and destination.

$D = \Large \frac{ \text{ Speed of Propagation } * \ x }{2 \ * \ \text{Data rate}}$

$\text{ Speed of Propagation and Frame size are constant.}$

Hence we know that Distance and Data Rate are inversely proportional.

$10 Mbps \rightarrow 2500m$

$100 Mbps \rightarrow 250m$

$1 Gbps \rightarrow 25m$

$10 Gbps \rightarrow 2.5m$

edited by