Question:

Consider a 100 Mbps link between an earth station (sender) and a satellite (receiver) at an altitude of 2100 km. The signal propagates at a speed of \(3 \times 10^8 \ \text{m/s}\). The time taken (in milliseconds, rounded off to two decimal places) for the receiver to completely receive a packet of 1000 bytes transmitted by the sender is

Show Hint

The total time for packet reception includes both propagation delay and transmission time.
Updated On: Jan 30, 2026
Hide Solution
collegedunia
Verified By Collegedunia

Correct Answer: 7.07

Solution and Explanation

First, calculate the time it takes for the signal to travel from the sender to the receiver: \[ \text{Distance} = 2100\ \text{km} = 2.1 \times 10^6\ \text{m} \] The time to travel this distance is: \[ \text{Time to travel} = \frac{\text{Distance}}{\text{Speed}} = \frac{2.1 \times 10^6}{3 \times 10^8} = 7 \times 10^{-3}\ \text{seconds} = 7.0\ \text{ms} \] Now, calculate the time required to transmit the 1000-byte packet over the 100 Mbps link: \[ \text{Packet size} = 1000\ \text{bytes} = 8000\ \text{bits} \] \[ \text{Transmission time} = \frac{\text{Packet size}}{\text{Bandwidth}} = \frac{8000}{100 \times 10^6} = 8 \times 10^{-5}\ \text{seconds} = 0.08\ \text{ms} \] Thus, the total time for the receiver to receive the packet is the sum of the propagation time and the transmission time: \[ \text{Total time} = 7.0\ \text{ms} + 0.08\ \text{ms} = 7.08\ \text{ms} \] Therefore, the time taken to completely receive the packet is: \[ \boxed{7.08\ \text{ms}} \]
Was this answer helpful?
0
0