The entropy \( H(X) \) of a source is given by \[ H(X) = - \sum_{i=1}^{n} p_i \log_2 p_i \] where \( p_i \) is the probability of each symbol. \[ H(X) = - 5 \times 0.2 \times \log_2(0.2) \] \[ H(X) = -1 \times (-3) = 3 \] \(\text{Conclusion:}\) The entropy of the source is 3 bits, as given by option (b).
LIST I | LIST II | ||
A | Controlling TV channels through remote | I | Circuit switching |
B | Moving elevators up/down | II | Simplex communication |
C | Communication between two computers | III | Half-duplex communication |
D | Communication through telephone call | IV | Full-duplex communication |