The entropy \( H(X) \) of a source is given by \[ H(X) = - \sum_{i=1}^{n} p_i \log_2 p_i \] where \( p_i \) is the probability of each symbol. \[ H(X) = - 5 \times 0.2 \times \log_2(0.2) \] \[ H(X) = -1 \times (-3) = 3 \] \(\text{Conclusion:}\) The entropy of the source is 3 bits, as given by option (b).
Column-I has statements made by Shanthala; and, Column-II has responses given by Kanishk.