The entropy \( H(X) \) of a source is given by \[ H(X) = - \sum_{i=1}^{n} p_i \log_2 p_i \] where \( p_i \) is the probability of each symbol. \[ H(X) = - 5 \times 0.2 \times \log_2(0.2) \] \[ H(X) = -1 \times (-3) = 3 \] \(\text{Conclusion:}\) The entropy of the source is 3 bits, as given by option (b).
Column-I has statements made by Shanthala; and, Column-II has responses given by Kanishk.
A closed-loop system has the characteristic equation given by: $ s^3 + k s^2 + (k+2) s + 3 = 0 $.
For the system to be stable, the value of $ k $ is:
A digital filter with impulse response $ h[n] = 2^n u[n] $ will have a transfer function with a region of convergence.