Question:

The random variable \( X \) takes values in \( \{-1, 0, 1\} \) with probabilities \[ P(X = -1) = P(X = 1) = \alpha \quad {and} \quad P(X = 0) = 1 - 2\alpha, \quad 0<\alpha<\frac{1}{2}. \] Let \( g(\alpha) \) denote the entropy of \( X \) (in bits), parameterized by \( \alpha \). Which of the following statements is/are TRUE?

Show Hint

For discrete random variables, the entropy \( g(\alpha) \) quantifies the uncertainty in the variable. It increases as the probabilities of outcomes become more evenly distributed.
Updated On: Apr 15, 2025
  • \( g(0.4)>g(0.3) \)
  • \( g(0.3)>g(0.4) \)
  • \( g(0.3)>g(0.25) \)
  • \( g(0.25)>g(0.3) \)
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is B, C

Solution and Explanation

Step 1: The entropy \( g(\alpha) \) of the random variable \( X \) is given by the formula: \[ g(\alpha) = -P(X = -1) \log_2 P(X = -1) - P(X = 0) \log_2 P(X = 0) - P(X = 1) \log_2 P(X = 1) \] Substituting the probabilities: \[ g(\alpha) = -\alpha \log_2 \alpha - (1 - 2\alpha) \log_2 (1 - 2\alpha) - \alpha \log_2 \alpha \] \[ g(\alpha) = -2\alpha \log_2 \alpha - (1 - 2\alpha) \log_2 (1 - 2\alpha) \] Step 2: Compare \( g(0.3) \) and \( g(0.4) \)
As \( \alpha \) increases, the entropy tends to decrease because the distribution becomes more deterministic (less uncertain) as the probability of \( X = 0 \) becomes larger. Therefore, we have: \[ g(0.3)>g(0.4) \] Step 3: Compare \( g(0.3) \) and \( g(0.25) \)
Similarly, the entropy at \( \alpha = 0.3 \) will be greater than the entropy at \( \alpha = 0.25 \), because as \( \alpha \) decreases, the distribution becomes more deterministic. Thus, the correct answers are (B) and (C).
Was this answer helpful?
0
0

Questions Asked in GATE EC exam

View More Questions