Question:

X and Y are Bernoulli random variables taking values in \( \{0,1\} \). The joint probability mass function of the random variables is given by:
P(X = 0, Y = 0) = 0.06, \( \quad P(X = 0, Y = 1) = 0.14 \), \( \quad P(X = 1, Y = 0) = 0.24 \), \( \quad P(X = 1, Y = 1) = 0.56 \).
The mutual information \( I(X; Y) \) is (rounded off to two decimal places).

Show Hint

Mutual information quantifies the amount of information shared between two random variables. If the mutual information is zero, the variables are independent.
Updated On: Apr 15, 2025
Hide Solution
collegedunia
Verified By Collegedunia

Solution and Explanation

The mutual information \( I(X; Y) \) is given by: \[ I(X; Y) = \sum_{x, y} P(x, y) \log_2 \frac{P(x, y)}{P(x)P(y)} \] Step 1: Compute the marginal probabilities
The marginal probabilities are given by summing over the appropriate values of \( Y \) and \( X \): \[ P(X = 0) = P(X = 0, Y = 0) + P(X = 0, Y = 1) = 0.06 + 0.14 = 0.20 \] \[ P(X = 1) = P(X = 1, Y = 0) + P(X = 1, Y = 1) = 0.24 + 0.56 = 0.80 \] \[ P(Y = 0) = P(X = 0, Y = 0) + P(X = 1, Y = 0) = 0.06 + 0.24 = 0.30 \] \[ P(Y = 1) = P(X = 0, Y = 1) + P(X = 1, Y = 1) = 0.14 + 0.56 = 0.70 \] Step 2: Compute the mutual information
Now, we can compute the mutual information: \[ I(X; Y) = P(X = 0, Y = 0) \log_2 \frac{P(X = 0, Y = 0)}{P(X = 0)P(Y = 0)} + \cdots \] After performing the necessary calculations for each term, the mutual information \( I(X; Y) \) simplifies to: \[ I(X; Y) = 0 \] Thus, the correct answer is 0.
Was this answer helpful?
0
0

Questions Asked in GATE EC exam

View More Questions