Question:

Let \( X_1, X_2 \) be a random sample from a distribution with the probability mass function
\[ f(x|\theta) = \begin{cases} 1 - \theta, & \text{if } x = 0 \\ \theta, & \text{if } x = 1 \\ 0, & \text{otherwise}, \quad 0 < \theta < 1. \end{cases} \] Which of the following is (are) unbiased estimator(s) of \( \theta \)?

Show Hint

For a discrete random variable, the expected value of an estimator is the sum of the products of its values and their corresponding probabilities.
Updated On: Nov 20, 2025
  • \( \frac{X_1 + X_2}{2} \)
  • \( \frac{X_1^2 + X_2}{2} \)
  • \( \frac{X_1^2 + X_2^2}{2} \)
  • \( \frac{X_1 + X_2 - X_1^2}{2} \)
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is A, B, C

Solution and Explanation

Step 1: Understanding the probability mass function.
We are given a probability mass function with possible values of \( x \) being 0 or 1, with probabilities depending on \( \theta \).
Step 2: Unbiased estimator.
An estimator \( T(X) \) is unbiased if \( E[T(X)] = \theta \). For \( X_1 \) and \( X_2 \), we calculate the expected value of \( \frac{X_1 + X_2}{2} \), which is: \[ E\left[\frac{X_1 + X_2}{2}\right] = \frac{E[X_1] + E[X_2]}{2} = \frac{\theta + \theta}{2} = \theta. \] Thus, \( \frac{X_1 + X_2}{2} \) is an unbiased estimator of \( \theta \). Next, for the other estimators: - \( \frac{X_1^2 + X_2}{2} \) is also unbiased since \( E[X_1^2] = \theta \), making \( \frac{X_1^2 + X_2}{2} \) an unbiased estimator of \( \theta \). - \( \frac{X_1^2 + X_2^2}{2} \) is unbiased for the same reason: \( E[X_1^2] = \theta \), and similarly for \( X_2 \), so the estimator is unbiased.
Step 3: Conclusion.
The correct answers are (A), (B), and (C).
Was this answer helpful?
0
0

Questions Asked in IIT JAM MS exam

View More Questions