Question:

Let \( X_1, X_2, \dots, X_n \) be a random sample from a distribution with the probability density function \[ f(x; \theta) = \begin{cases} \theta^2 x e^{-\theta x}, & x > 0, \\ 0, & \text{otherwise}, \end{cases} \] where \( \theta > 0 \) is the unknown parameter. If \( Y = \sum_{i=1}^n X_i \), then which of the following statement(s) is (are) true? 
 

Show Hint

For Gamma distributions, the sum of independent random variables with the same distribution is a sufficient statistic, and the estimator can be derived from the expectation.
Updated On: Dec 17, 2025
  • \( Y \) is a complete sufficient statistic for \( \theta \)
  • \( \frac{2n}{Y} \) is the uniformly minimum variance unbiased estimator of \( \theta \)
  • \( \frac{2n-1}{Y} \) is the uniformly minimum variance unbiased estimator of \( \theta \)
  • \( \frac{2n+1}{Y} \) is the uniformly minimum variance unbiased estimator of \( \theta \)
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is A, C

Solution and Explanation

Step 1: Identify the distribution

This is a Gamma distribution: $X_i \sim \text{Gamma}(2, \theta)$ with shape parameter $\alpha = 2$ and rate parameter $\theta$.

Step 2: Find the distribution of $Y$

Since the sum of independent Gamma random variables with the same rate parameter is also Gamma: $$Y = \sum_{i=1}^n X_i \sim \text{Gamma}(2n, \theta)$$

The PDF of $Y$ is: $$f_Y(y;\theta) = \frac{\theta^{2n} y^{2n-1} e^{-\theta y}}{\Gamma(2n)}, \quad y > 0$$

Option (A): $Y$ is a complete sufficient statistic for $\theta$

Sufficiency: By the factorization theorem, the likelihood function is: $$L(\theta; x_1, ..., x_n) = \prod_{i=1}^n \theta^2 x_i e^{-\theta x_i} = \theta^{2n} \left(\prod_{i=1}^n x_i\right) e^{-\theta \sum_{i=1}^n x_i}$$

This can be written as: $$L(\theta; x_1, ..., x_n) = g(Y, \theta) h(x_1, ..., x_n)$$

where $Y = \sum_{i=1}^n x_i$. Thus, $Y$ is a sufficient statistic.

Completeness: The family ${\text{Gamma}(2n, \theta) : \theta > 0}$ is an exponential family with natural parameter $-\theta$ and complete. Therefore, $Y$ is a complete sufficient statistic.

Option (A) is TRUE 

Option (B): $\frac{2n}{Y}$ is the UMVUE of $\theta$

First, find $E[Y]$ and $E[1/Y]$:

For $Y \sim \text{Gamma}(2n, \theta)$: $$E[Y] = \frac{2n}{\theta}$$

For the reciprocal, using the formula for the inverse Gamma moment: $$E\left[\frac{1}{Y}\right] = \frac{\theta}{2n - 1}$$ (for $2n > 1$)

Therefore: $$E\left[\frac{2n}{Y}\right] = 2n \cdot \frac{\theta}{2n - 1} = \frac{2n\theta}{2n - 1} \neq \theta$$

So $\frac{2n}{Y}$ is biased.

Option (B) is FALSE 

Option (C): $\frac{2n-1}{Y}$ is the UMVUE of $\theta$

From above: $$E\left[\frac{1}{Y}\right] = \frac{\theta}{2n - 1}$$

Therefore: $$E\left[\frac{2n-1}{Y}\right] = (2n-1) \cdot \frac{\theta}{2n - 1} = \theta$$

So $\frac{2n-1}{Y}$ is unbiased.

Since $Y$ is a complete sufficient statistic and $\frac{2n-1}{Y}$ is an unbiased estimator of $\theta$, by the Lehmann-Scheffé theorem, $\frac{2n-1}{Y}$ is the UMVUE of $\theta$.

Option (C) is TRUE 

Option (D): $\frac{2n+1}{Y}$ is the UMVUE of $\theta$

$$E\left[\frac{2n+1}{Y}\right] = (2n+1) \cdot \frac{\theta}{2n - 1} = \frac{(2n+1)\theta}{2n - 1} \neq \theta$$

So $\frac{2n+1}{Y}$ is biased.

Option (D) is FALSE 

Answer: (A) and (C) 

Was this answer helpful?
0
0

Top Questions on Estimation

View More Questions

Questions Asked in IIT JAM MS exam

View More Questions