Question:

Let \( X_1, X_2, \dots, X_n \), where \( n \geq 2 \), be a random sample from a \( N(-\theta, \theta) \) distribution, where \( \theta>0 \) is an unknown parameter. Then which one of the following options is correct?

Show Hint

For normal distributions, the sum of squares of the observations is often the minimal sufficient statistic because it captures both the mean and the variance of the distribution.
Updated On: Apr 9, 2025
  • \( \sum_{i=1}^{n} X_i \) is a minimal sufficient statistic
  • \( \sum_{i=1}^{n} X_i^2 \) is a minimal sufficient statistic
  • \( \left( \frac{1}{n} \sum_{i=1}^{n} X_i, \frac{1}{n-1} \sum_{j=1}^{n} (X_j - \frac{1}{n} \sum_{i=1}^{n} X_i)^2 \right) \) is a complete statistic
  • \( -\frac{1}{n} \sum_{i=1}^{n} X_i \) is a uniformly minimum variance unbiased estimator of \( \theta \)
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is B

Solution and Explanation

We are given a random sample \( X_1, X_2, \dots, X_n \) from a normal distribution \( N(-\theta, \theta) \), where \( \theta > 0 \) is the unknown parameter. We need to determine which of the following options is correct. 
Step 1: Understanding the Distribution 
The random variables \( X_1, X_2, \dots, X_n \) are from the normal distribution \( N(-\theta, \theta) \), which means:
The mean of the distribution is \( -\theta \),
The variance of the distribution is \( \theta \).
Step 2: Factorization Theorem 
To determine which statistic is minimal sufficient, we apply the Factorization Theorem, which states that a statistic is sufficient if the likelihood function can be factored into two parts: one depending on the data through the statistic, and the other not depending on the parameter. The likelihood function for the normal distribution is given by: \[ L(\theta) = \prod_{i=1}^{n} \frac{1}{\sqrt{2\pi \theta}} \exp \left( -\frac{(X_i + \theta)^2}{2\theta} \right). \] This can be written as: \[ L(\theta) = \frac{1}{(\sqrt{2\pi \theta})^n} \exp \left( -\sum_{i=1}^{n} \frac{(X_i + \theta)^2}{2\theta} \right), \] which depends on the sum \( \sum_{i=1}^{n} X_i^2 \), making it crucial in determining the likelihood. 
Step 3: Identifying the Minimal Sufficient Statistic 
The statistic \( \sum_{i=1}^{n} X_i^2 \) is sufficient for \( \theta \) because it contains all the information needed to estimate \( \theta \). Additionally, it is minimal because there are no redundant pieces of information: it captures both the mean and variance of the distribution. Thus, \( \sum_{i=1}^{n} X_i^2 \) is the minimal sufficient statistic. 
Step 4: Re-evaluating the Other Options
(A) \( \sum_{i=1}^{n} X_i \): This is a sufficient statistic, but not minimal, because it does not capture information about the variance of the distribution.
(C) \( \left( \frac{1}{n} \sum_{i=1}^{n} X_i, \frac{1}{n-1} \sum_{j=1}^{n} (X_j - \frac{1}{n} \sum_{i=1}^{n} X_i)^2 \right) \): While this is a complete statistic, it is not minimal since the sum of squares \( \sum_{i=1}^{n} X_i^2 \) alone is already sufficient and minimal.
(D) \( -\frac{1}{n} \sum_{i=1}^{n} X_i \): This is the unbiased estimator of \( \theta \), but not minimal or sufficient for the variance in this case. Thus, the correct answer is (B) \( \sum_{i=1}^{n} X_i^2 \).

Was this answer helpful?
0
0

Questions Asked in GATE ST exam

View More Questions