Question:

Let x1, x2 ….. xn be an independently, and identically distributed (iid) random sample drawn from a population that follows the Normal Distribution N(μ, σ2), where both the mean (μ) and variance (σ2) are unknown. Let \(\bar{x}\) be the sample mean. The maximum likelihood estimator (MLE) of the variance (\(\hat{\sigma}^2_{MLE}\)) is/are then characterized by

Updated On: Aug 21, 2025
  • \({\hat{\sigma}}^2_{MLE}=\frac{1}{n}\sum^n_{i=1}(x_i-\bar{x})^2\) which is a biased estimator of σ2
  • \({\hat{\sigma}}^2_{MLE}=\frac{1}{n}\sum^n_{i=1}(x_i^2-\bar{x})^2\) which is a consistent estimator of σ2
  • \({\hat{\sigma}}^2_{MLE}=\frac{1}{n-1}\sum^n_{i=1}(x_i-\bar{x})^2\) which is an unbiased estimator of σ2
  • \({\hat{\sigma}}^2_{MLE}=\frac{1}{n-1}\sum^{n-1}_{i=1}(x_i-\bar{x})^2\) which is an unbiased and consistent estimator of σ2
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is A

Solution and Explanation

Step 1: Assumptions 
We are given a random sample \(x_1, x_2, \dots, x_n\) from: \[ X_i \sim N(\mu, \sigma^2), \quad i = 1,2,\dots,n \] Both \(\mu\) and \(\sigma^2\) are unknown. The sample mean is: \[ \bar{x} = \frac{1}{n} \sum_{i=1}^n x_i \]

Step 2: Likelihood function
The joint probability density function is: \[ L(\mu,\sigma^2) = \prod_{i=1}^n \frac{1}{\sqrt{2\pi\sigma^2}} \exp\left(-\frac{(x_i-\mu)^2}{2\sigma^2}\right) \] Taking the natural log (log-likelihood): \[ \ell(\mu,\sigma^2) = -\frac{n}{2}\ln(2\pi\sigma^2) - \frac{1}{2\sigma^2}\sum_{i=1}^n (x_i-\mu)^2 \]

Step 3: Estimation of parameters
- Maximizing w.r.t. \(\mu\) gives \(\hat{\mu} = \bar{x}\). - Substituting into the log-likelihood and maximizing w.r.t. \(\sigma^2\) gives: \[ \hat{\sigma}^2_{MLE} = \frac{1}{n}\sum_{i=1}^n (x_i - \bar{x})^2 \]

Step 4: Bias property
This estimator is slightly biased because: \[ E\left[\hat{\sigma}^2_{MLE}\right] = \frac{n-1}{n}\sigma^2 < \sigma^2 \] An unbiased estimator would instead divide by \(n-1\): \[ S^2 = \frac{1}{n-1}\sum_{i=1}^n (x_i - \bar{x})^2 \]

Final Answer:
The MLE of the variance is \[ \boxed{\hat{\sigma}^2_{MLE} = \frac{1}{n}\sum_{i=1}^n (x_i - \bar{x})^2} \] and it is a biased estimator of \(\sigma^2\).

Was this answer helpful?
0
0

Questions Asked in GATE XH-C1 exam

View More Questions