Question:

Let \( \hat{\lambda} \) be the Maximum Likelihood Estimator of the parameter \(\lambda\), then, on the basis of a sample of size 'n' from a population having the probability density function \( f(x; \lambda) = \frac{e^{-\lambda} \lambda^x}{x!} \); \(x = 0, 1, 2, \dots\), \(\lambda>0\), the Var(\(\hat{\lambda}\)) is

Show Hint

For many common distributions (Normal, Poisson, Exponential, Bernoulli), the MLE of the mean parameter is the sample mean \( \bar{X} \). The variance of the sample mean is always \( \frac{\sigma^2}{n} \), where \( \sigma^2 \) is the population variance. For a Poisson distribution, \( \sigma^2 = \lambda \), so \( \text{Var}(\bar{X}) = \lambda/n \).
Updated On: Sep 20, 2025
  • \( \lambda \)
  • \( \frac{\lambda}{n^2} \)
  • \( \frac{1}{\lambda} \)
  • \( \frac{\lambda}{n} \)
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is D

Solution and Explanation

Step 1: Understanding the Concept:
The problem asks for the variance of the Maximum Likelihood Estimator (MLE) of the parameter \(\lambda\) of a Poisson distribution. The first step is to find the MLE, \( \hat{\lambda} \), and the second step is to calculate its variance.

Step 2: Key Formula or Approach:
1. Find the MLE of \(\lambda\), which is a standard result (\( \hat{\lambda} = \bar{X} \)). 2. Find the variance of the MLE using properties of variance. \[ \text{Var}(\hat{\lambda}) = \text{Var}(\bar{X}) = \text{Var}\left(\frac{1}{n}\sum X_i\right) = \frac{1}{n^2}\text{Var}\left(\sum X_i\right) \] 3. Since the \(X_i\) are independent, \(\text{Var}(\sum X_i) = \sum \text{Var}(X_i)\). 4. For a Poisson(\(\lambda\)) distribution, \( \text{Var}(X_i) = \lambda \).

Step 3: Detailed Explanation:
Part 1: Finding the MLE \( \hat{\lambda} \) The likelihood function for a sample \(x_1, \dots, x_n\) is: \[ L(\lambda) = \prod_{i=1}^n \frac{e^{-\lambda}\lambda^{x_i}}{x_i!} = \frac{e^{-n\lambda}\lambda^{\sum x_i}}{\prod x_i!} \] The log-likelihood is: \[ l(\lambda) = \ln(L(\lambda)) = -n\lambda + (\sum x_i)\ln(\lambda) - \ln(\prod x_i!) \] Differentiate with respect to \(\lambda\) and set to zero: \[ \frac{dl}{d\lambda} = -n + \frac{\sum x_i}{\lambda} = 0 \] \[ \frac{\sum x_i}{\lambda} = n \implies \lambda = \frac{\sum x_i}{n} = \bar{x} \] So, the MLE is \( \hat{\lambda} = \bar{X} \). Part 2: Finding the Variance of \( \hat{\lambda} \) We need to find \( \text{Var}(\hat{\lambda}) = \text{Var}(\bar{X}) \). \[ \text{Var}(\bar{X}) = \text{Var}\left(\frac{1}{n} \sum_{i=1}^n X_i\right) \] Using the variance property \(\text{Var}(aX) = a^2\text{Var}(X)\): \[ \text{Var}(\bar{X}) = \frac{1}{n^2} \text{Var}\left(\sum_{i=1}^n X_i\right) \] Since the observations are from a random sample, they are independent. Thus, the variance of the sum is the sum of the variances: \[ \text{Var}(\bar{X}) = \frac{1}{n^2} \sum_{i=1}^n \text{Var}(X_i) \] For a Poisson(\(\lambda\)) distribution, the variance of a single observation is \(\text{Var}(X_i) = \lambda\). \[ \text{Var}(\bar{X}) = \frac{1}{n^2} \sum_{i=1}^n \lambda = \frac{1}{n^2} (n\lambda) = \frac{\lambda}{n} \]
Step 4: Final Answer:
The variance of the MLE \( \hat{\lambda} \) is \( \frac{\lambda}{n} \).
Was this answer helpful?
0
0

Top Questions on Estimation Theory

View More Questions