Question:

Let \( X_1, X_2, \dots, X_n \) be a random sample of size \( n \geq 2 \) from a population having probability density function \[ f(x; \theta) = \left\{ \begin{array}{ll} \frac{2}{\theta} \left( -\log_e x \right)^2 e^{-\left( \frac{\log_e x}{\theta} \right)^2} & \text{if } 0 < x < 1, \\ 0 & \text{otherwise} \end{array} \right. \] where \( \theta > 0 \) is an unknown parameter. Then which one of the following statements is true?

Show Hint

- To find the maximum likelihood estimator, take the derivative of the log-likelihood function with respect to the parameter and set it equal to zero.
- The MLE is obtained by solving for the parameter in terms of the sample data.
Updated On: Aug 30, 2025
  • \( \frac{1}{n} \sum_{i=1}^{n} (\log_e X_i)^2 \) is the maximum likelihood estimator of \( \theta \)
  • \( \frac{1}{n-1} \sum_{i=1}^{n} (\log_e X_i) \) is the maximum likelihood estimator of \( \theta \)
  • \( \frac{1}{n} \sum_{i=1}^{n} \log_e X_i \) is the maximum likelihood estimator of \( \theta \)
  • \( \frac{1}{n-1} \sum_{i=1}^{n} \log_e X_i \) is the maximum likelihood estimator of \( \theta \)
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is A

Solution and Explanation

1) Understanding the Problem: 
The given probability density function (pdf) suggests that we are dealing with a distribution involving the logarithm of the sample values. The objective is to find the maximum likelihood estimator (MLE) for the parameter \( \theta \). 
2) Likelihood Function: 
The likelihood function for \( n \) independent observations from this distribution is given by: \[ L(\theta) = \prod_{i=1}^{n} \frac{2}{\theta} (\log_e X_i)^2 e^{-\left( \frac{\log_e X_i}{\theta} \right)^2} \] Taking the natural logarithm of the likelihood function, we get the log-likelihood: \[ \log L(\theta) = \sum_{i=1}^{n} \left( \log \left( \frac{2}{\theta} \right) + 2 \log_e X_i - \left( \frac{\log_e X_i}{\theta} \right)^2 \right) \] Simplifying: \[ \log L(\theta) = -n \log \theta + 2 \sum_{i=1}^{n} \log_e X_i - \frac{1}{\theta^2} \sum_{i=1}^{n} (\log_e X_i)^2 \] 3) Maximizing the Log-Likelihood: 
To find the MLE, we differentiate the log-likelihood with respect to \( \theta \) and set it equal to zero: \[ \frac{d}{d\theta} \log L(\theta) = -\frac{n}{\theta} + \frac{2}{\theta^3} \sum_{i=1}^{n} (\log_e X_i)^2 \] Setting the derivative equal to zero: \[ -\frac{n}{\theta} + \frac{2}{\theta^3} \sum_{i=1}^{n} (\log_e X_i)^2 = 0 \] Solving for \( \theta \), we get the MLE as: \[ \hat{\theta} = \frac{2}{n} \sum_{i=1}^{n} (\log_e X_i)^2 \] Thus, the maximum likelihood estimator of \( \theta \) is \( \frac{1}{n} \sum_{i=1}^{n} (\log_e X_i)^2 \).

Was this answer helpful?
0
0

Questions Asked in GATE ST exam

View More Questions