Question:

Let \( X_1, X_2, \dots, X_n \) be a random sample from a distribution with the probability density function \[ f(x; \theta) = \begin{cases} c(\theta) e^{-(x - \theta)}, & x \geq 2\theta, \\ 0, & \text{otherwise}, \end{cases} \] where \( \theta \in \mathbb{R} \) is the unknown parameter. Then which of the following statement(s) is (are) true? 
 

Show Hint

For many distributions, the maximum likelihood estimator can be found by maximizing the likelihood function, often resulting in the minimum or maximum of the sample values.
Updated On: Dec 17, 2025
  • The maximum likelihood estimator of \( \theta \) is \( \dfrac{\min{\{X_1, X_2, \dots, X_n\}}}{2} \) 
     

  • \( c(\theta) = 1 \), for all \( \theta \in \mathbb{R} \)
  • The maximum likelihood estimator of \( \theta \) is \( \min{\{X_1, X_2, \dots, X_n\}} \) 
     

  • The maximum likelihood estimator of \( \theta \) does not exist
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is A

Solution and Explanation

Step 1: Find $c(\theta)$

For this to be a valid PDF: $$\int_{2\theta}^{\infty} c(\theta) e^{-(x-\theta)} dx = 1$$

Let $u = x - \theta$, then $du = dx$. When $x = 2\theta$, $u = \theta$: $$c(\theta) \int_{\theta}^{\infty} e^{-u} du = c(\theta) \left[-e^{-u}\right]_{\theta}^{\infty} = c(\theta) e^{-\theta} = 1$$

Therefore: $c(\theta) = e^{\theta}$

So: $f(x;\theta) = e^{\theta} e^{-(x-\theta)} = e^{-x+2\theta}$ for $x \geq 2\theta$

Check Option (B): $c(\theta) = e^{\theta} \neq 1$

Option (B) is FALSE 

Step 2: Find the likelihood function

For a sample $(x_1, x_2, ..., x_n)$, the likelihood is: $$L(\theta) = \prod_{i=1}^n f(x_i;\theta) = \prod_{i=1}^n e^{-x_i+2\theta} = e^{2n\theta} e^{-\sum_{i=1}^n x_i}$$

This is valid only when $x_i \geq 2\theta$ for all $i$, which means: $$\theta \leq \frac{\min{x_1, x_2, ..., x_n}}{2}$$

Step 3: Maximize the likelihood

The log-likelihood is: $$\ell(\theta) = 2n\theta - \sum_{i=1}^n x_i$$

Taking the derivative: $$\frac{d\ell}{d\theta} = 2n > 0$$

Since the derivative is always positive, $\ell(\theta)$ is strictly increasing in $\theta$.

Therefore, the likelihood is maximized at the largest possible value of $\theta$, which is: $$\hat{\theta}_{\text{MLE}} = \frac{\min{x_1, x_2, ..., x_n}}{2}$$

Step 4: Verify the options

(A) The MLE of $\theta$ is $\frac{\min{x_1,x_2,...,x_n}}{2}$

This matches our result.

Option (A) is TRUE 

(C) The MLE of $\theta$ is $\min{x_1, x_2, ..., x_n}$

This would violate the constraint $x_i \geq 2\theta$ for all $i$. If $\theta = \min{x_i}$, then for the minimum value $x_{\min}$, we'd have $x_{\min} \geq 2x_{\min}$, which is impossible for positive values.

Option (C) is FALSE 

(D) The MLE of $\theta$ does not exist

We found that the MLE exists and equals $\frac{\min{x_1,x_2,...,x_n}}{2}$.

Option (D) is FALSE 

Answer: (A)

Was this answer helpful?
0
0

Top Questions on Estimation

View More Questions

Questions Asked in IIT JAM MS exam

View More Questions