Let \( X_1, X_2, \dots, X_n \) be a random sample from a distribution with the probability density function \[ f(x; \theta) = \begin{cases} c(\theta) e^{-(x - \theta)}, & x \geq 2\theta, \\ 0, & \text{otherwise}, \end{cases} \] where \( \theta \in \mathbb{R} \) is the unknown parameter. Then which of the following statement(s) is (are) true?
The maximum likelihood estimator of \( \theta \) is \( \dfrac{\min{\{X_1, X_2, \dots, X_n\}}}{2} \)
The maximum likelihood estimator of \( \theta \) is \( \min{\{X_1, X_2, \dots, X_n\}} \)
Step 1: Find $c(\theta)$
For this to be a valid PDF: $$\int_{2\theta}^{\infty} c(\theta) e^{-(x-\theta)} dx = 1$$
Let $u = x - \theta$, then $du = dx$. When $x = 2\theta$, $u = \theta$: $$c(\theta) \int_{\theta}^{\infty} e^{-u} du = c(\theta) \left[-e^{-u}\right]_{\theta}^{\infty} = c(\theta) e^{-\theta} = 1$$
Therefore: $c(\theta) = e^{\theta}$
So: $f(x;\theta) = e^{\theta} e^{-(x-\theta)} = e^{-x+2\theta}$ for $x \geq 2\theta$
Check Option (B): $c(\theta) = e^{\theta} \neq 1$
Option (B) is FALSE
Step 2: Find the likelihood function
For a sample $(x_1, x_2, ..., x_n)$, the likelihood is: $$L(\theta) = \prod_{i=1}^n f(x_i;\theta) = \prod_{i=1}^n e^{-x_i+2\theta} = e^{2n\theta} e^{-\sum_{i=1}^n x_i}$$
This is valid only when $x_i \geq 2\theta$ for all $i$, which means: $$\theta \leq \frac{\min{x_1, x_2, ..., x_n}}{2}$$
Step 3: Maximize the likelihood
The log-likelihood is: $$\ell(\theta) = 2n\theta - \sum_{i=1}^n x_i$$
Taking the derivative: $$\frac{d\ell}{d\theta} = 2n > 0$$
Since the derivative is always positive, $\ell(\theta)$ is strictly increasing in $\theta$.
Therefore, the likelihood is maximized at the largest possible value of $\theta$, which is: $$\hat{\theta}_{\text{MLE}} = \frac{\min{x_1, x_2, ..., x_n}}{2}$$
Step 4: Verify the options
(A) The MLE of $\theta$ is $\frac{\min{x_1,x_2,...,x_n}}{2}$
This matches our result.
Option (A) is TRUE
(C) The MLE of $\theta$ is $\min{x_1, x_2, ..., x_n}$
This would violate the constraint $x_i \geq 2\theta$ for all $i$. If $\theta = \min{x_i}$, then for the minimum value $x_{\min}$, we'd have $x_{\min} \geq 2x_{\min}$, which is impossible for positive values.
Option (C) is FALSE
(D) The MLE of $\theta$ does not exist
We found that the MLE exists and equals $\frac{\min{x_1,x_2,...,x_n}}{2}$.
Option (D) is FALSE
Answer: (A)
Let \( X_1, X_2, \dots, X_7 \) be a random sample from a population having the probability density function \[ f(x) = \frac{1}{2} \lambda^3 x^2 e^{-\lambda x}, \quad x>0, \] where \( \lambda>0 \) is an unknown parameter. Let \( \hat{\lambda} \) be the maximum likelihood estimator of \( \lambda \), and \( E(\hat{\lambda} - \lambda) = \alpha \lambda \) be the corresponding bias, where \( \alpha \) is a real constant. Then the value of \( \frac{1}{\alpha} \) equals __________ (answer in integer).