Let \( X_1, X_2, \dots, X_n \) be a random sample from \( U(\theta, \theta + 1) \), where \( \theta \in \mathbb{R} \) is the unknown parameter. Let \[ U = \max{\{X_1, X_2, \dots, X_n\}} \, \text{and} \, V = \min{\{X_1, X_2, \dots, X_n\}}. \] Then which of the following statement(s) is (are) true?
\( 2U - V - 2\) is a consistent estimator of \( \theta \)
\( 2U - V + 1\) is a consistent estimator of \( \theta \)
Given: $X_1, X_2, ..., X_n$ is a random sample from $U(\theta, \theta + 1)$
Let $U = \max{X_1, X_2, ..., X_n}$ and $V = \min{X_1, X_2, ..., X_n}$
Key facts about uniform distribution $U(\theta, \theta + 1)$:
Option (A): $U$ is a consistent estimator of $\theta$
We need to check if $U \xrightarrow{P} \theta$ as $n \to \infty$.
For the maximum of a uniform distribution on $[\theta, \theta + 1]$: $$P(U \leq u) = P(X_1 \leq u, ..., X_n \leq u) = \left(\frac{u - \theta}{1}\right)^n = (u - \theta)^n$$
for $u \in [\theta, \theta + 1]$.
As $n \to \infty$, $U$ converges to $\theta + 1$, not $\theta$.
Therefore: $U \xrightarrow{P} \theta + 1$
Option (A) is FALSE
Option (B): $V$ is a consistent estimator of $\theta$
For the minimum of a uniform distribution on $[\theta, \theta + 1]$: $$P(V \geq v) = P(X_1 \geq v, ..., X_n \geq v) = \left(\frac{\theta + 1 - v}{1}\right)^n = (\theta + 1 - v)^n$$
for $v \in [\theta, \theta + 1]$.
As $n \to \infty$, for any $\epsilon > 0$: $$P(|V - \theta| > \epsilon) = P(V > \theta + \epsilon) = (\theta + 1 - \theta - \epsilon)^n = (1 - \epsilon)^n \to 0$$
Therefore: $V \xrightarrow{P} \theta$
Option (B) is TRUE
Option (C): $2U - V - 2$ is a consistent estimator of $\theta$
Since:
By continuous mapping theorem: $$2U - V - 2 \xrightarrow{P} 2(\theta + 1) - \theta - 2 = 2\theta + 2 - \theta - 2 = \theta$$
Option (C) is TRUE
Option (D): $2V - U + 1$ is a consistent estimator of $\theta$
Since:
By continuous mapping theorem: $$2V - U + 1 \xrightarrow{P} 2\theta - (\theta + 1) + 1 = 2\theta - \theta - 1 + 1 = \theta$$
Option (D) is TRUE
Answer: (B), (C), and (D)
Let \( X_1, X_2, \dots, X_7 \) be a random sample from a population having the probability density function \[ f(x) = \frac{1}{2} \lambda^3 x^2 e^{-\lambda x}, \quad x>0, \] where \( \lambda>0 \) is an unknown parameter. Let \( \hat{\lambda} \) be the maximum likelihood estimator of \( \lambda \), and \( E(\hat{\lambda} - \lambda) = \alpha \lambda \) be the corresponding bias, where \( \alpha \) is a real constant. Then the value of \( \frac{1}{\alpha} \) equals __________ (answer in integer).