Let $X_1, X_2, \ldots, X_n$ be a random sample from $U(\theta - 0.5, \theta + 0.5)$ distribution, where $\theta \in \mathbb{R}$. If $X_{(1)} = \min(X_1, X_2, \ldots, X_n)$ and $X_{(n)} = \max(X_1, X_2, \ldots, X_n)$, then which one of the following estimators is NOT a maximum likelihood estimator (MLE) of $\theta$?
Step 1: Recall MLE property for uniform distributions.
For $U(\theta - 0.5, \theta + 0.5)$, the likelihood is nonzero only if
\[
X_{(1)} \ge \theta - 0.5, X_{(n)} \le \theta + 0.5.
\]
Thus, $\theta$ must satisfy
\[
X_{(n)} - 0.5 \le \theta \le X_{(1)} + 0.5.
\]
The MLE of $\theta$ is the midpoint of these bounds:
\[
\hat{\theta} = \frac{X_{(1)} + X_{(n)}}{2}.
\]
Step 2: Compare given estimators.
Option (A) matches the MLE exactly.
Options (B) and (C) are affine transformations preserving the same range and location (still MLEs under reparameterization).
Option (D) does not lie symmetrically between bounds and thus violates the MLE condition.
Step 3: Conclusion.
Hence, the estimator in (D) is NOT an MLE.
Let \( X_1, X_2, \dots, X_7 \) be a random sample from a population having the probability density function \[ f(x) = \frac{1}{2} \lambda^3 x^2 e^{-\lambda x}, \quad x>0, \] where \( \lambda>0 \) is an unknown parameter. Let \( \hat{\lambda} \) be the maximum likelihood estimator of \( \lambda \), and \( E(\hat{\lambda} - \lambda) = \alpha \lambda \) be the corresponding bias, where \( \alpha \) is a real constant. Then the value of \( \frac{1}{\alpha} \) equals __________ (answer in integer).