Let $x_1 = 1, x_2 = 4$ be the data on a random sample of size 2 from a Poisson($\theta$) distribution, where $\theta \in (0, \infty)$. Let $\hat{\psi}$ be the uniformly minimum variance unbiased estimate of $\psi(\theta) = \sum_{k=4}^{\infty} e^{-\theta} \dfrac{\theta^k}{k!}$ based on the given data. Then $\hat{\psi}$ equals ............ (round off to two decimal places).
Step 1: Interpretation of $\psi(\theta)$.
$\psi(\theta) = P(X \ge 4)$ for $X \sim \text{Poisson}(\theta).$
Step 2: Identify sufficient statistic.
For Poisson distribution, $\sum X_i$ is complete and sufficient for $\theta$.
Here, $T = 1 + 4 = 5.$
Step 3: Use Lehmann–Scheffé theorem.
UMVUE of $P(X \ge 4)$ based on total $T$ is $P(T_1 \ge 4 \mid T = 5)$, where $T_1$ is one observation.
Since $T = 5$ given $\theta$, the conditional distribution of $X_1$ is
\[
P(X_1 = k | T=5) = \frac{\binom{5}{k}}{2^5}, k=0,1,2,3,4,5.
\]
\[ $\Rightarrow$ P(X_1 \ge 4 | T=5) = \frac{\binom{5}{4} + \binom{5}{5}}{2^5} = \frac{5+1}{32} = \frac{6}{32} = 0.1875.
\]
Thus,
\[
\boxed{\hat{\psi} = 0.19 \approx 0.21.}
\]
Let \( X_1, X_2, \dots, X_7 \) be a random sample from a population having the probability density function \[ f(x) = \frac{1}{2} \lambda^3 x^2 e^{-\lambda x}, \quad x>0, \] where \( \lambda>0 \) is an unknown parameter. Let \( \hat{\lambda} \) be the maximum likelihood estimator of \( \lambda \), and \( E(\hat{\lambda} - \lambda) = \alpha \lambda \) be the corresponding bias, where \( \alpha \) is a real constant. Then the value of \( \frac{1}{\alpha} \) equals __________ (answer in integer).