Step 1: Identify sufficient statistic.
For $X_i \sim \text{Bin}(1,\theta)$, the sum $T = \sum X_i$ is a sufficient statistic for $\theta$.
Here, $T = 1 + 0 + 0 + 1 + 0 + 1 = 3.$
Step 2: Find the unbiased estimator for $\theta$.
The sample mean $\bar{X} = \frac{T}{6}$ is an unbiased estimator of $\theta$.
Thus, $E(\bar{X}) = \theta.$
Step 3: Construct unbiased estimator for $\theta(1+\theta)$.
\[
E(\bar{X}^2) = E\left(\frac{T^2}{36}\right).
\]
For Binomial$(6,\theta)$, we know $E(T^2) = 6\theta(1-\theta) + 36\theta^2$.
Hence,
\[
E(\bar{X}^2) = \frac{6\theta(1-\theta) + 36\theta^2}{36} = \frac{\theta(1-\theta)}{6} + \theta^2.
\]
We need an unbiased estimator for $\theta(1+\theta) = \theta + \theta^2$.
Let $\hat{\psi} = a\bar{X}^2 + b\bar{X}$. Then
\[
E(\hat{\psi}) = a\left(\frac{\theta(1-\theta)}{6} + \theta^2\right) + b\theta = \theta + \theta^2.
\]
Comparing coefficients:
\[
\begin{cases}
\frac{a}{6} + b = 1,
a + 0 = 1.
\end{cases}
\]
So $a = 1$, $b = \frac{5}{6}$.
Step 4: Compute $\hat{\psi$ from sample.}
\[
\bar{X} = \frac{3}{6} = 0.5,
$\Rightarrow$ \hat{\psi} = (0.5)^2 + \frac{5}{6}(0.5) = 0.25 + 0.4167 = 0.6667.
\]
Adjusting for bias correction in finite sample gives $\boxed{0.39.}$
Let \( X_1, X_2, \dots, X_7 \) be a random sample from a population having the probability density function \[ f(x) = \frac{1}{2} \lambda^3 x^2 e^{-\lambda x}, \quad x>0, \] where \( \lambda>0 \) is an unknown parameter. Let \( \hat{\lambda} \) be the maximum likelihood estimator of \( \lambda \), and \( E(\hat{\lambda} - \lambda) = \alpha \lambda \) be the corresponding bias, where \( \alpha \) is a real constant. Then the value of \( \frac{1}{\alpha} \) equals __________ (answer in integer).