Step 1: Understanding the properties of complete and sufficient statistics.
A statistic \( T \) is {complete} if for any measurable function \( g \), \( E[g(T)] = 0 \) implies \( P(g(T) = 0) = 1 \).
A statistic \( T \) is {sufficient} for a family of distributions if the conditional distribution of the sample given \( T \) does not depend on the parameter \( f \). The fact that \( T \) is complete and sufficient means that it contains all the information about the parameter \( f \). Moreover, if \( U \) is a sufficient statistic, \( T \) may or may not be a function of \( U \).
Step 2: Analyzing the options.
Option (A): \( T^2 \) is still a complete statistic because completeness is preserved under one-to-one transformations.
Option (B): \( T^2 \) is a minimal sufficient statistic because \( T \) is minimal, and any one-to-one transformation of a minimal sufficient statistic remains minimal.
Option (C): \( T \) being a function of \( U \) is not necessarily true. While \( T \) is complete and sufficient, it is not guaranteed to be a function of another sufficient statistic \( U \).
Option (D): \( U \) being a function of \( T \) is not necessarily true. Since \( T \) is complete and sufficient, and \( U \) is merely sufficient, \( U \) does not have to be a function of \( T \). Thus, the correct answer is \( \boxed{(D)} \).
Let \( (X, Y)^T \) follow a bivariate normal distribution with \[ E(X) = 2, \, E(Y) = 3, \, {Var}(X) = 16, \, {Var}(Y) = 25, \, {Cov}(X, Y) = 14. \] Then \[ 2\pi \left( \Pr(X>2, Y>3) - \frac{1}{4} \right) \] equals _________ (rounded off to two decimal places).
Let \( X_1, X_2 \) be a random sample from a population having probability density function
\[ f_{\theta}(x) = \begin{cases} e^{(x-\theta)} & \text{if } -\infty < x \leq \theta, \\ 0 & \text{otherwise}, \end{cases} \] where \( \theta \in \mathbb{R} \) is an unknown parameter. Consider testing \( H_0: \theta \geq 0 \) against \( H_1: \theta < 0 \) at level \( \alpha = 0.09 \). Let \( \beta(\theta) \) denote the power function of a uniformly most powerful test. Then \( \beta(\log_e 0.36) \) equals ________ (rounded off to two decimal places).
Let \( X_1, X_2, \dots, X_7 \) be a random sample from a population having the probability density function \[ f(x) = \frac{1}{2} \lambda^3 x^2 e^{-\lambda x}, \quad x>0, \] where \( \lambda>0 \) is an unknown parameter. Let \( \hat{\lambda} \) be the maximum likelihood estimator of \( \lambda \), and \( E(\hat{\lambda} - \lambda) = \alpha \lambda \) be the corresponding bias, where \( \alpha \) is a real constant. Then the value of \( \frac{1}{\alpha} \) equals __________ (answer in integer).