Let \( X \) and \( Y \) be continuous random variables with the joint probability density function Then \( E(X | Y = -1) \) equals}
Let \( X_1, X_2, X_3 \) be independent random variables with the common probability density function Let \( Y = \min \{ X_1, X_2, X_3 \}, \, E(Y) = \mu_Y \, \text{and} \, \text{Var}(Y) = \sigma_Y^2. \) Then \( P(Y>\mu_Y + \sigma_Y) \) equals}
Let \( X \) be a continuous random variable with the probability density function where \( k \) is a real constant. Then \( P(1<X<5) \) equals
\[ A = \begin{pmatrix} \frac{5}{6} & -\frac{1}{3} & -\frac{1}{6} \\ \frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\ -\frac{1}{6} & \frac{1}{3} & \frac{5}{6} \end{pmatrix}. \]
\[ u_n = \frac{18n + 3}{(3n - 1)^2 (3n + 2)^2}, \quad n \in \mathbb{N}. \]
\[ \sum_{n=1}^{\infty} u_n \text{ equals} \]
Let \( X \) and \( Y \) be continuous random variables with the joint probability density function Then \( 9 \, \text{Cov}(X, Y) \) equals
Let \( X_1, X_2, \dots, X_n \) (where \( n \geq 2 \)) be a random sample from a distribution with the probability density function Let \( \bar{X} = \frac{1}{n} \sum_{i=1}^n X_i \). Which of the following statistics is (are) sufficient but NOT complete?
Let \( X_1, X_2, X_3 \) be a random sample from a distribution with the probability mass function If \( \hat{X}(X_1, X_2, X_3) \) is an unbiased estimator of \( \theta \), which of the following CANNOT be attained as a value of the variance of \( \hat{X} \) at \( \theta = 1 \)?
Let \( X_1, X_2 \) be a random sample from a distribution with the probability mass function Which of the following is (are) unbiased estimator(s) of \( \theta \)?