\[ u_n = \frac{18n + 3}{(3n - 1)^2 (3n + 2)^2}, \quad n \in \mathbb{N}. \]
\[ \sum_{n=1}^{\infty} u_n \text{ equals} \]
Let \( X \) and \( Y \) be continuous random variables with the joint probability density function Then \( 9 \, \text{Cov}(X, Y) \) equals
Let \( X_1, X_2, \dots, X_n \) (where \( n \geq 2 \)) be a random sample from a distribution with the probability density function Let \( \bar{X} = \frac{1}{n} \sum_{i=1}^n X_i \). Which of the following statistics is (are) sufficient but NOT complete?
Let \( X_1, X_2, X_3 \) be a random sample from a distribution with the probability mass function If \( \hat{X}(X_1, X_2, X_3) \) is an unbiased estimator of \( \theta \), which of the following CANNOT be attained as a value of the variance of \( \hat{X} \) at \( \theta = 1 \)?
Let \( X_1, X_2 \) be a random sample from a distribution with the probability mass function Which of the following is (are) unbiased estimator(s) of \( \theta \)?
The cumulative distribution function of a random variable \( X \) is given by Which of the following statements is (are) TRUE?
Let \( f: [0,1] \to [0,1] \) be defined as follows: Which of the following statements is (are) TRUE?
Let \( f: [0, 1] \to \mathbb{R} \) be a function defined as Let \( F: [0, 1] \to \mathbb{R} \) be defined as \[ F(x) = \int_0^x f(t) \, dt \] Then \( F''(0) \) equals
Let \( X \) be a random variable having a probability density function \( f \in \{ f_0, f_1 \} \), where For testing the null hypothesis \( H_0 : f = f_0 \) against \( H_1 : f = f_1 \), based on a single observation on \( X \), the power of the most powerful test of size \( \alpha = 0.05 \) equals
Let \( x_1 = 2, x_2 = 1, x_3 = \sqrt{5}, x_4 = \sqrt{2} \) be the observed values of a random sample of size four from a distribution with the probability density function Then the method of moments estimate of \( \theta \) is
Let \( x_1 = 1.1, x_2 = 0.5, x_3 = 1.4, x_4 = 1.2 \) be the observed values of a random sample of size four from a distribution with the probability density function Then the maximum likelihood estimate of \( \theta^2 \) is
Let \( \{X_n\}_{n \geq 1} \) be a sequence of i.i.d. random variables with the probability mass function Let \( \bar{X}_n = \frac{1}{n} \sum_{i=1}^n X_i, n = 1, 2, \dots \). If \( \lim_{n \to \infty} P(m \leq \bar{X}_n \leq M) = 1 \), then possible values of \( m \) and \( M \) are
Let \( X \) and \( Y \) be continuous random variables with the joint probability density function Then \( P(X + Y>\frac{1}{2}) \) equals
Let \( X \) and \( Y \) be continuous random variables with the joint probability density function where \( c \) is a positive real constant. Then \( E(X) \) equals
Let \( X_1 \) and \( X_2 \) be i.i.d. continuous random variables with the probability density function Using Chebyshev's inequality, the lower bound of \( P \left( |X_1 + X_2 - 1| \leq \frac{1}{2} \right) \) is
Let \( \{X_n\}_{n \geq 1} \) be a sequence of i.i.d. random variables having common probability density function Let \( \bar{X}_n = \frac{1}{n} \sum_{i=1}^{n} X_i \), \( n = 1, 2, \dots \). Then \[ \lim_{n \to \infty} P(\bar{X}_n = 2) \] equals
Let \( X \) be a continuous random variable with the probability density function Then \( P\left( \frac{1}{2} \leq X \leq 2 \right) \) equals
\[ u = \begin{pmatrix} 1 \\ 2 \\ 3 \\ 5 \end{pmatrix} \quad \text{and} \quad v = \begin{pmatrix} 5 \\ 3 \\ 2 \\ 1 \end{pmatrix} \]
The imaginary parts of the eigenvalues of the matrix are