Let $f : [-1,3] \to \mathbb{R}$ be a continuous function such that $f$ is differentiable on $(-1,3)$, $|f'(x)| \le \dfrac{3}{2}$ for all $x \in (-1,3)$, $f(-1) = 1$ and $f(3) = 7$. Then $f(1)$ equals .................
Let $X$ be a single observation drawn from $U(0, \theta)$ distribution, where $\theta \in \{1, 2\}$. To test $H_0: \theta = 1$ against $H_1: \theta = 2$, consider the test procedure that rejects $H_0$ if and only if $X > 0.75$. If the probabilities of Type-I and Type-II errors are $\alpha$ and $\beta$, respectively, then $\alpha + \beta$ equals ......... (round off to two decimal places).
Let the sample mean based on a random sample from Exp($\lambda$) distribution be 3.7. Then the maximum likelihood estimate of $1 - e^{-\lambda}$ equals ........... (round off to two decimal places).
Let $U \sim F_{5,8}$ and $V \sim F_{8,5}$. If $P[U > 3.69] = 0.05$, then the value of $c$ such that $P[V > c] = 0.95$ equals ................ (round off to two decimal places).
Let $X$ be a random variable having the Poisson(4) distribution and let $E$ be an event such that $P(E|X = i) = 1 - 2^{-i}$, $i = 0, 1, 2, \ldots$. Then $P(E)$ equals ........... (round off to two decimal places).
Let $(X, Y)$ have the joint pdf \[ f(x, y) = \begin{cases} \dfrac{3}{4}(y - x), & 0 < x < y < 2, \\ 0, & \text{otherwise.} \end{cases} \] Then the conditional expectation $E(X|Y = 1)$ equals ........... (round off to two decimal places).
Find the rank of the matrix: \[ \begin{bmatrix} 1 & 1 & 1 & 1 \\ 1 & 2 & 3 & 2 \\ 2 & 5 & 6 & 4 \\ 2 & 6 & 8 & 5 \end{bmatrix} \] Rank = ?
Consider the two probability density functions (pdfs): \[ f_0(x) = \begin{cases} 2x, & 0 \le x \le 1, \\ 0, & \text{otherwise}, \end{cases} f_1(x) = \begin{cases} 1, & 0 \le x \le 1, \\ 0, & \text{otherwise.} \end{cases} \]
Let $X$ be a random variable with pdf $f \in \{f_0, f_1\}$. Consider testing $H_0: f = f_0(x)$ against $H_1: f = f_1(x)$ at $\alpha = 0.05$ level of significance. For which observed value of $X$, the most powerful test would reject $H_0$?
Let $\{X_n\}_{n \ge 1}$ be a sequence of i.i.d. random variables such that \[ P(X_1 = 0) = \frac{1}{4}, P(X_1 = 1) = \frac{3}{4}. \] Define \[ U_n = \frac{1}{n} \sum_{i=1}^n X_i \text{and} V_n = \frac{1}{n} \sum_{i=1}^n (1 - X_i)^2, n = 1, 2, \ldots \] Then which of the following statements is/are TRUE?
Let $(X, Y)$ have the joint probability mass function \[ f(x, y) = \begin{cases} \binom{x + 1}{y} \binom{16}{x} \left(\frac{1}{6}\right)^y \left(\frac{5}{6}\right)^{x + 1 - y} \left(\frac{1}{2}\right)^{16}, & y = 0, 1, \ldots, x + 1; \, x = 0, 1, \ldots, 16, \\ 0, & \text{otherwise.} \end{cases} \]
Then which of the following statements is/are TRUE?
Consider a sequence of independent Bernoulli trials with probability of success in each trial being $\dfrac{1}{5}$. Then which of the following statements is/are TRUE?
For real constants $a$ and $b$, let \[ M = \begin{bmatrix} \dfrac{1}{\sqrt{2}} & \dfrac{1}{\sqrt{2}} \\ a & b \end{bmatrix} \] be an orthogonal matrix. Then which of the following statements is/are always TRUE?
Let $f : \mathbb{R}^2 \to \mathbb{R}$ be defined by $f(x, y) = x^2(2 - y) - y^3 + 3y^2 + 9y$, where $(x, y) \in \mathbb{R}^2$. Which of the following is/are saddle point(s) of $f$?
Let the sequence $\{x_n\}_{n \ge 1}$ be given by $x_n = \sin \dfrac{n\pi}{6}$, $n = 1, 2, \ldots$. Then which of the following statements is/are TRUE?
Let $X_1, X_2, \ldots, X_n$ be a random sample from a distribution with probability density function \[ f_{\theta}(x) = \begin{cases} \theta (1 - x)^{\theta - 1}, & 0 < x < 1, \\ 0, & \text{otherwise}, \end{cases} \theta > 0. \] To test $H_0: \theta = 1$ against $H_1: \theta > 1$, the uniformly most powerful (UMP) test of size $\alpha$ would reject $H_0$ if
Let $X_1, X_2, \ldots, X_n$ be a random sample from $U(1,2)$ and $Y_1, Y_2, \ldots, Y_n$ be a random sample from $U(0,1)$. Suppose the two samples are independent. Define \[ Z_i = \begin{cases} 1, & \text{if } X_i Y_i < 1, \\ 0, & \text{otherwise}, \end{cases} i = 1,2, \ldots, n. \] If $\lim_{n \to \infty} P\left(|\frac{1}{n} \sum_{i=1}^n Z_i - \theta| < \epsilon\right) = 1$ for all $\epsilon > 0$, then $\theta$ equals
Let $X_1, X_2, \ldots, X_n$ be a random sample from $\text{Exp}(\theta)$ distribution, where $\theta \in (0, \infty)$. If $\bar{X} = \dfrac{1}{n}\sum_{i=1}^n X_i$, then a 95% confidence interval for $\theta$ is
Let $X_1, X_2, \ldots, X_n$ be a random sample from $U(\theta - 0.5, \theta + 0.5)$ distribution, where $\theta \in \mathbb{R}$. If $X_{(1)} = \min(X_1, X_2, \ldots, X_n)$ and $X_{(n)} = \max(X_1, X_2, \ldots, X_n)$, then which one of the following estimators is NOT a maximum likelihood estimator (MLE) of $\theta$?
Let $X_1, X_2, \ldots, X_n$ be a random sample from $N(\mu_1, \sigma^2)$ distribution and $Y_1, Y_2, \ldots, Y_m$ be a random sample from $N(\mu_2, \sigma^2)$ distribution, where $\mu_i \in \mathbb{R}, i = 1, 2$ and $\sigma > 0$. Suppose that the two random samples are independent. Define \[ \bar{X} = \frac{1}{n} \sum_{i=1}^{n} X_i \text{and} W = \frac{\sqrt{mn} (\bar{X} - \mu_1)}{\sqrt{\sum_{i=1}^{m} (Y_i - \mu_2)^2}}. \] Then which one of the following statements is TRUE for all positive integers $m$ and $n$?
Let the joint probability density function of $(X, Y)$ be \[ f(x, y) = \begin{cases} 2e^{-(x + y)}, & 0 < x < y < \infty, \\ 0, & \text{otherwise.} \end{cases} \]
Then $P\left(X < \dfrac{Y}{2}\right)$ equals
Consider a sequence of independent Bernoulli trials with probability of success in each trial being $\dfrac{1}{3}$. Let $X$ denote the number of trials required to get the second success. Then $P(X \ge 5)$ equals
Let $M$ be an $n \times n$ non-zero skew symmetric matrix. Then the matrix $(I_n - M)(I_n + M)^{-1}$ is always
Let $T: \mathbb{R}^3 \to \mathbb{R}^4$ be a linear transformation. If $T(1,1,0) = (2,0,0,0)$, $T(1,0,1) = (2,4,0,0)$, and $T(0,1,1) = (0,0,2,0)$, then $T(1,1,1)$ equals