Question:

Let \( X_1, X_2, \dots, X_n \) be a random sample from \( U(\theta, \theta + 1) \), where \( \theta \in \mathbb{R} \) is the unknown parameter. Let \[ U = \max{\{X_1, X_2, \dots, X_n\}} \, \text{and} \, V = \min{\{X_1, X_2, \dots, X_n\}}. \] Then which of the following statement(s) is (are) true? 
 

Show Hint

For uniform distributions, the maximum and minimum statistics can be combined to estimate the parameter consistently.
Updated On: Dec 17, 2025
  • \( U \) is a consistent estimator of \( \theta \)
  • \( V \) is a consistent estimator of \( \theta \)
  • \( 2U - V - 2\) is a consistent estimator of \( \theta \) 
     

  • \( 2U - V + 1\) is a consistent estimator of \( \theta \) 
     

Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is B, C, D

Solution and Explanation

Given: $X_1, X_2, ..., X_n$ is a random sample from $U(\theta, \theta + 1)$

Let $U = \max{X_1, X_2, ..., X_n}$ and $V = \min{X_1, X_2, ..., X_n}$

Key facts about uniform distribution $U(\theta, \theta + 1)$:

  • Support: $[\theta, \theta + 1]$
  • As $n \to \infty$: $V \to \theta$ and $U \to \theta + 1$ (almost surely)

Option (A): $U$ is a consistent estimator of $\theta$

We need to check if $U \xrightarrow{P} \theta$ as $n \to \infty$.

For the maximum of a uniform distribution on $[\theta, \theta + 1]$: $$P(U \leq u) = P(X_1 \leq u, ..., X_n \leq u) = \left(\frac{u - \theta}{1}\right)^n = (u - \theta)^n$$

for $u \in [\theta, \theta + 1]$.

As $n \to \infty$, $U$ converges to $\theta + 1$, not $\theta$.

Therefore: $U \xrightarrow{P} \theta + 1$

Option (A) is FALSE 

Option (B): $V$ is a consistent estimator of $\theta$

For the minimum of a uniform distribution on $[\theta, \theta + 1]$: $$P(V \geq v) = P(X_1 \geq v, ..., X_n \geq v) = \left(\frac{\theta + 1 - v}{1}\right)^n = (\theta + 1 - v)^n$$

for $v \in [\theta, \theta + 1]$.

As $n \to \infty$, for any $\epsilon > 0$: $$P(|V - \theta| > \epsilon) = P(V > \theta + \epsilon) = (\theta + 1 - \theta - \epsilon)^n = (1 - \epsilon)^n \to 0$$

Therefore: $V \xrightarrow{P} \theta$

Option (B) is TRUE 

Option (C): $2U - V - 2$ is a consistent estimator of $\theta$

Since:

  • $U \xrightarrow{P} \theta + 1$
  • $V \xrightarrow{P} \theta$

By continuous mapping theorem: $$2U - V - 2 \xrightarrow{P} 2(\theta + 1) - \theta - 2 = 2\theta + 2 - \theta - 2 = \theta$$

Option (C) is TRUE 

Option (D): $2V - U + 1$ is a consistent estimator of $\theta$

Since:

  • $V \xrightarrow{P} \theta$
  • $U \xrightarrow{P} \theta + 1$

By continuous mapping theorem: $$2V - U + 1 \xrightarrow{P} 2\theta - (\theta + 1) + 1 = 2\theta - \theta - 1 + 1 = \theta$$

Option (D) is TRUE 

Answer: (B), (C), and (D)

Was this answer helpful?
0
0

Top Questions on Estimation

View More Questions

Questions Asked in IIT JAM MS exam

View More Questions