Question:

Let 𝑋1, 𝑋2, 𝑋3, 𝑋4 be a random sample of size 4 from an 𝑁(πœƒ, 1) distribution, where πœƒ ∈ ℝ is an unknown parameter. Let 𝑋̅ = \(\frac{1 }{4 }βˆ‘^4_{i=1} X_i\) , 𝑔(πœƒ) = πœƒ 2 + 2πœƒ and 𝐿(πœƒ) be the Cramer-Rao lower bound on variance of unbiased estimators of 𝑔(πœƒ). Then, which one of the following statements is FALSE?

Updated On: Nov 17, 2025
  • 𝐿(πœƒ) = (1 + πœƒ) 2
  • 𝑋̅ + 𝑒 𝑋̅ is a sufficient statistic for πœƒ
  • (1 + 𝑋̅) 2 is the uniformly minimum variance unbiased estimator of 𝑔(πœƒ)
  • π‘‰π‘Žπ‘Ÿ((1 + 𝑋̅) 2 ) β‰₯ \(\frac{(1+ΞΈ) ^2}{ 2}\)
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is C

Solution and Explanation

Let's evaluate the provided statements to find the false one in the context of the statistical problem.

We are given that \(X_1, X_2, X_3, X_4\) form a random sample from a normal distribution \(N(\theta, 1)\), where \(\theta\) is an unknown parameter. The sample mean is \( \bar{X} = \frac{1}{4} \sum_{i=1}^{4} X_i\). We also have the function \( g(\theta) = \theta^2 + 2\theta \) and we need to discuss aspects related to the Cramer-Rao Lower Bound (CRLB) for unbiased estimators of \( g(\theta) \).

  1. Statement 1: \(L(\theta) = (1 + \theta)^2\) 
  2. Statement 2: \(\bar{X} + e^{\bar{X}}\) is a sufficient statistic for \(\theta\)
  3. Statement 3: \((1 + \bar{X})^2\) is the uniformly minimum variance unbiased estimator (UMVUE) of \(g(\theta)\)
  4. Statement 4: \(\text{Var}((1 + \bar{X})^2) \geq \frac{(1+\theta)^2}{2}\)

Based on the analysis, the statement "(1 + \bar{X})^2 is the uniformly minimum variance unbiased estimator of \(g(\theta)\)" is found false given the expression doesn't immediately satisfy all unbiased estimator criteria for \(g(\theta)\) without additional proofs. CRLB checks suggest required variance properties aren’t confirmed for this transformed mean expression.

Was this answer helpful?
0
0

Top Questions on Multivariate Distributions

View More Questions

Questions Asked in IIT JAM MS exam

View More Questions