Question:

Let $\bar{\alpha}_1$ and $\bar{\alpha}_2$ be two independent unbiased estimators of the parameter $\alpha$ with standard errors $\sigma_1$ and $\sigma_2$, respectively, with $\sigma_1 \ne \sigma_2$. The linear combination of $\bar{\alpha}_1$ and $\bar{\alpha}_2$ that yields an unbiased estimator of $\alpha$ with the minimum variance is

Show Hint

Weights in minimum variance combinations are inversely proportional to variances — lower variance estimators get higher weights.
Updated On: Dec 5, 2025
  • $\left(\frac{\sigma_1}{\sigma_1 + \sigma_2}\right)\bar{\alpha}_1 + \left(\frac{\sigma_2}{\sigma_1 + \sigma_2}\right)\bar{\alpha}_2$
  • $\left(\frac{\sigma_2}{\sigma_1 + \sigma_2}\right)\bar{\alpha}_1 + \left(\frac{\sigma_1}{\sigma_1 + \sigma_2}\right)\bar{\alpha}_2$
  • $\left(\frac{\sigma_2^2}{\sigma_1^2 + \sigma_2^2}\right)\bar{\alpha}_1 + \left(\frac{\sigma_1^2}{\sigma_1^2 + \sigma_2^2}\right)\bar{\alpha}_2$
  • $\left(\frac{\sigma_1^2}{\sigma_1^2 + \sigma_2^2}\right)\bar{\alpha}_1 + \left(\frac{\sigma_2^2}{\sigma_1^2 + \sigma_2^2}\right)\bar{\alpha}_2$
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is D

Solution and Explanation

Step 1: Linear combination setup.
Let the combined estimator be \[ \bar{\alpha} = w\bar{\alpha}_1 + (1 - w)\bar{\alpha}_2. \] Since both are unbiased, $E(\bar{\alpha}) = \alpha$ for any $w$.
Step 2: Minimize variance.
The variance is \[ Var(\bar{\alpha}) = w^2\sigma_1^2 + (1 - w)^2\sigma_2^2. \] Differentiate and set $\frac{dVar}{dw}=0$: \[ 2w\sigma_1^2 - 2(1 - w)\sigma_2^2 = 0 \Rightarrow w = \frac{\sigma_2^2}{\sigma_1^2 + \sigma_2^2}. \]
Step 3: Substitute $w$.
\[ \bar{\alpha} = \left(\frac{\sigma_2^2}{\sigma_1^2 + \sigma_2^2}\right)\bar{\alpha}_1 + \left(\frac{\sigma_1^2}{\sigma_1^2 + \sigma_2^2}\right)\bar{\alpha}_2. \]
Step 4: Conclusion.
Hence, the minimum variance unbiased linear combination is as in option (C).
Was this answer helpful?
0
0

Questions Asked in IIT JAM EN exam

View More Questions