Question:

For a given data \( (x_i, y_i) \), \( i = 1, 2, \dots, n \), with \( \sum_{i=1}^{n} x_i^2>0 \), let \( \hat{\beta} \) satisfy \[ \sum_{i=1}^{n} (y_i - \hat{\beta} x_i)^2 = \inf_{\beta \in \mathbb{R}} \sum_{i=1}^{n} (y_i - \beta x_i)^2. \] {Further, let } \( v_j = y_j - x_j \) and \( u_j = 2x_j \), for \( j = 1, 2, \dots, n \), and let \( \hat{\gamma} \) satisfy} \[ \sum_{i=1}^{n} (v_i - \hat{\gamma} u_i)^2 = \inf_{\gamma \in \mathbb{R}} \sum_{i=1}^{n} (v_i - \gamma u_i)^2. \] {If } \( \hat{\beta} = 10 \), then the value of \( \hat{\gamma} \) is:

Show Hint

The least-squares estimate for a linear model can be used to minimize the sum of squared errors. When dealing with transformations of the variables, apply the same logic to find the optimal estimator for the new model.
Updated On: Apr 9, 2025
  • 4.5
  • 5
  • 10
  • 9
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is B

Solution and Explanation

The given problem involves two optimization problems for minimizing the sum of squared errors.
1. The first condition gives the least-squares estimate \( \hat{\beta} \), which minimizes the sum of squared errors in the linear regression model \( y_i = \beta x_i + \epsilon_i \). It is known that the least-squares estimator of \( \beta \) is given by the formula: \[ \hat{\beta} = \frac{\sum_{i=1}^{n} x_i y_i}{\sum_{i=1}^{n} x_i^2}. \] We are given that \( \hat{\beta} = 10 \).
2. The second condition defines \( v_j = y_j - x_j \) and \( u_j = 2x_j \). The goal is to minimize the sum of squared errors for the linear regression model \( v_j = \gamma u_j + \epsilon_j \). We substitute the expressions for \( v_j \) and \( u_j \) into the formula for the least-squares estimator: \[ \hat{\gamma} = \frac{\sum_{i=1}^{n} u_i v_i}{\sum_{i=1}^{n} u_i^2}. \] Substitute \( v_i = y_i - x_i \) and \( u_i = 2x_i \): \[ \hat{\gamma} = \frac{\sum_{i=1}^{n} (2x_i)(y_i - x_i)}{\sum_{i=1}^{n} (2x_i)^2}. \] Simplify the equation: \[ \hat{\gamma} = \frac{2 \sum_{i=1}^{n} x_i (y_i - x_i)}{4 \sum_{i=1}^{n} x_i^2} = \frac{\sum_{i=1}^{n} x_i (y_i - x_i)}{2 \sum_{i=1}^{n} x_i^2}. \] Now, expand \( y_i = \beta x_i + \epsilon_i \) and substitute \( \hat{\beta} = 10 \): \[ \hat{\gamma} = \frac{\sum_{i=1}^{n} x_i \left( \beta x_i + \epsilon_i - x_i \right)}{2 \sum_{i=1}^{n} x_i^2}. \] Since \( \hat{\beta} = 10 \), this simplifies to: \[ \hat{\gamma} = \frac{\sum_{i=1}^{n} x_i \left( 10x_i - x_i \right)}{2 \sum_{i=1}^{n} x_i^2} = \frac{\sum_{i=1}^{n} x_i (9x_i)}{2 \sum_{i=1}^{n} x_i^2} = \frac{9 \sum_{i=1}^{n} x_i^2}{2 \sum_{i=1}^{n} x_i^2}. \] Thus, \[ \hat{\gamma} = \frac{9}{2} = 4.5. \] Therefore, the value of \( \hat{\gamma} \) is \( \boxed{4.5} \).
Was this answer helpful?
0
0

Questions Asked in GATE ST exam

View More Questions