Concept:
In Linear Regression, the objective is to find the best-fitting line that explains the relationship between the independent variable(s) and the dependent variable.
The Ordinary Least Squares (OLS) method is used to estimate the regression coefficients by minimizing the error between the predicted values and the actual observed values.
The error between actual and predicted values is called the residual.
Step 1: Define the linear regression model.
The simple linear regression model can be written as:
\[
y_i = \beta_0 + \beta_1 x_i + \epsilon_i
\]
where:
- \(y_i\) = actual value
- \(x_i\) = input feature
- \(\beta_0, \beta_1\) = regression coefficients
- \(\epsilon_i\) = error term
Step 2: Define residuals.
The residual is the difference between the observed value and the predicted value.
\[
e_i = y_i - \hat{y}_i
\]
Step 3: Apply the OLS objective function.
The OLS method minimizes the sum of squared residuals (SSR):
\[
\text{SSR} = \sum_{i=1}^{n} (y_i - \hat{y}_i)^2
\]
Squaring the residuals ensures:
- All errors become positive
- Larger errors are penalized more heavily
Step 4: Conclusion.
Therefore, the main objective of the Ordinary Least Squares method is to:
\[
\text{Minimize the sum of squared residuals}
\]