The least squares method minimizes the sum of squared differences between the actual values yi and the predicted values f(xi): ∑[yi − f(xi)]². This approach penalizes larger deviations more heavily and is widely used in regression and optimization problems.