Step 1: Understanding the given condition.
We are given that the covariance matrix has eigenvalues \( \lambda_i = 100^2 - i \) for \( 1 \leq i \leq 100 \). The direction of maximum variance, \( u \), corresponds to the first principal component, and it is the eigenvector associated with the largest eigenvalue \( \lambda_1 = 100^2 \).
Step 2: Principal Component Analysis.
The expression \( u^T x^{(i)} \) represents the projection of the \( i \)-th data point onto the principal component \( u \). The term \( \left( u^T x^{(i)} \right)^2 \) is the squared value of this projection. The sum \( \frac{1}{n} \sum_{i=1}^{n} \left( u^T x^{(i)} \right)^2 \) corresponds to the average of the squared projections of the data points onto the first principal component.
Step 3: Eigenvalue decomposition of the covariance matrix.
The variance along each direction is equal to the corresponding eigenvalue. Since \( u \) is the direction of maximum variance, the variance in the direction of \( u \) is equal to \( \lambda_1 = 100^2 \). Therefore, the average squared projection is: \[ \frac{1}{n} \sum_{i=1}^{n} \left( u^T x^{(i)} \right)^2 = \lambda_1 = 100. \] Thus, the value of \( \frac{1}{n} \sum_{i=1}^{n} \left( u^T x^{(i)} \right)^2 \) is \( 100 \). % Quick tip