Given \( I = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \) and \( A = \begin{bmatrix} a & b \\ c & -a \end{bmatrix} \), we need to find the condition for \( A^2 = I \). Calculating \( A^2 \):
\[ A^2 = \begin{bmatrix} a & b \\ c & -a \end{bmatrix} \begin{bmatrix} a & b \\ c & -a \end{bmatrix} = \begin{bmatrix} a^2 + bc & ab - ab \\ ac - ac & bc + a^2 \end{bmatrix} = \begin{bmatrix} a^2 + bc & 0 \\ 0 & a^2 + bc \end{bmatrix} \]
For \( A^2 = I \):
\[ \begin{bmatrix} a^2 + bc & 0 \\ 0 & a^2 + bc \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \]
This implies:
\( a^2 + bc = 1 \)
Rearranging gives:
\( 1 - a^2 - bc = 0 \)
The matrix A is given as:
\[ A = \begin{bmatrix} a & b \\ c & -a \end{bmatrix}. \]
To compute \( A^2 \):
\[ A^2 = \begin{bmatrix} a & b \\ c & -a \end{bmatrix} \cdot \begin{bmatrix} a & b \\ c & -a \end{bmatrix}. \]
\[ A^2 = \begin{bmatrix} a^2 + bc & ab - ab \\ ac - ac & bc + (-a)^2 \end{bmatrix} = \begin{bmatrix} a^2 + bc & 0 \\ 0 & bc + a^2 \end{bmatrix}. \]
Since \( A^2 = I \), we have:
\[ \begin{bmatrix} a^2 + bc & 0 \\ 0 & bc + a^2 \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}. \]
Equating the diagonal elements:
\[ a^2 + bc = 1. \]
Rewriting this equation:
\[ 1 - a^2 - bc = 0. \]
Thus, the correct answer is Option (B).