Question:

If \(3A = \begin{bmatrix} 1 & 2 & 2 \\[0.3em] 2 & 1 & -2 \\[0.3em] a & 2 & b \end{bmatrix}\) and \(AA^T = I\), then\(\frac{a}{b} + \frac{b}{a} =\):

Show Hint

For matrix multiplication and properties, ensure that: - The matrix \( A \) satisfies the equation \( AA^T = I \), meaning that \( A \) is an orthogonal matrix. - Use properties of orthogonal matrices (rows are orthogonal and have magnitude 1) to solve for unknowns.
Updated On: May 22, 2025
  • \( -\frac{5}{2} \)
  • \( \frac{13}{6} \)
  • \( \frac{13}{6} \)
  • \( \frac{5}{2} \)
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is D

Approach Solution - 1

Step 1: Solving for matrix \( A \). Given that \( 3A = \begin{bmatrix} 1 & 2 & 2
2 & 1 & -2
a & 2 & b \end{bmatrix} \), we solve for \( A \): \[ A = \frac{1}{3} \begin{bmatrix} 1 & 2 & 2
2 & 1 & -2
a & 2 & b \end{bmatrix}. \] Step 2: Use the condition \( AA^T = I \). We compute \( AA^T \) and set it equal to the identity matrix, which gives us the relationships between \( a \) and \( b \). Step 3: Solving for \( a \) and \( b \). From the equations, we find that \( a = -5 \) and \( b = 5 \). Step 4: Compute \( \frac{a}{b} + \frac{b}{a} \). \[ \frac{a}{b} + \frac{b}{a} = \frac{-5}{5} + \frac{5}{-5} = -1 + (-1) = -2. \]
Was this answer helpful?
0
0
Hide Solution
collegedunia
Verified By Collegedunia

Approach Solution -2

Given: \[ 3A = \begin{bmatrix} 1 & 2 & 2 \\ 2 & 1 & -2 \\ a & 2 & b \end{bmatrix} \] and \[ AA^T = I, \] where \( I \) is the identity matrix. Find the value of \[ \frac{a}{b} + \frac{b}{a}. \]

Step 1: Express \( A \) \[ A = \frac{1}{3} \begin{bmatrix} 1 & 2 & 2 \\ 2 & 1 & -2 \\ a & 2 & b \end{bmatrix}. \]

Step 2: Use the orthogonality condition \( AA^T = I \) Since \( AA^T = I \), the rows of \( A \) are orthonormal vectors. Let the rows of \( A \) be \( R_1, R_2, R_3 \): \[ R_1 = \frac{1}{3} (1, 2, 2), \quad R_2 = \frac{1}{3} (2, 1, -2), \quad R_3 = \frac{1}{3} (a, 2, b). \]

Step 3: Use the orthogonality of \( R_1 \) and \( R_3 \) \[ R_1 \cdot R_3 = 0 \implies \frac{1}{3} (1, 2, 2) \cdot \frac{1}{3} (a, 2, b) = 0, \] \[ \frac{1}{9} (a + 4 + 2b) = 0 \implies a + 2b = -4. \]

Step 4: Use the orthogonality of \( R_2 \) and \( R_3 \) \[ R_2 \cdot R_3 = 0 \implies \frac{1}{9} (2a + 2 + (-2) b) = 0, \] \[ 2a + 2 - 2b = 0 \implies a - b = -1. \]

Step 5: Use the normalization of \( R_3 \) \[ R_3 \cdot R_3 = 1 \implies \frac{1}{9} (a^2 + 4 + b^2) = 1, \] \[ a^2 + b^2 = 5. \]

Step 6: Solve for \( a \) and \( b \) From Step 4: \[ a = b - 1. \] Substitute into Step 3: \[ a + 2b = -4 \implies (b - 1) + 2b = -4 \implies 3b = -3 \implies b = -1. \] Then, \[ a = -1 - 1 = -2. \] Check Step 5: \[ (-2)^2 + (-1)^2 = 4 + 1 = 5, \] which satisfies the condition.

Step 7: Calculate \( \frac{a}{b} + \frac{b}{a} \) \[ \frac{a}{b} + \frac{b}{a} = \frac{-2}{-1} + \frac{-1}{-2} = 2 + \frac{1}{2} = \frac{5}{2}. \]

Final answer: \[ \boxed{\frac{5}{2}}. \]
Was this answer helpful?
1
0