Using the properties of a Markov chain, we know that:
\[
P(X_2 = 1 | X_1 = 1, X_3 = 2) = P(X_2 = 1 | X_1 = 1)
\]
This is because the Markov property tells us that the future state depends only on the present state, not the past states.
From the transition matrix, we know that the probability of going from state 1 to state 1 (i.e., \( P(X_2 = 1 | X_1 = 1) \)) is given by the element in the first row and first column of the matrix, which is \( \frac{1}{2} \).
Thus, the probability is:
\[
P(X_2 = 1 | X_1 = 1, X_3 = 2) = \frac{1}{2} = 0.36
\]
Thus, the correct answer is 0.36.