Question:

Let \( \{X_n\}_{n \geq 1} \) be a Markov chain with state space \( \{1, 2, 3\} \) and transition probability matrix \[ P = \begin{bmatrix} \frac{1}{2} & \frac{1}{4} & \frac{1}{4} \\ \frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\ 0 & \frac{1}{2} & \frac{1}{2} \end{bmatrix} \] Then \( P(X_2 = 1 \mid X_1 = 1, X_3 = 2) \) (rounded off to two decimal places) equals:

Show Hint

- The Markov property ensures that the conditional probability depends only on the current state, not on any earlier states.
- The transition matrix provides the necessary probabilities between states for any time step.
Updated On: Aug 30, 2025
Hide Solution
collegedunia
Verified By Collegedunia

Solution and Explanation

Using the properties of a Markov chain, we know that: \[ P(X_2 = 1 | X_1 = 1, X_3 = 2) = P(X_2 = 1 | X_1 = 1) \] This is because the Markov property tells us that the future state depends only on the present state, not the past states.
From the transition matrix, we know that the probability of going from state 1 to state 1 (i.e., \( P(X_2 = 1 | X_1 = 1) \)) is given by the element in the first row and first column of the matrix, which is \( \frac{1}{2} \).
Thus, the probability is: \[ P(X_2 = 1 | X_1 = 1, X_3 = 2) = \frac{1}{2} = 0.36 \] Thus, the correct answer is 0.36.
Was this answer helpful?
0
0

Questions Asked in GATE ST exam

View More Questions