Question:

Let \( \{X_n\}_{n \geq 1} \) and \( \{Y_n\}_{n \geq 1} \) be two sequences of random variables and \( X \) and \( Y \) be two random variables, all of them defined on the same probability space. Which one of the following statements is true?

Show Hint

- Convergence in distribution to a constant implies convergence in probability to that constant.
- Convergence in probability does not necessarily imply convergence in higher moments like 3rd mean or 1st mean.
Updated On: Aug 30, 2025
  • If \( \{X_n\}_{n \geq 1} \) converges in distribution to a real constant \( c \), then \( \{X_n\}_{n \geq 1} \) converges in probability to \( c \)
  • If \( \{X_n\}_{n \geq 1} \) converges in probability to \( X \), then \( \{X_n\}_{n \geq 1} \) converges in 3rd mean to \( X \)
  • If \( \{X_n\}_{n \geq 1} \) converges in distribution to \( X \) and \( \{Y_n\}_{n \geq 1} \) converges in distribution to \( Y \), then \( \{X_n + Y_n\}_{n \geq 1} \) converges in distribution to \( X + Y \)
  • If \( E(X_n) \) converges to \( E(X) \), then \( \{X_n\}_{n \geq 1} \) converges in 1st mean to \( X \)
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is A

Solution and Explanation

1) Convergence in distribution vs. probability:
If a sequence \( X_n \) converges in distribution to a real constant \( c \), this implies that for any \( \epsilon>0 \), \( P(|X_n - c| \geq \epsilon) \to 0 \) as \( n \to \infty \). This is exactly the definition of convergence in probability. Hence, convergence in distribution to a constant implies convergence in probability to that constant.
2) Explanation of the other options:
(B) Convergence in probability does not necessarily imply convergence in 3rd mean. Convergence in probability is a weaker condition compared to convergence in mean.
(C) The sum of independent random variables converging in distribution does not imply that the sum of the sequences converges in distribution to the sum of the limits.
(D) Convergence of expectations does not guarantee convergence in mean. Convergence in expectation is a weaker condition than convergence in 1st mean.
Was this answer helpful?
0
0

Questions Asked in GATE ST exam

View More Questions