Convergence in Probability Theory

Convergence in probability theory deals with a sequence of RVs approaching an RV in the limit.

Convergence in distribution. Pointwise equality between the CDF of the nth r.v. in a sequence \(X_1, \dots, X_n\) as \(n \to \infty\): \[ \lim_{n\to \infty} F_n(x) = F(x), \] where \(F(x)\) is the CDF of \(X\). We write \[ X_n \to^\mathcal{\hspace{-4mm}D} X \] The continuous mapping theorem (i.e., for a sequence of RVs \(\{X_n\}\) converging to some RV \(X\), continuous functions of \(X_n\) also converge to the same function of \(X\)) and Levy’s continuity theorem (convergence in distribution iff characteristic functions converge pointwise) are useful.

Convergence in probability. The probability of Xn being outside of the ball of radius epsilon centered at X approaches zero for any epsilon, i.e., \[ \lim_{n\to \infty} \Pr(| X_n - X | > \epsilon)= 0. \] The continuous mapping theorem holds as well.

Almost sure convergence. The probability of Xn = X approaches one: \[ \Pr(\lim_{n \to \infty} X_n = X) = 1. \] If we disregard sets of measure zero (i.e., which sure convergence covers), this is the strongest form of convergence. Practically, this form is the strongest: A.S. implies conv. in P implies conv. in D.

Applications