3

When does $L^2(P)$ convergence, imply almost sure convergence?

I am reading Schilling's Brownian Motion, and it contains the sentence:

"Since $W(t)$ is an $L^2(P)$-convergent, hence stochastically convergent, series of independent random variables, we know from classical probability theory that $\lim_{N\to \infty} W_N (t)=W(t)$ almost surely."

However, I don't recall any such theorem from probability theory. I would appreciate it if anyone could point out a reference or the theorem.

  • What is the exact relation between $W_n$ and $W$? Is $W_n = \sum \limits_{i = 1}^n X_i$ for some independent random variables $X_i$ and $W_n \to W$ in $L^2$? – Dominik Dec 05 '16 at 13:55
  • 5
    It is a theorem due to Paul Lévy that a series of independent random variables converges almost surely if and only if it converges in probability (of course, only one implication is (possibly surprising and anyway) interesting). Thus, if such a series converges in $L^2$ then it converges in probability (always true), then it converges almost surely (Lévy). – Did Dec 05 '16 at 13:56
  • @Did: Could you please give a reference to the proof of this Paul Levy's theorem? Is this till true if instead of a series, there is a set (could be uncountably many) of independent random variables? Thank you. – Hans Jun 03 '18 at 04:16
  • @Hans https://math.stackexchange.com/a/226295/6179 – Did Jun 03 '18 at 08:05

0 Answers0