1

Let $(X_i)_0^{\infty}$ be i.i.d. uniform $[0, 1]$ random variables.

How can I prove that $Y_n = X_1 X_2 \cdots X_n$ converges almost surely. And to what limit?

I have no idea on how to start this type of question.


What I tried:

I tried somehow to convert Multiplication to sum, for example I know that $$ \log(Y_n) = \sum_{i=1}^{n} \log(X_i)$$

and I know that: $$Y_n = e^{\log(X_1 X_2 \cdots X_n)} = e^{\sum_{i=1}^{n} \log(X_i)}$$

But as you can see I got sum of random variables and not average, thus I can't use Law of large numbers in this case.

The same happens when trying to prove that $\frac{1}{n^2} (X_1 + \cdots + X_n)$ converges.

Sangchul Lee
  • 167,468
zoro
  • 161

3 Answers3

1

Rewrite $$\prod_nX_n=\prod_n(1-(1-X_n))$$ by the law of large numbers $$\sum^N_{j=1}(1-X_j)=N\Big(\frac1N\sum^N_{j=1}(1-X_j)\Big)\sim N/2\quad\text{a.s.}$$ Hence $\sum_n(1-X_n)=\infty$ a.s. It follows that $\prod_nX_n=0$ a.s. (see for example this posting and its solution).

Alternatively, consider the power series (in $z$) $$S(z,\omega)=\sum^\infty_{n=1}X_1(\omega)\cdot\ldots\cdot X_n(\omega)\, z^n$$ By the law of large numbers $$\sqrt[n]{X_1\cdot\ldots\cdot X_n}=\exp\Big(\frac1n\sum^n_{j=1}\log X_j\Big)\xrightarrow{n\rightarrow\infty}\exp\Big(\int^1_0\log x\, dx\Big)=e^{-1}\qquad\text{a.s.}$$ Hence, for almost all $\omega\in\Omega$, the series $S$ converges for all $z$ with $|z|<e$. In particular, for almost all $\omega$, $S(1,\omega)$ converges. Consequently, $\prod^n_{j=1}X_j\xrightarrow{n\rightarrow\infty}0$ a.s.

Mittens
  • 39,145
  • Sorry but I lost you after this part: It follows that... how you moved from multiplication to summation... – zoro Feb 21 '23 at 19:39
  • @zoro: $0\leq \prod^n_{j=1}(1-a_j)\leq e^{-\sum^n_{j=1}a_j} \xrightarrow{n\rightarrow\infty}0$ if $\lim_n\sum^n_{j=1}a_j=\infty$ and $a_n\leq 1$. – Mittens Feb 21 '23 at 20:04
0

Notice that $\log X_i\leq 0$. Therefore, if you know that $$\mathbb P\bigg[X_i <1-\epsilon\text{ infinitely often }\bigg]=1$$ for some $\epsilon>0$, then you'll know that $\log(X_i)<\log(1-\epsilon)<-\delta$ infinitely often for some $\delta>0$. Hence $\sum_{i=0}^\infty \log(X_i)=-\infty$ almost-surely.

If you want to study the convergence of $(X_1+\dots+X_n)/n^2$ simply note that $$\frac{1}{n^2}(X_1+\dots+X_n)=\frac{1}{n}\cdot\frac{X_1+\dots+X_n}{n}.$$ You should be able to see what this limit equals.

Small Deviation
  • 2,296
  • 1
  • 3
  • 22
0

You can observe that sequence of $Y_n$ takes values in $[0,1]$ and is decreasing.
Therefore it converges to $Y \in [0,1]$.

Finally, using dominated convergence, you get $E[Y]=0$ that implies that $Y=0$ a.s..
(You could also conclude it from convergence in L1 and uniqueness of limit)

You can build a simple inequality also in the second exercise. Can you see it?

Kolmo
  • 1,479