3

I have come across the following exercise during a course on probability, and I'm nearly certain it has to be proved using the Kolmogorov 0-1 Law, but the theorem is only stated in the lecture notes, and no examples on how to apply it were given.


Let $A_1, A_2, \dots$ be any independent sequence of events and let $S_x := \{\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=1}^{n}1_{A_{i}}\leq x\}$. Prove that for each $x\in\mathbb{R}$ we have $\mathbb{P}(S_x)=\{0,1\}$.


I've found that the answer is trivial for any $x\in[0,1)^C$, but that's obviously not the point of the exercise.

So far it seems like it is not possible to construct a sequence $B_i$ of independent events such that $S_x$ is in the tail-field, since there is seemingly no way to prove independence of the $B_i$ without more knowledge about $\mathbb{P}:\Omega\rightarrow [0,1]$.

A friend suggested maybe it is possible that you can make a probability triple with a tail-field as its sigma-algebra. Then if we can show that $f_n(\omega)=\frac{1}{n}\sum_{i=1}^{n}1_{A_{i}}$ is a measurable function, then $\lim_{n\rightarrow\infty}f_n^{-1}([-\infty,x])$ would have to be inside the tail-field, hence also have probability $0$ or $1$.

It seems like if the last approach were to work, we must first show that $1_{A_{i}}$ is measurable in this new probability triple. Meaning we have to show that both $A_i$ and $A_i^C$ have to be in the tail-field. But since I have no prior experience with tail-fields it is not obvious at all how one could construct such a thing (if it's even possible). Any help would be appreciated.

One last note: $\{\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=1}^{n}1_{A_{i}}\leq x\} \iff \{\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=m}^{n}1_{A_{i}}\leq x\}$.


I found a similar question here, but there seems to be a big jump in the logic of the first answer.

BCLC
  • 13,459
  • Did you mean $\sum_{i = m}^n A_i \leq x$? If so, then the event is a tail even as it belongs to $\sigma(A_k; k \geq m)$ for any $m,$ hence the result. – William M. Feb 23 '19 at 01:02
  • Kolmogorov's 0-1 law tells that, if $X_n$'s are independent random variables and $\mathcal{T}=\bigcap_{n=1}^{\infty} \sigma(X_k:k\geq n)$ is the corresponding tail $\sigma$-algebra, then every $\mathcal{T}$-measurable event is $\mathbb{P}$-trivial, i.e., $\mathbb{P}(A)\in{0,1}$ for any $A\in\mathcal{T}$. Now with $X_n=\mathbf{1}_{A_n}$, it is easy to check that $S_x \in \mathcal{T}$, i.e., $S_x \in \sigma(A_k:k\geq n)$ for any $n$, and so, $\mathbb{P}(S_x) \in{0,1}$. – Sangchul Lee Feb 23 '19 at 01:03
  • @WillM. No, it's exactly like above. – Jelle Dijkstra Feb 23 '19 at 01:07
  • What you wrote doesn't make sense really, what is $i$ then? Also, you are overthinking an easy peasy question. Simply note that whatever the value of $m$ maybe, $m/n \to 0$ as $n \to \infty,$ hence, the first $m$ variables are irrelevant, this is why the event is a tail event. – William M. Feb 23 '19 at 01:08
  • @SangchulLee Yes, but I don't see what the independent $X_n$ would be and how to show they are independent. – Jelle Dijkstra Feb 23 '19 at 01:08
  • @WillM. Well, it's what my teacher wrote down and just saying 'it's a tail event' isn't helpful. – Jelle Dijkstra Feb 23 '19 at 01:10
  • Is independence of random variables never taught in class? Then perhaps it is better if you provide the precise version of the theorem that is covered in the class. – Sangchul Lee Feb 23 '19 at 01:17
  • @SangchulLee The theorem in my book simply states that for any sequence $A_n$ of independent events - not r.v.s. - we have that $P(A)={0,1}$ for all $A$ in the tail-field $\tau = \cap^{\infty}_{n=1} \sigma(A_k: k \geq n)$ – Jelle Dijkstra Feb 23 '19 at 01:21
  • 3
    Then the obvious idea is to show that $S_x \in \tau$, i.e., $S_x \in \sigma(A_k : k \geq n)$ for each $n$. But this is obvious since, for each $n$, both $$\limsup_{N\to\infty} \frac{1}{N}\sum_{k=n}^{N}\mathbf{1}{A_k}, \qquad \liminf{N\to\infty} \frac{1}{N}\sum_{k=n}^{N}\mathbf{1}_{A_k} $$ are limits of $\sigma(A_k : k \geq n)$-measurable functions, hence themselves being $\sigma(A_k : k \geq n)$-measurable as well. – Sangchul Lee Feb 23 '19 at 01:38

1 Answers1

1

First we notice that for any $m\geq 1$ we have the following,

\begin{align} \{\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=1}^{n}1_{A_{i}}\leq x\} &\iff \{\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=1}^{m-1}1_{A_{i}} + \lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=m}^{n}1_{A_{i}}\leq x\} \\ &\iff \{\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=m}^{n}1_{A_{i}}\leq x\}. \end{align}

Now let $\mathcal{T}:=\bigcap_{n=1}^{\infty}\sigma(A_i:i\geq n)$ be the tail field. From the Kolmogorov 0-1 Law we know that if we can show $S_x$ to be in $\mathcal{T}$, that we must have $P(S_x)\in\{0,1\}$. To achieve this we will show that

\begin{equation} \limsup_{n\rightarrow\infty} \frac{1}{n}\sum_{i=m}^{n}1_{A_{i}} \quad \text{and} \quad \liminf_{n\rightarrow\infty} \frac{1}{n}\sum_{i=m}^{n}1_{A_{i}} \end{equation}

are both $\mathcal{T}$-measurable. Let $m\geq 1$ and notice that $1_{A_m}$ is $\sigma(A_i:i\geq m)$-measurable. This in turn means that the simple functions $\frac{1}{n}\sum_{i=m}^{n}1_{A_{i}}$ are also $\sigma(A_i:i\geq m)$-measurable. Finally the $\limsup$ and $\liminf$ of measurable functions is again measurable. Combining this with what we proved at the start we conclude that if $\frac{1}{n}\sum_{i=1}^{n}1_{A_{i}}$ converges pointwise to some function, we must have the following for any $m\geq 1$

\begin{equation} \lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=1}^{n}1_{A_{i}} = \limsup_{n\rightarrow\infty}\frac{1}{n}\sum_{i=m}^{n}1_{A_{i}}. \end{equation}

Since the r.h.s. is measurable for all $m\geq 1$, then the l.h.s. must be $\mathcal{T}$-measurable. This proves the claim.