6

Let $\{A_n\}$ be a sequence of independent events. Show that, for every $x$, the event $$A=\left\{\omega:\frac1{n}\sum_{k=1}^n I_{A_k}(\omega) \rightarrow x\right\}$$ has probability either $0$ or $1$.

I think if I show this event as a tail event then the event has probability $0$ or $1$ by Kolmogorov $0$-$1$ law. Any Help?

2 Answers2

5

The event under consideration is independent from any finite number of $A_n$. The sum converges if its tail converges, so to measure it, you need to know the tail behavior of the $A_n$'s. The event $A$ is determined by the sequence $\{A_1,A_2, \dots\}$ but not by a finite set of the form $\{A_1,A_2, \ldots A_m\}$. Hence it belongs to the tail $σ-$ algebra.


This is the basic idea, which you can formulate as follows. Let $$σ(A_m, A_{m+1}, \ldots)$$ be the $σ-$ algebra generated by all events $A_k$ with $k\ge m$. The tail $σ-$ algebra $\mathcal T$ is defined as $$\mathcal T:=\bigcap_{n=1}^{+\infty}σ(A_m, A_{m+1}, \ldots)$$ Now, the event $A:=\left\{\omega:\frac1n\sum_{k=1}^n I_{A_k}(\omega) \rightarrow x\right\}$, for any $m>0$ depends only on the events $A_k$ with $k\ge m$ (see an explanation below) and hence it belongs to $σ(A_m, A_{m+1},\ldots)$ for every $m\ge 1$ (we can remove the first $m$ terms and the tail behavior remains unchanged). Hence $A$ belongs to the intersection of these $σ-$ algebras which is by the definition the tail $σ-$ algebra $\mathcal T$.

By the Kolmogorov $0-1$ law, every event in $T$ has probability either $0$ or $1$ (although it is usually hard to say which).


To see why $A$ depends only on the events $A_k$ for $k\ge m$, fix $m<n$ and write: $$\frac1n \sum_{k=1}^nI_{A_k}(ω)=\frac1n \sum_{k=1}^mI_{A_k}(ω)+\frac1n \sum_{k=m+1}^nI_{A_k}(ω)$$ Letting $n \to +\infty$ this gives \begin{align}\lim_{n\to+\infty}\frac1n \sum_{k=1}^nI_{A_k}(ω)&=\lim_{n\to+\infty}\frac1n \sum_{k=1}^{m}I_{A_k}(ω)+\lim_{n\to+\infty}\frac1n \sum_{k=m+1}^nI_{A_k}(ω)\\[0.2cm]&=0+\lim_{n\to+\infty}\frac1n \sum_{k=m+1}^nI_{A_k}(ω)=\lim_{n\to +\infty}\frac1n\sum_{k=1}^{n}I_{A_{m+k}(ω)}\end{align} This shows that the limit does not depend on the first $m$ events and since $m$ was arbitrary (but fixed) this shows the assertion.

Jimmy R.
  • 35,868
  • "To measure it, you need to know infinitely many of the $A_n$'s" I fail to get this part of the argument. Care to explain? – Did Jan 13 '16 at 12:00
  • "Now,the event $A:=$ ... lies certainly in $σ(A_n, A_{n+1},\ldots)$ for every $n\ge 1$, because it depends on infinitely many $A_n$'s." Now this argument seems squarely wrong: lots of events depending on infinitely many $A_n$s are not in the tail sigma-algebra. – Did Jan 13 '16 at 12:02
  • @Did I think that I nailed it! What do you think? – Jimmy R. Jan 13 '16 at 12:31
  • Closer now, but it remains to explain more precisely why "the event $A:=\left{\omega:\frac1n\sum_{k=1}^n I_{A_k}(\omega) \rightarrow x\right}$, for any $n>0$ depends only on the events $A_k$ with $k\ge n$". – Did Jan 13 '16 at 13:27
  • @Did This is it. – Jimmy R. Jan 13 '16 at 13:58
  • I don't see how you can directly say that $A$ belongs to $\sigma(A_m,A_{m+1},\dots)$, this seems like a big jump. Also shouldn't it be $m=1$ in the definition of the tail-field? – Jelle Dijkstra Feb 23 '19 at 00:46
1

$\{I_{A_n}\}$ is a sequence of independent random variables. Consequently, you can apply the following argument, which is true for every sequence of independent r.v.'s $\{X_n\}$.

Let $S_n = \sum_{ḱ=1}^n X_n$ and let $0 < b_n \uparrow +\infty$. Then, $\limsup \frac{S_n}{b_n}$ and $\liminf \frac{S_n}{b_n}$ are tail functions, see definition 1 and proposition 2. Thus:

$$ \{ \omega \in \Omega: \frac{S_n(\omega)}{b_n} \rightarrow x \} = \left(\liminf \frac{S_n}{b_n}\right)^{-1}\{x\} \bigcap \left(\limsup \frac{S_n}{b_n}\right)^{-1}\{x\} $$

What implies that $\{ \omega \in \Omega: \frac{S_n(\omega)}{b_n} \rightarrow x \}$ is a tail event.

Definition 1. Tail functions are random variables measurable for the tail $\sigma$-algebra.

Proposition 2. $\limsup \frac{S_n}{b_n}$ and $\liminf \frac{S_n}{b_n}$ are tail functions.

Proof.

It is just presented the proof for the limit superior since the other one works similarly.

In order to proof the assertion it is enough proving that $\left[\limsup \frac{S_n}{b_n} > a\right]$ is a tail event for every $a \in \mathbb{R}$. The statement is achieved using the following analysis argument, which needs that $b_n \uparrow \infty$:

Lemma. Let $x_n$ be an arbitrary sequence, $s_n = \sum_{k=1}^n x_k$ and $0 < b_n \uparrow +\infty$. For each $m \in \mathbb{N}$ and $a \in \mathbb{R}$:

$$ \limsup \frac{s_n}{b_n} > a \iff \limsup \frac{s_{n+m}-s_m}{b_{n+m}} > a $$

Proof.

Use that $\frac{s_m}{b_n} \rightarrow 0$ when $n \rightarrow \infty$. If you need more insight in the proof, then tell me. I don't want to fulfill the answer with analysis details. $\tag*{$\blacksquare$}$

Thus, we have

$$ \left[ \limsup \frac{S_n}{b_n} > a \right] = \left[ \limsup \frac{S_{n+m}- S_m}{b_{n+m}} > a \right] \in \beta(X_m, X_{m+1}, \ldots) \ \forall \ m \in \mathbb{N} $$

since $\frac{S_{n+m}- S_m}{b_{n+m}}$ is measurable in $\beta(X_m, X_{m+1}, \ldots)$ for every $n \in \mathbb{N}$ and, thus, so it is its limit superior.

Finally, the event is in the tail $\sigma$-algebra because this one is the intersection of every $\beta(X_m, X_{m+1}, \ldots)$. $\tag*{$\blacksquare$}$

andreshp
  • 305