5

$X_{i}\sim bern(\frac{1}{i})$

$S_{n}=\sum_{i=1}^{n}X_{i}$.

Then $\frac{S_{n}}{\ln(n)}\xrightarrow{a.s} 1$

I think I somewhere have to use that $\frac{H_{n}}{\ln(n)}\to 1$. Where $H_{n}$ is the nth harmonic number. As $\sum_{i=1}^{n}\frac{1}{i}$ and $\ln(n)$ just screams out Euler-Mascheroni constant to me.

So I try to bring in expectation somewhere to make that $H_{n}$ appear.

$$P\left(\left|\frac{S_{n}}{\ln(n)}-1\right|\geq \epsilon\right)\leq \frac{E\left[\frac{S_{n}}{\ln(n)}-1\right]}{\epsilon}$$ by Markov's inequality.

This gives me that as $n\to\infty$. $P(|\frac{S_{n}}{\ln(n)}-1|\geq \epsilon)\to 0$ .

But this does not give me almost sure convergence.

also I cannot apply Strong Law of Large numbers as the rv's are not iid. Can anyone tell me how I should proceed. I am lost for ideas .

  • Do you know martingale theory? – nejimban Nov 04 '21 at 11:13
  • No man . I am in a introduction to probability course . Well what I mean to say is this is my first time doing probability –  Nov 04 '21 at 11:14
  • Try to prove that $\sum P(|\frac{S_{n}}{\ln(n)}-1| <\infty$. That would be enough, by Borel -Cantelli Lemma. – Kavi Rama Murthy Nov 04 '21 at 11:32
  • @KaviRamaMurthy Yeah I tried that also. But how do I do that ?. $P(|\frac{S_{n}}{\ln(n)}-1|>\epsilon)=P(S_{n}>\ln(n)+\epsilon\cdot\ln(n))+P(S_{n}<\ln(n)-\epsilon\ln(n))$ . Now how do I proceed ?. I was stuck at this part as I do not understand how to find this probability let alone finding a way to upper bound it by a convergent series. –  Nov 04 '21 at 11:42
  • 2
    You can try with Chernoff bound and other concentration inequalities. In particular you can compute $$\ln\mathbb E\Bigl[\mathrm e^{\theta S_n}\Bigr]=\sum_{i=1}^n\ln\Bigl(1+\frac1i\bigl(\mathrm e^\theta-1\bigr)\Bigr)\sim\bigl(\mathrm e^\theta-1\bigr)\ln n.$$ – nejimban Nov 04 '21 at 11:51
  • @nejimban . I managed to use what you said to prove convergence of $\sum_{n=1}^{\infty}P(S_{n}>\ln(n)(1+\epsilon))$. What should I use for the sum $\sum_{n=1}^{\infty}P(S_{n}<\ln(n)(1-\epsilon))$. ? –  Nov 04 '21 at 12:45
  • @Hundred-eyes Doesn't it work in this direction as well? As in this paragraph. – nejimban Nov 04 '21 at 13:15
  • @nejimban . It holds but I could not make any use of it Mainly because when we use it we get $e^{\theta(1-\epsilon)\ln(n)}$ in the numerator. –  Nov 04 '21 at 14:37
  • (I haven't checked details) It can help centering the variables (replace $X_i$ by $X'i=X_i-\mathbb E[X_i]$). At least [Bernstein's inequality](https://en.wikipedia.org/wiki/Bernstein_inequalities(probability_theory)) seems to work (use it with $t:=H_n$, $M:=1$, once for the family $(X'i){i\ge1}$ and once for $(-X'i){i\ge1}$ to get the two bounds). You should have bounds approximately equal to $$\exp!\left(-\frac{\frac12t^2}{\frac13Mt}\right)\approx\exp!\left(-\frac32\ln n\right)!.$$ Bernstein's inequality is proved from Chernoff bound. I can try to write a full answer later. – nejimban Nov 04 '21 at 15:37

1 Answers1

1

I think you can use the following result to answer this:

Suppose $\{X_n\}_{n\in \mathbb N}$ is a sequence of independent random variables with $\operatorname{Var}(X_n)<\infty$ for every $n\in \mathbb N$. If $S_n=\sum\limits_{i=1}^n X_i$, then

$$\sum_{n=1}^\infty \frac{\operatorname{Var}(X_n)}{b_n^2}<\infty\,,\, b_n \uparrow \infty \implies \frac{S_n-E(S_n)}{b_n}\stackrel{\text{a.s.}}\longrightarrow 0$$

The above can be shown by combining Kronecker's lemma and Kolmogorov's convergence criterion. See, e.g., A Probability Path by Sidney Resnick.

If you take $b_n= \ln n$, then

$$\sum_{n=1}^\infty \frac{\operatorname{Var}(X_n)}{b_n^2}=\sum_{n=2}^\infty \frac{n-1}{n^2 (\ln n)^2}<\sum_{n=2}^\infty \frac{1}{n (\ln n)^2}<\infty\,,$$

where the convergence of the last series is discussed here.

Therefore,

$$\frac{S_n-E(S_n)}{b_n}=\frac{S_n}{\ln n}-\frac{H_n}{\ln n} \stackrel{\text{a.s.}}\longrightarrow 0$$

and $$\frac{H_n}{\ln n}\to 1$$

together imply

$$\frac{S_n}{\ln n}\stackrel{\text{a.s.}}\longrightarrow 1$$

StubbornAtom
  • 17,052