1

We known that for i.i.d. RVs $X_i$, where $i=1,2,...,n$, the following holds

$\Pr\Big(\big|\frac{1}{n}\sum_{i=1}^n X_i -E[X]\big|<\epsilon\Big)\geq 1 - \frac{\sigma_X^2}{n \epsilon^2}$.

However, is it possible to obtain an upper bound on the above probability, i.e., to obtain

$\Pr\Big(\big|\frac{1}{n}\sum_{i=1}^n X_i -E[X]\big|<\epsilon\Big)\leq f(n,\epsilon),$

where $f(n,\epsilon)$ is a function on $n$ and $\epsilon$. If possible, could you please refer me to a reference. Thank you!

Nik
  • 33

1 Answers1

0

Yes, such a bound follows from the Kolmogorov-Rogozin Theorem on the Levy concentration function. See. e.g., Esseen [1] Ineq. (3.3) page 296.

[1] Esseen, Carl-Gustav. "On the concentration function of a sum of independent random variables." Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete 9, no. 4 (1968): 290-308.

https://link.springer.com/content/pdf/10.1007/BF00531753.pdf

Yuval Peres
  • 21,955
  • Thank you! Does this mean that the above probability goes towards 1 with rate at most with $o(n^{-3/2})$ ? – Nik Nov 13 '22 at 15:49
  • No, these bounds are useful for very small $\epsilon$, that is of order $C/n^{1/2}$. For fixed $\epsilon$, the probability you mention would go to 1 exponentially if the summands have exponential tails. – Yuval Peres Nov 18 '22 at 17:00