I'm reading a script where it says that for $X\in{\displaystyle L^{2}}$ we have $$\mathbb{E}[X^2]=\int_0^{\infty}\mathbb{P}(X^2\geq s)\text{d}s<\infty$$ and from this we can conclude with the integral test of convergence that $$\mathbb{P}(X^2\geq s)=\mathcal{o}\left(\frac{1}{sln(s)}\right)$$ The formula with the expectation is clear to me but I don't know how to arrive at the statement regarding $\mathcal{o}\left((sln(s))^{-1}\right)$. Or more generally I'm guessing it would then hold that if we have decreasing functions $f,g\geq 0$ with $\int_0^{\infty}f(s)\text{d}s<\infty$ and $\int_0^{\infty}g(s)\text{d}s =\infty$ then $\frac{f(x)}{g(x)}\to 0$ as $x\to\infty$. But I don't know for certain if this is true and how I would prove it.
Asked
Active
Viewed 125 times
2
-
Hi, can you attach the script you are reading? Since the below answer provides a counterexample, it may be possible that other conditions on $X$ exist. Besides, it is good to give all the context you can. +1 – Sarvesh Ravichandran Iyer Nov 16 '20 at 15:07
-
I linked the relevant pages from the script. – LordOfNumbers Nov 16 '20 at 15:36
-
Thanks for that, The LordOfNumbers. – Sarvesh Ravichandran Iyer Nov 16 '20 at 15:45
1 Answers
1
I expect the statement regarding $\mathcal{o}\left((s\ln(s))^{-1}\right)$ can fail because its counterpart for functions is fail. For instance, for each $x\ge 0$ let $f(x)=(s(x) \ln(s(x)))^{-1}$, where $s(x)$ is the smallest number $s$ from the sequence $(\exp(s^2))_{s=0}^\infty$ such that $s(x)\ge x$. Then $f(x)\not\in \mathcal{o}\left((s ln(s))^{-1}\right),$ but $$\int_0^\infty f(x)dx=\int_0^1 f(x)dx+\sum_{s=0}^\infty \int_{\exp(s^2)}^{ \exp((s+1)^2)} f(x) dx\le$$ $$1+\sum_{s=0}^\infty \int_{\exp(s^2)}^{ \exp((s+1)^2)} (\exp((s+1)^2\ln(\exp((s+1)^2))^{-1} dx\le$$ $$1+\sum_{s=0}^\infty \exp((s+1)^2) \exp (s+1)^{-2} (s+1)^{-2}=$$ $$1+\sum_{s=0}^\infty (s+1)^{-2}=1+\frac {\pi^2}6.$$
Alex Ravsky
- 90,434
-
So you would say that the calculations in Remark 2.3.23 in the script are wrong? – LordOfNumbers Nov 17 '20 at 17:38
-
@LordOfNumbers I am much stronger in real analysis than in probability theory, so, as I understood the respective fragment of the remark, there are no calculations, but a reference to the integral test for convergence. Using it, we indeed can show that a series $\sum_{n=2}^\infty \tfrac 1{n\ln n}$ diverges. – Alex Ravsky Nov 18 '20 at 05:58
-
But we are interested in the divergence of an integral $\int_{s=2}^\infty \tfrac{ds}{s\ln s}$, that is already given for this test. Given a continuous function $f:\Bbb R\to\Bbb R$, the test implies that if
$f(s)$ is bounded below by $\tfrac{1}{s\ln s}$ asymptotically then $\int_{s=2}^\infty f(s)ds$ diverges. – Alex Ravsky Nov 18 '20 at 05:58 -
But the test should not imply that if $\int_{s=2}^\infty f(s)ds$ converges then $f$ is dominated by $\tfrac{1}{s\ln s}$ asymptotically and the answer provides a counterexample. – Alex Ravsky Nov 18 '20 at 05:59