0

Consider a sequence of measurable functions $X_n$ and $X$ measurable on some probability space $(\Omega,\mathcal{F},P)$.

I want to show, that following holds

$$\lim_{n\rightarrow \infty} E[\min(|X_n-X|,1)]=0\quad\Rightarrow\quad\lim_{n\rightarrow \infty} E[|X_n-X|]=0$$

This statement is intuitively clear, but I fail to find the actual argument. Thanks in advance!

user408858
  • 2,463
  • 12
  • 28

1 Answers1

2

This is not true. Consider $(0,1)$ with Lebesgue measure and let $A_n=(0, 1 -\frac 1 n)$. Let $X_n=\frac 1 n$ on $A_n$ and $n$ on $A_n^{c}$. Then $E X_n \wedge 1 =\frac 1 n P(A_n)+P(A_n^{c}) \leq \frac 2 n \to 0$ but $EX_n \geq nP(A_n^{c}) =1 $ for all $n$.

  • 1
    @user408858: what you stated is that convergence in probability implies convergence in mean. That is false as Kavi Rama Murthy just showed with a counter example. Here is a related question: https://math.stackexchange.com/questions/477942/does-convergence-in-probability-implies-convergence-of-the-mean – Mittens Jun 21 '20 at 23:40
  • Hm, ok. But how can I state from $\lim_{n\rightarrow \infty} E[\min(|X_n-X|,1)]=0$, that $X_n\rightarrow X$ in probability? That's what you are saying, right? – user408858 Jun 21 '20 at 23:45
  • Ok, I will look, if following question will help out: https://math.stackexchange.com/questions/1656751/equivalence-definition-for-convergence-in-probability :-) – user408858 Jun 21 '20 at 23:47