Let $X = (X_i: i=1,\ldots,n)$ where $X_i \sim f$ i.i.d. such that $f$ is supported on $[0,\infty)$, $E(|X_i|)<\infty$ and $E(X_i)=\mu$. Consider $\tilde{X} = (X_i: X_i\leq a_n)$ for some deterministic sequence $a_n \to \infty$ and denote $m$ the number of elements in $\tilde{X}$.
By Law of large numbers we know that $\frac{1}{n}\sum_{i}^{n}X_i \to \mu$.
But, is it true that $\frac{1}{m}\sum_{i}^{m}\tilde{X}_i \to \mu$ ?
My idea is to try to use LLN again. Since $\tilde{X}_i \sim f_n$ with $f_n = \frac{f}{F(a_n)}$ and $F$ the c.d.f. of $X_i$, then $E(\tilde{X}_i)\leq E(X_i)<\infty$ and $E(\tilde{X}_i) = \frac{\int_0^{a_n}xf(x)dx}{F(a_n)}:= \mu_n$.
Then as $n\to\infty$ then $m\to\infty$ and $$ \left|\frac{1}{m}\sum_{i}^{m}\tilde{X}_i - \mu \right| \leq \left|\frac{1}{m}\sum_{i}^{m}\tilde{X}_i - \mu_n \right| + |\mu_n-\mu|$$ Therefore we conclude by using LLN on the first term and the second goes to 0 by definition and $a_n\to\infty$.
Is this correct? My main concern is that I'm trying to use LLN with a mean that depends on $n$. Is it valid?
Edit: Ok, this clearly seems to be wrong. What I should be able to do though is the following
$$ \left|\frac{1}{m}\sum_{i}^{m}\tilde{X}_i - \mu \right| \leq \left|\frac{1}{m}\sum_{i}^{m}\tilde{X}_i - \frac{1}{n}\sum_{i}^{n}X_i \right| + \left|\frac{1}{n}\sum_{i}^{n}X_i-\mu \right|$$ all I need to prove now is $$ \left|\frac{1}{m}\sum_{i}^{m}\tilde{X}_i - \frac{1}{n}\sum_{i}^{n}X_i \right| \to 0 $$ a.s. or in probability. How can I show that?