I will like to seek clarification on the following question. (Same question was asked here: Prove that convex function on $[a,b]$ is absolutely continuous)
Question: If $f\colon [a,b] \to \mathbb{R}$ is continuous and convex and increasing then $f$ is absolutely continuous on $[a,b]$.
I have two approaches.
- As $f$ is convex on $(a,b)$ then $f$ is locally Lipschitz and as $f$ is continuous then the variation of $f$ can be written as $V_f[a,b] = \lim_{|\Gamma| \to 0} S_\Gamma(f;a,b)$ where $\Gamma = \{x_i\}_{i=0}^m$ is a partition of $[a,b]$ so that $S_\Gamma(f;a,b) = \sum_{i=1}^m |f(x_{i}) - f(x_{i-1})|$ and $f$ is increasing so $f'$ exists a.e. in $[a,b]$. Using an equivalent definition of absolute continuity, we have \begin{align*} V_f[c,d] = \int_c^d |f'(x)|\,d x \end{align*} for any $a < c < d < b$. Then, \begin{align*} V_f[a,b] &= \lim_{\substack{c \to a\\ d \to b}} V_f[c,d]\\ &= \lim_{\substack{c \to a\\ d \to b}} \int_c^d |f'(x)|\,d x\\ &= \int_a^b |f'(x)|\, dx \end{align*} where the first equality follows from the result that $f$ is continuous implies that the variation is also continuous.
- For the second approach, I try to prove via first principle. Let $\{[a_i, b_i]\}_{i=1}^n$ be a finite collection of disjoint subintervals on $[a,b]$. Without loss of generality, we have $$a_1< b_1 < a_2 < b_2 < \dots < a_n < b_n$$ We have the following cases.
- If $a,b \notin [a_1, b_n]$ then $f$ being Lipschitz in $[a_1, b_n]$ then there $f$ is absolutely continuous on $[a_1, b_n]$ and so given $\epsilon > 0$ there exists $\delta >0$ so that $\sum b_i - a_i < \delta \implies \sum |f(b_i) - f(a_i)| <\epsilon$.
- If $a\in [a_1, b_n]$ and $b\notin [a_1, b_n]$ then necessarily $a = a_1$. Let $M >0$ be so that $x,y \in [a_2, b_n] \implies |f(x) - f(y)| < M|x-y|$. So, given $\epsilon >0$ let $\delta_1 = \frac{\epsilon}{2M}$ so that $$\sum_{i =2}^n b_i - a_i < \delta_1 \implies \sum_{i=2}^n |f(b_i) - f(a_i)| < \epsilon/2$$ and by absolute continuity, there exists $\delta_2>0$ so that $|x-y|<\delta_2 \implies |f(x) - f(y)|< \epsilon/2$ for any $x,y \in [a,b]$. So, taking $\delta = \min(\delta_1, \delta_2) >0$ we have $$\sum b_i - a_i < \delta \implies \sum |f(b_i) - f(a_i)| < \epsilon$$
- If $a,b \in [a_1, b_n]$ then using a similar argument, the result follows.
I believe that the second approach has gaps because otherwise, then this argument would imply that uniformly continuous functions are absolutely continuous, which is incorrect (e.g the Cantor-Lebesgue function). However, I am not sure why the second approach is wrong. Any clarification is appreciated.