I have the sequence $(y_n)_{n\in\mathbb N_0}$ of functions $$y_n\colon [0,\alpha] \to \mathbb R$$ defined recursively by $$ y_{n+1}(x) = \int_0^x g\bigl(y_{n}(\xi)\bigr)\,d\xi,\qquad n\in\mathbb N $$ and the constant function $y_0(x)\equiv 0$. We know nothing about the function $g$, except that all those integrals always exist.
Suppose we know that $(y_n)_{n\in\mathbb N}$ converges uniformly to a function $y^*$: $$\|y_n - y^*\|\to0,\qquad(n\to\infty)$$ where $\|\bullet\|$ is the supremum norm on $[0,\alpha]$: $$ \| y \| = \sup_{x\in [0,\alpha]} |y(x)|. $$
The question is:
Can we prove that $y^*$ has to be differentiable?
If I'd know that $y_n$ was always differentiable with $y_n'(x) = g\bigl(y_{n-1}(x)\bigr)$, then it would suffice to show that $(y_n')_{n\in\mathbb N}$ converges uniformely, but I don't really know anything about the convergence of $y_n'$, since I don't know anything about $g$? Would it be easier or at least doable if $g$ was continuous?
This question arises, when you try to apply the Picard-iteration to the initial value problem $$ y'(x) = g\bigl(y(x)\bigr),\qquad y(0) = 0, $$ where one does not have Lipschitz-continuity of $g$. There is a similar exercise in the german standard textbook „Gewöhnliche Differentialgleichungen“ by Harro Heuser. It is Exercise III.12.5 in that book (at least in the fourth edition).