1

Take the superposition principle for linear ODEs of the form $y'(t)=A(t,y(t)) + g(t)$ ($y\in \mathbb{R}^n$, $A$ a linear function in y). If $g(t)=\sum _{k=1}^N g_k(t)$ then $y(t)=\sum _{k=1}^N y_k(t)$ solves the system, where each $y_k$ solves the ODE with $g(t)=g_k(t)$.

Can this be extended from finite sums to series?

Alex Ravsky
  • 90,434
Lilla
  • 2,099

1 Answers1

1

I am wondered that no one of ODE guys answered this question, because it is a common method to look for a solution of a differential equation in a form of a series which is a linear combination of single functions (for instance, $x^n$ or $\sin nx$). So I recalled a theorem from the big book [Fich, Ch. 12, $\S 1$, 435. Theorem 8]. It concerns one-dimensional case, but I expect that for high-dimensional case the situation is similar.

Theorem. Let functions $y_k(t)$ have bounded derivatives in a bounded segment $I$. If a series $y(t)=\sum_{k=1}^\infty y_k(t)$ converges at some point $t_0\in I$ and a series $y^*(t)=\sum_{k=1}^\infty y’_k(t)$ converges on $I$ uniformly then the series $y(t)$ converges on $I$ uniformly and $y’(t)= y^*(t)$ for each $t\in I$.

It easily implies a positive answer to your question for a fixed $t\in I$ when $n=1$, the functions $y_k$ satisfy the conditions of the theorem, the function $A(t,y)$ is continuous, and the series $g(t)=\sum _{k=1}^N g_k(t)$ converges to $g(t)$.

References

[Fich] Grigorii Fichtenholz, Differential and Integral Calculus, v. II, 7-th edition, M.: Nauka, 1970 (in Russian).

Alex Ravsky
  • 90,434