6

I very recently learnt how to find the characteristic polynomial of a linear constant-coefficient homogeneous recurrence relation. But I also learned that the term "characteristic polynomial" appears in linear algebra (for matrices) and differential equations and perhaps other sectors that I am unaware of.

Fundamentally, what is it about a characteristic polynomial that remains invariant across domains of math?

Scouring through the internet, all I see are complex symbols and terminology that I am unable to make much sense of. I'd request if you can break down the relationship between the use of a characteristic polynomial in different areas of math. I'd also request that you try to keep the answer as intuitive as possible (even if that means it’s not perfectly rigorous).

  • The use of the expression "characteristic polynomial" in linear recurrences and linear differential equation is basically the same, but has little to do with the use of this expression in linear algebra. – Captain Lama Jun 30 '22 at 19:50
  • 3
    @CaptainLama: Au contraire! The characteristic polynomial of a recurrence $R$ is the characteristic polynomial of the shift endomorphism on the space of solutions to $R$. – Jacob Manaker Jun 30 '22 at 19:52
  • 2
    @CaptainLama, actually, linear recurrences can be expressed in matrix terms and then the characteristic polynomial of the matrix is the same as the characteristic polynomial of the recurrence – lhf Jun 30 '22 at 19:52

1 Answers1

5

The fundamental notion is the characteristic polynomial of a linear transformation $T : V \to V$ acting on a finite-dimensional vector space, which specializes to all of these:

  • the characteristic polynomial of a matrix is what happens when you pick a basis of $V$ and compute $T$ in that basis.
  • the set of solutions to a (homogeneous, constant-coefficient) linear recurrence forms a finite-dimensional vector space, and the characteristic polynomial of that recurrence is the characteristic polynomial of the shift map $S$ which acts on the vector space of solutions via $S(a_0, a_1, \dots) = (a_1, a_2, \dots)$. We can in fact rewrite the theory of linear recurrences in terms of the shift map: finding the solutions to a linear recurrence $a_{n+k} = c_{k-1} a_{n+k-1} + \dots + c_0 a_n$ amounts to solving an equation of the form $(S^k - c_{k-1} S^{k-1} - \dots - c_0 I) a = 0$, and we can do this by factoring the polynomial in $S$.
  • the set of solutions to a (homogeneous, constant-coefficient) linear ordinary differential equation again forms a finite-dimensional vector space, and the characteristic polynomial of that differential equation is the characteristic polynomial of the derivative map $D(f(x)) = f'(x)$. We can again rewrite the theory in terms of $D$: finding the solutions to a differential equation $f^{(k)}(x) = c_{k-1} f^{(k-1)}(x) + \dots + c_0 f(x)$ amounts to solving an equation of the form $(D^k - c_{k-1} D^{k-1} - \dots - c_0 I) f = 0$, and we can again do this by factoring the polynomial in $D$.

As you can see the last two stories are very closely related, and they can even be made isomorphic by thinking about how the derivative map acts on the polynomials $\frac{x^n}{n!}$ (it acts exactly by the shift map). This connects sequences satisfying linear recurrences to the Taylor series expansions of functions satisfying linear ODEs. All of this can also be understood in the language of generating functions and/or the Laplace transform.

In all of these cases the key is to understand the roots of the characteristic polynomial, which give the eigenvalues and subsequently the eigenvectors of $T$.

Qiaochu Yuan
  • 419,620
  • I will certainly revisit this answer in the future once I garner some further mathematical experience. –  Jun 30 '22 at 20:23