18

Every $n\times n$ matrix satisfies a polynomial equation of degree at most $n^2$, simply because the space of $n\times n$ matrices has dimension $n^2$.

By the Cayley–Hamilton theorem, every matrix satisfies a polynomial equation of degree $n$.

Is there a simple proof that every matrix satisfies a polynomial equation of degree at most $n$ without using the Cayley–Hamilton theorem?

lhf
  • 216,483
  • Perhaps, this may be the basis of a proof by induction: http://en.wikipedia.org/wiki/Minimal_polynomial_(linear_algebra)#Computation – lhf Jun 03 '15 at 13:06

4 Answers4

6

Ths might not count as simple, but it provides another point of view:

$A$ makes $K^n$ into a $K[x]$-module and by the structure theorem of finitely generated modules over a PID, we get

$$K^n \cong K[x]/(f_1) \oplus \dotsb \oplus K[x]/(f_s).$$

Set $f=f_1 \dotsb f_s$. Clearly $f$ acts by zero on the right hand side, hence $f$ acts by zero on $K^n$, which means $f(A)=0$. By comparison of the dimensions of both sides, we get

$$n = \dim K^n = \sum \dim K[x]/(f_i) = \sum \deg f_i = \deg f.$$

Of course, the structure theorem is not really simpler than Cayley-Hamilton, but its standard proof does not need any Cayley-Hamilton related methods, I think.

MooS
  • 31,390
5

Proof that any matrix satisfies a polynomial of degree at most $n^2-1$:

Let $A$ be a matrix that doesn't satisfy any such polynomial. Then $I,A,A^2,\dots,A^{n^2-1}$ are linearly independent and form a basis for the space of matrices. It follows that every matrix commutes with $A$.

This implies that $A$ is a multiple of the identity, so that $A - \lambda I = 0$, so that $A$ satisfies $x - \lambda$. This contradicts our hypothesis that $A$ satisfies no such polynomial.

Ben Grossmann
  • 225,327
2

This is a proof by induction:

The result is trivial for $n=1$ and of course we can assume the field $K$ to be algebraically closed.

Pick an eigenvector with $Av = \lambda v$, set $V = \langle v \rangle$. $V$ is an invariant one-dimensional subspace.

We get two new endomorphisms: $A_{|V} \in End(V)$ and $(A_{K^n/V}: [x] \mapsto [Ax]) \in End(K^n/V)$.

By induction we obtain $f$ of degree $1$ with $f(A_{|V})=0$ and $g$ of degree $n-1$ with $g(A_{K^n/V})=0$.

The desired $h(A)=0$ holds for the product $h=fg$:

$g(A_{K^n/V})=0$ translates into $g(A)(K^n) \subset V$, hence $h(A)(K^n)=f(A)(g(A)(K^n)) \subset f(A)(V)=0$.

MooS
  • 31,390
1

Yes, use Jordan form over an algebraic closure of the ground field.

paul garrett
  • 52,465
  • 11
    I don't think that qualifies as "simple", relative to using Cayley-Hamilton – Ben Grossmann Jun 03 '15 at 13:07
  • @Omnomnomnom, while "simplicity" is perhaps subjective, how do you prove Cayley-Hamilton, if you object to Jordan normal form? There is certainly an argument that doesn't use Jordan form, but it's not "elementary". – paul garrett Jun 03 '15 at 13:20
  • @paulgarrett I'd say it's more elementary to state that matrices are, in general, upper-triangularizable, than it is to say that they can be put in Jordan canonical form. This is sufficient for C-H. – Ben Grossmann Jun 03 '15 at 13:25
  • For instance, the proof I know of Jordan's normal form first uses Cayley-Hamilton to decompose the space to subspaces (the generalised eigenspaces) where $A - \lambda I_n$ is nilpotent. The heart of Jordan's form is then the combinatorial study of nilpotent matrices. It may be provable in other ways, but I really see Jordan's form as something strictly stronger than Cayley-Hamilton (or at least to a weaker form of it which would say that there exists a polynomial $P$ having only the eigenvalues of $A$ as roots s.t. $P(A) = 0$.) – PseudoNeo Jun 03 '15 at 13:27
  • @PseudoNeo, but why not just view the finite-dimensional vector space as a module over the polynomial ring $k[x]$, and use structure of f.g. modules over PIDs to get Jordan form? Then Cayley-Hamilton can be a corollary of that. But, sure, all these results are true, and standard. – paul garrett Jun 03 '15 at 13:34
  • That's a good idea! – PseudoNeo Jun 04 '15 at 07:50