3

Suppose we already know that any real $m \times n$ matrix has a singular value decomposition. Can this fact be used to give an easy, quick proof of the spectral theorem from finite dimensional linear algebra?

To be concrete, here is the version of the spectral theorem I have in mind: if $A$ is a real symmetric $n \times n$ matrix, then there exists an orthonormal basis of eigenvectors for $A$.

1 Answers1

3

It immediately follows from SVD that for $A$ symmetric, $A^2$ is orthonormally diagonalisable. We need to show that we can find an orthonormal basis consisting of eigenvectors of $A$.

For this, let $v$ be an eigenvector of $A^2$ and $w:=Av$. If $w$ is a multiple of $v$, then we can add $v$ into the collection and continue in $v^\perp$ (which is $A$-invariant). Otherwise we have that $Av=w$, $Aw=\lambda v$. This means that $A$ sends $\sqrt{\lambda} v\pm w$ to the $\pm\sqrt{\lambda}$-multiple of itself. These are eigenvectors of $A$ and, moreover, they are orthogonal to each other due to $$ \langle \sqrt\lambda v + w, \sqrt{\lambda} v - w\rangle =\lambda \langle v,v\rangle - \langle w,w\rangle =\lambda \langle v,v\rangle - \langle Av, Av\rangle =\lambda \langle v,v\rangle - \langle A^2v, v\rangle=0. $$ Again, you can continue inductively in $v^\perp\cap w^\perp$.

This looks very awkward indeed. Usually you do it the other way around, first prove the spectral theorem and then SVD.

Peter Franek
  • 11,522
  • 2
    You don't need to prove the spectral theorem for SVD: https://qchu.wordpress.com/2017/03/13/singular-value-decomposition/ – Qiaochu Yuan Nov 15 '17 at 19:11
  • @QiaochuYuan Thanks. Yes, I love that way of inventing the SVD, and I once tried to explain the intuition myself here: https://math.stackexchange.com/a/1740483/225419 . Since there is a very clear way to discover the SVD directly, I began to wonder if we can then view the spectral theorem as an easy corollary of the SVD. – eternalGoldenBraid Nov 15 '17 at 22:28