6

I am curious how to quickly compute the eigenvalues for arbitrary matrices, sparse or dense, restricted on some given interval of interest.

Suppose we have an arbitrary $n\times n$ matrix $A$, normally the complexity of the computing all the eigenvalues for $A$ is $O(n^3)$, I wonder if we could find an algorithm that does the same job while only bears an $O(n^2)$ or less complexity, or more specifically, the eigenvalues on an interval $(a,b)$. Is this possible at all?

For symmetric matrix, to my knowledge, we could use bisection method to somewhat get the job done, I would like to know if there are other tricks for general matrices as well.

Shuhao Cao
  • 2,552
  • 17
  • 30
  • 1
    How large is your interval $(a,b)$? If you want an eigenvalue near a given point $\mu_0$, you can use the inverse power iteration. – Hui Zhang Mar 01 '12 at 17:30
  • If your matrix is general, it will have complex eigenvalues. What then is the meaning of finding the eigenvalues in $(a,b)$? – Arnold Neumaier Apr 14 '12 at 19:24

2 Answers2

6

For a direct general result (as opposed to iterative approximation) you will have to compute the largest eigenvector first to find the smaller ones in the remaining orthonormal sub-space. Basically compute the largest eigenvector (mean projection of a unit-vector $O(N^2)$), and repeat in the remaining $N$-subspaces, subtracting previous projections against the matrix.

I think 'interval' is a somewhat poorly formed concept here. Basically if you compute the first eigenvector (and corresponding eigen-value), you may discover that its magnitude is below the desired window, in which case you hit your $N^2$ lower bound of computation. I think the upper interval bound $a$ will not matter, but the lower will decide what partition of the eigenvalues you must compute, hence leaving you somewhere between $N^2$ and $N^3$.

For sparse matrices checkout Arnoldi Iteration, and the corresponding math on Krylov subspaces/methods. These algorithms do a really good job at approximating the largest eigenvalues quickly, but sometimes grind a long time when the matrix has a poor condition number or other undesirable spectral properties.

meawoppl
  • 1,918
  • 11
  • 20
1

Inverse subspace iteration: http://en.wikipedia.org/wiki/Inverse_iteration is available in ARPACK.

Filter diagonalization: http://ab-initio.mit.edu/wiki/index.php/Harminv

Rayleigh-Chebyshev: ftp://ftp.math.ucla.edu/pub/camreport/cam10-05.pdf

Are designed for that problem. Inverse iteration is the standard approach.

Edit: I just noticed you're looking for possibly asymmetric eigenvalue problems. Filter diagonalization can still work but it's trickier. See www.cs.tsukuba.ac.jp/techreport/data/CS-TR-08-13.pdf

dranxo
  • 1,128
  • 8
  • 10