5

I have a matrix $A\in \mathbb{C}^{N\times N}$ and I need to calculate $||A^{-1}||_{2}$ efficiently. Can it be done without having to evaluate the inverse explicitly?

In general, I am looking for methods of calculating $||A^{-1}||_{2}$ that are faster than just doing the following in Matlab:

norm(inv(A))

If no faster method is possible, or actually, just in general, I am alternatively interested in efficient ways to calculate an approximation or upper bound of $||A^{-1}||_{2}$.

Does anyone have any insight on this?

Edit: Note that the form of the matrix $A$ is in fact $A = I-B$ and this $I-B$ is very well conditioned (as $A$ has arisen due to preconditioning). I don't know if this makes any difference to how efficiently $||A^{-1}||_2$ an be computed but I just thought I'd mention it in case.

sonicboom
  • 153
  • 5

1 Answers1

4

You might want to use the fact that:

$$ ||A||_2=\sigma_\max(A) $$

where $\sigma_\max$ is the largest singular value. If you are interested in details, this Math SO question should be interesting. Thus,

$$ ||A^{-1}||_2=\frac{1}{\sigma_\min(A)} $$ where $\sigma_\min$ is the smallest singular value.

You certainly want to avoid the actual calculation of the inverse and at least change it by a much more numerically stable Singular Vector Decomposition. Unfortunately, it is also a $\mathcal O(N^3)$ operation, where $N$ is the size of your matrix.

Now, if $N$ is too large, you might want to be interested in the algorithms of estimating singular values or even condition numbers, as

$$ \kappa_2(A)=||A||_2||A^{-1}||_2=\frac{\sigma_\max(A)}{\sigma_\min(A)} $$

which might be your actual motivation of calculating the $||A^{-1}||_2$ in the first place.

If your matrix $A$ happens to be normal ($A^HA=AA^H)$, then

$$ \kappa_2(A)=\frac{|\lambda_\max(A)|}{|\lambda_\min(A)|} $$ where $A^H$ represents a conjugate transpose, while $\lambda_\max(A)$ and $\lambda_\min(A)$ are the maximum and minimum eigenvalues of $A$ respectively. In this case, some calculations can be simplified.

Relevant material to start with:

Now, what to use in practice, would depend on your particular reasons for calculating this quantity, matrix size, available computational resources, and, potentially, matrix structure.

Anton Menshov
  • 8,672
  • 7
  • 38
  • 94
  • 1
    I find the Spectra to be a useful and easy library for calculating top-k eigenvalues. For appropriate matrices, these can then be manipulated through @Anton's answer to give the desired results. – Richard Dec 30 '19 at 23:54
  • I think the formula should actually be $|A^{-1}|2 = \frac{1}{\sigma\text{min}(A)}$. – Wolfgang Bangerth Dec 30 '19 at 23:57
  • It might also be useful to mention that $\sigma_\text{min}(A)$ is the magnitude of the smallest singular value, not the smallest eigenvalue. – Wolfgang Bangerth Dec 30 '19 at 23:58
  • 2
    @WolfgangBangerth I have no idea how did I miss that while typing. Thanks! Worth to mention that singular values are non-negative, thus magnitude is not needed. – Anton Menshov Dec 31 '19 at 00:07
  • I have a square matrix $A$ so is the SVD necessary, could I not equivalently compute the smallest eigenvalue? Or is the smallest singular value more efficient to compute than the smallest eigenvalue? – sonicboom Jan 02 '20 at 10:36
  • Also, I have provided some more information on $A$ in the edit to my post, I'm not sure if it makes any difference but just in case it does.. – sonicboom Jan 02 '20 at 10:37
  • 1
    @sonicboom the smallest eigenvalue won't give you what you need. You need the smallest singular value (which are directly related to eigenvalues of $AA^H$, not just $A$) – Anton Menshov Jan 02 '20 at 10:44
  • Yes but I can compute the smallest singular value by taking the square root of the smallest eigenvalue. So is it more efficient to compute the singular value directly or compute the eigenvalue and then take the square root? – sonicboom Jan 03 '20 at 12:12
  • 2
    @sonicboom that's not how it works. singular values cannot be obtained by calculating eigenvalues of $A$ and taking a square root in a general case. See my comment above and edits to the answer. – Anton Menshov Jan 03 '20 at 15:53