6

$M$ is a $n\times n$ positive definite matrix and we would like to compute its determinant. We observe another positive definite matrix $\hat{M}$ whose components are within some factor of those of $M$. In particular $\hat{M}_{ij}=c_{ij}\cdot M_{ij}$ and each $c_{ij}\in (1-\varepsilon, 1+\varepsilon).$

How is $|\hat{M}|$ related to $|M|$? In particular how small does $\varepsilon$ have to be in terms of $M$'s entries for strong bounds to exist?


When each $c_{ij}=c_{i}$ for each $i$, it's immediate that $|\hat{M}|=\prod_ic_i\cdot |M|.$

There must be some correspondence, since the determinant is just a polynomial of a matrix's entries so it's continuous.

  • 1
    Without more information, the most you can really say is that $$(1-\varepsilon)^n \det M < \det \hat{M} < (1+\varepsilon)^n \det M.$$ This is a tight bound as well. You can take the identity matrix and modify each entry to be arbitrarily close to $1\pm \varepsilon$ for example. – EuYu Dec 04 '17 at 23:32
  • 4
    @EuYu, That's the sort of bound I was hoping for but it's not true. Take $M=\left[\begin{smallmatrix} a&b\c&d \end{smallmatrix}\right]$ and $\hat{M}=\left[\begin{smallmatrix}(1+\varepsilon)\cdot a&b\c&(1+\varepsilon)\cdot d\end{smallmatrix}\right].$ Now if all the entries are positive then $|\hat{M}|>(1+\varepsilon)^2|M|.$ – Christian Chapman Dec 04 '17 at 23:42
  • 2
    Work in the orthogonal coordinates of $ M $. Then use Hadamard's inequality for an explicit upper bound. Use derivative of determinamt for asymptotic power bounds – Bananach Dec 04 '17 at 23:47
  • 1
    This question and the references therein might be useful to you. – EuYu Dec 05 '17 at 00:09
  • $\operatorname{det}(A+\delta A)-\operatorname{det}(A)=\operatorname{det}(A) \operatorname{Tr} \left (A^{-1} \delta A \right ) + o(| \delta A |)$. Here $| \cdot |$ is any norm you wish (assuming $n$ is just fixed). (See https://mathoverflow.net/questions/214908/proof-for-the-derivative-of-the-determinant-of-a-matrix) – Ian Dec 05 '17 at 23:17
  • This question contains more explicit bounds. (The one in my answer is related to Bananach's advice). – Giuseppe Negro Mar 10 '18 at 01:52

1 Answers1

0

One general answer is in the paper in the comment by @EuYu,

I. C. F. Ipsen, and R. Rizwana. "Perturbation bounds for determinants and characteristic polynomials." SIAM Journal on Matrix Analysis and Applications 30.2 (2008): 762-776.

The following holds for general complex matrices.

If $M$ is the $n\times n$ complex matrix we are interested in and $\hat{M}$ is what we observe, take $E\triangleq \hat{M}-M.$ If $M$ is invertible, then by rearranging Theorem 2.13 in that paper, we have:

$$\left\|\operatorname{det}\hat{M}-\det M\right\|\leq \left\|\det M\right\|\cdot \left(\left(\kappa_M\frac{\|E\|_2}{\|M\|_2}+1\right)^n-1\right)$$

and if $M$ is singular and rank $r$, by Corollary 2.11 we have: $$\left\|\det\hat{M}-\det{M}\right\|=\|\det \hat{M}\|\leq \left\|E\right\|^{n-r}_2(\|M\|_2+\|E\|_2)^r,$$

where $\|\cdot\|_2$ is the operator norm (largest singular value) and $\kappa_{M}$ is the conditioning number, the ratio of the largest to the smallest singular value of $M.$ These bounds are tight when $E$ has singular values in the same direction as $M$.


In the OP, taking $C$ the matrix of $c_{ij}$'s and $\odot$ the Hadamard product then $E=M\odot C$. From here if $\varepsilon<1$ we can bound $\|E\|_2\leq (1+\varepsilon)\cdot\|M\|_2$.


A little more work is needed to get rid of the bounds' dependence on $M$ but this can be done with some simple Frobenius norm inequalities and our form of $E, \hat{M}$. I've written up the details of doing this here http://www.public.asu.edu/~cdchapm2/approximating-determinants/index.html.