1

I've implemented an algorithm that can calculate the cofactor-matrix of a matrix in $\mathcal{O}(n^5)$.

The algorithm just step-by-step iterates over the whole matrix ($\mathcal{O}(n^2)$) and for every $(i,j)$ in the matrix, it then calculates the determinant of the "sub-matrix" (leaving off row $i$ and column $j$) by using the bareiss algorithm in $\mathcal{O}(n^3)$.

Is there a faster way to do this?

  • 3
    Computing determinants of anything is so vastly expensive that it is almost always a good question to ask what you actually need it for, and whether what you want to do could not also be done without actually computing determinants. – Wolfgang Bangerth Feb 22 '20 at 01:12
  • Your are working on integers, do I understand correctly? And you need an exact integer answer even if it is going to be astronomically huge? – Federico Poloni Feb 22 '20 at 09:45
  • No, I am working with vector<vector> in C++. –  Feb 22 '20 at 14:31
  • @chrysaetos99 then I would suggest switching to proper structures for matrices (raw double*, wrappers, external libraries), as while keeping the same asymptotic complexity, you will get the results much faster. Not sure it is your goal, though. – Anton Menshov Feb 22 '20 at 19:38
  • That's a great suggestion, thanks! –  Feb 22 '20 at 19:48
  • 1
    It seems to me that this question still does not have a satisfying answer. The most interesting case is the one when the matrix is singular or almost singular, and in this case using the formula $\det(A) A^{-T}$ is either outright impossible, or otherwise it probably still is a bad idea in terms of stability. It looks like there should be an $O(n^3)$ solution even for this case. – Federico Poloni Feb 23 '20 at 11:04
  • @FedericoPoloni For an almost singular matrix $A$ still SVD might work well, but for an exact singular matrix with $\mathrm{det}(A) = 0$, co-factor can't be defined uniquely. We know that: $$A\mathbf{C}^{T} = \mathrm{det}(A) I$$, if $\mathrm{det}(A) = 0$ the equation $$A X = 0$$, where $X = \mathbf{C}^{T}$ has many non-trivial solutions. So, which one might be picked as co-factor? – Mithridates the Great Feb 23 '20 at 16:32
  • @AloneProgrammer The cofactor matrix is well defined even for singular matrices. Its definition is not $\det(A) A^{-1}$, that is just a characterization that works for invertible matrices. It is defined in terms of minor determinants. – Federico Poloni Feb 23 '20 at 16:41
  • @FedericoPoloni I know but in that case for an exactly singular matrix, you won't get better than $\mathcal{O}(n^{5})$ by going through all the first minors and compute their determinants and I don't think there might be any better way to do it practically. Of course there are some much more complex algorithms to compute determinant somewhere between $\mathcal{O}(n^{2})$ and $\mathcal{O}(n^{3})$ such as Strassen or Coppersmith-Winograd algorithms, but those algorithms might not be easy to implement or available in common linear algebra libraries. – Mithridates the Great Feb 23 '20 at 16:53
  • @AloneProgrammer You won't get better than $\mathcal O(n^5)$ [citation needed]. Personally I would be surprised if there isn't a reasonably simple $\mathcal O(n^3)$ algorithm. – Federico Poloni Feb 23 '20 at 17:55
  • @AloneProgrammer Found it on another answer on this website! Fast algorithm for computing cofactor matrix Voting to close this as a duplicate. Note that Anton Menshov's solution is $\mathcal{O}(n^3)$. – Federico Poloni Feb 23 '20 at 20:05

2 Answers2

1

Determinants and matrix inversion are pretty numerically unstable, but if all you are going for is speed, you can compute $A^{-1}$ in $O(n^3)$ time, then we have the cofactor matrix given by $$ C = \mathrm{det}(A)(A^{-1})^T $$

whpowell96
  • 2,444
  • 7
  • 19
1

I prefer to use SVD (singular value decomposition) instead of calculating inverse and determinant directly. SVD is still $\mathcal{O}(n^{3})$ in time complexity, but I think is much more stable. For singular decomposition of $A$ you have:

$$A = U \Sigma V^{T}$$

Where $U$ and $V$ are orthogonal matrices and $\Sigma$ is just a diagonal matrix. So:

$$|\mathrm{det}(A)| = \prod_{i} \mathrm{diag}(\Sigma)_{i}$$

Please pay attention to the abs in the above formula, cause the only thing that we know is $\mathrm{det}(U),\mathrm{det}(V) = \pm 1$. Also, an inverse could be calculated from SVD as because $U$ and $V$ are orthogonal matrices:

$$A^{-1} = V \Sigma^{-1} U^{T}$$

So the co-factor is:

$$\mathbf{C} = \mathrm{det}(A) A^{-T}$$

Mithridates the Great
  • 2,332
  • 2
  • 11
  • 27
  • 1
    I know that A^T is the transposed matrix, but what is meant by A^(-T)? Is it the transposed of the inverse? This also doesn't work, if det(A) = 0, right? –  Feb 22 '20 at 16:52
  • @chrysaetos99 $A^{-T} = (A^{-1})^T$ of course a matrix with zero determinant does not have co-factor. – Mithridates the Great Feb 22 '20 at 20:34