6

As it known, according to the Polar Decomposition, square matrix can be expressed in the next form $$M=QR$$ ($Q$ - orthogonal matrix $R$ - positive-semidefinite Hermitian matrix)

I need to find this $Q$ factor for the case of $3\times 3$ matrix. For this purpose I use next well known iterative formula

$$ Q_{i+1} = \frac{1}{2}\left[ Q_i+(Q_i^{-1})^T \right] $$

where the $\det Q_0\neq0 $ However, on practice it works a little bit slowly (it takes, more than 15 iteration before finds right answer). Is there any other, faster algorithms exist to perform Polar Decomposition? I have found exact formula for finding $Q$ factor for the case of $2\times 2$ matrices.

$$Q = M + \mathrm{sign}(\det M)\begin{pmatrix} d & -c\\ -b & a\\ \end{pmatrix}$$ where the initial matrix $$M=\begin{pmatrix} a & b\\ c & d\\ \end{pmatrix}$$ Does such formula exist for $3\times 3$ matrix?

J. M.
  • 3,155
  • 28
  • 37
Stepan Loginov
  • 289
  • 3
  • 11
  • Maybe this question is a bit obvious, but just to be thorough: have you tried the LAPACK QR routine? – Geoff Oxberry Nov 01 '13 at 22:09
  • It's not the QR decomposition, the original poster just used the same letters. In the polar decomposition, the $R$ is positive semidefinite and not triangular. – Nick Alger Nov 05 '13 at 01:10
  • 3
    A faster iteration than the one you are using is described in: Yuji Nakatsukasa and Nicholas J. Higham, Backward stability of iterations for computing the polar decomposition, SIAM Journal on Matrix Analysis and Applications, Vol. 33, No. 2, pp. 460-479, 2012. However, I am not sure if this is the best way if you are only working with $3\times 3$ matrices; if you find a (stable) closed form as suggested below it is probably much faster. – Federico Poloni Nov 06 '13 at 09:54

3 Answers3

2

You can reduce the problem to computing the singular value decomposition, for which there exist many fast methods and codes. For fast 3x3 SVD, I found this paper.

To reduce the polar decomposition to the SVD, suppose the polar decomposition is written in the following form, $$M = U P,$$

with orthogonal $U$ and positive semidefinite $P$ (Ie., $Q \rightarrow U$, $R \rightarrow P$ from your notation). Further, denote the SVD of $P$ by $$P = V \Sigma V^T.$$

Substituting the second equation into the first yields, $$M = U V \Sigma V^T.$$

In other words, if you compute the SVD of $M$, $$M = W \Sigma V^T,$$

then $P$ is given by the above formula, and $$U = W V^T.$$


Incidentally, when you want to prove the existence of SVD-like decompositions for infinite dimensional operators, one basic strategy is to start with the polar decomposition which is easier to prove, and then do these steps backwards.

Nick Alger
  • 3,143
  • 15
  • 25
  • I think you need to use the conjugate transpose instead of the regular transpose. – Azmisov May 14 '14 at 22:39
  • 1
    Sure, if your input matrix is complex-valued then you need the conjugate transpose. However, if it is real valued then $V$ is real-valued so it doesn't matter. (unlike the eigenvalue decomposition where real matrices could have complex eigenvectors) – Nick Alger May 14 '14 at 23:32
1

A more recent publication has come out with a new method for solving the 3x3 polar decomposition.

An algorithm to compute the polar decomposition of a 3x3 matrix

(I'm really surprised OP was needing 15 iterations for the iterative method!)

Praxeolitic
  • 143
  • 6
1

Section 2.5 in Continuum Mechanics by A.J.M. Spencer is devoted to the 3x3 polar decomposition.

k20
  • 772
  • 3
  • 3