7

While doing a computation, I needed to take numerical derivatives of eigenvectors of a 4x4 hermitian matrix with respect to a parameter. I ran into the issue of phase jumps -- a random phase unpredictability added to numerically solved for eigenvectors -- and played with solutions like forcing the phase of the first element to be zero, setting the weighted average of phases zero, etc.

This is a 4x4 matrix though, and I could avoid jumps by solving for an analytic solution of the full eigensystem for an arbitrary hermitian matrix. I could then simply plug in my matrix values to get eigenvectors. I can do this twice and use finite difference to numerically compute the derivative. Here's what I did:

    x = {{aa/2, ab, ac, ad}, {0, bb/2, bc, bd}, {0, 0, cc/2, cd}, {0, 0,
   0, dd/2}};
m = (x + ConjugateTranspose[x]);
mEigensystem = Eigensystem[m, Cubics -> True, Quartics -> True];

give4x4Eigensystem[matrix_] :=
  mEigensystem /. {aa -> matrix[[1, 1]], ab -> matrix[[1, 2]],
    ac -> matrix[[1, 3]], ad -> matrix[[1, 4]], bb -> matrix[[2, 2]],
    bc -> matrix[[2, 3]], bd -> matrix[[2, 4]], cc -> matrix[[3, 3]],
    cd -> matrix[[3, 4]], dd -> matrix[[4, 4]]} 

Not the most elegant solution, but it gave me the right answers. Unfortunately, the calculation took surprisingly long.

I understand that the formulas I'm plugging elements into are very large, but they're stored in memory, and all mathematica has to do is plug numbers into these formulas and evaluate.

I'm not great at Mathematica so I'm wondering if there something obvious I'm missing that slows down this approach? Am I implicitly asking it to symbolically simplify the huge expressions involved? How could I speed up this approach?

Thanks!

Mr.Wizard
  • 271,378
  • 34
  • 587
  • 1,371
Steve
  • 73
  • 3
  • You can gain a factor of 25 in speed by setting Cubics -> False, Quartics -> False. – Roman Mar 14 '20 at 12:12
  • Finite differences are usually a bad idea. Instead, matrix perturbation theory might be precisely what you need. – Roman Mar 14 '20 at 12:18
  • 1
    related: https://mathematica.stackexchange.com/q/100647/34893 – AccidentalFourierTransform Mar 14 '20 at 17:29
  • 1
    Thanks for the response -- why are finite differences a bad idea? – Steve Mar 14 '20 at 18:43
  • Eigenvectors are only defined up to scalar multiple. What exactly is it that you wish to do here? A full example would be helpful. – Daniel Lichtblau Mar 14 '20 at 21:10
  • @DanielLichtblau hey so I was taking a derivative of a unitary matrix, which is built from eigenvectors of a hermitian matrix. But random phases were being added to the numerical eigenvectors Mathematica spat out. I thought this would be a way to resolve that issue. – Steve Mar 15 '20 at 19:31

1 Answers1

9

You can try to autocompile this:

x = {{aa/2, ab, ac, ad}, {0, bb/2, bc, bd}, {0, 0, cc/2, cd}, {0, 0, 0, dd/2}};
m = (x + ConjugateTranspose[x]);
mEigensystem = Eigensystem[m, Cubics -> True, Quartics -> True];

evalMX =
  With[{mm = mEigensystem[[2]] // N},
   Hold[compile[
       {{mx, _Complex, 2}},
       block[
        {
         aa = mx[[1, 1]], ab = mx[[1, 2]], ac = mx[[1, 3]], ad = mx[[1, 4]], bb = mx[[2, 2]], bc = mx[[2, 3]], bd = mx[[2, 4]], cc = mx[[3, 3]], cd = mx[[3, 4]], dd = mx[[4, 4]]
         },
        mm
        ],
       CompilationTarget -> "C"
       ]
      ] /. {compile -> Compile, block -> Block} // ReleaseHold
   ];

evalMX[{{1, 2, 3, 4}, {2, 3, 4, 1}, {3, 4, 1, 2}, {4, 1, 2, 3}}] // Re // RepeatedTiming

{7.0*10^-6, {{-2.41421, -1., 2.41421, 1.}, {-1., 1., -1., 1.}, {0.414214, -1., -0.414214, 1.}, {1., 1., 1., 1.}}}
b3m2a1
  • 46,870
  • 3
  • 92
  • 239
  • 1
    This works incredibly well. I didn't even know this was possible in Mathematica. Thanks so much! – Steve Mar 14 '20 at 18:44
  • 2
    One of Mathematica's great uses is for automated code generation. Given how simple this kind of stuff is, you could imagine also marshalling down the expressions to something that could be used directly in another language if you primarily work in a different one. – b3m2a1 Mar 14 '20 at 20:46