7

I have to solve a system of fixed point equations and then calculate the eigenvalues of the Jacobian at the fixed point. There are about 50 equations with 50 variables and they contain a lot of numerical integrals, it would be really hard for me to give an explicit working example. I have the system of equations in the form

eqnSys={expr1,expr2,expr3,..,expr50};

the point is, it is a list of expressions which has to equal zero in the end and it is not defined as a function, but I have the variables saved in

eqnVars={x1,x2,..,x50};

I have very good initial guess for the root finding:

eqnGuess={{x1,1},{x2,2},..,{x50,50}};

The basic Newton method of FindRoot breaks down immediately, complaining about singular Jacobian. So this

FindRoot[eqnSys,eqnGuess]

does not work. I used the secant method for a long time to find the roots:

eqnGuessSec={{x1,0.9,1.1},{x2,1.8,2.2},..,{x50,45,55}};
FindRoot[eqnSys,eqnGuessSec]

Recently I came across the AffineCovariantNewton method for FindRoot which works like a charm and outperforms the secant method in time by a factor of 4:

FindRoot[eqnSys,eqnGuess,Method -> {"AffineCovariantNewton"}]

Judging from the evaluation monitor, it has several Jacobian evaluations:

FindRoot[eqnSys,eqnGuess,Method -> {"AffineCovariantNewton"},Jacobian -> {Automatic, EvaluationMonitor :> Print["J evaluated here"]}]

My question is: It would be really lucrative for me to be able to save the Jacobian directly from FindRoot. Is it possible to extract the Jacobian matrix constructed by FindRoot? I'm thinking about something like

Reap@FindRoot[eqnSys,eqnGuess,Method -> {"AffineCovariantNewton"},Jacobian -> {Automatic,Sow[jacobian]}]

I'm only interested in the purely numerical matrix and not a symbolic one. Bonus question: What is the most efficient way to turn the system of equations to a function? So something like

FeqnSys[x1_,x2_,...,x50_]:=eqnSys

Edit: I have implemented a very simple version of the problem. Increasing UTrunc increaseas the number of equations (but then need additional initial conditions). I basically need the object with the name ineedthisguy. I hoped that it can be obtained without this analytical differention, because for a real problem I can only generate the full matrix in chunks due to memory limitation.

d = 3;
WorPrec = 16;
\[Alpha] = 1;

UTrunc = 6;

Z[r_] := 0 W[r_] := 0

U[r_] := Sum[ ToExpression["u" <> ToString[n]]/n! (r - [Kappa])^n, {n, 2, UTrunc}]; [Omega][r_] := U'[r] + 2 r U''[r]

MasterKernel1[d_, n1_, [Omega]?NumericQ, w?NumericQ] := MasterKernel1[d, n1, [Omega], w] = -2 [Alpha] NIntegrate[ E^-y y^(-1 + d/ 2) (1 + y) (y + w y^2 + E^-y [Alpha] + [Omega])^-n1, {y, 0, [Infinity]}, Method -> {Automatic, "SymbolicProcessing" -> False}, WorkingPrecision -> WorPrec]

Derivative[1][MasterL[n_, d_]][[Rho]_] := Derivative[1][ MasterL[n, d]][[Rho]] = -n (MasterL[n + 1, d][[Rho]] [Omega]'[[Rho]] + MasterL[pa][n + 1, d + 2][[Rho]] Z'[[Rho]] + MasterL[pa][n + 1, d + 4][[Rho]] W'[[Rho]])

MasterL[n_, d_][[Kappa]] := MasterL[n, d][[Kappa]] = MasterKernel1[d, n, 2 [Kappa] u2, W[[Kappa]]]

BetaU[r_] := -d U[r] + (d - 2) r U'[r] - 1/(4 [Pi]^2) MasterL[1, d][r]

dExpr[f_, betafunc_, n_] := D[k D[f[r], k] == betafunc[r], {r, n}]

GenBeta[f_, betafunc_, min_, max_] := Block[{expr, result, tmpres}, expr = dExpr[f, betafunc, min]; result = {(expr /. r -> [Kappa])}; Do[ expr = D[expr, r]; tmpres = Block[{r = [Kappa]}, expr]; result = Join[result, {tmpres}]; , {i, min + 1, max} ]; Return[result]; ];

listU = GenBeta[U, BetaU, 1, UTrunc];
listU[[1]] = Thread[-listU[[1]]/u2, Equal];

FPEqn = ((Flatten@(List @@@ Flatten[listU]))[[2 ;; ;; 2]]);

varTrf = {g_[n_] :> ToExpression[ ToString[g] <> ToString[n]]};
varList = Flatten[{\[Kappa], Table[u[i], {i, 2, UTrunc}]}];

iniGuess = 
  Rationalize[
   List @@@ {\[Kappa] -> 0.04174875412610417566172053373396096686`12.,
      u2 -> 6.14584037490485804822706857376675685878`12., 
     u3 -> 60.04918116532118965443749174665446530096`12., 
     u4 -> 390.9010607033057646222`12., 
     u5 -> -3513.6112140902988423965`12., 
     u6 -> -93676.7079827356649900999`12.}, 0];
(*real solution:
{\[Kappa]\[Rule]0.0726928522670547`,u2\[Rule]4.570711765672155`,u3\
\[Rule]28.871831592476088`,u4\[Rule]134.9966784017132`,u5\[Rule]-371.\
15673934569224`,u6\[Rule]-14195.11815231752`}
*)

fOPT = Experimental`OptimizeExpression[FPEqn, 
   "OptimizationLevel" -> 2]; (*im not sure if this helps*)

fpLocator[initial__] := 
  FindRoot[fOPT // First, List @@@ initial, 
   Method -> {"AffineCovariantNewton"}, WorkingPrecision -> WorPrec, 
   StepMonitor :> {Print[initial[[All, 1]]]} ] ;

sol = fpLocator[iniGuess]


\!\(\*SuperscriptBox[\(MasterKernel1\), 
TagBox[
RowBox[{"(", 
RowBox[{"0", ",", "0", ",", "1", ",", "0"}], ")"}],
Derivative],
MultilineFunction->None]\)[d_, n1_, \[Omega]_?NumericQ, 
  w0_?NumericQ] := 
\!\(\*SuperscriptBox[\(MasterKernel1\), 
TagBox[
RowBox[{"(", 
RowBox[{"0", ",", "0", ",", "1", ",", "0"}], ")"}],
Derivative],
MultilineFunction->None]\)[d, n1, \[Omega], 
   w0] = -n1 MasterKernel1[d, n1 + 1, \[Omega], w0]

\!\(\*SuperscriptBox[\(MasterKernel1\), 
TagBox[
RowBox[{"(", 
RowBox[{"0", ",", "0", ",", "0", ",", "1"}], ")"}],
Derivative],
MultilineFunction->None]\)[d_, n1_, \[Omega]_?NumericQ, 
  w0_?NumericQ] := 
\!\(\*SuperscriptBox[\(MasterKernel1\), 
TagBox[
RowBox[{"(", 
RowBox[{"0", ",", "0", ",", "0", ",", "1"}], ")"}],
Derivative],
MultilineFunction->None]\)[d, n1, \[Omega], 
   w0] = -n1 MasterKernel1[d + 4, n1 + 1, \[Omega], w0]

ineedthisguy = Eigenvalues[D[FPEqn, {varList /. varTrf}] /. sol]
dzsoga
  • 341
  • 1
  • 8
  • 1
    D[eqnSys /. Equal -> Subtract, {eqnVars}]? If not, it must depend on eqnSys, which you haven't shared. The Jacobian might be evaluated numerically, and then there is no single matrix; the matrix is updated at each step (unless it accepts the "UpdateJacobian", in which case you can specify how often it is updated). – Michael E2 Sep 03 '20 at 13:08
  • I can generate symbolically the Jacobian, but it takes a lot of time both generating it and then substituting the values into it. I only need the matrix with only numerical entries/numerically evaluated matrix. – dzsoga Sep 03 '20 at 13:19
  • Do you want each matrix at each step, or just one matrix at some particular point in the domain? – Michael E2 Sep 03 '20 at 13:23
  • I only want the last one, which corresponds to the solution itself or close to it (last few digits does not really matter of the solution). – dzsoga Sep 03 '20 at 13:26
  • Even though your real problem is too detailed to post, could you add a toy problem to give people something to play with? – Chris K Sep 03 '20 at 14:37
  • Sure, I cook up something! – dzsoga Sep 03 '20 at 14:49
  • +1 to this question. I've wondered before if it's possible to directly obtain the Jacobians/Hessians used by functions like FindRoot and FindMinimum. – Sjoerd Smit Sep 03 '20 at 15:56
  • 2
    You could try complex-step differentiation. I worked up an example before you posted yours, and it relied on the system being made up of Listable expressions; you could adapt it. Here's key code: jac = Im[ sys /. Thread[ vars -> x0 + $MachineEpsilon*I*IdentityMatrix[Length@vars] ] ]/$MachineEpsilon, but this was faster for me: jac = Hold@Evaluate@vars /. Hold[v_] :> Block[v, v = x0 + $MachineEpsilon*I*IdentityMatrix[Length@vars]; Im@sys/$MachineEpsilon ] – Michael E2 Sep 03 '20 at 17:38
  • @MichaelE2 I have tried your suggestion. SetAttributes[MasterKernel1, Listable]; vars = varList /. varTrf; jac = Im[FPEqn /. Thread[vars -> vars + \[Epsilon]*I* IdentityMatrix[Length@vars]]]/\[Epsilon] /. sol; jac // MatrixForm Aside from speed, I am puzzled, because the matrix is identical to the analytically computed (in the post) for $10^{-4}<\epsilon < 10^{-6}$, but it is shifted away from the correct values for $\epsilon \to 0$. (I mean, I would expect that this method approached the correct values for $\epsilon \to 0$.) – dzsoga Sep 03 '20 at 19:16

1 Answers1

7

Here is a way to extract the Jacobian. The idea is to re-write the linear solver to Sow the Jacobian and then to Reap that:

Setup a simple problem:

f[X_] := Block[{x, y}, {x, y} = X; {Exp[x - 2] - y, y^2 - x}]
vars = {x, y};
start = {1, 1};
callFindRoot[f_, vars_, start_, opts___] := 
 vars /. FindRoot[f[vars], Evaluate[Transpose[{vars, start}]], opts]

Re-write the LinearSolver to Sow the Jacobian:

MyLinearSolver = (Sow[#]; LinearSolve[##]) &;

Call FindRoot

Reap[callFindRoot[f, vars, start, Method -> {"AffineCovariantNewton"
    , "LinearSolver" -> {MyLinearSolver}
    (*,"BroydenUpdates"\[Rule]False*)
    }]]

(* {{0.019026, 0.137935}, {{{{0.367879, -1.}, {-1., 2.}}, {{0.295112, -1.}, {-1., 1.7796}}, {{0.191064, -1.}, {-1., 1.28831}}, {{0.096665, -1.}, {-1., 0.121763}}}}} *)

I'll add this example to the documentation.

To turn the equations into a function you could use something along the lines of:

cf = With[{vars = vars, fun = f[vars]},
  Compile[{{X, _Real, 1}},
   Block[vars,
    vars = X;
    fun
    ]
   ]
  ]
user21
  • 39,710
  • 8
  • 110
  • 167