2

I, like others before me, am struggling with FindFit. So, I tried a simple example.

data2 = Table[{x, x^2}, {x, -2, 2, 0.1}]

ListPlot[data2]

FindFit[data2, a x^d, {a, d}, x]

Which then produces the error:

"FindFit::nrjnum: The Jacobian is not a matrix of real numbers at {a,d} = {1.,1.}"

The plot looks good, but the error message doesn't mean anything to me. Where the bottom of the range of x is greater than zero, FindFit enter code hereworks okay, and gives the right result. When the bottom of the range is zero or less, the error message comes back.

Two questions please. What does the error message mean, and how can I avoid running into it?

Dr. belisarius
  • 115,881
  • 13
  • 203
  • 453
Francis King
  • 123
  • 1
  • 3

1 Answers1

6

There are two reasons why it doesn't work.

You have negative x values. Raising these to fractional d powers gives complex results. For this reason it is not sensible to use the form a x^d for fitting this data. However, you could use a Abs[x]^d instead.

With this change it still won't work. The other reason is that FindFit work by minimizing (using FindMinimum) the sum of the squares errors. The data includes the value x==0 so the sum of the squared errors will include 0^d as a subexpression. Most minimization methods will calculate the gradient of the function to minimize. Mathematica will try to do this symbolically. The derivative of 0^d is going to yield Indeterminate through evaluating to 0^d Log[0], which ultimately causes the trouble.

Two possible solutions are: Remove the point where x==0

FindFit[DeleteCases[data2, {x_, _} /; x == 0], a Abs[x]^d, {a, d}, x]

(* ==> {a -> 1., d -> 2.} *)

Or use a minimization method that doesn't calculate the gradient:

FindFit[data2, a Abs[x]^d, {a, d}, x, Method -> "PrincipalAxis"]

(* ==> {a -> 1., d -> 2.} *)
Szabolcs
  • 234,956
  • 30
  • 623
  • 1,263
  • Oh, you beat me to it. The reason a Abs[x]^d doesn't work is because the Jacobian is Indeterminate at x = 0. Delete that from data and it works. – Michael E2 Mar 20 '14 at 22:46
  • @MichaelE2 It gives me the same error – Dr. belisarius Mar 20 '14 at 22:49
  • Nice analysis. +1 – ciao Mar 20 '14 at 22:51
  • @MichaelE2 I still don't understand something. The derivative of 0^d is well defined if d>0. So if it is approximated numerically (not symbolically) then there shouldn't be a problem. Yet all the methods except PrincipalAxis fail because of it. How can we force Mma to approximate the gradient purely numerically other than making the function a numerical black box? – Szabolcs Mar 20 '14 at 23:12
  • @MichaelE2 Also, FindMinimum[0^d + 2^d - 4, {d, 1}, Method -> "Newton"] fails as expected but FindMinimum[0^d + 2^d - 4, {d, 1}, Method -> "QuasiNewton"] works. So why doesn't QuasiNewton work for the fitting? And why does FindMinimum[N[0^d + 2^d - 4], {d, 1}, Method -> "Newton"] not fail? – Szabolcs Mar 20 '14 at 23:20
  • The Jacobian D[a Abs[x]^d, {{a, d}}] is {Abs[x]^d, a Abs[x]^d Log[Abs[x]]}, which is Indeterminate at x = 0. Any derivative-based algorithm is likely to have trouble. The "PrincipalAxis" method is derivative-free. It's the only derivative-free method I know.... – Michael E2 Mar 20 '14 at 23:29
  • Neither FindMinimum method works for me. "QuasiNewton" uses the (symbolic) first derivative and numerically estimates the second derivative (or Hessian). – Michael E2 Mar 20 '14 at 23:37
  • @MichaelE2 "QuasiNewton uses the (symbolic) first derivative and numerically estimates the second derivative (or Hessian)." <-- OK, that explains it. This is only a problem when doing the derivative symbolically. – Szabolcs Mar 20 '14 at 23:38
  • @belisarius Szabolcs's code that deletes 0 works for me. – Michael E2 Mar 20 '14 at 23:40
  • @MichaelE2 I'm trying data2 = Table[{x, x^2}, {x, -2, 2, 0.1}]; FindFit[DeleteCases[data2, {x_, _} /; x == 0], a x^d, {a, d}, x] That's why we get different results – Dr. belisarius Mar 20 '14 at 23:51