One can optimize within NDSolve.
OP's code, for comparison:
SeedRandom[0]; (* for reproducibility *)
pts = Table[{i, RandomReal[1]}, {i, 10}];
AbsoluteTiming[
s = NDSolve[{y'[x] == y[x] Cos[x + y[x]], y[0] == 1}, y, {x, 0, 11}];
Sum[NMinimize[
{(x - pts[[i, 1]])^2 + ((y[x] /. s[[1]]) - pts[[i, 2]])^2,
0 <= x <= 11}, x][[1]],
{i, Length[pts]}] / Length[pts]
]
(* {0.784672, 0.23634} *)
Collect the critical points in NDSolve, Sow the distances tagged by dist[p] for each point p, and Reap the minima:
AbsoluteTiming[
dists = Last@Reap[
sol = First@ NDSolve[{y'[x] == y[x] Cos[x + y[x]], y[0] == 1,
Table[ (* a crit-pt event for each point *)
With[{p = p},
WhenEvent[Dot[{1, y'[x]}, p - {x, y[x]}] == 0,
Sow[Norm[p - {x, y[x]}], dist[p]]]], (* sow distance tagged by dist[p] *)
{p, pts}]},
y, {x, 0, 11}];
Do[ (* sow end points *)
Sow[Norm[p - {0., y[0] /. sol}], dist[p]];
Sow[Norm[p - {11., y[11] /. sol}], dist[p]],
{p, pts}],
dist /@ pts, (* tags for reaping point by point *)
Min[#2] &]; (* applied to each tag group of distances *)
Total[dists, 2]/Length@pts
]
(* {0.00991, 0.406317} *)
Almost 80 times faster.
This is just as fast, or a bit faster, using Nearest to get a good seed for FindMinimum. Also Newton's method will be fast, but must be used on an unconstrained problem. This could pose problems if the distance from a given point to the curve is decreasing at the endpoints of the curve.
AbsoluteTiming[
s = NDSolve[{y'[x] == y[x] Cos[x + y[x]], y[0] == 1}, y, {x, 0, 11}];
nf = Nearest[
Transpose@{Flatten[y["Coordinates"] /. s], y["ValuesOnGrid"] /. First[s]} ->
Flatten[y["Coordinates"] /. s]];
Sum[Sqrt[FindMinimum[
(x - pts[[i, 1]])^2 + ((y[x] /. s[[1]]) - pts[[i, 2]])^2,
{x, First@nf@pts[[i]]}, Method -> "Newton"][[1]]],
{i, Length[pts]}]/Length[pts]
]
FindMinimum::lstol: The line search decreased the step size to within the tolerance specified by AccuracyGoal and PrecisionGoal but was unable to find a sufficient decrease in the function. You may need more than MachinePrecision digits of working precision to meet these tolerances.
(* {0.008901, 0.406317} *)
I'm not sure why FindMinimum has trouble with one of the points (the first in this case). That sometimes happens when using interpolating functions, but things seem well behaved near the minimum.
NMinimizewithFindMinimumgives a $\sim 2$ factor speedup.NMinimizeis for global minimization problems, whileFindMinimumfor local minimization. In this case the latter is probably sufficient. Further removing the constraint onxfromFindMinimumalso gives you a dramatic improvement ($\sim 0.01$ s on my laptop to run the minimization vs the $\sim 0.9$ with the origin version). – glS Jan 02 '17 at 11:55s = NDSolve[...]toss = NDSolveValue[...]and doinglen = Length@pts; AbsoluteTiming[ Total[Sqrt[Table[ FindMinimum[(x - i)^2 + (ss[x] - pts[[i, 2]])^2, {x, 5.}][[1]] , {i, len}]]]/len ]gives an answer in 0.006 s on my laptop, compared to 0.7 s. A broader note, since you are doing a least squares problem thenFindMinimumhas a method for that. – Marius Ladegård Meyer Jan 02 '17 at 13:11