8

@JimBelk asked in Interpolating an Antiderivative how to find a numerical antiderivative. I gave an answer that uses NDSolve with the default method for integrating $y'=f(x,y)$. However for $f(x,y)=g(x)$, more powerful integration rules are available, like Gauss-Kronrod.

Is there a way to use NIntegrate integration rules within NDSolve to solve IVPs of the form y'[x] == f[x]?

For instance, to find

NDSolve[{y'[x] == Sin[x^2], y[0] == 0}, y, {x, 0, 15}]

with a method from NIntegrate.

Michael E2
  • 235,386
  • 17
  • 334
  • 747

1 Answers1

9

We construct an NDSolve method which can pass an NIntegrate method to NIntegrate to set up an integration rule. We define a method nintegrate implements such a method. The requirements are

  • the ODE is of the form y'[x] == f[x], and
  • the NIntegrate method returns an interpolatory rule.

Example:

foo = NDSolveValue[{y'[x] == Sin[x^2], y[0] == 0}, y, {x, 0, 15}, 
  Method -> nintegrate, InterpolationOrder -> All]

Error plot:

Plot[
 Evaluate@RealExponent[Integrate[Sin[x^2], x] - foo[x]],
 {x, 0, 15},
 GridLines -> {Flatten@foo@"Grid", None}, (* show steps *)
 PlotRange -> {-18.5, 0.5}]

Another example:

foo = NDSolveValue[{y'[x] == Sin[x^2], y[0] == 0}, y, {x, 0, 15}, 
  Method -> {nintegrate, 
    Method -> {"ClenshawCurtisRule", "Points" -> 33}}, 
  InterpolationOrder -> All, WorkingPrecision -> 32, 
  PrecisionGoal -> 24, MaxStepFraction -> 1, StartingStepSize -> 15]

Error plot:

Block[{$MaxExtraPrecision = 500},
 ListLinePlot[
  Integrate[Sin[x^2], x] - foo[x] /. x -> Subdivide[0, 15, 1000] // 
   RealExponent, DataRange -> {0, 15}, PlotRange -> {-35.5, 0.5}, 
  GridLines -> {Flatten@foo@"Grid", None}]
 ]

Code for method

nintegrate::nintode = 
  "Method nintegrate requires an ode of the form ``'[``] == f[``]";
nintegrate::nintinit = 
  "NIntegrate method `` did not return an interpolatory integration rule.";

nintegrate[___]["StepInput"] = {"F"["T"], "H", "T", "X", "XP"}; nintegrate[___]["StepOutput"] = {"H", "XI"}; nintegrate[rule_, order_, ___]["DifferenceOrder"] := order; nintegrate[___]["StepMode"] := Automatic

Options@nintegrate = {Method -> "GaussKronrodRule"}; getorder[points_, method_] := Switch[method , "GaussKronrodRule" | "GaussKronrod", (* check points should be odd ??? ) With[{gp = (points - 1)/2}, If[OddQ[gp], 3 gp + 2, 3 gp + 1] ] , "LobattoKronrodRule", ( check points should be odd ??? *) With[{glp = (points + 1)/2}, If[OddQ[glp], 3 glp - 2, 3 glp - 3] ] , "GauseBerntsenEspelidRule", 2 points - 1 , "NewtonCotesRule", If[OddQ[points], points, points - 1] , , points - 1 ]; nintegrate /: NDSolve`InitializeMethod[nintegrate, stepmode, sd_, rhs_, state_, mopts : OptionsPattern[nintegrate]] := Module[{prec, order, norm, rule, xvars, tvar, imeth}, xvars = NDSolveSolutionDataComponent[state@"Variables", "X"]; tvar = NDSolveSolutionDataComponent[state@"Variables", "T"]; If[Length@xvars != 1, Message[nintegrate::nintode, First@xvars, tvar, tvar]; Return[$Failed]]; If[! VectorQ[rhs["FunctionExpression"][ N@NDSolve`SolutionDataComponent[sd, "T"], Sequence @@ xvars], NumericQ ], Message[nintegrate::nintode, First@xvars, tvar, tvar]; Return[$Failed]]; prec = state@"WorkingPrecision"; norm = state@"Norm";

imeth = Replace[Method /. mopts, Automatic -> "GaussKronrodRule"]; rule = NIntegrate[1, {x, 0, 1}, Method -> {"GlobalAdaptive", "SymbolicProcessing" -> 0, Method -> imeth}, WorkingPrecision -> prec, IntegrationMonitor :> (Return[Through[#@"GetRule"], NIntegrate] &)]; rule = Replace[rule, { {(NIntegrateGeneralRule | NIntegrateClenshawCurtisRule)[idata_]} :> idata, _NIntegrate :> Return[$Failed], _ :> (* What happened here? *) (Message[nintegrate::nintinit, Method -> imeth]; Return[$Failed]) }]; order = getorder[Length@First@rule, imeth /. {m_String, ___} :> m];

nintegrate[rule, order, norm] ];

(rule : nintegrate[int_, order_, norm_, ___])[ "Step"[rhs_, h_, t_, x_, xp_]] := Module[{prec, tt, xx, dx, normh, err, hnew, temp}, (* Norm scaling will be based on current solution y. ) normh = (Abs[h] temp[#1, x] &) /. {temp -> norm}; tt = Rescale[int[[1]], {0, 1}, {t, t + h}]; xx = rhs /@ tt; dx = hint[[2]].xx; (* Compute scaled error estimate ) err = hint[[3]].xx // normh; hnew = Which[ err > 1 (* Rejected step: reduce h by half ) , dx = $Failed; h/2 , err < 2^-(order + 2), 2 h , err < 1/2, h , True, h Max[1/2, Min[9/10, (1/(2 err))^(1/(order + 1))]] ]; ( Return step data along with updated method data *) {hnew, dx}];

Michael E2
  • 235,386
  • 17
  • 334
  • 747