6

By default, NIntegrate works with MachinePrecision and its PrecisionGoal is set to Automatic which is effectively a value near 6:

In[1]:= Options[NIntegrate, {WorkingPrecision, PrecisionGoal}]

Out[1]= {WorkingPrecision -> MachinePrecision, PrecisionGoal -> Automatic}

I need sufficiently higher accuracy when computing the integrals similar to this one:

dpdA[i_] := NIntegrate[
  Cos[φ] Cos[i*φ] Exp[Sum[-Cos[j*φ], {j, 11}]], {φ, 0, Pi}, 
  Method -> {Automatic, "SymbolicProcessing" -> None}]

The integral cannot be taken symbolically, so "SymbolicProcessing" is off.

Actually I need to compute such integrals thousands of times during an optimization procedure in order to find best coefficients a[j] under the summation:

dpdA[i_] := 
 NIntegrate[Cos[φ] Cos[i φ] Exp[Sum[(-a[j]) Cos[j φ], {j, 11}]], {φ, 0, Pi}, 
  Method -> {Automatic, "SymbolicProcessing" -> None}]

Is there a way to precondition this integral in order to make integration with high WorkingPrecision faster? Perhaps using Experimental`NumericalFunction?

The problem is that when I increase PrecisionGoal to 15 and consequently WorkingPrecision to a value higher than MachinePrecision I get very low performance.

Alexey Popkov
  • 61,809
  • 7
  • 149
  • 368
  • 3
    With NIntegrate, performance is usually dictated by the appropriateness or otherwise of the chosen method. Try, for example, dpdA[i_] := NIntegrate[Cos[φ] Cos[i*φ] Exp[Sum[-Cos[j*φ], {j, 11}]], {φ, 0, Pi}, WorkingPrecision -> $MachinePrecision, MinRecursion -> 3, MaxRecursion -> 5, Method -> {"GlobalAdaptive", Method -> {"GaussKronrodRule", "Points" -> 20}, "SymbolicProcessing" -> False}]. This works well for small arguments, but for larger arguments the integrand is highly oscillatory so another method could be better. – Oleksandr R. Jul 28 '12 at 00:24
  • 3
    @Oleksandr is right, a different method would be better. You say you don't want symbolic processing, so you can't use "LevinRule" as the Method (though it seems to work nicely on your integral when I tried it, and removing "SymbolicProcessing" -> None to that effect). I would suggest trying "ClenshawCurtisOscillatoryRule" as the Method. See if it helps. – J. M.'s missing motivation Jul 28 '12 at 00:40
  • Maybe you can speed up the computation by transforming your integral with the substitution u==Cos[\[CurlyPhi]], \[CurlyPhi]==ArcCos[u], du/Sqrt[1-u^2]==d\[CurlyPhi]. With that transformation you get rid of all the costly trigonometric function calls so NIntegrate might give you shorter integration times. – Thies Heidecke Aug 10 '12 at 18:53
  • ok, i take back that suggestion. I just tested it, and it takes roughly double the integration time because the integrand has singularities at the ends. – Thies Heidecke Aug 10 '12 at 18:59
  • @Thies, in fact, when confronted with an integral with a $\sqrt{1-u^2}$ factor over the interval $(-1,1)$ to be evaluated numerically, the first instinct should be the substitution $u=\sin,v$ or $u=\cos,v$ to mitigate the endpoint singularities... – J. M.'s missing motivation Sep 07 '12 at 10:14
  • I can't reproduce the problem. I'm using Mathematica 8.0.4. I have tried {WorkingPrecision->60, PrecisionGoal->30} and {WorkingPrecision->30, PrecisionGoal->15} and the difference between the 2 answers is 10E-26. The answers are immediate. – andre314 Feb 11 '13 at 17:28
  • @andre AbsoluteTiming gives 0.8593750 second for WorkingPrecision->60, PrecisionGoal->30 and 0.5625000 for WorkingPrecision->30, PrecisionGoal->15 on my machine. I need to compute such integrals thousands of times with variable coefficients a[j] under the summation: NIntegrate[Cos[φ]*Cos[i*φ]*Exp[Sum[-a[j]*Cos[j*φ],{j,11}]],{φ,0,Pi}]. – Alexey Popkov Feb 11 '13 at 22:33

1 Answers1

2

Maybe the solution would be to forget about NIntegrate and to try to do it 'by hand', e.g.

integrate[f_, nx_, prec_: MachinePrecision] :=
 Module[{xg, fg},
  xg = Table[(2 i - 1)/(2*nx) \[Pi], {i, 1, nx}];
  fg = N[f /@ xg, prec];
  \[Pi]/Sqrt[nx] First@FourierDCT[fg, 2]
 ];

I'm assuming that the integrand has the following form

$$ f(x)=\sum_{j\geq 0}a_{j}\cos\left(j x\right) $$

then, the integral of $f(x)$ can be rewritten as

$$ \int_{0}^{\pi} f(x) dx = \int_{0}^{\pi} \sum_{j\geq 0}a_{j}\cos\left(j x\right) dx = \sum_{j\geq 0}a_{j}\int_{0}^{\pi}\cos\left(j x\right) dx $$

The last integral is $\pi\delta_{j0}$, so the final result is

$$ \int_{0}^{\pi} f(x) dx = \pi a_{0}$$

I'm using FourierDCT[] to get the $a_{0}$, for that I need to probe the $f(x)$ function at a compatible set of collocation points (factor $1/\sqrt{nx}$ comes from normalization used in FourierDCT[]). This gives an exact result for function being a linear combination of cosines up to $\cos((nx-1)x)$. In your case

Do[a[j] = RandomReal[{}, WorkingPrecision -> 50], {j, 11}];
f = Function[{\[CurlyPhi], i}, 
   Cos[\[CurlyPhi]] Cos[i \[CurlyPhi]] Exp[Sum[(-a[j]) Cos[j \[CurlyPhi]], {j, 11}]]];

The integrate[] procedure can produce result within an order of magnitude shorter time (using grid of 'just' 128 points)

NIntegrate[f[x, 2], {x, 0, \[Pi]}, AccuracyGoal -> 25, 
  PrecisionGoal -> 25, WorkingPrecision -> 50] // AbsoluteTiming
(* => {0.265530, -0.63130651358588386138570371796273647256851197182433} *)

integrate[f[#, 2] &, 128, 50] // AbsoluteTiming
(* => {0.030267, -0.631306513585883861385703717912986612789831494874} *) 

In addition, one can verify how does the error change with respect to number of points (nx)

convData = Table[{nx, integrate[f[#, 2] &, nx, 50]}, {nx, 2^Range[3, 8]}];
ListPlot[MapAt[Log10@Norm[# - convData[[-1, 2]]] &, 
   convData, {All, 2}], Joined -> True, PlotMarkers -> Automatic, 
AxesLabel -> {"nx", "Log10[Error]"}]

enter image description here

Of course due to the oscilatory character of your function the problem would be that with higher $i$ number of grid points nx needs to be increased.

mmal
  • 3,508
  • 2
  • 18
  • 38