6

I am trying to do multiple integrations recursively. For instance, I would like to do the following equation for arbitrary integer $n$:

$\displaystyle R_n(t) = \int_0^t \mathrm dt' R_0(t-t') R_{n-1}(t')$

where $R_0(t) = e^{-k t}\cos\omega_0 t$. I can do it by brute force, but I would like to be able to do it automatically for any $n$.

J. M.'s missing motivation
  • 124,525
  • 11
  • 401
  • 574
BeauGeste
  • 2,815
  • 2
  • 29
  • 32

3 Answers3

10

I think this should work:

ClearAll[r];
r[0, t_] := Exp[-k*t]*Cos[t];
r[n_, t_] := Integrate[r[0, t - td]*r[n - 1, td], {td, 0, t}]

eg

r[2,t]

(*
(\[ExponentialE]^(-k t) (2 \[ExponentialE]^(k t) k^2 - 
   k (2 k + t + k^2 t) Cos[t] + (k - k^3 + t + k^2 t) Sin[t]))/(2 (1 +
    k^2)^2)
*)
Szabolcs
  • 234,956
  • 30
  • 623
  • 1,263
acl
  • 19,834
  • 3
  • 66
  • 91
  • An interesting way to speed this up considerably is to avoid the definite integrals by doing this: r[n_, t_] := Simplify[Subtract @@ Table[Integrate[r[0, t - td]*r[n - 1, td], td] /. td -> i, {i, {t, 0}}]. It seems that by doing the indefinite integral first, I can circumvent additional convergence checks or simplifications that slow things down. But I'm surprised by how much faster it gets, e.g., for r[4,t]. – Jens May 28 '13 at 15:28
  • @Jens thanks; strange that it speeds it up so much. Do you mind if I add this piece of information to the answer? – acl May 28 '13 at 22:03
  • Of course you can add it to the answer! – Jens May 28 '13 at 22:13
10

As already mentioned, this is a convolution. Luckily, there's a more natural function to use for this problem than Integrate[], and that function is called, appropriately enough, Convolve[]. Now, since Convolve[] assumes an infinite integration region, we need a UnitStep[] multiplier in both the functions being convolved to limit the integration region to a finite interval. Here is one such implementation, making use of Convolve[], as well as a caching technique described here:

r[0, k_, t_] := Exp[-k t] Cos[t];
r[n_Integer, k_, t_] := 
  Module[{kl, tl, y}, 
   Set @@ Hold[r[n, kl_, tl_], 
     Simplify[
      Convolve[UnitStep[y] r[n - 1, kl, y], UnitStep[y] r[0, kl, y], 
       y, tl], tl >= 0]];
   r[n, k, t]];

Note that I had already taken the liberty to add k as an additional parameter. You can do the same thing for the $\omega_0$ factor within the cosine. The advantage of using Leonid's version of caching is that effort expended for computing, say, r[10, 4, t] is still usable for computing r[7, 8, t], since the caching remembers for a generic, as opposed to a specific, k value.

J. M.'s missing motivation
  • 124,525
  • 11
  • 401
  • 574
3

You can write the definition down like that in Mathematica:

r[0, t_] := Exp[-k t] Cos[ω0 t]
r[n_, t_] := r[n, t] = 
    Integrate[r[0, t - tt] r[n - 1, tt], {tt, 0, t}, 
              Assumptions -> k > 0 && ω0 > 0]

... or is that what you considered to be brute force, and you're looking for a general solution of the recursion relation?

Anyway, the code above is awfully slow, even though I used the r := r = trick. It gets much faster when setting $k$ and $\omega_0$ to $1$.

J. M.'s missing motivation
  • 124,525
  • 11
  • 401
  • 574
David
  • 14,911
  • 6
  • 51
  • 81