I have the following code
epsilon[t_, alpha_] := If[t > 3, 1, 2]
L[t_, alpha_] :=
Min[epsilon[t,
alpha] - (x[t - 1, alpha] -
Min[epsilon[t, alpha - 1], x[t - 1, alpha]]), x[t - 1, alpha - 1]]
d[t_] := 10
f = 5
x[0, alpha_] := 0
x[t_, f] := x[t - 1, f] + L[t, f] - Min[d[t - 1], x[t - 1, f]]
x[t_, 0] := 10
x[t_, alpha_] := x[t - 1, alpha] + L[t, alpha] - L[t, alpha + 1]
s = TimeUsed[];
Table[x[t, a], {t, 0, 8}, {a, 0, 10}] // MatrixForm
Timeused : TimeUsed[] - s
The timeused output increases dramatically and nonlinearly as I increase the range of t, in Table[...]:
1. 0.0
2: 0.016
3: 0.016
4: 0.063
5: 0.422
6: 2.90
7: 19.344
8: 131
Why does this happen? It seems to me that my equations should be computable in linear of $t$ time: For every $t$, x[t, alpha] can be calculated solely on the basis of x[t-1, alpha]. Therefore each computation should be roughly equally intensive. So it should be able to just sequentially compute all the $x$'s for $t = 0$, $t = 1$, $t = 2$, ...
What am I doing wrong?
SetDelay, i.e,:=instead use=. – zhk Apr 15 '17 at 09:02