In a nutshell
I have a large system of ODE. When I integrate the system from x = 0 to x = dx, NDSolve returns the result happily, it takes about a minute, and MaxMemoryUsed tells me that about 100MB was used to perform the computation. Because I have 8GB RAM, I ask NDSolve to integrate the system from x = 0 to x = 2 * dx, and it just crashes! I would expect memory required for NDSolve to increase linearly with integration time, am I wrong?
Minimal Working Example
Background
I have an extremely large first-order ODE system (59582 equations) of the form $y'(x) = f[y(x)]$, where $y(x)$ is an array of reals with Dimensions {2, 31, 31, 31}. Well, despite its size, compiled version of $f$ seems to compute the rhs pretty good, so I decided to give NDSolve a try. Because NDSolve evaluates the sytem symbolically, I create a CompiledFunction for $f$ and wrap it into another expression to protect it from symbolical evaluation and pass some extra data to the CompiledFunction:
ClearAll[$rhs, rhs];
$rhs = Compile[
{
{d1, _Integer},
{d2, _Integer},
{d3, _Integer},
{state, _Real, 4}
},
Table[
-(i + j + k + l)*state[[i, j, k, l]],
{i, 1, 2},
{j, 1, d1},
{k, 1, d2},
{l, 1, d3}
],
CompilationTarget -> "C"
];
rhs[{d1_, d2_, d3_}][y_ /; ArrayQ[y, 4, MachineNumberQ]] := $rhs[d1, d2, d3, y];
The actual rhs is, of course, different from the one given above. I have chosen this one as an example which should not introduce any numerical problems (a linear system with negative real eigenvalues, what can be easier?).
All the rest is straightforward:
Here is the code for NDSolve:
ClearAll[solve];
solve[rhs_, y0_, xmax_, opts : OptionsPattern[]] := NDSolve[
{
y'[x] == rhs[y[x]],
y[0] == y0
},
y,
{x, 0, xmax},
FilterRules[{opts}, Options[NDSolve]]
];
Here is the code for the initial value:
BlockRandom[
dims = {31, 31, 31};
init = RandomReal[1, Prepend[dims, 2]];
init = init/Sqrt[Total[init^2, Infinity]];,
RandomSeeding -> 3
]
And here is the value of integration time which leads to the crash on my system
dx = 0.7;
I believe that this value depends on a system MMA is running on. By the way, mine is
$Version
(*"11.2.0 for Microsoft Windows (64-bit) (September 11, 2017)"*)
The Problem
While x < dx, NDSolve works normally, and the memory it consumes increases linearly with the integration time:
mems = MaxMemoryUsed[
solve[
rhs[dims],
init,
#
]
] & /@ Range[0., dx, dx/10]
(*6385992, 52857744, 64484696, 71772376, 62706016, 66520768, 70335840, 80827824, 82735200, 89411304, 96087344*)
So that it takes about 100 MB to perform the computations for xmax = dx. Therefore, I wouldn't expect any problems for xmax = 2 * dx, but NDSolve just crashes the kernel for this value of xmax! Is it a bug, or do resources required by NDSolve increase faster then linearly?
NDSolvewill use progressively smaller time steps to maintain accuracy. So, memory requirements easily can grow much faster than linearly, depending on the equations.. – bbgodfrey Sep 21 '18 at 01:27RuntimeOptions -> EvaluateSymbolically -> False, or simply modify the definition toCompile[……]//Lastgiven you've compiled to C, to avoid symbolic calculation. – xzczd Sep 21 '18 at 07:22NDSolve: the bulk of the computation time should be spent on calculating the RHS anyway. That part (which is usually simple) can be programmed in C if needed. That is what is effectively being done here (withCompile). The complicated part is implementing the solver method, and that's exactly where it makes sense to rely on an existing implementation. If you get serious about this and decide to go all-C++, you'd likely still end up using a library that handles the method ... so why not NDSolve then? – Szabolcs Sep 23 '18 at 09:02