1

I am trying to solve numerically a system of linear ODEs with some quickly varying driving functions. The basic command is: NDSolve[Eqs, Vars, {t, 0, 1}, MaxSteps -> 10^7, MaxStepSize -> 10^(-7), AccuracyGoal -> 10, PrecisionGoal -> 10]; The lists of equations and variables contain 30 elements. My problem is that Mathematica V10.1.0.0 uses inordinate amount of memory, more than 40 GB.

Even if I assume that at each point the interpolation function stores first and second derivatives and it uses double-precision numbers with 8 bytes, I get 10^7*30*8*3=7.2GB.

I tried InterpolationOrder->2 in NDSolve, but that does not reduce the memory used.

Mike
  • 81
  • 2
  • 2
    What exactly is your question here? Do you want to find out why it uses so much memory, do you need to reduce memory usage... ? Perhaps sharing the list of equations and variables would be conducive to more specific answers. – MarcoB Feb 03 '16 at 19:20
  • The equations would take several pages, but its not relevant, they are just algebraic functions. If I program this in C, my memory usage would just go with the array size of the output as I go along and integrate. So, yes, the question is why does Mathematica use much more memory, if it stores data at intermediate points between each step or something like that and if there is a way to reduce it. Even with 32Gb of RAM, it is starting to swap and slow down. – Mike Feb 03 '16 at 21:01
  • 1
    Here is a possibility: you are asking for specific accuracy and precision goals, so Mathematica is probably using arbitrary-precision numbers in the calculation. These take much more space in memory than 8 bytes. Consider for instance that machine-precision 3` takes 16 bytes (ByteCount[3`]) on my machine, but arbitrary precision 3`10 occupies a whopping 80 bytes. At that rate with your settings, just storing the function value alone will cost you ca. 20 GB; value + 1st derivative may run into 40 GB or so. Do you see any difference without the accuracy and precision goals? – MarcoB Feb 04 '16 at 02:34
  • ByteCount indeed give interesting results. For example: – Mike Feb 05 '16 at 16:25
  • For example: ByteCount[{1, 2, 3, 4, 5, 6, 7, 8, 9, 10}]=280 if I type the list explicitly. But ByteCount[Range[10]]=184, ByteCount[Range[10^6]]/10.0^6=8.00014. So it seems asymptotically it uses 8 bytes, but not always. Similarly in NDSolve, sometimes I get InterpolationFunction that uses DeveloperPackedArrayForm` and 48 bytes per point. And sometimes I get just a regular-looking array that uses 200 bytes per point. – Mike Feb 05 '16 at 16:37
  • The first difference is due to the fact that Range generates a packed array object (see Needs["Developer`"]; PackedArrayQ@Range[10]). Compare also ByteCount /@ FromPackedArray@Range[10^6] which returns 16 bytes per value once the list is unpacked. – MarcoB Feb 05 '16 at 16:51
  • Yes, PackedArray seems to be part of the issue here. If I look at the InterpolatingFunction object coming from NDSolve, in particular part [[4]] sometimes it contains DeveloperPackedArrayForm` and sometimes it doesn't. I haven't been able to determine why similar ODEs give different results – Mike Feb 05 '16 at 18:54
  • A silly aside which I discovered recently as well: here's how to enter inline code that contains backticks so it is formatted correctly: How do I enter the backticks for contexts as inline code? :-) – MarcoB Feb 05 '16 at 18:59
  • Poking inside the InterpolatingFunction object here is a funny thing. Take part [[2,2]], which is an integer and change it to n. The number of bytes used changes. For n=0-3 its equal to 200 per data point, for n=4-7 it is 48 and for n=8 its 200 again and keeps repeating with period of 4. The actual evaluation result does not change. – Mike Feb 05 '16 at 19:15
  • In NDSolve, changing the method to "ExplicitRungeKutta" gives solutions with small memory use (48 bytes/point). But setting the method to "Adams" gives 200 bytes/point in the Interpolating Function. – Mike Feb 05 '16 at 22:20
  • Do you need all the intermediate values or just the final values? all variables or just a subset? – Chris K Feb 08 '16 at 14:43
  • MaxStepSize -> 10^(-7) causes a lot of memory to be used. It seems unusually small. – Michael E2 Feb 08 '16 at 23:46
  • For more info about the internals of InterpolatingFunction, see (q28337); some more info at (a98349). – Michael E2 Feb 08 '16 at 23:49
  • @MichaelE2- thanks for reference to InterpolatingFunction documentation. As for step-size, well that is what is making the problem hard, the driving function has multiple time scales. – Mike Feb 09 '16 at 04:57

1 Answers1

0

Maybe this help:

Set up a very large system of equations:

Clear["Global`*"]

(Format[#[n_]] := Subscript[#, n]) & /@ x;
X[k_] := x[k];

n = 1000;
vars = Table[X[i][t], {i, n}];
eqns = Table[j = Mod[i, n] + 1; {X[i]'[t] == 1/(X[i][t] + X[j][t])^2,X[i][0] == 1/i}, {i, n}];

Solve for all of the dependent variables, but save only the solution for $x_{1}$:

ONEsol = NDSolve[eqns, X[1], {t, 0, 100},DependentVariables -> vars];    

This saves a lot of memory versus saving all the solutions:

ALLsol = NDSolve[eqns, vars, {t, 0, 100}];
{ByteCount[ONEsol], ByteCount[ALLsol]}
(* {16856, 16864248} *)
Mariusz Iwaniuk
  • 13,841
  • 1
  • 25
  • 41
  • You should avoid using Subscript while defining symbols (variables). Subscript[x, 1] is not a symbol, but a compound expression, you expect to do $x_1=2$ but you are actually doing Set[Subscript[x, 1], 2] which is to assign a Downvalue to Subscript and not an Ownvalue to an indexed x as you may intend. Read how to properly define indexed variables here. – rhermans Feb 08 '16 at 20:04
  • @rhermans .May the Force be with you :) – Mariusz Iwaniuk Feb 08 '16 at 21:42
  • Yes, I figured out that one doesn't have to specify all functions in the list of variables and it will solve for only the functions one needs. That does save about half the memory in my case. Still there seems to be either a bug or a feature, where it generates InterpolatingFunction with "exact" values depending on the NDSolve integration method. It doesn't matter if I include AccuracyGoal and PrecisionGoal or not. And applying N[ ] to everything doesn't help either. – Mike Feb 09 '16 at 04:47