8

I've noticed strange peaks in memory usage (several GB) during evaluation of my code, sometimes freezing operating system. Nailed it down to numerical evaluation of "bignum" expressions like this:

MaxMemoryUsed[N[Abs[Tanh[Power[E,Power[Pi,E]]]]]]

Unfortunaltely, MemoryConstrained do not work as expected, e.g:

MaxMemoryUsed[MemoryConstrained[N[Abs[Tanh[Power[E,Power[Pi,E]]]]],65536]]

still request 2GB of memory. Setting SetSystemOptions["CatchMachineUnderflow" -> False] do not help as well. I'm using Mathematica 11.2 for Windows 7 64bit.

  • Another similar example, this time without Abs: N[Tanh[Pi^Cosh[Cosh[2]]], 32] This one immediately request 18 GB of memory, so be prepared to kill Kernel. Only method to prevent this is to replace 2 with numerical value. – Andrzej Odrzywolek Feb 07 '18 at 20:59
  • 1
    Problem can be prevented changing Internal\$MaxExponentfrom defaultInfinityto some finite value. Same forUnderflow, e.g.Internal`$MinExponent=-1024`. Information from Technical Support. – Andrzej Odrzywolek Feb 08 '18 at 10:57

2 Answers2

3

Just put N as early as possible to avoid symbolic computations:

MaxMemoryUsed[N[Abs[Tanh[Power[N@E, Power[N@Pi, N@E]]]]]]

152

Henrik Schumacher
  • 106,770
  • 7
  • 179
  • 309
  • 2
    You only need one machine precision number to cause machine precision calculations. MaxMemoryUsed[Abs[Tanh[Power[E, Power[Pi, N@E]]]]] evaluates to 88 – Bob Hanlon Feb 07 '18 at 14:37
2

It seems that it is the symbolic evaluation that is the culprit. More particularly, as @CarlWoll pointed out, it is Abs that is the problem:

{res = Tanh[Power[E, Power[Pi, E]]]; // MaxMemoryUsed // AbsoluteTiming, res}
{res = Abs[Tanh[Power[E, Power[Pi, E]]]]; // MaxMemoryUsed // AbsoluteTiming, res}
(*
  {{0.00004, 672}, Tanh[E^π^E]}
  {{2.84414, 2046458944}, Tanh[E^π^E]}
*)

My guess is that automatic, internal evaluation/simplification rules are being tried. It's quite possible that numeric routines are being used internally to determine whether the absolute value can be simplified, but I'm not sure how to see that, since the internals appear to be hidden. In fact, they appear to be hidden from the MemoryConstrained[] environment. This is not completely surprising, since some rules, such as for Plus, are applied before the standard evaluation procedure. It is done for the sake of efficiency. This seems to be an edge case where the "efficiency" might be called into question.

MaxMemoryUsed[
 res = MemoryConstrained[Abs[Tanh[Power[E, Power[Pi, E]]]],
 65536]]
res
(*
  2046458968
  $Aborted  
*)

If the symbolic evaluation happens first, then we see that numericizing it uses little memory. Interestingly, little memory is also used if you prevent symbolic evaluation with Unevaluated.

expr = Abs[Tanh[Power[E, Power[Pi, E]]]];
MaxMemoryUsed[res = MemoryConstrained[N@expr, 65536]]
res
(*
  192
  1.
*)

MaxMemoryUsed[
 res = MemoryConstrained[N[Unevaluated@Abs[Tanh[Power[E, Power[Pi, E]]]]],
 65536]]
res
(*
  192
  1.
*)

OTOH, evaluating expr with arbitrary precision ("bignum") does use about as much memory as the symbolic evaluation. It is also not constrained by the memory constraint, at least not until some 2GB of memory have been used. This coincidence suggests bignums might underlie the symbolic evaluation.

MaxMemoryUsed[res = MemoryConstrained[N[expr, 10], 65536]]
res

Divide::infy: Infinite expression 2/0.*10^2147483648 encountered.

(*
  2046481544
  $Aborted
*)

Interestingly, setting $MaxExtraPrecision to 0 prevents the huge use of memory. It also prevents a useful result.

Block[{$MaxExtraPrecision = 0},
 MaxMemoryUsed[res = MemoryConstrained[N[expr, 10], 65536]]
 ]
res

Divide::infy: Infinite expression 2/0.*10^2147483648 encountered.

(*
  13136
  ComplexInfinity
*)

However, setting $MaxExtraPrecision to 0 does not help with the symbolic evaluation. The limit could be reset internally. One cannot tell. (Indeed, I tried to observe it with Dynamic@{$MaxExtraPrecision, Clock[]}, but dynamic updating hangs while the symbolic expression is being evaluated. That seems significant, but I don't know exactly what it signifies.)

Block[{$MaxExtraPrecision = 0},
 MaxMemoryUsed[
  res = MemoryConstrained[N[Abs[Tanh[Power[E, Power[Pi, E]]]]], 65536]]
 ]
res
(*
  2046458992
  $Aborted
*)
Michael E2
  • 235,386
  • 17
  • 334
  • 747
  • 2
    I think it would help to explicitly state that Abs is the problem, that is, Abs[Tanh[Power[E, Power[Pi, E]]]] causes the memory problem, and after removing it, N[Tanh[Power[E, Power[Pi, E]]]] evaluates quickly with little memory usage. – Carl Woll Feb 07 '18 at 16:39
  • @CarlWoll Thanks, I didn't think to check that. – Michael E2 Feb 07 '18 at 16:59
  • Another weird example for Abs simplification: Cos[Pi/2/Log[Tanh[Cosh[Pi]]]] // Abs // N (returns -0.989552). – Andrzej Odrzywolek Feb 07 '18 at 17:49
  • @AndrzejOdrzywolek That has nothing to do with Abs. Rather, it exposes issues with machine precision computations of trigs with large arguments, see (161124). – Carl Woll Feb 07 '18 at 17:56
  • @CarlWoll I understand this, but negative numeric value of Abs is unexpected anyway. – Andrzej Odrzywolek Feb 07 '18 at 18:09
  • @AndrzejOdrzywolek I see. Maybe I'm belaboring the obvious, but I think your assuming that N sees that its argument is an Abs object, and this is not quite true. N evaluates its arguments before working on them. Since Mathematica knows that Abs@Cos[Pi/2/Log[Tanh[Cosh[Pi]]]] equals Cos[Pi/2/Log[Tanh[Cosh[Pi]]]], N is evaluating Cos[Pi/2/Log[Tanh[Cosh[Pi]]]] (note the absence of Abs), and does a poor job of it with machine precision. – Carl Woll Feb 07 '18 at 18:23
  • @AndrzejOdrzywolek Perhaps comparing the evaluation sequences in Trace@N[Abs[Cos[Pi/2/Log@Tanh@Cosh@Pi]], 6] and Trace@N[Abs[Cos[Pi/2/Log@Tanh@Cosh@Pi]]] will illustrate Carl's point (the first one is accurate, the second has excessive rounding error). (I think Carl left off a minus sign.) – Michael E2 Feb 07 '18 at 19:37