3

A few people have already asked the question of how to increase the maximum memory LaTeX is allowed to use (see here and here) because it is quite common for LaTeX to run out of memory when using, for example, pgfplots to plot experimental data.

Following the instructions from this answer, I managed to successfully increase the memory limit. However, I was not able to increase the limit to an amount capable of processing a 10,000-data-points plot. When I ran fmtutil-sys --all as instructed, the process terminated with errors. The problem was that, according to texmf.cnf, I had exceeded the memory limit by using main_memory = 50000000:

% Memory. Must be less than 8,000,000 total.

I was wondering why that is the case and whether there is a way to circumvent that limit. If there isn't a limit, what are the maximum values I can set the main_memory, extra_mem_top and extra_mem_bot to? Do the latter two count towards the total memory (I assume they do!)? Is there another variable that is more closely related to the "type of memory" that pgfplots uses so that I can increase that instead?

For your information, I externalize my figures using mode=list and make and I already tried lualatex, but I'm it still complains about running out of memory (even though it uses only around 1Gb of RAM).

I understand that people will argue that it is unnecessary to plot such a large number of data points in such a short space, but I would at least to know why LaTeX imposes such memory limits, especially with memory being so cheap these days.

sudosensei
  • 4,072
  • 4
    'Classical' TeX uses fixed size arrays: it's not computer RAM that limits things. I'm no Pascal/C programmer, but my guess is that either hard-coded in the sources or within the compile chain there is a difficult-to-remove limit. Note that LuaTeX uses dynamic allocation, so this does not (usually) occur, and that this is not LaTeX's fault as it's at the engine level. There are also several different arrays to worry about! – Joseph Wright Feb 05 '14 at 22:05
  • @JosephWright: Interesting. I wasn't aware that TeX used fixed-size arrays. If LuaLaTeX uses dynamic memory allocation, why do I still get an out of memory error when I use it? Its would simplify things immensely, but I can't seem to make it allocate memory dynamically as advertised. – sudosensei Feb 05 '14 at 22:12
  • 1
    I use LuaTeX for some files with 32k points, which don't plot with pdfTeX. In that case there is no need to fiddle with anything other than choice of engine. I think an example might be useful. – Joseph Wright Feb 05 '14 at 22:15
  • While main_memory can be 8000000 at most, you can use extra_mem_top and extra_mem_bot – egreg Feb 05 '14 at 22:16
  • 2
    @JosephWright: I think I figured out why LuaLaTeX wasn't working properly for me. It's because I wasn't really using it! The makefile produced by mode=list and make uses pdflatex even though I'm generating the document with lualatex. I changed pgfplots's system call to lualatex which fixed the makefile, and I am now waiting to see whether the figure is produced. Thanks for the help. – sudosensei Feb 06 '14 at 13:01
  • @egreg: That's great. Thanks. Is there a limit to what those two values can be? – sudosensei Feb 06 '14 at 13:02
  • @sudosensei I don't really know. Do your own experiments. – egreg Feb 06 '14 at 13:11
  • @egreg: Thanks a lot for that link. If either of you would be so kind as to provide a short answer (a combination of Joseph's and your first comment?), I'll gladly accept it. – sudosensei Feb 06 '14 at 13:17
  • @sudosensei I believe that it's better to mark this as duplicate – egreg Feb 06 '14 at 13:19

0 Answers0