4

What is the maximum number of macros one can define with pdflatex? I thought that the limit was around 35,000, but it looks like I am well past that and things are still working fine. This includes:

  • \newcommand
  • \def
  • \csdef
  • \NewDocumentCommand
  • etc..

Is there specific limits on each of these or some global limit?

Background:

I am in the process of restructing my code so wanted to know if I am aproaching some limit soon.

One things I was thinking of doing was to replace something like

\csdef{Property A file1}{abcd}
\csdef{Property B file1}{efgh}
\csdef{Property C file1}{ijkl}
\csdef{Property D file1}{mnop}

with something like

\csdef{Properties file1}{{abcd}{efgh}{ijkl}{mnop}

and then parsing this when needed to extract the indiviual properties resulting in a reduction by a factor of 4. But, if I am not near the limits, then I can delay this optimization to later.

References:

Peter Grill
  • 223,288

3 Answers3

5

You can use \tracingstats=1 and you can see, how the TeX memory is used, at the end of the log file. The parameters mentioned here can be enlarged by editing texmf.cnf file (and probably re-generating the format).

Note that there is maximum number of strings, maximum amount of strings data and maximum number of multiletter control sequences. If you declare a new multiletter control sequence (\def\something..., \let\otherthing...) then all these three amounts mentioned here are changed.

There is maximum "words of memory" (main memory), where the data about boxes or bodies of the macros are saved. I.e. if you declare a macro, then each token of the macro body is saved here.

Your suggestion of optimization needs less of strings and multiletter control sequences but more words of memory, because you have to save these braces {..}{..}{..}{..} to the main memory.

TeX memory concept is ancient: it allocates arrays of fixed length from operating memory at the start of the execution. These lengths are declared in texmf.cnf as parameters mentioned above. And the malloc operation (memory alocation from operating memory) isn't performed during processing the document. So, you cannot use full memory given by operating system on the fly. Note that luatex is more modern and it does normal malloc of each peace of strings data, of the main memory and of the fonts memory during processing, but other parameters (number of strings, number of multiletter control sequences) are still fixed by similar way as in ancient TeX.

wipet
  • 74,238
3

I’m on my iPad right now so I can’t easily check, but texmf.cnf will have the maximums for Tex-live distributions (MikTeX uses a different configuration file)

The key numbers are max_strings and pool_size. The former is the total number of strings (which includes TeX’s own messages and the latter is the total length of all strings). MikTeX sets these to 500,000 and 3,250,000. I can’t find the current TeX live values online but I imagine they’re comparable so you can stop worrying.

Don Hosek
  • 14,078
3

At the end of the .log file, I find:

Here is how much of TeX's memory you used:
 53726 strings out of 478287
 1115535 string characters out of 5846764
 1926603 words of memory out of 5000000
 70958 multiletter control sequences out of 15000+600000
 524031 words of font info for 93 fonts, out of 8000000 for 9000
 1141 hyphenation exceptions out of 8191
 140i,10n,134p,1231b,2527s stack positions out of 5000i,500n,10000p,200000b,80000s
[...]
PDF statistics:
 129 PDF objects out of 1000 (max. 8388607)
 94 compressed objects within 1 object stream
 17 named destinations out of 1000 (max. 500000)
 32961 words of extra memory for PDF output out of 35830 (max. 10000000)

Try variants and compare these values...

Paul Gaborit
  • 70,770
  • 10
  • 176
  • 283