Is there a maximum number of new definitions such as
\def\myNewDefA{foo}
\def\myNewDefB{bar}
...
\newcommand\myNewMacroA{foo}
\newcommand\myNewMacroB{bar}
one can have in a document typeset with pdflatex?
Is there a maximum number of new definitions such as
\def\myNewDefA{foo}
\def\myNewDefB{bar}
...
\newcommand\myNewMacroA{foo}
\newcommand\myNewMacroB{bar}
one can have in a document typeset with pdflatex?
TeX has lots of memory limitations.
The following TeX file for iniTeX tests some of them:
% test.tex
% iniTeX
% preamble
\tracingstats=1
\catcode`\{=1
\catcode`\}=2
\countdef\i=255
\countdef\m=100
\chardef\l=1
\def\e{}
\i=0
\def\n{%
\ifnum\i<\m
\advance\i\l%
\expandafter\let\csname x\the\i\endcsname\e
\else
\let\n\end
\fi
\n
}
% main
\m=0 % number of macro definitions
\n
% end of test.tex
The test file defines \m macros of the form \x1, \x2, \x3 etc.
They are defined via \let that does not take memory for the definition,
because this is shared with the macro (the empty macro \e in this case).
After running (e.g. tex --ini test) the file test.log contains a short memory statistics.
Some examples:
\m=0Here is how much of TeX's memory you used:
4 strings out of 498654
30 string characters out of 6225568
1060 words of memory out of 5000000
322 multiletter control sequences out of 15000+600000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
1i,0n,0p,54b,6s stack positions out of 5000i,500n,10000p,200000b,80000s
\m=100000Here is how much of TeX's memory you used:
100004 strings out of 498654
588925 string characters out of 6225568
1074 words of memory out of 5000000
100322 multiletter control sequences out of 15000+600000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
3i,0n,0p,54b,6s stack positions out of 5000i,500n,10000p,200000b,80000s
\m=498654Here is how much of TeX's memory you used:
498654 strings out of 498654
3379475 string characters out of 6225568
1074 words of memory out of 5000000
498972 multiletter control sequences out of 15000+600000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
3i,0n,0p,54b,6s stack positions out of 5000i,500n,10000p,200000b,80000s
pdftex --ini test already fails with:
! TeX capacity exceeded, sorry [number of strings=497943].
max_strings)TeX Live 2013 uses max_strings=500000 in texmf.cnf. Some strings are preoccupied (names of primitives, ...). The macro names, defined in the test file, use short one character/letter command names that are not stored as strings. Strings are stored as indexes to the memory pool.
The number of strings can be increased, e.g. (bash):
$ max_strings=1000000 tex --ini test
But then the limitation of the hash table is hit (\m=800000):
! TeX capacity exceeded, sorry [hash size=615000].
hash_extra)Each command name is also stored in a hash table for fast access.
Its size 15000 can be increased by hash_extra (600000 in TL 2013).
$ max_strings=1000000 hash_extra=1000000 tex --ini test
now works for \m=800000:
Here is how much of TeX's memory you used:
800004 strings out of 998654
5488925 string characters out of 6225568
1074 words of memory out of 5000000
800322 multiletter control sequences out of 15000+1000000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
3i,0n,0p,54b,6s stack positions out of 5000i,500n,10000p,200000b,80000s
pool_size)The memory where the string characters are actually stored is the string pool. Its size is also limited (default for pool_size in TL 2013 is 6250000).
With \m=1500000 the following command
$ max_strings=1600000 hash_extra=1600000 pool_size=16000000 tex --ini test
succeeds with
Here is how much of TeX's memory you used:
1500004 strings out of 1598654
10888926 string characters out of 15975568
1076 words of memory out of 5000000
1500322 multiletter control sequences out of 15000+1600000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
3i,0n,0p,54b,6s stack positions out of 5000i,500n,10000p,200000b,80000s
main_memory)With non-trivial definitions also the main memory is used. The following line
\expandafter\edef\csname x\the\i\endcsname{\the\i}%
defines the macros \x1, \x2, \x3 with a definition text that contains the number.
Result for \m=1000000 and
$ max_strings=1600000 hash_extra=1600000 pool_size=16000000 main_memory=10000000 tex --ini test
Here is how much of TeX's memory you used:
1000004 strings out of 1598654
6888926 string characters out of 15975568
7889966 words of memory out of 10000000
1000322 multiletter control sequences out of 15000+1600000
7 words of font info for 0 fonts, out of 8000000 for 9000
0 hyphenation exceptions out of 8191
3i,0n,0p,62b,6s stack positions out of 5000i,500n,10000p,200000b,80000s
But the main memory cannot be increased indefinitely, e.g. using main_memory=15000000 results in an internal fatal error:
Ouch---my internal constants have been clobbered!---case 14
TeX was written in a time, where memory was very limited and expensive and the computers were very slow. Therefore Knuth has written its own memory management with fixed sized tables that do not dynamically grow, if the maximum size is reached. Some table sizes can be increased before the TeX run. But some implementation limits remain.
LuaTeX has rewritten the core of TeX, thus some limitations are gone. The latest example with \m=1500000 runs with
$ max_strings=1600000 hash_extra=1600000 luatex --ini test
and uses the following memory (0.76.0):
Here is how much of LuaTeX's memory you used:
1500013 strings out of 1598958
500,14696376 words of node,token memory allocated
58 words of node memory still in use:
nodes
avail lists: 2:1,6:1
1500335 multiletter control sequences out of 65536+1600000
0 fonts using 0 bytes
3i,0n,0p,61b,6s stack positions out of 5000i,500n,10000p,200000b,100000s