I seem to often run into capacity issues and have been upping the limits each time. There are several question on this site pertaining to these. However, there seems to be some conflicting information and no single place where all these limits are detailed.
Main Question:
So, what I am looking for is
- Name of the variable
- Default value.
- Absolute max that it can be set to.
- What steps can I take to minimize these usages. What are the DOs and DONTs for each these capacity limit.
The settings I am aware of are the ones reported at the end of the log file:
106477 strings out of 493042
2524840 string characters out of 39884661
12363848 words of memory out of 12435455
103471 multiletter control sequences out of 15000+600000
197845 words of font info for 228 fonts, out of 8000000 for 9000
1143 hyphenation exceptions out of 8191
86i,17n,86p,10523b,3398s stack positions out of 5000i,500n,10000p,200000b,80000s
but there may be others that are not listed here. My immediate issue is with words of memory as the numbers above are from the first run of my indexing. If I attempt to do a second run then I get TeX Capacity Exceeded.
Examples of Other Variables:
From Davis Carlisle's comments, there is also extra_mem_top and extra_mem_bot that can also be used. But, does this allow for larger words of memory, or are these related to one of other capacity limits in the report. Again, what is the maximum I can set these to?
Possible DOs and DONTs:
Some of the questions I have in regards to minimizing the usage are listed here, so perhaps these can be covered in the DOs and DONTs for each of the variables.
Is it better to use
\def\foo{xxx}instead of\newcommand*{\foo}{xxx}. I can perform the check for duplicate macro names outside of TeX.Do macro names with a large number of characters increase usage of some of these?
For instance, does
\expandafter\newcommand\csname dir1/dir1/long-file-name\endcsname{xxxx}have any issues in regrads to capacity. Would it be better for me to mapdir1/dir1/long-file-nameto some shorter string such asZXywto save on these limits. Yes, readability is lost, but I can automate this replacement.Does repeated text result in increased use of the limits: Which of the following would be better:
\MyMacro{../../../../dirA/dirB/dirC/dirD/file0001.tex} \MyMacro{../../../../dirA/dirB/dirC/dirD/file0002.tex} ... \MyMacro{../../../../dirA/dirB/dirC/dirD/file5000.tex}or
\def\Path{../../../../dirA/dirB/dirC/dirD/} \MyMacro{\Path/file0001.tex} \MyMacro{\Path/file0002.tex} ... \MyMacro{\Path/file5000.tex}Does
\undefing macros after they are no longer needed help with these limits?
Certainly, I can experiment with these things that I have thought of, but perhaps there are other things that can be done to reduce the useage of these limited string and words.
Example of Inconsistency
In my
/usr/local/texlive/2015/texmf.cnfI havemain_memory = 12435455(12.4M) which is the stated max as per latex gives me the error: Ouch---my internal constants have been clobbered. If I attempt to increase it just by 1 to12435456indeed I getOuch---my internal constants have been clobbered!---case 14. Even after executingsudo vi texmf.cnfas per Increase LaTeX capacity.However, Thomas F. Sturm's solution to How to expand TeX's “main memory size”? (pgfplots memory overload) shows
100571134 words of memory out of 103000000thus indicating that the max is at least 103M.
Furthermore, it the file
/usr/local/texlive/2015/texmf-dist/web2c/texmf.cnfthere is a comment thatMemory. Must be less than 8,000,000 total.. Well that is clearly incorrect as I am able to use more than that.
Notes:
I am assuming that any changes would be made to the file reported by
kpsewhich texmf.cnfas per Leo Liu's answer at How to expand TeX's “main memory size”? (pgfplots memory overload), which in my case is/usr/local/texlive/2015/texmf.cnfIf there are different ways to set some of these I would like to know those as well.
- My main problem right now is with indexing a large number of entries and the
words of memorylimit. I am less concerned about run time (within reason), more about exceeding these limits. - I am using TeXLive2015 in case it matters. But if the limits are dependent on the distribution it would be good to know that as well.
- This question is not intended to be about error conditions that result in exceeding the capacity limits (such as my naive question Tex Capacity Exceeded (if remove % after use of macro)).
References:
- Are there limits to the number of new macros or commands in TeX/LaTeX lists the variables mentioned here and provides a test case. However, his is focused specifically on the limits based on the number of macros.
- Components of (La)TeX's memory usage is an excellent reference, so perhaps these answers can be added there instead and this closed as a duplicate?
\expandafter\newcommand*\csname…will expand*rather than\csname. – Henri Menke Feb 11 '16 at 09:43main_memorybut you can upextra_mem_topandextra_mem_botso the effective memory is beyond that. (or use luatex which uses dynamic allocation) – David Carlisle Feb 11 '16 at 11:08extra_mem_topandextra_mem_bot? It isn't obvious to me. What is the difference between these extra memory versus themain_memory. What are the maximum values I can set these two. What will increasing these allow me to do? – Peter Grill Feb 11 '16 at 11:55words of memorybeyond12435455? – Peter Grill Feb 11 '16 at 12:15mem_botandmem_topare INITEX only special constants for the dumped pre-loaded format file - in the old times when INITEX and VIRTEX were two different executables. IMHO most (or all?) TeX systems in use have both integrated in one executable and then these two variables need to be set tomem_minandmem_max(or the implementation handles themain_memoryin a different way replacing these TeX.web parts in their used change file). – Bernd Raichle Jan 09 '23 at 08:21