Occasionally, when running xelatex (using TeX Live 2018 on Linux Mint 19.2), I
encounter the following error message:
(/usr/local/texlive/2018/texmf-dist/tex/latex/microtype/mt-cmr.cfg))
(/usr/local/texlive/2018/texmf-dist/tex/generic/oberdiek/se-ascii-print.def) [1
! I can't write on file `xxx.pdf'.
(Press Enter to retry, or Control-D to exit; default file extension is `.pdf')
Please type another file name for output:
This happened to me yesterday evening, and I found myself expending an hour of my time trying to nail down what could be causing causing such a behavior.
I had a document viewer with the PDF open so naturally that could have been blocking access to the file so I closed the program, but to no avail (I'm using Evince which is most of the time fine, does automatically reload on document update, but has also occasionally been known to become unresponsive for no tractable reason).
I tried then to remove all temporary files and the PDF from the target folder; in my case this included all
files matching *.aux, *.fls, *.log, *.out, and *.pdf; this had no discernible effect.
I also tried to write to another target:
(Press Enter to retry, or Control-D to exit; default file extension is `.pdf')
Please type another file name for output: /tmp/x.pdf
! I can't write on file `/tmp/x.pdf'.
(Press Enter to retry, or Control-D to exit; default file extension is `.pdf')
This did not spark joy.
During my continued efforts to come to grips with this I could produce a PDF roughly one out of two or three runs; I was unable to make the behavior predictable—it's not the viewer, it's none of the existing files in the directory, all packages that are loaded are always the exact same ones.
Has there been any experience as to possible reasons for failure to write to the target or any other file?
If the specific target had been blocked, then surely entering a non-existing filename in the existing /tmp
directory should always work, right? It feels as if the 'file writing mechanism' of TeX has a washer
hopping around in the gear, sometimes causing it to grind to a halt.
I will say that this work is being done on a recent but not terribly performant netbook, IOW it might be the case that shortage of RAM is the underlying problem and failure to write just the visible side effect of that. Maybe someone can corroborate or refute that; personally I would of course prefer any software to fail with an Out of Memory error when that is indeed the problem.