When compiling the following piece of LaTeX code, the ∖ symbol does not show up.
\documentclass{article}
\usepackage{unicode-math}
\setmainfont{Latin Modern Roman}
\setmathfont{Latin Modern Math}
\begin{document}
This will not show: $\setminus$ \\
This will show: $\smallsetminus$
\end{document}
I know why this happens: \setminus translates to unicode character 0x29F5, and
this character is not part of Latin Modern Math. What I would like to do is to
use Unicode character for \smallsetminus, 0x2216, instead. So what I am really
looking for is a way to remap the \smallsetminus command to another Unicode
value than the one defined by the unicode-math package. I know a possible way
to get a value for \setminus would be to use another math font that does have
Unicode character 0x29F5. Adding the line
\setmathfont[range={"29F5}]{XITS Math}
gives me a ∖ symbol, but I don’t want to use another font for this character.
\def\setminus{\symbol{2216}}orrenewcommand{\setminus}{\symbol{2216}}? – dgoodmaniii Nov 21 '19 at 22:36unicode-mathdefines its commands only at\begin{document}, so any\defor\letstatements before\begin{document}are overwritten (which is why the accepted solution is to wrap the\letin\AtBeginDocument) – Semafoor Nov 21 '19 at 23:54