7

Bug introduced in 10.0.2 on XUbuntu persisting through 10.3 or later


I'm running a fresh install of XUbuntu 14.04.3 LTS with an NVidia GeForce GTX 970. I have installed NVidia driver 352.63 and OpenGL is working. Additionally, libglu1-mesa is installed.

Nevertheless, anti-aliasing is not working and cannot be adjust in the preference settings.

The problem is that

libMesaGL.so.1 isn't missing, it's included with Mathematica and is on LD_LIBRARY_PATH at runtime. The problem is not that gltest is missing a dependency, but that it runs and returns GLTest_Fail for some reason. This is being looked into. – ilian

Here is a related question.

halirutan
  • 112,764
  • 7
  • 263
  • 474

1 Answers1

5

Cross-posted from the related issue, this is the workaround that I found:

I had to modify the /usr/local/bin/mathematica script to fix 3D antialiasing. It seems that the GLTest script fails and as a consequence Mathematica disables advanced 3D rendering. The fix is to replace the line

GLTestResult=`${GLTest} 1 1 1 2 ${userDisplay}  2> /dev/null | grep "GLTest_OK"`

with

GLTestResult="GLTest_OK"

and now antialiasing works. Seems like a bug or improper test procedure to me. Tested with Mathematica 10.3.0 on Xubuntu 15.10 with Nvidia GeForce GT 730 and libglu1-mesa 9.0.0-2. Note that I did not have to export MATHEMATICA_GL_FBO=1 to enable antialiasing.

shrx
  • 7,807
  • 2
  • 22
  • 55
  • Works with Intel chip card. Fedora 23/Gnome. By any chance do you know if something like this can be used to fix antialiasing in other openGL programs, such as glxgears or Acrobat Reader for Linux (see http://tex.stackexchange.com/questions/300953/force-acrobat-to-render-asymptotes-figure-with-antialias?noredirect=1#comment734023_300953) – alfC Apr 07 '16 at 05:00
  • Would there be a similar fix for Windows users? – Doug Kimzey Dec 31 '18 at 17:47