6

Why texlive installation is so slow? It installs programs piecewise, instead of extracting a compressed file.

Is there a reason for this? I am doing this on Windows 10.

Gergely
  • 1,065
  • 1
  • 8
  • 17
  • 2
    No comment about the reasons why the installation works as it works, but you can get a lot faster by selecting a mirror that is closer to you, the default mirror selection is somewhat suboptimal. – Skillmon Feb 07 '20 at 08:03
  • 1
    I haven't seen an option to choose a mirror – Gergely Feb 07 '20 at 08:15
  • 2
    You can get an iso image. But it will not contain the newest versions of the various packages so you would have to update them afterwards. – Ulrike Fischer Feb 07 '20 at 08:17
  • @Gergely iirc, that option is accessible via command line parameters. What's also faster is to rsync a mirror which provides this functionality and then run the installation command using that local copy. This is especially the case if you install on multiple machines. – Skillmon Feb 07 '20 at 08:21
  • @UlrikeFischer On Fedora this would be done by the dnf package manager I guess. Now it is able to handle individual LaTeX packages. – Gergely Feb 07 '20 at 08:39
  • MikTex installs in a few minutes – Roi Baer Oct 25 '21 at 18:16

1 Answers1

5

The TeX Live distribution already contains compressed archives (.tar.xz) and these are used for installation. The reason why it is slow is that there are a lot packages, and for each there are between 1 and 4 archives to be downloaded, decompressed (xz), and unpackaged (tar).

Furthermore, there is no multi-threaded installation, so the packages are done one-by-one and not in parallel.

Possible solutions would be providing a pre-installed TeX Live (something MacTeX is doing) which just contains all the files in one big archive. We are volunteers and preparing such a distribution is currently out of our hands, but everyone is free to provide something similar, and we (TeX Live Team) will be happy to look at it and maybe integrate it.

norbert
  • 8,235
  • I wonder whether using zstd instead of xz would bring considerable gain. Arch Linux believes that it does and has recently switched from xz to zstd. – Henri Menke Feb 18 '20 at 02:53
  • Interesting statement there at the linlk "decompression time for all packages saw a ~1300% speedup", that is a lot!! With TL the biggest problem is space, and fitting on the DVD. So loosing a bit of compressed size reduction might be a problem. Furthermore, switching from one compression method to another is not trivial, although it can be done. I will keep it in mind. You might want to post something about this to the TL mailing list, or tlbuild or so, that we can discuss there. – norbert Feb 18 '20 at 03:05
  • One more thing: we have already switched to lz for the backup containers so that making and decompressing them is faster. We might contemplate something similar for the distribution, indeed. – norbert Feb 18 '20 at 03:07
  • No, for the on-disk database lz4 is just fine. The compression ratio is not great but the (de)compression speed is much better than zstd (see benchmarks on zstd homepage). However, when files have to be transferred over the network, smaller compressed size is important. – Henri Menke Feb 18 '20 at 03:16
  • But the biggest time consumer is the decompression of the .xz packages, so where should one switch to zstd then? – norbert Feb 18 '20 at 03:19
  • lz4 is a bit faster than zstd at worse compression ratio, whereas xz is slower than zstd at only a marginally better ratio. So xz could be replaced by zstd, whereas lz4 could stay for cases where file size is irrelevant. – Henri Menke Feb 18 '20 at 03:53