[Tex/LaTex] Do the online LaTeX compilers use a TeX daemon to speed up their compilation

compilingonlineperformancetools

There seem to be quite a few fabulous web-sites now that compile LaTeX online.

I was wondering what techniques these sites used to speed up compilation of documents. It would seem they would be particularly interested in reducing the CPU load from loading the compiler, especially as many of the documents they would be asked to create would be relatively small.

In my experience it seems like the ordinary process for creating a relatively small document with LaTeX, especially XeLaTeX, seems to take a substantial amount of time starting the compiler (e.g. a couple seconds), and a relatively very modest amount of time running it (a few milliseconds).

It would seem then that one could gain tremendous improvements in performance for relatively small documents by running (PDF/Xe)LaTeX as a daemon that produced a number of documents without having to restart the process.

Others seem to have tried this, and it has been discussed (also) on Tex.SE before. The techniques seem to be somewhat dated, and in any case I couldn't get them to work on Linux with XeLaTeX (which just happens to be what I'm using).

Are any of the online compilers using a sort of TeX daemon to compile their documents in the background? Are there any other recent developments in this area? I happen to be personally interested in the XeLaTeX on Linux, but I would love to know more about what is happening in the LaTeX performance area.

There also seems to be quite a bit of discussion about pre-compilation (as seen under eg ), but I tried it and the benefits do not seem to be as great as that of daemonizing the process (though I would stand to be proven correct!).

Are there other techniques these online services may be using to improve the response time of compilation?

Best Answer

I'm a cofounder at writeLaTeX.

We don't currently use a background daemon. Our backend uses pdflatex on Linux, so I can't say much about XeLaTeX on Windows (but XeLaTeX support is planned), but here's our experience.

The main factor that determines the compile time for a small document is whether the many source files for the packages it uses are already in the linux disk cache. (See http://www.linuxatemyram.com for an overview of the disk cache and some good links.)

That is, the first latex document you compile tends to be slow, because latex has to read all of those files from their various locations on your hard disk. But, when you compile the second document, the operating system has helpfully kept those files in main memory since it read them the first time, so reading them in again is much faster.

I also know that jpallen of ShareLaTeX now maintains the CLSI, which is open source, so you can see how that backend works. I don't think it uses a background service, either.

As far as I can see, the daemon-based approaches in the links you provided still work in principle, but I don't know whether they're still supported.