As many other students I had to write my Bachelor thesis in Latex. So I had to choose between Overleaf, the complete integrated tool for Latex work, or setting up some local writing environment myself. As I wanted to have full control over my setup and sync my work across multiple devices using Git (which is also available in a paid subscription of Overleaf), and I also kinda hate myself a bit, I decided to go with the local setup.
As the work was growing more and more in terms of pages and the compilation got slower and slower, I had to think of some ways to improve and optimize the build process to make it more comfortable for me.
TL;DR
The general suggestion is to use as much precompilation as possible to avoid double compilation and therefore significantly speed up the document build process. Use the draft mode to avoid the need for rendering the entire document in the first compilation steps. A possible pipeline setup that performed really good for me is as follows:
pdflatex precompile preamble
pdflatex draft
bibtex
makeglossaries
pdflatex draft
pdflatex final
With that I was able to achieve about 1 second per page compilation time on a 60 page document with many large images.
As Martin Isaksson [1] described in his article, compiling LaTeX to PDF used to require several separate steps that are largely avoidable today. You would typically run latex to produce a DVI file, convert that to PostScript with dvips, and finally produce a PDF using ps2pdf.
Later dvipdfm and dvipdfmx simplified this process a bit by allowing direct conversion from DVI to PDF. The introduction of pdflatex and xelatex simplified the workflow even more by allowing the direct compilation of LaTeX files to PDF.
⚙️ The Standard Workflow Today
Today we usually use pdfLaTeX (or its alternatives) in a larger process that typically includes multiple compilation steps. The exact sequence depends on document complexity and packages, but a common pattern is: [1]
pdfLaTeX parses the source and generates an .aux file containing information about citations, cross-references, and glossary entries.
BibTeX processes bibliography data and generates a .bbl file with formatted references.
pdfLaTeX incorporates the generated .bbl into a full document, still containing unresolved citations and references "[?]".
One or more additional pdfLaTeX runs resolve cross-references and finalize the PDF output.
Fig.1 - The standard build steps today
📄 pdfLaTeX
This is the core tool in the described process. It uses pdftex under the hood to compile LaTeX documents directly to PDF files. Alternatives that you might now are XeLaTeX and LuaLaTeX. First of all it generates a structure of the document described in the .tex and .sty files If you don’t provide any other arguments, by default it will also generate images and markup information and output it to a PDF file.
📚 BibTeX
Also an important to note tool, that is used to manage bibliographies and citations in LaTeX documents. It reads the bibliography information from the .bib file and the citation information from the .aux file, and generates a .bbl file that contains the formatted bibliography. bibtex works well together with pdflatex to ensure that all citations are correctly formatted and included in the final document.
📊 Measures and Benchmarks
In order to evaluate the optimizations I propose, I created a small benchmark script based on the template that I will provide later. The benchmark is also included in the GitHub repository. It uses a Docker container that encapsulates all dependencies and ensures a consistent environment for the tests (as far as that is even possible). The benchmark script measures the time taken for each step of the LaTeX build process so we can quantify the impact of each optimization.
The machine used for the tests had the following specifications:
Hardware: AMD Ryzen 7 PRO 4750U with integrated Radeon GPU, 32 GB RAM, Samsung 980 1 TB NVMe.
Software: Windows 11 24H2, TeX Live 2024 Docker image with pdfTeX 3.141592653-2.6-1.40.26.
Using this system and the described benchmark we get the following results for the template document containing 18 pages:
Total build time: 30.778 secondsAverage build time per page: 1.710 seconds
Thirty seconds are still pretty usable, but there is definitely room for improvement. And also keep in mind that these results will increase with growing document size.
Fig.2 - Today's build steps on a timeline
🛠️ Optimizations
So now let’s go through all the optimizations I tested and adopted to speed up the LaTeX build process step by step.
📑 Introduction to makeglossaries
makeglossaries is a Perl script that generates an index of glossary entries for later generation of the glossary itself, but also the list of acronyms and all necessary cross-references. The tool is part of the glossaries package and will be executed automatically during the LaTeX build process.
Calling makeglossaries explicitly during the build lets us pre-generate these indices and avoid re-generating them during repeated pdfLaTeX runs, which speeds up the overall process. To use it we just have to use other declaration commands for the creation of glossaries and of course call the makeglossaries command in our build process.
Typical changes when switching from the no-index approach are:
Replace \makenoidxglossaries with \makeglossaries.
Replace \printnoidxglossary with \printglossary.
So of course I ran the benchmark with this change and produced the following results:
Total build time: 16.710 secondsAverage build time per page: 0.928 secondsSpeed increase: 45.7%
📋 Preamble Precompilation
Precompiling the preamble is one of the most effective ways to speed up LaTeX compilation [2]. The preamble (everything before \begin{document}) usually contains many package imports and definitions. Compiling these takes most of the time. So by precompiling them beforehand we save us some time in the later repeating steps. [3]
The preamble can be stored in a separate file or kept in the main document and dumped into a format file (.fmt) using format-dumping utilities. The resulting .fmt file is then loaded by pdfLaTeX, which avoids reprocessing package code on each run. [1]
Listing.1 - PdfLaTeX commands for using a preamble precompilation
Note that not every package is compatible with precompilation; test your document thoroughly after adopting this strategy.
As described before, this precompilation speeds up the overall process significantly. This is also shown by the following benchmark results:
Total build time: 13.160 secondsAverage build time per page: 0.731 secondsSpeed increase: 57.2%
So we already achieved a reduction in build time by the half of the original time. Not that bad.
🎛️ pdfLaTeX Commandline Arguments
There are some special command line arguments for pdfLaTeX that we can utilize to make our doument compilation faster and more efficient.
One of them is the "-interaction=batchmode" argument, which suppresses all interaction with the user [1], [2], [3]. This means that if there are any errors or prompts, pdfLaTeX will not stop to ask for input, making it suitable for automated builds. If you want to see the errors, you will have to open the log file, which will be generated nevertheless. But all the writing to the user terminal takes so much time that we can save by suppressing it. This leads to the following benchmark results:
Total build time: 13.225 secondsAverage build time per page: 0.735 secondsSpeed increase: 57.0%
Another useful flag is the "-draftmode" argument [1], [2], [3]. It tells pdfLaTeX to compile the document only draft wise, which means that it won’t generate the whole document. Instead it just generates some auxiliary files and uses the constraints of the document class to create a fast preview. This significantly reduces the compilation time as you can see:
Total build time: 16.171 secondsAverage build time per page: 0.898 secondsSpeed increase: 47.5%
📚 Using biblatex
Replacing BibTeX with biber as the backend for bibliography processing via the biblatex package is another avenue for optimization. biblatex offers a more modern and flexible interface for bibliographies and can reduce the number of required compilation steps in many cases.
I did not pursue this approach for the current project because it would have required restructuring bibliography handling across the document. Still, it is a promising option for future improvements.
🚀 New Compilation Process
After trying out all the different approaches, combining the techniques above yields a new compilation pipeline that balances speed and reliability. The adopted steps are as follows:
pdflatex precompile preamble
pdflatex -interaction=batchmode -draftmode
bibtex
makeglossaries
pdflatex -interaction=batchmode -draftmode
pdflatex
How these steps are connected together can be seen in the following process overview:
Fig.3 - Build process with optimizations
Of course the most important thing in this comparison are the benchmark results. These results clearly show the improvements achieved by the new process.
Total build time: 9.212 secondsAverage build time per page: 0.512 secondsSpeed increase: 70.1%
Fig.4 - The optimized build steps on a timeline
🚢 Dockerizing the LaTeX Build
A very important point is the topic of installing all tools and dependencies for Latex. As I wanted to make the setup as reproducible and easy as possible (and didn’t want to install all those programs manually), I decided to use Docker for this purpose. You probably already know Docker. It’s a technique to encapsulate a whole environment, including all dependencies, you need for running an application inside a container. These container use images that define the environment’s configuration and structure. For Latex there are already some prebuilt Docker images you could use [4]. I chose the texlive/texlive:latest image from Docker Hub. This delivers for example pdflatex, bibtex and makeglossaries already installed. So instead of using the tools locally, I can run the docker container providing the same command that would normally execute in commandline [4].
docker run --rm-v"%WORKSPACE_FOLDER%:/workdir"-w /workdir texlive/texlive:latest pdflatex thesis
Listing.2 - Example for running PdfLaTeX through Docker
🧑💻 Setting up VS Code for LaTeX Development
So now that we have a Docker container set up for our LaTeX development, we need some text editor or IDE to write in it. For this purpose, I chose Visual Studio Code (VS Code) as it has excellent support for LaTeX through extensions. The main extension for overall Latex support is LaTeX Workshop [5]. Using it you can setup pipelines and shortcuts for building your LaTeX documents efficiently. [6]
VS Code provides an easy solution for managing settings and configurations for your projects. You can configure project-specific settings to customize the behavior of the editor and the LaTeX Workshop extension by creating a .vscode/settings.json file in the repository [6]. This settings file can be transferred to other computers and shared with collaborators, ensuring a consistent development environment with the same configuration across different machines.
📝 Additional Notes
For easy reuse I created a template in a GitHub repository that you can fork or clone and use by yourself. It comes with some special additional scripts and tools to enhance your LaTeX development experience even further:
Automatic PlantUML diagram generation using the FileWatcher extension
for VS Code.
Scripts that detect unused/unreferenced labels and figures in your
document.
Scripts for Markdown conversion using either Pandoc or Poppler utilities.
🔎 Further Possibilities for Experimentation
There are some topics that might be worth looking into them deeper.
For example some other build drivers like latexmk or arara could be explored to see if they offer any advantages over the current setup. They might provide more flexibility or additional features that could further optimize the build process. [1]
For users of TikZ and PGFPlots, consider precompiling heavy diagrams to PDF. The pre-generated graphics can then be reused in subsequent runs to speed up compilation. [2], [3]
If you include many large images, add an image-processing pipeline: Downscale images to the display size used in the document and convert raster images to appropriately compressed JPEGs, or prefer vector formats (PDF/SVG) where possible. These changes could significantly reduce build time. [2]
💡 Tips and Tricks collection
A brief set of miscellaneous suggestions from the literature. I did not need all of these, but they may be helpful in your contexts [2].
Consider removing todonotes for performance-sensitive builds as it relies on TikZ and can noticeably slow compilation.
Maintain two image variants (high-resolution for final builds, low-resolution for drafts) and switch between them in your build pipeline to speed iteration.
Avoid loading heavy math packages unless your document requires their specific features.
Be cautious mixing inputenc and fontenc as it results in slower compilation.
🧾 Summary and Results
This article addresses a common and stubborn problem: a growing LaTeX project whose edit-compile cycle becomes progressively slower as content accumulates. Rather than resigning ourselves to perpetual waiting (and an unhealthy relationship with coffee), we instrumented the build, measured bottlenecks, and evaluated targeted, low-risk optimizations. In brief: precompiling the preamble and indices, applying a few well-chosen pdfLaTeX flags, and using a reproducible, containerized build yield substantial time savings.
Key numerical results from the template benchmark (18 pages):
Baseline total build time: 30.778 seconds (1.710 s/page)Combined new pipeline (precompile + flags+ prebuilt indices): 9.212 seconds (0.512 s/page);roughly a 70% speedup compared to baseline.
Practical takeaways (what to do now):
Precompile the preamble where possible. It removes repeated package processing and usually gives the largest single win.
Pre-generate indices and glossaries (makeglossaries) instead of regenerating them on each run.
Use pdfLaTeX flags in automated builds (-interaction=batchmode) and -draftmode during interactive iteration for faster feedback.
Consider modern bibliography tooling (biblatex+biber) and precompiling heavy TikZ/PGFPlots figures into PDFs for repeatable speed improvements.
Containerize the build (Docker) to make results reproducible across machines and CI.
See the repository README for the LaTeX template, Docker instructions, and the benchmark scripts.
I hope this deep dive into LaTeX compilation helps you speed up local LaTeX writing and provides a practical alternative to Overleaf.
Happy compiling.
[1] M. Isaksson, “Optimizing Your LaTeX Workflow: A Guide to Choosing a Build System,” Martin’s blog. Oct. 2023. Accessed: May 04, 2025. [Online]. Available: https://blog.martisak.se/2023/10/01/compiling/.
Top comments (0)