In this article I claim that the LaTeX developer experience can be improved by learning and copying from programming languages. Especially automated package management (including automatic package acquisition based on
\usepackage directives) would be a big gain.
- I shouldn't have to install the packages that my document requires for compilation one by one manuallay. This should be automated.
- The Tex Live installer asks too many questions.
- The LaTeX compiler is not user-friendly
I'm a Linux user and I use LaTeX for university assignments. I acquired texlive through my distro's package manager. Installing Texlive was a cumulative download of about 2GB. I got a ready-made LaTeX template from a fellow student. I ran pdflatex. It found that some .sty file could not be found and entered some weird interactive mode. I figured out how to prevent this weird interactive mode for good so that the compiler just terminates upon error. I then had to manually install all packages that provide all the .sty files that were not present on my system.
These two are the first two problems I want to point out. The user should never be confronted with this strange interactive mode of the compiler. The worst part of the interactive mode is that it is so hard to exit (is it Ctrl+C, q, quit or exit?). Also the retrieval of packages should be automated. Installing them manually is tedious. An alternative is installing all texlive-* packages, so that none would have to be insalled manually. But that is a download of over 6GB...
Then I wanted to work on my thesis on university lab computer. It did not have a package installed thorugh the distribution that was necessary to build my thesis. Lacking admin privileges I could not install them regularly. What I could have done was download the package manually from CTAN and put it into my source path.
At that time I was not aware that texlive could be installed manually, independently from the system package manager. I could have installed texlive in my home directory and could have made use of tlmgr to install all packages. That would have worked. However, having recently tried this out, manually installing texlive is also tedious: The installer gives the user many choices to make. Installation location, the size of the set of packages to install and many other less relevant things. Also, after installation, all programs (compilers, tlmgr etc) need to be added to the PATH environment variable so as to make it exectuable from the command line.
Programmers prefer using libraries and frameworks over writing everything from scratch, because this approach has many obvious advantages. Relying on externally developed code has some consequences: The foreign code needs to be locally available to (depending on the type of language) build or run the software. Not only that, but the foreign code may rely on other libraries to perform its tasks. Manually downloading, integrating and setting up external code and its dependencies is tedious. And the process needs to be repeated for every new PC that is set up to build/run the software and for every version update of the dependencies.
- There is an online registry/repository where libraries/frameworks (the more general term being 'packages') are published. http://npmjs.com/
- There is a CLI program
npmto find, download and update packages from that registry.
- If you use npm you put a file package.json in the root folder of your code project. This file provides the following information:
- It has a list of all packages that the code project depends on. You also specify the versions of the packages.
- It has a list of (build) scripts. These consist of a script name and a CLI command.
- And, less importantly, this file contains all the metadata of your project in case you want to publish the project as an npm package. As such you specify title, authors, current version etc.
When somebody new starts working on a project all they need to do is get the project code from version control (only the internal code and the
package.json file is in version control - not the external dependencies) and then inside the root directory run
npm install. This reads the dependency list in
package.json, downloads them all and all their dependencies from the registry and puts them into a directory in the project root.
LaTeX is in a good position to easily establish its own tooling for automated dependency management. With CTAN we already have an online registry of packages. Also, LaTeX already has the means for describing the dependencies of a document: the
\usepackage directives in the preamble.
My vision for a LaTeX dependency manager is a little program that reads the main LaTeX source file to install all required packages automatically. Installing can be done in any of the following ways:
If texlive is installed on the system then use the executables from that install. Download packages that are not present but required by the document directly from CTAN to ...
a. a directory in the project
b. a project-agnostic directory located in the user's home directory
Download and install TeX Live. Offer express installation, i.e. do not ask for install location etc., use sane defaults. Install no LaTeX packages except the absolut bare minimum. After that, use tlmgr to install all packages that are needed for the document. No more. There are multiple ways this can be done:
a. npm style: install in the project directory.
b. Install in the user's home directory. This will be enough for many users because oftentimes a computer is used by only one person anyway.
c. Install globally for all users. This will require fiddling with admin privileges.
Using the system package manager. I think it's going to be doable to find a way to map package names to the names of the package in the system package manager.
These options are ordered by increasing difficulty.
dev script: This typically serves the frontend by means of a locally running webserver to the browser. It also watches the files in the project for changes and automatically rebuilds the code and reloads the served page.
build script: This builds the project for deployment.
Scripts are run by invoking
npm run <scriptname> from the command line.
'Build tool' is a common term for programs that automate the build process. I think with LaTeX many people use Makefile. The build process of a LaTeX document typically comprises bibliography processing and up to three LaTeX compilation runs. Additionally you might want to perform some actions to keep the working directory clean from compilation byproducts. Maybe you're writing the body of the document in Markdown or some other markup and need to perform a markdown-to-LaTeX-translation step. Maybe there is some data that has to be plotted during build.
TL;DR build automation is pretty much the norm. Some thought should be given to the question whether there are any benefits to be gained from including a build scripting feature in a possible LaTeX dependency manager, thereby turning it into a full-blown build tool like sbt and Maven. Maybe also just Makefile does the job well enough. However, I would love to save TeX Live users from having to find out about nonstop mode of the TeX compiler. Also, an authoring mode with automatic recompilation upon file change would be neat.