DEV Community

Discussion on: GitLab CI: Cache and Artifacts explained by example

Collapse
 
bcouetil profile image
Benoit COUETIL 💫

node_modules can be huge in real world, and then unsuitable for artifacts which are limited in size. Worth knowing, it is also uploaded to central Gitlab, which can be a bottleneck for a large Gitlab instance with lots of runners uploading to it.

Other than that thank you, I learned that npm ci is slow due to node_modules deletion 🙏

Collapse
 
zaggleszurek profile image
Agata Zurek

Yes, this! My project's node_modules is 2GB and is too big for artifacts. What is the recommended solution to deal with that? I've had to include npm ci on every step to get my pipeline to work at all.

Collapse
 
bcouetil profile image
Benoit COUETIL 💫 • Edited

You should use cache. This is why cache exists, and can be shared even across pipelines.

But cache has to be configured on your runners, or you will experience missing cache each time your jobs switch runners (which should not be a problem, npm will handle it)

Collapse
 
drakulavich profile image
Anton Yakutovich

If you compare the time on the clean system, I bet npm ci would be faster than npm install. Cause it just downloads full tree of dependencies from package-lock.json. npm install will check which deps can be updated and build new dependency tree.