Episode 6 β Caching & Artifacts (Faster Pipelines, Smarter CI)
So far in this series, weβve focused on correctness:
- workflows
- runners
- actions
- secrets and environments
Now itβs time to focus on something every developer cares about:
Speed β‘
In this episode, weβll understand caching and artifacts β two concepts that are often confused, but serve very different purposes in GitHub Actions.
Why This Topic Matters
Without caching:
- every CI run downloads dependencies again
- pipelines are slow
- developers wait unnecessarily
Without artifacts:
- jobs canβt share build output
- deploy steps become messy
- pipelines break in subtle ways
A good pipeline uses both β correctly.
1οΈβ£ What is Caching?
Caching means:
Reusing data from previous workflow runs to speed up future runs
Typical things to cache:
- npm / yarn / pnpm cache
- dependency downloads
- build tool caches
Caching is about:
- performance
- efficiency
- faster feedback loops
β Cache is not guaranteed to exist β workflows must still work without it.
2οΈβ£ What are Artifacts?
Artifacts are:
Files produced by a job that need to be used later
Common artifact examples:
- build output (
dist/,build/) - test reports
- coverage reports
- logs
Artifacts are about:
- sharing data
- traceability
- correctness
3οΈβ£ Cache vs Artifacts (Very Important)
| Aspect | Cache | Artifact |
|---|---|---|
| Purpose | Speed | Share files |
| Between runs | β | β |
| Between jobs | β | β |
| Guaranteed | β | β |
| Typical use | dependencies | build output |
π One-line rule
Cache = speed, Artifact = share
4οΈβ£ Dependency Caching (npm Example)
β Without caching
Every run:
npm ci β download everything β slow
β With caching (recommended approach)
- name: Cache npm
uses: actions/cache@v4
with:
path: ~/.npm
key: ${{ runner.os }}-npm-${{ hashFiles('package-lock.json') }}
restore-keys: |
${{ runner.os }}-npm-
Then install:
- run: npm ci
What happens:
- First run β cache miss β downloads deps
- Next run β cache hit β much faster
5οΈβ£ Why NOT Cache node_modules?
Caching node_modules is risky because:
- itβs OS and architecture dependent
- it can go out of sync with
package-lock.json - itβs very large
- failures become hard to debug
Best practice:
Cache what you download, not what you install
6οΈβ£ Uploading Artifacts (Build Job)
- name: Upload build output
uses: actions/upload-artifact@v4
with:
name: build-output
path: dist/
This:
- stores
dist/ - makes it available to other jobs
- keeps it for later inspection
7οΈβ£ Downloading Artifacts (Deploy Job)
- name: Download build output
uses: actions/download-artifact@v4
with:
name: build-output
Now the deploy job can use the exact files produced by the build job.
8οΈβ£ Real-World Example: Build β Deploy
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- run: npm ci
- run: npm run build
- uses: actions/upload-artifact@v4
with:
name: dist
path: dist/
deploy:
needs: build
runs-on: ubuntu-latest
steps:
- uses: actions/download-artifact@v4
with:
name: dist
- run: echo "Deploying build output"
This pattern is clean, reliable, and production-safe.
9οΈβ£ Common Mistakes π¨
β Using cache to share build output
β Expecting cache to always exist
β Using static cache keys
β Uploading huge artifacts unnecessarily
β Forgetting needs: between jobs
Final Mental Model (Lock This In)
Cache β reuse between runs
Artifact β share between jobs
If you understand this distinction, your pipelines will be:
- faster
- more reliable
- easier to debug
Whatβs Next?
π Episode 7:
Reusable Workflows & DRY Pipelines
Weβll learn how to:
- avoid copy-pasting workflows
- centralize CI/CD logic
- scale GitHub Actions across repos
Follow along β weβre getting very close to production-level mastery π
Thanks for reading!
Happy automating β‘
Top comments (0)