Who This Is For
If you are deploying a frontend or full-stack app on AWS Amplify and your builds feel slower than they should be, this blog is worth reading. We are going to talk about Amplify's caching system — what it is supposed to do, what it actually does, and why in my experience it makes things worse, not better.
No deep AWS knowledge is required. If you know what a build pipeline is and have used Amplify at least once, you will follow this completely.
The Problem I Ran Into
I was deploying an app on AWS Amplify. The build had two phases: install packages and build the app. Pretty standard setup.
The total build time was sitting at around 9 minutes. That felt too long. So I opened the build logs and started looking at where the time was actually going.
Here is what I found:
- 3 minutes to restore cache (fetch previously stored files)
- 3 minutes to install packages and build the app
- 3 minutes to save cache (store files for the next build) So out of 9 minutes, the actual work — installing and building — was only 3 minutes. The other 6 minutes were spent entirely on cache operations.
That immediately felt wrong. Cache is supposed to speed things up. If it is consuming twice the time of the actual build, something is broken.
The First Experiment: Disable Cache Entirely
My first instinct was simple. What if I just removed the cache configuration completely and let Amplify install everything fresh every time?
I removed the cache settings from my amplify.yml build config and triggered a new build.
The result: 3 minutes and 30 seconds.
The build went from 9 minutes to 3 minutes 30 seconds just by removing cache. Yes, it took an extra 30 seconds to download packages compared to the ideal cached scenario. But it saved 6 full minutes of cache overhead.
This alone should raise a flag. The cache was not saving time. It was adding time.
What Is Amplify Cache, Exactly?
Before going further, let me explain how Amplify's caching works, because understanding the mechanism is key to understanding why it fails.
When Amplify runs a build, it can be configured to save certain folders — most commonly node_modules — by zipping them up and storing them in S3 (AWS's file storage service). On the next build, it fetches that zip, unzips it into the build environment, and in theory your packages are already there so the install step is faster.
The key operation here is: zip and upload after a build, download and unzip before the next build.
This is how Amplify's cache model works. It is essentially just copying folders in and out of storage between builds.
The Second Experiment: Maybe It Is My Project
After the first result, I thought maybe the problem was specific to my project. I had a reasonably large dependency tree. Maybe the node_modules folder was so big that zipping and unzipping it was always going to take longer than just reinstalling.
So I created a minimal test project — a simple website with almost no packages. Just enough to have a package.json and a basic build step. The kind of project where node_modules is tiny and cache should be trivially fast.
I deployed it on Amplify with cache enabled.
Same result. Amplify spent time fetching the cache, and then installed all dependencies from scratch anyway. The cache folder it had stored from the previous build was essentially ignored from a practical standpoint.
The Root Cause: Cache and npm Are Fundamentally Incompatible
After these experiments, I did some digging and found the real reason this does not work. It comes down to how npm (the package manager) behaves versus how Amplify's cache model works.
Amplify caches folders. That is it. It saves a folder, restores a folder.
But here is the problem:
If you use npm ci (which is the recommended command for CI/CD pipelines because it gives you clean, reproducible installs), it deletes node_modules entirely before installing. Every single time. It does not matter that Amplify just spent 3 minutes restoring that folder. npm ci will delete it and start over.
If you use npm install (the more common development command), it does not always delete node_modules, but it re-evaluates the dependency tree and may reinstall or update packages depending on what it finds. So even here, the cache is not reliably used.
In both cases, the cached node_modules folder is either deleted outright or partially ignored.
Amplify's own documentation recommends using npm ci for builds. But npm ci by design destroys exactly what Amplify's cache tries to preserve. These two things directly contradict each other.
The cache model and the install command are working against each other.
A Simple Way to Think About It
Imagine you spend 10 minutes carefully organizing your desk every night before bed so it is ready for tomorrow. But every morning, the first thing you do is clear everything off the desk and start fresh. The organizing you did the night before is completely wasted.
That is exactly what is happening here. Amplify organizes the node_modules folder into cache. npm wipes the desk clean every build.
What the Numbers Look Like Side by Side
To make this concrete, here is a comparison of what I observed:
With cache enabled:
- Restore cache: ~3 minutes
- Install and build: ~3 minutes
- Save cache: ~3 minutes
- Total: ~9 minutes With cache disabled:
- Install and build: ~3 minutes 30 seconds
- Total: ~3 minutes 30 seconds The "optimized" build with cache took more than twice as long as the build with no cache at all.
What You Should Do Instead
Based on everything above, my recommendation is straightforward: disable Amplify cache unless you have a very specific reason to use it and have verified it is actually helping.
To disable it, remove or empty the cache section from your amplify.yml. Here is what a build config without cache looks like:
version: 1
frontend:
phases:
preBuild:
commands:
- npm ci
build:
commands:
- npm run build
artifacts:
baseDirectory: build
files:
- '**/*'
No cache block. Clean and simple.
If your builds are still slow after removing cache, the bottleneck is likely somewhere else — large dependencies, slow build tools, or the build machine itself. Those are worth investigating separately, but at least you will not be wasting time on a cache that is not working.
My Conclusion
AWS Amplify's cache feature is built on a model that zips and unzips folders between builds. That model does not account for how npm actually works. npm ci deletes node_modules before every install. npm install may partially reinstall anyway. The result is that the cache restore step costs real time — in my case, 3 minutes per build — and delivers no actual benefit.
I tested this on a large app and a minimal app. I tried npm ci and npm install. I made sure cache folders were correctly configured and permissions were in place. In every scenario, disabling cache made builds faster.
This feels like a fundamental design mismatch between Amplify's caching mechanism and how modern package managers work.
Has This Happened to You?
I am genuinely curious whether other developers have experienced this. Have you found a way to make Amplify cache actually work? Did you measure a real improvement? Or did you hit the same wall?
Drop a comment or reach out — I would love to hear if someone has cracked this or if this is a widely shared frustration in the community.
Need Help With Your Amplify Setup?
If you are running into build time issues or anything else with your Amplify deployment, feel free to reach out. Happy to help.
Email me at khantanseer43@gmail.com
Top comments (0)