Let’s be honest: how many times does your Angular app hit the same endpoint over and over again?
- Multiple components requesting the same data
- Identical parallel requests
- Users navigating back and forth between routes
- The same service invoked multiple times
Result: unnecessary backend load and wasted latency.
NgHttpCaching solves this problem in a clean and predictable way.
It’s an Angular HTTP interceptor that adds configurable, intelligent caching to your HttpClient without changing your architecture.
🚀 How It Works
NgHttpCaching intercepts outgoing HTTP requests and:
- Checks if a valid cached response exists
- If yes → returns it immediately
- If no → sends the request to the backend
- When the response arrives → stores it in cache
It also automatically handles simultaneous requests to the same endpoint:
only one real HTTP call is made, and the others subscribe to the same observable.
Clean. Efficient. Deterministic.
✨ Key Features
- ✅ HTTP caching via interceptor
- ✅ Handles simultaneous/parallel requests
- ✅ Automatic garbage collector for expired entries
- ✅ Automatic cache invalidation on mutations (POST, PUT, DELETE, PATCH)
- ✅ MemoryStorage, LocalStorage, SessionStorage or custom store
- ✅ Optional support for
cache-controlandexpiresheaders - ✅ Fully configurable
⚡ Installation
npm i ng-http-caching
With Angular standalone:
import { bootstrapApplication } from '@angular/platform-browser';
import { provideHttpClient, withInterceptorsFromDi } from '@angular/common/http';
import { provideNgHttpCaching } from 'ng-http-caching';
import { AppComponent } from './app.component';
bootstrapApplication(AppComponent, {
providers: [
provideNgHttpCaching(),
provideHttpClient(withInterceptorsFromDi())
]
});
That’s it. Your GET requests are now cached.
🔧 Configuration
Everything is optional and customizable.
import { NgHttpCachingConfig } from 'ng-http-caching';
const config: NgHttpCachingConfig = {
lifetime: 1000 * 10, // 10 seconds
allowedMethod: ['GET', 'HEAD'],
};
Then:
provideNgHttpCaching(config)
🧠 Smart Cache Invalidation on Mutations
Caching is easy. Keeping it consistent is not.
NgHttpCaching includes automatic mutation invalidation strategies:
clearCacheOnMutation: NgHttpCachingMutationStrategy.COLLECTION
Available strategies:
-
NONE→ No automatic invalidation -
ALL→ Clear the entire cache -
IDENTICAL→ Clear entries with the same URL -
COLLECTION→ Clear resource and its parent collection - Custom function → Fully custom logic
Example:
clearCacheOnMutation: (req) => req.url.includes('/api/critical-data')
💾 Persistent Storage
Default storage is in-memory.
But you can switch to:
LocalStorage
import { withNgHttpCachingLocalStorage } from 'ng-http-caching';
provideNgHttpCaching({
store: withNgHttpCachingLocalStorage()
});
SessionStorage
import { withNgHttpCachingSessionStorage } from 'ng-http-caching';
Custom Store
Implement NgHttpCachingStorageInterface and plug in your own logic.
🏷 Tag-Based Cache Management
You can tag requests:
headers: {
[NgHttpCachingHeaders.TAG]: 'users'
}
Then clear them later:
ngHttpCachingService.clearCacheByTag('users');
Perfect for domain-specific invalidation scenarios.
🎯 Per-Request Overrides
You can control caching behavior via headers:
X-NG-HTTP-CACHING-ALLOW-CACHEX-NG-HTTP-CACHING-DISALLOW-CACHEX-NG-HTTP-CACHING-LIFETIMEX-NG-HTTP-CACHING-TAG
Or override configuration methods using HttpContext.
You can customize:
isExpiredisValidisCacheablegetKey
Even per request.
🛠 Advanced Example: Safe POST Caching
By default, cache key = METHOD@URL_WITH_PARAMS.
If you want to cache POST/PUT safely, define a custom key:
getKey: (req) => {
return req.method + '@' +
req.urlWithParams + '@' +
JSON.stringify(req.body);
}
Now even mutation requests can be uniquely cached.
🧹 Manual Cache Control
Inject NgHttpCachingService:
constructor(private cache: NgHttpCachingService) {}
Available methods:
clearCache()clearCacheByKey()clearCacheByRegex()clearCacheByTag()runGc()getFromCache()
Full control when you need it.
📦 Live Demo
Try it here:
Top comments (0)