DEV Community

Cover image for Request Deduplication in Angular
kasual1
kasual1

Posted on

Request Deduplication in Angular

Many Angular applications share a common problem. Using the same data in different components. While there are many solutions to solve this issue, in this article I am going to explain a really elegant solution that helped me to get rid of some misconceptions and improved my knowledge about the overall architecture of an Angular application.

Over the last couple of years the feature based architectural approach has gained a lot of popularity in the Angular community. You divide your app into small, self-sufficient parts and keep components, services and models within a such called feature module. This way it is really easy to extend functionality and reuse a feature wherever it is needed in the app. Unfortunately this approach also has its drawbacks. Since every feature is responsible for handling its own data, it is only a matter of time until you get duplicated requests. While one solution is to globally fetch the data and forward it to children via input-properties, this certainly leads to more complexity and makes it less easy to reuse a feature. Therefore we need a solution where everything can stay within the feature module.

I started to dig around the internet and interestingly I found the solution in the docs of Next.js. Within the data-fetching part they explain the general concept. In Next.js all requests are made using an extended version of the fetch API. Requests are automatically memoized so that you can fetch the same data in multiple places in a React component tree while only executing it once. For example, if you need to use the same data across a route (e.g. in a Layout, Page, and multiple components), you do not have to fetch data at the top of the tree then forward props between components. Instead, you can fetch data in the components that need it without worrying about the performance implications of making multiple requests across the network for the same data.

Request deduplication overview

While this sounds exactly like the behavior that we are looking for, unfortunately the Angular framework does not provide such functionality out of the box. Therefore I sat down and build a solution myself.

A great place to start is the HttpInterceptor. First we create a new interceptor called CustomInterceptor and add it to the providers array of our app.module.ts.



// custom.interceptor.ts
@Injectable()
export class CustomInterceptor implements HttpInterceptor {
   return next.handle(request);
}

// app.module.ts
@NgModule({
  declarations: [
    ...
  ],
  imports: [
    ...
  ],
  providers: [
    {
      provide: HTTP_INTERCEPTORS,
      useClass: CustomInterceptor,
      multi: true
    }
  ],
  bootstrap: [AppComponent]
})
export class AppModule {}


Enter fullscreen mode Exit fullscreen mode

Within the interceptor we can execute some custom logic that is executed right before a request is send out to the network. Lets start by implementing simple caching behavior.



// custom.interceptor.ts
@Injectable()
export class CustomInterceptor implements HttpInterceptor {

  cache: Map<string, HttpEvent<unknown>> = new Map();

  intercept(request: HttpRequest<unknown>, next: HttpHandler): Observable<HttpEvent<unknown>> {

    if(request.method !== 'GET') {
      return next.handle(request);
    }

    const cached = this.cache.get(request.urlWithParams);
    if(cached) {
      return of(cached);
    }

   return next.handle(request).pipe(
      tap(event => {
        if(event instanceof HttpResponse){
          this.cache.set(request.urlWithParams, event.clone())
        }
      })
    );
  }
}


Enter fullscreen mode Exit fullscreen mode

In the code above we first check whether the request is eligible to be cached. Since we want to keep things simple, we only cache GET requests and immediately return the next.handle() otherwise. We then take a look in our cache and if it includes an entry for the current request we return this entry instead of making a new request. If the cache does not hold an entry already, we then tap into the response stream of next.handle(request) and use request.urlWithParams as a key to add the cloned response event to our cache.

When running your application with the current setup, all requests that have been responded already, are now served from cache. While this is certainly going in the right direction, there is another problem that needs to be solved. All requests that are concurrently fired and have not been responded yet, cause another request that isn’t served from cache. Therefore we now implement the actual deduplication behavior via a queue. We add to the HttpInterceptor as depicted below.



// custom.interceptor.ts
@Injectable()
export class CustomInterceptor implements HttpInterceptor {

  cache: Map<string, HttpEvent<unknown>> = new Map();

  // Queue to store ongoing requests
  queue: Map<string, Observable<HttpEvent<unknown>>> = new Map();

  constructor() {}

  intercept(request: HttpRequest<unknown>, next: HttpHandler): Observable<HttpEvent<unknown>> {

    if(request.method !== 'GET') {
      return next.handle(request);
    }

    // Check if request is currently in queue
    const queued = this.queue.get(request.urlWithParams);
    if(queued) {
      return queued;
    }

    const cached = this.cache.get(request.urlWithParams);
    if(cached) {
      return of(cached);
    }

    const shared = next.handle(request).pipe(
      tap(event => {
        if( event instanceof HttpResponse){
          this.cache.set(request.urlWithParams, event.clone())
        }
      }),
      // Delete from queue, since the request has been answered
      finalize(() => this.queue.delete(request.urlWithParams)),
      // Share replay to not cause duplicate subscriptions
      shareReplay()
    );

    // Add request to queue 
    this.queue.set(request.urlWithParams, shared);

    return shared;
  }
}


Enter fullscreen mode Exit fullscreen mode

Similar to the cache, we add another Map called queue. We use this to store all ongoing requests until they have returned a response. So again we first check whether a request is currently in the queue and return it. Otherwise we store the entire next.handle(request) Observable in the queue. We add the finalize operator and delete the entry from the queue as soon as the Observable completes with a response. Last but not least we add shareReplay, in order to not cause duplicated subscriptions.

If we now run the application again, even concurrently fired requests should not cause a wire transfer anymore.

Bonus 1: Cache only for current active route path

While it is certainly nice to cache data for the entire lifespan of your application, some applications require to frequently pull in, up to date data. Some users might even expect your application to serve fresh data as soon as they navigate to a different route path. We can easily extend our current CustomInterceptor to achieve the desired behavior.



// custom.interceptor.ts
export class CustomInterceptor implements HttpInterceptor {

cache: Map<string, HttpEvent<unknown>> = new Map();

queue: Map<string, Observable<HttpEvent<unknown>>> = new Map();

// Store current active url path
currentUrl: string = '';

constructor(private router: Router) {}

intercept(request: HttpRequest<unknown>, next: HttpHandler): Observable<HttpEvent<unknown>> {
   // Clear cache if current active url path changes
   if(this.router.url != this.currentUrl) {

      this.currentUrl = this.router.url;

      this.cache.clear();
      this.queue.clear();
    }


    if(request.method !== 'GET') {
      return next.handle(request);
    }

    // ...

  }
}


Enter fullscreen mode Exit fullscreen mode

We add the variable currentUrl to hold the current active route path of our application. As soon as the route path changes we update the currentUrl and clear the cache as well as the queue.

Now your application should deduplicate requests only while staying on the current route. As soon as you navigate to a different view the deduplication starts all over again.

Bonus 2: Skip cache for certain requests

Of course there are places in an app that always require to fetch up to date data. Let’s implement a simple flag that enables you to make selected requests skip the cache.



// custom.interceptor.ts
export const SKIP_CACHE = new HttpContextToken<boolean>(() => false);

export class CustomInterceptor implements HttpInterceptor {

// ...

intercept(request: HttpRequest<unknown>, next: HttpHandler): Observable<HttpEvent<unknown>> {
   // ...

    if(request.context.get(SKIP_CACHE)) {
      return next.handle(request);
    }

    if(request.method !== 'GET') {
      return next.handle(request);
    }

    // ...

  }
}

// data.service.ts
@Injectable({providedIn: 'root'})
export class DataService {

    constructor(private http: HttpClient) { }

    getData$(id: number): Observable<any> {
        const context = new HttpContext().set(SKIP_CACHE, true);
        return this.http.get(`https://swapi.dev/api/people/${id}`, { context });
    }

}


Enter fullscreen mode Exit fullscreen mode

We introduce a HttpContextToken called SKIP_CACHE. As soon as this token is present on a http request and the value is set to true, we immediately return the next.handle(request).

Whenever you want to skip a certain request you just set this token on the HttpContext and pass it to the http get method of the HttpClient.

Conclusion

Especially in the context of feature based architectures, the presented solution seems quite elegant. Just like in Next.js, when developing a feature, you make all necessary requests right in your feature module, without worrying about implications of duplicated requests.

In fact, to me the solution seems so obvious, that I’m a bit disappointed that such technique is not mentioned anywhere in the Angular docs. Even though it would be great to get a native solution from the Angular team, I just published the discussed solution as a small npm package. Just head over to ngx-dedup and give it a try.

Follow Me

x.com (formerly twitter)
github.com
npmjs.com
medium.com

Top comments (0)