DEV Community

Cover image for JavaScript Proxy the Fetch API
Chris Bongers
Chris Bongers

Posted on • Updated on • Originally published at daily-dev-tips.com

JavaScript Proxy the Fetch API

If you go on and google search for JavaScript Proxy, you'll see many articles explaining the core concepts.

But there is one powerful thing almost nobody tells you about.
That one thing is:

You can use Proxy to overwrite existing APIs!

I know it makes sense. It can extend any object, array, or function, so it is logical. But let me explain by a real-world example in which I used the Proxy object.

Extending the Fetch API with a Proxy

You have heard of the Fetch API, a native wrapper to efficiently perform requests to URLs.

Let's say our app has a file that handles all API calls, and they all use the Fetch API.

An example, we got the following class to handle API calls for our Todos.

class TodoAPI {
  getTodos = async () =>
    await fetch('https://jsonplaceholder.typicode.com/todos');
  getTodo = async (id: number) =>
    await fetch(`https://jsonplaceholder.typicode.com/todos/${id}`);
}
Enter fullscreen mode Exit fullscreen mode

To use it, we can use the following code.

const API = new TodoAPI();

(async () => {
  await API.getTodos()
    .then((data) => data.json())
    .then((res) => console.log(res));
  console.log('Fetching single TODO');
  await API.getTodo(3)
    .then((data) => data.json())
    .then((res) => console.log(res));
})();
Enter fullscreen mode Exit fullscreen mode

Nothing crazy yet. We can call our API middleware which uses the fetch request.

This code works perfectly on our website, but when introducing it to a Chrome extension, we quickly notice we can't directly use the fetch method.
CORS issues are blocking it as we inject it on different websites.

We should still accept all the Fetch request data but send it via a background worker.

So one idea is to create a new function that mimics the Fetch API, which could work.
But what happens when the Fetch API changes props?

So a better way to tackle this is to leverage the Proxy object!

Yes, we can Proxy the Fetch API.

In a super simple example, it would look like this:

(async () => {
  const fetchHandler = {
    apply(target, thisArg, args) {
      console.log(args);
    },
  };

  const proxiedFetch = new Proxy(fetch, fetchHandler);

  await proxiedFetch('https://jsonplaceholder.typicode.com/todos/3')
    .then((data) => data.json())
    .then((res) => console.log(res));
})();
Enter fullscreen mode Exit fullscreen mode

Let's see what's going on here.
We create a proxy handler that accesses the apply trap.
Then instead of performing the request, we log the arguments.

We then proxy the fetch function and apply our handlers.
And then, we can use it as the standard Fetch API!

The cool part about this is that all the Fetch arguments stay the same, so there is no need to change any existing implementation formats.

Now let's move this into our function that will become able to switch between regular fetch and our proxied fetch!

We first have to introduce a constructor in our class that will define which method of fetching we should use.

constructor(fetchMethod = (...args) => fetch(...args)) {
    this.fetchMethod = fetchMethod;
}
Enter fullscreen mode Exit fullscreen mode

This function can set the fetch method with all its arguments. By default, we set it to be fetch.

Then we can modify our existing calls to use the preferred fetch method.

getTodos = async () =>
  await this.fetchMethod('https://jsonplaceholder.typicode.com/todos');
Enter fullscreen mode Exit fullscreen mode

As you can see, not much has changed. We moved fetch. to this.fetchMethod. and all our props and callbacks stay the same.

However, the example still uses the regular old fetch.

Let's set a new version to use a custom proxy fetch.

const proxyFetch = {
  apply(_, __, args) {
    console.log(args);
    return { message: 'proxy done' };
  },
};
const proxiedFetch = new Proxy(fetch, proxyFetch);

const API = new TodoAPI(proxiedFetch);

(async () => {
  await API.getTodos().then((res) => console.log(res));
  console.log('Fetching single TODO');
  await API.getTodo(3).then((res) => console.log(res));
})();
Enter fullscreen mode Exit fullscreen mode

We create a new proxy fetch that, in our case console logs all requests and then returns that it's done.

Then we pass this proxied fetch version to our class so that it will use this one.

Feel free to try it on this CodePen. You can switch between passing the proxied fetch or leaving it empty.

The background worker example

I described a background worker example for an extension, and we mock the fetch request to send all requests it receives via the browser runtime messages.

The code looks like this:

const proxyFetch = {
  apply(_, __, args) {
    browser.runtime.sendMessage({
      type: 'FETCH_REQUEST',
      url: args[0],
      args: args[1],
    });
    return null;
  },
};

export const proxiedFetch = new Proxy(fetch, proxyFetch);
Enter fullscreen mode Exit fullscreen mode

As you can see, it's a similar concept as we saw in the main article.
We proxy the existing fetch method but overwrite what it executes.
In this example, we send a message to the browser runtime.

Conclusion

With the Proxy object, we can proxy existing APIs like, for instance, the Fetch API.

This can become super powerful as we don't have to mock the entire function but proxy it to do what we need.

Thank you for reading, and let's connect!

Thank you for reading my blog. Feel free to subscribe to my email newsletter and connect on Facebook or Twitter

Top comments (6)

Collapse
 
peerreynders profile image
peerreynders

I think you're running into the issue here that proxies really aren't a "beginner" topic.

So one idea is to create a new function that mimics the Fetch API, which could work. But what happens when the Fetch API changes props?

The thing is Proxy is object centric. For functions it's just easier to write a wrapper function. On top of that I'm constantly amazed how many people aren't familiar with Function.protoype.apply() and Function.prototype.call() (especially "arrow functions by default" champions)—so handler.apply() is going to catch them off guard.

For demonstration purposes you may have been better off to proxy the API object as a whole. Ultimately that is the approach that Comlink takes as it wraps an entire worker behind a Proxy object.

The other issue is that your proxy doesn't honour the protocol of fetch(); for example it should return a promise. The example works because await automatically turns non-promises into resolved promises.

The final code sample only initiates the fetch but the code to listen for the message event with the data or error is missing (which would likely need to include a correlation ID when factoring out something as fundamental as fetch which will likely have many consumers).


IMO a proxy over a MessagePort is the "wrong abstraction" (unless it's a "quick fix").

And to some degree Surma et al knew this:

By packaging functionality in an actor-like component it simply has to connect to the local message bus. It's the message bus's responsibility to route the messages either locally or to the appropriate web worker.

Not objects, not functions, just messages.

These components could then be easily moved around as they are appropriately decoupled. The API and UI actor could be both on the main thread or the API could be on a web worker.

Collapse
 
dailydevtips1 profile image
Chris Bongers

Hi Peer,

Agreed it's not really a beginner topic.
Would love to have some more discussions around alternatives here.

I think via this article it's hard to explain the concept I was going for, as it should serve a monorepo code-base where one part should be able to mimic fetch requests but send to a background worker.

I'd love to hear some arguments why a non valid response is so desperately needed.

From my perspective we achieved a proxy that keeps all implementations the same, arguments that is.
But simply performs an action via a proxied call.

To me all other alternatives would make this whole process messy and unclear.
Again, would love to have a chat about this 🙏

Perhaps the wrapper function would be a good alternative here.

Collapse
 
peerreynders profile image
peerreynders

it's hard to explain the concept I was going for

I think the context in which your application of the proxy made sense was simply too large to fit into the scope of a single article. It's bound to happen when you try to mine your current work for material.

why a non valid response is so desperately needed.

This seems to be symptomatic of the "web development" industry in general. There seem to be a lot of practices in place for the sake of "saving time now" that would have been judged pre-web as prototype approaches that can't tolerated anywhere near production. So really the meme-worthy 2 year rewrite cycle on the web shouldn't be too surprising—"save now, pay later". J.B. Rainsberger refers to this as "the scam" (The Well-Balanced Programmer).

So specifically in this case…

  • Given that we are doing a fetch one would assume that it is initiated in the service of some goal. Once the fetch is successful the processing can continue pursuing that goal.
  • If the fetch fails one would assume that the goal is at least temporarily unattainable.
  • One would assume that there is some level of importance associated with that goal (otherwise why initiate any action based on it?).
  • So a non-valid response is needed to initiate any mitigation necessary for not achieving the original goal; something like a raising an error indicator or scheduling a retry.
  • If the goal is so unimportant that we don't care that the fetch failed, why waste resources pursuing the goal in the first place?

Now perhaps that mitigation happens entirely on the worker side so no return message is strictly necessary; if so, that wasn't clear.

From my perspective we achieved a proxy that keeps all implementations the same, arguments that is.

Fair enough.

But that isn't the canonical use case of a Proxy though.

Looking at Andrea Giammarchi's example:

const speedy = new Proxy(new Map, {
  get(map, key) {
    if (!map.has(key))
      map.set(key, compute(key));
    return map.get(key);
  }
});
Enter fullscreen mode Exit fullscreen mode

Fundamentally that Proxy will behave just like the original Map it proxies with the exception that it's decorated to lazily add the generation of missing values.

Similarly reactive frameworks use Proxies to monitor the user supplied entities for mutations so that they can notify any subscribers but usually the overall shape and behaviour is not modified (just augmented).

In the case of Comlink the Proxy is used to establish a Façade in front of the MessagePort. In the case of React + Redux + Comlink = Off-main-thread the Proxy becomes the reified interface to the logical Redux-based application running in the web worker.

While it acts as a "proxy" to the web worker based application, it is actually used as a generic, runtime malleable object to implement the application façade.

The default parameters for this code:

constructor(fetchMethod = (...args) => fetch(...args)) {
    this.fetchMethod = fetchMethod;
}
Enter fullscreen mode Exit fullscreen mode

and the previous examples strongly suggest that the TodoAPI class wants access to the return value of the passed fetchMethod. But the proxiedFetch never returns the fetched data.

In effect the article starts out suggesting the intent to proxy the fetch function for TodoAPI but we end up with a "fetch façade" in front of the worker; that leads to cognitive dissonance from the reader's perspective because somehow the objectives switched midstream.

To me all other alternatives would make this whole process messy and unclear.

Sure.

But I think that "clarity" comes from the "one off" nature of the message (right now).

As more communication needs to be routed over that message channel it quickly becomes necessary to multiplex various "conversation contexts" over it.

Aside: How much longer are you going to pursue the "one article per day thing" (would stop me from getting into meatier stuff)? Though now that you are mining git you'll be set for material for a while.

Thread Thread
 
dailydevtips1 profile image
Chris Bongers

Thanks a lot Peer,

That makes a lot of sense, love this kind of background story and validation.

I quite like the article a day approach, it gives me some room to try out different objectives in the morning, without getting caught up in writing long form again. (Tried this before and doesn't really work for me)

Not sure what you mean by mining git?
Would you like me stop writing these every day? 😅

Thread Thread
 
peerreynders profile image
peerreynders

You do you, as long as conformance to arbitrary goals doesn't keep you from what you really want to do.

Collapse
 
crayoncode profile image
crayoncode

Thanks, hoped to find a comment like yours.