DEV Community

JM Santos
JM Santos

Posted on • Originally published at Medium on

Transduction in JavaScript

Photo by Ross Sneddon on Unsplash

This is a continuation of the previous article I wrote titled, Fusion in JavaScript. Fusion is a technique of combining pure functions — taking advantage of composition and removing the intermediary copies of data on each layer. If you haven’t read about it, please do so! You can find it here.

📖 Introduction to Transduction

In applying the Fusion technique, you can only use it if all the functions have the same shape of arguments and the same shape of return. Here’s our example back there

You can see that our mapper functions have the same shape — both accept a number and they both return the same shape. A sum and a product.

That’s the Fusion technique. For us to “fuse” the functions or compose them, we have to follow a rule. A rule that in order for us to fuse or compose our functions they should have the same function shape. On our example, the add and multiplyBy2 both have the same shape and that’s why we were able to take advantage of composition.

But, what if there’s an additional requirement? Let’s say we need to filter our result by only getting the numbers below 10 and get the total of all the numbers?

Okay, I hear you. We will add Array.prototype.filter() to remove other items since we only need the items that are below 10 and an Array.prototype.reduce() to get the total of all the numbers. That’s actually correct!

But, this approach suffers also from the fact that on each chain layer, it will create a new copy of the data and iterate on each item again to apply the function.

Maybe you are now starting to wonder, is it possible to combine Array.prototype.map(), Array.prototype.filter() and Array.prototype.reduce() into a single call to avoid creating intermediary copies of data on each layer?

The answer is YES and that’s where Transduction will come! That is our goal, to put thoseArray.prototype.map(), Array.prototype.filter() and Array.prototype.reduce() into a single call.

🧬 Reduce Them All

Before we try to implement Transduction technique, it’s important to realize how this specific method that I’m going to tell you is powerful.

The Array.prototype.reduce() is a powerful function because it allows you to implement anything you would like. You can implement the logic of Array.prototype.filter() inside it, also the logic of Array.prototype.map() and so on!

Let’s see how we can implement our map and filter inside the reduce as we move forward.

I have removed the implementation of pipe for now to avoid extra confusion with the new functions created. There’s also some ground-breaking understanding of the flow of data when using thepipe or compose utilities which I’ll be discussing as we go on.

We’ve created mapReduce and filterReduce as curried functions because in functional programming it is inconvenient to have more than one argument because of composition. These helper functions allow us to use our functions inside Array.prototype.reduce() and make it “compatible” with the Array.prototype.reduce() signature. If you will observe the two functions, you can see that on the 2nd call of the function, it is expecting two inputs (accumulator, currentValue). That function signature is the signature from the Array.prototype.reduce() . We’ve curried the two functions because that allows us to partially create the function or in other words, lazy evaluation.

This is how it looks like without those two functions utilities in raw form.

If we can do it in this raw form, why did we implement some curried functions?

Look at those reductions (the functions inside the Array.prototype.reduce()) and you will see something in common.

Have you spotted it?

Yes, those accumulator.push and returning the accumulator declarations are called combiner functions. A combiner function is simply a function that combines the result. A combiner function is not limited to combining items to the list. In fact, it can combine anything! Here on our example, it is doing accumulator.push which sounded like a “concat” combiner. Let’s create a combiner function and name it combinerConcat .

Okay, that looks good… We’ve extracted our combiner function and that gives us a somehow generic combiner function on our Array.prototype.reduce() calls.

But, there’s a problem with this raw version and why it is important to switch to the curried functions. With this raw version, we will not be able to take advantage of composition and won’t allow us to reduce our calls into a single call operation.

Let’s tidy it up as this will also prepare us for the succeeding steps.

We haven’t reduced our calls into a single call. But, hang on there! We’re almost there! It will make sense later on why we need to curry it!

I want you to be aware of the result. It is 18 and that what we should be expecting throughout the end result.

📈 Going Above With Transduction

Transduction is a process of making Array.prototype.map(), Array.prototype.filter() and Array.prototype.reduce() to be compatible with each other regardless if they have different function shape.

Kyle Simpson on the frontend masters course said that Transduction is a way to use a mathematical process to reshape map and filter into reducers so that map, filter, and reduce can all be used in conjunction.

Transduction uses transducers to compose multiple reducers in order for those reducers to be composable with each other.

A transducer is a higher-order reducer or a composed reducer. A function that is composed of reducers, accepts a reducer, and returns a reducer.

Compared with normal reducers, they are not composable because their signature is different. They accept two inputs (accumulator, currentValue) and returns a single value. With transducer, it accepts a reducer and returns a reducer. And that makes the transducer valid for composition.

On our last example, we were able to convert those Array.prototype.map() and Array.prototype.filter() in a way of Array.prototype.reduce(). That’s actually great progress because we are now able to reshape it into a common signature. Which then, if functions have the same signature, it means we can take advantage of…? Guess what! Yes, Composition!

We haven’t reduced it into a single call and that’s what we are going to do now! Let’s try that one.

We’ve now removed the comments on our pipe and use it to create a transducer in line 37. We now know that a transducer is a higher-order reducer or a composed reducer.

We have two new things here. The first one is transducer which we will tackle shortly and the last one is the result. It is now 35 and not 18 . Remember when I told you to aware of that? We will address it after our transducer . Hang-on tight!

You might wonder about our transducer, why did we not have them combinerConcat on it?

The reason is that will break the signature of our reducers. Let me show you why it will break the signature of our reducers.

We can see that the transducer with a combiner will make the signature kinda like the normal reducers. It accepts two inputs (accumulator, currentValue). We also understand that normal reducers aren’t composable because of their signature are different compared with transducers.

Here’s our statement from the start of this topic:

Compared with normal reducers, they are not composable because their signature is different. They accept two inputs (accumulator, currentValue) and returns a single value. With transducer, it accepts a reducer and returns a reducer. And that makes the transducer valid for composition.

In order for our transducer to be valid for composition, the function shapes should be the same for all the functions.

That is why our transducer doesn’t have a combinerFn . I know that is hard to digest. Take your time. I still have a hard time wrapping my brain about it.

Let’s now get back with our result.

Why is it 35 and not 18? Our pipe’s flow looks the same with our initial implementation.

Do you remember why I commented out our pipe function awhile ago? The reason is that thepipe and compose behaves differently when applying in the transduction.

When we say it behaves differently, what do we mean by that? We understand that the execution ofpipe runs from left-to-right and compose runs from right-to-left.

We can see that compose executes those functions starting from the end (right) until to the start (left). It is running from right to left indeed.

We can also see that pipe executes those functions starting from the start (left) until to the end (right). It is running from left to right indeed.

Those rules are reversed when it is applied in transduction. I didn’t know this at first. I spent almost 2hrs figuring out why this is happening at midnight. I did a lot of research but something is not clicking. I can’t seem to understand what I am reading from different articles.

My last option is to contact Kyle Simpson on Twitter to shed some light on me.

Shooting for the moon! After waking up, he indeed gave an answer and it starts clicking and making sense! So grateful!

This is what he said to my problem.

https://gist.github.com/jmaicaaan/7f217cf3f7d638596d7c503a4083f491

That is confusing at first but I re-read multiple times to start clicking. In addition to that answer, the reason why we are getting a different result is that we think that the “data” that is flowing through the pipe is the actual value — numbers from our list. But, that is incorrect.

A mental shift is needed.

The “data” that is flowing through the pipe is the “reducer” function and not the actual number from our array. It is actually our combinerFn.

With that one, let’s replace our pipe with compose as that would feel “natural” in the flow.

After changing that one, let’s update our transducer as well and see the result.

Hooray! Our result is now correct! Pat your back for sticking through it.

We’re almost there with our final step to complete this journey! We haven’t reduced our calls into a single call. We’ve now achieved to combine Array.prototype.map() and Array.prototype.filter() into a single call but there’s still one more step that we need to do.

Take a look closely on the combinerConcat and sum function.

What do you notice? They both have the same signature. They accept the same input signature and return the same value signature.

The sum function is also a combiner function! And knowing that it is a combiner function as well. We can now remove our combinerConcat and put the sum combiner function in that!

We’ve replaced the initial value from [] to 0 as well because our combiner function — combinerSum is dealing with summing numbers and not working with the collection/list.

We’ve now applied the Transduction technique and that should greatly help us in terms of performance and also provides readability and easier to reason out on our code.

🤖 Bringing It All Together

We’ve converted those Array.prototype.map() , Array.prototype.filter() , and Array.prototype.reduce() into a single call by making them compatible with each other. Making their function signatures be the same in order for us to take advantage of composition. That is the Transduction — the process of converting those functions into a compatible shape through transducers.

There are libraries such a Ramda.js and transducer-js that will you out implementing this and you don’t have to go through implementing this on your own. The goal of this article is to give us knowledge and understanding of how these things work, what problems it is solving, and how we can apply it to our code.

If you are interested in checking it out more, here are some references:

Thank you for reading. I hope this will help you on your journey! ❤️


Top comments (3)

Collapse
 
iquardt profile image
Iven Marquardt • Edited

Here is the identity transducer that does nothing but pretending to be a transducer:

// identity transducer (2x for illustrative purpose)

const foo = f => (x, y) => f(x, y);
//               ^^^^^^^^^^^^^^^^^ partially applied foo

const bar = f => (x, y) => f(x, y);
//               ^^^^^^^^^^^^^^^^^ partially applied bar

// reducer (or rather monoid operation)
const append = (x, y) => x.concat([y]);

// function composition
const comp = (f, g) => x => f(g(x));

[1,2,3,4,5].reduce(comp(foo, bar) (append), []); // [1,2,3,4,5]
Enter fullscreen mode Exit fullscreen mode

The trick is that when we apply comp(foo, bar) (append), foo's f becomes the partially applied bar and bar's f becomes append. reduce then applies the partially applied foo with (x, y), which in turn applies the partially applied bar, which in turn applies append. This is the reason why transducers compose from left-to-right.

Collapse
 
jmaicaaan profile image
JM Santos

I'm honestly still having a hard time wrapping my brain when it comes to this. I mean, it's easy to understand normal compose and pipe flow of data. But, it's challenging for me to 100% track the flow of reducers in the transducers.
Thanks for the example. Took me a lot of reread :)

Collapse
 
iquardt profile image
Iven Marquardt

Ya, it's hard stuff. If we dig even deeper the reason for all this confusing behavior is the composition itself. comp doesn't return a value, but another composed function. I think you already mentioned this in your post. This is sometimes called abstraction over arity, because this new composed function expects more arguments than comp = (f, g) => x => f(g(x)) has originally required. Whenever this happens, the order of evaluation switches to left-to-right, not only with transducers.