DEV Community

Cover image for Four Ways to Immutability in JavaScript
Gabriel Lebec
Gabriel Lebec

Posted on • Updated on

Four Ways to Immutability in JavaScript

Abstract

This article presents four different techniques to immutably update data structures in JavaScript:

Many of the code snippets are runnable and editable, and you are encouraged to experiment with them.

Introduction

Functional programming techniques and design patterns are increasingly popular in JavaScript applications. Tools and frameworks such as RxJS, Redux, and React take inspiration from functional reactive programming, for example.

Among the many varied features of FP, the interrelated facets of purity, immutability, and persistent data structures yield benefits such as referential transparency and idempotence. To put it simply, avoiding mutation and side effects makes code easier to reason about and compose.

JavaScript is a multi-paradigm programming language. It allows for functional approaches, but without the convenience or guarantees found in some dedicated functional languages. Accordingly, a JS developer must take special care if they intend to update data without mutation. For example, a Redux reducer "must never mutate its arguments, perform side effects…, [or] call non-pure functions", but Redux itself provides no tools for adhering to those strictures.

This article will not focus on the benefits of immutability, but rather explore how it can be achieved in JS if desired.

What is an Immutable Update

Consider a function which increments the .counter property of an object:

const obj1 = { bools: [true, false], counter: 7 } const obj2 = { bools: [false, true], counter: 3 } function incrementObjCounter (obj) { // naive solution const newObj = { bools: obj.bools, // same reference counter: obj.counter + 1 // updated data } return newObj } const newObj1 = incrementObjCounter(obj1) const newObj2 = incrementObjCounter(obj2) console.log(obj1, newObj1) // different objects console.log(obj2, newObj2) // different objects console.log(obj1.bools === newObj1.bools) // shared reference console.log(obj2.bools === newObj2.bools) // shared reference

Rather than modify the original object, we generate a new object with changed data. Note that an even more naive solution might recursively copy all the data in the original object. This is usually referred to as a deep clone, and doubles the amount of memory in use.

If we never intend to mutate the data in our base object, however, it can be safely shared with the new object, as demonstrated with the .bools property above. This structural sharing means that an immutable update can often re-use existing memory.

Unfortunately, the code above is brittle; it uses hard-coded property names even for data not being updated. The function is neither reusable for other object shapes nor easily maintainable.

Difficulties with Immutable Updates

Suppose we have some nested data as follows:

const dino = {
  name: 'Denver',
  type: 'dinosaur',
  friends: [
    {
      name: 'Wally',
      type: 'human',
      pets: [
        {
          name: 'Rocky',
          type: 'dog'
        }
      ]
    },
    {
      name: 'Casey',
      type: 'human'
    }
  ]
}
Enter fullscreen mode Exit fullscreen mode

Perhaps we might want to give Wally a new pet rabbit of the form { name: 'Ears', type: 'rabbit' }. Give it a try; the dino value is in scope in the following editable code block. Unlike in most real-world applications, we have deeply frozen dino so you will know if you accidentally attempt to mutate it – an error will be thrown.

const registerTests = require('@runkit/glebec/tape-dispenser/releases/1.0.0') const test = (actual) => registerTests(t => { t.deepEqual( actual, { name: 'Denver', type: 'dinosaur', friends: [ { name: 'Wally', type: 'human', pets: [ { name: 'Rocky', type: 'dog' }, { name: 'Ears', type: 'rabbit' } ] }, { name: 'Casey', type: 'human' } ] } ) t.end() }) function deepFreeze(object) { var propNames = Object.getOwnPropertyNames(object); for (let name of propNames) { let value = object[name]; object[name] = value; if ((typeof value) === 'object') { object[name] = deepFreeze(value) } } return Object.freeze(object); } const dino = deepFreeze({ name: 'Denver', type: 'dinosaur', friends: [ { name: 'Wally', type: 'human', pets: [ { name: 'Rocky', type: 'dog' } ] }, { name: 'Casey', type: 'human' } ] }) await (function runTestInStrictMode () { 'use strict'; // using functional strict mode because of runkit limitations // dino is in scope const updatedDino = undefined; // give Wally a new pet rabbit! return test(updatedDino); })()

Did you manage to correctly generate a new dinosaur, without throwing any mutation-related errors? In an actual codebase, what is your confidence that you would have avoided those errors? Keep in mind that frequently, objects like this are not deeply frozen – they will silently accept mutation.

For this challenge, we had you add an object. What about updating a deeply-nested object, or deleting one? Feel free to experiment below, then read on for some review.

function deepFreeze(object) { var propNames = Object.getOwnPropertyNames(object); for (let name of propNames) { let value = object[name]; object[name] = value; if ((typeof value) === 'object') { object[name] = deepFreeze(value) } } return Object.freeze(object); } const dino = deepFreeze({ name: 'Denver', type: 'dinosaur', friends: [ { name: 'Wally', type: 'human', pets: [ { name: 'Rocky', type: 'dog' } ] }, { name: 'Casey', type: 'human' } ] }) (function runTestInStrictMode () { 'use strict'; // using functional strict mode because of runkit limitations // dino is in scope const dinoRenamedRocky = undefined // change dino.friends[0].pets[0].name const dinoRemovedRocky = undefined // remove dino.friends[0].pets[0] console.log(dino, dinoRenamedRocky, dinoRemovedRocky) // three different values })()

Review of Vanilla JS Behavior

Primitives are Immutable (but also Re-assignable)

JavaScript primitives (undefined, null, boolean, number, string, and symbol values) are immutable by default, so you are already prevented from permanently altering primitive values themselves.

const str = 'hello' str[0] = 'j' // silently ignores mutation attempt console.log(str) // "hello", not "jello"

Of course, JavaScript does allow re-assignment of shared variables, which is not the same thing as mutation but which presents many of the same issues.

let x = 5 badInc = () => { x = x + 1; return x } badDbl = () => { x = x * 2; return x } console.log(badInc(x) + badDbl(x)) // neither commutative nor idempotent

Try commuting the terms in the above expression to badDbl(x) + badInc(x) and re-running the code. Because of the statefulness inherent in re-binding the shared x variable, we get different results, betraying our expectation of how addition works.

Objects are Mutable (but also Freezable)

Anything not of the primitive types listed above is considered an object in JS – including arrays and functions. A common beginner misunderstanding with JavaScript is that const declarations will prevent mutation. Not so; const only prevents re-assignment.

const obj = {} const arr = [] const fnc = () => {} obj.mutated = true arr.mutated = true fnc.mutated = true console.log(obj.mutated, arr.mutated, fnc.mutated) // true x 3

JavaScript does have the static method Object.freeze(obj) to prevent re-assigning properties. In strict mode, attempting to set a property on a frozen object throws a helpful error. Sadly, in non-strict mode the attempt is merely rejected silently. (Note that freeze itself mutates the object.)

const obj = { example: 9000 } obj.example++ // mutates obj Object.freeze(obj) // mutates obj by freezing it obj.example++ // silently ignored in non-strict mode (error in strict mode) console.log(obj.example) // 9001, not 9002

Beware, however – freeze only prevents re-assigning top-level properties, and does not prevent mutating objects referenced in those properties. For that, you need a non-native "deep freeze" algorithm which recursively freezes objects.

const obj = { o: { x: 50 } } Object.freeze(obj) obj.o.x++ // still allowed console.log(obj) // inner reference mutated, oops

Freezing objects could help reveal and debug errors by failing early (upon mutation attempt) rather than late (when an unanticipated state yields incorrect behavior), at least in strict mode. In practice however, freezing is not very common, perhaps because:

  • an author needs to predict that a given object is worth the extra effort of freezing, belying the YAGNI principle
  • deep freeze algorithms, like deep clone algorithms, have edge cases and pitfalls in JS (e.g. cycles, properly detecting objects)
  • recursively freezing deeply-nested objects might be a performance concern (valid or not)

Approach 1: Native Immutable Updates

So, what tools does JS provide when hoping to avoid mutation? As primitives are already immutable, and functions are infrequently used as mutable data containers, in practice the question is how to generate new versions of Objects and Arrays. For those, we have several built-in methods.

Spread Syntax

As of ES2015, the spread ... operator has allowed for expanding iterable values into explicit arguments. When used inside new array or object literals, this permits copying the properties of existing values into the new value:

const obj1 = { type: 'data' } const arr1 = [1, 2, 3] const obj2 = { ...obj1, subtype: 'stuff' } // adds a key-val pair const arr2 = [ ...arr1, 'cheese' ] // adds an element // distinct old and new versions console.log(obj1, obj2) console.log(arr1, arr2)

With objects in particular, this allows for easy updating of a top-level key:

const obj1 = { type: 'data', age: 55 } const obj2 = { ...obj1, age: obj1.age + 1 } // modified age console.log(obj1, obj2)

In some environments, object spread syntax may not be supported, but Object.assign (also ES2015) may be available.

Problems with Native Methods

Spread makes it convenient to add new values to an array, add new key-value pairs to an object, or modify a key-value pair on an object. However, deletion of a key-value pair is not as straightforward. There are some tricks using destructuring but they are not convenient when working with nested objects.

When it comes to arrays, removing a specific element can be done via either filtering or sliceing:

const arr = ['red', 'orange', 'yellow', 'green', 'blue', 'indigo', 'violet'] const newArrA = [...arr.slice(0, 2), ...arr.slice(3)] const newArrB = arr.filter(color => color !== 'yellow') console.log(arr, newArrA, newArrB)

While updating an array element can be accomplished with slice or map:

const arr = ['red', 'orange', 'yellow', 'green', 'blue', 'indigo', 'violet'] const newArrA = [...arr.slice(0, 2), 'sunshine', ...arr.slice(3)] const newArrB = arr.map(color => color === 'yellow' ? 'sunshine' : color) console.log(arr, newArrA, newArrB)

These methods work but are somewhat cumbersome and error-prone. And we are only dealing with shallow data; things become more difficult with nesting. Let's return to the original example from this article, adding a rabbit to Denver's friend Wally's list of pets. Here is one possible solution using vanilla JS:

const dino = { name: 'Denver', type: 'dinosaur', friends: [ { name: 'Wally', type: 'human', pets: [ { name: 'Rocky', type: 'dog' } ] }, { name: 'Casey', type: 'human' } ] } const updatedDino = { ...dino, friends: [ { ...dino.friends[0], pets: [...dino.friends[0].pets, { name: 'Ears', type: 'rabbit' }] }, ...dino.friends.slice(1) ] } console.log(dino, updatedDino)

This works, but it has some notable downsides:

  • it is complex and verbose enough to be difficult to read, write, and maintain
    • frequent alternation between array and object spread makes it easy to lose track of what's going on
    • the deeper into the object one goes, the higher the likelihood that reference chains grow (e.g. pets: [...dino.friends[0].pets ])
    • modifying one specific value in an array requires adding map or slice into the mix, increasing noise
    • any mistakes in implementation will be silent mutation errors in practice, unless dino is deeply frozen and you are in strict mode
  • the performance is not ideal
    • spread syntax invokes the iteration protocol
    • creating a copy of changed objects / arrays (e.g. the pets and friends arrays) is O(n) time complexity

Approach 2: Ramda lens

Context

Lenses are an approach to interacting with nested data that originate in Haskell, a lazy (and therefore of necessity pure) functional language with Hindley-Milner inferred static typing. In that language, they are remarkably flexible and general tools that work across myriad data types seamlessly.

Lenses and related tools are a deep topic, and in JavaScript we do not have the type restrictions that make lenses especially valuable in Haskell. However, the core idea is simple and useful enough that it makes an appearance in Ramda, a popular functional JS utility library.

High-Level

Conceptually, a lens is a pair of functions:

  • a getter for a "focus" element a that can be derived from some "source" data s
  • a setter which can update a, immutably generating a new s

In terms of implementation, a lens is actually a single function which performs either of the above duties depending on how it is used. A given lens function is not invoked directly on the source; instead, helper functions like view, set, and over are responsible for correctly wielding the lens.

const { view } = require('ramda')
const extractedData = view(someLens, sourceData) // NOT someLens(wrappingData)
Enter fullscreen mode Exit fullscreen mode

Because lenses are individual functions, they can be directly composed together to form new lenses (an example of the combinator pattern).

const { compose } = require('ramda')
const deepLens = compose(outerLens, innerLens)
Enter fullscreen mode Exit fullscreen mode

Demonstration

Most commonly with Ramda, one will not write a lens from scratch but use methods like lensIndex and lensProp to generate correct lenses.

const dino = { name: 'Denver', type: 'dinosaur', friends: [ { name: 'Wally', type: 'human', pets: [ { name: 'Rocky', type: 'dog' } ] }, { name: 'Casey', type: 'human' } ] } const R = require('ramda') // dino is in scope // making a lens for the "friends" property const friendsLens = R.lensProp('friends') // viewing data using a lens const friends = R.view(friendsLens, dino) console.log(friends) // immutably setting data using a lens const lonelyDino = R.set(friendsLens, [], dino) // aww, no more friends console.log(lonelyDino) // immutably mapping data using a lens const addEveryone = arr => [...arr, 'everyone'] const popularDino = R.over(friendsLens, addEveryone, dino) console.log(popularDino) console.log(dino) // unchanged

The set and over methods generate new data with the desired updates, but like Object/Array spread, the new values also share any non-updated references with the old values.

All of this is well and good, but what about nested data? For that, standard right-to-left function composition can stitch together multiple lenses.

const dino = { name: 'Denver', type: 'dinosaur', friends: [ { name: 'Wally', type: 'human', pets: [ { name: 'Rocky', type: 'dog' } ] }, { name: 'Casey', type: 'human' } ] } const R = require('ramda') // dino is in scope // making some lenses const friendsLens = R.lensProp('friends') const lens0 = R.lensIndex(0) const petsLens = R.lensProp('pets') const nameLens = R.lensProp('name') // stacking lenses together into a single lens const frns0pets0name = R.compose( friendsLens, // outermost lens first lens0, petsLens, lens0, nameLens // innermost lens last ) // viewing data using a lens console.log(R.view(frns0pets0name, dino)) // immutably setting data using a lens console.log(R.set(frns0pets0name, 'Spot', dino)) // immutably mapping data using a lens console.log(R.over(frns0pets0name, (s => s + '!'), dino)) console.log(dino) // unchanged

Compared to our Object/Array spread + map technique, the above is far more concise, declarative, easy, and foolproof. But it gets even easier; composing together lenses to focus on a nested property chain is common, so Ramda provides a convenient lensPath function.

const dino = { name: 'Denver', type: 'dinosaur', friends: [ { name: 'Wally', type: 'human', pets: [ { name: 'Rocky', type: 'dog' } ] }, { name: 'Casey', type: 'human' } ] } const R = require('ramda') // dino is in scope const frns0pets0name = R.lensPath(['friends', 0, 'pets', 0, 'name']) console.log(R.set(frns0pets0name, 'Spot', dino)) console.log(dino) // unchanged

You can use this lens with view and over as well, of course – try for yourself. Ultimately, our final solution has been condensed down to two easy-to-read lines – one constructing the lens from a path, another using the lens to immutably set a deeply-nested property in some data.

All You Need is Lens?

Lenses, being vanilla composable functions with multiple capabilities (read/update/set), are a highly functional solution with all the attendant tricks that implies: partial application, higher-order usage, first-class portability, dynamic behavior etc. However, while Ramda's lens functions make it easy to view, set, and map nested properties, deleting is a little more indirect; you must map over a parent prop using vanilla JS removal techniques:

const R = require('ramda') const obj = { items: ['a', 'b'] } const removeFirst = arr => arr.slice(1) const updated = R.over(R.lensProp('items'), removeFirst, obj)

The basic instantiation and usage of lenses via a library like Ramda may be simple enough, but users should be aware that lenses and other "optics"* have some advanced uses and important gotchas. For example, lens-like functions which violate the lens laws can result in odd edge cases.

*(A subset of lenses are isomorphisms, which enable you to operate on data structures as if they were other (equivalent) data structures. There are also prisms, which enable interacting with data that may or may not be of a certain shape, and traversals, which can focus on multiple subparts at a time.)

Approach 3: Immer

Some developers may prefer a more idiomatic approach to immutable updates in JS which doesn't involve importing a larger suite of tools or learning a new API. Michael Westrate's library Immer is purpose-built for just that.

Immer's clever trick is to change what ordinarily-mutating actions mean – intercepting assignment (=), unsafe methods (push), and other cases – in a form of metaprogramming. Specifically, it uses the Proxy constructor to create a "draft" version of your original object. You can mangle this draft at will, even mutating deeply-nested references directly. Instead of allowing those changes to actually occur, Immer transforms them into an aggregate immutable update on the original object.

Demonstration

Immer's core function, produce, has the following signature:

produce(currentState, producer: (draftState) => void): nextState
Enter fullscreen mode Exit fullscreen mode
const { produce } = require('immer') const original = { color: 'blue', items: [5, 6], thing: { x: true }, shared: {} } const updated = produce(original, (draft => { draft.color = 'red' draft.items.push(917) delete draft.thing.x draft.thing.y = false })) console.log(original, updated) // no mutation console.log(original.shared === updated.shared) // no naive copying

As the final console.log above shows, Immer is using structural sharing just like our manual spread and Ramda lens solutions. The advantages include:

  • new properties are automatically frozen in development mode
  • produce has a curried version for partial application
  • immer is a small single-purpose library
  • no need to learn a new API (apart from the one function wrapper)
  • no need to take special care in most cases, less cognitive overhead

All You Need is Immer?

While Immer is very slick, it is ultimately doing the same Object.assign / spread / map type operations as our previous solutions. So while Immer benefits from structural sharing, some updates on large data may be slow (O(n) to copy an array, for example). Also, by default Immer only works on (certain properties of) plain arrays and objects, not other built-ins such as Map and Set. For other pitfalls, consult the docs.

Approach 4: Immutable.js

The three approaches we have examined so far all focus on essentially the same data structures – nested Objects and Arrays. Native methods, lenses, and immer all "play well with others" in the sense that other libraries and codebases will generally provide and expect such data types.

One disadvantage of those structures, however, is that they were not purposefully designed with immutable updates in mind. To add an item to an array, you have to create a copy of the array with all other elements – an O(n) operation. If you fail to take special care, you can call a native method like sort without realizing it mutates the original data structure.

An alternative approach is to switch to a purely functional data structure (PDF link), or at least a data structure with efficient immutable update operations. Such data structures would expose an API of only pure operations, reducing the chance of error.

Facebook's library Immutable takes advantage of such structures. While they are implemented in vanilla JavaScript, the details are hidden from the user, who now interacts with classes such as List and Map (distinct from the ES2015 class of the same name). For large amounts of data, these entities are now much quicker to modify; for example, Map's set and get functions have O(log32 n) time complexity, and List's push and pop methods are O(1) despite returning a new updated list.

Demonstration

Immutable is a large library with multiple classes, each of which has many methods. We will demonstrate just a small corner of that library, beginning with converting some typical JS data to a nested series of Maps and Lists.

const dino = { name: 'Denver', type: 'dinosaur', friends: [ { name: 'Wally', type: 'human', pets: [ { name: 'Rocky', type: 'dog' } ] }, { name: 'Casey', type: 'human' } ] } const Immutable = require('immutable') // dino is in scope const immDino = Immutable.fromJS(dino) // view nested (perhaps missing!) data const dogName = immDino.getIn(['friends', 0, 'pets', 0, 'name']) console.log(dogName) // change nested data (immutably), perhaps at a brand new path const immDino2 = immDino.setIn(['friends', 0, 'pets', 0, 'name'], 'Spot') console.log(immDino2)

Some advantages are not directly demonstrated, but worth pointing out:

  • if a property doesn't exist, getIn will short-circuit and return undefined rather than throwing a cannot read x of undefined error
  • if a property doesn't exist, setIn will generate new Map objects along the way
  • there are many more things we can do with Immutable.js Maps and Lists, such as .map, .push, .sort, etc. – all in a pure, non-destructive way

At the end, if some other library or framework requires a plain JS object or array, we can use .toJS() to convert it back.

All You Need is Immutable?

Performance and safety are the primary reasons to favor using immutable, but as always there are tradeoffs to consider.

  • Immutable requires developers to learn a new, relatively large API
  • Debugging immutable values is less ergonomic than plain objects and arrays
  • Projects using Immutable classes may need to convert to and from simpler datatypes, potentially undoing the performance benefit

I Want it All

Each of the approaches examined has pros and cons, as is often the case in programming. Thankfully, they are not mutually exclusive. For example, Brian Lonsdorf demonstrates using Ramda lenses to blend native objects/arrays with Immutable.js maps/lists in his article Lenses with Immutable.js. Similarly, one might use array spread inside a function passed to Ramda over, or inside a call to Immer's produce function. Knowing multiple approaches allows you to reach for the abstraction that fits your use case.

Tool Learning Curve Safety Performance Concept TL;DR
Native JS devs expected to know it, but takes time to master Poor, common source of errors Slow (linear) for large amounts of data Manually copy nested properties, map / filter / slice to update or delete Built-in
Ramda (lenses) · (52 kB) Small number of concepts, but potentially alien to FP newbies Good, but be careful with paths Comparable to native Create lenses for nested data, use them with view / set / over, manipulate via FP techniques (composition, partial application) Versatile patterns
Immer · (13 kB) One API method Good, but has some edge cases Comparable to native Use normally-mutating native actions on a draft object, actions instead become immutable updates Easy API
Immutable · (63 kB) Larger library with many API methods Good, but be careful with paths Excellent per se, but slowed by conversion Use library's data structures and methods instead of native types, convert where necessary Faster algos

Addendum: Immutability via Function Encoding

Prof. Dierk König brings up another technique for immutable data: store your data as functions.

The Vireo combinator (as it is named in Raymond Smullyan's famous book, To Mock a Mockingbird) is defined in JavaScript as follows:

const vireo = a => b => f => f(a)(b)

Enter fullscreen mode Exit fullscreen mode

When vireo is applied twice (to two separate pieces of data), it returns a final function which has closed over both data points. The result is conceptually a pair of numbers, stored as a function closure. To access to the stored numbers, a final function argument is provided:

const vireo = a => b => f => f(a)(b) const pairOfNums = vireo(5)(8) // f => f(5)(8) pairOfNums(a => b => console.log(a)) // logs 5 pairOfNums(a => b => console.log(b)) // logs 8

Note that with this example, there is no way to actually modify the values once the pair is generated – the closure acts as a form of privileged data. How can we "update" such a pair? We make a new function which defers to old function:

const vireo = a => b => f => f(a)(b) const pairOfNums = vireo(5)(8) // f => f(5)(8) const fst = a => _ => a const snd = _ => b => b const newPairWithSecondNumDoubled = f => { const firstNum = pairOfNums(fst) const secondNum = pairOfNums(snd) return f(firstNum)(secondNum * 2) } console.log( newPairWithSecondNumDoubled(fst), // 5 newPairWithSecondNumDoubled(snd) // 16 )

To make this easier, we could write helper functions for getting, setting, and mapping…

const vireo = a => b => f => f(a)(b) const getL = p => p(l => _ => l) const getR = p => p(_ => r => r) const getLR = p => p(l => r => [l, r]) const setL = newL => p => vireo(newL)(getR(p)) const setR = newR => p => vireo(getL(p))(newR) const mapL = transform => p => vireo(transform(getL(p)))(getR(p)) const mapR = transform => p => vireo(getL(p))(transform(getR(p))) const numPair = vireo(3)(7) const show = pair => console.log(getLR(pair)) show(numPair) // [3, 7] show(setL(0)(numPair)) // [0, 7] show(mapR(x => x * 2)(numPair)) // [3, 14]

Storing data through closures, and creating new functions which defer to old functions, can scale up to more complex data and also be made more idiomatic in JS; this is a large enough topic for its own article, however. If this approach intrigues you, Sandy Maquire's Thinking with Types illustrates a clever method (in Haskell) of encoding Tic-Tac-Toe boards as functions from coordinates to data. Experiment with a JS version below!

const emptyBoard = (x, y) => null const setPos = (atX, atY, newVal, oldBoard) => { const oldVal = oldBoard(atX, atY) const newBoard = (x, y) => { return (x === atX && y === atY) ? newVal : oldBoard(x, y) } return newBoard } const printBoard = (board, limitX, limitY) => { for (let x = 0; x < limitX; x++) { for (let y = 0; y < limitY; y++) { console.log('pos ' + x + ', ' + y + ' is ' + board(x, y)) } } } const b2 = setPos(0, 0, 'x', emptyBoard) const b3 = setPos(1, 1, 'o', b2) printBoard(b3, 2, 2) // NB: if you log a larger board, runkit paginates the results!

Top comments (6)

Collapse
 
latobibor profile image
András Tóth • Edited

Great write-up to illustrate the problems and solutions of immutability! I have a couple of remarks:

  • I loved immer's approach since it was the cleanest.
  • I think immutable.js passed away silently: the last release was 14 months ago and it also has problems with TypeScript which does not seem to be ever solved.
  • Finally I have serious problems with Ramda's syntax; personally I think it is a horrible API from the point of clean code:
// Any function that relies on you memorizing the order of parameters 
// will make you do unnecessary doc checks. 
// If you don't use `R.set` for 2 days then any of these would look legit:
R.set(frns0pets0name, 'Spot', dino);
R.set('Spot', frns0pets0name, dino);
R.set(dino, 'Spot', frns0pets0name);

// An alternative to this could be this:
R.set({ source: dino, lens: frns0pets0name, value: 'Spot' });

// another problem: stringly typed; if you were relying on TypeScript
// for example it won't save you, it won't know the difference between
// any string and a string coinciding to be a property name
immDino.getIn(['friends', 0, 'pets', 0, 'name'])

// one day someone will rename `name` to `firstName` using a tool in their IDE,
// the tool will find all Dino objects, and update them; except these string ones
Enter fullscreen mode Exit fullscreen mode

So the take-away: use the cleanest, most intuitive ones until you have a performance bottleneck; then use the complicated ones for those cases.

Collapse
 
glebec profile image
Gabriel Lebec • Edited

You make a very good point about Ramda. This generally isn't a problem in Haskell (one of the inspirations for Ramda) because of static typing, which enforces that a function like set takes a lens, source object, and target value in the correct argument order (else it simply won't compile). Then, the fact that set is curried allows for a lot of seamless interop and versatile usage patterns with other higher-order functions. In JS however, the lack of HS-style flexible/powerful static typing means you have more opportunities to make mistakes, passing the wrong object into the wrong argument position. Ramda still provides some very flexible usage patterns due to currying, but I don't demonstrate or highlight those techniques in this article as they fit into a much wider topic of FP approaches.

Likewise, I agree with you about the problems of "stringly-typed" APIs. Again in Haskell this isn't an issue for lenses for a variety of reasons, including the fact that lenses can be auto-generated from record field names and composed using the vanilla function composition operator (.) so you end up with lines like friends . first . pet . name which is 100% typed, non-string code.

The moral may be that trying to emulate Haskell in JavaScript can look almost as nice in a purely syntactic sense, but lose so many of the real benefits that it becomes correspondingly harder to justify the exercise.

Collapse
 
briancodes profile image
Brian • Edited

lodash/fp may be of interest - the immutable/functional version of lodash. Similar in style to Ramda

Collapse
 
glebec profile image
Gabriel Lebec

NOTE: on 2020-08-13, github.com/forem/forem/issues/9773 noted that some Runkit embeds are failing to load correctly. This article is affected; apologies to any readers who may come here before that issue is resolved.

Collapse
 
gabrielmlinassi profile image
Gabriel Linassi

Very good article. It helped me quite a lot to understand a bit more about this advanced topic in JS. Thank you very much for taking the time to share your knowledge with the community.

Collapse
 
redbar0n profile image
Magne

ts-belt is the new shizz!