DEV Community

Cover image for Understanding JavaScript/TypeScript Memoization

Understanding JavaScript/TypeScript Memoization

Carlos Caballero on February 22, 2019

Originally published at www.carloscaballero.io on February 8, 2019. What means Memoization? The definintion of memoization from th...
Collapse
 
enriquemorenotent profile image
Enrique Moreno Tent • Edited

Good job, important concepts explained with easy words.

Small improvements to your memoize method:

  • You rewrite the content of cache, even if the value was already saved. This is unnecessary, since the value does not change.
  • undefined is actually a valid value for a function response. Probably should use hasOwnProperty instead of cache[stringifiedParams] === undefined

Thanks for the article.

Collapse
 
carlillo profile image
Carlos Caballero

Hi Enrique!

Thanks for your words, the intend in this post is show the memoization concept (I think that you can find in your language/framework/tool this technique to applied in your project).

Respect your words in the memoize method:

  1. Yes, It is not necessary re-saved the same value.
  2. Yes, I think that is better use hasOwnProperty as you said!

At the moment, I could to do a new version to teach in my class. Thanks very much for your comments :-).

Collapse
 
qm3ster profile image
Mihail Malo

I find it's best to use a Map or, for functions and other objects, a WeakMap.
The way I write my code, structural comparison will give almost no advantage, but JSON.stringify of huge objects will add overhead to every call (even a cache hit)

Thread Thread
 
carlillo profile image
Carlos Caballero

Hi Mihail!

Good catch! In fact, the lodash implementation (github.com/lodash/lodash/blob/508d...) use map.

In my article, I only wanted show the memoization technique when you've a pure function!

In case that you want use it in a real project, I think that lodash implementation is a good option.

Thread Thread
 
qm3ster profile image
Mihail Malo • Edited

I have so many comments about that function :/

  1. Line 58 is redundant if they also do line 62. It seems they kept this redundancy through the last commit to that function
  2. You can only set memoize.Cache, but can't set it in the call to memoize (like memoize(fn, resolver, cacheClass))
  3. You can't pass in an existing cache like memoize(fn, resolver, cacheInstance), which would be more flexible and no less ergonomic.
  4. Line 55: memoized.cache = cache.set(key, result) || cache What if my custom cache returns a truthy value that isn't itself? For example if it returns result value? Why not just set it to cache, what does this achieve? Supporting an immutable cache?

The closest I'd go with is something like this:

export const memoize = ({
  resolver,
  cacheClass = Map,
  cacheFactory = () => new cacheClass()
} = {}) => func => {
  const cache = cacheFactory()
  return typeof resolver === "undefined"
    ? arg => {
        if (cache.has(arg)) return cache.get(arg)
        const result = func(arg)
        cache.set(arg, result)
        return result
      }
    : (...args) => {
        const key = resolver(...args)
        if (cache.has(key)) return cache.get(key)
        const result = func(...args)
        cache.set(key, result)
        return result
      }
}

I don't think this support should be there by default, especially passing this both to the resolver and the function.
Maybe this 😂👌:

export const memoize = ({
  resolver,
  cacheClass = Map,
  cacheFactory = () => new cacheClass()
} = {}) => func => {
  const cache = cacheFactory()
  return typeof resolver === "undefined"
    ? func.hasOwnProperty("prototype")
      ? function(arg) {
          if (cache.has(arg)) return cache.get(arg)
          const result = func.call(this, arg)
          cache.set(arg, result)
          return result
        }
      : arg => {
          if (cache.has(arg)) return cache.get(arg)
          const result = func(arg)
          cache.set(arg, result)
          return result
        }
    : func.hasOwnProperty("prototype")
      ? resolver.hasOwnProperty("prototype")
        ? function(...args) {
            const key = resolver.apply(this, args)
            if (cache.has(key)) return cache.get(key)
            const result = func.apply(this, args)
            cache.set(key, result)
            return result
          }
        : function(...args) {
            const key = resolver(...args)
            if (cache.has(key)) return cache.get(key)
            const result = func.apply(this, args)
            cache.set(key, result)
            return result
          }
      : (...args) => {
          const key = resolver(...args)
          if (cache.has(key)) return cache.get(key)
          const result = func(...args)
          cache.set(key, result)
          return result
        }
}

Obviously a combinatorial explosion, but that is my point - most of the time you won't use most of the features anywhere in your entire project, so it's better to write your own function. Maybe you want to use a WeakMap of Maps, or vice versa, instead of a resolver.

Memoization is already not a business logic requirement but a performance optimization. So why waste performance on a suboptimal generic solution instead of improving your algorithm by inlining a cache?

Collapse
 
ritikesh profile image
Ritikesh

Hey, nice article. I've implemented a slightly tweaked version of memoization as well, you can read more about that here: dev.to/ritikesh/why-just-cache-whe....

Collapse
 
timkor profile image
Timkor • Edited

Cool article!
How did you generate source code example images?

Collapse
 
carlillo profile image
Carlos Caballero

I'm using carbon which you can find as VSCode plugin.

Thanks!

Collapse
 
iamshadmirza profile image
Shad Mirza

Thanks, I wanted to know that too.
Great article

Collapse
 
jimlynchcodes profile image
Jim Lynch

Really interesting post! I am wondering though- what is "TurboFan"? 🤔

Collapse
 
sixman9 profile image
Richard Joseph

As this wasn't answered and I was also interested, I did a quick Google and found this:

v8.dev/docs/turbofan

"TurboFan is one of V8’s optimizing compilers leveraging a concept called “Sea of Nodes”. "

Obviously, V8 is the name of Google Chrome's JavaScript interpreter, which is what NodeJS is based on.