DEV Community

Mateus Amorim
Mateus Amorim

Posted on • Updated on

Understanding primitive vs reference type in javascript and what you can do with it


Primitive vs Reference types is a very important concept you need to understand to avoid having to deal with strange mutation bugs and also to be able to use references to your advantage.

Primitive types

A primitive type in javascript is any type that, when two instances with the same value are strictly compared (===) it will return true.


  1 === 1 // true
  'one' === 'one' // true
  null === null // true
  undefined === undefined // true
  Infinite === Infinite // true

In that sense, it doesn't matter where it was declared or used, both will be equal when compared. 1 is always 1, null is always null.

Reference type

Now, for reference types, it is a little different

  NaN === NaN // false
  {} === {} // false
  [] === [] // false
  () => {} === () => {} // false

In this case, even though the structure is the same for both sides, the comparison will always be false, and that is easy to understand why:


NaN stands for "Not a Number", so it makes sense to return false for the comparison of it since even though both "values" are not a number it doesn't mean that they are the same, even if they originated from the same operation.

  parseInt('asd') === parseInt('asd') // false

Ironically typeof NaN will return number, which is a little confusing since it stands for Not a Number, but let's take parseInt('asd') for example, it returns NaN because even though the result of parseInt should be a number, it cannot be represented in any know way, so basically, NaN is a number that has no know formart.

Objects and Arrays

Both arrays and objects can be seen as collections of keys and values. The main difference is that arrays will use numbers as keys/indexes.

With that in mind, when you compare {} === {} you are actually comparing if both collections are the same, not if both collections have the same items, which is fundamentally different. Like, both collections are empty, but they are not the same. You can have an empty basket, but it is not the same as someone else's empty basket, it is just in the same condition.

  // Not the same collection
  {} === {} // false

  const x = {}

  // Same collection
  x === x // true

This gets tricky in some situations. For example:

Dealing with aliases

  // Let's say you wanted a copy of an abject and did this
  const x = {};
  const y = x;

  // now you want to set a value for this copy
  y.a = 5;

  // however you unintentionally also modified the original, since you assigned the reference, not the values :(
  console.log(x.a, y.a) // 5 5

This may be a simple example, but it's important to take attention to that kind of destructive bahavior, for example:

  const x = [1,2,3];
  const reversed = x.reverse(); // [3,2,1];

  // Looks ok right? However, .reverse modifies the array after being called
  // So if we do this now

  console.log(x); // [3,2,1]
  // We get the reversed array :(

Well, if you were depending on this array for something else in your application, it might stop working after that.

That is why it is a good practice to never modify anything outside the functions that you create and always return a new value instead.

let's also take a look at the different ways you could go about cloning objects and the problems this could cause.

Deconstructing and Object.assign

A common way to clone an object is to use deconstructing like this:

  const x = { a: 1 };
  const y = { ...x }; // same as Object.assign({}, x);

  y.a = 5;

  console.log(x.a, y.a) // 1 5

This will work nice for most of the cases, but the problem arises when we have nested reference types inside it.

  const x = { a: { b: 2 } };
  const y = { ...x };

  y.a.b = 5;

  console.log(x.a.b, y.a.b); // 5 5

Well, we only cloned the values of x, and unfortunately, x had an object inside it that we wanted to modify. This became a problem because the value of x.a is a reference, so the cloned object also points to that same reference. This can be a very bad thing on a large codebase that uses a shared store for example, since you may modify the store without intending to do so and cause side effects in other places.


The Object.create() approach is very similar with the deconstructing one. However there are some differences:

  const x = { a: { b: 2 } };
  const y = Object.create(x);

  y.a.b = 5;

  // same result as the deconstructing approach
  console.log(x.a.b, y.a.b); // 5 5

  // However
  console.log(y); // {}

  // Also
  console.log(y.a); // { b: 5 }

What happens here is that Object.create will fallback to x instead of duplicating it. Which can be useful if you want overwrite one value without losing the original reference values and keep it in sync.

Using the JSON stringify and parse

A common approach to solve the deconstructing problem is to use the JSON stringify and parse the result again, creating a new object.

  const x = { a: { b: 2 } };
  const y = JSON.parse(JSON.stringify(x));

  y.a.b = 5;

  console.log(x.a.b, y.a.b); // 2 5 :)

This is a nice approach for simple objects, however, JSON.stringify will not work with functions, complex objects, class instances, and others. (basically, it will not work against things that cannot go to a JSON file). So you may use it with caution :).

What is the best way to clone an object

Well, the "best way" will really depend on what you need. In most cases, you can just use deconstructing. If you want the object inheritance, you can use Object.create, if you want to deep clone simple objects, like an API response, you can use JSON.stringify and JSON.parse, but if really needs to deep clone a complex object, you may need to check the type of each one of its keys and use the desired approach.

How to use reference types to our advantage

Well, we already saw a little bit of that with Object.create.
It only extends the reference so it has access to the original object values even after it changes.

  const x = { a: 1 };
  const y = Object.create(x);

  console.log(y.a) // 1

  x.b = 2;

  console.log(y.b) // 2

This is nice, but a place where this knowledge becomes interesting is when dealing with modules.

for example, let's say I have this simple module:


  export default {};

Ok, now let's see what you can do with it.

  // index.ts
  import state from './myModule';

  state = { a: 5 } // This will throw an error since we cannot modify the module value

  // However, since the value is a reference, we can do this:
  state.a = 1;

  console.log(state) = { a: 1 };

You should probably never do that in production code since you would have no idea of what is in the module, but this can be useful for some quick API for tests, statistics, redefinition of modules functionality and others.

ex: test API


  export default new Map([
    ['1', { id: '1', name: 'bob' }],
    ['2', { id: '2', name: 'foo' }],
    ['3', { id: '3', name: 'bar' }]
// server/index.ts
  import users from './users';

  app.get('/users', (req, res) => res.json(state.users));
  app.delete('/users:id', (req, res) => {

ex: Statistics


  export default {}
  // index.ts
  import internal from './internal';

  internal.operationsPerformed.push({name: 'console.log', args: 'log'}});

  process.addListener('SIGINT', () => {

ex: Redefinition of module

  // myModule.ts

  export default {
    a() {
      throw new Error('you should have never done that');
  // index.ts
  import myModule from './myModule';

  myModule.a = () => {

  myModule.a(); // :)

Top comments (0)