Sure. Mutation is a kind of side-effect, which is a common source of bugs in code.
For example, let's say I have a function called name which is there to set the name of the object in question, so I use it like
constobj={n:5};name(obj,"fred");console.log(obj)
Later on, your future self or a collaborator is skimming the code but misses that name() has a mutation side-effect. They see that obj is assigned and that it is not (or can't be) reassigned, and so they assume obj has not changed. So a bug could be easily introduced the next time the code is worked on because the mutation side-effect obscured the purpose of the code and made it a little more difficult to read.
A better way to write the code would be to use immutable operations like spread to return a new object, so the calling code could be read as
Now it is very clear to the reader that obj is being changed, so it is less likely that the next programmer will introduce a bug.
It may be tempting to say that programmers should just be more careful readers of code, but when you introduce parallelism (which is very important for efficiency) it becomes clear that bugs are unavoidable without immutability.
I believe it might be impossible to predict what this code will do. Will the promise finish first, or will the console.log? That depends on how the particular browser or runtime is implemented and on how many cores are available on the user's machine and what else is running at the time... and so on. (Actually, I don't think this particular example runs in JavaScript because it sheds the outer context when it creates the promise, but this is the simplest example I could think of).
This is why I prefer languages like Elm which don't support mutations at all. One might think that constantly creating new objects would be costly to performance, but Elm benchmarks better than most popular frameworks because it is very parallel by default.
Tech Lead/Team Lead. Senior WebDev.
Intermediate Grade on Computer Systems-
High Grade on Web Application Development-
MBA (+Marketing+HHRR).
Studied a bit of law, economics and design
Location
Spain
Education
Higher Level Education Certificate on Web Application Development
Using proper names makes it pretty understandable. Even that, sure it's better to create new instances to avoid bugs.
A usecase that usually takes place when this situation happens is an object being used in different places and a dev adding a mutation in the middle just for a new use case, breaking things in other data paths (use-cases). If you instantiate a new object, array or whatever and you work with your own instance this will hardly take place.
There are other methods to avoid this, of course. Interfaces in OOP, and good practices when using FP, maybe the most important thing is to differentiate well whether are you defining imperative statements and were and how you use declarative ones.
Using proper names makes it pretty understandable.
I agree... but that's largely because I intentionally made the example trivial so that the issue would be easy to understand. Real-world mutation bugs can be extremely subtle.
Tech Lead/Team Lead. Senior WebDev.
Intermediate Grade on Computer Systems-
High Grade on Web Application Development-
MBA (+Marketing+HHRR).
Studied a bit of law, economics and design
Location
Spain
Education
Higher Level Education Certificate on Web Application Development
Sure. Mutation is a kind of side-effect, which is a common source of bugs in code.
For example, let's say I have a function called
name
which is there to set the name of the object in question, so I use it likeLater on, your future self or a collaborator is skimming the code but misses that
name()
has a mutation side-effect. They see that obj is assigned and that it is not (or can't be) reassigned, and so they assume obj has not changed. So a bug could be easily introduced the next time the code is worked on because the mutation side-effect obscured the purpose of the code and made it a little more difficult to read.A better way to write the code would be to use immutable operations like spread to return a new object, so the calling code could be read as
Now it is very clear to the reader that obj is being changed, so it is less likely that the next programmer will introduce a bug.
It may be tempting to say that programmers should just be more careful readers of code, but when you introduce parallelism (which is very important for efficiency) it becomes clear that bugs are unavoidable without immutability.
Consider:
I believe it might be impossible to predict what this code will do. Will the promise finish first, or will the console.log? That depends on how the particular browser or runtime is implemented and on how many cores are available on the user's machine and what else is running at the time... and so on. (Actually, I don't think this particular example runs in JavaScript because it sheds the outer context when it creates the promise, but this is the simplest example I could think of).
This is why I prefer languages like Elm which don't support mutations at all. One might think that constantly creating new objects would be costly to performance, but Elm benchmarks better than most popular frameworks because it is very parallel by default.
For better examples of mutability bugs than my lazy ones, have a look here: elmprogramming.com/immutability.html
For further reading, look into immutability as a functional programming paradigm.
thank you so much <3 this was an amazing explanation
I enjoyed the reading, just want to break a spear in favour of mutation.
Using proper names makes it pretty understandable. Even that, sure it's better to create new instances to avoid bugs.
A usecase that usually takes place when this situation happens is an object being used in different places and a dev adding a mutation in the middle just for a new use case, breaking things in other data paths (use-cases). If you instantiate a new object, array or whatever and you work with your own instance this will hardly take place.
There are other methods to avoid this, of course. Interfaces in OOP, and good practices when using FP, maybe the most important thing is to differentiate well whether are you defining imperative statements and were and how you use declarative ones.
I agree... but that's largely because I intentionally made the example trivial so that the issue would be easy to understand. Real-world mutation bugs can be extremely subtle.
Also agree with that