DEV Community

Cover image for Why You Should Strive for Immutability
Anthony Lannutti
Anthony Lannutti

Posted on

Why You Should Strive for Immutability

Just to get my biases out of the way right off the bat, I am not a fan of object-oriented programming (OOP). I tend to prefer functional programming (FP) whenever it is feasible. I'll go into more detail about why I'm not a fan in a future post but I'll quickly go over what I don't like. In my experience OOP leads to code that is more difficult to create, maintain, parallelize, and test.

In FP, you love immutability. In fact, immutability is one of the core principles of FP. There are many reasons for this and it has been discussed much more eloquently than I could ever hope to discuss it myself. However, I'm going to attempt to discuss this topic for myself and throw my own spin on it.

As for my assertion that OOP leads to code that is more difficult to unit test, I would posit that there are 3 kinds of tests. Each kind of test has a different level of difficulty to actually extract value from. The easiest is the type of test that just tries to validate a return value or exception that is being thrown. The second kind of test that is a little more difficult to extract value from is the test that depends on changes in state. The third kind of test that is the most difficult to extract value from is the test that requires valdiating interaction with some external component. Dave Farley created a video some months ago that discusses these kinds of and can be found here. When you write code that is immutable, you naturally find yourself writing more unit tests that only test return values and exceptions. This provides more value than the other two forms of testing because you are able to completely ignore implementation details that the consumer of you API ultimately shouldn't need to concern themselves with.

A while back, I watched an NDC Conference keynote by Kevlin Henney on "Refactoring for Immutability". I highly recommend that anyone who is a software developer watch this talk as I found it extremely impactful on how I view the code that I write. In this keynote, there was one slide that stood out to me over all of the other slides. Stop, read it, process it, let it sink in.

How Mutability and State Sharing Affect the Need for Synchronisation

I know this isn't a new concept but I like how it can be visualized here likely because this is reminiscent of
root locus analysis from back in my college day's when I was studying to be an Electrical Engineer.

When you look at a root locus plot, you can very easily see if that system is stable by looking at the position of the poles. As long as your roots aren't in the right hand (or positive plane), your system is stable. In the case of the slide that I've referenced from Kevlin Henney's keynote, there isn't a whole side of that plane that you need to stay out of. You just want to stay out of the top right quadrant. Why do you want to stay out of that quadrant you ask? Because if you enter that quadrant then you instantly need to concern yourself with locking. We want thing to run in parallel so that we can make things run faster. Locks, however are extremely good at making things run really slow in addition to protecting state that gets mutated. Maintaining code with locks is more difficult than maintaining code without locks so if we can avoid it, we should.

I'm sure you've reasoned this by now but immutability is the solution to this problem. Notice that much like the root locus plot I mentioned earlier, as long as you stay out of the top right-hand quadrant you don't need to worry about synchronization! You don't even really have to think about it past verifying that the API's that you are using in your parallelized code are thread safe. If the API's that you are utilizing are thread safe and you are not mutating any shared data structures, then there is no way that your other threads or processes can stomp all over whatever each other is trying to do.

Top comments (0)