loading...
Cover image for But the World is Mutable

But the World is Mutable

ericnormand profile image Eric Normand Originally published at lispcast.com ・3 min read

Immutability is a hard topic to breach. As a programmer used to modeling the world, you might object to immutable data structures. How do you model a changing world? Why would you choose to use immutable data structures when everything in the world is changeable?

Let's do a little thought experiment. Let's look at a nice mutable system: paper and pencil. You can write, erase, and write again. It's very convenient. It lets you correct mistakes. And when you don't need something anymore, you can easily erase it.

Now answer this: would you trust a bank that used pencils to record transactions? It would be easy: whenever you would withdraw money, they would erase the old balance and write the new balance. And if you transferred money from one account to another, they'd erase two balances and write the new ones in. It may sound great, but there's a reason banks don't use pencils: they want to be sure nothing has changed. That sounds like immutability.

Bank ledger

This is a bank ledger. Each transaction gets its own line. Always done in pen. It's an example of an append-only data structure. You can answer questions about the past like "How much money was in the account at the close of last Tuesday?" by going up lines until you find the last entry for Tuesday. And you can do that because you never modify existing entries. You only add new entries on blank lines.

Medical Records

This is another example of an append-only data structure in the real world: medical records. Each patient gets a file that everything is added to. You never modify old records. That way, everything is recorded, even the wrong diagnoses (mistakes) of the doctor.

It turns out that traditional information systems that need a high degree reliability create immutable records out of mutable paper. Even though you could in theory scratch out some data and write it again, or white it out, or find some other way to mutate the document, a mark of professionalism in the job is to discipline yourself to adhere to strict append-only behaviors.

Wouldn't it be nice if the machine took care of the discipline for us? Even though RAM and disk are mutable like paper and pen, we can impose a discipline inside of our program. We could rely on the programmer to never accidentally overwrite existing data. But that's just shifting the burden. Instead, we can build in immutability into our data structures and make a paper that cannot be overwritten.

That's how immutable data structures work. All new pieces of information are written to new locations in memory. Only when it is proven that a location is never going to be used again is it reused.

Reliable paper-based systems use immutable data. There was a time when computer memory was expensive and we had to reuse storage, so we couldn't make immutable systems. But RAM is cheap now! We should be using immutable data, just as banks have done for hundreds of years. Ready to join the 13th century?1

If you're interested in a language with a very cool set of powerful immutable data structures, probably the most cutting edge immutable data structures in any language, you're in luck! Check out LispCast Introduction to Clojure at PurelyFunctional.tv. It's a video course with animations, exercises, and screencasts that teaches you Clojure.

Photo credits: Ledger and Medical Records


  1. The Double-entry method of accounting can trace its history back to 13th century Florence. 

Posted on by:

ericnormand profile

Eric Normand

@ericnormand

Eric Normand is a long time functional programmer, writer, and teacher. He teaches Clojure and Functional Programming at PurelyFunctional.tv.

Discussion

pic
Editor guide
 

WoW very nice thought experiment, although i am already bought on the idea and i have taken your course in clojure. Still an interesting read.

 

Ready to join the 13th century?

Nice, I'll be using that in future.

I think our resistance to immutability comes from our cognitive coupling to databases. We've trained ourselves to turn specifications into tables, and we gotten so good at it that we do it intuitively, without even realising we're translating requirements into implementations.

We end up viewing it as the way things are, rather than an implementation of our model.

I was able to break the spell by moving to Event Sourced systems, you see how useful immutable messages are and how they actually model the system, as opposed to a standard RDBMS system.

 

I like the idea in the Event Sourced community that mature models eventually become event sourced. That is, eventually, you want auditing and a complete history of how you got there.

 

I definitely agree. As the business matures you'll want a deeper understanding of what's actually happening, so you can optimise and improve.

As as aside, the current trend is to just mash reporting software/code into our apps to solve this problem, but it doesn't really work. Making your application event sourced is really the best way to solve it.

There are those that think ES is a bad choice at the beginning of a project, but I disagree. Our model has evolved extensively over the last year (radically) and ES didn't interfere at all, instead it made us think about what we're doing and forced us to keep things clear and consistent. I doubt I could say the same if we were building a standard CRUD/RDBMS app.

 

I think the education in our domain is doing this mistakes, they present the programming as a copy of the world and only present 1 paradigm, OOP.

Software is not the real world and shouldn't try to replicate it, it should solve real world problems in its own way.

Rant over, sorry 🙏.

PS: you can use immutability in any language, the degree of was is different though, even JS has a famous immutable library.

 

Software is not the real world and shouldn't try to replicate it, it should solve real world problems in its own way.

I totally agree! I don't know why people are still teaching this idea that you should simulate the world.

We are mostly programming Information Systems: they gather, process, record, and transmit data. They're not simulations!

 

Besides, the world is not mutable. We can't change the here and now. We can only act on it now to change the future.

 

When you look at a waterfall, what is the waterfall? Is it the water you're seeing now (which is hundreds of meters away before you blink)

I'm not sure what I'm getting at but I think the general metaphor applies somehow 😂

 

Sure but that's just a way to express the same problem: if you act now to change the future, you lose a record of the now you used to have.

We work really hard to preserve physical stuff: records, photos, certificates, manuscripts, paintings, physical artifacts. They decay over time if we don't take care of them. Why make our information systems simulate this part of the world? We should help them preserve things, too.