In this talk, Robert "Uncle Bob" Martin talks about "The Failure of State", in his talk on Functional Programming:
In the first 21 minutes, Uncle Bob brings up what I have referred to as temporal data in the past. The side effects of stateful operations do lead to this, although sometimes the data is naturally temporal and that nature cannot be avoided.
Temporal Data
For example, one project I did years ago was disk volume synchronization; keeping a "mirror drive" in sync with the primary drive, but to also allow that constant synchronization to be paused (e.g. for consistent backup from the mirror drive) and then resumed (where it would eventually become synchronized again). And while resyncing the changes that were made while paused, further changes were made which would ensure it was not synchronized, until all the queued and then dynamic changes were also applied to the mirror drive.
This process didn't really work until I recognized and accepted that the changes to the mirror drive represented a temporal queue that could not be optimized without risking the fidelity of the mirror image. This was a new fundamental law for me.
Don't Mess With a Temporal Queue
Here's an example of why any attempt to optimized this temporal queue would fail: if you created and deleted an item, such as a file or folder, and both the create and delete were in the temporal queue of changes, it was thought that perhaps you could optimize the queue by skipping both operations during the resync. This fails, because this "optimization" does not take into account other operations on that intermediate temporal state.
The two operations to create and delete state were not the whole story, because other operations could use that state. A good example is temp files. Let's say a file is created, written to, saved and deleted. One optimization might be to skip all of those operations since the net effect was to not have the file at all when they are complete. However, what if there was an intermediate operation to copy that file elsewhere? Or more subtle, what if an operation copied a file to that same location, where a collision should cause the file copy to fail? If the first file was optimized out, the second one (that should have failed) would actually succeed, and the mirror would have an extra file. At this point, that means the temporal queue has lost its fidelity and can no longer be relied on for subsequent operations. So this is the failure of state. The failure of a functional set of operations due to the presence of state.
Sticking My Neck Out
That said, I disagree with Uncle Bob's comment (and I can't believe I'm saying that) that the recursive implementation in his talk is stateless. No it's not. It's just that the stateful variable is stored on the stack as a parameter to the recursive call. And the number of calls on the stack defines the state. The data is still there. And is copied on each call to a new stack variable, even if it's not called a local function variable. Parameters are state, and passing the parameters to the same function recursively has different results for each level of recursion because it has the state for that function on the stack as a parameter.
In fact, even the expanded version, with 20 calls, is also not really stateless, but rather the state is stored in the code. It has 20 such stateless calls in it but that 20 is effectively the parameter to the recursive function. It is not a general functional method because it accepts no input; f()
always produces those 20 lines. Just as the recursive f(20)
produces the same output using recursion.
Now it can be argued that these are stateless since they have no side-effects but that is arguing over the definition of stateless. If I create a function with console.log("12345")
, it is not really stateless to me, it is storing the data in the function code.
This reminds me of Mark Nelson's compression challenge, where one of the rules was you couldn't store any of the data, not even one byte, in the code that compressed or expanded the data.
So those are my thoughts for the day on this talk, at least the first 21:40 of it.
Top comments (0)