I have come to consider object-oriented programming (at least the sort which has evolved from C++) as being an anti-pattern. An anti-pattern, in case anyone is unfamiliar with the term, is a commonly occurring solution that seems to solve a problem but actually creates more problems than it solves.
This may seem like quite a bold claim especially given the prevalence of OOP in almost every development shop in the world; hence it requires some explanation.
Let me be very clear on one point: I think OOP as embodied in Smalltalk, Simula and other OOP languages is probably perfectly fine. I say "probably" because I honestly don't have a lot of experience with Smalltalk or Simula or OOP languages preceding C++. Mainly I'm going to focus on C++, Java and C# because those three languages are more similar than they are different (at least in the ways that cause me concern).
Defaults Matter
So, first of all, I'd say that language defaults are extremely important. I say this for a few reasons:
1.) I have yet to meet a developer who doesn't have tight deadlines and whose end-users don't want their software three weeks ago. None of us have the time to move off of the language defaults unless we're given an extremely good reason to do so. Hence although the const, readonly, etc. keyword or some form of it exists in all three languages, it's not idiomatic in any of them to use these keywords to any great extent.
2.) Defaults color our thinking. When we have mutable memory by default, we think in terms of mutating values in place. It's natural to think in that way. If, on the other hand, the default is immutable, we are forced to (and quickly become accustomed to) copy then change values; we can change from default immutability but it feels unnatural and "wrong" in some sense.
So language defaults are important. Recently it seems as if C# has been pulling back features from F#. In fairness from what I've seen of Java and C++ they've also started to pull back functional language features into their language specs as well. But none of them, as far as I know, have modified the most important default of all. None of them have made variables immutable by default. Of course they can't--all three of them have lots and lots of legacy code that would be broken by such a change. And, in fairness this is not solely a property of OOP; it's a property of most imperative languages.
Don't Use Inheritance!
It's very interesting to me that the current thinking (well, not so much current but it's certainly not how things started out) is to avoid inheritance as much as possible. One of the central pillars of OOP should be avoided.
The advice I've seen over and over is to use interfaces. I don't know but my guess would be that the reason for this preference for interfaces is the simple issue that inheritance leads to slightly pathological couplings. While it's fine if you're inheriting from a 3rd party class which is extremely unlikely to change without some sort of warning, don't ever, ever, ever inherit from a class written by one of your colleagues because what happens when that colleague changes their class and doesn't tell you that they changed it.
When it's become almost codified as good coding practice to simply avoid one of the main features of a paradigm one has to question the wisdom of using the paradigm at all.
But It Models Reality!
One of the main defenses I've heard for OOP when I've chatted on this subject with hard core OOP supporters is the fact that OOP models reality much more closely than procedural or functional code. While they offer this as a benefit which cannot be realized in functional I have a couple of thoughts on this.
First there's nothing that prevents anyone from building OOP with functional programming. Take for example the Scala programming language. F# also. Both have classes and most of the features one would expect from any OOP language. That's part of the reason I said that when I say OOP is an anti-pattern I'm specifically discussing C++, Java and C#.
Secondly, "it models reality" is not the large benefit that supporters of OOP think it is. Let's take one of the most common examples of an object--employee. Let's say I've got an employee class in my system. And further let's say that I am maintaining this code and I need to model a team lead. Do I add a field to employee to indicate that someone is a team lead? Do I have a "employee title" field in the employee class which would be set to team lead? If I do, how do I indicate which employees report to this team lead? I'll grant that DBA's run into the same data modeling issues but the point is that saying OOP is better because it "models" reality is hardly a huge benefit. All of us need to make decisions about how to model reality in software. OOP is not superior to any other paradigm in terms of helping us to abstract reality into software.
Future Trends
Finally, the success of C# and Java may lead to their own worst failing. Both languages have come to this interesting position of needing to add new features to keep users interested while maintaining backwards compatibility. I don't think it's a large leap of imagination to see both C# and Java, if they keep on their current path of adding features, becoming the same sort of slapdash mess that people rightly criticize C++ for having become. The saying "Everyone uses 10% of C++; the problem is that no one can agree on which 10%" may become even more true of C# and Java.
For all these reasons I'd say that we seriously need to ask ourselves as technical experts if it's time to admit that C++, C# and Java are just long, expensive diversions from genuine progress in the art of software development. When the business asks us to recommend languages to build systems in, surely we should have some better justification than "we already know Language X and that's the way we should move into the future."
Top comments (0)