DEV Community

Cover image for Invented here syndrome
edA‑qa mort‑ora‑y
edA‑qa mort‑ora‑y

Posted on • Originally published at mortoray.com

Invented here syndrome

Are you afraid to write code? Does the thought linger in your brain that somewhere out there somebody has already done this? Do you find yourself trapped in an analysis cycle where nothing is getting done? Is your product mutating to accommodate third-party components? If yes, then perhaps you are suffering from invented-here syndrome.

Most of us are aware of not-invented-here syndrome, but the opposite problem is perhaps equally troublesome. We can get stuck in the mindset that there must be a product, library, or code sample, that already does what we want. We spend a lot of time testing out modules and trying to jam them into our system. At some point, we need to say, "stop!", and write the code ourselves.

Upon rereading this article, I'm wondering how much of this problem is attributed to imposter syndrome. I talk about this in my my book, and also look at all the skills of a good programmer. Perhaps, it's easy to look at other people's code and assume they have it all figured out, while we ourselves don't? This would push us away from developing our own solution.

Varying levels of quality

As a general rule, we shouldn't write code that already exists. Often the choice is easy. We pick up a standard product, lump it in with our system, and it works as promised. These situations are delightful and do happen often enough.

Past this come the less than ideal choices. Perhaps a library covers only 95% of the requirements, the configuration is a bit more involved, or the purpose has to be twisted a little bit. It'll work, but not without some effort, or rethinking a few requirements.

At some level, we enter the realm of crappy products. These are things advertised to do what we need but fail utterly at that task. Or perhaps they do something in a completely different way than expected. Maybe the integration is extremely troublesome. Perhaps the product is too bug ridden to be trusted, or the documentation so bad that proper usage is a mystery.

From what I've seen, the vast majority of coding products, snippets, and libraries fall into this category of crappy software. Being available in a package manager, being downloaded by thousands of people, or having a fancy web page, are no indications of a good product. It's trivial to publish programming products. Chances are, for any obscure requirement there is already some matching product. Just because it's there doesn't mean it should be used.

I should point out that the vast majority of any project is comprised of standard products. Consider the operating system, the compiler, file system, shell and the build system. Add to this the built-in libraries like SSL, HTTP, or even the browser to render HTML. It's generally an unfounded fear that a project is not using enough standard components.

It's about the time

When we search for suitable products, it shouldn't take long to discover when none exist. We will find modules, but either they are less than ideal or just crappy. As we add new search terms, we are either coming up empty or getting the same results. Lists of popular, seemingly suitable products, don't include any that really fit. That's it. The search is exhausted.

I'm not saying we should immediately jump from here to writing our own module. No, now is the point where evaluation becomes essential. Time is usually the critical factor here. How long will it take to write a custom component? How long will it take to adapt one of the existing products?

I feel that the time savings from using a less-than-suitable third-party product must be an order of magnitude higher than writing it myself to consider it worthwhile. Third party code has a lot of open questions, regardless of how well it has been evaluated. This uncertainty must be factored into our consideration.

Getting stuck in analysis paralysis is very bad. I have no problem writing code before completing the analysis. I consider this a valid form of evaluation. Often I'm not confident about what I need until I've attempted an implementation. Stalling a project can be disastrous. Perhaps my quick code is enough for now and let's defer the decision to later.

It's about the features

It's easy to fall into the trap of features. The full feature set of a product can seem impressive and alluring. But who cares about everything a product can do. Our project has a specific list of requirements, and those are the only requirements we should care about.

This is where a lot of popular products falter. They offer a complete package, but we aren't looking for a complete package. They have the feature we want, but there's no way to extract it efficiently. We need to view the individual features on their own. This is relevant to the time consideration. Clearly, product Xyz would take millions of man-hours to recreate, but perhaps module Q will only take a few days.

As a second aspect, how vital is a requirement is to our own product. Compromising our key selling points is going to doom the project to failure. There's no value in saving time if it doesn't result in the intended product. We have to face reality sometimes: getting the desired feature set may involve writing a lot of code. We're programmers though so that shouldn't scare us.

Warning! Being delusional here is not helpful, and can often lead to genuine not-invented-here syndrome. While features can be realized in many different ways, do we need it exactly as we want? The time invested to write our version has to be related to how critical that feature really is. It's best to involve marketing, or product management, in such decisions. It can be easy at times to lose sight of what is truly important.

Write some code

Fretting over a selection of inadequate products is not productive. While it's entirely reasonable to avoid not-invented-here syndrome, becoming overly frightened of writing code can land a project in a production quagmire. It shouldn't be surprising that software development involves coding.

A great mass of libraries, modules, code, and other software products, available on the net are either terrible in their own right or simply not suitable to our project. Forcing things to work together can be costlier than writing the required code on our own.

And what if coding turns out to be the wrong choice? Well, that's part of development. It's entirely possible that attempting our own component leads us to find the correct third-party product. Since we're following good design practices, it won't be a huge problem to swap parts in and out as desired.


Do you want to learn how to overcome the uncertainty? Read my book What is Programming?. I provide a roadmap of the skills you need to be a great programmer and improve your confidence in making the right decision.

Top comments (2)

Collapse
 
codemouse92 profile image
Jason C. McDonald

Two stories from the development annals of MousePaw Media that go right along with this:

(1) We needed to implement tests for C++, but I had two unusual requirements:

  • Embed tests in production binaries, so it can be run on any machine without special frameworks or tools,
  • An embedded benchmarker, to catch performance regressions.

I knew that the major testing frameworks didn't have the first option; they required test harnesses and the like. I also knew that other, less ubiquitous libraries might have both features, but I didn't want to waste dozens or hundreds of hours researching all the obscure testing frameworks, and then having to take my chances learning something that probably lacked sufficient documentation. (Learning is non-transferrable, so I'd have to train the rest of my staff, mostly interns, on it too!)

Since I already knew what I wanted, I simply wrote my own testing framework for C++. Two years later, it's an investment I'm glad I made! I've been able to add features as we need them, and the benchmarker has proven itself incredibly useful in writing efficient code.

I've been mocked on occasion for writing a new testing framework, instead of just using something someone else wrote, but the payoff has been clear for me. Goldilocks is well-documented, easy to introduce my interns to, and well-suited to expansion as our testing needs evolve.


(2) On the flip side, we've also had cases where we started coding something, only to discover that we didn't need it after all!

We needed an XML parser for C++, so I assigned one of the interns to find one and get a proof-of-concept put together. (For anyone unfamiliar with the idea, this sort of proof-of-concept is a simple, bare-bones project that uses the target technology or library, and few or no other dependencies. It allows you to discover and solve basic implementation problems before you hit them in production, which is a huge time saver!)

The intern researched the topic, found that Xerces is almost ubiquitously recommended, and tried to work with it. It was obtuse, overpowered, and poorly documented. He checked for alternatives, as did another staff member and myself, and Xerces seemed to be it, so we decided to create a wrapper for that library that would simplify its use.

About a year into it, we discovered that another library that had utterly escaped our research, pugixml, existed that could do what we needed with a sane, well-documented API. We dropped Xerces and ended the wrapper project at that point.

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

I spent a lot of effort getting the boost test framework to function as I wanted in C++. I don't think it was worth it in the end. In another project I wrote my own, and it went quicker, and was smoother.

These are absolutely the types of situations I'm talking about. They are more common than one might think.

But I curse you for bringing up the name Xerces again. I had successfully erased memory of that project, and now it's back!