Antifragile Software

nathanepstein profile image Nathan Epstein ・3 min read

Software projects famously suffer from unforeseen complexities that slow development and undermine teams' ability to execute on high level objectives. In light of this, the desirability of developing "antifragile" software projects that grow stronger through this complexity - as opposed to collapsing under it - should be obvious. What follows are a few principles aimed at achieving this.

Prefer dependency on software with a long history of use

Project requirements are generally complex, incompletely specified, and non-static over time. For the most part, this means that a demonstrated history of usefulness should be weighed more heavily than rationalizations about a technology's value.

Software that survives through a long period of wide use has a demonstrated ability to handle practical complexity beyond that of newer software. This translates directly to a longer expected shelf-life. From this, we get "Lindy effects" where the longer a technology has been used, the longer it is likely to continue to be in use.

When released, smart people spun compelling narratives around why web developers would want to adopt Backbone.js, CoffeeScript, Knockout.js, Meteor, Angular, Aurelia, Haml, and an expansive graveyard of forgotten JavaScript frameworks. And yet, all of these (to varying degrees) have seen their usage wane as they've failed to adequately match the requirements of real software projects.

By way of contrast, SQL has been around for decades and shows no signs of going away any time soon. Through extensive use, features which could easily be rationalized as weaknesses (a generally insecure text-based API for example) have revealed themselves to be strengths (by allowing non-technical business users to explore data without an engineer). Time, and not narrative, is the judge.

This seems like an intuitive result. Given even odds, how many people would bet that SQL will be outlasted by a newer and "better" alternative like, say, MongoDB?

Prefer dependency on software that is used by its maintainers

A proven track record is ideal when choosing software but it's not always an option. If you're picking a JavaScript web framework, for example, essentially all of your options are young projects (except perhaps jQuery which, if it supports the requirements of your project, is a great choice).

But if you have to go with something new, its preferable to use software which is used by its authors. This creates a sensitivity to unpredictable challenges and will tend to lead the project to get stronger with time. If the author is using a project, emergent issues will prompt corresponding feature development. Conversely, an author who doesn't actively use their project will develop to match a preconceived mental model of reality instead of the real thing.

Consider the example of AngularJS vs. React. AngularJS grew quickly based largely on the excitement surrounding the fact that it was developed by Google. The rationalization was something like "Google has a lot of money and smart people, so their framework will obviously be great". But Google famously didn't use AngularJS for their own projects; their design turned out to be poorly suited to real projects and was abandoned for a whole sale re-write in the form of Angular 2.

By way of contrast, React is used by Facebook (which maintains the framework). The project has been growing rapidly, is healthily maintained, and appears to be the best active possibility for stability within the JS ecosystem.

According to the Stack Overflow Developer Survey (2017 and 2018), the percentage of respondants using Angular dropped from 44.3% to 36.9% while those using React jumped from 19.5% to 27.8%. This indicates a massive migration from Angular to React which shows no signs of slowing.

Prefer project owners over issue owners

In organizing software development work, prefer assigning people to high-level projects over low-level tasks. The reason for this is similar to the above argument about preferring software used by its authors. In completing small incremental tasks, it's easy to introduce technical debt for temporary expendience; project ownership creates incentives to avoid this kind of practice.

A developer tasked with specific, contained features has incentives to trade the long term health of the project for ease of development / efficiency in the short term. A project owner will recognize their own exposure to long term issues and be inclined to make technical decisions which support the long term health of the project; new challenges will lead to project growth opportunities - instead of buried time bombs.


markdown guide

Interesting subject.

SQL is simple. In itself it is rarely used at all, usually implementations come with support for variables and whatnot that put them apart from the original QL.

It isn't well suited for all common database tasks either. There is good reason to use column or graph databases too, and they tend to have other types of QL:s. Picolisp makes use of a prolog-like inference engine while kdb+ is queried through APL-like languages.

Also, being popular isn't much of an end in itself. It is a way of measuring software quality that haven't been particularly accurate. The JRE is very fragile and Java is arguably among the least pleasurable and effective languages to work with, but it is still one of the most used and most popular in industry.

Personally I'm not so sure non-fragile software is possible on the hardware we're using today. The branch predictions that squeeze out Ghz:s and lifted us above the architectures of elder times comes with great vulnerabilities, the end of which is nowhere in sight. Some embedded systems are more reliable, of course. But 'the cloud' and mainstream devices are not.

So I would probably recommend to use and invest in simple technologies that allow the most expressiveness and the fastest development, to keep the attack surfaces as small and fast changing as possible, rather than aim for some development methodology or other, or some fairly arbitrary principles. The JRE will be brittle forever. It is that complex. SQL will always have performance issues in real applications, because they are complex and make demands that such a simple and portable QL must handle by extensions it cannot express itself (hence adding complexity, always).

React hasn't been used for long. It is quite ad hoc and tailored towards a particular business' needs. This doesn't change due to the company using it, rather it is just two sides of the same coin, and allowing project owners rather than issue owners control development doesn't affect it either. The company's business is also what drives their maintenance, if you find a bug in a marginal case that they will never run into it is entirely arbitrary whether they fix it or not. Or perhaps they restructure their business due to changing markets, and then also their libraries and make breaking changes. You will then need to replace them in your projects, since they are too big and too complex to be rewritten or maintained easily.

Scheme is simple and reliable. As are WAM-based languages, like miniKanren and Prolog. They are flexible, easy to understand and has endured for ages, much longer than frontend web development has been a particular job. Still, they aren't used much in industrial applications compared to, say, Java or some other clay colossus that stagger around and litter with CVE:s. This is due to the cost of educating and thus for replacing workers, and as it happens, this is also why trends and advertising in software development gets such influence over jobs in the industry. Rather than, say, longevity or reliability of application dependencies.