DEV Community

Discussion on: The broken promise of static typing

Collapse
 
eljayadobe profile image
Eljay-Adobe

My feelings -- just my feelings, not backed by any hard data -- is that the most important thing is both simplicity and writing the source code for maintainability and legibility. What Uncle Bob wrote about in his book Clean Code.

Some languages lend themselves to simplicity. For example, I'm impressed with D, Python, Lua and F# ... all of which have a clean syntax and are rather free of excessive "ceremony". Which is why I have a soft spot in my heart for those languages.

But the languages I use that pay the bills are C++ and C#, and I have a love-hate relationship with both of those languages. (More vehemence for C++, because I've been using it for a very long time.)

Bugs can be written in any language. But languages like C++ that have so many areas of undefined behavior that are easy to accidentally stumble into do no one any favors.

Languages that have contract programming, like Eiffel, D, and Ada 2012, make unit testing a lot less important because the contracts can be specified directly in the code instead of being encoded in unit tests. (That's what unit tests do: they express contracts.)

In my experience, statically typed languages -- like Go, C++, D, F#, Swift, TypeScript -- don't have much better protection from the duck typed languages like Python, JavaScript, Boo for "not making bugs". What the static typing does provide is scaling. Small applications gain little benefit from static typing. But as applications grow helps to make sure the pieces are fitting together correctly.

Case in point is Google's Angular that was converted from JavaScript to TypeScript, they had discovered that there had been a good number of bugs in their code that were caught once they had the static typing of TypeScript. (TypeScript transpiles to JavaScript, and the type annotation information is erased. It's a transpile time safety net.)

But, I've also worked with large system based in Objective-C which has a mix of static type checking and runtime duck typing, due to the nature of it using message passing to objects. (The message passing is reminiscent of SmallTalk.)

When I think of duck typed languages, I usually think of scripting languages. When I want to do something quick-and-dirty I reach for Python. When I want to make something application-like, I reach for a static typed compiled language.

But there are languages out there that bridge the two worlds of sorts. Languages that minimize the ceremony around the static typing, like OCaml, F#, and Swift. They're still all strongly typed, but the burden is more on the shoulders of the compiler, rather than forcing the developer to dot all the i's, and cross all the t's.

So I'd say that static typing catches a small category of bugs. For smaller applications, those kinds of bugs are few. For larger applications, those kinds of bugs can be crippling.

I don't know of any scripting language that supports contract programming as part of the core language. (Educate me if you know of any!)

A vastly bigger source of bugs in programs I work in is mutable global state. By which I am also including local mutable member variables in a class instance... that's a smaller scope global state. Programs that I've seen and I've written that emphasize immutability and segregate immutable data from functions and side-effect free functions seem to produce a lot less bugs.

I'm not sure if the "less bugs" I'm seeing is because I'm a better programmer with those kinds of languages, or if I make less bugs in those languages because it is easier to reason about the correctness of the code. Doesn't have to do with all those languages being statically typed. I believe it does have to do with immutable data and lack of global state has more simplicity.

Another vast source of bugs I've run into is null pointers. (Damn you Tony Hoare for adding in the null reference to ALGOL W!). That's another area where Haskell, F#, OCaml, Swift outshine C, C++, C#. Objective-C sort of sidestepped the problem with its treatment of the nil object quietly eating messages (well, almost quietly... the eaten message is output to the console log).