DEV Community

Discussion on: What computer science concepts should most devs understand, at least a bit?

Collapse
 
ben profile image
Ben Halpern

I think "Big O Notation" may not be something most devs need to understand scientifically, but it's a good concept to have a grasp of.

Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. Big O is a member of a family of notations invented by Paul Bachmann, Edmund Landau, and others, collectively called Bachmann–Landau notation or asymptotic notation. The letter O was chosen by Bachmann to stand for Ordnung, meaning the order of approximation.

Understanding algorithmic complexity and being able to estimate and account for the tradeoffs of different algorithmic approaches just seems like a really good thing to understand. Your program is always going to have to run under some kind of constraint and load, and studying algorithm performance is a really solid foundation.

Certain parts of computer science just aren't going to apply broadly and others only seem to be labelled as "computer science" in so far as they've been around a while, but to me just seem like "programming" (not that the distinction matters all that much). But I think understanding algorithmic complexity stands out as a very good area to dip one's toes into.