Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. Big O is a member of a family of notations invented by Paul Bachmann, Edmund Landau, and others, collectively called Bachmann–Landau notation or asymptotic notation. The letter O was chosen by Bachmann to stand for Ordnung, meaning the order of approximation.
Understanding algorithmic complexity and being able to estimate and account for the tradeoffs of different algorithmic approaches just seems like a really good thing to understand. Your program is always going to have to run under some kind of constraint and load, and studying algorithm performance is a really solid foundation.
Certain parts of computer science just aren't going to apply broadly and others only seem to be labelled as "computer science" in so far as they've been around a while, but to me just seem like "programming" (not that the distinction matters all that much). But I think understanding algorithmic complexity stands out as a very good area to dip one's toes into.
In my opinion this includes a programming language like C++ or Java, essential computer science concepts like data structures, algorithms and computer networking basics, essential tools like Git, Microsoft Word and Excel, skills like SQL and UNIX, editors like Eclipse or Visual Studio, and text editors.
Details may not be important, but every programmer should know how computers work and have at least a basic grasp of what operations the computer has to perform to execute any given code. This fundamental knowledge seems lacking in a lot of modern devs that started in dynamic languages like javascript, where they will happily add half a dozen extra arrays and array operations to get code into one line and then act like they did a good thing.
30+ years of tech, retired from an identity intelligence company, now part-time with an insurance broker.
Dev community mod - mostly light gardening & weeding out spam :)
Top comments (8)
I think "Big O Notation" may not be something most devs need to understand scientifically, but it's a good concept to have a grasp of.
Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. Big O is a member of a family of notations invented by Paul Bachmann, Edmund Landau, and others, collectively called Bachmann–Landau notation or asymptotic notation. The letter O was chosen by Bachmann to stand for Ordnung, meaning the order of approximation.
Understanding algorithmic complexity and being able to estimate and account for the tradeoffs of different algorithmic approaches just seems like a really good thing to understand. Your program is always going to have to run under some kind of constraint and load, and studying algorithm performance is a really solid foundation.
Certain parts of computer science just aren't going to apply broadly and others only seem to be labelled as "computer science" in so far as they've been around a while, but to me just seem like "programming" (not that the distinction matters all that much). But I think understanding algorithmic complexity stands out as a very good area to dip one's toes into.
In my opinion this includes a programming language like C++ or Java, essential computer science concepts like data structures, algorithms and computer networking basics, essential tools like Git, Microsoft Word and Excel, skills like SQL and UNIX, editors like Eclipse or Visual Studio, and text editors.
Details may not be important, but every programmer should know how computers work and have at least a basic grasp of what operations the computer has to perform to execute any given code. This fundamental knowledge seems lacking in a lot of modern devs that started in dynamic languages like javascript, where they will happily add half a dozen extra arrays and array operations to get code into one line and then act like they did a good thing.
Programming in C. The basics of how an OS manages processes. The basics of how a filesystem works.
Binary operations. Especially in the front-end, even though JS supports them, they are woefully underused and oftentimes not even known or understood.
Theory of Computation. As you said at least a bit. This bit is important.
This is a pretty popular topic hereabouts :), plenty of posts from this search you might like to summarise?
dev.to/search?q=computer%20science
I don't think any of these are a must have, but a basic understanding of them can help a lot:
Big O,
Data structures,
Graph theory