It is crucial for JavaScript developers to understand how data structures work and how to design algorithms to
build applications in the 21st Century11:09 AM  29 Jun 2020
Adrian Twarog 🦘@adrian_twarogJavaScript is like water. It can adapt to fit almost any shape.
If you put JS on the front end, it can become the front end. If you put JS on the backend, it can become the backend. Now, JavaScript can execute, or it can crash.
Be like JavaScript my friend.01:35 AM  30 Jun 2020
Introduction (☝️)
The motivation (🔥🔥) for writing this was the lack of available information about JavaScriptwritten data structures and algorithms, this is true, because as you can see, most of our data structure and algorithm book was written in Java and C++:
10 Best Books for Data Structure and Algorithms for Beginners in Java, C/C++, and Python
javinpaul ・ ・ 11 min read
Medium
I found this strange (😑) because many of the job opportunities for software engineering and development today require JavaScript knowledge; it is the only language that can be used to write the entire stack, including the front end and back end. (👌)
JavaScript follows the prototypal inheritance pattern, unlike Java and C++ which follow the inheritance pattern only(sorry Java and C/C++ 😅), there are some changes in writing data structures in JavaScript. The inheritance pattern allows inheritance by creating a blueprintlike form that objects follow during inheritance. Here is an example of inherentance in Java,
👉 Inheretance in Java:
class Teacher {
String designation = "Teacher";
String collegeName = "Beginnersbook";
void does(){
System.out.println("Teaching");
}
}
public class PhysicsTeacher extends Teacher{
String mainSubject = "Physics";
public static void main(String args[]){
PhysicsTeacher obj = new PhysicsTeacher();
System.out.println(obj.collegeName);
System.out.println(obj.designation);
System.out.println(obj.mainSubject);
obj.does();
}
}
👉 Output:
Beginnersbook
Teacher
Physics
Teaching
However, the prototypal inheritance pattern means copying the objects (📝📝) and changing their properties. Here is an example of prototype in JavaScript,
👉 Inheretance in JavaScript:
var animal = {
legs: 4,
walk: function() {
console.log('walking on ' + this.legs + ' legs');
}
}
var bird = object(animal);
bird.legs = 2;
bird.fly = function() {
console.log('flying');
}
bird.walk(); // walking on 2 legs
bird.fly(); // flying
👉 Output:
walking on 2 legs
flying
Part 01: BigO Notation (😱 🔥 ⚡)
By the end of this chapter, you will understand how to analyze an implementation of an algorithm with respect to both time (execution time) and space (memory consumed).
BigO Notation Primer (😮)
The BigO notation measures the complexity of an algorithm. In BigO notation, n represents the number of inputs. The question is: “What will happen as n approaches infinity? (❓❓)” BigO notation tells you how efficient the algorithm is.

Figure 10. Common BigO Complexities (Live)

Figure 11. Common BigO Complexities (Static)
The figure 10 and 11 illustrate these common time complexities as live and static.
Common Examples (🤘)
See figure 11, ${O(1)}$ does not change. Hence, ${O(1)}$ is referred to as being constant time. ${O(n)}$ is linear time. And since ${O(n)}$ is linear time, an example of an O(n) algorithm below is printing numbers from 0 to n1, as shown here:
Linear Time: (🕙)
Similarly, ${O(n^2)}$ is quadratic time, and ${O(n^3)}$ is cubic time. Examples of these complexities are shown here:
Quadratic Time: (🕔)
Cubic Time: (🕣)
Finally, an another example logarithmic time complexity is printing elements that are a power of 2 between 2 and n. For example, logarithmic_time(100) will print the following:
2,4,8,16,32,64
Rules of BigO Notation
Let’s represent an algorithm’s complexity as ${f(n)}$ . n represents the number of inputs, ${f(n){time}}$ represents the time needed, and ${f(n){space}}$ represents the space (additional memory) needed for the algorithm. It can be challenging to calculate ${f(n)}$ . But BigO notation provides some fundamental rules that help developers compute for ${f(n)}$ .
 Coefficient rule: If ${f(n)}$ is ${O(g(n))}$ , then ${kf(n)}$ is ${O(g(n))}$ , for any constant ${k>0}$ .
 Sum rule: If ${kf(n)}$ is ${O(h(n))}$ and ${g(n)}$ is ${O(p(n))}$ , then ${f(n)}$ ${+g(n)}$ is ${O(h(n)+p(n))}$ .
 Product rule: If ${f(n)}$ is ${O(h(n))}$ and ${g(n)}$ is ${O(p(n))}$ , then ${f(n)g(n)}$ is ${O(h(n)p(n)}$ .
 Transitive rule: If ${f(n)}$ is ${O(g(n))}$ and ${g(n)}$ is ${O(h(n))}$ , then ${f(n)}$ is ${O(h(n))}$ .
 Polynomial rule: If ${f(n)}$ is a polynomial of degree k, then ${f(n)}$ is ${O(n^k)}$ .
 Log of a power rule: ${log(nk)}$ is ${O(log(n))}$ for any constant ${k>0}$ .
Note: Don't try to understand them for now. Just give your special attention to the first three rules and the polynomial rule because they are the most commonly used. I’ll discuss each of those rules in the following sections.
Coefficient Rule: “Get Rid of Constants” (💯)
This rule simply requires you to ignore any noninputsize constants. Coefficients(or constant) in BigO are negligible with large input sizes. Therefore, this is the most important rule of BigO notations.
If ${f(n)}$ is ${O(g(n))}$ , then ${kf(n)}$ is ${O(g(n))}$ , for any constant ${k>0}$ .
This means that both
${kf(n)}$
and
${f(n)}$
have the same BigO notation of
${O(g(n))}$
if they are constant. Here is an example of a code block for
${f(n)}$
:
function a(n){
var count =0;
for (var i=0;i<n;i++){
count+=1;
}
return count;
}
This block of code has
${f(n)=n}$
. This is because it adds to count n times. Therefore, this function is
${O(n)}$
. Here's another example code block for for
${kf(n)}$
:
function a(n){
var count =0;
for (var i=0;i<5*n;i++){
count+=1;
}
return count;
}
This block has
${kf(n)=5n}$
. After all, the first two examples both have a BigO notation of
${O(n)}$
or
${O(g(n))}$
from the above coefficient rule. Another following code block demonstrates another function with a linear time complexity but with an additional operation which is count+=1:
function a(n){
var count =0;
for (var i=0;i<n;i++){
count+=1;
}
count+=1;
return count;
}
This block of code has ${f(n)=n+1}$ (in linear time, it can also be ${n1}$ ). There is ${+1}$ from the last operation (count+=1). This still has a BigO notation of ${O(n)}$ . This is because the one operation is not dependent on the input n like we said earlier.
Sum Rule: “Add BigOs Up” (➕)
The sum rule is easy to understand; time complexities can be added. Imagine a master(which is function ${a(n)}$ ) algorithm that involves two other algorithms(which is the for loop). The BigO notation of that master algorithm is simply the sum of the other two BigO notations.
If ${f(n)}$ is ${O(h(n))}$ and ${g(n)}$ is ${O(p(n))}$ , then ${f(n)+g(n)}$ is ${O(h(n)+p(n))}$
Note:It is important to remember to apply the coefficient rule after applying this rule.
The following code block demonstrates a function with two main loops whose time complexities(quadratic time) must be considered independent:
function a(n){
var count =0;
for (var i=0;i<n;i++){
count+=1;
}
for (var i=0;i<5*n;i++){
count+=1;
}
return count;
}
In this example, ${f(n)=1n}$ (or 1n), and ${g(n)=5n}$ . This results to 6n because ${f(n)+g(n)=6n}$ (or ${h(n)+p(n)}$ ). That means, ${O(h(n)+p(n))}$ is ${6n}$ or ${O(h(n)+p(n))=6n}$ . However, when applying the coefficient rule, the final result is ${O(n)=n}$ . This is because both 2 loops or operation abide from this coefficient rule ${f(n)}$ is ${O(g(n))}$ , then ${kf(n)}$ is ${O(g(n))}$
Product Rule: “Multiply BigOs” (❌)
The product rule simply states how BigOs can be multiplied.
If ${f(n)}$ is ${O(h(n))}$ and ${g(n)}$ is ${O(p(n))}$ , then ${f(n)g(n)}$ is ${O(h(n)p(n))}$
The following code block demonstrates a function with two nested for loops(remember that this is a quadratic time inside product rule):
function (n){
var count =0;
for (var i=0;i<n;i++){
count+=1;
for (var i=0;i<5*n;i++){
count+=1;
}
}
return count;
}
In this example, ${f(n)=5nxn}$ because the second loop has ${5nxn}$ which runs ${5n}$ times. Therefore, this results in a total of ${5n^2}$ operations. Applying the coefficient rule, the result is that ${O(n)=n^2}$ .
Polynomial Rule: “BigO to the Power of k” (📈)
The polynomial rule states that polynomial time complexities have a BigO notation of the same polynomial degree.(Look at basic Polynomial subject)
If ${f(n)}$ is a polynomial of degree ${k}$ , then ${f(n)}$ is ${O(n^k)}$
The following code block has only one for loop with quadratic time complexity(quadratic time because n*n is equal to 2 loop):
function a(n){
var count =0;
for (var i=0;i<n*n;i++){
count+=1;
}
return count;
}
In this example, ${f(n)=n^2}$ because the first loop runs n*n iterations which is equivalent ${n^2}$ in accord to polynomial rule ${f(n)}$ is a polynomial of degree ${k}$ , then ${f(n)}$ is ${O(n^k)}$ .
Summary (📚)
BigO is important for analyzing and comparing the efficiencies of algorithms. The analysis of BigO starts by looking at the code, and, applying the rules, applying the rules is because to simplify the BigO notation linear or quadratic rule is not enough. The following are the most often used rules:
 Eliminating constant
 Adding up BigO Notations
 Multiplying BigO Notations
 Determining the polynomial of the BigO notation
Discussion
I'm looking forward to the next post already!
Learn Data Structure & Algorithm in JavaScript  Part 02
Edison Pebojot ・ Jul 1 ・ 8 min read
Nicely done!
Thank you so much! 😃
What a terrible example of "inheritance pattern".
I will always post every 3 times in a week! 😃