loading...
Cover image for Learn Data Structure & Algorithm in JavaScript | Part 01

Learn Data Structure & Algorithm in JavaScript | Part 01

edisonpebojots profile image Edison Pebojot Updated on ใƒป8 min read

Introduction (โ˜๏ธ)

Alt Text
The motivation (๐Ÿ”ฅ๐Ÿ”ฅ) for writing this was the lack of available information about JavaScript-written data structures and algorithms, this is true, because as you can see, most of our data structure and algorithm book was written in Java and C++:

I found this strange (๐Ÿ˜‘) because many of the job opportunities for software engineering and development today require JavaScript knowledge; it is the only language that can be used to write the entire stack, including the front end and back end. (๐Ÿ‘Œ)

JavaScript follows the prototypal inheritance pattern, unlike Java and C++ which follow the inheritance pattern only(sorry Java and C/C++ ๐Ÿ˜…), there are some changes in writing data structures in JavaScript. The inheritance pattern allows inheritance by creating a blueprint-like form that objects follow during inheritance. Here is an example of inherentance in Java,

๐Ÿ‘‰ Inheretance in Java:

class Teacher {
   String designation = "Teacher";
   String collegeName = "Beginnersbook";
   void does(){
    System.out.println("Teaching");
   }
}

public class PhysicsTeacher extends Teacher{
   String mainSubject = "Physics";
   public static void main(String args[]){
    PhysicsTeacher obj = new PhysicsTeacher();
    System.out.println(obj.collegeName);
    System.out.println(obj.designation);
    System.out.println(obj.mainSubject);
    obj.does();
   }
}

๐Ÿ‘‰ Output:

Beginnersbook
Teacher
Physics
Teaching

However, the prototypal inheritance pattern means copying the objects (๐Ÿ“๐Ÿ“) and changing their properties. Here is an example of prototype in JavaScript,

๐Ÿ‘‰ Inheretance in JavaScript:

var animal = {
    legs: 4,
    walk: function() {
        console.log('walking on ' + this.legs + ' legs');
    }
}

var bird = object(animal);

bird.legs = 2;
bird.fly = function() {
    console.log('flying');
}

bird.walk(); // walking on 2 legs
bird.fly(); // flying

๐Ÿ‘‰ Output:

walking on 2 legs
flying

Part 01: Big-O Notation (๐Ÿ˜ฑ ๐Ÿ”ฅ โšก)

By the end of this chapter, you will understand how to analyze an implementation of an algorithm with respect to both time (execution time) and space (memory consumed).

Big-O Notation Primer (๐Ÿ˜ฎ)

The Big-O notation measures the complexity of an algorithm. In Big-O notation, n represents the number of inputs. The question is: โ€œWhat will happen as n approaches infinity? (โ“โ“)โ€ Big-O notation tells you how efficient the algorithm is.

alt text

Figure 1-0. Common Big-O Complexities (Live)

alt text

Figure 1-1. Common Big-O Complexities (Static)

The figure 1-0 and 1-1 illustrate these common time complexities as live and static.

Common Examples (๐Ÿค˜)

See figure 1-1, O(1){O(1)} does not change. Hence, O(1){O(1)} is referred to as being constant time. O(n){O(n)} is linear time. And since O(n){O(n)} is linear time, an example of an O(n) algorithm below is printing numbers from 0 to n-1, as shown here:

Linear Time: (๐Ÿ•™)

// linear time code goes here const linear_time=n=>{ for(let i=0;i<n;i++){ console.log(i) } } linear_time(10)

Similarly, O(n2){O(n^2)} is quadratic time, and O(n3){O(n^3)} is cubic time. Examples of these complexities are shown here:

Quadratic Time: (๐Ÿ•”)

// quadratic time code goes here const quadratic_time=n=>{ for(let i=0;i<n;i++){ console.log(i) for(let j=i;j<n;j++){ console.log(j) } } } quadratic_time(10)

Cubic Time: (๐Ÿ•ฃ)

// cubic time code goes here const cubic_time=n=>{ for(let i=0;i<n;i++){ console.log(i) for(let j=i;j<n;j++){ console.log(j) for(let k=j;k<n;k++){ console.log(k) } } } } cubic_time(10)

Finally, an another example logarithmic time complexity is printing elements that are a power of 2 between 2 and n. For example, logarithmic_time(100) will print the following:

2,4,8,16,32,64
const logarithmic_time=n=>{ for(let i=2;i<=n;i=i*2){ console.log(i) } } logarithmic_time(100)

Rules of Big-O Notation

Letโ€™s represent an algorithmโ€™s complexity as f(n){f(n)} . n represents the number of inputs, f(n)time{f(n){time}} represents the time needed, and f(n)space{f(n){space}} represents the space (additional memory) needed for the algorithm. It can be challenging to calculate f(n){f(n)} . But Big-O notation provides some fundamental rules that help developers compute for f(n){f(n)} .

  • Coefficient rule: If f(n){f(n)} is O(g(n)){O(g(n))} , then kf(n){kf(n)} is O(g(n)){O(g(n))} , for any constant k>0{k>0} .
  • Sum rule: If kf(n){kf(n)} is O(h(n)){O(h(n))} and g(n){g(n)} is O(p(n)){O(p(n))} , then f(n){f(n)} +g(n){+g(n)} is O(h(n)+p(n)){O(h(n)+p(n))} .
  • Product rule: If f(n){f(n)} is O(h(n)){O(h(n))} and g(n){g(n)} is O(p(n)){O(p(n))} , then f(n)g(n){f(n)g(n)} is O(h(n)p(n){O(h(n)p(n)} .
  • Transitive rule: If f(n){f(n)} is O(g(n)){O(g(n))} and g(n){g(n)} is O(h(n)){O(h(n))} , then f(n){f(n)} is O(h(n)){O(h(n))} .
  • Polynomial rule: If f(n){f(n)} is a polynomial of degree k, then f(n){f(n)} is O(nk){O(n^k)} .
  • Log of a power rule: log(nk){log(nk)} is O(log(n)){O(log(n))} for any constant k>0{k>0} .

Alt Text

Note: Don't try to understand them for now. Just give your special attention to the first three rules and the polynomial rule because they are the most commonly used. Iโ€™ll discuss each of those rules in the following sections.

Coefficient Rule: โ€œGet Rid of Constantsโ€ (๐Ÿ’ฏ)

This rule simply requires you to ignore any non-input-size constants. Coefficients(or constant) in Big-O are negligible with large input sizes. Therefore, this is the most important rule of Big-O notations.

If f(n){f(n)} is O(g(n)){O(g(n))} , then kf(n){kf(n)} is O(g(n)){O(g(n))} , for any constant k>0{k>0} .

This means that both kf(n){kf(n)} and f(n){f(n)} have the same Big-O notation of O(g(n)){O(g(n))} if they are constant. Here is an example of a code block for f(n){f(n)} :

 function a(n){
    var count =0;
      for (var i=0;i<n;i++){
        count+=1;
      }
   return count;
 }

This block of code has f(n)=n{f(n)=n} . This is because it adds to count n times. Therefore, this function is O(n){O(n)} . Here's another example code block for for kf(n){kf(n)} :

 function a(n){
   var count =0;
     for (var i=0;i<5*n;i++){
       count+=1;
     }
    return count;
 }

This block has kf(n)=5n{kf(n)=5n} . After all, the first two examples both have a Big-O notation of O(n){O(n)} or O(g(n)){O(g(n))} from the above coefficient rule. Another following code block demonstrates another function with a linear time complexity but with an additional operation which is count+=1:

 function a(n){
     var count =0;
     for (var i=0;i<n;i++){
      count+=1;
     }
     count+=1;
     return count;
  }

This block of code has f(n)=n+1{f(n)=n+1} (in linear time, it can also be nโˆ’1{n-1} ). There is +1{+1} from the last operation (count+=1). This still has a Big-O notation of O(n){O(n)} . This is because the one operation is not dependent on the input n like we said earlier.

Sum Rule: โ€œAdd Big-Os Upโ€ (โž•)

The sum rule is easy to understand; time complexities can be added. Imagine a master(which is function a(n){a(n)} ) algorithm that involves two other algorithms(which is the for loop). The Big-O notation of that master algorithm is simply the sum of the other two Big-O notations.

If f(n){f(n)} is O(h(n)){O(h(n))} and g(n){g(n)} is O(p(n)){O(p(n))} , then f(n)+g(n){f(n)+g(n)} is O(h(n)+p(n)){O(h(n)+p(n))}

Note:It is important to remember to apply the coefficient rule after applying this rule.

The following code block demonstrates a function with two main loops whose time complexities(quadratic time) must be considered independent:

  function a(n){
    var count =0;
    for (var i=0;i<n;i++){
     count+=1;
    }
    for (var i=0;i<5*n;i++){
     count+=1;
    }
    return count;
 }

In this example, f(n)=1n{f(n)=1n} (or 1n), and g(n)=5n{g(n)=5n} . This results to 6n because f(n)+g(n)=6n{f(n)+g(n)=6n} (or h(n)+p(n){h(n)+p(n)} ). That means, O(h(n)+p(n)){O(h(n)+p(n))} is 6n{6n} or O(h(n)+p(n))=6n{O(h(n)+p(n))=6n} . However, when applying the coefficient rule, the final result is O(n)=n{O(n)=n} . This is because both 2 loops or operation abide from this coefficient rule f(n){f(n)} is O(g(n)){O(g(n))} , then kf(n){kf(n)} is O(g(n)){O(g(n))}

Product Rule: โ€œMultiply Big-Osโ€ (โŒ)

The product rule simply states how Big-Os can be multiplied.

If f(n){f(n)} is O(h(n)){O(h(n))} and g(n){g(n)} is O(p(n)){O(p(n))} , then f(n)g(n){f(n)g(n)} is O(h(n)p(n)){O(h(n)p(n))}

The following code block demonstrates a function with two nested for loops(remember that this is a quadratic time inside product rule):

 function (n){
   var count =0;
   for (var i=0;i<n;i++){
     count+=1;
     for (var i=0;i<5*n;i++){
       count+=1;
     }
   }
   return count;
 }

In this example, f(n)=5nxn{f(n)=5nxn} because the second loop has 5nxn{5nxn} which runs 5n{5n} times. Therefore, this results in a total of 5n2{5n^2} operations. Applying the coefficient rule, the result is that O(n)=n2{O(n)=n^2} .

Polynomial Rule: โ€œBig-O to the Power of kโ€ (๐Ÿ“ˆ)

The polynomial rule states that polynomial time complexities have a Big-O notation of the same polynomial degree.(Look at basic Polynomial subject)

If f(n){f(n)} is a polynomial of degree k{k} , then f(n){f(n)} is O(nk){O(n^k)}

The following code block has only one for loop with quadratic time complexity(quadratic time because n*n is equal to 2 loop):

  function a(n){
    var count =0;
    for (var i=0;i<n*n;i++){
     count+=1;
    }
   return count;
 }

In this example, f(n)=n2{f(n)=n^2} because the first loop runs n*n iterations which is equivalent n2{n^2} in accord to polynomial rule f(n){f(n)} is a polynomial of degree k{k} , then f(n){f(n)} is O(nk){O(n^k)} .

Summary (๐Ÿ“š)

Big-O is important for analyzing and comparing the efficiencies of algorithms. The analysis of Big-O starts by looking at the code, and, applying the rules, applying the rules is because to simplify the Big-O notation linear or quadratic rule is not enough. The following are the most often used rules:

  • Eliminating constant
  • Adding up Big-O Notations
  • Multiplying Big-O Notations
  • Determining the polynomial of the Big-O notation

Up Next๐Ÿ‘‰ Part 02: JavaScript: Unique Parts ๐Ÿ”ฅ ๐Ÿ”ฅ (July 01, 2020)


Posted on Jun 29 by:

Discussion