What is Algorithm
Algorithm is step by step instructions to solve a problem.
- You might not have noticed but in our every day life we use algorithms. For example when you want to drink coffee you first boil water and pour boiled water to cup to make coffee.
Terms that describe how efficient an algorithm is
There are 3 terms that describe how fast our algorithm is and how much space it needs
They are specific terms to describe efficiency of an algorithm for Best Case, Worst Case, Average case
- Big Omega Ω is used to describe the best case for runtime and space complexities of an algorithm.
- Order of or Big O is used to describe the upper bound or the worst for space and runtime of an algorithm
- Big Theta (Θ) is used to indicate that both the upper bound and the best case of our algorithm is the same which is average case
How to calculate efficiency of an algorithm?
For finding out how fast algorithm runs, If number of operations computer has to perform grows proportionally with the given input it is O(n) if quadratically it is O(n^2) if number operations are constant O(1)
Think about bigger picture. What I mean by this is that if number of operations computer has to perform is like this O(100n + 12); O(19n^2 + 2n +10), they are narrowed down to this O(n) and O(n^2). When they are scaled to very large data set for example 1 million and 10 million does not make much difference.
Space required in memory by certain primitives are constant which are as followings:
- Both integer and float needs 4 bytes of memory
- Bool needs 1 byte
- Double and long needs 8 bytes of memory
String require 1 byte for each character and it can overflow memory and
Arrays and Object also need n space in memory
BIG-O ALGORITHM COMPLEXITIES visit
Algorithms
Searching Algorithms
Linear Search as name indicates searching algorithms in which we search data linearly one at a time
It can work on both sorted and unsorted arrays
Example
const linearSearch = (arr, num)=>{
for(let val of arr){
if(num === val){
return true
}
}
return false
}
linearSearch([10,8,5,11], 11)
Runtime of an algorithm
Best Case Omega(1); Worst Case O(n)
Binary Search is logarithmic algorithm that's called divide and conquer algorithm. Because it searches target item in the array 1) looks at the middle if target item is in the middle, it returns else 2) eliminates the half in which it can't find target item.
It works only on SORTED ARRAY
Runtime of an algorithm
Best Case Omega(1); Worst Case O(log n)
Sorting Algorithms
Bubble sort as its name makes it obvious bigger items bubble up in the array. It uses nested loop and it compares inner loop index with inner loop index + 1 and bigger one bubbles up.
Example: (it is optimized in nearly sorted array Best case will be O(n))
const swap = (arr, j)=>{
let temp = arr[j];
arr[j] = arr[j + 1];
arr[j + 1] = temp;
}
const bubleSort = (arr)=>{
let counter = 0;
for(let i = 0; i < arr.length; i++){
let isSorted = false;
for(let j = 0; j < arr.length - 1; j++){
if(arr[j] > arr[j + 1]){
swap(arr, j)
isSorted = true;
}
}
if(!isSorted) break;
}
return arr;
}
bubleSort([6,4,1,1,1,2,3,4,5]);
Runtime of an algorithm
Average case O(n^2); but best could be O(n) on nearly sorted array
Selection Sort it uses nested loop and find the smallest item between item in outer loop's current index and array items starting outer loop index + 1 till last item and it swaps the smallest item with item in the current index of outer loop
Example
const swap = (arr, i, min) =>{
const temp = arr[i];
arr[i] = arr[min];
arr[min] = temp;
}
const selectionSort = (arr)=>{
for(let i = 0; i < arr.length; i++){
let min = i;
for(let j = i + 1; j < arr.length; j++){
if(arr[j] < arr[min]){
min = j;
}
}
swap(arr, i, min)
}
return arr;
}
selectionSort([9,1,10,0,6,3]);
Runtime of an algorithm
Average case O(n^2);
Insertion Sort sorts elements gradually by creating sorted larger half. It's one of the most efficient on nearly sorted array. It also uses nested array
Example
const insertionSort = (arr) =>{
for(let i = 1; i < arr.length; i++){
let currentMin = arr[i];
for(var j = i - 1; j >= 0 && arr[j] > currentMin; j--){
arr[j + 1] = arr[j];
}
arr[j + 1] = currentMin;
}
return arr;
}
insertionSort([10,2,5,1]);
Runtime of an algorithm
Average case O(n^2); on nearly sorted data Best case O(n)
Merge Sort logarithmic algorithm which divides array into left and right halves until 0 or 1 item left as 1 item is already sorted it merges left and right halves (it pushes smaller items before bigger items)
Example
const merge = (arr1, arr2) =>{
const mergedArr = [];
let left = 0;
let right = 0;
while(left < arr1.length && right < arr2.length){
if(arr1[left] > arr2[right]){
mergedArr.push(arr2[right]);
right++;
} else {
mergedArr.push(arr1[left]);
left++;
}
}
while(left < arr1.length){
mergedArr.push(arr1[left]);
left++;
}
while(right < arr2.length){
mergedArr.push(arr2[right]);
right++;
}
return mergedArr;
}
const mergeSort = (arr) =>{
if(arr.length <= 1) return arr;
const mid = Math.floor(arr.length / 2);
const left = mergeSort(arr.slice(0, mid));
const right = mergeSort(arr.slice(mid));
return merge(left, right);
}
Runtime of an algorithm
Average case O(n log n); Space complexity O(n)
Summary:
- So you learned that an algorithm is how you arrive to solution
- You now can know runtime of your algorithms and talk about runtime of algorithms with your friends and colleagues
- You now know some searching and sorting algorithms which you can use in your project.
Read Data Structures for Complete Beginners in here
Top comments (0)