<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Rishab Trivedi</title>
    <description>The latest articles on DEV Community by Rishab Trivedi (@rishabtrivedi).</description>
    <link>https://dev.to/rishabtrivedi</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/rishabtrivedi"/>
    <language>en</language>
    <item>
      <title>Sliding Window || Python || Data Structures and Algorithms</title>
      <dc:creator>Rishab Trivedi</dc:creator>
      <pubDate>Sun, 02 Feb 2025 03:07:26 +0000</pubDate>
      <link>https://dev.to/rishabtrivedi/sliding-window-python-data-structures-and-algorithms-16m</link>
      <guid>https://dev.to/rishabtrivedi/sliding-window-python-data-structures-and-algorithms-16m</guid>
      <description>&lt;p&gt;&lt;strong&gt;What is Sliding Window Technique ?&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Sliding Window Technique is a method where we define a window or range in the input data (arrays or strings) and then move that window across the data to perform some operation within the window.&lt;/p&gt;

&lt;p&gt;This technique is commonly used in algorithms like finding subarrays with a specific sum, finding the longest substring with unique characters, or solving problems that require a fixed-size window to process elements efficiently.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Two Types of Sliding Window-&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Fixed Size Sliding Window -&lt;/strong&gt; In this, the size of the window is fixed, and we move the window across the array.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Variable Size Sliding Window -&lt;/strong&gt; In this, we increment the right pointer by one in each iteration, and we move the left pointer only when the condition is not met. We continue moving the left pointer until the condition is met again or it reaches the right pointer.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;When should we use sliding window?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Whenever we see a problem where we need to calculate maximum or minimum of a subarray or any operation related to subarray then There is a high chance a sliding window can be used in it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;General template for Sliding Window -&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;function fn(arr):
    left = 0
    for (int right = 0; right &amp;lt; arr.length; right++):
        Do some logic to "add" element at arr[right] to window

        while WINDOW_IS_INVALID:
            Do some logic to "remove" element at arr[left] from window
            left++

        Do some logic to update the answer
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Example - Minimum Size Subarray Sum&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Given an array of positive integers nums and a positive integer target, return the minimal length of a subarray whose sum is greater than or equal to target. If there is no such subarray, return 0 instead.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test Case 1:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Input: target = 7, nums = [2,3,1,2,4,3]&lt;/p&gt;

&lt;p&gt;Output: 2&lt;/p&gt;

&lt;p&gt;Explanation: The subarray [4,3] has the minimal length under the problem constraint.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test Case 2:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Input: target = 4, nums = [1,4,4]&lt;/p&gt;

&lt;p&gt;Output: 1&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test Case 3:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Input: target = 11, nums = [1,1,1,1,1,1,1,1]&lt;/p&gt;

&lt;p&gt;Output: 0&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Brute Force -&lt;/strong&gt; The Brute Force Approach to solve this problem is to iterate from left to right for each element. Then calculate the subarray sum for all the elements between our left and right pointer.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;So the time complexity will be O(n) * O(n(n+1) /2 ) = O(n3)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sliding Window Approach -&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu7vjjey9gztc20rz4lxd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu7vjjey9gztc20rz4lxd.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fju2akqm5o4qxao8ayari.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fju2akqm5o4qxao8ayari.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8kh371wg8g019inc5ib5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8kh371wg8g019inc5ib5.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3sunivx2b0wx8mpu1no7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3sunivx2b0wx8mpu1no7.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi1achahm7cbe1xa43s60.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi1achahm7cbe1xa43s60.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkou4uk3b9oltoj8n57ds.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkou4uk3b9oltoj8n57ds.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpddgvlxs2tmh38p8nro5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpddgvlxs2tmh38p8nro5.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff4970l9f8lcjjhga7vkn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff4970l9f8lcjjhga7vkn.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp01lt5njiozju6w1jdpl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp01lt5njiozju6w1jdpl.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp7yxay15v12exws2fsq5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp7yxay15v12exws2fsq5.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ti0umeofvgwevh8pv13.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ti0umeofvgwevh8pv13.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Code -&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkkf4c2odypfwkxq2ofue.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkkf4c2odypfwkxq2ofue.png" alt="Image description" width="800" height="499"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Algorithm -&lt;/strong&gt; &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Initialize 2 pointers left and right and Two more variables one for calculating current sum and and one for size of minimum window.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Our window size will always be (right - left + 1) , as arrays starting index starts from 0.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now increment the right pointer in each iteration.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In each iteration check if the current sum of the window exceeds the target then we have to decrease the size of our window.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Increment the left pointer until the size of the window is correct as per the condition.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;During this calculate the size of the minimum window as we want to calculate the minimum size window.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;At Last if the min Window variable has changed the return the original value otherwise return 0.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Time Complexity - O(n)&lt;/strong&gt; as for the worst case we are iterating through the array elements at most twice.&lt;/p&gt;

&lt;p&gt;Simplifying O(2n) to O(n)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Space Complexity - O(1)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;By using sliding window algorithm we reduced the time complexity from &lt;strong&gt;O(n3) to O(n)&lt;/strong&gt;.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>programming</category>
      <category>python</category>
      <category>career</category>
    </item>
    <item>
      <title>Stack and Queue || Python || Data Structures and Algorithms</title>
      <dc:creator>Rishab Trivedi</dc:creator>
      <pubDate>Sat, 21 Dec 2024 09:31:19 +0000</pubDate>
      <link>https://dev.to/rishabtrivedi/stack-and-queue-python-data-structures-and-algorithms-55hb</link>
      <guid>https://dev.to/rishabtrivedi/stack-and-queue-python-data-structures-and-algorithms-55hb</guid>
      <description>&lt;h2&gt;
  
  
  Stack
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Stack -&lt;/strong&gt; It is a storage container which supports retrieval by last-in, first-out ( LIFO) order. Stacks are probably the right container to use when retrieval order doesn’t matter at all, such as when processing batch jobs. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;For example,&lt;/strong&gt; consider a stack of plates used in cafeterias: the order in which plates are removed from the stack is the reverse of the order in which they were added, as only the top plate is accessible.&lt;/p&gt;

&lt;p&gt;The INSERT operation on a stack is often called PUSH, and the DELETE operation, which does not take an element argument, is often called POP.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Push (x,s):&lt;/strong&gt; Insert item x at the top of stack s.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pop(s) :&lt;/strong&gt; Return (and remove) the top item of stack s.&lt;/p&gt;

&lt;p&gt;Food inserted into my refrigerator usually exits the same way, despite the incentive of expiration dates. Algorithmically, LIFO tends to happen in the course of executing recursive algorithms.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Applications of Stack -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Function calls:&lt;/strong&gt; Used to manage function execution and recursion.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Undo operations:&lt;/strong&gt; Tracks changes in editors for "Undo/Redo."&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Browser history:&lt;/strong&gt; Stores visited pages for backtracking.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Expression parsing:&lt;/strong&gt; Evaluates and converts mathematical expressions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Syntax validation:&lt;/strong&gt; Matches parentheses or tags in code.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Memory management:&lt;/strong&gt; Manages call stacks during program execution.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;DFS:&lt;/strong&gt; Implements Depth-First Search in graph algorithms.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Implement Stack Using Array -&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;push()&lt;/strong&gt; - Insert an element into the stack&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;pop()&lt;/strong&gt; -  Remove an element from the stack&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;top()&lt;/strong&gt; - Returns the top element of the stack.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;isEmpty()&lt;/strong&gt; -  Returns true if the stack is empty else false.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class Stack:
    def __init__(self):
        #Initializes an empty stack 
        self.stack = []

    def isEmpty(self) -&amp;gt; bool:
        #Returns True if the stack is empty, False otherwise.

        return len(self.stack) == 0

    def push(self, item) -&amp;gt; None:
        #Pushes an item onto the stack.
        self.stack.append(item)

    def pop(self):
        #Removes and returns the top item from the stack.
        if self.isEmpty():
            return None
        return self.stack.pop()

    def peek(self):
        #Returns the top item from the stack without removing it.
        if self.isEmpty():
            return None
        return self.stack[-1]

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Queue
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Queue -&lt;/strong&gt; It is a storage container which supports retrieval by first-in, first-out (FIFO) order. We call the INSERT operation on a queue ENQUEUE, and we call the DELETE operation DEQUEUE. Like the stack operation POP, DEQUEUE takes no element argument. Stacks and queues are dynamic sets in which the element removed from the set by the DELETE operation is predefined.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Enqueue(x,q):&lt;/strong&gt; Insert item x at the back of queue q.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Dequeue(q):&lt;/strong&gt; Return (and remove) the front item from queue q.&lt;/p&gt;

&lt;p&gt;The queue has a head and a tail.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;When an element is enqueued, it takes its place at the tail of the queue, just as a newly arriving customer takes a place at the end of the line.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The element dequeued is always the one at the head of the queue, like the customer at the head of the line, who has waited the longest.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Applications of Queue -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Task scheduling:&lt;/strong&gt; Manages CPU processes and jobs in order.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;BFS:&lt;/strong&gt; Implements Breadth-First Search in graphs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Print queues:&lt;/strong&gt; Handles print jobs sequentially.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Network routing:&lt;/strong&gt; Buffers data packets for transmission.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Call centers:&lt;/strong&gt; Manages customer calls in waiting order.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Streaming:&lt;/strong&gt; Buffers video or audio streams in real-time.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Input events:&lt;/strong&gt; Processes keyboard and mouse inputs in GUI systems.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Implement Queue Using Array-&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;enqueue() –&lt;/strong&gt; Insertion of elements to the queue.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;dequeue() –&lt;/strong&gt; Removal of elements from the queue.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;peek() or front()-&lt;/strong&gt; Acquires the data element available at the front node of the queue without deleting it.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;isEmpty() –&lt;/strong&gt; Checks if the queue is empty.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class MyQueue:
    def __init__(self):
        # Store elements
        self.queue = []
        # A pointer to indicate the start position
        self.p_start = 0

    def enQueue(self, x):
        #Insert an element into the queue. 
        self.queue.append(x)
        return True # Return True if the operation is successful

    def deQueue(self):
        #Delete an element from the queue. 
        if self.isEmpty():
            return False
        self.p_start += 1
        return True #Return True if the operation is successful

    def Front(self):
        #Get the front item from the queue.
        if not self.isEmpty():
            return self.queue[self.p_start]
        return None  # Return None if the queue is empty

    def isEmpty(self):
        #Checks whether the queue is empty or not
        return self.p_start &amp;gt;= len(self.queue)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Implement Queue using Stacks -&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;push(x) -&lt;/strong&gt; Move all elements from stack1 to stack 2 to reverse their order and then and again move all elements back from stack2 to stack 1.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;pop() -&lt;/strong&gt; Removes the element from the front of the queue and returns it&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;peek() -&lt;/strong&gt; Returns the element at the front of the queue&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;empty() -&lt;/strong&gt; Returns true if the queue is empty, false otherwise
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class MyQueue:

    def __init__(self):
        self.stack_1 = []  # Main stack for enqueue operations
        self.stack_2 = []  # Temporary stack for dequeue operations
        self.front = None

    # Pushes element x to the back of the queue.
    def push(self, x):

        # Move all elements from stack1 to stack 2 to reverse their order
        while self.stack_1:
            self.stack_2.append(self.stack_1.pop())
        self.stack_2.append(x)

        # Move all elements back from stack2 to stack 1 as a queue
        while self.stack_2:
            self.stack_1.append(self.stack_2.pop())

    # Removes the element from the front of the queue and returns it
    def pop(self):
        return self.stack_1.pop()

    # Returns the element at the front of the queue
    def peek(self):
        return self.stack_1[-1]

    # Returns true if the queue is empty, false otherwise
    def empty(self):
        return not self.stack_1

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Implement Stack using Queues -&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;push(x) -&lt;/strong&gt; Add new element to second queue, then move all elements from queue 1 to queue 2 to maintain stack order (LIFO) and Swap the queues.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;pop() -&lt;/strong&gt; Removes the element on the top of the stack and returns it&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;peek or top() -&lt;/strong&gt; Returns the element at the front of the queue&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;empty() -&lt;/strong&gt; Returns true if the queue is empty, false otherwise
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class MyStack:

    def __init__(self):
        # Initialize two queues for stack operations
        self.queue1 = [] # Main queue to hold elements
        self.queue2 = [] # Temporary queue used during push

    # Pushes element x to the top of the stack
    def push(self, x: int) -&amp;gt; None:

        # Add new element to second queue
        self.queue2.append(x)

        # Move all elements from queue 1 to queue 2 to maintain stack order (LIFO)
        while len(self.queue1) &amp;gt; 0:
            self.queue2.append(self.queue1.pop(0))

        # Swap the queues
        self.queue1, self.queue2 = self.queue2, self.queue1

    # Removes the element on the top of the stack and returns it
    def pop(self) -&amp;gt; int:
        return self.queue1.pop(0)

    # Returns the element on the top of the stack
    def top(self) -&amp;gt; int:
        return self.queue1[0]

    # Returns true if the stack is empty, false otherwise.
    def empty(self) -&amp;gt; bool:
        return len(self.queue1) == 0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>learning</category>
      <category>python</category>
      <category>programming</category>
      <category>algorithms</category>
    </item>
    <item>
      <title>Sorting Algorithms || Python || Data Structures and Algorithms</title>
      <dc:creator>Rishab Trivedi</dc:creator>
      <pubDate>Tue, 17 Dec 2024 03:24:28 +0000</pubDate>
      <link>https://dev.to/rishabtrivedi/sorting-algorithms-python-data-structures-and-algorithms-1dca</link>
      <guid>https://dev.to/rishabtrivedi/sorting-algorithms-python-data-structures-and-algorithms-1dca</guid>
      <description>&lt;p&gt;Sorting Algoritms&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Bubble Sort
&lt;/h2&gt;

&lt;p&gt;In this, we swap the higher element with its neighbor until we reach the end of the array. Now the highest element is at the last position. So, we change the boundary and decrease it by 1 from the last. At worst, we have to iterate n times to sort the array.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def bubble_sort(arr):
    n = len(arr)
    for i in range(n):
        swapped = False

        # Last i elements are already in place, so we don't need to check them
        for j in range(0, n-i-1):
            # Swap if the element found is greater than the next element
            if arr[j] &amp;gt; arr[j+1]:
                arr[j], arr[j+1] = arr[j+1], arr[j]  # Swap the elements
                swapped = True
        if not swapped:
            break

    return arr
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Algorithm -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Iterate over the array and find the maximum element, then swap it to the last.&lt;/li&gt;
&lt;li&gt;Compare all adjacent elements and swap if the larger element is before the smaller element. Keep doing this until you reach the end of the array.&lt;/li&gt;
&lt;li&gt;Maintain a flag: if no element is swapped, then we can break the loop, as the array is sorted.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Time Complexity :&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Best Case&lt;/strong&gt; - If the array is already sorted then only one iteration is required. Time complexity - O(n)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Average Case&lt;/strong&gt; - if the array is randomly sorted, then time complexity - O(n2)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Worst Case&lt;/strong&gt; - if the array is in decreasing order then we will require n*2 iterations.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Space Complexity -&lt;/strong&gt; O(1), No extra memory required.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Advantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;No extra memory required.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Stable as elements keep their relative order.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Disadvantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Time complexity - O(n2), which is very high for large datasets.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Applications-&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Can be used only when the dataset is very small, and simplicity outweighs inefficiency due to high time complexity.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  2. Selection Sort
&lt;/h2&gt;

&lt;p&gt;In this, we find the smallest element in the array and replace it with the first element. Then, we increase the boundary by 1 and repeat the same steps until we reach the end of the array.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def selectionSort(a):
    i = 0
    while i&amp;lt;len(a):

        smallest = min(a[i:])
        index_of_smallest = a.index(smallest)
        a[i],a[index_of_smallest] = a[index_of_smallest],a[i]

        i=i+1
    return a
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Algorithm -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Iterate over the array and find the minimum element.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Swap it with the first element and increase the pointer by 1.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Repeat this process until we reach the end of the array.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Time Complexity :&lt;/strong&gt; It has a time complexity of O(n2) in all three cases: &lt;strong&gt;best, average, and worst.&lt;/strong&gt; This is because we have to select the minimum element and swap it every time, regardless of whether the array is already sorted or not.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Space Complexity -&lt;/strong&gt; O(1), No extra memory required.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Advantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;No extra memory required.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Fewer swaps are done than in bubble sort.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Disadvantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Time complexity - O(n2), which is very high for large datasets.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Not Stable, as it does not maintain the relative order of equal elements.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Applications -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;It can be used in systems with limited memory as it does not require additional storage.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;It is used in systems where minimizing the number of swaps is critical, such as in systems with slow write operations.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  3. Insertion Sort
&lt;/h2&gt;

&lt;p&gt;It is an algorithm that works by inserting an unsorted element into its correct position by iteratively checking backwards from the position of the element to the start of the array.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def insertion_sort(arr):
    for i in range(1, len(arr)):
        key = arr[i]
        j = i - 1
        while j &amp;gt;= 0 and key &amp;lt; arr[j]:
            arr[j + 1] = arr[j]
            j -= 1
        arr[j + 1] = key
    return arr

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Algorithm -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Start from the second element of the array and compare it with the first element. If the current element is smaller than the first element, then swap them.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now increase the pointer and do this for the third element: compare it with the second and first elements.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Repeat the same process for the rest of the elements, comparing them with all the previous elements, and insert them at the suitable position.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Time Complexity :&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Best Case -&lt;/strong&gt; If the array is already sorted then only one iteration is required. Time complexity is O(n)&lt;br&gt;
&lt;strong&gt;- Average Case -&lt;/strong&gt; if the array is randomly sorted, then time complexity is O(n2)&lt;br&gt;
&lt;strong&gt;- Worst Case -&lt;/strong&gt; if the array is in decreasing order then we will require n2 iterations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Space Complexity -&lt;/strong&gt; O(1), No extra memory required.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Advantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;No extra memory required.&lt;/li&gt;
&lt;li&gt;Stable, as elements keep their relative order.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Disadvantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Time complexity - O(n2), which is very high for large datasets.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Not Stable, as it does not maintain the relative order of equal elements.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Applications-&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;It is efficient for scenarios where elements arrive in real-time and need to be sorted, such as live data streams.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;
  
  
  4. Merge Sort
&lt;/h2&gt;

&lt;p&gt;Merge Sort is an algorithm that follows the divide and conquer approach. It has two main steps: first, dividing the array recursively, and second, merging the divided arrays in sorted order.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def mergeSort(alist):
    if len(alist) &amp;gt; 1:
        mid = len(alist) // 2
        lefthalf = alist[:mid]
        righthalf = alist[mid:]

        # Recursively split and sort the halves
        mergeSort(lefthalf)
        mergeSort(righthalf)

        # Merge the sorted halves
        merge(alist, lefthalf, righthalf)

def merge(alist, lefthalf, righthalf):
    i = j = k = 0

    while i &amp;lt; len(lefthalf) and j &amp;lt; len(righthalf):
        if lefthalf[i] &amp;lt; righthalf[j]:
            alist[k] = lefthalf[i]
            i += 1
        else:
            alist[k] = righthalf[j]
            j += 1
        k += 1

    while i &amp;lt; len(lefthalf):
        alist[k] = lefthalf[i]
        i += 1
        k += 1

    while j &amp;lt; len(righthalf):
        alist[k] = righthalf[j]
        j += 1
        k += 1


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Algorithm -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Divide the array into two halves by calculating the mid-point.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Continue dividing until the length of each sub-array is 1.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Call the merge function on both halves: the left half and the right half.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use three pointers for the merging process:&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;The first pointer for the left half array.&lt;/li&gt;
&lt;li&gt;The second pointer for the right half array.&lt;/li&gt;
&lt;li&gt;The third pointer for the sorted array.&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Iterate through both halves and compare their elements. Insert the smaller element into the sorted array and increment the corresponding pointer by 1.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Repeat this process recursively until the entire array is sorted.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Time Complexity :&lt;/strong&gt; Merge Sort has a time complexity of &lt;strong&gt;O(n log n)&lt;/strong&gt; in all three cases: best, average, and worst. This is because, irrespective of whether the array is already sorted or not, the same steps are followed for each division and merge.&lt;/p&gt;

&lt;p&gt;O( log n ) - The array size is halved at each step during the divide phase.&lt;/p&gt;

&lt;p&gt;O(n) - During merging process we have to iterate over all the elements once.&lt;/p&gt;

&lt;p&gt;So the total time complexity is O (n) * O (log n) = &lt;strong&gt;O (n log n)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Space Complexity - O(n)&lt;/strong&gt;, Extra memory is required during the merging process to store the temporary arrays.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Advantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Stable, as elements keep their relative order.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Time complexity is O (n log n), even for the large datasets.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Suitable for parallel processing because sub-arrays are merged independently.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Disadvantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Time complexity - O(n2), which is very high for large datasets. &lt;/li&gt;
&lt;li&gt;Extra memory is required for the merging process.&lt;/li&gt;
&lt;li&gt;Not in place as extra memory is required.&lt;/li&gt;
&lt;li&gt;Slower than QuickSort in general for most datasets.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Applications -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;It is used in situations where data is too large to fit in memory, such as merging large files.&lt;/li&gt;
&lt;li&gt;It is used to sort linked lists because random access is not required.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  5. Quick Sort
&lt;/h2&gt;

&lt;p&gt;Quick Sort is an algorithm that follows the divide-and-conquer approach. We choose a pivot element and partition the array around the pivot element after placing the pivot in its correct sorted position.&lt;/p&gt;

&lt;p&gt;The first step is to choose the pivot element and then partition the array around the pivot. All elements smaller than the pivot will be on the left, and all elements greater than the pivot will be on its right. The pivot is then in its correct sorted position. Recursively, the same process is applied by dividing the array into two halves: the first half contains the elements before the pivot, and the second half contains the elements after the pivot. This process is repeated until the length of each sub-array reaches 1.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def partition(arr, low, high):

    # Choosing Last Element as Pivot
    pivot = arr[high]
    i = low - 1
    for j in range(low, high):
        if arr[j] &amp;lt; pivot:
            i += 1
            arr[i], arr[j] = arr[j], arr[i]
    arr[i + 1], arr[high] = arr[high], arr[i + 1]
    print(arr)
    print("Pivot is ", arr[high])
    return i + 1

    # Choosing First Element as Pivot

    # pivot = arr[low]
    # i = low
    # for j in range(low, high+1):
    #     if arr[j] &amp;lt; pivot:
    #         i += 1
    #         arr[i], arr[j] = arr[j], arr[i]
    # arr[i], arr[low] = arr[low], arr[i]
    # print(arr)
    # print("Pivot is ", arr[high])
    # return i 



def quicksort(arr, low, high):

    if low &amp;lt; high:
        pi = partition(arr, low, high)

        quicksort(arr, low, pi - 1)
        quicksort(arr, pi + 1, high)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Algorithm -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Divide the array into two halves by calculating the mid-point.&lt;/li&gt;
&lt;li&gt;Choose a pivot; any element can be chosen as the pivot.&lt;/li&gt;
&lt;li&gt;Iterate over the array and compare each element with the pivot.&lt;/li&gt;
&lt;li&gt;Place all elements smaller than the pivot on the left and all elements greater than the pivot on the right.&lt;/li&gt;
&lt;li&gt;Swap the pivot with the left pointer so that the pivot is at its correct sorted position.&lt;/li&gt;
&lt;li&gt;Repeat this process recursively until the length of the partition is greater than 1.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Time Complexity :&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Best Case -&lt;/strong&gt; Time complexity - O(n log n), when the pivot divides the array into two equal halves.&lt;br&gt;
&lt;strong&gt;2. Average Case -&lt;/strong&gt; Time complexity - O(n log n), when the pivot divides the array into two equal halves. But not necessarily equal.&lt;br&gt;
&lt;strong&gt;3. Worst case -&lt;/strong&gt; Time complexity - O(n2) , When -&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The smallest element is chosen as the pivot in an already sorted array.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The largest element is chosen as the pivot in an array sorted in decreasing order.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;O( log n ) - The array size is halved at each step during the divide phase.&lt;/p&gt;

&lt;p&gt;O(n) - During the ordering of elements.&lt;/p&gt;

&lt;p&gt;So, the total time complexity is O (n) * O (log n) = &lt;strong&gt;O (n log n)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Space Complexity :&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Best and Average case - O( log n)&lt;/strong&gt; - for the recursive stack.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Worst Case - O(n)&lt;/strong&gt; - for the recursive stack.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Advantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Efficient for large datasets unless the pivot is chosen bad.&lt;/li&gt;
&lt;li&gt;It is cache friendly as we work on the same array to sort and do not copy data to to any auxiliary array.&lt;/li&gt;
&lt;li&gt;One of the fastest general purpose algorithms for large data when stability is not required.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Disadvantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Time complexity - O(n2), which is very high for large datasets.&lt;/li&gt;
&lt;li&gt;Not Stable, as it does not maintain the relative order of equal elements.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Applications -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;It is used in programming libraries and frameworks. For example - Python’s sorted() function and Java’s Array.sort() is based on quick sort.&lt;/li&gt;
&lt;li&gt;It is used indDatabase query optimization by efficiently sorting rows during query execution.&lt;/li&gt;
&lt;li&gt;It works well for in-memory sorting of large datasets due to its cache-friendly properties.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;
  
  
  6.Heap Sort
&lt;/h2&gt;

&lt;p&gt;Heap Sort is a comparison-based sorting algorithm. It is an extension of Selection Sort. In Heap Sort, we create a Binary Heap and swap the maximum or minimum element with the last element. Then, we reduce the heap size by 1. This process is repeated until the length of the heap is greater than 1.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class Solution:
    def sortArray(self, nums: List[int]) -&amp;gt; List[int]:

        def heapify(nums,n,i):
            largest = i
            left = 2*i +1
            right = 2*i + 2

            if left &amp;lt; n and nums[left] &amp;gt; nums[largest]:
                # nums[left],nums[largest] = nums[largest],nums[left]
                largest = left

            if right &amp;lt; n and nums[right] &amp;gt; nums[largest]:
                # nums[right] , nums[largest] = nums[largest],nums[right]
                largest = right

            if largest != i:
                nums[i],nums[largest] = nums[largest],nums[i]
                heapify(nums,n,largest)


        n = len(nums)

        # Build Max Heap
        for i in range((n//2)-1,-1,-1):
            heapify(nums,n,i)

        for i in range(n-1,0,-1):
            nums[0],nums[i] = nums[i],nums[0]
            heapify(nums,i,0)

        return nums


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Algorithm -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Build a max heap using the heapify process. Heapify is a method used to maintain the heap property of a binary heap data structure.&lt;/li&gt;
&lt;li&gt;If the array is of size n, it contains n//2 parent nodes.&lt;/li&gt;
&lt;li&gt;For a parent at index i:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;a.Its left child is at index 2i+1&lt;/p&gt;

&lt;p&gt;b. Its right child is at index 2i+2&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Iterate over all subtrees with parents from index n//2 to 0 and call the heapify function on them.&lt;/li&gt;
&lt;li&gt;Now, the array becomes a max heap, with the largest element at index 0.&lt;/li&gt;
&lt;li&gt;Swap the first element with the last element of the heap and decrease the size of the heap by 1.&lt;/li&gt;
&lt;li&gt;Repeat this process until the length of the heap is greater than 1&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Time Complexity :&lt;/strong&gt; Heap Sort has a time complexity of O(n log n) in all three cases: best, average, and worst. This is because, irrespective of whether the array is already sorted or not, the same steps are followed each time a max heap is created and an element is swapped.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;O( log n )&lt;/strong&gt; - To create max heap&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;O(n)&lt;/strong&gt; - As the max heap is created and an element is swapped n times.&lt;/p&gt;

&lt;p&gt;So the total time complexity is O (n) * O (log n) = O (n log n)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Space Complexity&lt;/strong&gt; : For all cases - O( log n) - for the recursive stack.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Advantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Time complexity is O (n log n), even for the large datasets.&lt;/li&gt;
&lt;li&gt;Memory usage is almost constant.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Disadvantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Not Stable, as it does not maintain the relative order of equal elements.&lt;/li&gt;
&lt;li&gt;Many swaps are required as compared to merge sort.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Applications -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;It is useful for implementing priority queues where extraction of the maximum or minimum element is frequent.&lt;/li&gt;
&lt;li&gt;Useful in systems where in-place sorting is necessary and memory usage is critical.&lt;/li&gt;
&lt;li&gt;It is used in scenarios where rankings are needed.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  7.Counting Sort and Radix Sort
&lt;/h2&gt;

&lt;p&gt;Counting Sort is a non-comparison-based sorting algorithm. It is particularly efficient when the range of input values is small compared to the number of elements to be sorted. The basic idea behind Counting Sort is to count the frequency of each distinct element in the input array and use that information to place the elements in their correct sorted positions.&lt;/p&gt;

&lt;p&gt;Radix Sort uses Counting Sort as a subroutine. It applies Counting Sort to each digit place of a number and repeatedly sorts until it processes all the digits of the largest number in the array.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def countingSort(arr, place):
    indexarr = [0] * 10
    sortedArr = [0] * len(arr)

    for i in range(len(arr)):
        index = (arr[i] // place) % 10
        indexarr[index] += 1

    for i in range(1, len(indexarr)):
        indexarr[i] += indexarr[i - 1]

    i = len(arr) - 1
    while i &amp;gt;= 0:
        index = (arr[i] // place) % 10
        sortedArr[indexarr[index] - 1] = arr[i]
        indexarr[index] -= 1
        i -= 1

    for i in range(len(arr)):
        arr[i] = sortedArr[i]

def radixSort(arr):
    maxElement = max(arr)
    place = 1

    while maxElement // place &amp;gt; 0:
        countingSort(arr, place)
        place *= 10

    return arr

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Separate negative and non-negative numbers
        negative_nums = [num for num in nums if num &amp;lt; 0]
        positive_nums = [num for num in nums if num &amp;gt;= 0]

        # Sort negative numbers by converting them to positive
        negative_nums = [-num for num in negative_nums]
        if negative_nums:
            max_negative = max(negative_nums)
            place = 1
            while max_negative // place &amp;gt; 0:
                countingSort(negative_nums, place)
                place *= 10
        negative_nums = [-num for num in reversed(negative_nums)]

        # Sort positive numbers normally
        if positive_nums:
            max_positive = max(positive_nums)
            place = 1
            while max_positive // place &amp;gt; 0:
                countingSort(positive_nums, place)
                place *= 10

        # Combine the sorted negative and positive numbers
        return negative_nums + positive_nums
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Algorithm -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Find the maximum number in the array and determine the number of digits (d) in it. If the length of the number is d, Counting Sort is called d times on the array.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Call Counting Sort for each digit place in the array, starting from the ones place, then tens place, and so on.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In Counting sort:&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;First, create an index array and map the elements based on their values.For example, if a digit is 4, increment the value at the 4th index of the index array.&lt;/li&gt;
&lt;li&gt;Create a prefix sum array from the index array.&lt;/li&gt;
&lt;li&gt;Using the prefix sum array, create a new sorted array equal to the length of the input array&lt;/li&gt;
&lt;li&gt;For example, if the prefix sum array has a value of 2 at index 1, place the value 1 at position 2 in the sorted array and decrement the value in the prefix sum array by 1&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Time Complexity :&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Counting Sort&lt;/strong&gt; has a time complexity of O(n + k), where n is the number of elements to sort and k is the range of values (size of the index array).This complexity is valid for all three cases: best, average, and worst.&lt;/p&gt;

&lt;p&gt;This is because, irrespective of whether the array is already sorted or not, the same steps are followed each time.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Radix Sort’s&lt;/strong&gt; time complexity increases by a factor of d, where d is the number of digits in the largest number in the array. Time complexity is O (d * (n + k))&lt;/p&gt;

&lt;p&gt;So the total time complexity is O (d) * O(n + k) = O (d * (n + k))&lt;/p&gt;

&lt;p&gt;Space Complexity : For all cases - O(n + k), where n is the length of the input array and k is the range of values in the index array.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Advantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Stable as elements keep their relative order.&lt;/li&gt;
&lt;li&gt;Counting sort generally performs faster than all comparison-based sorting algorithms, such as merge sort and quicksort, if the range of input is of the order of the number of input.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Disadvantages -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Counting sort doesn’t work on decimal values.&lt;/li&gt;
&lt;li&gt;Counting sort is inefficient if the range of values to be sorted is very large.&lt;/li&gt;
&lt;li&gt;Not in place sorting algorithm, as it uses extra space O(O(n + m)).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Applications -&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;It is used in applications like counting character occurrences in strings.&lt;/li&gt;
&lt;li&gt;Useful for sorting integers with a large range of values, such as IDs or phone numbers.&lt;/li&gt;
&lt;li&gt;Efficient for sorting data with multiple keys, like dates or tuples.&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>python</category>
      <category>algorithms</category>
      <category>interview</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>[Boost]</title>
      <dc:creator>Rishab Trivedi</dc:creator>
      <pubDate>Thu, 12 Dec 2024 05:13:38 +0000</pubDate>
      <link>https://dev.to/rishabtrivedi/-25hi</link>
      <guid>https://dev.to/rishabtrivedi/-25hi</guid>
      <description>&lt;div class="ltag__link"&gt;
  &lt;a href="/rishabtrivedi" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2559774%2F38c5d63c-3ea8-4c1d-8648-2cba1ffad4f3.png" alt="rishabtrivedi"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="https://dev.to/rishabtrivedi/binary-search-python-2fej" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;Binary Search || Python || Data Structures and Algorithms&lt;/h2&gt;
      &lt;h3&gt;Rishab Trivedi ・ Dec 12 '24&lt;/h3&gt;
      &lt;div class="ltag__link__taglist"&gt;
        &lt;span class="ltag__link__tag"&gt;#python&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#datastructures&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#algorithms&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#leetcode&lt;/span&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;


</description>
    </item>
    <item>
      <title>Binary Search || Python || Data Structures and Algorithms</title>
      <dc:creator>Rishab Trivedi</dc:creator>
      <pubDate>Thu, 12 Dec 2024 04:51:38 +0000</pubDate>
      <link>https://dev.to/rishabtrivedi/binary-search-python-2fej</link>
      <guid>https://dev.to/rishabtrivedi/binary-search-python-2fej</guid>
      <description>&lt;h2&gt;
  
  
  &lt;strong&gt;Binary Search&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Binary Search&lt;/strong&gt; is an algorithm that repeatedly divides the search space in half. This searching technique follows the divide and conquer strategy. The search space always reduces to half in every iteration.resulting in a time complexity of O(log(n)), where n is the number of elements.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Condition:&lt;/strong&gt; Array should be sorted but they can also be applied on monotonic functions where we need to find the monotonically increasing or decreasing.&lt;/p&gt;

&lt;p&gt;It works when we need to narrow down the search space in logarithmic time.&lt;/p&gt;

&lt;p&gt;We use two pointers, left and right. Take the average of left and right to find the mid element.&lt;/p&gt;

&lt;p&gt;Now, we check where we should move our left and right pointers based on the condition.&lt;/p&gt;

&lt;p&gt;Mainly, three steps are required to solve a problem:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Pre-processing:&lt;/strong&gt; Sort the input if it is not sorted.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Binary Search:&lt;/strong&gt; Use two pointers and find the mid to divide the search space, then choose the correct half accordingly.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Post-processing:&lt;/strong&gt; Determine the output.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Advantages of Binary Search Algorithm -&lt;/strong&gt; Binary search is faster than linear search for large data because it cuts the array in half each time, instead of checking each element one by one. This makes it quicker and more efficient.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Limitations:&lt;/strong&gt; Binary search only works on sorted arrays, so it's not efficient for small unsorted arrays because sorting takes extra time. It also doesn't work as well as linear search for small, in-memory searches.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Applications:&lt;/strong&gt; It is used to search element in a sorted array with O(log(n)) time complexity and it can also be used to find the smallest or largest element in the array.&lt;/p&gt;

&lt;p&gt;Basic Binary Search Code - &lt;/p&gt;

&lt;h1&gt;
  
  
  Code
&lt;/h1&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def binarySearch(nums, target):
    if len(nums) == 0:
        return -1

    left, right = 0, len(nums) - 1

    while left &amp;lt;= right:
        mid = (left + right) // 2
        if nums[mid] == target:
            return mid
        elif nums[mid] &amp;lt; target:
            left = mid + 1
        else:
            right = mid - 1

    # End Condition: left &amp;gt; right
    return -1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;33. &lt;a href="https://leetcode.com/problems/search-in-rotated-sorted-array/solutions/6126488/binary-search-162-find-peak-element-33-search-in-rotated-sorted-array-python" rel="noopener noreferrer"&gt;Search in Rotated Sorted Array&lt;/a&gt;&lt;/strong&gt;&lt;br&gt;
Given the array nums after the possible rotation and an integer target, return the index of target if it is in nums, or -1 if it is not in nums.&lt;br&gt;
You must write an algorithm with O(log n) runtime complexity.&lt;br&gt;
Example 1:&lt;br&gt;
Input: nums = [4,5,6,7,0,1,2], target = 0&lt;br&gt;
Output: 4&lt;/p&gt;

&lt;p&gt;Example 2:&lt;br&gt;
Input: nums = [4,5,6,7,0,1,2], target = 3&lt;br&gt;
Output: -1&lt;/p&gt;

&lt;p&gt;Example 3:&lt;br&gt;
Input: nums = [1], target = 0&lt;br&gt;
Output: -1&lt;/p&gt;

&lt;h1&gt;
  
  
  Code
&lt;/h1&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class Solution:
    def search(self, nums: List[int], target: int) -&amp;gt; int:
        left = 0
        right = len(nums)-1

        while left &amp;lt;= right:

            mid = (left + right)//2
            print(f'left is {left},right is {right} and mid is {mid}')
            if nums[mid]==target:
                return mid

            if nums[mid] &amp;gt;= nums[left]:
                # if nums[mid]&amp;lt; target and target &amp;gt;= nums[left]:
                if nums[left] &amp;lt;= target &amp;lt; nums[mid]:
                    right = mid -1
                else:
                    left = mid +1
            else:
                # if nums[mid] &amp;lt; target and target &amp;lt;= nums[right]:
                if nums[mid] &amp;lt; target &amp;lt;= nums[right]:
                    left = mid +1
                else:
                    right = mid - 1

        return -1

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Use two pointers, left and right, and iterate until they overlap.&lt;/li&gt;
&lt;li&gt;Find the mid element.&lt;/li&gt;
&lt;li&gt;Since the array is sorted but rotated, we can’t simply compare the left or right elements with the mid.&lt;/li&gt;
&lt;li&gt;First, determine which part left or right is sorted by comparing the mid pointer with the left or right pointer.&lt;/li&gt;
&lt;li&gt;Based on this conclusion, adjust the pointers accordingly.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Time Complexity - O(log(n))&lt;/strong&gt; as the search space is getting divided into half in each iteration.&lt;br&gt;
&lt;strong&gt;Space Complexity - O(1)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Monotonically Increasing&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://leetcode.com/problems/find-peak-element/solutions/6138010/binary-search-162-find-peak-element-33-search-in-rotated-sorted-array-python" rel="noopener noreferrer"&gt;162. Find Peak Element&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A peak element is an element that is strictly greater than its neighbors.&lt;br&gt;
Given a 0-indexed integer array nums, find a peak element, and return its index. If the array contains multiple peaks, return the index to any of the peaks.&lt;br&gt;
You may imagine that nums[-1] = nums[n] = -∞. In other words, an element is always considered to be strictly greater than a neighbor that is outside the array.&lt;br&gt;
You must write an algorithm that runs in O(log n) time.&lt;/p&gt;

&lt;p&gt;Example 1:&lt;br&gt;
Input: nums = [1,2,3,1]&lt;br&gt;
Output: 2&lt;br&gt;
Explanation: 3 is a peak element and your function should return the index number 2.&lt;br&gt;
Example 2:&lt;br&gt;
Input: nums = [1,2,1,3,5,6,4]&lt;br&gt;
Output: 5&lt;br&gt;
Explanation: Your function can return either index number 1 where the peak element is 2, or index number 5 where the peak element is 6.&lt;/p&gt;

&lt;h1&gt;
  
  
  Code
&lt;/h1&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class Solution:
    def findPeakElement(self, nums: List[int]) -&amp;gt; int:
        left = 0
        right = len(nums) -1
        while left &amp;lt;= right:
            mid = (left+right)//2

            if mid &amp;gt;0 and nums[mid] &amp;lt; nums[mid-1]:
                right = mid -1
            elif mid &amp;lt; len(nums)-1 and  nums[mid] &amp;lt; nums[mid+1]:
                left = mid+1
            else:
                return mid
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;In this type of problem, we need to check for the peak by comparing the left or right element of the mid.&lt;/li&gt;
&lt;li&gt;This helps determine whether the graph is trending upward or downward.&lt;/li&gt;
&lt;li&gt;To find the maximum, search the upward slope and explore the right subspace.&lt;/li&gt;
&lt;li&gt;To find the minimum, search the left subspace&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Time Complexity - O(log(n))&lt;/strong&gt; as the search space is getting divided into half in each iteration.&lt;br&gt;
&lt;strong&gt;Space Complexity - O(1)&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>python</category>
      <category>datastructures</category>
      <category>algorithms</category>
      <category>leetcode</category>
    </item>
  </channel>
</rss>
