<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: MD. Tahsin Ferdous</title>
    <description>The latest articles on DEV Community by MD. Tahsin Ferdous (@tahsin005).</description>
    <link>https://dev.to/tahsin005</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/tahsin005"/>
    <language>en</language>
    <item>
      <title>Big O Notation and the Climb Over Constants.</title>
      <dc:creator>MD. Tahsin Ferdous</dc:creator>
      <pubDate>Fri, 03 Jan 2025 11:57:43 +0000</pubDate>
      <link>https://dev.to/tahsin005/big-o-notation-and-the-climb-over-constants-53lk</link>
      <guid>https://dev.to/tahsin005/big-o-notation-and-the-climb-over-constants-53lk</guid>
      <description>&lt;p&gt;I was recently going through the implementations and complexities of common sorting algorithms such as Quicksort and Mergesort. I noticed that many programmers, including myself, often ignore the constant factors when calculating time complexity. This is typically acceptable for a high-level analysis, but there are situations where this oversight can lead to misunderstandings about the efficiency of algorithms.&lt;/p&gt;

&lt;p&gt;To illustrate, both Quicksort and Mergesort have an average time complexity of O(n logn). Mergesort consistently achieves this complexity in all cases, making it reliable and predictable. Quicksort, on the other hand, has a worst-case time complexity of O(n²), which occurs when the pivot selection leads to highly unbalanced partitions. In these scenarios, Quicksort can perform as poorly as simple algorithms like Selection Sort. However, Quicksort is still generally considered more efficient in practice because it has an average-case time complexity of O(n logn) and tends to be faster due to its lower constant factors.&lt;/p&gt;

&lt;p&gt;This leads to a critical question: If Quicksort has a worst-case time complexity of O(n²) and Mergesort consistently performs at O(n logn), why is Quicksort often preferred? Isn't Mergesort inherently faster?&lt;/p&gt;

&lt;p&gt;Let's break it down. Suppose you have this simple function to print every item in an array.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;void printItem1(vector&amp;lt;int&amp;gt; &amp;amp;arr) {
ㅤㅤ    for (int i = 0; i &amp;lt; arr.size(); i++) {
ㅤㅤ        ㅤcout &amp;lt;&amp;lt; arr[i] &amp;lt;&amp;lt; " ";
ㅤ ㅤ   }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This function goes through every item in the array and prints it out. Because its iterates over the whole array once, this function runs in O(n) time. Now suppose you change this function so it sleeps for 1 second before it prints out and item:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#include &amp;lt;thread&amp;gt; // For sleep functionality
#include &amp;lt;chrono&amp;gt; // For specifying time duration
void printItem2(vector&amp;lt;int&amp;gt; &amp;amp;arr) {
ㅤㅤ    for (int i = 0; i &amp;lt; arr.size(); i++) {
ㅤㅤㅤㅤ        this_thread::sleep_for(chrono::seconds(1)); // Sleep for 1 second
ㅤㅤㅤㅤ        cout &amp;lt;&amp;lt; arr[i] &amp;lt;&amp;lt; " ";
ㅤㅤ    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Before it prints out an item, it will pause for 1 second. Suppose you&lt;br&gt;
print an array of five items using both functions.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[1, 2, 3, 4, 5]
printItem1: 1 2 3 4 5
printItem2: &amp;lt;sleep&amp;gt; 1 &amp;lt;sleep&amp;gt; 2 &amp;lt;sleep&amp;gt; 3 &amp;lt;sleep&amp;gt; 4 &amp;lt;sleep&amp;gt; 5
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Both functions loop through the list once, so they're both O(n) time.&lt;br&gt;
Which one do you think will be faster in practice? I think printItem1&lt;br&gt;
will be much faster because it doesn't pause for 1 second before printing&lt;br&gt;
an item. So even though both functions are the same speed in Big O&lt;br&gt;
notation, printItems1 is faster in practice. When you write Big O&lt;br&gt;
notation like O(n), it really means this,&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;C * n, ㅤwhere n = some amount of time.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;C is some fixed amount of time that your algorithm takes. It's called the&lt;br&gt;
constant. For example, it might be 10 milliseconds * n for printItem1 versus 1 second * n for printItem2.&lt;br&gt;
We usually ignore that constant, because if two algorithms have&lt;br&gt;
different Big O times, the constant doesn't matter. Take binary search&lt;br&gt;
and simple search, for example. Suppose both algorithms had these&lt;br&gt;
constants.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;10ms * n for simple search
1 sec * log n for binary search
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We might say, "Wow! Simple search has a constant of 10 milliseconds,&lt;br&gt;
but binary search has a constant of 1 second. Simple search is way&lt;br&gt;
faster!" Now suppose you're searching a list of 1 billion elements. Here&lt;br&gt;
are the times.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;simple search | ㅤ10ms * 1 billion = 116 days
binary search | ㅤ1s * 1 billion = 30 seconds
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you can see, binary search is still way faster. That constant didn't&lt;br&gt;
make a difference at all.&lt;br&gt;
But sometimes the constant can make a difference. Quicksort versus&lt;br&gt;
merge sort is one example. Quicksort has a smaller constant than&lt;br&gt;
merge sort. So if they're both O(n log n) time, quicksort is faster. And&lt;br&gt;
quicksort is faster in practice because it hits the average case way more&lt;br&gt;
often than the worst case.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>DSA before Development or Development before DSA?</title>
      <dc:creator>MD. Tahsin Ferdous</dc:creator>
      <pubDate>Fri, 03 Jan 2025 11:02:59 +0000</pubDate>
      <link>https://dev.to/tahsin005/dsa-before-development-or-development-before-dsa-1nm4</link>
      <guid>https://dev.to/tahsin005/dsa-before-development-or-development-before-dsa-1nm4</guid>
      <description>&lt;p&gt;I was speaking to one of my developer friends the other day. Suddenly, a topic came up: The question that we discussed for a while was 'Should we learn Data Structures and Algorithms (DSA) first before jumping to development or the other way around?' I attempted to persuade him out of it explaining the importance of mastering DSA.&lt;br&gt;
If I speak about it from personal experience I can say that because I come from the problem solving background, I used to be just like that web developer that wanted to build some stuff and see the effects right away. However, this way of structuring my tasks did not let me proceed as a slow and steady workhorse - I faced a problem as my projects began to get larger. It is at this point that I can really appreciate just how significant DSA actually is.&lt;/p&gt;

&lt;p&gt;So, here's why I believe learning Data Structures and Algorithms should come first before jumping into development.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Building a Strong Foundation:&lt;/strong&gt;&lt;br&gt;
Most of the modern development philosophies are based on the principles that DSA entails. For example, graph algorithm is used in the computing of networks such as the internet while tree plays a significant role in databases. Understanding DSA deeply allows further studying various development areas and makes a person more versatile in the sphere of IT.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Enhancing Problem-Solving Skills:&lt;/strong&gt;&lt;br&gt;
As central as problem solving is to DSA, one can easily lose a grip on it. It prepares you in terms of how to overcome a coding problem depending on the type of problem at hand. Algorithms such as sorting, searching, and traversing data structures help to enhance a person's thinking approach which is beneficial in development. This comes in handy especially when you are faced with problems that demand out of the box and innovative solves. It is concluded that those challenges can be met with less effort by those developers that have strong problem-solving skills, thus creating reliable and high-performance applications.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Algorithmic Efficiency and Data Organization:&lt;/strong&gt;&lt;br&gt;
The topics of time and space complexity in DSA let you know how different algorithms grow with the size of data that they are being applied to. There are many options available that you can select based on the requirement of your application to optimize on large data. To begin with, it is possible to choose the right data structure from such variants as arrays, linked lists, stacks, and trees. For instance, the linked list is appropriate for adding or deleting nodes / elements often while an array or hash table is ideal for fast lookups.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Preparing for Technical Interviews:&lt;/strong&gt;&lt;br&gt;
DSA skills are now a hot cake in the market especially with the various companies in the technology sector. Most coding interviews testers highly focused on the applicants' ability to solve algorithmic problems and their analysis of time complexity. With DSA already under your belt, these challenges are not going to strike out of the blue and keep you from your goal. You will be ready to face them head on.&lt;br&gt;
Yes of course, like any other skill the process of learning DSA requires some efforts in the beginning. Well, consider it as a future capital investment in the journey of your opportunities, growth, and self-quizzing. Although the basic form of learning may not be very easy, the positives that come with it outweigh the negatives. Many tutorials, articles and even practice platforms (like LeetCode, Codeforces etc.) are available for that to make DSA learning process engaging and fun.&lt;br&gt;
DSA does not mean that you should completely discard development. When establishing the DSA foundation, it is possible to incrementally introduce basic development practices. Begin by developing minor projects which will enable you to practice on basic structures and techniques of sorting. This will enhance your theoretical knowledge of DSA and demonstrate to you how every concept works in practice.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt;&lt;br&gt;
Software development is in fact a process that goes well beyond just writing the actual program. In fact, by giving priority to Data Structures and Algorithms (DSA), you prepare yourself to think more freely, write clean or optimized code, and solve the challenges of today's software engineering. This knowledge forms a base for the ongoing learning and growth which makes you a more desirably versatile developer in future.&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
