<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: farizmamad</title>
    <description>The latest articles on DEV Community by farizmamad (@farizmamad).</description>
    <link>https://dev.to/farizmamad</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/farizmamad"/>
    <language>en</language>
    <item>
      <title>How Software Engineering Enables National-Scale Product Distribution</title>
      <dc:creator>farizmamad</dc:creator>
      <pubDate>Mon, 03 Feb 2025 03:01:02 +0000</pubDate>
      <link>https://dev.to/farizmamad/how-software-engineering-enables-national-scale-product-distribution-45dd</link>
      <guid>https://dev.to/farizmamad/how-software-engineering-enables-national-scale-product-distribution-45dd</guid>
      <description>&lt;p&gt;Are you looking to expand your physical product distribution beyond local markets? As a technical leader who's helped a company scale from local to national distribution, I've learned that success lies in the intersection of technology and business operations.&lt;/p&gt;

&lt;p&gt;Let me share how the right software engineering approach can transform your distribution capabilities:&lt;/p&gt;

&lt;h2&gt;
  
  
  The Business Challenge
&lt;/h2&gt;

&lt;p&gt;Local success often creates an urgent need to expand. But scaling distribution isn't just about hiring more people or opening new warehouses. It requires systems that can:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Handle exponentially growing transaction volumes&lt;/li&gt;
&lt;li&gt;Maintain speed and reliability across regions&lt;/li&gt;
&lt;li&gt;Support real-time decision making&lt;/li&gt;
&lt;li&gt;Ensure accurate tracking and auditing&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Key Success Metrics
&lt;/h2&gt;

&lt;p&gt;Before diving into solutions, let's focus on what matters:&lt;/p&gt;

&lt;h3&gt;
  
  
  Sales Team Efficiency
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Application response time under 2 seconds for field sales&lt;/li&gt;
&lt;li&gt;99.9% system availability&lt;/li&gt;
&lt;li&gt;Automated visit scheduling and routing&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Operation Reliability
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Real-time inventory tracking&lt;/li&gt;
&lt;li&gt;trackable order processing&lt;/li&gt;
&lt;li&gt;Full audit capability&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Customer Satisfaction
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Instant order confirmation&lt;/li&gt;
&lt;li&gt;Accurate delivery estimates&lt;/li&gt;
&lt;li&gt;Transparent tracking&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  The Strategic Approach
&lt;/h2&gt;

&lt;p&gt;success comes from aligning technology with business operations:&lt;/p&gt;

&lt;h3&gt;
  
  
  Infrastructure Strategy
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Regional server deployment for optimal performance&lt;/li&gt;
&lt;li&gt;Automated scaling to handle the high traffic during peak hours&lt;/li&gt;
&lt;li&gt;Multi-platform accessibility like web or mobile app&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Team Alignment
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Share a clear and common objectives&lt;/li&gt;
&lt;li&gt;Structured team according to job description&lt;/li&gt;
&lt;li&gt;Iterative product delivery&lt;/li&gt;
&lt;li&gt;Empowered and accountable teams&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Technical Excellence
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Optimized response times&lt;/li&gt;
&lt;li&gt;Efficient database management&lt;/li&gt;
&lt;li&gt;lightweight customer-facing application&lt;/li&gt;
&lt;li&gt;Scalable architecture&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  The company see improvements in business process:
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;40% reduction in order processing time&lt;/li&gt;
&lt;li&gt;2x increase in sales team productivity&lt;/li&gt;
&lt;li&gt;Significant sales leader planning time reduction through scheduling automation&lt;/li&gt;
&lt;li&gt;increased revenue&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Here are a common framework for getting started
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Assess your current systems&lt;/li&gt;
&lt;li&gt;Identify bottlenecks&lt;/li&gt;
&lt;li&gt;Prioritize improvements&lt;/li&gt;
&lt;li&gt;Start with quick wins&lt;/li&gt;
&lt;li&gt;Build for scale&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;What distribution challenges is your business facing? Let's discuss in the comments.&lt;br&gt;
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - &lt;br&gt;
If you're looking for technical leader who brings dedication to your business journey, let's connect. I'm open to discussing how my experience in building scalable systems and leading high-performing teams could help drive your business growth.&lt;/p&gt;

</description>
      <category>softwareengineering</category>
      <category>business</category>
      <category>distribution</category>
      <category>retail</category>
    </item>
    <item>
      <title>MongoDB Model.bulkWrite method solved my issue importing documents from hours to 5 minutes</title>
      <dc:creator>farizmamad</dc:creator>
      <pubDate>Fri, 01 Mar 2024 01:07:03 +0000</pubDate>
      <link>https://dev.to/farizmamad/mongodb-modelbulkwrite-method-solved-my-issue-importing-documents-from-hours-to-5-minutes-25el</link>
      <guid>https://dev.to/farizmamad/mongodb-modelbulkwrite-method-solved-my-issue-importing-documents-from-hours-to-5-minutes-25el</guid>
      <description>&lt;h2&gt;
  
  
  Situation
&lt;/h2&gt;

&lt;p&gt;I have a Node.js backend service using the NestJS framework that depends on an object modeling tool called Mongoose to connect to the MongoDB database. It serves to import data from CSV to the database via API endpoint. The CSV consists of &amp;gt;10k rows, each has multiple columns. The service has tasks to:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Convert the CSV data to an array of JavaScript objects.&lt;/li&gt;
&lt;li&gt;Calculate the value of new fields for each object using some business logic.&lt;/li&gt;
&lt;li&gt;Insert each object into the database OR update the existing document if the new field value matches.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Task
&lt;/h2&gt;

&lt;p&gt;It was relatively easy to do task 1 by utilizing the csv-parse library to convert rows of columns of data into an array of JavaScript objects. The problem happened when doing calculations on each object and when inserting the object into the database. Calculations on each object mean it might query the database, do mathematic calculations, do string manipulation, etc. Hence, it resulted in different updates for each object. Furthermore, the database operation could be updateOne or create depending on the value of new field. Below is the pseudocode on the 1st version of the code.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Guess how is the performance? &lt;strong&gt;To process 1k of data it took 1 hour or more&lt;/strong&gt;! If an error occurred, then it took longer. Imagine how long I need to wait to process 10k of data. &lt;strong&gt;It was such a wasted time&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Action and Result
&lt;/h2&gt;

&lt;p&gt;That's why I explored the root cause of this problem, and I found out that updateOne or create commands on every looping consumes most of the time. Why? &lt;strong&gt;Because in updateOne or create, it did a round-trip communication to the database&lt;/strong&gt;. Moreover, create needs time to index. Thus, n data = n round-trip where each round-trip may take 100 ms to 1000 ms.&lt;/p&gt;

&lt;p&gt;It was where I found the solution: &lt;strong&gt;Model.bulkWrite&lt;/strong&gt;. Instead of doing n round-trip communication to the database in every loop, &lt;strong&gt;we do it in batch&lt;/strong&gt;. The idea was to only do calculations on each loop. After that, it only sends the insert or update operations at one call to do write operations in batch. &lt;strong&gt;Now the service can process 8k data only in under 5 minutes&lt;/strong&gt;. Below is the code after being fixed.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Short about Model.bulkWrite
&lt;/h2&gt;

&lt;p&gt;By definition from MongoDB, &lt;strong&gt;Model.bulkWrite&lt;/strong&gt; or &lt;strong&gt;db.collection.bulkWrite&lt;/strong&gt; is a method that &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"Performs multiple write operations with controls for order of execution".&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;It receives write operations as input and returns counts for each write operation, and an array containing an _id for each inserted or upserted document.&lt;/p&gt;

&lt;p&gt;In &lt;strong&gt;Mongoose&lt;/strong&gt;, the details look like this.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# write operations:
- insertOne
- replaceOne
- updateOne
- updateMany
- deleteOne
- deleteMany
# options: some bulk write options.
# counts:
- insertedCount
- matchedCount
- modifiedCount
- deletedCount
- upsertedCount
# upsertedIds: array of ObjectIds
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fut65cp2swkf57rq0yn4s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fut65cp2swkf57rq0yn4s.png" alt="MongoDB logo" width="800" height="201"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So I recommend using Model.bulkWrite in your application if you have a similar use case, where your application does write operations on lots of data to MongoDB with different operations on each document.&lt;/p&gt;

</description>
      <category>mongodb</category>
      <category>backend</category>
      <category>nestjs</category>
    </item>
    <item>
      <title>Tool for Integration Testing in NestJS-based Backend using Mongodb Replica Set</title>
      <dc:creator>farizmamad</dc:creator>
      <pubDate>Mon, 15 Jan 2024 02:47:17 +0000</pubDate>
      <link>https://dev.to/farizmamad/tool-for-integration-testing-in-nestjs-based-backend-using-mongodb-replica-set-1pjg</link>
      <guid>https://dev.to/farizmamad/tool-for-integration-testing-in-nestjs-based-backend-using-mongodb-replica-set-1pjg</guid>
      <description>&lt;p&gt;Backend services built on top of NestJS framework is able to utilize the test framework Jest. NestJS and Jest come up with solutions to develop unit tests and end-to-end tests. While both are necessary, they are out of scope of this document. Here we present a documentation for integration tests, which is where the tests are run against a service integrated with its dependencies, e.g. database, cloud features, etc.&lt;/p&gt;

&lt;h2&gt;
  
  
  Situation
&lt;/h2&gt;

&lt;p&gt;Previously tests in my project were done manually by QA or Product Ops. For simple cases, this practice is enough. However, the project is growing and now has advance features for business process. Advance features are hard to be tested by detail in manual test, because it is exhausting and prone to human error. Furthermore, QA or Product Ops are handed with other daily tasks that takes time.&lt;/p&gt;

&lt;p&gt;Another problem is coming from service dependencies which have advance features. For example, Mongo DB with its Node.js library called mongoose has a "Transaction" feature that is really important for data consistency. Me and my team have experienced data inconsistency in the mid of 2022. QA and Product Ops found the tests are passed in manual test. However, in production inconsistency is still exist.&lt;/p&gt;

&lt;h2&gt;
  
  
  Task
&lt;/h2&gt;

&lt;p&gt;Manual test can't be used in advance engineering problem. So, test automation is a way to brake this hard issue. Problems coming from dependencies can't be tested in unit test, yet it is overkill to be tested in end-to-end test. That's why engineer offer a solution to create integration test.&lt;/p&gt;

&lt;h2&gt;
  
  
  Action
&lt;/h2&gt;

&lt;p&gt;Engineer prepare requirements and steps to do integration test in backend services.&lt;/p&gt;

&lt;h2&gt;
  
  
  Requirements
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Shell scripts containing mongodb replica set initiation placed in root with directory &lt;code&gt;scripts/setup.sh&lt;/code&gt;. Here pre-defined script for up to 3 replica set with 1 primary.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Docker-compose file placed in root, &lt;code&gt;docker-compose.yml&lt;/code&gt;. This file will create mongodb replica set containers and one db setup container.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Npm scripts for pretest, test, and posttest in &lt;code&gt;package.json&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Create a file inside test directory, &lt;code&gt;test/jest-integration.json&lt;/code&gt;, to enable jest identify test cases for integration test.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;5. &lt;code&gt;.env&lt;/code&gt; file placed at the root of the project containing the DB_URI_TEST&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;DB_URI_TEST=mongodb://localhost:27017/db-test?readPreference=primary&amp;amp;directConnection=true
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ol&gt;
&lt;li&gt;Create test cases inside file name with pattern &lt;code&gt;&amp;lt;any case&amp;gt;.integration.spec.ts&lt;/code&gt; in test directory. For example:
&lt;/li&gt;
&lt;/ol&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;test/order.integration.spec.ts
test/product.integration.spec.ts 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Make sure to connect to the primary host before all test cases.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;h2&gt;
  
  
  Steps to run the tests
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Open a terminal at root project directory.&lt;/li&gt;
&lt;li&gt;Run test command, &lt;code&gt;npm run test:integration&lt;/code&gt;. Note that we don't need to call pretest and posttest commands. NestJS will call pretest before the test, and call posttest after all test cases are passed.&lt;/li&gt;
&lt;li&gt;Npm will first run pretest script.&lt;/li&gt;
&lt;li&gt;Wait the tests until it stops.&lt;/li&gt;
&lt;li&gt;The test is done.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Note:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;if tests are passed, npm will run posttest script.&lt;/li&gt;
&lt;li&gt;else, test will be stopped without running the posttest. It allows us to fix the bug and then rerun the test command.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>mongodb</category>
      <category>nestjs</category>
      <category>backend</category>
    </item>
  </channel>
</rss>
