<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Angha Ramdohokar</title>
    <description>The latest articles on DEV Community by Angha Ramdohokar (@angha_ramdohokar_0b6505c2).</description>
    <link>https://dev.to/angha_ramdohokar_0b6505c2</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/angha_ramdohokar_0b6505c2"/>
    <language>en</language>
    <item>
      <title>Exploring Software Architecture: My Path to Certification and Tactics</title>
      <dc:creator>Angha Ramdohokar</dc:creator>
      <pubDate>Fri, 29 Sep 2023 11:43:59 +0000</pubDate>
      <link>https://dev.to/angha_ramdohokar_0b6505c2/exploring-software-architecture-my-path-to-certification-and-tactics-3dj9</link>
      <guid>https://dev.to/angha_ramdohokar_0b6505c2/exploring-software-architecture-my-path-to-certification-and-tactics-3dj9</guid>
      <description>&lt;p&gt;Software architecture is the backbone of every successful software project. &lt;br&gt;
As a software developer, I've always been fascinated by the art and science of software architecture. It's not just about writing code, it's about designing systems that are efficient, scalable, and maintainable. &lt;br&gt;
To deepen my understanding and skills in this crucial field, I embarked on a journey to earn a software architecture certification. &lt;/p&gt;

&lt;p&gt;In this blog, I'll take you through my personal journey of exploring software architecture, the challenges I faced, the valuable tactics I learned, and how this certification transformed my approach to software development.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Quest Begins&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;My journey into the world of software architecture started with a simple realization – there's so much more to building software than just writing code just like an ice berg model.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fozjiq2b2hkebegy1c215.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fozjiq2b2hkebegy1c215.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I wanted to understand the principles and patterns that underpin robust software systems. &lt;br&gt;
So, I decided to pursue a certification that would not only validate my knowledge but also push me to learn and grow.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Navigating the Certification Process&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Earning a software architecture certification isn't a walk in the park. It required a structured approach and a commitment to continuous learning. &lt;br&gt;
The certification program I chose covered a wide range of topics, from design patterns to architectural styles, quality attributes and documentation.&lt;br&gt;
I had to manage my time effectively and strike a balance between work, study, and personal life. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Embracing Architectural Tactics&lt;/strong&gt;&lt;br&gt;
One of the most enlightening parts of my certification journey was delving into architectural tactics. &lt;br&gt;
These are practical techniques that architects use to address common architectural concerns. &lt;br&gt;
I learned how to make decisions about trade-offs, select appropriate design patterns, and ensure that the software system met its quality attributes. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff5jpdfruq98yd6xh95ey.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff5jpdfruq98yd6xh95ey.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Common Architectural Concerns:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Performance&lt;/strong&gt;: Tactics for optimizing system response times, throughput, and resource utilization.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalability&lt;/strong&gt;: Tactics for ensuring that the system can handle increased workloads and growth in users or data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security&lt;/strong&gt;: Tactics for protecting the system from vulnerabilities, threats, and unauthorized access.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reliability&lt;/strong&gt;: Tactics for ensuring the system's availability, fault tolerance, and error handling.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Maintainability&lt;/strong&gt;: Tactics for making the system easier to evolve, update, and modify over time.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Usability&lt;/strong&gt;: Tactics for enhancing the user experience and usability of the software.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;- Examples of Architectural Tactics:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Caching&lt;/strong&gt;: To improve performance, data that is frequently accessed can be cached in memory, reducing the need to fetch it from slower storage.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Load Balancing&lt;/strong&gt;: To enhance scalability, incoming requests can be distributed evenly across multiple servers to prevent overloading a single server.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Encryption&lt;/strong&gt;: To address security concerns, sensitive data can be encrypted to protect it from unauthorized access.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Redundancy&lt;/strong&gt;: For reliability, critical components or services can be duplicated to provide fault tolerance.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Modular Design&lt;/strong&gt;: To improve maintainability, the system can be designed with modular components that can be developed, tested, and updated independently.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;User Interface Guidelines&lt;/strong&gt;: To enhance usability, following user interface design principles and best practices can lead to a more user-friendly application.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;- Choosing the Right Tactics:&lt;/strong&gt;&lt;br&gt;
Selecting the appropriate tactics depends on the specific requirements of your project.&lt;br&gt;
There comes tradeoffs in picture, as they can have different effects on different quality attributes. &lt;/p&gt;

&lt;p&gt;These tactics proved invaluable as I applied them to real-world projects, making my code more efficient and maintainable.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Overcoming Challenges&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkumu873nzn6y5siy8nf4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkumu873nzn6y5siy8nf4.png" alt="Image description"&gt;&lt;/a&gt;&lt;br&gt;
Every journey has its challenges, and my certification journey was no exception. &lt;br&gt;
There were moments of self-doubt, late-night study sessions, and juggling multiple responsibilities. &lt;br&gt;
But with determination and a support system of fellow learners and mentors, I was able to overcome these hurdles.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;My journey of exploring software architecture through certification and the application of architectural tactics has been a rewarding experience.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcpmaatarj1wv0qg9tv09.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcpmaatarj1wv0qg9tv09.PNG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Remember that the world of software architecture is waiting to be explored. It's a journey that will challenge you, inspire you, and ultimately make you a better developer.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Happy Reading :) &lt;/p&gt;

</description>
    </item>
    <item>
      <title>My Journey of Making an App Faster and Stronger: Scaling for Everyone</title>
      <dc:creator>Angha Ramdohokar</dc:creator>
      <pubDate>Fri, 30 Jun 2023 07:38:56 +0000</pubDate>
      <link>https://dev.to/angha_ramdohokar_0b6505c2/my-journey-of-making-an-app-faster-and-stronger-scaling-for-everyone-1f8o</link>
      <guid>https://dev.to/angha_ramdohokar_0b6505c2/my-journey-of-making-an-app-faster-and-stronger-scaling-for-everyone-1f8o</guid>
      <description>&lt;p&gt;Have you ever wondered how popular apps like Instagram or YouTube handle millions of users without crashing? Well, they achieve this by scaling their applications. &lt;br&gt;
Scaling means making an app faster, stronger, and able to handle lots of people using it. In this blog post, I'll share my experience of scaling an app, its strategies and real-world examples to help you navigate this journey with confidence.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why Apps Need Scaling ?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Identifying the reasons why scalability is necessary is extremely important. &lt;br&gt;
Are you noticing that your application is taking too long to respond, frequently crashing, or having difficulty accommodating a large number of users? &lt;br&gt;
Recognizing these issues is the key to understand why scaling is essential. &lt;/p&gt;

&lt;p&gt;I faced the same problems with my application and identification of these pain points is first step towards finding a solution.&lt;/p&gt;

&lt;p&gt;There comes Scaling.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--S5sSv83g--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lt68m7cwvqvn0781v52e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--S5sSv83g--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lt68m7cwvqvn0781v52e.png" alt="Scalability" width="800" height="407"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Boosting Performance&lt;/strong&gt;&lt;br&gt;
To make my app faster, I started looking for tactics which can help me with performance. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cZAiiBkJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e7xt6oslhsirq8ngj9k8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cZAiiBkJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e7xt6oslhsirq8ngj9k8.png" alt="Performance fixes" width="620" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After some readings, I have implemented following ways to make my application faster -&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Use of data caching -&lt;br&gt;
I implemented data caching, which is like creating a temporary storage space for frequently accessed information.&lt;br&gt;
By storing commonly used data in the cache, the app could retrieve it more quickly, reducing the need to fetch the data from the database every time. I have used both server side and client side caching.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use of stored procedures and indexes -&lt;br&gt;
I optimized the way my app interacted with the database by utilizing stored procedures and indexes. &lt;br&gt;
Stored procedures are pre-defined database commands that can be executed more efficiently than dynamic queries. &lt;br&gt;
Indexes, on the other hand, are like bookmarks that help the database quickly find specific data. &lt;br&gt;
By using these techniques, I minimized the time it took to retrieve and process the data, making the app more responsive.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use of Aync await methods -&lt;br&gt;
I implemented asynchronous programming using the async/await keywords. This allowed my app to perform multiple tasks simultaneously without blocking the main thread. &lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Horizontal vs. Vertical Scaling&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Horizontal scaling,&lt;/em&gt; also known as scaling out, involves adding more machines or servers to distribute the workload.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Ese2q_zT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/u9tic2s8o3cat3zkgm9d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Ese2q_zT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/u9tic2s8o3cat3zkgm9d.png" alt="Horizontal Scaling" width="739" height="415"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When started exploring horizontal scaling, I found several benefits, it offered excellent scalability as I could simply add more servers to accommodate increasing user demand. It also allows to handle a larger number of users without sacrificing performance. &lt;/p&gt;

&lt;p&gt;But I have faced couple of challenges too with horizontal scaling, which are mainly,&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Data consistency &lt;/li&gt;
&lt;li&gt;Multiple servers management &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Next one is &lt;em&gt;Vertical scaling&lt;/em&gt;, also known as scaling up, involves improving the capacity of a single machine or server.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--G5mv3S4t--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/p5c5j1p4gklvd5avzlhn.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--G5mv3S4t--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/p5c5j1p4gklvd5avzlhn.jpg" alt="Vertical Scaling" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With vertical scaling, I focused on enhancing the existing server's capabilities, such as increasing its processing power, memory, or storage capacity.&lt;/p&gt;

&lt;p&gt;Vertical scaling provided some distinct advantages. &lt;br&gt;
Firstly, it simplified the management and configuration of my app by working with a single server. There were no complexities like multiple servers. &lt;br&gt;
Secondly, vertical scaling allowed me to use the full potential of a server, which led to significant performance improvements.&lt;/p&gt;

&lt;p&gt;However, vertical scaling has its own limitations. &lt;br&gt;
Eventually, a server would reach its maximum capacity, and after that scaling would require costly hardware upgrades.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Horizontal and vertical scaling together&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;When I undergo both these scaling, I started thinking like can I combine both and create a hybrid scaling. So I started implementing with load balancing and distributing the workload across multiple servers (horizontal scaling), Simultaneously, I used vertical scaling by optimizing the performance of individual servers, leveraging their increased resources.&lt;/p&gt;

&lt;p&gt;But this hybrid approach allowed me to strike a balance between scalability and performance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Testing and Learning&lt;/strong&gt;&lt;br&gt;
I used to test the changes I made to see if they improve the app's performance and scalability, I had also created an sheet where I wrote the actual numbers from testing(with load, without load etc.). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DHkczSzx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ljpbp0zgu4quzkpfqwdq.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DHkczSzx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ljpbp0zgu4quzkpfqwdq.jpeg" alt="Build-Measure-Learn" width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Sometimes things didn't work perfectly on the first try, but with each test, I got closer to finding the best solutions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Continuous Improvement&lt;/strong&gt;&lt;br&gt;
My journey of scaling the app taught me the value of continuous improvement. Even after achieving a good results, I understood the importance of regularly monitoring the app's performance, analyzing user feedback, and making further optimizations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
Remember, &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;scaling is not a one-time task but a continuous process. &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Embrace the feedback from your users, stay updated with the latest technologies, and always strive for further improvements. &lt;br&gt;
By prioritizing performance and scalability, you can ensure that your app remains competitive, relevant, and loved by users.&lt;/p&gt;

&lt;p&gt;I hope you found my journey and experiences of making an app faster and stronger through scaling insightful and valuable.&lt;/p&gt;

&lt;p&gt;Happy Reading :) &lt;/p&gt;

</description>
      <category>performance</category>
      <category>architecture</category>
      <category>learning</category>
      <category>scalability</category>
    </item>
    <item>
      <title>Strategies for Managing Last-Minute Challenges During an App Release</title>
      <dc:creator>Angha Ramdohokar</dc:creator>
      <pubDate>Mon, 27 Mar 2023 07:07:53 +0000</pubDate>
      <link>https://dev.to/angha_ramdohokar_0b6505c2/strategies-for-managing-last-minute-challenges-during-an-app-release-504f</link>
      <guid>https://dev.to/angha_ramdohokar_0b6505c2/strategies-for-managing-last-minute-challenges-during-an-app-release-504f</guid>
      <description>&lt;p&gt;It’s a feeling that is familiar to software developers: the last-minute rush to get an application released. Whether it’s a looming deadline or an unexpected bug, software developers have come to expect the unexpected when it comes to the release process. &lt;/p&gt;

&lt;p&gt;Recently, me and my team tasked with getting a application out the door. We had to move quickly and efficiently to ensure the release was successful and as you all know the pressure was on.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lHWmjYvf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3zeslrc6x38g1ay9d9dv.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lHWmjYvf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3zeslrc6x38g1ay9d9dv.jpg" alt="Image description" width="258" height="196"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Lets talk about challenges that I have faced during application release.&lt;/p&gt;

&lt;h2&gt;
  
  
  Challenges during app release -
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Managing user acceptance testing :&lt;/strong&gt; &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The UAT in our case happened during the final stages, and we encountered few issues from them. However, these were all missed requirements which needed to be developed after the UAT.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Loading data for marketing team :&lt;/strong&gt; &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;During final stages of the release, we got a requirement in which we needed to load large amount of data on TEST so that marketing team can use it.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Addition of analytics tool to the application :&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We also needed to add the various analytics tool code to our application so that we can track the traffic towards our application. But this as well is a last minute thing.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Access issues of production environment during deployment :&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There are multiple permissions which are missing for our users so we need to wait till we got the permission.&lt;br&gt;
Based on my experience, waiting for access during production release can be extremely stressful.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jO5pVAiQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n4mvkurh1mk8ar8q7p88.GIF" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jO5pVAiQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n4mvkurh1mk8ar8q7p88.GIF" alt="Image description" width="627" height="432"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;These are few challenges that I have faced there can be more items too. But from my experience I made a list of strategies that helped me for successful release and I think it can help you guys too.&lt;/p&gt;

&lt;h2&gt;
  
  
  Strategies for successful release -
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Establish a release timeline :&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It’s important to have a timeline for the release, which should include milestones for testing, bug fixes, and other important activities. &lt;br&gt;
This timeline should be shared with the entire team and regularly updated to ensure everyone is on the same page.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Prioritizing testing :&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Testing is a crucial part of the release process and should be done thoroughly. Make sure to prioritize testing and ensure that all features are working correctly prior to release.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Have a contingency plan :&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Unexpected problems can arise during a release, so it’s important to have a contingency plan in place. This plan should include how to handle issues such as data loss and unexpected downtime.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Monitor performance :&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Monitor the performance of the app before it’s released. This can help to identify any issues that weren’t detected can help you to quickly resolve them.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Communicate with team  :&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Make sure to communicate with team, infrastructure people too during the release process. These people can help you to tackle any last minute access issues.&lt;/p&gt;

&lt;p&gt;Following these strategies can help to ensure that your last-minute tasks in application release goes as smoothly as possible.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GHmj2LPN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x17acng2d1agt3znx4hm.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GHmj2LPN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x17acng2d1agt3znx4hm.jpg" alt="Image description" width="700" height="417"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the day of the release, I was nervous and excited. I wanted to ensure that everything went smoothly and that no issues would arise and I was relieved when the application was released with no major issues.&lt;/p&gt;

&lt;p&gt;All of this was stressful but it was one that I was glad I experienced. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_uaiaKVH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4l8pymjfqhaf8o416m9v.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_uaiaKVH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4l8pymjfqhaf8o416m9v.jpg" alt="Image description" width="225" height="225"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Last-minute tasks during release can be an exciting challenge, and they are a great way to test your skills as a software developer.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;I’ve learned a lot from this experience and I’m sure I’ll be faced with similar situations in the future. &lt;br&gt;
No matter how stressful the situation, now I'm confident that can handle it with the right plan and dedication and my next release will go much smoother.&lt;/p&gt;

&lt;p&gt;Good luck !!&lt;/p&gt;

</description>
      <category>productivity</category>
      <category>releasemanagement</category>
      <category>webdev</category>
      <category>management</category>
    </item>
    <item>
      <title>Design considerations for large data import</title>
      <dc:creator>Angha Ramdohokar</dc:creator>
      <pubDate>Mon, 02 Jan 2023 09:55:34 +0000</pubDate>
      <link>https://dev.to/angha_ramdohokar_0b6505c2/design-considerations-for-large-data-import-1bge</link>
      <guid>https://dev.to/angha_ramdohokar_0b6505c2/design-considerations-for-large-data-import-1bge</guid>
      <description>&lt;p&gt;First thing that comes to mind after hearing term large data is some data which is bigger in volume. This can be also be defined like, &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&amp;gt; Big Data is data that contains greater variety, arriving in increasing volumes and with more velocity. (3Vs)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The original relational database system made it so easy to work with data as long as the data size is small enough to manage. &lt;br&gt;
However, when the data reach a significant volume, it becomes very difficult to work with because it would take a long time, or sometimes even be impossible, to read, write, and process successfully. This same problem occurs to me in a recent project. We had a large data system which is source of our data and we needed to import that data to another system.&lt;/p&gt;

&lt;p&gt;Here are design considerations that I followed when working on large data import,&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Data extraction
&lt;/h2&gt;

&lt;p&gt;In data extraction, I did two types of extraction- data extraction for once and data extraction in sync&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;Data extraction for once&lt;/strong&gt;&lt;/em&gt; &lt;br&gt;
In this, we needed to extract all the data from source data system one time. &lt;br&gt;
There are multiple ways from which you can extract data depending on source system.&lt;/p&gt;

&lt;p&gt;In my case the source data was present in sql server.&lt;br&gt;
Thus, I have followed following steps ,&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;I have executed sql scripts to get only the data I wanted from source system data tables.&lt;/li&gt;
&lt;li&gt;Then saved that data in .csv files using save as options from results menu of sql server.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There is another option as well i.e. to generate scripts for data objects you want to extract data from.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;Data extraction in sync&lt;/strong&gt;&lt;/em&gt;&lt;br&gt;
In this, we needed to extract the data in sync i.e. we need to extract data frequently after a specific time interval.&lt;/p&gt;

&lt;p&gt;Thus, we have a stored procedure that gets executed through a web job and that stored procedure has a logic written to extract the data from source system and save it in a .csv file on the server.&lt;/p&gt;

&lt;p&gt;We set that web job to be executing every five minutes and update the existing csv files with new ones.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Data transformation
&lt;/h2&gt;

&lt;p&gt;In data transformation, we can have things that can transform the data in one form to another.&lt;br&gt;
In first step, we extracted data and saved that to .csv file. Now in this step I have changed that data in a manner that it is of my target system data form.&lt;/p&gt;

&lt;p&gt;I have done following things under data transformation,&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Bad data removal &lt;/li&gt;
&lt;li&gt;Conversion of date fields&lt;/li&gt;
&lt;li&gt;Update columns to target data table columns&lt;/li&gt;
&lt;li&gt;Delete duplicate data&lt;/li&gt;
&lt;li&gt;Save the file in excel format for load&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Transformation may include other aspects as well like integrating data from multiple sources to one format and then doing manipulation on them&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Data load
&lt;/h2&gt;

&lt;p&gt;In data load, we actually load extracted and transformed data in step 1&amp;amp;2. &lt;/p&gt;

&lt;p&gt;For data load, in my case my destination system is azure sql server so I have used the SQL server import and export wizard and followed these steps,&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Choose a data source : Selecting the excel file with data&lt;/li&gt;
&lt;li&gt;Choose destination : Your destination database &lt;/li&gt;
&lt;li&gt;Specify the data table &amp;amp; select source tables&lt;/li&gt;
&lt;li&gt;Run the wizard&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj69kknemcfd3dgf7aeha.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj69kknemcfd3dgf7aeha.png" alt="SQL server import and export wizard" width="652" height="551"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You might face some issues with bad data if the data is not corrected properly or if the transformed data is not in desired format.&lt;/p&gt;

&lt;p&gt;After successfully running the SQL server import and export wizard, you can see your data being imported successfully from source to destination systems.&lt;/p&gt;

&lt;p&gt;You can load the data using sql scripts as well if you have extracted them using sql generate scripts feature mentioned in step 1. &lt;/p&gt;

&lt;p&gt;After data load, we have checked performance of all the pages and endpoints. Looking at those numbers, we tried fix the performance issues with the help of following things,&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Adding non clustered indexing to required data tables.&lt;/li&gt;
&lt;li&gt;Selected only required data table columns instead of returning all from an endpoint.&lt;/li&gt;
&lt;li&gt;Adding offset and limit for all the GET endpoints.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We also added spinner to show up until all the endpoints are completed running. &lt;/p&gt;

&lt;p&gt;These are the performance numbers after all the improvements,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnnb1do8rr5b5pdpcy2ak.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnnb1do8rr5b5pdpcy2ak.png" alt="After improvement" width="478" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now a days, applications that need to process large amounts of data are more likely to increase. So, it's essential to understand the problems and design a solution according to their need. Thus, these design considerations will be helpful in such scenarios. &lt;/p&gt;

&lt;p&gt;Happy Reading !! &lt;/p&gt;

</description>
      <category>webdev</category>
      <category>discuss</category>
    </item>
    <item>
      <title>API Authentication - JWT vs OAuth</title>
      <dc:creator>Angha Ramdohokar</dc:creator>
      <pubDate>Thu, 29 Sep 2022 07:05:02 +0000</pubDate>
      <link>https://dev.to/angha_ramdohokar_0b6505c2/api-authentication-jwt-vs-oauth-802</link>
      <guid>https://dev.to/angha_ramdohokar_0b6505c2/api-authentication-jwt-vs-oauth-802</guid>
      <description>&lt;p&gt;In my current project, we were designing a system with multiple APIs. The APIs we are building will be used by other systems as well. Whenever there is an API call we need to secure that API from external users so that we know only authenticated users are making use of our APIs and that is nothing but API authentication.&lt;/p&gt;

&lt;h2&gt;
  
  
  API Authentication
&lt;/h2&gt;

&lt;p&gt;In simple terms, API authentication is all about proving or verifying the identity of the people accessing your system.&lt;br&gt;&lt;br&gt;
Its goal is to prevent attacks from cybercriminals who snoop around websites looking for the slightest vulnerability to take advantage of.&lt;br&gt;
There are various methods that can help us to prevent our APIs.&lt;/p&gt;
&lt;h2&gt;
  
  
  Methods of API Authentication
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9fggq3wdg4rxbpo5mghw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9fggq3wdg4rxbpo5mghw.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Our requirement was quiet simple that we need to secure our APIs in a way that other systems can easily use it. But at the same time we don't want to share our username and password with other systems. Also in future there can be a case that we implement federated identity management so whatever approach we choose now for API authentication can be used when we switch to federated identities.&lt;/p&gt;

&lt;p&gt;So taking in consideration all the requirements we drill down to two API authentication methods -&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;API Key Authentication&lt;/li&gt;
&lt;li&gt;OAuth Authentication&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this article I will be sharing in detail about these two authentication methods.&lt;/p&gt;
&lt;h2&gt;
  
  
  API Key Authentication -
&lt;/h2&gt;

&lt;p&gt;API Key is a code used to identify and authenticate an application or user.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;IP84UTvzJKds1Jomx8gIbTXcEEJSUilGqpxCcmnx
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;API Keys are very simple to use from the consumer perspective:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;You get an API key from the service.&lt;/li&gt;
&lt;li&gt;Add the key to an Authorization header.&lt;/li&gt;
&lt;li&gt;Call the API.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;On the other hand, simplicity may raise security concerns. Here are its limitations -&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Shifting of Responsibility - What happens if someone else comes upon an API key that is not their own? In most cases, they can use the API key with all the privileges of the rightful owner. Depending on the API, they may be able to retrieve all the data, add incorrect content, or delete everything.  This way the security of the system as a whole is entirely dependent on the ability of the developer/consumer to protect their API keys and maintain security.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Lack of Granular Control - &lt;br&gt;
One precaution that some API designers take is to use API keys for read-only data. For APIs that don’t need write permissions, this is especially useful, while limiting risk. However, this approach limits the APIs that may require more granular permissions.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;*&lt;em&gt;What it comes down to is that API Keys are, by nature, not a complete solution. *&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;There comes the &lt;strong&gt;JWT(JSON Web Token)&lt;/strong&gt; in picture that can be used along with API key authentication.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;JSON Web Token&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Both API key and JWT can provide authentication and authorization. API key is on project scope and JWT is on user scope.&lt;br&gt;
A JWT token can contain information like its expiration date and a user identifier to determine the rights of the user across the entire ecosystem.&lt;/p&gt;

&lt;p&gt;JWT authorization offers flexibility, reliability, and more security.&lt;br&gt;
Any API that requires authentication can easily switch over to JWT’s authorization. With JWT authorization, you get a user-based authentication. Once the user is authenticated, the user gets a secure token that they can use on all systems. You set up access rights and you give each user different rights for each system. The JWT authorization endpoint authenticates the user and creates the token. &lt;br&gt;
In sum, sometimes JWT is absolutely needed and sometimes it’s overkill.&lt;br&gt;
JWT is overkill in two situations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A simple ecosystem with only a few APIs&lt;/li&gt;
&lt;li&gt;Suppliers of third-party APIs. &lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  OAuth Authentication -
&lt;/h2&gt;

&lt;p&gt;OAuth is the answer to accessing user data with APIs.&lt;br&gt;&lt;br&gt;
If you’ve ever seen one of the dialogs below, that’s what I'm talking about. This is an application asking if it can access data on your behalf.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7j5uqzlcer3dcnl900z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7j5uqzlcer3dcnl900z.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is nothing but OAuth.&lt;br&gt;
OAuth is a delegated authorization framework for REST/APIs. It enables apps to obtain limited access (scopes) to a user’s data without giving away a user’s password.&lt;/p&gt;

&lt;p&gt;You can think of this like hotel key cards, but for apps. If you have a hotel key card, you can get access to your room. How do you get a hotel key card? You have to do an authentication process at the front desk to get it. After authenticating and obtaining the key card, you can access resources across the hotel.&lt;/p&gt;

&lt;p&gt;To break it down simply, OAuth is where:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;App requests authorization from User&lt;/li&gt;
&lt;li&gt;User authorizes App and delivers proof&lt;/li&gt;
&lt;li&gt;App presents proof of authorization to server to get a Token&lt;/li&gt;
&lt;li&gt;Token is restricted to only access what the User authorized for the specific App.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;OAuth, specifically OAuth 2.0, is a standard for the process that goes on behind the scenes to ensure secure handling of these permissions.&lt;br&gt;
With a federated system module, OAuth Authentication 2.0 offers security scalability and the best user experience. &lt;br&gt;
However, it’s also more work for developers and API providers to implement and maintain. &lt;/p&gt;

&lt;p&gt;Another tool to consider that complements OAuth 2.0 is OpenID Connect. It works as an identity layer you can deploy on top of the protocol so the API can verify a client’s identity and profile via authentication performed by the authorization server.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;You can use any authentication method to secure your APIs but the choice should be made depending on your system's requirements.&lt;br&gt;
As we have seen all these authentication methods aren't mutually exclusive so you can use all of them at once. Or, each could be used independently of the others.&lt;br&gt;
But securing your APIs and choosing right level of authentication is very important.&lt;/p&gt;

&lt;p&gt;I hope this helps you to choose better API authentication method!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Happy Reading :)&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>api</category>
      <category>webdev</category>
      <category>elastikteams</category>
      <category>oauth</category>
    </item>
    <item>
      <title>Refactoring of API calls</title>
      <dc:creator>Angha Ramdohokar</dc:creator>
      <pubDate>Thu, 17 Mar 2022 05:33:07 +0000</pubDate>
      <link>https://dev.to/angha_ramdohokar_0b6505c2/refactoring-of-api-calls-28o7</link>
      <guid>https://dev.to/angha_ramdohokar_0b6505c2/refactoring-of-api-calls-28o7</guid>
      <description>&lt;p&gt;In my recent project, we have written multiple express APIs for different purposes and called them from react code. In this case multiple APIs have their different routes, definition and responses. Every api is having their CRUD operations and we have  written separate code to call every api.&lt;/p&gt;

&lt;p&gt;What this will lead to ? Code duplication and Code mess. &lt;br&gt;
So I was thinking what I can do to avoid this mess and have a simple approach to call these APIs.&lt;/p&gt;

&lt;p&gt;I spent time on analyzing the code we have written to call these APIs, what are the code duplication sections? can we make this generic any how? &lt;/p&gt;

&lt;p&gt;As a result of analysis, I found that for every api call we have set of functions which can be minimized to generic ones and call for every API.&lt;/p&gt;

&lt;p&gt;Following are set of things I have implemented for refactoring of API calls- &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Division of code&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;In each place of API call, I found that we have performed CRUD (Create, Read, Update, Delete) operation only which can be moved to separate files and only difference is the name of resource e.g. /v1/api/&lt;strong&gt;users&lt;/strong&gt; &lt;br&gt;
/v1/api/&lt;strong&gt;companies&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;So &lt;strong&gt;users&lt;/strong&gt;, &lt;strong&gt;companies&lt;/strong&gt; are nothing but our resource the first part of api is same for all.&lt;br&gt;
Keeping all these things in mind, we made following division-&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;api.provider.ts&lt;/em&gt;&lt;/strong&gt; : &lt;br&gt;
This file is having CRUD operation definition for API calls. It includes axios calling as its promise based and we can handle the responses the way we wanted.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Define your api url from any source. Pulling from your .env   // file when on the server or from localhost when locally
const BASE_URL = Api_base_CRUD; 

/** @param {string} resource */ 
const getAll = (resource: string) =&amp;gt; { 
  return axios 
    (`${BASE_URL}/${resource}`) 
    .then(resp =&amp;gt; resp.data) 
    .catch(handleError); 
}; 

/** @param {string} resource */ 
/** @param {string} id */ 
const getSingle = (resource: string, id: string) =&amp;gt; { 
  return axios 
    .get(`${BASE_URL}/${resource}/${id}`) 
    .then(resp =&amp;gt; resp.data) 
    .catch(handleError); 
}; 

/** @param {string} resource */ 
/** @param {object} model */ 
const post = (resource: string, model: object) =&amp;gt; { 
  return axios 
    .post(`${BASE_URL}/${resource}`, model) 
    .then(resp =&amp;gt; resp.data) 
    .catch(handleError); 
}; 

/** @param {string} resource */ 
/** @param {object} model */ 
const patch = (resource: string, model: object) =&amp;gt; { 
  return axios 
    .patch(`${BASE_URL}/${resource}`, model) 
    .then(resp =&amp;gt; resp.data) 
    .catch(handleError); 
}; 

/** @param {string} resource */ 
/** @param {string} id */ 
const remove = (resource: string, id: AxiosRequestConfig&amp;lt;any&amp;gt; | undefined) =&amp;gt; { 
  return axios 
    .delete(`${BASE_URL}/${resource}/${id}`, id) 
    .then(resp =&amp;gt; resp.data) 
    .catch(handleError); 
}; 

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;em&gt;api.core.ts&lt;/em&gt;&lt;/strong&gt; : &lt;br&gt;
This is a class from where we can make calls to the provider file methods. Here we can pass the resource urls as well.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import {apiProvider} from './api-provider';

export class ApiCore {
  getAll!: () =&amp;gt; any;
  getSingle!: (id: any) =&amp;gt; any;
  post!: (model: any) =&amp;gt; any;
  patch!: (model: any) =&amp;gt; any;
  remove!: (id: any) =&amp;gt; any;
  url!: string;

  constructor(options: { getAll: any; url: any; getSingle: any; post: any; patch: any; remove: any}) {
    if (options.getAll) {
      this.getAll = () =&amp;gt; {
        return apiProvider.getAll(this.url);
      };
    }

    if (options.getSingle) {
      this.getSingle = (id) =&amp;gt; {
        return apiProvider.getSingle(this.url, id);
      };
    }

    if (options.post) {
      this.post = (model) =&amp;gt; {
        return apiProvider.post(this.url, model);
      };
    }

    if (options.patch) {
      this.patch = (model) =&amp;gt; {
        return apiProvider.patch(options.url, model);
      };
    }

    if (options.remove) {
      this.remove = (id) =&amp;gt; {
        return apiProvider.remove(this.url, id);
      };
    }

  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;em&gt;api.operation.ts&lt;/em&gt;&lt;/strong&gt; : &lt;br&gt;
This will be the actual file we will be making use of when making api calls, this includes making an object of api-core class and specify the parameters for the constructor.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { ApiCore } from "./api-core";

const apiOperation = new ApiCore({
  getAll: true,
  getSingle: true,
  post: true,
  patch: true,
  remove: true,
  url: "",
});
export default apiOperation;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;2. Implementing API calls&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now its time to make calls to our api using the generic api files we have created.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import apiUsers from '../../api-operation';

function callUsersData(){
  apiUsers.url = "users";
  apiUsers.getAll()
  .then((resp:any) =&amp;gt; {
    let user = resp.data?.rows; 
  })
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The only thing that will be different in each api is their url, everything else is generic now.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt; :&lt;br&gt;
By making division of files and using the generic functions for api call, now the code base looks simple, easy to read and mainly we removed code duplication.&lt;br&gt;
I hope this helps you manage your API code structure easily manageable and understandable as your code base and team grows!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Here is a link of reference used while doing implementation&lt;/strong&gt; : &lt;br&gt;
&lt;a href="https://dev.to/mmcshinsky/a-simple-approach-to-managing-api-calls-1lo6"&gt;https://dev.to/mmcshinsky/a-simple-approach-to-managing-api-calls-1lo6&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Happy Reading :)&lt;/strong&gt; &lt;/p&gt;

</description>
      <category>react</category>
      <category>axios</category>
      <category>api</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Creating RESTful APIs with Node and MongoDB</title>
      <dc:creator>Angha Ramdohokar</dc:creator>
      <pubDate>Sun, 26 Sep 2021 16:20:20 +0000</pubDate>
      <link>https://dev.to/angha_ramdohokar_0b6505c2/creating-restful-apis-with-mongodb-and-node-js-25ap</link>
      <guid>https://dev.to/angha_ramdohokar_0b6505c2/creating-restful-apis-with-mongodb-and-node-js-25ap</guid>
      <description>&lt;p&gt;During my career as a software developer, I have written RESTful APIs in different languages and used different frameworks for it like VB.net, C#, Java, ASP.NET etc. But recently I got an opportunity to create RESTful APIs using Node js.&lt;/p&gt;

&lt;p&gt;Node.js is a server-side platform built on Google Chrome's JavaScript Engine (V8 Engine). Node.js provides a backend web application framework called as Express. It is designed for building web applications and APIs. MongoDB is an open-source document-oriented database.&lt;/p&gt;

&lt;p&gt;We’ll be building a RESTful CRUD (Create, Retrieve, Update, Delete) API with Node.js, Express and MongoDB. We’ll use Mongoose for interacting with the MongoDB instance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Prerequisites&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Install &lt;a href="https://nodejs.org/en/download/" rel="noopener noreferrer"&gt;Node.js&lt;/a&gt; and &lt;a href="https://docs.mongodb.com/manual/tutorial/install-mongodb-on-windows/" rel="noopener noreferrer"&gt;MongoDB&lt;/a&gt; on your machine if you have not done already and use any development environment like &lt;a href="https://code.visualstudio.com/download" rel="noopener noreferrer"&gt;Visual Studio Code&lt;/a&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Creating application&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open new terminal and create new folder for the application.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;PS C:\&amp;gt; mkdir node-blog-app
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;2.Initialize application with package.json file&lt;/p&gt;

&lt;p&gt;At root of the folder, type npm init to initialize your app with a package.json file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;PS C:\&amp;gt; cd node-blog-app      
PS C:\node-blog-app&amp;gt; npm init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;package name: (blog-app) node-blog-app
version: (1.0.0)
description: Creates blogs easily and quickly.
entry point: (index.js) server.js
test command:
git repository:
keywords: Express,RestAPI,MongoDB,Mongoose,Blogs
author: dev.to
license: (ISC)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here we have defined entry point as server.js file so we will create it further down.&lt;/p&gt;

&lt;p&gt;3.Install application dependencies&lt;/p&gt;

&lt;p&gt;We will need express, mongoose. Let’s install them by typing the following command -&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;PS C:\node-blog-app&amp;gt; npm install express mongoose --save  
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;em&gt;--save will save these dependencies in package.json file.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;4.&lt;strong&gt;Setting up the web server&lt;/strong&gt;&lt;br&gt;
Now we will create the main entry point of our application named &lt;code&gt;server.js&lt;/code&gt; in the root folder of the application with the following contents-&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const express = require('express');

// create express app
const app = express();

// parse requests of content-type - application/x-www-form-urlencoded
app.use(express.urlencoded({ extended: true }))

// parse requests of content-type - application/json
app.use(express.json())

// define a simple route
app.get('/', (req, res) =&amp;gt; {
    res.json({"message": "Welcome to E-Blog. Creates blogs easily and quickly."});
});

// listen for requests
app.listen(3000, () =&amp;gt; {
    console.log("Server is listening on port 3000");
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;First, we import express then we create an express app, and add two parser middlewares using express’s app.use() method.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;If you are using Express &amp;gt;= 4.16.0, body parser has been re-added under the methods express.json() and express.urlencoded().&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Then, We define a simple GET route which returns a welcome message to the clients.&lt;br&gt;
Finally, We listen on port 3000 for incoming connections.&lt;/p&gt;

&lt;p&gt;Let’s now run the server and go to &lt;a href="http://localhost:3000" rel="noopener noreferrer"&gt;http://localhost:3000&lt;/a&gt; to access the route we just defined.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;PS C:\node-blog-app&amp;gt; node server.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgeqmf8q2lrqwcchgc47f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgeqmf8q2lrqwcchgc47f.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;5.&lt;strong&gt;Database configuration and connection&lt;/strong&gt;&lt;br&gt;
Create new file named as &lt;code&gt;database.config.js&lt;/code&gt; inside app/config folder with the following contents -&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module.exports = {
    url: 'mongodb://localhost:27017/blogs'
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now we will import the above database configuration in server.js and connect to the database using mongoose.&lt;/p&gt;

&lt;p&gt;Add the following code to the server.js.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Configuring the database
const dbConfig = require('./config/database.config.js');
const mongoose = require('mongoose');

mongoose.Promise = global.Promise;

// Connecting to the database
mongoose.connect(dbConfig.url, {
    useNewUrlParser: true
}).then(() =&amp;gt; {
    console.log("Successfully connected to the database");    
}).catch(err =&amp;gt; {
    console.log('Could not connect to the database. Exiting now...', err);
    process.exit();
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Please run the server.js and make sure that you’re able to connect to the database -&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;PS C:\node-blog-app&amp;gt; node server.js
Server is listening on port 3000
Successfully connected to the database
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;6.&lt;strong&gt;Defining the Blog model in Mongoose&lt;/strong&gt;&lt;br&gt;
Create a file called &lt;code&gt;blog.model.js&lt;/code&gt; inside app/models folder with the following contents -&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const mongoose = require('mongoose');

const BlogSchema = mongoose.Schema({
    title: String,
    content: String
}, {
    timestamps: true
});

module.exports = mongoose.model('Blog', BlogSchema);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;7.&lt;strong&gt;Defining Routes using Express&lt;/strong&gt;&lt;br&gt;
Create a new file called &lt;code&gt;blog.routes.js&lt;/code&gt; inside app/routes folder with the following contents -&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module.exports = (app) =&amp;gt; {
    const blogs = require('../controllers/blog.controller.js');

    // Create a new Blog
    app.post('/blogs', blog.create);

    // Retrieve all Blogs
    app.get('/blogs', blog.findAll);

    // Update a Blog with blogId
    app.put('/blogs/:blogId', blog.update);

    // Delete a Blog with blogId
    app.delete('/blogs/:blogId', blog.delete);
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;8.&lt;strong&gt;Writing Controller functions&lt;/strong&gt;&lt;br&gt;
Create a new file called &lt;code&gt;blog.controller.js&lt;/code&gt; inside app/controllers folder.&lt;/p&gt;

&lt;p&gt;Creating a new Blog-&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Create and Save a new Blog
exports.create = (req, res) =&amp;gt; {

    // Create a Blog
    const blog = new Blog({
        title: req.body.title, 
        content: req.body.content
    });

    // Save Blog in the database
    blog.save()
    .then(data =&amp;gt; {
        res.send(data);
    }).catch(err =&amp;gt; {
        res.status(500).send({
            message: err.message || "Some error occurred while creating the Blog."
        });
    });
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Retrieving all Blogs -&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Retrieve and return all blogs from the database.
exports.findAll = (req, res) =&amp;gt; {
    Blog.find()
    .then(blogs =&amp;gt; {
        res.send(blogs);
    }).catch(err =&amp;gt; {
        res.status(500).send({
            message: err.message || "Some error occurred while retrieving blogs."
        });
    });
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Updating a Blog -&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Update a blog identified by the blogId in the request
exports.update = (req, res) =&amp;gt; {

    // Find blog and update it with the request body
    Blog.findByIdAndUpdate(req.params.blogId, {
        title: req.body.title,
        content: req.body.content
    }, {new: true})
    .then(blog =&amp;gt; {
        if(!blog) {
            return res.status(404).send({
                message: "Blog not found with id " + req.params.blogId
            });
        }
        res.send(blog);
    }).catch(err =&amp;gt; {
        if(err.kind === 'ObjectId') {
            return res.status(404).send({
                message: "Blog not found with id " + req.params.blogId
            });                
        }
        return res.status(500).send({
            message: "Error updating blog with id " + req.params.blogId
        });
    });
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;em&gt;The {new: true} option in the findByIdAndUpdate() method is used to return the modified document to the then() function instead of the original.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Deleting a Blog-&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Delete a blog with the specified blogId in the request
exports.delete = (req, res) =&amp;gt; {
    Blog.findByIdAndRemove(req.params.blogId)
    .then(blog =&amp;gt; {
        if(!blog) {
            return res.status(404).send({
                message: "Blog not found with id " + req.params.blogId
            });
        }
        res.send({message: "Blog deleted successfully!"});
    }).catch(err =&amp;gt; {
        if(err.kind === 'ObjectId' || err.name === 'NotFound') {
            return res.status(404).send({
                message: "Blog not found with id " + req.params.blogId
            });                
        }
        return res.status(500).send({
            message: "Could not delete blog with id " + req.params.blogId
        });
    });
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Check out &lt;a href="https://mongoosejs.com/docs/api.html" rel="noopener noreferrer"&gt;Mongoose API documentation&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Testing of Blogs API&lt;/strong&gt;-&lt;br&gt;
Check out &lt;a href="https://www.postman.com/anagharamdohokar/workspace/blogsapi/overview" rel="noopener noreferrer"&gt;this&lt;/a&gt; in POSTMAN to test Blogs APIs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
In this blog, We learned how to build REST APIs in Node.js using express framework and mongodb.&lt;br&gt;
Please ask any questions that you might have in the comment section below.&lt;/p&gt;

&lt;p&gt;Thanks for reading.&lt;/p&gt;

</description>
      <category>node</category>
      <category>express</category>
      <category>api</category>
      <category>elastikteams</category>
    </item>
  </channel>
</rss>
