<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Rahul</title>
    <description>The latest articles on DEV Community by Rahul (@sairahul1).</description>
    <link>https://dev.to/sairahul1</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/sairahul1"/>
    <language>en</language>
    <item>
      <title>How to perform realtime transformations on S3</title>
      <dc:creator>Rahul</dc:creator>
      <pubDate>Tue, 25 Oct 2022 04:55:57 +0000</pubDate>
      <link>https://dev.to/sairahul1/how-to-perform-realtime-transformations-on-s3-51g6</link>
      <guid>https://dev.to/sairahul1/how-to-perform-realtime-transformations-on-s3-51g6</guid>
      <description>&lt;p&gt;&lt;strong&gt;Learn a new feature of AWS S3 which can be used to do realtime transformations like compression, resizing images etc.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Let’s discuss a new feature of S3, S3 Object Lambda, which can do realtime transformations of files, as you retrieve them.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--XGW4nFlg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ftugo9ym8pchrlz6s42g.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XGW4nFlg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ftugo9ym8pchrlz6s42g.jpg" alt="Image description" width="880" height="660"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Recently came across an &lt;a href="https://aws.amazon.com/blogs/aws/introducing-amazon-s3-object-lambda-use-your-code-to-process-data-as-it-is-being-retrieved-from-s3/"&gt;S3 feature&lt;/a&gt;, where you can attach lambda to any AWS S3 bucket and do your conversions, transformations, altering on the fly and also use all the features that S3+CloudFront provides.&lt;/p&gt;

&lt;h2&gt;
  
  
  Use cases
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Compressing or decompressing files as they are being downloaded.&lt;/li&gt;
&lt;li&gt;Resizing and watermarking images on the fly.&lt;/li&gt;
&lt;li&gt;Converting across data formats, such as converting XML to JSON.&lt;/li&gt;
&lt;li&gt;Implementing custom authorization rules to access data.
Many more (left to the devs’ creativity).&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Steps
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Create an Access Point for S3 bucket.&lt;/li&gt;
&lt;li&gt;Create an Object Lambda Access Point, and attach above created Access Point and a Lambda, with required permissions like GetObject.&lt;/li&gt;
&lt;li&gt;Write a code in above created Lambda, to retrive and modify the files according to your needs. &lt;a href="https://bit.ly/3Vr7ywC"&gt;Example Code&lt;/a&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1twRhynu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/u57w67phfrb2nuaarqd7.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1twRhynu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/u57w67phfrb2nuaarqd7.jpg" alt="Create an S3 Object Lambda Access Point from the S3 Management Console." width="880" height="660"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--iX1VkUNs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9o62mwoekdynl7clqqp9.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--iX1VkUNs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9o62mwoekdynl7clqqp9.jpg" alt="Create an Object Lambda Access Point from S3 Console&amp;lt;br&amp;gt;
" width="880" height="660"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QejQdc5N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8unjs0uz1ac50kpyd63e.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QejQdc5N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8unjs0uz1ac50kpyd63e.jpg" alt="Access Point and Lambda inputs, with required permissions like GetObject" width="880" height="660"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Example
&lt;/h2&gt;

&lt;p&gt;Let’s use the example of image transformation on the fly.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;User asks for sunset_600x400.jpg&lt;/li&gt;
&lt;li&gt;Lambda function will get invoked.&lt;/li&gt;
&lt;li&gt;If sunset_600x400.jpg is available, return 200 success to continue downloading the file.&lt;/li&gt;
&lt;li&gt;If sunset_600x400.jpg is not available, will remove suffix _600x400 and look for an image named sunset.jpg&lt;/li&gt;
&lt;li&gt;Resize it to fit the maximum width and height as described in the file name and save it in S3 as sunset_600x400.jpg&lt;/li&gt;
&lt;li&gt;Return 200 success to continue downloading the file.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Caveats
&lt;/h2&gt;

&lt;p&gt;For every S3 file access, lambda will get invokes and thus need to pay extra for Lambda.&lt;/p&gt;


&lt;h2&gt;
  
  
  In Conclusion…
&lt;/h2&gt;

&lt;p&gt;With this new feature from AWS, you can do file transformations, image resize/compressions, limit access etc on the fly.&lt;/p&gt;

&lt;p&gt;Let me know if you find any other use cases.&lt;/p&gt;



&lt;p&gt;That’s it! Please let me know about your views and comment below for any clarifications.&lt;/p&gt;

&lt;p&gt;If you found value in reading this, please consider sharing it with your friends and also on social media 🙏&lt;/p&gt;

&lt;p&gt;Also, to be notified about my upcoming articles, subscribe to my newsletter below (I’ll not spam you 😂)&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
      &lt;div class="c-embed__cover"&gt;
        &lt;a href="https://blogofcodes.substack.com/" class="c-link s:max-w-50 align-middle" rel="noopener noreferrer"&gt;
          &lt;img alt="" src="https://res.cloudinary.com/practicaldev/image/fetch/s--BRaZ_y5S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/image/fetch/w_1008%2Ch_528%2Cc_fill%2Cf_jpg%2Cq_auto:best%2Cfl_progressive:steep/https%253A%252F%252Fblogofcodes.substack.com%252Ftwitter%252Fsubscribe-card.svg%253Fv%253Da6ca445a50b461c5c738c0161190d16f%2526version%253D7" height="461" class="m-0" width="880"&gt;
        &lt;/a&gt;
      &lt;/div&gt;
    &lt;div class="c-embed__body"&gt;
      &lt;h2 class="fs-xl lh-tight"&gt;
        &lt;a href="https://blogofcodes.substack.com/" rel="noopener noreferrer" class="c-link"&gt;
          Blog of Codes | Rahul | Substack
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;p class="truncate-at-3"&gt;
          Articles about cloud architecture and programming. Click to read Blog of Codes, by Rahul, a Substack publication. Launched 2 months ago.
        &lt;/p&gt;
      &lt;div class="color-secondary fs-s flex items-center"&gt;
          &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://res.cloudinary.com/practicaldev/image/fetch/s--Tp6HD1dD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/icons/substack/favicon.ico" width="64" height="64"&gt;
        blogofcodes.substack.com
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;You can find me on &lt;a href="https://twitter.com/sairahul1"&gt;Twitter&lt;/a&gt; and &lt;a href="https://www.linkedin.com/in/sairahul1/"&gt;LinkedIn&lt;/a&gt; ✌️&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>s3</category>
    </item>
    <item>
      <title>How to build caching service with 10% of your Redis, DynamoDB costs</title>
      <dc:creator>Rahul</dc:creator>
      <pubDate>Mon, 17 Oct 2022 08:11:17 +0000</pubDate>
      <link>https://dev.to/sairahul1/how-to-build-caching-service-with-10-of-your-redis-dynamodb-costs-29ng</link>
      <guid>https://dev.to/sairahul1/how-to-build-caching-service-with-10-of-your-redis-dynamodb-costs-29ng</guid>
      <description>&lt;p&gt;&lt;strong&gt;Learn how to utilise FREE cloud services and cache like Redis, DynamoDB with 10x reduction in costs&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before we dive in, this is just a theory. No one has implemented it till now. So, use it only if you understand the numbers.&lt;/p&gt;

&lt;p&gt;So recently, I was reading a &lt;a href="https://medium.com/@jaderd/no-s3-isnt-more-expensive-than-dynamodb-if-you-can-afford-the-extra-latency-50f6e5b06c17"&gt;blog&lt;/a&gt; saying that S3 isn’t more expensive than DynamoDB, if you afford the extra latency.&lt;/p&gt;

&lt;p&gt;Then I got an idea of why not use some cloud providers’ FREE services and build a caching service.&lt;/p&gt;

&lt;h2&gt;
  
  
  Limitations
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Latency of 100ms.&lt;/li&gt;
&lt;li&gt;No custom invalidation.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Services
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Backblaze&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Similar to S3&lt;/li&gt;
&lt;li&gt;10 GB of FREE storage&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Cloudflare&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Service is completely FREE&lt;/li&gt;
&lt;li&gt;Provide edge caching.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Here’s are the steps
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Create a storage bucket in Backblaze.&lt;/li&gt;
&lt;li&gt;Connect your Cloudflare DNS with above bucket via some route.&lt;/li&gt;
&lt;li&gt;Enable Cloudflare edge caching to this route.&lt;/li&gt;
&lt;li&gt;Now you can upload to bucket with filename as key and file data as value.&lt;/li&gt;
&lt;li&gt;Get/serve your value/file from Cloudflare route, to act as cache.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Cost Estimation
&lt;/h2&gt;

&lt;p&gt;Let’s take an example of 10 billion entries of 1KB each with 100 billion reads (I know it’s massive)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;DynamoDB&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;On-demand pricing of DynamoDB

&lt;ul&gt;
&lt;li&gt;$0.25 per GB of storage&lt;/li&gt;
&lt;li&gt;$1.25 per million write requests&lt;/li&gt;
&lt;li&gt;$0.25 per million read requests&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Calculations

&lt;ul&gt;
&lt;li&gt;10 billion * 1 KB = 10 TB of storage * $0.25 per GB =&amp;gt; $2,500&lt;/li&gt;
&lt;li&gt;10 billion inserts * $1.25 per million inserts =&amp;gt; $12,500&lt;/li&gt;
&lt;li&gt;100 billions reads * $0.25 per million reads =&amp;gt; $25,000&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Total = $2,500 + $12,500 + $25,000 = $40,000&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Backblaze + Cloudflare&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Backblaze cloud storage&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;$5 per TB of storage&lt;/li&gt;
&lt;li&gt;Unlimited FREE inserts/uploads&lt;/li&gt;
&lt;li&gt;$0.004 for 10k reads&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Considerations&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Cloudflare DNS and edge caching is completely FREE&lt;/li&gt;
&lt;li&gt;90% of reads will be from Cloudflare cache, which is FREE. So only 10% will hit Backblaze bucket and get charged&lt;/li&gt;
&lt;li&gt;Cloudflare, Backblaze are in &lt;a href="https://www.cloudflare.com/en-in/partners/technology-partners/backblaze/"&gt;bandwidth alliance&lt;/a&gt;, which gives us zero egrees fees for these files.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Calculations&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;10 billion * 1 KB = 10 TB of storage * $5 per TB =&amp;gt; $50&lt;/li&gt;
&lt;li&gt;10 billion inserts * Unlimited FREE inserts =&amp;gt; $0&lt;/li&gt;
&lt;li&gt;100 billions reads * $0.004 per 10k reads * 0.1 (10% of reads) =&amp;gt; $4,000&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Total = $50 + $0 + $4,000 = $4,050&lt;/p&gt;

&lt;p&gt;$40,000 =&amp;gt; $4,000&lt;/p&gt;

&lt;p&gt;10x decrease 🤯&lt;/p&gt;




&lt;h2&gt;
  
  
  In Conclusion…
&lt;/h2&gt;

&lt;p&gt;I know this is completely theoretical. And also I didn’t take into account many things while calculating numbers, provisioned capacity, FREE tiers and etc.&lt;/p&gt;

&lt;p&gt;But I would say, if you would leave it aside, and if you can afford little higher latency, and some caveats, then I think this can work as cache, alternative to DynamoDB, Redis.&lt;/p&gt;




&lt;p&gt;That’s it! Please let me know about your views and comment below for any clarifications.&lt;/p&gt;

&lt;p&gt;If you found value in reading this, please consider sharing it with your friends and also on social media 🙏&lt;/p&gt;

&lt;p&gt;Also, to be notified about my upcoming articles, subscribe to my newsletter below (I’ll not spam you 😂)&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
      &lt;div class="c-embed__cover"&gt;
        &lt;a href="https://blogofcodes.substack.com/" class="c-link s:max-w-50 align-middle" rel="noopener noreferrer"&gt;
          &lt;img alt="" src="https://res.cloudinary.com/practicaldev/image/fetch/s--BRaZ_y5S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/image/fetch/w_1008%2Ch_528%2Cc_fill%2Cf_jpg%2Cq_auto:best%2Cfl_progressive:steep/https%253A%252F%252Fblogofcodes.substack.com%252Ftwitter%252Fsubscribe-card.svg%253Fv%253Da6ca445a50b461c5c738c0161190d16f%2526version%253D7" height="461" class="m-0" width="880"&gt;
        &lt;/a&gt;
      &lt;/div&gt;
    &lt;div class="c-embed__body"&gt;
      &lt;h2 class="fs-xl lh-tight"&gt;
        &lt;a href="https://blogofcodes.substack.com/" rel="noopener noreferrer" class="c-link"&gt;
          Blog of Codes | Rahul | Substack
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;p class="truncate-at-3"&gt;
          Articles about cloud architecture and programming. Click to read Blog of Codes, by Rahul, a Substack publication. Launched a month ago.
        &lt;/p&gt;
      &lt;div class="color-secondary fs-s flex items-center"&gt;
          &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://res.cloudinary.com/practicaldev/image/fetch/s--Tp6HD1dD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/icons/substack/favicon.ico" width="64" height="64"&gt;
        blogofcodes.substack.com
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;You can find me on &lt;a href="https://twitter.com/sairahul1"&gt;Twitter&lt;/a&gt; and &lt;a href="https://www.linkedin.com/in/sairahul1/"&gt;LinkedIn&lt;/a&gt; ✌️&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
    </item>
    <item>
      <title>How to schedule jobs at scale</title>
      <dc:creator>Rahul</dc:creator>
      <pubDate>Mon, 10 Oct 2022 12:59:08 +0000</pubDate>
      <link>https://dev.to/sairahul1/how-to-schedule-jobs-at-scale-47mh</link>
      <guid>https://dev.to/sairahul1/how-to-schedule-jobs-at-scale-47mh</guid>
      <description>&lt;p&gt;Let's discuss how to schedule and process millions of jobs.&lt;/p&gt;

&lt;p&gt;Use-case:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Build a social media scheduler, where users can schedule their posts, like everyday at 7:30 PM and post them to their social media accounts automatically.&lt;/li&gt;
&lt;li&gt;Build a product where users can trigger their ETL jobs or some code every 10 min.&lt;/li&gt;
&lt;li&gt;Build a status page product to check your product health, speed etc for every 1 min.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Now let's dive into some solutions and their cons.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cloudwatch/Eventbridge rules
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Add a rule in eventbridge for every customer need and this rule takes care of hitting that particular job.&lt;/li&gt;
&lt;li&gt;Easy to setup but has a hard limit of 300 rules per account, so not scalable.&lt;/li&gt;
&lt;li&gt;Although this solution doesn't scale for your customers, but can be used for your own product jobs.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Code it
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Take all the jobs which run in the next 10 min&lt;/li&gt;
&lt;li&gt;Add it to "to-be-processed" table with exact time.&lt;/li&gt;
&lt;li&gt;Continuously run a code/executor, to get all the jobs which have execution time &amp;lt;= current time and not processed, and execute them.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Let's say you are running this scheduler cron every 10 min and current time is 7:28 PM. And you have job A scheduled at 7:32 PM.&lt;/li&gt;
&lt;li&gt;Now get all the jobs which needs to be run in the next 10 min that is 7:28 PM to 7:37 PM, which includes job A.&lt;/li&gt;
&lt;li&gt;And then push to "to-be-processed" table with exact time.&lt;/li&gt;
&lt;li&gt;Then your executor code will be getting all the jobs which have execution time &amp;lt;= current time and not processed, which also includes job A, and then executes them.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Issues:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Not scalable with 1 executor.&lt;/li&gt;
&lt;li&gt;If working with multiple executor threads, need to handle concurrency and not execute the same jobs multiple times.&lt;/li&gt;
&lt;li&gt;This can re-run failed jobs. But should write a logic to stop after certain number of failures.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  SQS
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--I3fXKsum--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1r4wye15gaw3c0nknehd.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--I3fXKsum--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1r4wye15gaw3c0nknehd.gif" alt="Image description" width="500" height="281"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Using delay feature in SQS. This enables us to send messages now and consume them after some time.&lt;/li&gt;
&lt;li&gt;Now take all the jobs which run in the next 10 min.&lt;/li&gt;
&lt;li&gt;Add it to SQS queue with it’s specific delay.&lt;/li&gt;
&lt;li&gt;Attach a lambda consumer/executor to the SQS, which will consume messages, when available and execute the jobs.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Same example. Let’s say you are running this scheduler cron every 10 min and current time is 7:28 PM. And you have job A scheduled at 7:32 PM.&lt;/li&gt;
&lt;li&gt;Now get all the jobs which needs to be run in the next 10 min that is 7:28 PM to 7:37 PM, which includes job A.&lt;/li&gt;
&lt;li&gt;And push to SQS with it’s specific delay. So for job A, which needs to be run at 7:32 PM, which is 5 min after 7:28 PM/current time. Push job A with 5 min delay.&lt;/li&gt;
&lt;li&gt;Lambda consumer will get all the messages, when available. So job A will be available to consumer after 5 min, that is 7:32 PM and executes it.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Does it solve previous solutions issues?&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Since SQS, lambda is server less and highly scalable, this solution is scalable too.&lt;/li&gt;
&lt;li&gt;Since we can tell SQS to send message to consumers only once, this doesn’t execute the same job multiple times.&lt;/li&gt;
&lt;li&gt;If failed and we don’t delete message from SQS, this will re-run after certain time automatically (specified by us).&lt;/li&gt;
&lt;li&gt;Multiple time failed jobs can also be removed from further processing by using/pushing to dead letter queue automatically.&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  In Conclusion…
&lt;/h2&gt;

&lt;p&gt;With this architecture, you can schedule a very large number of jobs. Let me know if you have a better solution in mind.&lt;/p&gt;




&lt;p&gt;That’s it! Please let me know about your views and comment below for any clarifications.&lt;/p&gt;

&lt;p&gt;If you found value in reading this, please consider sharing it with your friends and also on social media 🙏&lt;/p&gt;

&lt;p&gt;Also, to be notified about my upcoming articles, subscribe to my newsletter below (I’ll not spam you 😂)&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
      &lt;div class="c-embed__cover"&gt;
        &lt;a href="https://blogofcodes.substack.com/" class="c-link s:max-w-50 align-middle" rel="noopener noreferrer"&gt;
          &lt;img alt="" src="https://res.cloudinary.com/practicaldev/image/fetch/s--BRaZ_y5S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/image/fetch/w_1008%2Ch_528%2Cc_fill%2Cf_jpg%2Cq_auto:best%2Cfl_progressive:steep/https%253A%252F%252Fblogofcodes.substack.com%252Ftwitter%252Fsubscribe-card.svg%253Fv%253Da6ca445a50b461c5c738c0161190d16f%2526version%253D7" height="461" class="m-0" width="880"&gt;
        &lt;/a&gt;
      &lt;/div&gt;
    &lt;div class="c-embed__body"&gt;
      &lt;h2 class="fs-xl lh-tight"&gt;
        &lt;a href="https://blogofcodes.substack.com/" rel="noopener noreferrer" class="c-link"&gt;
          Blog of Codes | Rahul | Substack
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;p class="truncate-at-3"&gt;
          Articles about cloud architecture and programming. Click to read Blog of Codes, by Rahul, a Substack publication. Launched a month ago.
        &lt;/p&gt;
      &lt;div class="color-secondary fs-s flex items-center"&gt;
          &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://res.cloudinary.com/practicaldev/image/fetch/s--Tp6HD1dD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/icons/substack/favicon.ico" width="64" height="64"&gt;
        blogofcodes.substack.com
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;You can find me on &lt;a href="https://twitter.com/sairahul1"&gt;Twitter&lt;/a&gt; and &lt;a href="https://www.linkedin.com/in/sairahul1/"&gt;LinkedIn&lt;/a&gt; ✌️&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>aws</category>
    </item>
    <item>
      <title>How to avoid top 5 mistakes in SQL</title>
      <dc:creator>Rahul</dc:creator>
      <pubDate>Mon, 03 Oct 2022 15:04:32 +0000</pubDate>
      <link>https://dev.to/sairahul1/how-to-avoid-top-5-mistakes-in-sql-4lfb</link>
      <guid>https://dev.to/sairahul1/how-to-avoid-top-5-mistakes-in-sql-4lfb</guid>
      <description>&lt;p&gt;Let’s discuss some tips and tricks to avoid major mistakes in SQL&lt;/p&gt;

&lt;h2&gt;
  
  
  Never use Select *
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;select *, gets the data from all the columns, which increases the latency and is expensive for huge data.&lt;/li&gt;
&lt;li&gt;Instead, get only the required fields, which limits the size of each record&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6GPAXWzH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kgrqdyaaqklk3jujau0i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6GPAXWzH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kgrqdyaaqklk3jujau0i.png" alt="Image description" width="880" height="175"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Use EXISTS() Instead of COUNT()
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;If you want to know whether a record/result exists or not, it is better to use exists(), rather than count()&lt;/li&gt;
&lt;li&gt;count(), will browse the entire table to get you the number of records.&lt;/li&gt;
&lt;li&gt;Whereas, exists(), will get back to you, when it finds the first record for the query, saving you time and computing.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Functions on indexed columns is useless
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Using functions on indexed columns, while querying, will make the indexes remain useless.&lt;/li&gt;
&lt;li&gt;In order to use indexes, we need to avoid adding functions on the indexed columns.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8u0rM95K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/v39d747ridml2knznj19.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8u0rM95K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/v39d747ridml2knznj19.png" alt="Image description" width="880" height="221"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  LIMIT Statements
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Most common used pagination is LIMIT, OFFSET, which is not optimal.&lt;/li&gt;
&lt;li&gt;It is fast for smaller and immediate sets like “LIMIT 0, 10”&lt;/li&gt;
&lt;li&gt;But when OFFSET is changed to 1000000, it takes too long, since database doesn’t know where 1000000th record exists, it starts from scratch till it finds 1000000th row.&lt;/li&gt;
&lt;li&gt;Better approach would be to add another filter, mostly indexed column, like below&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8nhKhUV1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b2t1y4ao0qdr9462l9h7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8nhKhUV1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b2t1y4ao0qdr9462l9h7.png" alt="Image description" width="880" height="204"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Use GROUP BY Instead of DISTINCT
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;The distinct is an expensive operation, and doesn’t use indexes if available.&lt;/li&gt;
&lt;li&gt;Faster and easier way to do the same is to use group by.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6Dbt0xB4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/24tiivzbne22hgs5s9xa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6Dbt0xB4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/24tiivzbne22hgs5s9xa.png" alt="Image description" width="880" height="196"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  In Conclusion…
&lt;/h2&gt;

&lt;p&gt;I haven’t covered many major SQL pitfalls. But these are some of the top mistakes, even made by experienced developers. Now you get to avoid them 😁&lt;/p&gt;




&lt;p&gt;That’s it! Please let me know about your views and comment below for any clarifications.&lt;/p&gt;

&lt;p&gt;If you found value in reading this, please consider sharing it with your friends and also on social media 🙏&lt;/p&gt;

&lt;p&gt;Also, to be notified about my upcoming articles, subscribe to my newsletter below (I’ll not spam you 😂)&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
      &lt;div class="c-embed__cover"&gt;
        &lt;a href="https://blogofcodes.substack.com/" class="c-link s:max-w-50 align-middle" rel="noopener noreferrer"&gt;
          &lt;img alt="" src="https://res.cloudinary.com/practicaldev/image/fetch/s--BRaZ_y5S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/image/fetch/w_1008%2Ch_528%2Cc_fill%2Cf_jpg%2Cq_auto:best%2Cfl_progressive:steep/https%253A%252F%252Fblogofcodes.substack.com%252Ftwitter%252Fsubscribe-card.svg%253Fv%253Da6ca445a50b461c5c738c0161190d16f%2526version%253D7" height="461" class="m-0" width="880"&gt;
        &lt;/a&gt;
      &lt;/div&gt;
    &lt;div class="c-embed__body"&gt;
      &lt;h2 class="fs-xl lh-tight"&gt;
        &lt;a href="https://blogofcodes.substack.com/" rel="noopener noreferrer" class="c-link"&gt;
          Blog of Codes | Rahul | Substack
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;p class="truncate-at-3"&gt;
          Articles about cloud architecture and programming. Click to read Blog of Codes, by Rahul, a Substack publication. Launched a month ago.
        &lt;/p&gt;
      &lt;div class="color-secondary fs-s flex items-center"&gt;
          &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://res.cloudinary.com/practicaldev/image/fetch/s--Tp6HD1dD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/icons/substack/favicon.ico" width="64" height="64"&gt;
        blogofcodes.substack.com
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;You can find me on &lt;a href="https://twitter.com/sairahul1"&gt;Twitter&lt;/a&gt; and &lt;a href="https://www.linkedin.com/in/sairahul1/"&gt;LinkedIn&lt;/a&gt; ✌️&lt;/p&gt;

</description>
      <category>sql</category>
      <category>beginners</category>
    </item>
    <item>
      <title>How to save upto 50% of your AWS costs</title>
      <dc:creator>Rahul</dc:creator>
      <pubDate>Mon, 26 Sep 2022 08:32:24 +0000</pubDate>
      <link>https://dev.to/sairahul1/how-to-save-upto-50-of-your-aws-costs-1149</link>
      <guid>https://dev.to/sairahul1/how-to-save-upto-50-of-your-aws-costs-1149</guid>
      <description>&lt;p&gt;Let’s discuss some tips and tricks to reduce the cost of the most used AWS services - RDS, EC2, S3, DynamoDB, CloudFront.&lt;/p&gt;

&lt;h2&gt;
  
  
  RDS
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Migrate to latest graviton based instances. According to AWS, Graviton2 instances provide up to 35% performance improvement and up to 52% price/performance improvement.&lt;/li&gt;
&lt;li&gt;Opt for reserved instance, if you know your load. Can save upto 60% when compared to normal On-Demand database.&lt;/li&gt;
&lt;li&gt;Upgrade to latest SQL version, for higher performance, speed and also security.&lt;/li&gt;
&lt;li&gt;Don’t use multi A-Z, if you don’t need it.&lt;/li&gt;
&lt;li&gt;Downgrade or auto shutdown staging, testing instances, when not in-use.&lt;/li&gt;
&lt;li&gt;Disable performance insights for non-prod database.&lt;/li&gt;
&lt;li&gt;Auto delete snapshots that are too old.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  EC2
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Migrate to latest graviton based instances.&lt;/li&gt;
&lt;li&gt;Opt for reserved instance, if you know your load. Upto 72% savings compared to normal On-Demand EC2 instances.&lt;/li&gt;
&lt;li&gt;Move to lambda for infrequent batch jobs, instead of running EC2 all day long. 1 million executions FREE per month.&lt;/li&gt;
&lt;li&gt;You can also pick spot instances for testing, staging environments. Upto 90% discount when compared to normal On-Demand EC2 instances.&lt;/li&gt;
&lt;li&gt;Upgrade to latest EC2 versions.&lt;/li&gt;
&lt;li&gt;Downgrade or auto shutdown staging, testing instances, when not in-use.&lt;/li&gt;
&lt;li&gt;Use smaller instance, instead of 1 large instance, so that you can scale up and down when necessary.&lt;/li&gt;
&lt;li&gt;Choose EC2 in same region as other service like S3, DynamoDB to avoid unnecessary data transfers.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  S3
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Enable S3 Intelligent-Tiering to automatically move infrequently accessed files to lower priced S3. Can save up to 95% on storage costs for data that is not accessed for months, or even years, at a time.&lt;/li&gt;
&lt;li&gt;Store compressed formats, if possible.&lt;/li&gt;
&lt;li&gt;Enable CloudFront and serve through it, to reduce requests and data transfer cost.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  DynamoDB
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Check your minimum load and use provisioned capacity. 25 Write Capacity Units and 25 Write Capacity Units are FREE every month.&lt;/li&gt;
&lt;li&gt;Use on demand for lesser used tables.&lt;/li&gt;
&lt;li&gt;Prefer queries over scans.&lt;/li&gt;
&lt;li&gt;Use shorter attribute names.&lt;/li&gt;
&lt;li&gt;Avoid strongly consistent reads and transactions, wherever possible.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  CloudFront
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Auto Compress via CloudFront to reduce data transfer.&lt;/li&gt;
&lt;li&gt;Go for savings plan, if you know your load. Can save upto 30% when compared to on-demand.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  In Conclusion…
&lt;/h2&gt;

&lt;p&gt;I haven’t covered many AWS services and tips to reduce cloud costs. But these are some ways by which you can reduce your AWS cloud bill upto 50%.&lt;/p&gt;




&lt;p&gt;That’s it! Please let me know about your views and comment below for any clarifications.&lt;/p&gt;

&lt;p&gt;If you found value in reading this, please consider sharing it with your friends and also on social media 🙏&lt;/p&gt;

&lt;p&gt;Also, to be notified about my upcoming articles, subscribe to my newsletter below (I’ll not spam you 😂)&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
      &lt;div class="c-embed__cover"&gt;
        &lt;a href="https://blogofcodes.substack.com/" class="c-link s:max-w-50 align-middle" rel="noopener noreferrer"&gt;
          &lt;img alt="" src="https://res.cloudinary.com/practicaldev/image/fetch/s--BRaZ_y5S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/image/fetch/w_1008%2Ch_528%2Cc_fill%2Cf_jpg%2Cq_auto:best%2Cfl_progressive:steep/https%253A%252F%252Fblogofcodes.substack.com%252Ftwitter%252Fsubscribe-card.svg%253Fv%253Da6ca445a50b461c5c738c0161190d16f%2526version%253D7" height="461" class="m-0" width="880"&gt;
        &lt;/a&gt;
      &lt;/div&gt;
    &lt;div class="c-embed__body"&gt;
      &lt;h2 class="fs-xl lh-tight"&gt;
        &lt;a href="https://blogofcodes.substack.com/" rel="noopener noreferrer" class="c-link"&gt;
          Blog of Codes | Rahul | Substack
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;p class="truncate-at-3"&gt;
          Articles about cloud architecture and programming. Click to read Blog of Codes, by Rahul, a Substack publication. Launched a month ago.
        &lt;/p&gt;
      &lt;div class="color-secondary fs-s flex items-center"&gt;
          &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://res.cloudinary.com/practicaldev/image/fetch/s--Tp6HD1dD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/icons/substack/favicon.ico" width="64" height="64"&gt;
        blogofcodes.substack.com
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;You can find me on &lt;a href="https://twitter.com/sairahul1"&gt;Twitter&lt;/a&gt; and &lt;a href="https://www.linkedin.com/in/sairahul1/"&gt;LinkedIn&lt;/a&gt; ✌️&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
    </item>
    <item>
      <title>How to host your product for FREE</title>
      <dc:creator>Rahul</dc:creator>
      <pubDate>Mon, 19 Sep 2022 05:00:04 +0000</pubDate>
      <link>https://dev.to/sairahul1/how-to-host-your-product-for-free-c8d</link>
      <guid>https://dev.to/sairahul1/how-to-host-your-product-for-free-c8d</guid>
      <description>&lt;p&gt;Let’s discuss various cloud providers and their FREE services to host your product for FREE 😁, till 50k monthly active users 🤯. Be it frontend, backend, database, authentication, emails etc.&lt;/p&gt;

&lt;h2&gt;
  
  
  Authentication
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Firebase Authentication&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Can be used by up to 50k monthly active users for FREE. &lt;/li&gt;
&lt;li&gt;Supports Google, FB, Phone, and Email.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;AWS Cognito&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;FREE till 50k monthly active users.&lt;/li&gt;
&lt;li&gt;Supports Google, FB, Phone, and Email.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Although you could use Google, and FB SDKs to implement their logins. It would be much easier to implement and maintain users using Firebase Auth or AWS Cognito.&lt;/p&gt;

&lt;h2&gt;
  
  
  Frontend
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Cloudflare Pages&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;With Git integration, you can directly deploy your website with every commit.&lt;/li&gt;
&lt;li&gt;Can preview every commit in every branch before going live.&lt;/li&gt;
&lt;li&gt;No CI/CD setup is required.&lt;/li&gt;
&lt;li&gt;FREE plan supports up to 500 builds per month. That means you can push 16 times a day, preview and deploy to prod for FREE. &lt;/li&gt;
&lt;li&gt;Unlimited bandwidth. So you can have unlimited visitors to your website for FREE. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Other notable services are AWS Amplify Hosting and AWS S3+Cloudfront.&lt;/p&gt;

&lt;h2&gt;
  
  
  Media
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Backblaze + Cloudflare&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Store assets on Backblaze and server them through Cloudflare CDN.&lt;/li&gt;
&lt;li&gt;Backblaze provides 10 GB of storage for FREE and Cloudflare CDN is completely FREE to use. &lt;/li&gt;
&lt;li&gt;Also, the bandwidth alliance between Backblaze and Cloudflare gives us zero egrees fees for these media. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Other notable services - AWS S3+Cloudfront/Cloudflare.&lt;/p&gt;

&lt;h2&gt;
  
  
  Backend
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;AWS Lambda&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Serverless - Can deploy your backend code on Lambda and server your API without worrying about the scale.&lt;/li&gt;
&lt;li&gt;Lambda even has a function URL, thus you can directly use it like an API, without any API Gateway&lt;/li&gt;
&lt;li&gt;AWS Lambda has a very generous FREE tier of 1 million invocations (with 128 MB RAM - 3 sec time) per month, which translates to 33.3k API hits per day&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;AWS Beanstalk&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS gives 1 EC2 server with 1 GB RAM, and 2 vCPUs with elastic load balancing for 750 hours FREE, which means FREE for the entire month.&lt;/li&gt;
&lt;li&gt;With this config, I was able to serve 50 req/sec without any issues (using Golang 😁)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Database
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;AWS RDS&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Postgres or MySQL with 1 GB RAM, 20 GB SSD, and 2 vCPU is FREE for an entire month.&lt;/li&gt;
&lt;li&gt;Can easily handle 50 req/sec (if query latency is less than a second 😄)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;AWS DynamoDB&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;DynamoDB is NoSQL key-value database.&lt;/li&gt;
&lt;li&gt;25 GB of storage, 25 provisioned Write and Read Capacity FREE each month.&lt;/li&gt;
&lt;li&gt;This is enough to handle up to 200 million requests per month, with 25 write/sec up to 1KB and 50 reads/sec up to 4KB.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;AWS Elasticache&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Redis or Memcached with 0.5 GB, 2 vCPU can be used for an entire month FREE&lt;/li&gt;
&lt;li&gt;For 500 B for each key, this can handle 1 million keys in RAM.&lt;/li&gt;
&lt;li&gt;Can use AWS MemoryDB for Redis too.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;AWS DocumentDB&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS version of MongoDB.&lt;/li&gt;
&lt;li&gt;With 4GB RAM, 2 vCPUs and 5 GB storage, you can host MongoDB for FREE.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Monitoring
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;AWS Cloudwatch&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Monitor your server, log your requests etc.&lt;/li&gt;
&lt;li&gt;FREE till 10 custom metrics, 10 alarms, 5GB of log ingestion&lt;/li&gt;
&lt;li&gt;If you set the logs to expire for 7 days, mostly this limit would never be touched 😊&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Email
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;AWS SES&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Send 62k emails per month for FREE. That means 2k emails per day 🤯&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Notifications
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Can use both Firebase messages and Onesignal notifications.&lt;/li&gt;
&lt;li&gt;Send unlimited notifications every day, completely FREE.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  In Conclusion…
&lt;/h2&gt;

&lt;p&gt;There are many cloud providers and their FREE services, which I didn’t cover in this article.&lt;/p&gt;

&lt;p&gt;But you get the point, that you can host your entire product tech stack, to 50k monthly users, for FREE without opening your wallet.&lt;/p&gt;




&lt;p&gt;That’s it! Please let me know about your views and comment below for any clarifications.&lt;/p&gt;

&lt;p&gt;If you found value in reading this, please consider sharing it with your friends and also on social media 🙏&lt;/p&gt;

&lt;p&gt;Also, to be notified about my upcoming articles, subscribe to my newsletter below (I’ll not spam you 😂)&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
      &lt;div class="c-embed__cover"&gt;
        &lt;a href="https://blogofcodes.substack.com/embed" class="c-link s:max-w-50 align-middle" rel="noopener noreferrer"&gt;
          &lt;img alt="" src="https://res.cloudinary.com/practicaldev/image/fetch/s--eFxWvpXa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/image/fetch/w_200%2Cc_limit%2Cf_auto%2Cq_auto:good%2Cfl_progressive:steep/https%253A%252F%252Fsubstack.com%252Fimg%252Fsubstack_wordmark.black.png" height="34" class="m-0" width="200"&gt;
        &lt;/a&gt;
      &lt;/div&gt;
    &lt;div class="c-embed__body"&gt;
      &lt;h2 class="fs-xl lh-tight"&gt;
        &lt;a href="https://blogofcodes.substack.com/embed" rel="noopener noreferrer" class="c-link"&gt;
          Blog of Codes | Rahul | Substack
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;p class="truncate-at-3"&gt;
          Articles about cloud architecture and programming. Click to read Blog of Codes, by Rahul, a Substack publication. Launched a month ago.
        &lt;/p&gt;
      &lt;div class="color-secondary fs-s flex items-center"&gt;
          &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://res.cloudinary.com/practicaldev/image/fetch/s--Tp6HD1dD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://substackcdn.com/icons/substack/favicon.ico" width="64" height="64"&gt;
        blogofcodes.substack.com
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


</description>
      <category>aws</category>
      <category>cloud</category>
      <category>architecture</category>
    </item>
    <item>
      <title>Django (python) or Golang?</title>
      <dc:creator>Rahul</dc:creator>
      <pubDate>Fri, 18 Feb 2022 05:16:06 +0000</pubDate>
      <link>https://dev.to/sairahul1/django-python-or-golang-58n0</link>
      <guid>https://dev.to/sairahul1/django-python-or-golang-58n0</guid>
      <description>&lt;p&gt;I'm starting a side project. I’m confused what to use for backend.&lt;br&gt;
I’m an expert in Golang, but also love Django’s ease of development&lt;/p&gt;

&lt;p&gt;Any suggestions?&lt;/p&gt;

</description>
      <category>go</category>
      <category>django</category>
      <category>python</category>
    </item>
  </channel>
</rss>
