<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Dustin Goodman</title>
    <description>The latest articles on DEV Community by Dustin Goodman (@dustinsgoodman).</description>
    <link>https://dev.to/dustinsgoodman</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/dustinsgoodman"/>
    <language>en</language>
    <item>
      <title>Migrating a classic Express.js to Serverless Framework</title>
      <dc:creator>Dustin Goodman</dc:creator>
      <pubDate>Thu, 21 Apr 2022 03:37:03 +0000</pubDate>
      <link>https://dev.to/thisdotmedia/migrating-a-classic-expressjs-to-serverless-framework-584f</link>
      <guid>https://dev.to/thisdotmedia/migrating-a-classic-expressjs-to-serverless-framework-584f</guid>
      <description>&lt;h2&gt;
  
  
  Problem
&lt;/h2&gt;

&lt;p&gt;Classic Express.js applications are great for building backends. However, their deployment can be a bit tricky. There are several solutions on the market for making deployment "easier" like Heroku, AWS Elastic Beanstalk, Qovery, and Vercel. However, "easier" means special configurations or higher service costs.&lt;/p&gt;

&lt;p&gt;In our case, we were trying to deploy an Angular frontend served through Cloudfront, and needed a separately deployed backend to manage an OAuth flow. We needed an easy to deploy solution that supported HTTPS, and could be automated via CI.&lt;/p&gt;

&lt;h2&gt;
  
  
  Serverless Framework
&lt;/h2&gt;

&lt;p&gt;The Serverless Framework is a framework for building and deploying applications onto AWS Lambda, and it allowed us to easily migrate and deploy our Express.js server at a low cost with long-term maintainability. This was so simple that it only took us an hour to migrate our existing API, and get it deployed so we could start using it in our production environment.&lt;/p&gt;

&lt;h2&gt;
  
  
  Serverless Init Script
&lt;/h2&gt;

&lt;p&gt;To start this process, we used the Serverless CLI to initialize a new Serverless Express.js project. This is an example of the settings we chose for our application:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ serverless
What do you want to make? AWS - Node.js - Express API
What do you want to call this project? example-serverless-express

Downloading "aws-node-express-api" template...
Installing dependencies with "npm" in "example-serverless-express" folder
Project successfully created in example-serverless-express folder

What org do you want to add this service to? [Skip]
Do you want to deploy your project? No

Your project is ready for deployment and available in ./example-serverless-express
Run **serverless deploy** in the project directory
    Deploy your newly created service
Run **serverless info** in the project directory after deployment
    View your endpoints and services
Run **serverless invoke** and **serverless logs** in the project directory after deployment
    Invoke your functions directly and view the logs
Run **serverless** in the project directory
    Add metrics, alerts, and a log explorer, by enabling the dashboard functionality
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here's a quick explanation of our choices:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What do you want to make?&lt;/strong&gt; This prompt offers several possible scaffolding options. In our case, the Express API was the perfect solution since that's what we were migrating.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What do you want to call this project?&lt;/strong&gt; You should put whatever you want here. It'll name the directory and define the naming schema for the resources you deploy to AWS.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What org do you want to add this service to?&lt;/strong&gt; This question assumes you are using the serverless.com dashboard for managing your deployments. We're choosing to use Github Actions and AWS tooling directly though, so we've opted out of this option.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Do you want to deploy your project?&lt;/strong&gt; This will attempt to deploy your application immediately after scaffolding. If you don't have your AWS credentials configured correctly, this will use your default profile. We needed a custom profile configuration since we have several projects on different AWS accounts so we opted out of the default deploy.&lt;/p&gt;

&lt;h2&gt;
  
  
  Serverless Init Output
&lt;/h2&gt;

&lt;p&gt;The init script from above outputs the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;.gitignore&lt;/li&gt;
&lt;li&gt;handler.js&lt;/li&gt;
&lt;li&gt;package.json&lt;/li&gt;
&lt;li&gt;README.md&lt;/li&gt;
&lt;li&gt;serverless.yml&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The key here is the serverless.yml and handler.js files that are outputted.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;serverless.yml&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;service: example-serverless-express
frameworkVersion: '2 || 3'

provider:
  name: aws
  runtime: nodejs12.x
  lambdaHashingVersion: '20201221'

functions:
  api:
    handler: handler.handler
    events:
      - httpApi: '*'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;handler.js&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const serverless = require("serverless-http");
const express = require("express");
const app = express();

app.get("/", (req, res, next) =&amp;gt; {
  return res.status(200).json({
    message: "Hello from root!",
  });
});

app.get("/hello", (req, res, next) =&amp;gt; {
  return res.status(200).json({
    message: "Hello from path!",
  });
});

app.use((req, res, next) =&amp;gt; {
  return res.status(404).json({
    error: "Not Found",
  });
});

module.exports.handler = serverless(app);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you can see, this gives a standard Express server ready to just work out of the box. However, we needed to make some quality of life changes to help us migrate with confidence, and allow us to use our API locally for development.&lt;/p&gt;

&lt;h2&gt;
  
  
  Quality of Life Improvements
&lt;/h2&gt;

&lt;p&gt;There are several things that Serverless Framework doesn't provide out of the box that we needed to help our development process. Fortunately, there are great plugins we were able to install and configure quickly.&lt;/p&gt;

&lt;h3&gt;
  
  
  Environment Variables
&lt;/h3&gt;

&lt;p&gt;We need per-environment variables as our OAuth providers are specific per host domain. &lt;a href="https://www.serverless.com/framework/docs/environment-variables"&gt;Serverless Framework supports &lt;code&gt;.env&lt;/code&gt; files out of the box&lt;/a&gt; but it does require you to install the dotenv package and to turn on the &lt;code&gt;useDotenv&lt;/code&gt; flag in the serverless.yml.&lt;/p&gt;

&lt;h3&gt;
  
  
  Babel/TypeScript Support
&lt;/h3&gt;

&lt;p&gt;As you can see in the above handler.js file, we're getting CommonJS instead of modern JavaScript or TypeScript. To get these, you need webpack or some other bundler. &lt;a href="https://github.com/serverless-heaven/serverless-webpack"&gt;serverless-webpack&lt;/a&gt; exists if you want full control over your ecosystem, but there is also &lt;a href="https://github.com/AnomalyInnovations/serverless-bundle"&gt;serverless-bundle&lt;/a&gt; that gives you a set of reasonable defaults on webpack 4 out of the box. We opted into this option to get us started quickly.&lt;/p&gt;

&lt;h3&gt;
  
  
  Offline Mode
&lt;/h3&gt;

&lt;p&gt;With classic Express servers, you can use a simple node script to get the server up and running to test locally. Serverless wants to be run in the AWS ecosystem making it. Lucky for us, &lt;a href="https://github.com/dherault"&gt;David Hérault&lt;/a&gt; has built and continues to maintain &lt;a href="https://github.com/dherault/serverless-offline"&gt;serverless-offline&lt;/a&gt; allowing us to emulate our functions locally before we deploy.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final Configuration
&lt;/h2&gt;

&lt;p&gt;Given these changes, our serverless.yml file now looks as follows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;service: starter-dev-backend
frameworkVersion: '2 || 3'
useDotenv: true # enable .env file support

plugins: # install serverless plugins
  - serverless-bundle
  - serverless-offline

custom: # configure serverless-offline
  serverless-offline:
    httpPort: 4000

provider:
  name: aws
  profile: exampleprofile # use your own profile
  stage: production # use your specified stage
  runtime: nodejs14.x
  lambdaHashingVersion: '20201221'

functions:
  api:
    handler: handler.handler
    events:
      - httpApi: '*'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Some important things to note:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The order of serverless-bundle and serverless-offline in the plugins is critically important.&lt;/li&gt;
&lt;li&gt;The custom port for serverless-offline can be any unused port. Keep in mind what port your frontend server is using when setting this value for local development.&lt;/li&gt;
&lt;li&gt;We set the profile and stage in our provider configuration. This allowed us to use specify the environment settings and AWS profile credentials to use for our deployment.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;With all this set, we're now ready to deploy the basic API.&lt;/p&gt;

&lt;h2&gt;
  
  
  Deploying the new API
&lt;/h2&gt;

&lt;p&gt;Serverless deployment is very simple. We can run the following command in the project directory:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ serverless deploy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command will deploy the API to AWS, and create the necessary resources including the API Gateway and related Lambdas. The first deploy will take roughly 5 minutes, and each subsequent deply will only take a minute or two! In its output, you'll receive a bunch of information about the deployment, including the deployed URL that will look like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://&amp;lt;hash-string&amp;gt;.execute-api.&amp;lt;region&amp;gt;.amazonaws.com
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can now point your app at this API and start using it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Next Steps
&lt;/h2&gt;

&lt;p&gt;A few issues we still have to resolve but are easily fixed:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;New Lambdas are not deploying with their Environment Variables, and have to be set via the AWS console. We're just missing &lt;a href="https://www.serverless.com/framework/docs/providers/aws/guide/variables"&gt;some minor configuration in our serverless.yml&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Our deploys don't deploy on merges to main. For this though, we can just use the &lt;a href="https://github.com/serverless/github-action"&gt;official Serverless Github Action&lt;/a&gt;. Alternatively, we could purchase a license to the Serverless Dashboard, but this option is a bit more expensive, and we're not using all of its features on this project. However, we've used this on other client projects, and it really helped us manage and monitor our deployments.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Given all the above steps, we were able to get our API up and running in a few minutes. And due to it being a 1-to-1 replacement for an existing Express server, we were able to port our existing implementations into this new Serverless implementation, deploy it to AWS, and start using it in just a couple of hours.&lt;/p&gt;

&lt;p&gt;This particular architecture is a great means for bootstraping a new project, but it does come with some scaling issues for larger projects. As such, we do not recommend a pure Serverless, and Express for monolithic projects, and instead suggest utilizing some of the amazing capabilities of Serverless Framework with AWS to horizontally scale your application into smaller Lambda functions.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;This Dot Labs is a development consultancy focused on providing staff augmentation, architectural guidance, and consulting to companies.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;We help implement and teach modern web best practices with technologies such as React, Angular, Vue, Web Components, GraphQL, Node, and more.&lt;/em&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Debugging Node Serverless Functions on AWS Lambda</title>
      <dc:creator>Dustin Goodman</dc:creator>
      <pubDate>Thu, 21 Apr 2022 03:31:37 +0000</pubDate>
      <link>https://dev.to/dustinsgoodman/debugging-node-serverless-functions-on-aws-lambda-48kh</link>
      <guid>https://dev.to/dustinsgoodman/debugging-node-serverless-functions-on-aws-lambda-48kh</guid>
      <description>&lt;p&gt;How many times have you written a function locally, tested it, and had it working only for it to fail when you deployed it to AWS? This is probably more common than you realize and it's usually caused by a misunderstanding of Node or an issue with lambda configuration. In this post, I'll cover some of the most common debugging problems you'll encounter when writing serverless functions and how to fix them.&lt;/p&gt;

&lt;h2&gt;
  
  
  Improper Use of &lt;code&gt;async/await&lt;/code&gt;
&lt;/h2&gt;

&lt;p&gt;When I first started writing serverless functions in Node.js, I had a misconception about how asynchronous functions behave. I was under the impression that you could run an asynchronous function as a background process and it would run on its own thread. However, this is not the case. Asynchronous functions are executed in the context of the Node.js event loop and are not run in the background. This means that if you try to run an asynchronous function in the background, it will block the event loop and the function may possibly never run. For example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;randomBackgroundFunction&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;This function may never run&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;handler&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// do some stuff ...&lt;/span&gt;

  &lt;span class="nf"&gt;randomBackgroundFunction&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt; &lt;span class="c1"&gt;// &amp;lt;-- this most likely won't run without an await&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;randomBackgroundFunction&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt; &lt;span class="c1"&gt;// &amp;lt;-- this function will definitely run&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;goodResponse&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I say "may" because if no other code is running and the event loop is idle, the function will run, but once your handler returns, it's a race against the CPU clock. The AWS Lambda implementation tries to shutdown the Lambda once the response has executed or the Lambda's timeout has been reached (more to come on this topic!). So it's possible, your invocation may run before the shutdown process beings and you'll get lucky that it ran.&lt;/p&gt;

&lt;p&gt;Now you may be asking, "Dustin, how do I run my function in the background and ensure execution?" Luckily, there are 2 great solutions: asynchronous Lambda invocations or AWS's Simple Queuing Service (SQS).&lt;/p&gt;

&lt;h3&gt;
  
  
  Asynchronous Lambda Invocations
&lt;/h3&gt;

&lt;p&gt;AWS built Lambda to have &lt;a href="https://docs.aws.amazon.com/lambda/latest/dg/invocation-async.html" rel="noopener noreferrer"&gt;asynchronous invocations&lt;/a&gt; as an out-of-the-box feature. This means you can invoke a Lambda from your primary handler and have it run on its own thread and not block your main instance. So you can rewrite our example from above like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// background.js&lt;/span&gt;
&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;handler&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// do our background stuff like we may have before&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;This function will definitely run&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;// main.js&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;LambdaClient&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;InvokeCommand&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;@aws-sdk/client-lambda&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;handler&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// do some stuff ...&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;LambdaClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;config&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;command&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;InvokeCommand&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;FunctionName&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;background&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;InvocationType&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Event&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;// default here is 'RequestResponse'&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;command&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="c1"&gt;// this acts as a fire and forget&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;resp&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;See the &lt;a href="https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-lambda/classes/invokecommand.html" rel="noopener noreferrer"&gt;AWS SDK v3 Docs&lt;/a&gt; for more details about the API in use. What we're doing is utilizing the &lt;code&gt;'Event'&lt;/code&gt; invocation type to tell Lambda to just trigger this function and not wait for a response. From &lt;a href="https://docs.aws.amazon.com/lambda/latest/dg/invocation-async.html" rel="noopener noreferrer"&gt;Lambda's docs&lt;/a&gt;:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Lambda manages the function's asynchronous event queue and attempts to retry on errors. If the function returns an error, Lambda attempts to run it two more times, with a one-minute wait between the first two attempts, and two minutes between the second and third attempts. Function errors include errors returned by the function's code and errors returned by the function's runtime, such as timeouts.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;With this, we get the benefit of the event queue without having to set it up and manage it ourselves. The downside is we have to use Lambda's default retry behavior to handle errors giving us less flexibility.&lt;/p&gt;

&lt;h3&gt;
  
  
  AWS SQS
&lt;/h3&gt;

&lt;p&gt;Similar to invoking via another Lambda, we can utilize SQS to send a message to a queue and have it run our function instead. Like with the above example, we can generate a message in an inconsequential amount of time and send it to the queue. With this, we get the benefit of configurable retry behavior, but it comes at a cost of having to manage the queue ourselves. It also means our Lambda needs to know how to flexibly read event data from the SQS stream instead of being able to parse the Payload.&lt;/p&gt;

&lt;h2&gt;
  
  
  Lambda Timeouts
&lt;/h2&gt;

&lt;p&gt;Lambda's default timeout settings are the next major hurdle. If your Lambda needs to run for a while or process a lot of data, you might see your function just quit suddenly and not reach a later moment in your code. &lt;strong&gt;By default, Lambda has a 6 second timeout.&lt;/strong&gt; If you're waiting on additional services, long running queries, or a Lambda to start cold, this could prove problematic. A quick way to check your Lambda's timeout is to load up the AWS Console and at the Lambda's general configuration at the bottom of page. In the screenshot below, you'll see the Lambda I'm inspecting has a 5 minute timeout.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdustingoodman.dev%2Fblog-assets%2F20220420-lambda-configuration.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdustingoodman.dev%2Fblog-assets%2F20220420-lambda-configuration.webp" alt="Lambda Console Configuration Tab"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Lambda timeouts can be configured in second intervals up to 15 minutes. When I use the Serverless Framwork, I typically set my Lambdas attached to API Gateway triggers to 29 seconds and SQS triggers to 15 minutes via the configuration file. I choose 29 seconds because API Gateway's maximum timeout is 30 seconds and due to latency between the API Gateway and the Lambda, AWS warns when the timeout is equal to 30 seconds as it's not truly 30 seconds. Use your deployment configurations method of choice for setting timeouts but confirm they are what you set them to be.&lt;/p&gt;

&lt;h2&gt;
  
  
  Other Things to Look out for
&lt;/h2&gt;

&lt;p&gt;These were two of the larger problems I've faced with relatively easy fixes. The following are some smaller problems that are either easy to fix but specific to your utilization of Lambda or are things I haven't experimented with yet but am aware of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Ensure your Lambda has access to all the resources it interfaces with. You'll need to check the IAM Role attached to your function via the console to see what permissions it has. If you're using the Serverless Framework, you can set the IAM permissions in your serverless configuration file.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Verify your environment variables are set correctly. A Lambda keeps a copy of the environment variables that it accesses and you can verify it via the AWS console. Make sure your values match what you're expecting from your configuration.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If you're doing File I/O or large data operations, make sure you're not running out of memory. If you are, look into utilizing Lambda's new emphermeral storage feature.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I hope you've found these tips and tricks helpful and they save you time down the road!&lt;/p&gt;

</description>
      <category>serverless</category>
      <category>lambda</category>
      <category>aws</category>
      <category>node</category>
    </item>
    <item>
      <title>Announcing a Serverless Microservices Template with GraphQL</title>
      <dc:creator>Dustin Goodman</dc:creator>
      <pubDate>Thu, 21 Apr 2022 03:29:14 +0000</pubDate>
      <link>https://dev.to/dustinsgoodman/announcing-a-serverless-microservices-template-with-graphql-2703</link>
      <guid>https://dev.to/dustinsgoodman/announcing-a-serverless-microservices-template-with-graphql-2703</guid>
      <description>&lt;p&gt;For those that know me, they know I love talking about two things more than anything: Serverless Framework and GraphQL. Today, I'm excited to announce a project starter template that I've been developing that allows developers to build serverless microservices with GraphQL. It is built using Nx monorepos and provides a lot of quality of life developer tooling out-of-the-box. I'll discuss what's in the repo and how you can leverage it today for your own projects. If you want to jump into the code, you can &lt;a href="https://github.com/dustinsgoodman/serverless-microservices-graphql-template" rel="noopener noreferrer"&gt;find it on GitHub&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Project Goals
&lt;/h2&gt;

&lt;p&gt;Over the past few years, I've developed multiple serverless projects and they all tend to follow a similar pattern especially when using GraphQL as the primary API Gateway for the entire application. As the applications have grown, the service architecture had to expand and my teams discovered a lot of difficulty maintaining the different services and how to stitch the different services together in a streamlined method. Another issue was orchestrating local development through package.json scripts would cause memory issues and some cause system memory leaks. Every team also had to perform deep function analysis to keep bundle sizes down when deploying. This project sought out to solve these key issues and provide a first-class developer experience (DX) for those wishing to build serverless microservices with GraphQL.&lt;/p&gt;

&lt;h3&gt;
  
  
  Managing Service Generation
&lt;/h3&gt;

&lt;p&gt;In each project, we always introduced a root level directory &lt;code&gt;serverless.common.yml&lt;/code&gt; configuration file that centralized configuration of plugins and acted as a central lookup for ports utilized by the &lt;a href="https://www.serverless.com/plugins/serverless-offline" rel="noopener noreferrer"&gt;&lt;code&gt;serverless-offline plugin&lt;/code&gt;&lt;/a&gt;. Prior to Serverless v3 and TypeScript configuration files, we had to configure the ports and inject them into the application. This was troublesome for a number of reasons, but mainly, it created problems when creating new services into the stack. Developers would have to create a new service directory, scaffold the service, and configure the global settings and then when they were ready to implement it, they would have to spell their service's name correctly at invocation time. The biggest issue with this was when two services were created at the same time and cause merge conflicts for port configuration.&lt;/p&gt;

&lt;p&gt;With this template, we're leveraging the &lt;a href="https://nx.dev/" rel="noopener noreferrer"&gt;Nx monorepo toolchain&lt;/a&gt; which provides us with the ability to create service generators! This project ships with a service generators that quickly solves the issues seen traditionally. You can simply run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;yarn workspace-generator service &amp;lt;service name&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a new service directory with the provided name&lt;/li&gt;
&lt;li&gt;Scaffold the serverless configuration, test suite, linting, typescript, and example lambda function&lt;/li&gt;
&lt;li&gt;Register the service to the Nx workspace&lt;/li&gt;
&lt;li&gt;Update the serverless.common.ts file to include the new service name to the service type and port mappings to next available ports based on existing configuration&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;For example, if you run &lt;code&gt;yarn workspace-generator service my-service&lt;/code&gt;, you'll see the following changes made for you:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight diff"&gt;&lt;code&gt;&lt;span class="gd"&gt;- export type Service = 'public-api' | 'background-jobs' | 'example-service';
&lt;/span&gt;&lt;span class="gi"&gt;+ export type Service = 'public-api' | 'background-jobs' | 'example-service' | 'my-service';
&lt;/span&gt;&lt;span class="p"&gt;export const PORTS: PortConfig = {
&lt;/span&gt;  'public-api': {
    httpPort: 3000,
    lambdaPort: 3002,
  },
  'background-jobs': {
    httpPort: 3004,
    lambdaPort: 3006,
  },
  'example-service': {
    httpPort: 3008,
    lambdaPort: 3010,
  },
&lt;span class="gi"&gt;+ 'my-service': {
+   httpPort: 3012,
+   lambdaPort: 3014,
+ },
&lt;/span&gt;};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will give you type safety when using the provided &lt;code&gt;invoke&lt;/code&gt; function when selecting services and identify which local port to use.&lt;/p&gt;

&lt;p&gt;The default scaffold attempts to make some reasonable defaults for the serverless configuration. If you don't like the defaults, you can customize them by &lt;a href="https://github.com/dustinsgoodman/serverless-microservices-graphql-template/blob/main/tools/generators/service/files/serverless.ts__tmpl__" rel="noopener noreferrer"&gt;editing generator default template&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Solving Service Orchestration
&lt;/h3&gt;

&lt;p&gt;Most complex backends require a background jobs worker that relies on a service like SQS or require a database to be running. Unfortunately, these are difficult to emulate services in a local environment and require custom setup. Previously, my teams would attempt to extend start scripts to init the required services. However, the node process manager wouldn't manage the processes correctly and would leave services running in the background and cause memory issues for those trying to run services.&lt;/p&gt;

&lt;p&gt;Since a majority of services rely on the same system services, we can leverage Docker to start all the needed project services prior running services with the offline command. The template ships with a base &lt;code&gt;docker-compose.yml&lt;/code&gt; that stands up an instance of &lt;a href="https://github.com/softwaremill/elasticmq" rel="noopener noreferrer"&gt;ElasticMQ&lt;/a&gt; to emulate Amazon SQS's API. The Docker image can be extended to include other services like DynamoDB, Redis, or PostgresQL and can be run using &lt;code&gt;yarn infrastructure:build&lt;/code&gt; or &lt;code&gt;yarn infrastructure:start&lt;/code&gt; depending on if it's your first run or not.&lt;/p&gt;

&lt;h3&gt;
  
  
  Function Analysis
&lt;/h3&gt;

&lt;p&gt;One of the most important aspects of serverless development is keeping an eye on your bundle sizes and to reduce cold start times on Lambda. Keeping this in mind, the template utilizes &lt;a href="https://github.com/floydspace/serverless-esbuild" rel="noopener noreferrer"&gt;&lt;code&gt;serverless-esbuild&lt;/code&gt;&lt;/a&gt; and &lt;a href="https://github.com/adriencaccia/serverless-analyze-bundle-plugin" rel="noopener noreferrer"&gt;&lt;code&gt;serverless-analyze-bundle-plugin&lt;/code&gt;&lt;/a&gt; to provide function analysis out-of-the-box. I opted for &lt;code&gt;serverless-esbuild&lt;/code&gt; over &lt;a href="https://github.com/AnomalyInnovations/serverless-bundle" rel="noopener noreferrer"&gt;&lt;code&gt;serverless-bundle&lt;/code&gt;&lt;/a&gt; for a few reasons:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;code&gt;serverless-bundle&lt;/code&gt; provides a lot of functionality that we don't need or use out of the box&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;esbuild&lt;/code&gt; is generally faster than &lt;code&gt;webpack&lt;/code&gt; for bundling and requires significantly less configuration&lt;/li&gt;
&lt;li&gt;I personally had issues with &lt;code&gt;serverless-bundle&lt;/code&gt; in the past with function analysis which you can &lt;a href="https://dustinsgoodman.medium.com/resolving-serverless-webpack-issues-efae729e0619" rel="noopener noreferrer"&gt;read more about here&lt;/a&gt; and have been reluctant to use it as a result. The plugin has matured since and provides the functionality I fought with but the analyzer tool they chose isn't my favorite so the benefits just aren't there.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;With this project, you can run &lt;code&gt;yarn analyze &amp;lt;service name&amp;gt; --function=&amp;lt;function name&amp;gt;&lt;/code&gt; to get an analysis of your bundle size. For example, if you run &lt;code&gt;yarn analyze public-api --function=graphql&lt;/code&gt;, you'll see the following analysis:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdustingoodman.dev%2Fblog-assets%2F20220414-bundle-analysis.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdustingoodman.dev%2Fblog-assets%2F20220414-bundle-analysis.webp" alt="public-api graphql function analysis"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Default Services
&lt;/h2&gt;

&lt;p&gt;For the default template, I've shipped with a few services to demonstrate how to utilize the project structure. The default services are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;public-api: Contains the GraphQL API setup and structure&lt;/li&gt;
&lt;li&gt;background-jobs: Contains SQS setup and Lambda runners&lt;/li&gt;
&lt;li&gt;example-service: Contains examples for resolver functions for cross service communication and a simple SQS message generator&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Other projects will have more complexity and this doesn't begin to demonstrate all the different features and functionality you can include in your project. This is solely intended to generate enough of a baseline for you to develop your own applications without having to go through all the verbose setup. If you see a feature that would be useful for others, please drop an issue or pull request on the repo!&lt;/p&gt;

&lt;h2&gt;
  
  
  Developer Console
&lt;/h2&gt;

&lt;p&gt;I started my career building Ruby on Rails applications. One of my favorite features was the &lt;code&gt;rails console&lt;/code&gt; which allows developers to interact with their application code directly which greatly helps with testing and debugging. As such, I wanted to recreate this for this template. You can run &lt;code&gt;yarn console &amp;lt;library name&amp;gt;&lt;/code&gt; to get an interactive REPL that you can interface with that library's code. I only included this as part of libraries and not services due to the nature of the code structure and the lack of entry points in which you should interact with the Lambda functions in your services. Below you can see how the console can work:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdustingoodman.dev%2Fblog-assets%2F20220414-console-example.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdustingoodman.dev%2Fblog-assets%2F20220414-console-example.webp" alt="example utils console"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What's Next?
&lt;/h2&gt;

&lt;p&gt;Obviously, there is a lot here and I could keep plugging away to continually improve the template and never release but that wouldn't help anyone. I'll continue to iterate on the template and add useful features and cleanup the codebase. If you have any suggestions or want to help, the &lt;a href="https://github.com/dustinsgoodman/serverless-microservices-graphql-template" rel="noopener noreferrer"&gt;repos issues and pull requests are open&lt;/a&gt;!&lt;/p&gt;

&lt;h2&gt;
  
  
  Thank you!
&lt;/h2&gt;

&lt;p&gt;I want to send a special thank you to the following people for their help in getting this project setup!&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://github.com/sudokar" rel="noopener noreferrer"&gt;sudokar&lt;/a&gt; for the original Nx template in which this project was branched from. I frankly could not have solved some of the workspace generator issues I faced without this starting point.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/jtomchak" rel="noopener noreferrer"&gt;Jesse Tomchak&lt;/a&gt; for talking through key issues and helping to make some important architectural decisions.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/ktrz" rel="noopener noreferrer"&gt;Chris Trześniewski&lt;/a&gt; for assisting with some critical path AST parsing issues. The &lt;code&gt;serverless.common.ts&lt;/code&gt; automatic updates wouldn't have been possible without his help.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/TapaiBalazs" rel="noopener noreferrer"&gt;Tápai Balázs&lt;/a&gt; for convincing me to use Nx in the first place. I tried yarn workspaces and frankly, I couldn't make anything work right. Nx was the right call and his suggestion made for a great fit even if I had to modify a lot of their core features!&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/NachoVazquez" rel="noopener noreferrer"&gt;Nacho Vazquez&lt;/a&gt; and &lt;a href="https://github.com/dariodjuric" rel="noopener noreferrer"&gt;Dario Djuric&lt;/a&gt; for assisting with some Nx structural decisions that I was fighting and helping me come up with a better long term solution for this template.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>serverless</category>
      <category>graphql</category>
      <category>microservices</category>
      <category>nx</category>
    </item>
    <item>
      <title>Why you should upgrade to Serverless 3</title>
      <dc:creator>Dustin Goodman</dc:creator>
      <pubDate>Sat, 19 Feb 2022 21:44:29 +0000</pubDate>
      <link>https://dev.to/dustinsgoodman/why-you-should-upgrade-to-serverless-3-22h2</link>
      <guid>https://dev.to/dustinsgoodman/why-you-should-upgrade-to-serverless-3-22h2</guid>
      <description>&lt;p&gt;If you haven't read about the release and what's included in this update, I highly recommend you read both the &lt;a href="https://www.serverless.com/blog/serverless-framework-v3-beta"&gt;beta release announcement&lt;/a&gt; and the &lt;a href="https://www.serverless.com/blog/serverless-framework-v3-is-live"&gt;full release announcement&lt;/a&gt; first as they give a good overview of all the changes. In this post, I'll highlight a few of those core changes and their significance.&lt;/p&gt;

&lt;h2&gt;
  
  
  Upgrading from v2 to v3
&lt;/h2&gt;

&lt;p&gt;The Serverless Framework team did an amazing job of making the migration to v3 as seamless as possible without major breaking changes and kept up their habit of making things backwards compatibility. The upgrade is as easy as follows:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Upgrade to the latest version of v2. If you're on v1, you'll have a bit more work to do.&lt;/li&gt;
&lt;li&gt;Check if you have &lt;em&gt;any&lt;/em&gt; deprecation notices.

&lt;ul&gt;
&lt;li&gt;If yes, you'll want to fix those but the good news is they should all be restricted to your &lt;code&gt;serverless.yaml&lt;/code&gt; files.&lt;/li&gt;
&lt;li&gt;Some deprecations may be related to your plugins. The Serverless Framework team worked with many of the most popular plugins to make them backwards compatible. Go read your plugins' documentation and make the needed upgrades.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Upgrade serverless to v3. Typically you can just do this on your system with the following:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;   npm i &lt;span class="nt"&gt;-g&lt;/span&gt; serverless
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Now, you can update your &lt;code&gt;serverless.yaml&lt;/code&gt; to specify the new version.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;   &lt;span class="na"&gt;service&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;&amp;lt;your service name here&amp;gt;&lt;/span&gt;
   &lt;span class="na"&gt;plugins&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
     &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;&amp;lt;your plugins here&amp;gt;&lt;/span&gt;

   &lt;span class="na"&gt;frameworkVersion&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3'&lt;/span&gt; &lt;span class="c1"&gt;# This is the line to change from 2 -&amp;gt; 3&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you find issues when you move to v3, you can easily just flip this back to v2 so you should definitely give this a shot.&lt;/p&gt;

&lt;h2&gt;
  
  
  🎉 New Stage Parameters 🎉
&lt;/h2&gt;

&lt;p&gt;TL;DR:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3MlmT_PO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dustingoodman.dev/blog-assets/20220130-stage-params.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3MlmT_PO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dustingoodman.dev/blog-assets/20220130-stage-params.png" alt="stage parameters meme" width="492" height="489"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is by far my favorite feature included with this update as it removes an old hack that many serverless developers implemented to achieve the end goal of this feature. Specifically, this feature allows you to set service configuration settings based on the current stage, or environment. This is incredibily important for larger teams implementing serverless applications as typically, you'll have a production, staging, and/or development environment that may need custom configuration.&lt;/p&gt;

&lt;p&gt;Before, you might have setup your &lt;code&gt;serverless.yaml&lt;/code&gt; like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="c1"&gt;# ...&lt;/span&gt;

&lt;span class="na"&gt;custom&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;profile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;my-profile&lt;/span&gt;

  &lt;span class="na"&gt;stage_env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;dev&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;PROFILE&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dev&lt;/span&gt;
      &lt;span class="na"&gt;SLS_STAGE&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dev&lt;/span&gt;

    &lt;span class="na"&gt;local&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;PROFILE&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dev&lt;/span&gt;
      &lt;span class="na"&gt;SLS_STAGE&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;local&lt;/span&gt;

&lt;span class="na"&gt;provider&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;aws&lt;/span&gt;
  &lt;span class="na"&gt;stage&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${opt:stage, "local"}&lt;/span&gt;
  &lt;span class="na"&gt;region&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${opt:region, "us-east-1"}&lt;/span&gt;
  &lt;span class="na"&gt;runtime&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;nodejs14.x&lt;/span&gt;
  &lt;span class="na"&gt;profile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${self:custom.profile}-${self:custom.stage_env.${self:provider.stage}.PROFILE, 'dev'}&lt;/span&gt;
&lt;span class="c1"&gt;# ...&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Notice the really hard to grok field, &lt;code&gt;profile&lt;/code&gt;, under &lt;code&gt;provider&lt;/code&gt;. This was what we had to do to configure our service variables per development stage. For example, if I want the dev environment setup, I would set &lt;code&gt;--stage local&lt;/code&gt; when I run my serverless commands and &lt;code&gt;profile&lt;/code&gt; would be set to &lt;code&gt;my-profile-dev&lt;/code&gt;. This is pretty hard to keep track of and lead to many long configuration debugging sessions.&lt;/p&gt;

&lt;p&gt;Now, you can use the new stage parameters by setting their values under the &lt;code&gt;params&lt;/code&gt; key and then acessing them via the &lt;code&gt;${param:xxx}&lt;/code&gt; syntax. So our above &lt;code&gt;serverless.yaml&lt;/code&gt; can now be rewritten as:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="c1"&gt;# ...&lt;/span&gt;
&lt;span class="na"&gt;params&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;dev&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;PROFILE&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;my-profile-dev&lt;/span&gt;
    &lt;span class="na"&gt;SLS_STAGE&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dev&lt;/span&gt;
  &lt;span class="na"&gt;local&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;PROFILE&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;my-profile-dev&lt;/span&gt;
    &lt;span class="na"&gt;SLS_STAGE&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;local&lt;/span&gt;

&lt;span class="na"&gt;provider&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;aws&lt;/span&gt;
  &lt;span class="na"&gt;stage&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${opt:stage, "local"}&lt;/span&gt;
  &lt;span class="na"&gt;region&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${opt:region, "us-east-1"}&lt;/span&gt;
  &lt;span class="na"&gt;runtime&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;nodejs14.x&lt;/span&gt;
  &lt;span class="na"&gt;profile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${params:PROFILE}&lt;/span&gt;
&lt;span class="c1"&gt;# ...&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Improved CLI
&lt;/h2&gt;

&lt;p&gt;I refer you to the &lt;a href="https://www.serverless.com/blog/serverless-framework-v3-is-live"&gt;full release post from the Serverless team&lt;/a&gt; to really see these highlights as they did a wonderful job showing you the before and after of the enhancements. I'll give my quick opinion here on these improvements:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The CLI now only informs you of what you need to know. Before, there was a little extra white noise which &lt;em&gt;could&lt;/em&gt; be helpful under the right circumstances but was mostly unhelpful. Now, it tells you the facts you care about: (1) where was my app deployed and (2) what endpoints can I use to access it.&lt;/li&gt;
&lt;li&gt;The &lt;code&gt;--verbose&lt;/code&gt; flag now provides all that additional information that's now hidden when you need to debug and provides it in a much more readable format.&lt;/li&gt;
&lt;li&gt;Errors now look like errors! 🛑 Before, errors looked like the rest of the output returned by a command so it didn't immediately capture your eye. Now, it's very obvious when a problem occurs with seemingly better messaging. I haven't had too many run ins with this experience yet as the rest of my systems have upgraded so cleanly but the few times I've seen an issue, it took me seconds to realize what the problem was instead of minutes.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you're new to Serverless Framework, trust me when I say that the devs really thought about you with these updates. These were real problems for folks and now, they're a thing of the past. There's always room to improve so if you see something that could be better, let them know as they're clearly listening to their users. 🥰&lt;/p&gt;

&lt;h2&gt;
  
  
  Onboarding
&lt;/h2&gt;

&lt;p&gt;One of my least favorite things with the old CLI was starting a new project. Previously, it created your new project in your current work directory so I would accidentally clutter my top-level &lt;code&gt;development&lt;/code&gt; folder that I keep and would have to manually clean things up. Additionally, the setup only gave you the minimum files and setup needed to start so the README.md, serverless.yaml, and handler.js.&lt;/p&gt;

&lt;p&gt;The new CLI project starter made a ton of improvements. Now, it creates the new folder for your project and initializes all your files in there. The CLI offers you a set of standard templates for common usages and guides you to their full list of amazing starter kits for new projects.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rIkidItA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dustingoodman.dev/blog-assets/20220130-serverless-init.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rIkidItA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dustingoodman.dev/blog-assets/20220130-serverless-init.png" alt="serverless init" width="880" height="292"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;They kept with the old and help you quickly configure your project for the Serverless Dashboard and help you do your first deploy on project init but allow you to opt out as well.&lt;/p&gt;

&lt;h2&gt;
  
  
  Plugins
&lt;/h2&gt;

&lt;p&gt;As I mentioned earlier, the Serverless team did a great job helping the community migrate popular plugins to be v3 compatible. I've never created a plugin myself, but the team has put out some great resources and new plugin API for those looking to create their own tooling for Serverless Framework. I've gone ahead and experimented with the following plugins and can say they have played really nicely with v3:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.npmjs.com/package/serverless-offline"&gt;Serverless Offline&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.npmjs.com/package/serverless-webpack"&gt;Serverless Webpack&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.npmjs.com/package/serverless-esbuild"&gt;Serverless ESBuild&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.npmjs.com/package/serverless-bundle"&gt;Serverless Bundle&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.npmjs.com/package/serverless-s3-remover"&gt;Serverless S3 Remover&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.npmjs.com/package/serverless-dynamodb-local"&gt;Serverless DynamoDB Local&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This isn't to say they've all been officially upgraded to the latest plugin API and play nice with the new CLI, but they still work so if you're in need of any of these plugins, I'd say they're safe to use but are worth your own investigation. 😉&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Serverless Framework continues to be my favorite toolchain for building applications utilizing serverless architectures, and I'm really excited to see the developers continue to improve it in meaningful ways. The other changes I haven't covered here are some specific deprecated syntaxes that I think are uncommon. If you need more info on them though, &lt;a href="https://www.serverless.com/framework/docs/guides/upgrading-v3"&gt;read the upgrade guide&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I want to send a special thank you to the Serverless Framework team for their hard work and making this great tool even better! You all hit a homerun with this one! 🎉&lt;/p&gt;

</description>
      <category>serverless</category>
    </item>
    <item>
      <title>Migrating from REST to GraphQL</title>
      <dc:creator>Dustin Goodman</dc:creator>
      <pubDate>Mon, 15 Feb 2021 20:25:25 +0000</pubDate>
      <link>https://dev.to/thisdotmedia/migrating-from-rest-to-graphql-1kd2</link>
      <guid>https://dev.to/thisdotmedia/migrating-from-rest-to-graphql-1kd2</guid>
      <description>&lt;h1&gt;
  
  
  Introduction
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://graphql.org/"&gt;GraphQL&lt;/a&gt; has been gaining a lot of traction with enterprises and startups for their application data layers. Historically, the web has been built using &lt;a href="https://en.wikipedia.org/wiki/Representational_state_transfer"&gt;REST&lt;/a&gt; and &lt;a href="https://en.wikipedia.org/wiki/SOAP"&gt;SOAP&lt;/a&gt; APIs which have served their purpose successfully for years, but as applications have gotten more complicated and data has become richer, these solutions have created friction in developing performant software quickly.&lt;/p&gt;

&lt;p&gt;In this article, we'll briefly discuss some of the problems with traditional API solutions, the benefits of migrating to GraphQL, and the strategy for migrating to a GraphQL solution.&lt;/p&gt;

&lt;h1&gt;
  
  
  Traditional API Problems
&lt;/h1&gt;

&lt;p&gt;In traditional API systems, we typically suffer from a few common issues:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Data under-fetching or n+1 fetching&lt;/li&gt;
&lt;li&gt;Data over-fetching&lt;/li&gt;
&lt;li&gt;All-or-nothing Responses&lt;/li&gt;
&lt;li&gt;Lack of batch support&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Data Under-fetching
&lt;/h2&gt;

&lt;p&gt;Traditional resources require us to request data on a per-entity basis, e.g. only users or only posts. For example, using REST, if we want to get some user details and their posts, we'd have to make the following requests:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;code&gt;GET /users/1&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;GET /users/1/posts&lt;/code&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Data Over-fetching
&lt;/h2&gt;

&lt;p&gt;Conversely, when we request certain data, it will give us all the available information including data we might not care about. From our previous example, we might only want a user's name and username but the response might provide us their creation time and bio.&lt;/p&gt;

&lt;h2&gt;
  
  
  All-or-nothing Responses
&lt;/h2&gt;

&lt;p&gt;However, if there's an error somewhere in this process, we might not get any data. Instead, we receive an HTTP status code informing us of a failure with an error message but none of the data that was fetchable.&lt;/p&gt;

&lt;h2&gt;
  
  
  Lack of Batch Support
&lt;/h2&gt;

&lt;p&gt;Finally, for our more complex page, we might need to run multiple requests that can be parallelized but traditional APIs don't support this behavior out of the box. Dashboards, for example, might need sales and marketing data which will require our clients to make two separate requests to our server and wait on results before displaying that data causing perceived slowness in our application.&lt;/p&gt;

&lt;h1&gt;
  
  
  The GraphQL Advantage
&lt;/h1&gt;

&lt;p&gt;Out of the box, GraphQL solves all of these described issues due to its declarative querying syntax and data handling. When you fetch data, you can request the exact data you need, and using the connection among entities, you can retrieve those relationships in a single request. If any of the data fails to fetch, GraphQL will still tell you about the data that was successfully retrieved and about the failures in fetching the other data, allowing you to show your users data regardless of failures. GraphQL also allows you to group multiple operations in a single request and fetch all data from a single request, thus reducing the number of round trips to your server and increasing perceived speed of your application.&lt;/p&gt;

&lt;p&gt;In addition to these features, GraphQL creates a single gateway for your clients, reducing friction in team communication around how data should be fetched. Your API is now abstracted away behind a single endpoint that also provides documentation on how to use it.&lt;/p&gt;

&lt;p&gt;&lt;a href="//images.ctfassets.net/zojzzdop0fzx/37NhItaEgKCFvNSbLCoJP9/165f732ddeeb7f1e8bfeb7c01fde12bb/graphql_architecture.png" class="article-body-image-wrapper"&gt;&lt;img src="//images.ctfassets.net/zojzzdop0fzx/37NhItaEgKCFvNSbLCoJP9/165f732ddeeb7f1e8bfeb7c01fde12bb/graphql_architecture.png" alt="GraphQL Architecture"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Given all these advantages, it's no wonder teams are moving to GraphQL, but it leaves the question of: how?&lt;/p&gt;

&lt;h1&gt;
  
  
  Migration Strategy
&lt;/h1&gt;

&lt;p&gt;The GraphQL migration strategy is incremental so you don't have to slow down development to port over existing data or endpoints until you're ready to opt into those changes. &lt;/p&gt;

&lt;h2&gt;
  
  
  0. Before you begin
&lt;/h2&gt;

&lt;p&gt;Before you start migration, here are some suggestions to think about as you're building new features or modifying the system in any way.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Don't build any new REST endpoints.&lt;/strong&gt; Any new REST work is going to be additional GraphQL work later. Do yourself a favor and build it in GraphQL already.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Don't maintain your current REST endpoints.&lt;/strong&gt; Porting REST endpoints to GraphQL is simple and GraphQL will provide you more functionality to build the exact behavior you want.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Leverage your existing REST endpoints to prototype quickly.&lt;/strong&gt; You can use your existing REST API to power your GraphQL implementation. This won't be sustainable or performant long term, but it's a great way to get started.&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Pick your GraphQL Implementation
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.apollographql.com/"&gt;Apollo&lt;/a&gt; and &lt;a href="https://relay.dev/"&gt;Relay&lt;/a&gt; are the two most popular fullstack GraphQL solutions, but you can also build your own solutions. Regardless of what you use, you'll use this to implement your server endpoint and connect to it with your client. All GraphQL requests go through a single endpoint, so once this is up and running, you can connect to it and begin porting functionality.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Select your first feature to build or port
&lt;/h2&gt;

&lt;p&gt;With our server, we can start adding to it. Following our earlier example, let's migrate user posts.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Define your schema types
&lt;/h2&gt;

&lt;p&gt;Now that we've decided on user posts, we have two routes here: (1) migrate users and posts or (2) migrate posts with a filter on user. For this, we're going to migrate posts and filter on user ID for now. To start, we'll define our &lt;code&gt;post&lt;/code&gt; type in the schema and define its query type:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;type Post {
  id: ID!
  userId: ID!
  content: String!
}

type Query {
  posts(userId: ID): [Post]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We now have a &lt;code&gt;Post&lt;/code&gt; type that has an id and content and knows which user it belongs to. Additonally, we have a query called &lt;code&gt;Posts&lt;/code&gt; that optionally accepts a userId as a filter and returns a list of &lt;code&gt;Post&lt;/code&gt;s. It's important to note that it is semantically incorrect in GraphQL to expose the &lt;code&gt;userId&lt;/code&gt; as a field. Instead, we should connect a post to its user and expose that entity relation, but those will be choices you make as you design your API.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Build our data resolver
&lt;/h2&gt;

&lt;p&gt;Now, we need to connect our schema type and query to our data. For this, we'll use a resolver. The following syntax will vary slightly pending your server implementation, but using JavaScript and the GraphQL specification, we'd end up with the following resolver object:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;fetch&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;node-fetch&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;resolvers&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;Query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;posts&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;obj&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;args&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;API_URL&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;userId&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;args&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

      &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;userId&lt;/span&gt;&lt;span class="p"&gt;){&lt;/span&gt;
        &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;fetch&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;API_URL&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/users/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;userId&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/posts`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;

      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;fetch&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;API_URL&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/posts`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If the &lt;code&gt;userId&lt;/code&gt; is present in the query arguments, we use our existing REST API to fetch the posts by user, but if no &lt;code&gt;userId&lt;/code&gt; is provided, we use the &lt;code&gt;posts&lt;/code&gt; route directly. Now, we can make the following request on the frontend to retrieve our data:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight graphql"&gt;&lt;code&gt;&lt;span class="k"&gt;query&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;UserPosts&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$userId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;ID&lt;/span&gt;&lt;span class="p"&gt;!)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="n"&gt;posts&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;userId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$userId&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I chose to use &lt;a href="https://www.npmjs.com/package/node-fetch"&gt;node-fetch&lt;/a&gt; for my implementation because it was simple, but you can use any HTTP library of your choice. However, if you're in the Apollo ecosystem, they've built a &lt;a href="https://www.apollographql.com/docs/apollo-server/data/data-sources/"&gt;RESTDataSource&lt;/a&gt; library that will create an extension to your GraphQL implementation for handling resolvers to microservice APIs that can setup the boilerplate for that service so you only worry about fetching the data.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. Next Steps
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Extending Our Graph
&lt;/h3&gt;

&lt;p&gt;Now that we have our data integrated, we need to complete the graph by connecting related types. Instead of &lt;code&gt;Post&lt;/code&gt; having a &lt;code&gt;userId&lt;/code&gt;, it can have a &lt;code&gt;User&lt;/code&gt; and fetch the author details directly from the same query, e.g.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight graphql"&gt;&lt;code&gt;&lt;span class="k"&gt;query&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;UserPosts&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$userId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;ID&lt;/span&gt;&lt;span class="p"&gt;!)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="n"&gt;posts&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;userId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$userId&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;user&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="n"&gt;avatarUrl&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="n"&gt;displayName&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Monoliths
&lt;/h3&gt;

&lt;p&gt;Because we now have queries and types with full control of our schema, we can update our resolver functionality to rely on the codebase and not our REST API abstraction which will give us some added perfromance benefits. We can keep stitching together new types and extend our API further.&lt;/p&gt;

&lt;h3&gt;
  
  
  Microservices
&lt;/h3&gt;

&lt;p&gt;GraphQL and microservices go hand-in-hand pretty well. GraphQL supports schema stitching, which allows us to build individual GraphQL APIs in our microservices and then combine them to make up our larger interface. Now, instead of configuring our clients to define all the different connections to different services, our GraphQL server understands where to collect all the data from,  simplifying the amount of information the frontend needs to know about in order to complete requests.&lt;/p&gt;

&lt;h3&gt;
  
  
  Performance
&lt;/h3&gt;

&lt;p&gt;A major downside to GraphQL can be the server-side overfetching, or n+1 problem. Because GraphQL doesn't know exactly how data is structured in the database, it cannot optimize for redundant requests in the graph tree. However, the &lt;a href="https://github.com/graphql/dataloader"&gt;GraphQL DataLoader&lt;/a&gt; library is here to solve exactly that. It determines any data that's already been fetched and caches for use in any sub-query to follow.&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;With all this power, it's no wonder GraphQL is picking up so much steam in the community. That being said, GraphQL isn't for everyone or might not be a good solution for your team today. However, I would suspect lots of future APIs we rely upon will start utilizing GraphQL more heavily and we'll see a trend away from traditional REST. Hopefully, you've seen the opportunity of GraphQL in your codebase and how it will help your team deliver quality products faster, and you can have a conversation with your team about a possible migration.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;This Dot Labs is a modern web consultancy focused on helping companies realize their digital transformation efforts. For expert architectural guidance, training, or consulting in React, Angular, Vue, Web Components, GraphQL, Node, Bazel, or Polymer, visit &lt;a href="https://www.thisdotlabs.com"&gt;thisdotlabs.com&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;This Dot Media is focused on creating an inclusive and educational web for all.  We keep you up to date with advancements in the modern web through events, podcasts, and free content. To learn, visit &lt;a href="https://www.thisdot.co"&gt;thisdot.co&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>graphql</category>
      <category>javascript</category>
      <category>webdev</category>
    </item>
  </channel>
</rss>
