<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Andre</title>
    <description>The latest articles on DEV Community by Andre (@andre347).</description>
    <link>https://dev.to/andre347</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/andre347"/>
    <language>en</language>
    <item>
      <title>Scheduling dbt Core with Github Actions</title>
      <dc:creator>Andre</dc:creator>
      <pubDate>Wed, 08 Dec 2021 14:54:55 +0000</pubDate>
      <link>https://dev.to/andre347/scheduling-dbt-core-with-github-actions-25bk</link>
      <guid>https://dev.to/andre347/scheduling-dbt-core-with-github-actions-25bk</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published: &lt;a href="https://www.andredevries.dev/posts/schedule-dbt-github-actions" rel="noopener noreferrer"&gt;andredevries.dev&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;I have been using dbt more and more as my main tool for data transformation and data modelling. &lt;a href="https://www.getdbt.com/" rel="noopener noreferrer"&gt;dbt Labs&lt;/a&gt; (the company behind dbt) offers two types of products: dbt Cloud and dbt Core. The latter is an open-source CLI tool, which gives you the main components of a dbt project. The cloud offering (dbt Cloud) includes a lot more functionality, including an easy way to schedule your dbt runs. However, if you don't want to use dbt Cloud or want a bit more flexibility for configuring your dbt runs then you could also use Github Actions as a way of deploying and running your models. In fact, the analytics pipeline of this website uses this particular setup with a daily Github Action job. There are a lot more options for orchestrating dbt Core, including using Apache Airflow but I found Github Actions to be an easy and lightweight solution.&lt;/p&gt;

&lt;h2&gt;
  
  
  What are Github Actions?
&lt;/h2&gt;

&lt;p&gt;dbt is built for &lt;a href="https://git-scm.com/" rel="noopener noreferrer"&gt;git&lt;/a&gt; and it is the perfect tool to integrate with your continuous integration and continuous delivery (CI/CD) pipeline. &lt;a href="https://github.com/features/actions" rel="noopener noreferrer"&gt;Github Actions&lt;/a&gt; make it easy to automate your software workflows and allows you to automate your build, test, and deployment pipeline. Typically Github provides the virtual machines for you to run your workflows, making it super easy to get up and running. They also have a &lt;a href="https://docs.github.com/en/actions/learn-github-actions/usage-limits-billing-and-administration" rel="noopener noreferrer"&gt;generous free tier&lt;/a&gt;. Actions are typically &lt;a href="https://docs.github.com/en/actions/learn-github-actions/events-that-trigger-workflows" rel="noopener noreferrer"&gt;configured&lt;/a&gt; to run on particular &lt;em&gt;triggers&lt;/em&gt; such as forking a repo, commenting on an issue, pushing to a branch etc. You can also configure actions to run on a schedule via CRON. That is the option we are going to look at in this blog post.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to configure Github Actions for dbt
&lt;/h2&gt;

&lt;p&gt;Github Actions workflows are defined by a YAML file, which is checked into your repository in a &lt;code&gt;.github/workflows&lt;/code&gt; folder. It will run when it is triggered by an event in your repository, or it can be triggered manually, or at a defined schedule (CRON). You can configure Github Actions for any of your repositories on Github. There are a few steps involved for each workflow you want to configure. Let's take a look at configuring one for your dbt project:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create &lt;code&gt;.github/workflows/&lt;/code&gt; directory in the root of your dbt project to store your workflow YAML files&lt;/li&gt;
&lt;li&gt;In this folder create a file called 'schedule_dbt_job.yml'&lt;/li&gt;
&lt;li&gt;Copy/paste the YAML below&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;

   &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;schedule_dbt_job&lt;/span&gt;

   &lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
   &lt;span class="na"&gt;schedule&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
       &lt;span class="c1"&gt;# run at 7AM every single day&lt;/span&gt;
       &lt;span class="c1"&gt;# https://crontab.guru &amp;lt;-- for generating CRON expression&lt;/span&gt;
       &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;cron&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;0&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;7&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;*&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;*&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;*"&lt;/span&gt;
   &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
       &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
       &lt;span class="c1"&gt;# run on push to development branch&lt;/span&gt;
       &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;development&lt;/span&gt;
   &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
   &lt;span class="na"&gt;DBT_PROFILES_DIR&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;./&lt;/span&gt;

   &lt;span class="na"&gt;DBT_SNOWFLAKE_USERNAME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.DBT_SNOWFLAKE_USERNAME }}&lt;/span&gt;
   &lt;span class="na"&gt;DBT_SNOWFLAKE_PW&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.DBT_SNOWFLAKE_PW }}&lt;/span&gt;
   &lt;span class="na"&gt;DBT_SNOWFLAKE_ROLE&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.DBT_SNOWFLAKE_ROLE }}&lt;/span&gt;

   &lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
   &lt;span class="na"&gt;schedule_dbt_job&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
       &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;schedule_dbt_job&lt;/span&gt;
       &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;

       &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
       &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Check out&lt;/span&gt;
           &lt;span class="s"&gt;uses&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@master&lt;/span&gt;

       &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/setup-python@v1&lt;/span&gt;
           &lt;span class="s"&gt;with&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt;
           &lt;span class="na"&gt;python-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;3.7.x"&lt;/span&gt;

       &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Install dependencies&lt;/span&gt;
           &lt;span class="s"&gt;run&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
           &lt;span class="s"&gt;pip install dbt&lt;/span&gt;
           &lt;span class="s"&gt;dbt deps&lt;/span&gt;

       &lt;span class="c1"&gt;# dbt related commands here - run use --target prod/dev to run for specific environments&lt;/span&gt;
       &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Run dbt models&lt;/span&gt;
           &lt;span class="s"&gt;run&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dbt run&lt;/span&gt;

       &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Test dbt models&lt;/span&gt;
           &lt;span class="s"&gt;run&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dbt test&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;Create a &lt;code&gt;profiles.yml&lt;/code&gt; &lt;a href="https://docs.getdbt.com/reference/profiles.yml/" rel="noopener noreferrer"&gt;file&lt;/a&gt; in the root of your dbt project if it does not exist yet. In this document you have to configure your connection details for setting up a connection to your datawarehouse. In this YAML you will see that I am using Snowflake, but this should also work for any of other datawarehouses that are supported by dbt (Google Big Query &amp;amp; Amazon Redshift).&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;

   &lt;span class="na"&gt;default&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
     &lt;span class="na"&gt;outputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
       &lt;span class="na"&gt;dev&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
       &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;snowflake&lt;/span&gt;
       &lt;span class="na"&gt;threads&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;1&lt;/span&gt;
       &lt;span class="na"&gt;account&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;env_var('DBT_SNOWFLAKE_ACCOUNT')&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"&lt;/span&gt;
       &lt;span class="na"&gt;user&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;env_var('DBT_SNOWFLAKE_USERNAME')&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"&lt;/span&gt;
       &lt;span class="na"&gt;role&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;env_var('DBT_SNOWFLAKE_ROLE')&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"&lt;/span&gt;
       &lt;span class="na"&gt;password&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;env_var('DBT_SNOWFLAKE_PW')&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"&lt;/span&gt;
       &lt;span class="na"&gt;database&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;env_var('DBT_SNOWFLAKE_DATABASE')&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"&lt;/span&gt;
       &lt;span class="na"&gt;warehouse&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;env_var('DBT_SNOWFLAKE_WAREHOUSE')&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"&lt;/span&gt;
       &lt;span class="na"&gt;schema&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;env_var('DBT_SNOWFLAKE_SCHEMA')&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"&lt;/span&gt;
       &lt;span class="na"&gt;client_session_keep_alive&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;False&lt;/span&gt;
       &lt;span class="na"&gt;query_tag&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;github_action_query&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;Before we can schedule this dbt project we need to configure the &lt;a href="https://docs.github.com/en/actions/learn-github-actions/workflow-syntax-for-github-actions#jobsjob_idstepsenv" rel="noopener noreferrer"&gt;environment variables&lt;/a&gt; in your Github repo. Whenever your Github Action runs it will use these variables at run time. You can configure these secrets via Settings &amp;gt; Secrets &amp;gt; New Repository Secret. You need to add all the variables that are specified in the YAML above. For example, the Snowflake Account variable would be 'DBT_SNOWFLAKE_ACCOUNT'.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw6h3tkapkv4hthj1irvm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw6h3tkapkv4hthj1irvm.png" alt="https://res.cloudinary.com/dmim37dbf/image/upload/v1638908821/dbt-blog-github-action/dbt-secrets-github-action.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrap-up
&lt;/h2&gt;

&lt;p&gt;You should now be able to push all of your changes to your Github repository. If you navigate to Github you will see an 'Actions' tab. In here you will be able to see all of your past runs. Whenever I develop and test my dbt setup in Github I also use the push to branch event in my workflow YAML - rather than waiting till it is the configured CRON time!&lt;/p&gt;

</description>
      <category>github</category>
      <category>dbt</category>
    </item>
    <item>
      <title>How to easily create a Postgres database in Docker</title>
      <dc:creator>Andre</dc:creator>
      <pubDate>Thu, 28 Jan 2021 09:33:03 +0000</pubDate>
      <link>https://dev.to/andre347/how-to-easily-create-a-postgres-database-in-docker-4moj</link>
      <guid>https://dev.to/andre347/how-to-easily-create-a-postgres-database-in-docker-4moj</guid>
      <description>&lt;p&gt;Have you ever had the problem where a tool or a piece of software works fine on your machine, but the moment you install it on someone else's you get all kinds of issues? Well, I have, and particularly for this reason Docker was invented! In this blog post we will take a quick look at what Docker is and how easy it is to run a database in a Docker container. This container will work on any machine. I promise. Along the way you also learn some Docker specific lingo.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Docker?
&lt;/h2&gt;

&lt;p&gt;According to the &lt;a href="https://docs.docker.com/get-started/overview/" rel="noopener noreferrer"&gt;official docs&lt;/a&gt;, Docker is and "&lt;em&gt;open platform for developing, shipping, and running applications. Docker enables you to separate your applications from your infrastructure so you can deliver software quickly. With Docker, you can manage your infrastructure in the same ways you manage your applications.&lt;/em&gt;"&lt;/p&gt;

&lt;p&gt;For fully understanding Docker we also need to talk about the difference between Docker and a Virtual Machine (VM). The latter often runs in cloud environments like AWS and Azure. Whenever you create a VM you are sharing the &lt;strong&gt;hardware&lt;/strong&gt; with others and other VMs. What these cloud environments are doing is 'virtualise' the hardware. Docker doesn't do that and does it differently. In a VM you can have multiple Operating Systems running on the same hardware, whilst with Docker you virtualise the &lt;strong&gt;Operating System&lt;/strong&gt;. Therefore, the big difference between VMs and Docker containers is that the former can have multiple (Guest) Operating Systems on the same hardware, through for example VMWare (which is called a Hypervisor). Whilst when you install Docker, you are going to use the Docker Engine to create isolated entities on the OS. These entities are called containers. Docker therefore allows you to &lt;strong&gt;automate&lt;/strong&gt; the deployment of applications in these containers.&lt;/p&gt;

&lt;h2&gt;
  
  
  Let's install Docker
&lt;/h2&gt;

&lt;p&gt;Enough theory, let's jump into installing Docker and firing up a Postgres database. In order to use Docker you first need to install it. You can either install Docker on a Desktop machine (both Windows and Mac), or on a server (Linux based installations). For this tutorial we're going to install Docker on a &lt;a href="https://docs.docker.com/docker-for-mac/install/" rel="noopener noreferrer"&gt;Mac&lt;/a&gt;. Windows installation instructions can be found &lt;a href="https://docs.docker.com/docker-for-windows/install/" rel="noopener noreferrer"&gt;here&lt;/a&gt;. For a Mac (and I think also for Windows) the installation is fairly straightforward. You download the app and drag it to Applications. Then you double click the &lt;a href="http://docker.app" rel="noopener noreferrer"&gt;Docker.app&lt;/a&gt; and it should start. You can check if it's working when there is a Docker icon (the whale or ship like image with containers) on the top right next to your other small icons. If this is the case you can quickly follow the 'Hello World' example to get up and running.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to create a Postgres database
&lt;/h2&gt;

&lt;p&gt;Hands down the easiest way of running a clean Postgres database is by running this command in a terminal window (after Docker has been installed):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker run &lt;span class="nt"&gt;--name&lt;/span&gt; postgres-db &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;POSTGRES_PASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;docker &lt;span class="nt"&gt;-p&lt;/span&gt; 5432:5432 &lt;span class="nt"&gt;-d&lt;/span&gt; postgres
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;But what does it do?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Last section of the command grabs the latest 'postgres' Docker image from the &lt;a href="https://hub.docker.com/_/postgres" rel="noopener noreferrer"&gt;Docker Hub&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;-d means that you enable Docker to run the container in the background&lt;/li&gt;
&lt;li&gt;-p plus the port numbers means you map the containers port 5432 to the external port 5432 - this allows you to connect to it from the outside&lt;/li&gt;
&lt;li&gt;POSTGRES_PASSWORD sets the password to docker. This is the password that gives you access to your database&lt;/li&gt;
&lt;li&gt;the —name property gives your container a name and means you can easily find it back&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Now you can connect to this brand new Postgres database in any tool that allows you to communicate with databases. I tend to use RazorSQL or DBeaver. You need to use the following connection details to actually connect to the DB:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Host: localhost&lt;/li&gt;
&lt;li&gt;Port: 5432&lt;/li&gt;
&lt;li&gt;User: postgres&lt;/li&gt;
&lt;li&gt;Password: docker&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once connected you can do anything you want with the database (Create tables, load data etc). But as you can see the database is completely empty. However, the real power of Docker is when you want to easily provision a database that has already content in it. This can be a simple or a complex database structure and schema. The choice is all yours. What this also means is that you can easily spin up such a container (and shut it down). Let's see how below:&lt;/p&gt;

&lt;h2&gt;
  
  
  Create a Dockerfile and Docker Image
&lt;/h2&gt;

&lt;p&gt;In order to create a pre-populated database we need to create a Dockerfile. This is a bit of a strange file. At least the first time I saw one. It doesn't have a file extension. It's basically a text document that is being used by Docker and describes what it needs to do. Basically a set of instructions. In our example we want to do the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Pull down the latest Postgres image from the Docker Hub&lt;/li&gt;
&lt;li&gt;Set the environment variable for password to 'docker'&lt;/li&gt;
&lt;li&gt;Create a database, let's call it 'world'&lt;/li&gt;
&lt;li&gt;Use a sql dump file to create the table schema and populate it with data&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Above I described what I want in this file. Now let's create it. Create an new file and call it 'Dockerfile'. Use a text editor like VS Code to open it and add the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;FROM postgres
ENV POSTGRES_PASSWORD docker
ENV POSTGRES_DB world
COPY world.sql /docker-entrypoint-initdb.d/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The last line points at a SQL dump file. You can find the one I'm using &lt;a href="https://www.postgresql.org/ftp/projects/pgFoundry/dbsamples/world/dbsamples-0.1/" rel="noopener noreferrer"&gt;here&lt;/a&gt;. Download it and put the .sql file in the same folder as the Dockerfile. Next step is to create our image by typing this command in the terminal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker build &lt;span class="nt"&gt;-t&lt;/span&gt; my-postgres-db ./
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The above line tells Docker to build an image from the Dockerfile and give it a name of 'my-postgres-db'. In order to see your images you can run&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker images &lt;span class="nt"&gt;-a&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Great, now we got our own image called 'my-postgres-db'. We can run it as a container by doing the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker run &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="nt"&gt;--name&lt;/span&gt; my-postgresdb-container &lt;span class="nt"&gt;-p&lt;/span&gt; 5432:5432 my-postgres-db
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can now connect to this database by using the login details specified in the Dockerfile.&lt;/p&gt;

&lt;p&gt;In case you want to remove images you can run this command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker image &lt;span class="nb"&gt;rm&lt;/span&gt; &lt;span class="s1"&gt;'nameOfTheImage'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>docker</category>
      <category>aws</category>
      <category>cloud</category>
    </item>
    <item>
      <title>How to schedule an AWS Lambda function</title>
      <dc:creator>Andre</dc:creator>
      <pubDate>Tue, 01 Sep 2020 10:22:25 +0000</pubDate>
      <link>https://dev.to/andre347/how-to-schedule-an-aws-lambda-function-3a8b</link>
      <guid>https://dev.to/andre347/how-to-schedule-an-aws-lambda-function-3a8b</guid>
      <description>&lt;p&gt;Efficiently and easily automating tasks and pieces of code is super important for those that want to stay productive. I'm a huge fan of scheduling my workloads so I don't need to look after tedious and repetitive tasks. Thanks to Amazon EventBridge (a.k.a. CloudWatch events) we can easily create rules for our Lambda functions and schedule them whenever we want. In this blog post we will take a look at how to do that!&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/_Nzw5w2PuOA"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h3&gt;
  
  
  What is AWS Lambda?
&lt;/h3&gt;

&lt;p&gt;AWS Lambda is probably the oldest and most popular serverless service out there. It allows you to focus on your code rather than having to take care of provisioning and maintaining (virtual) machines. It removes the need for such traditional compute services and therefore also reducing the complexity and operation cost. Lambda functions have their limitations but are great for small pieces of isolated tasks. They are great for serverless websites, real time data transformation, web authentication, chatbots, IoT workloads and a whole lot more. One of the things I like to do with it is schedule my tasks.&lt;/p&gt;

&lt;h3&gt;
  
  
  How to schedule a Lambda function?
&lt;/h3&gt;

&lt;p&gt;The word 'serverless' already says it, there is no server! Well, actually there is but you don't have to manage it. What this means is that you are only paying for the times that you actually invoke/use your Lambda function. That is revolutionary in comparison with the traditional computing services where you pay constantly, even when you're not using your machine! This makes scheduling even more attractive. Let's see how to do that. We're going to do the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a Lambda Function&lt;/li&gt;
&lt;li&gt;Setup a rule in Amazon EventBridge&lt;/li&gt;
&lt;li&gt;See our function being executed every N seconds (minutes or hours etc)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Almost sounds too simple right? It definitely is super easy to setup! &lt;/p&gt;

&lt;h3&gt;
  
  
  1. Create Lambda Function
&lt;/h3&gt;

&lt;p&gt;The quickest way to create a simple Lambda function is to do this through the AWS Console. When you have more complex applications I would move to the &lt;a href="https://www.serverless.com/" rel="noopener noreferrer"&gt;Serverless Framework&lt;/a&gt;.  First rule of using AWS, check which region you are in! I'm using London but you can choose any in which Lambda is supported. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm1jmdze7ludea0n4c7c6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm1jmdze7ludea0n4c7c6.png" alt="https://res.cloudinary.com/dmim37dbf/image/upload/v1598893296/aws-lambda-schedule-blog/01lambda.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next step is to create a function and give it a name. I called mine 'ScheduleLambdaFunction', but you can call this anything (1). After this you need to select a runtime (2). I'm most familiar with JavaScript so I'm using NodeJS but you can select any of the runtimes (Python, Java, Go etc). Last thing you need to do is create or select an execution role that has the correct permissions to execute the Lambda function (3). I'm going to create a new one.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsqqfgbykspa61o7qdmv6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsqqfgbykspa61o7qdmv6.png" alt="https://res.cloudinary.com/dmim37dbf/image/upload/v1598893551/aws-lambda-schedule-blog/02lambda.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fac33b6ffenuozhfdzd5s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fac33b6ffenuozhfdzd5s.png" alt="https://res.cloudinary.com/dmim37dbf/image/upload/v1598893901/aws-lambda-schedule-blog/03lambda.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When your function has been created you can write your code in the integrated code editor. For this tutorial it's not very important what the code does so I'm just going to log the current date and time. If you're following along you can copy the code below.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;

&lt;span class="nx"&gt;exports&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;handler&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;statusCode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`This fuction ran at &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;())}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7argol85zowsur4k1f7h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7argol85zowsur4k1f7h.png" alt="https://res.cloudinary.com/dmim37dbf/image/upload/v1598894352/aws-lambda-schedule-blog/04lambda.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Make sure you hit save when you've edited the code!&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Setup a rule in Amazon EventBridge
&lt;/h3&gt;

&lt;p&gt;Once the function has been created and you've modified the code you need to setup a rule in EventBridge. Go back to the console and navigate to 'Amazon EventBridge'. You can also do the exact same process of scheduling your Lambda function in CloudWatch 'events' but the easiest is to do this in EventBridge. Under the hood these two services use exactly the same API. The main difference is that with EventBridge you can also integrate third party SaaS application such a ZenDesk, DataDog and Shopify. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft0fhljoku3ak41x19o8z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft0fhljoku3ak41x19o8z.png" alt="https://res.cloudinary.com/dmim37dbf/image/upload/v1598947928/aws-lambda-schedule-blog/Lambda08.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The setup of these rules is fairly straightforward. You need to create a rule (1), define a pattern and select an event bus (2) assign it a target (our Lambda function) (3). You can configure the schedule to run either by using a human readable format or using CRON. CRON gives you a lot more flexibility and there is no need to learn how CRON expressions are actually written. Head over to &lt;a href="http://crontab.guru" rel="noopener noreferrer"&gt;Crontab.guru&lt;/a&gt; to configure your own schedule expressions and copy paste them into the AWS console. For this tutorial I choose to run the event every 1 minute. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjyt93460grpo6gohw474.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjyt93460grpo6gohw474.png" alt="https://res.cloudinary.com/dmim37dbf/image/upload/v1598948305/aws-lambda-schedule-blog/10lambda.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can optionally give your schedule a description and add some tags. Hit 'Create' after and your scheduled Lambda function has been created! Pretty easy right?&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Check the logs to see if function executed successfully
&lt;/h3&gt;

&lt;p&gt;Now all the hard work has been done and you can head over to the CloudWatch logs to see your function results. You should see that the function gets executed every minute. &lt;/p&gt;

&lt;h2&gt;
  
  
  In conclusion
&lt;/h2&gt;

&lt;p&gt;In this blog post we created and scheduled a very simple Lambda function. It actually only logs out the current date and time. Not super exciting. But you can do so much more! If you head back into your Lambda function console you can see that there are lots of triggers and destinations. You could for example do some web scraping to extract information from a website and then load and save the results in an S3 bucket. Another use case would be to create a Twitter bot that tweets something at fixed times. There are so many options and I'm really keen to understand what you would do with scheduled Lambda. Let me know in the comments below or hit me up on &lt;a href="https://twitter.com/andre347_" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt;!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>100daysofcloud</category>
      <category>serverless</category>
    </item>
    <item>
      <title>How I became an AWS Certified Developer - how to pass the associate exam!</title>
      <dc:creator>Andre</dc:creator>
      <pubDate>Sun, 30 Aug 2020 19:54:02 +0000</pubDate>
      <link>https://dev.to/andre347/how-i-became-an-aws-certified-developer-how-to-pass-the-associate-exam-38hl</link>
      <guid>https://dev.to/andre347/how-i-became-an-aws-certified-developer-how-to-pass-the-associate-exam-38hl</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published at &lt;a href="https://andredevries.dev/posts/aws-certified-developer" rel="noopener noreferrer"&gt;andredevries.dev&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;A few months ago I passed the &lt;a href="https://aws.amazon.com/certification/certified-developer-associate/" rel="noopener noreferrer"&gt;AWS Certified Developer Associate Exam&lt;/a&gt;. This exam tests your knowledge of the core AWS services that you need to use for developing, deploying and debugging cloud based applications. AWS Certifications are currently one of the most sought-after credentials that you can have in the industry and I thought it would be helpful to share my experience and tips for this exam. This should therefore be a helpful blog post for those that want to start out in the tech sector or have an interest in becoming a developer who uses Amazon Web Services. I found the exam itself quite hard but it was really rewarding and a learned a ton!&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Amazon Web Services (AWS) is vast and there is a lot to learn. It currently comprises more than 200 services. I was pretty overwhelmed when I started delving into it. The first exam that I tried was the &lt;a href="https://aws.amazon.com/certification/certified-cloud-practitioner/" rel="noopener noreferrer"&gt;Cloud Practitioner&lt;/a&gt;. I passed that one earlier this year. To me, that exam felt kind of an entry exam into AWS and the cloud. There were lots of high level questions about a myriad of services, ranging from virtualization to machine learning and databases. This made it really hard to know beforehand if I studied enough. That was different for the Certified Developer Associate exam - &lt;strong&gt;the focus point of this blog post.&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Exam details
&lt;/h2&gt;

&lt;p&gt;The format of the exam is multiple-choice. There are no open questions and you don't have to write an essay. Typically the exam is held in a test centre but because we are living in a global pandemic AWS has moved all their exams to an online proctor environment. I'm still on the fence if I like that setup. A proctor is someone that constantly pays attention to you while you do the exam. For the full 130 minutes! This means that with even the slightest of movements of your hands or face the screen will be paused and you will be asked why you are moving. I literally got stopped while I was grabbing my glass of water that was standing next to the laptop. I found this really frustrating and it dropped my focus, especially the first time when I did the Cloud Practitioner exam. The second time around, for the Developer Associate, I just accepted it (and didn't drink any water).&lt;/p&gt;

&lt;h2&gt;
  
  
  How I studied for the exam
&lt;/h2&gt;

&lt;p&gt;Before we jump into the specifics of the exam I want to explain how I prepared myself for this exam. I think that everyone has a different way of learning so please don't take this as gospel. I think the time between registering for the exam and actually taking it was about 3,5 weeks. Even though I've used AWS quite a lot in the last few months I had to learn a lot about certain services I had never used. I mainly studied through the &lt;a href="https://www.udemy.com/course/aws-certified-developer-associate/" rel="noopener noreferrer"&gt;Udemy&lt;/a&gt; course and the &lt;a href="https://tutorialsdojo.com/" rel="noopener noreferrer"&gt;practice exams&lt;/a&gt; that I mention below. Especially the practice exams give you a good sense of what you know and what you need to focus on. For me this was mainly learning more about &lt;a href="https://aws.amazon.com/dynamodb/" rel="noopener noreferrer"&gt;DynamoDB&lt;/a&gt; and the CI/CD Pipeline. Whenever I study for a particular topic I try to immerse myself into the material. I do that via books, podcasts and YouTube videos. I have listed out a whole bunch of resources at the end of this blog post but the main reason I read books and listen to podcasts is that you get some 'real' opinions about the AWS services you are learning about. The Udemy course from A Cloud Guru is great, but it's a very structured and un-opinionated approach to learning about the cloud. While if you, for example, listen to a podcast about Serverless (e.g. the Real-World Serverless podcast by &lt;a href="https://theburningmonk.com/2020/03/announcing-the-new-real-world-serverless-podcast/" rel="noopener noreferrer"&gt;The Burning Monk&lt;/a&gt;) you get more in-depth explanations of the various study topics.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Let's start with the requirements for this exam. The Developer Associate exam is one of three associate level exams. The other two are the Solutions Architect and the SysOps Administrator. Find more information about those exams &lt;a href="https://aws.amazon.com/certification/" rel="noopener noreferrer"&gt;here&lt;/a&gt;. AWS states the you need a certain amount of experience and in some cases even a few years of hands-on experience to pass them. However, there are no real prerequisites to &lt;em&gt;register&lt;/em&gt; for the exam. Anyone can therefore sign up, even if you don't hold any AWS certifications.&lt;/p&gt;

&lt;p&gt;The breakdown of the exam questions are as follows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Deployment: 22%&lt;/li&gt;
&lt;li&gt;Security: 26%&lt;/li&gt;
&lt;li&gt;Development: 30%&lt;/li&gt;
&lt;li&gt;Refactoring: 10%&lt;/li&gt;
&lt;li&gt;Monitoring: 12%&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AWS recommends you have the following knowledge and experience for the Developer Associate exam:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In-depth knowledge of at least one high-level programming language&lt;/li&gt;
&lt;li&gt;Understanding of core AWS services, uses, and basic AWS architecture best practices&lt;/li&gt;
&lt;li&gt;Proficiency in developing, deploying, and debugging cloud-based applications using AWS&lt;/li&gt;
&lt;li&gt;Ability to use the AWS service APIs, AWS CLI, and SDKs to write applications&lt;/li&gt;
&lt;li&gt;Ability to identify key features of AWS services&lt;/li&gt;
&lt;li&gt;Understanding of the AWS shared responsibility model&lt;/li&gt;
&lt;li&gt;Understanding of application lifecycle management&lt;/li&gt;
&lt;li&gt;Ability to use a CI/CD pipeline to deploy applications on AWS&lt;/li&gt;
&lt;li&gt;Ability to use or interact with AWS services&lt;/li&gt;
&lt;li&gt;Ability to apply a basic understanding of cloud-native applications to write code&lt;/li&gt;
&lt;li&gt;Ability to write code using AWS security best practices (e.g., not using secret and access keys in the code, instead using IAM roles)&lt;/li&gt;
&lt;li&gt;Ability to author, maintain, and debug code modules on AWS&lt;/li&gt;
&lt;li&gt;Proficiency writing code for serverless applications&lt;/li&gt;
&lt;li&gt;Understanding of the use of containers in the development process&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is a big list and you are tested on each item. But some are more important then others. I'll therefore break down the &lt;em&gt;most essential ones&lt;/em&gt; below:&lt;/p&gt;

&lt;h3&gt;
  
  
  In-depth knowledge of at least one high-level programming language
&lt;/h3&gt;

&lt;p&gt;This is the first one in the list and might scare a few people. Do I really need to know or even learn a complete programming language to be able to pass this exam? The answer is a &lt;em&gt;no, but it might help.&lt;/em&gt; What I mean with that is that you don't actually have to write code, but you have to understand it. It also helps with your further career in AWS. Even though you can do a lot in the AWS Console, many Cloud Developers are using the &lt;a href="https://aws.amazon.com/cli/" rel="noopener noreferrer"&gt;CLI&lt;/a&gt; and &lt;a href="https://aws.amazon.com/cloudformation/" rel="noopener noreferrer"&gt;CloudFormation&lt;/a&gt; to deploy their 'infrastructure as code'. Because the questions are multiple choice you can use the art of elimination to get to the right answers for these types of questions if you're not too certain. The ones I got that involved code where related to caching and which code snippet demonstrated caching most effectively.&lt;/p&gt;

&lt;h3&gt;
  
  
  Understanding of core AWS services, uses, and basic AWS architecture best practices
&lt;/h3&gt;

&lt;p&gt;Even though this exam focuses heavily on the developer side of AWS, I would still recommend that it is important you have a solid understanding of the core services such as &lt;a href="https://aws.amazon.com/ec2/" rel="noopener noreferrer"&gt;EC2&lt;/a&gt;, &lt;a href="https://aws.amazon.com/s3/" rel="noopener noreferrer"&gt;S3&lt;/a&gt;, &lt;a href="https://aws.amazon.com/iam/" rel="noopener noreferrer"&gt;IAM&lt;/a&gt; and &lt;a href="https://aws.amazon.com/rds/" rel="noopener noreferrer"&gt;RDS&lt;/a&gt;. Many questions have these services listed in possible answers so knowing what you can (and can't do) with them is very important. I didn't get any questions around actually designing or architecting solutions in a &lt;a href="https://aws.amazon.com/vpc/" rel="noopener noreferrer"&gt;VPC&lt;/a&gt;. This is mainly because lots of the developer services are all moving towards a 'serverless' setup. With serverless a lot of the detailed architectures are already taken care of for you. Another point that always comes back in AWS exams is best practices. These often relate to security best practices. A rule of thumb when you work with AWS is that you are responsible of securing what's in the cloud, while AWS secures the cloud itself.&lt;/p&gt;

&lt;h3&gt;
  
  
  Ability to use a CI/CD pipeline to deploy applications on AWS
&lt;/h3&gt;

&lt;p&gt;This one is super important and I got a lot of questions on. CI/CD stands for Continuous Integration and Continuous Deployment. These are paradigms that developers use when they (automatically) deploy and build their code. Understanding what 'blue and green' deployments are is important, but you're not being tested if you know the definition of these terms. Often the questions are written as little case studies where you have to select which deployment strategy is the most suitable. 'Suitable' is then also defined in the question itself. I would recommend using or playing around with &lt;a href="https://aws.amazon.com/codepipeline/" rel="noopener noreferrer"&gt;AWS Codepipeline&lt;/a&gt; on your own to understand the moving parts. This AWS service is kind of an umbrella service that comprises '&lt;a href="https://aws.amazon.com/codecommit/" rel="noopener noreferrer"&gt;Codecommit&lt;/a&gt;' (AWS's Github), &lt;a href="https://aws.amazon.com/codebuild/" rel="noopener noreferrer"&gt;Codebuild&lt;/a&gt; &amp;amp; &lt;a href="https://aws.amazon.com/codedeploy/" rel="noopener noreferrer"&gt;CodeDeploy&lt;/a&gt;. Another tool that you need to understand in this category is &lt;a href="https://aws.amazon.com/xray/" rel="noopener noreferrer"&gt;X-Ray&lt;/a&gt;. A tool that can help you with analyse, debug and monitor your applications.&lt;/p&gt;

&lt;h3&gt;
  
  
  Proficiency writing code for serverless applications
&lt;/h3&gt;

&lt;p&gt;You can take this requirement a bit broader and think about serverless applications as a whole. Because in the exam there are quite a few questions related to serverless computing. You need to be able to articulate the differences between a 'traditional deployment' and a serverless deployment. In the former, the user has to provision their own instances, maintain and update the operating system, install applications and handle and configure autoscaling themselves. All of this is not needed when you move to a serverless framework. You only have to take care of writing the code of your application and monitor it. Everything else, like autoscaling, is taken care of by AWS. Services that you need know in this section are: &lt;a href="https://aws.amazon.com/lambda/" rel="noopener noreferrer"&gt;AWS Lambda&lt;/a&gt;, &lt;a href="https://aws.amazon.com/api-gateway/" rel="noopener noreferrer"&gt;API Gateway&lt;/a&gt;, &lt;a href="https://aws.amazon.com/dynamodb/" rel="noopener noreferrer"&gt;DynamoDB&lt;/a&gt;, &lt;a href="https://aws.amazon.com/step-functions/" rel="noopener noreferrer"&gt;Step Functions&lt;/a&gt;, &lt;a href="https://aws.amazon.com/athena" rel="noopener noreferrer"&gt;Athena&lt;/a&gt; and &lt;a href="https://aws.amazon.com/kinesis/" rel="noopener noreferrer"&gt;Kinesis&lt;/a&gt;. Of this list I would really recommend studying Lambda, Gateway and DynamoDB.&lt;/p&gt;

&lt;h3&gt;
  
  
  Understanding of the use of containers in the development process
&lt;/h3&gt;

&lt;p&gt;This is something I didn't focus on at all while studying. I didn't get many questions about it either so I wouldn't worry to much if you've never used Docker and Kubernetes. However, I would read up a bit on the AWS services that help you with deploying containers on AWS (&lt;a href="https://docs.aws.amazon.com/AmazonECS/latest/developerguide/docker-basics.html" rel="noopener noreferrer"&gt;AWS ECS&lt;/a&gt;) because they might be mentioned as potential answers in other non container related questions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resources I used to study:
&lt;/h2&gt;

&lt;p&gt;There are a whole bunch of resources that I used to study. The most important thing to stress when you are studying for AWS exams is to get hands-on experience with the services. This hands-on experience gives you a good grips on the console. However, for the Developer Associate exam I would also focus on learning CloudFormation and how to use the &lt;a href="https://aws.amazon.com/serverless/sam/" rel="noopener noreferrer"&gt;AWS Serverless Application Model&lt;/a&gt; (SAM). Almost all the things you can do in the console you can also do by using the various APIs that AWS offers (e.g. AWS CLI). The majority of resources that I list below will give you this firsthand experience:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;A Cloud Guru Developer Associate Course&lt;/strong&gt; (&lt;a href="https://www.udemy.com/course/aws-certified-developer-associate/" rel="noopener noreferrer"&gt;Udemy&lt;/a&gt;): a well-known resource for online cloud training. A Cloud Guru has a whole bunch of courses and this one is quite good and gives you a solid understanding of the developer side of AWS.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Exam Readiness AWS Certified Developer Associate&lt;/strong&gt; (&lt;a href="https://www.youtube.com/watch?v=HOPUwmq95kk" rel="noopener noreferrer"&gt;YouTube&lt;/a&gt;): recording of a presentation that walks you through the various elements of the exam. Contains practice questions and lots of tips &amp;amp; tricks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;TutorialsDojo&lt;/strong&gt; (&lt;a href="https://tutorialsdojo.com/" rel="noopener noreferrer"&gt;Website&lt;/a&gt;): this is where I bought access to practices exams. This was my primary method of testing what elements of the exam I had to pay more attention to.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AWS Simplified&lt;/strong&gt; (&lt;a href="https://www.youtube.com/channel/UCraiFqWi0qSIxXxXN4IHFBQ" rel="noopener noreferrer"&gt;YouTube&lt;/a&gt;): very informative YouTube channel that has lots of bite-size videos about various AWS services.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;DynamoDB Guide&lt;/strong&gt; (&lt;a href="https://www.dynamodbguide.com/" rel="noopener noreferrer"&gt;Website&lt;/a&gt;): built by &lt;a href="https://www.alexdebrie.com/" rel="noopener noreferrer"&gt;Alex DeBrie&lt;/a&gt;. He's an AWS Hero and the author of &lt;a href="https://www.dynamodbbook.com/" rel="noopener noreferrer"&gt;The DynamoDB Book&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Off-by-one&lt;/strong&gt; (&lt;a href="https://offbynone.io/" rel="noopener noreferrer"&gt;newsletter&lt;/a&gt;): I signed up for two AWS related newsletters but also had a look through the backlog of this newsletter by &lt;a href="https://twitter.com/jeremy_daly" rel="noopener noreferrer"&gt;Jeremy Daly&lt;/a&gt;. It's a great way to learn more about the vast and growing ecosystem of serverless.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Last Week in AWS&lt;/strong&gt; (&lt;a href="https://www.lastweekinaws.com/" rel="noopener noreferrer"&gt;newsletter&lt;/a&gt;): not specifically focused on the developer side of AWS but a good all-round weekly newsletter that keep you informed about what's new in AWS.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Serverless Chats&lt;/strong&gt; (&lt;a href="https://www.serverlesschats.com/episodes/" rel="noopener noreferrer"&gt;podcast&lt;/a&gt;): a very informative but sometimes highly technical podcast from Jeremy Daly.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Real World Serverless&lt;/strong&gt; (&lt;a href="https://theburningmonk.com/2020/03/announcing-the-new-real-world-serverless-podcast/" rel="noopener noreferrer"&gt;podcast&lt;/a&gt;): a podcast by Yan, who is an AWS Hero who specialises in everything related to serverless. Contains lots of very good examples of serverless implementations from various companies.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>100daysofcloud</category>
    </item>
    <item>
      <title>How to host a React application on AWS S3</title>
      <dc:creator>Andre</dc:creator>
      <pubDate>Tue, 25 Aug 2020 16:49:52 +0000</pubDate>
      <link>https://dev.to/andre347/how-to-host-a-react-application-on-aws-s3-53mc</link>
      <guid>https://dev.to/andre347/how-to-host-a-react-application-on-aws-s3-53mc</guid>
      <description>&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/BZcSUInHBfc"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;There are many ways to host your website on Amazon Web Services (AWS). One of the easiest is to use an S3 bucket to host your static website. Setup and configuration is fairly straightforward for this option. Take a look at the video and see how easy it is. In the video I explain how you can either manually build out your React application and then upload it to a bucket, or use the AWS CLI to automate the deployment.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;AWS Account&lt;/li&gt;
&lt;li&gt;AWS CLI installed on your machine&lt;/li&gt;
&lt;li&gt;IAM User / role&lt;/li&gt;
&lt;li&gt;Local credentials of the AWS User&lt;/li&gt;
&lt;li&gt;NodeJS &amp;amp; npm installed&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Steps to Upload to S3
&lt;/h2&gt;

&lt;p&gt;As shown in the video:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Scaffold a React application by running:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;   npx create-react app nameofApp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create an S3 bucket&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Change Properties to allow static website hosting (index.html for the Index document.)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Change Permissions of Bucket Policy (replace NameOFBucket with your bucket name from 2)&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;   &lt;span class="o"&gt;{&lt;/span&gt;
       &lt;span class="s2"&gt;"Version"&lt;/span&gt;: &lt;span class="s2"&gt;"2012-10-17"&lt;/span&gt;,
       &lt;span class="s2"&gt;"Statement"&lt;/span&gt;: &lt;span class="o"&gt;[&lt;/span&gt;
           &lt;span class="o"&gt;{&lt;/span&gt;
               &lt;span class="s2"&gt;"Sid"&lt;/span&gt;: &lt;span class="s2"&gt;"AllowPublicReadAccess"&lt;/span&gt;,
               &lt;span class="s2"&gt;"Effect"&lt;/span&gt;: &lt;span class="s2"&gt;"Allow"&lt;/span&gt;,
               &lt;span class="s2"&gt;"Principal"&lt;/span&gt;: &lt;span class="s2"&gt;"*"&lt;/span&gt;,
               &lt;span class="s2"&gt;"Action"&lt;/span&gt;: &lt;span class="s2"&gt;"s3:GetObject"&lt;/span&gt;,
               &lt;span class="s2"&gt;"Resource"&lt;/span&gt;: &lt;span class="s2"&gt;"arn:aws:s3:::NameOFBucket/*"&lt;/span&gt;
           &lt;span class="o"&gt;}&lt;/span&gt;
       &lt;span class="o"&gt;]&lt;/span&gt;
   &lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Build out the React application and copy contents of build folder over to S3
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt; yarn build
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Setup S3 Sync - syncs directories and S3 prefixes. Recursively copies new and updated files from the source directory to the destination. Modify the package.json file and add a 'deploy' script that syncs the content of the build folder with the bucket:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;  &lt;span class="s2"&gt;"deploy"&lt;/span&gt;: &lt;span class="s2"&gt;"aws s3 sync build/ s3://nameofbucket"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Each time you want to deploy a new version of your app run:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;  yarn build &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; yarn deploy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>react</category>
      <category>aws</category>
      <category>cloud</category>
      <category>automation</category>
    </item>
    <item>
      <title>A first look at Deno</title>
      <dc:creator>Andre</dc:creator>
      <pubDate>Wed, 06 May 2020 17:52:43 +0000</pubDate>
      <link>https://dev.to/andre347/a-first-look-at-deno-6dl</link>
      <guid>https://dev.to/andre347/a-first-look-at-deno-6dl</guid>
      <description>&lt;h3&gt;
  
  
  The release of Deno v1 is upon us. Today was the first day I played with it. Deno is a new runtime for JavaScript and Typescript and created by the founder of NodeJS. In the video I share my initial experiences.
&lt;/h3&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/bNJU458W2SQ"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

</description>
      <category>deno</category>
      <category>node</category>
      <category>javascript</category>
      <category>videos</category>
    </item>
    <item>
      <title>How to use a 'do... while' loop for API pagination</title>
      <dc:creator>Andre</dc:creator>
      <pubDate>Mon, 24 Feb 2020 18:44:11 +0000</pubDate>
      <link>https://dev.to/andre347/how-to-use-a-do-while-loop-for-api-pagination-375b</link>
      <guid>https://dev.to/andre347/how-to-use-a-do-while-loop-for-api-pagination-375b</guid>
      <description>&lt;p&gt;There are various ways of looping in JavaScript. We have the regular 'for' loop, 'for/in', 'for/of' and the regular 'while' loop. Each have their strengths, weaknesses and use cases. By using an example of looping over multiple pages of an API I want to take a look at another type of loop, the &lt;strong&gt;'do... while'&lt;/strong&gt; loop.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem
&lt;/h2&gt;

&lt;p&gt;In my day-to-day job I have to work a lot with APIs. One characteristics of APIs is that they often provide the user with multiple 'pages' of data. There are clear reasons why you want to split up your data into parts. One of them is the resources that it takes to serve up all the data in one API call. Multiple trips might be more efficient and quicker. However, very often we want to have access to all the data in one go. A solution for grabbing all of the data is to loop over all the pages and grab the pieces you are interested in.&lt;/p&gt;

&lt;p&gt;An example of a REST API that uses pagination is the Star Wars API. One of my favourite APIs to illustrate said problem. You can find the official documentation &lt;a href="https://swapi.co/" rel="noopener noreferrer"&gt;here&lt;/a&gt;. Don't worry about rate limits or costs. This API is free to use. I use it in almost all my training sessions.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Syntax
&lt;/h2&gt;

&lt;p&gt;Before we're going to implement this recursion we're going to take a look at the syntax of the 'do...while' loop. According to &lt;a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/do...while" rel="noopener noreferrer"&gt;MDN&lt;/a&gt; this type of statement &lt;em&gt;"creates a loop that executes a specified statement until the test condition evaluates to false. The condition is evaluated after executing the statement, resulting in the specified statement executing at least once."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The important part of the definition is that this loop executes at least once. Especially with API calls this is useful because you want to check if there resource is available for which you are requesting the data. If there are no more pages then it doesn't continue. A regular 'while' loop on the other hand keeps executing a code block until a condition is met. The syntax for this loop is easier than the do...while loop. But let's take a look at how to create a do... while loop:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Basic do while loop&lt;/span&gt;
&lt;span class="c1"&gt;// Logs a message to the console&lt;/span&gt;
&lt;span class="c1"&gt;// @andre347_&lt;/span&gt;

&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;doLoop&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// create an empty message&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;message&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="dl"&gt;""&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="c1"&gt;// we want to log a message 5 times&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;i&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="c1"&gt;// execute this code block..&lt;/span&gt;
  &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;message&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="s2"&gt;`The number decreased to &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;i&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; \n`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="c1"&gt;// decrement i in each loop - so 5, 4, 3, 2, 1&lt;/span&gt;
    &lt;span class="nx"&gt;i&lt;/span&gt;&lt;span class="o"&gt;--&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;while &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;i&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="c1"&gt;// while i is more than 0 log something to the console&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;// make sure we call our function&lt;/span&gt;
&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Timer&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="nf"&gt;doLoop&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;timeEnd&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Timer&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you run this with NodeJS (I used node v12 with the experimental flag for modules) you will see a message being logged five times. The message kept on being logged until it ran 5 times. We want to use similar logic for our API pagination. The key of the do... while loop is in the while condition. This code block controls how often a loop will run. Make sure you don't create &lt;a href="https://www.dummies.com/web-design-development/avoid-infinite-loops-javascript/" rel="noopener noreferrer"&gt;infinite loops&lt;/a&gt; because that can crash your browser or node environment. These loops are the ones that never finish.&lt;/p&gt;

&lt;h2&gt;
  
  
  API Pagination
&lt;/h2&gt;

&lt;p&gt;The Star Wars API contains all kinds of data related to the Star Wars universe. These resources are split into individual endpoints. One of them is '/people' - which contains information of all characters in the movies. When you inspect the first page of this endpoint you see a big object. The top section of this object is useful for the recursion.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;87&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"next"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://swapi.co/api/people/?page=2"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"previous"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"results"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Luke Skywalker"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"height"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"172"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"mass"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"77"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="err"&gt;...etc&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This object contains a 'next' property. Which is the next page of data we want to grab in our API call. The logic to stop the loop is to check if there is still a next page. If not, then we got all the data. See below implementation.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;getPages&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// set some variables&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;baseUrl&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`https://swapi.co/api/people/?format=json&amp;amp;page=`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="c1"&gt;// create empty array where we want to store the people objects for each loop&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;people&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[];&lt;/span&gt;
  &lt;span class="c1"&gt;// create a lastResult array which is going to be used to check if there is a next page&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;lastResult&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[];&lt;/span&gt;
  &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// try catch to catch any errors in the async api call&lt;/span&gt;
    &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="c1"&gt;// use node-fetch to make api call&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;resp&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;baseUrl&lt;/span&gt;&lt;span class="p"&gt;}${&lt;/span&gt;&lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;resp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
      &lt;span class="nx"&gt;lastResult&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
      &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;person&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="c1"&gt;// destructure the person object and add to array&lt;/span&gt;
        &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;height&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;films&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;person&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="nx"&gt;people&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;height&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;films&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
      &lt;span class="p"&gt;});&lt;/span&gt;
      &lt;span class="c1"&gt;// increment the page with 1 on each loop&lt;/span&gt;
      &lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;err&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`Oeps, something is wrong &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;err&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="c1"&gt;// keep running until there's no next page&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;while &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;lastResult&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;next&lt;/span&gt; &lt;span class="o"&gt;!==&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="c1"&gt;// let's log out our new people array&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;people&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Time my API call&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="nf"&gt;getPages&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;timeEnd&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Time my API call&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This should give you a nice array with all the characters (87) and their height plus the movies in which they appeared.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;You can find all the code for this blog post in this &lt;a href="https://github.com/andre347/do-while-loop-api" rel="noopener noreferrer"&gt;Github repository&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Orginally posted at &lt;a href="https://andredevries.dev/posts/do-while-api-nodejs/" rel="noopener noreferrer"&gt;andredevries.dev&lt;/a&gt;&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>api</category>
      <category>node</category>
    </item>
    <item>
      <title>How to create a user snippet in VS Code</title>
      <dc:creator>Andre</dc:creator>
      <pubDate>Sun, 09 Feb 2020 13:15:59 +0000</pubDate>
      <link>https://dev.to/andre347/how-to-create-a-user-snippet-in-vs-code-h1</link>
      <guid>https://dev.to/andre347/how-to-create-a-user-snippet-in-vs-code-h1</guid>
      <description>&lt;p&gt;Originally posted on &lt;a href="https://andredevries.dev/" rel="noopener noreferrer"&gt;andredevries.dev&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I write all my blog posts in markdown in VS Code. That's because my personal website is built with Gatsby and this tool makes it extremely easy to create pages with it. What I don't like about writing markdown, especially for a Gatsby blog, is that I have to write frontmatter each time I start writing. Frontmatter is the first section of a blog post and I see it as the metadata about your blog. This can contain anything but generally you have the title of the post, the date and time of writing, a description and a category. The text editor that I use for writing blog posts is VS Code. This editor makes it very easy to create snippets. I have therefore written a quick snippet that scaffolds out the necessary frontmatter for my blogposts. Let's take a look at how I created such a snippet.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F98p4xyj9a6dp8dkcq06h.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F98p4xyj9a6dp8dkcq06h.gif" alt="markdown-gif" width="600" height="296"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What are user snippets in VS Code?
&lt;/h2&gt;

&lt;p&gt;User snippets are a quick way to scaffold out some code or a text. They can save you a lot of time, especially when you have to write the same thing over and over again. I use these a lot when I work in React - I can quickly create a functional component that is exported by using &lt;a href="https://marketplace.visualstudio.com/items?itemName=dsznajder.es7-react-js-snippets" rel="noopener noreferrer"&gt;this snippet&lt;/a&gt; from the VS Code Marketplace. The only thing I have to type is 'rcfe' and it creates the snippet for me.&lt;/p&gt;

&lt;h2&gt;
  
  
  How can I create a user snippet?
&lt;/h2&gt;

&lt;p&gt;There is some handy documentation which you can find &lt;a href="https://code.visualstudio.com/docs/editor/userdefinedsnippets" rel="noopener noreferrer"&gt;here&lt;/a&gt;. I'm using a Mac, which means I have to go to 'Code &amp;gt; Preferences &amp;gt; User Snippets'. VS Code will then ask you if you want to create a 'global' snippet or a file specific one. The global options means that you can use the snippet in any filetype (.js, .md, .mdx etc). You can also limit the snippet to the current workspace you are in. But because I want to use this also for other folders in VS Code I generated one for markdown files.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgv5zzruvf98rhhjpbd1f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgv5zzruvf98rhhjpbd1f.png" alt="usersnippet" width="800" height="412"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frj24drm83gi2t4tsuomt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frj24drm83gi2t4tsuomt.png" alt="usersnippet2" width="800" height="165"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The next step is to configure the snippet. You have to write it in JSON. The image below shows how I setup the snippet from the animated gif on the top of this page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fde4ptwc65o6tylvjt83k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fde4ptwc65o6tylvjt83k.png" alt="usersnippet3" width="800" height="718"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The first things you need to do is give your snippet a &lt;em&gt;name&lt;/em&gt;. I called mine 'Markdown Blog Frontmatter'. The &lt;em&gt;prefix&lt;/em&gt; is the actual shortcode that you have to write to trigger the creation of the snippet in a markdown file. You can use a prefix by hitting 'Ctrl + Space' on a Mac. This will bring up the IntelliSense window.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3mxar0o45q1ow2f1o8g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3mxar0o45q1ow2f1o8g.png" alt="IntelliSense" width="800" height="443"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The most important part of the snippet goes into the &lt;em&gt;body&lt;/em&gt; section. This is the actual text that is going to be placed in the document. You can use a few variables which make it easy to scaffold out some information. In my example I used the filename (without file extension) as the title of the blog. I did the same for the date of the blogpost. This one uses todays date. Another variable that I used is the '\$0' - which places the cursor at a particular location. In my example I placed two cursors. One in the frontmatter title section and the other one at the start of the blog post.&lt;/p&gt;

&lt;p&gt;You can see a full list of variables that you can use in snippets &lt;a href="https://code.visualstudio.com/docs/editor/variables-reference" rel="noopener noreferrer"&gt;here&lt;/a&gt;. With variables you can do something really cool and useful. You can for example reference environment variables or create popups for user input.&lt;/p&gt;

&lt;p&gt;I'm really impressed with the possiblities and how easy it is to setup these snippets. They can be a huge time saver. While browing the VS Code marketplace I found a really handy extension. If you want to turn any text into a snippet you could use &lt;a href="https://marketplace.visualstudio.com/items?itemName=vincentkos.snippet-creator" rel="noopener noreferrer"&gt;this extension&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>vscode</category>
      <category>snippet</category>
      <category>extension</category>
      <category>plugin</category>
    </item>
    <item>
      <title>How to deploy a React application on AWS Amplify</title>
      <dc:creator>Andre</dc:creator>
      <pubDate>Tue, 31 Dec 2019 17:09:34 +0000</pubDate>
      <link>https://dev.to/andre347/how-to-deploy-a-react-application-on-aws-amplify-3eik</link>
      <guid>https://dev.to/andre347/how-to-deploy-a-react-application-on-aws-amplify-3eik</guid>
      <description>&lt;h3&gt;
  
  
  While studying for one of the AWS Exams I found a service called AWS Amplify. In this video I show you how easy it is to use this service to push a React application to a repository on Github and then have it automatically deploy on AWS.
&lt;/h3&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/kKwyKQ8Jxd8"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Originally published at &lt;a href="https://andredevries.dev/posts/aws-amplify-react/" rel="noopener noreferrer"&gt;andredevries.dev&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>react</category>
      <category>javascript</category>
      <category>videos</category>
    </item>
    <item>
      <title>D3.js and Vue.js</title>
      <dc:creator>Andre</dc:creator>
      <pubDate>Sun, 17 Mar 2019 20:42:24 +0000</pubDate>
      <link>https://dev.to/andre347/d3js-and-vuejs--30c8</link>
      <guid>https://dev.to/andre347/d3js-and-vuejs--30c8</guid>
      <description>&lt;p&gt;In this blog, I will describe how you can integrate D3 into Vue.js. D3 is a popular JavaScript library for visualising data using web standards (HTML, CSS, JavaScript, and SVG). Vue.js is a rising &lt;a href="https://hasvuepassedreactyet.surge.sh/" rel="noopener noreferrer"&gt;star&lt;/a&gt; in the front-end and has lately gained a lot of popularity in the web development scene. It’s a front-end framework similar to React and Angular and allows you to build scalable user interfaces.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;I’m learning D3 as I go, and I’ve used Vue.js a lot in the last few months, I thought it would be good to practice how I can combine the two. I’m still learning and improving my D3 skills so I think that if I revisit this post in a few months I’ll see mistakes or things I would change, so please highlight any issues or suggestions in the comments below.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;I borrowed the chart that I’m recreating for this blog post from &lt;a href="https://medium.com/@Elijah_Meeks" rel="noopener noreferrer"&gt;Elijah Meeks&lt;/a&gt; (image below), who wrote an excellent book about D3.js called &lt;a href="https://www.amazon.com/D3-js-Action-Data-visualization-JavaScript/dp/1617294489/ref=sr_1_2?ie=UTF8&amp;amp;qid=1546022089&amp;amp;sr=8-2" rel="noopener noreferrer"&gt;‘D3.js in Action’&lt;/a&gt;. If you want to learn more about this JavaScript library then this should be your first stop (so yeah, definitely buy this one).&lt;/p&gt;

&lt;p&gt;In Chapter 9 he writes about integrating D3 with React, and I have seen a lot of good examples of this combination on GitHub and B.locks. But there aren’t that many resources around integrating D3 with Vue.js. I found a few other articles on Medium and one on B.locks but the best one so far was &lt;a href="https://github.com/sxywu/vue-d3-example" rel="noopener noreferrer"&gt;this&lt;/a&gt; repo from &lt;a href="https://twitter.com/sxywu" rel="noopener noreferrer"&gt;Shirley Wu&lt;/a&gt;, who is a freelance software engineer and data visualisation expert.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbbvlzhaqit6kqv2qa2k4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbbvlzhaqit6kqv2qa2k4.png"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;&lt;a href="https://andre347.github.io/d3-vue-example/" rel="noopener noreferrer"&gt;‘Circle Chart’&lt;/a&gt; for displaying hierarchical data - taken from Chapter 6.3.1 from D3js in Action&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to get started with Vue&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Let’s get started with scaffolding a Vue Project — this is similar to &lt;code&gt;create-react-app&lt;/code&gt; for those that come from a React world. I’m using Vue CLI version 3. If you don’t have it installed then please run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;$&lt;/span&gt; &lt;span class="nx"&gt;npm&lt;/span&gt; &lt;span class="nx"&gt;install&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;g&lt;/span&gt; &lt;span class="p"&gt;@&lt;/span&gt;&lt;span class="nd"&gt;vue&lt;/span&gt;&lt;span class="sr"&gt;/cl&lt;/span&gt;&lt;span class="err"&gt;i
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I know that using the CLI for just one chart component is a bit overkill but I suppose you’re going to integrate D3 in a bigger application for which you want to use all Vue functionality. The -g flag means you’re installing it globally on your machine so there’s no need to run this again the next time you use Vue. After installing Vue it’s time to create a new project. Run the following lines one by one in your terminal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;$&lt;/span&gt; &lt;span class="nx"&gt;vue&lt;/span&gt; &lt;span class="nx"&gt;create&lt;/span&gt; &lt;span class="nx"&gt;d3&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;vue&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;example&lt;/span&gt;
&lt;span class="nx"&gt;$&lt;/span&gt; &lt;span class="nx"&gt;cd&lt;/span&gt; &lt;span class="nx"&gt;d3&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;vue&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;example&lt;/span&gt;
&lt;span class="nx"&gt;$&lt;/span&gt; &lt;span class="nx"&gt;npm&lt;/span&gt; &lt;span class="nx"&gt;run&lt;/span&gt; &lt;span class="nx"&gt;serve&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With npm run serve you’re starting a development server with ‘hot reload’ enabled. This means that when you make changes in almost all of the files the changes are immediately displayed. Once you got this server running it’s time to install D3. You can do that like so:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;$&lt;/span&gt; &lt;span class="nx"&gt;npm&lt;/span&gt; &lt;span class="nx"&gt;i&lt;/span&gt; &lt;span class="o"&gt;--&lt;/span&gt;&lt;span class="nx"&gt;save&lt;/span&gt; &lt;span class="nx"&gt;d3&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you open up the d3-vue-example folder in your favourite editor (I use VS Code) then you see a bunch of files and folders listed. For now it’s important that the package.json file is there. This is where all the packages appear that you install through npm. D3 should now also show up under dependencies in this package.json file.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a chart component&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The most important files and folders for us are in the &lt;code&gt;src&lt;/code&gt; folder. &lt;code&gt;App.vue&lt;/code&gt; is the main entrance to your application. In this file, you want to import all the components that you create. By default you see only one component being imported here; the HelloWorld. This component file is located in the components subfolder. It’s best practice to put all your components in this folder.&lt;/p&gt;

&lt;p&gt;Let’s also create a &lt;code&gt;Chart.vue&lt;/code&gt;file in this folder. Then go back into your App.vue file and duplicate line 9 and replace HelloWorld with the newly created Chart file. After this you have to add Chart to the components property of the JavaScript Object that is being exported in this file. The next step is to reference this component in the template section of the&lt;code&gt;App.vue&lt;/code&gt; file.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fejqzemlnj0u1osxh8hi7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fejqzemlnj0u1osxh8hi7.png"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Component ‘PackChart’ is being imported in App.vue and used in the template&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Okay, that might have confused you a bit if you’re new to front-end frameworks and working with npm. Head over to &lt;a href="https://github.com/andre347/d3-vue-example" rel="noopener noreferrer"&gt;my Github page&lt;/a&gt; to find the source code if you want a full look.&lt;/p&gt;

&lt;p&gt;Go to your localhost &lt;code&gt;port 8080&lt;/code&gt; (&lt;a href="http://localhost:8080" rel="noopener noreferrer"&gt;http://localhost:8080&lt;/a&gt;) and there you’ll be welcomed with the Vue default template. If you’re new to Vue.js then this new file extension &lt;code&gt;.vue&lt;/code&gt; might look a bit foreign. Actually, this is the beauty of Vue — within this one file you create your own components and have all your HTML (template), JavaScript, and CSS together. Going over all the basics of Vue is too much for this blog so I recommend spending some time with &lt;a href="https://gitconnected.com/site/redirect/tutorials/2052" rel="noopener noreferrer"&gt;this&lt;/a&gt; course on Udemy from &lt;a href="https://twitter.com/maxedapps?lang=en" rel="noopener noreferrer"&gt;Maximilian Schwarzmüller&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Adding D3 to Vue&lt;/strong&gt;&lt;br&gt;
I tend to import D3 in all the components that I create (including App.vue) but it’s probably best practice to not do this and just import it once, or only import the elements of the API that you need. An explanation of the module import of D3 can be found &lt;a href="https://stackoverflow.com/questions/50606982/what-is-the-correct-way-to-import-and-use-d3-and-its-submodules-in-es6/50610922" rel="noopener noreferrer"&gt;here&lt;/a&gt;. You can import D3 in each component by referencing it in the top of the script section of the vue file like so:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nx"&gt;d3&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;d3&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Another way would be to include the &lt;a href="https://cdnjs.com/libraries/d3" rel="noopener noreferrer"&gt;CDN link&lt;/a&gt; in the head section of the &lt;code&gt;index.html&lt;/code&gt; file but it is best practice to use the node modules. Although with the CDN method it would then mean you can use it everywhere in your application.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Build out the Chart component&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If you go back to your &lt;code&gt;App.vue&lt;/code&gt; file then we’ll set up data props. Props are the data you want to send from your parent component, the App.vue file, to your child components, in this case &lt;code&gt;Chart.vue&lt;/code&gt;. Let’s first create a data property in which we’ll push the data (I called it loadData). We’re going to use the tweets.json file from Meeks’ book — you can get the file &lt;a href="https://github.com/emeeks/d3_in_action_2/blob/master/data/tweets.json" rel="noopener noreferrer"&gt;here&lt;/a&gt;. Once downloaded, move the file into the public folder in the project folder.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff09l82vg228onm8qei8y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff09l82vg228onm8qei8y.png"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Load the data in App.vue when the app has mounted&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Vue.js has several ‘life-cycle’ hooks. These correspond with the different ‘states’ of your application. In the image above you see the ‘mounted’ property in the Vue instance. When the application has loaded it adds all the properties it can find in the ‘data’ object to its Reactivity system. This means that if the data changes, your application also updates (it becomes reactive). If you’re new to front-end frameworks, it might be a bit difficult in the beginning to wrap your head around the concept of ‘state’ and how elements are being removed and updated. But if you’re familiar with D3 then this might ring a bell. Think of it as the whole ‘Enter-Update-Exit’ pattern.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F900utn6ppr7yjbwvx30g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F900utn6ppr7yjbwvx30g.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Add this to the Chart.vue file&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Back to the mounted object. In D3 version 5, we have to use promises to load our data. This makes life a lot easier because previously you would have to use callbacks that often became a bit messy. What mounted does in this case is load the data from the JSON file and make it available when the DOM is being ‘mounted’. After this, you need to add the data to the prop we created in the &lt;code&gt;Chart.vue&lt;/code&gt; component (see image on the top). You then bind this prop to the Chart component in the &lt;code&gt;App.vue&lt;/code&gt; file like so:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;PackChart&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nx"&gt;loadData&lt;/span&gt; &lt;span class="o"&gt;/&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This means that whatever object is in ‘loadData’ is then being pushed into the child component that is the Chart.vue file (called the PackChart).&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;If you’re new to this Async/Await syntax of handling Promises then I would suggest watching &lt;a href="https://www.youtube.com/watch?v=9YkUCxvaLEk" rel="noopener noreferrer"&gt;this talk&lt;/a&gt; from Wes Bos.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Create the D3 chart&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The first section of the script part of the &lt;code&gt;Chart.vue&lt;/code&gt; file contains importing D3, return the &lt;code&gt;data&lt;/code&gt; object (with a message that is displayed in the component, and the &lt;code&gt;width&lt;/code&gt; and &lt;code&gt;height&lt;/code&gt; of the SVG that contains the chart (1)). This width and height is then bound to the SVG in the template (2).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhkwjseonwmcff4izwqlq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhkwjseonwmcff4izwqlq.png"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Width and Height of SVG specified in the data object of the Vue instance&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In the ‘created’ life cycle hook, I am defining a scale function for the colours of the circle chart. Because we have a set list of nested bubbles (a discrete list) as input, we can use the scaleOrdinal scale. This scale then returns a discrete output of colours (the ones we define in the array). If you want to learn more about the different scales of D3, then I recommend heading over to this page.&lt;/p&gt;

&lt;p&gt;The next step is to create a &lt;code&gt;computed&lt;/code&gt; property in which we restructure the data so that we can use it as a hierarchy. D3 has several useful functions that can help you with making your data ready for charts that display hierarchies. One of them is the &lt;a href="http://learnjsdata.com/group_data.html" rel="noopener noreferrer"&gt;nest()&lt;/a&gt; function. What this does is turn a flat structure to a nested one (1 =&amp;gt; 2). You can then define how to nest it (which property) and how deep to nest it. In our case, I use the ‘user’ as the top level property. This then means our new array contains four objects (one for each user).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F43i9wrcs412t53j3ja4u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F43i9wrcs412t53j3ja4u.png"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;From one tweet per Object to one Object per user (with tweets as children) with &lt;code&gt;nest()&lt;/code&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In this same computed property, I’m using the hierarchy module. This module takes a root (the new object called packableTweets — see image below) and returns a new layout.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx6crbi3i9196slzyjrag.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx6crbi3i9196slzyjrag.png"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Creates the hierarchical structure as per previous image&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In order to actually draw something on the screen, we need to return some data that can be bound to the SVG in the template. For that, I created another computed property that takes in the previous one &lt;code&gt;(packData())&lt;/code&gt; and returns an array of JS Objects with the x &amp;amp; y coordinates and the radius of the circles. It then also uses the colourScale defined in the created hook (see image below).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fewaidido6hgjo61dfb64.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fewaidido6hgjo61dfb64.png"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Returns an array of objects (containing the circle data)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;We can then loop over this array with the &lt;code&gt;v-for&lt;/code&gt; directive and display the circles in the view with their corresponding &lt;code&gt;x&lt;/code&gt; and &lt;code&gt;y&lt;/code&gt; coordinates, their radius, and colour.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0h2pcdrtgwzs86z46r93.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0h2pcdrtgwzs86z46r93.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you made it here then you’ve followed quite a lot of steps. If you got stuck along the way then I would recommend heading over to my GitHub and clone the &lt;a href="https://github.com/andre347/d3-vue-example" rel="noopener noreferrer"&gt;repo&lt;/a&gt; and inspect the chart &lt;a href="https://andre347.github.io/d3-vue-example/" rel="noopener noreferrer"&gt;here&lt;/a&gt;. I’m open for feedback as I’m certain I’ve either overcomplicated things or overlooked elements in my code.&lt;/p&gt;

&lt;p&gt;Originally published at &lt;a href="https://andredevries.dev/posts/d3-vuejs/" rel="noopener noreferrer"&gt;andredevries.dev&lt;/a&gt; (January 2019)&lt;/p&gt;

</description>
      <category>d3js</category>
      <category>vue</category>
    </item>
    <item>
      <title>How to create a Task Scheduler in NodeJS</title>
      <dc:creator>Andre</dc:creator>
      <pubDate>Sat, 16 Mar 2019 10:35:37 +0000</pubDate>
      <link>https://dev.to/andre347/how-to-create-a-task-scheduler-in-nodejs-4lo2</link>
      <guid>https://dev.to/andre347/how-to-create-a-task-scheduler-in-nodejs-4lo2</guid>
      <description>&lt;h3&gt;
  
  
  Wes Bos posted a really useful &lt;a href="https://www.youtube.com/watch?v=rWc0xqroY4U" rel="noopener noreferrer"&gt;video&lt;/a&gt; explaining how to scrape data from the web with NodeJS. In his second &lt;a href="https://www.youtube.com/watch?v=9dIHjegGeKo" rel="noopener noreferrer"&gt;video&lt;/a&gt; he explained how to setup a schedule for this particular task. Something I'd never done before in Node so I thought this might come in useful in the future and therefore I should write a quick blog post about it.
&lt;/h3&gt;

&lt;p&gt;Whereas in Wes his video he grabs data from his own social media pages, I'm going to create a small app that runs on a schedule and downloads a random image every day at 6PM. I know right, who doesn't want to have a random image popping up on his or her disk every day?!&lt;/p&gt;

&lt;p&gt;A few things we need to install first:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;    &lt;span class="c1"&gt;// create dir, go into it and install packages&lt;/span&gt;
    &lt;span class="nx"&gt;mkdir&lt;/span&gt; &lt;span class="nx"&gt;image&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;downloader&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nx"&gt;cd&lt;/span&gt; &lt;span class="nx"&gt;image&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;downloader&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt;
    &lt;span class="nx"&gt;npm&lt;/span&gt; &lt;span class="nx"&gt;i&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;cron&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;fetch&lt;/span&gt; &lt;span class="nx"&gt;esm&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A quick break down of what you've just installed:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;node-cron&lt;/strong&gt;: this is the package for the task scheduler. It allows you to setup schedules that automatically perform something (often executes a function).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;node-fetch&lt;/strong&gt;: a way to use the fetch api. Which is a native browser API - but we don't have a browser when we use node. You can also use another package here. Axios is very popular one. It just allows you to download the content behind a url. Typically you use this for connecting to APIs or scraping the web.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;esm&lt;/strong&gt;: I had not used this one before but it's super useful. It allows you to write your code like you'd do in client side JavaScript such as in Vue or React. Which means you have access to things like import / exports. To enable this esm you have to install it and then add it to your run script. In my package.json file I added this line as the 'start' script:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;scripts&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;start&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;node -r esm index.js&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You could then run this script by doing &lt;code&gt;npm run start&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Create downloader
&lt;/h2&gt;

&lt;p&gt;Now we got the necessary packages installed it's time to create the first file in which we'll just fetch one image: fetch.js&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// fetch.js&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;fetch&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;node-fetch&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// create a function that grabs a random image&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;fetchingData&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://picsum.photos/200?random&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;date&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;dest&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createWriteStream&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`./image-&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;date&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.png`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;pipe&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;dest&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="c1"&gt;// export the function so it can be used in the index.js file&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nx"&gt;fetchingData&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In order to get a random picture each time you execute this script, I use &lt;a href="https://picsum.photos/" rel="noopener noreferrer"&gt;Picsum&lt;/a&gt;. This website allows you to generate a random image with a fixed width and height. You can append those dimensions to the url. I also create a variable with the current date. This date will then be appended to the file name and prevents the files from being overwritten. Because we're working with promises here I'm using async/await.&lt;/p&gt;

&lt;p&gt;If you want to test this file you can run it with &lt;code&gt;node -r esm fetch.js&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Setup a schedule
&lt;/h2&gt;

&lt;p&gt;Next you want to create an index.js file. This will be the main entry file and this one contains the node-cron function:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;cron&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;node-cron&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="nx"&gt;cron&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;schedule&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;* * * * *&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`this message logs every minute`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is a very small app which if you execute it will log a message to the console. This message will be repeated every minute. Cool, but not very useful. Let's add our image fetcher by importing it. The index.js file will then look like so:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;cron&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;node-cron&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;fetchingData&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;./fetch&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="nx"&gt;cron&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;schedule&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;* * * * *&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`one minute passed, image downloaded`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nf"&gt;fetchingData&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;However, this will run the image downloader every minute. We can change the cron job by changing the first parameter that we're adding into the schedule function. The five stars you see mean that the function will run every minute. You can modify this by following this (taken from &lt;a href="https://github.com/node-cron/node-cron" rel="noopener noreferrer"&gt;here&lt;/a&gt;):&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; # ┌────────────── second (optional)
 # │ ┌──────────── minute
 # │ │ ┌────────── hour
 # │ │ │ ┌──────── day of month
 # │ │ │ │ ┌────── month
 # │ │ │ │ │ ┌──── day of week
 # │ │ │ │ │ │
 # │ │ │ │ │ │
 # * * * * * *
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;At first, I didn't really understand what this meant. After a bit of Googling I found the following website that was really useful as a cheatsheet; the &lt;a href="https://crontab.guru/" rel="noopener noreferrer"&gt;crontabguru&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This means you can setup a schedule for literally any time. Maybe once a year? Or every Tuesday at 8am in January and July. There's really no limitation. I continued by setting up a schedule to make it download every day at 6PM by setting it to this: &lt;code&gt;0 18 * * *&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;The complete and final &lt;code&gt;index.js&lt;/code&gt; file is then:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;cron&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;node-cron&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;fetchingData&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;./fetch&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="nx"&gt;cron&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;schedule&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;0 18 * * *&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`one minute passed, image downloaded`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nf"&gt;fetchingData&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Want to take a look at the full app or clone it? Head over to my Github &lt;a href="https://github.com/andre347/nodejs-task-scheduling" rel="noopener noreferrer"&gt;here&lt;/a&gt;!&lt;/p&gt;

&lt;p&gt;First post at Dev.to so be gentle. Originally published at &lt;a href="https://andredevries.dev/posts/node-task-scheduler/" rel="noopener noreferrer"&gt;andredevries.dev&lt;/a&gt;&lt;/p&gt;

</description>
      <category>node</category>
      <category>javascript</category>
      <category>es6</category>
    </item>
  </channel>
</rss>
